Sample records for batch sequential designs

  1. Assessment of in vitro cyto/genotoxicity of sequentially treated electroplating effluent on the human hepatocarcinoma HuH-7 cell line.

    PubMed

    Naik, Umesh Chandra; Das, Mihir Tanay; Sauran, Swati; Thakur, Indu Shekhar

    2014-03-01

    The present study compares in vitro toxicity of electroplating effluent after the batch treatment process with that obtained after the sequential treatment process. Activated charcoal prepared from sugarcane bagasse through chemical carbonization, and tolerant indigenous bacteria, Bacillus sp. strain IST105, were used individually and sequentially for the treatment of electroplating effluent. The sequential treatment involving activated charcoal followed by bacterial treatment removed 99% of Cr(VI) compared with the batch processes, which removed 40% (charcoal) and 75% (bacteria), respectively. Post-treatment in vitro cyto/genotoxicity was evaluated by the MTT test and the comet assay in human HuH-7 hepatocarcinoma cells. The sequentially treated sample showed an increase in LC50 value with a 6-fold decrease in comet-assay DNA migration compared with that of untreated samples. A significant decrease in DNA migration and an increase in LC50 value of treated effluent proved the higher effectiveness of the sequential treatment process over the individual batch processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Landsat-4 (TDRSS-user) orbit determination using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Hakimi, M.; Samii, M. V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1992-01-01

    TDRSS user orbit determination is analyzed using a batch least-squares method and a sequential estimation method. It was found that in the batch least-squares method analysis, the orbit determination consistency for Landsat-4, which was heavily tracked by TDRSS during January 1991, was about 4 meters in the rms overlap comparisons and about 6 meters in the maximum position differences in overlap comparisons. The consistency was about 10 to 30 meters in the 3 sigma state error covariance function in the sequential method analysis. As a measure of consistency, the first residual of each pass was within the 3 sigma bound in the residual space.

  3. Enzymatic saccharification of pretreated wheat straw: comparison of solids-recycling, sequential hydrolysis and batch hydrolysis.

    PubMed

    Pihlajaniemi, Ville; Sipponen, Satu; Sipponen, Mika H; Pastinen, Ossi; Laakso, Simo

    2014-02-01

    In the enzymatic hydrolysis of lignocellulose materials, the recycling of the solid residue has previously been considered within the context of enzyme recycling. In this study, a steady state investigation of a solids-recycling process was made with pretreated wheat straw and compared to sequential and batch hydrolysis at constant reaction times, substrate feed and liquid and enzyme consumption. Compared to batch hydrolysis, the recycling and sequential processes showed roughly equal hydrolysis yields, while the volumetric productivity was significantly increased. In the 72h process the improvement was 90% due to an increased reaction consistency, while the solids feed was 16% of the total process constituents. The improvement resulted primarily from product removal, which was equally efficient in solids-recycling and sequential hydrolysis processes. No evidence of accumulation of enzymes beyond the accumulation of the substrate was found in recycling. A mathematical model of solids-recycling was constructed, based on a geometrical series. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Simultaneous biodegradation of three mononitrophenol isomers by a tailor-made microbial consortium immobilized in sequential batch reactors.

    PubMed

    Fu, H; Zhang, J-J; Xu, Y; Chao, H-J; Zhou, N-Y

    2017-03-01

    The ortho-nitrophenol (ONP)-utilizing Alcaligenes sp. strain NyZ215, meta-nitrophenol (MNP)-utilizing Cupriavidus necator JMP134 and para-nitrophenol (PNP)-utilizing Pseudomonas sp. strain WBC-3 were assembled as a consortium to degrade three nitrophenol isomers in sequential batch reactors. Pilot test was conducted in flasks to demonstrate that a mixture of three mononitrophenols at 0·5 mol l -1 each could be mineralized by this microbial consortium within 84 h. Interestingly, neither ONP nor MNP was degraded until PNP was almost consumed by strain WBC-3. By immobilizing this consortium into polyurethane cubes, all three mononitrophenols were continuously degraded in lab-scale sequential reactors for six batch cycles over 18 days. Total concentrations of ONP, MMP and PNP that were degraded were 2·8, 1·5 and 2·3 mol l -1 during this time course respectively. Quantitative real-time PCR analysis showed that each member in the microbial consortium was relatively stable during the entire degradation process. This study provides a novel approach to treat polluted water, particularly with a mixture of co-existing isomers. Nitroaromatic compounds are readily spread in the environment and pose great potential toxicity concerns. Here, we report the simultaneous degradation of three isomers of mononitrophenol in a single system by employing a consortium of three bacteria, both in flasks and lab-scale sequential batch reactors. The results demonstrate that simultaneous biodegradation of three mononitrophenol isomers can be achieved by a tailor-made microbial consortium immobilized in sequential batch reactors, providing a pilot study for a novel approach for the bioremediation of mixed pollutants, especially isomers present in wastewater. © 2016 The Society for Applied Microbiology.

  5. Evaluation of TDRSS-user orbit determination accuracy using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Hodjatzadeh, M.; Samii, M. V.; Doll, C. E.; Hart, R. C.; Mistretta, G. D.

    1991-01-01

    The development of the Real-Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination on a Disk Operating System (DOS) based Personal Computer (PC) is addressed. The results of a study to compare the orbit determination accuracy of a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOD/E with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), is addressed. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for the Earth Radiation Budget Satellite (ERBS); the maximum solution differences were less than 25 m after the filter had reached steady state.

  6. TDRSS-user orbit determination using batch least-squares and sequential methods

    NASA Astrophysics Data System (ADS)

    Oza, D. H.; Jones, T. L.; Hakimi, M.; Samii, Mina V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1993-02-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), and operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the January 17-23, 1991, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were less than 40 meters after the filter had reached steady state.

  7. Comparison of ERBS orbit determination accuracy using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Fabien, S. M.; Mistretta, G. D.; Hart, R. C.; Doll, C. E.

    1991-01-01

    The Flight Dynamics Div. (FDD) at NASA-Goddard commissioned a study to develop the Real Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination of spacecraft on a DOS based personal computer (PC). An overview is presented of RTOD/E capabilities and the results are presented of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOS/E on a PC with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. RTOD/E was used to perform sequential orbit determination for the Earth Radiation Budget Satellite (ERBS), and the Goddard Trajectory Determination System (GTDS) was used to perform the batch least squares orbit determination. The estimated ERBS ephemerides were obtained for the Aug. 16 to 22, 1989, timeframe, during which intensive TDRSS tracking data for ERBS were available. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for ERBS; the solution differences were less than 40 meters after the filter had reached steady state.

  8. TDRSS-user orbit determination using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Hakimi, M.; Samii, Mina V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1993-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), and operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the January 17-23, 1991, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were less than 40 meters after the filter had reached steady state.

  9. Comparison of ERBS orbit determination accuracy using batch least-squares and sequential methods

    NASA Astrophysics Data System (ADS)

    Oza, D. H.; Jones, T. L.; Fabien, S. M.; Mistretta, G. D.; Hart, R. C.; Doll, C. E.

    1991-10-01

    The Flight Dynamics Div. (FDD) at NASA-Goddard commissioned a study to develop the Real Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination of spacecraft on a DOS based personal computer (PC). An overview is presented of RTOD/E capabilities and the results are presented of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOS/E on a PC with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. RTOD/E was used to perform sequential orbit determination for the Earth Radiation Budget Satellite (ERBS), and the Goddard Trajectory Determination System (GTDS) was used to perform the batch least squares orbit determination. The estimated ERBS ephemerides were obtained for the Aug. 16 to 22, 1989, timeframe, during which intensive TDRSS tracking data for ERBS were available. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for ERBS; the solution differences were less than 40 meters after the filter had reached steady state.

  10. Optical Tracking Data Validation and Orbit Estimation for Sparse Observations of Satellites by the OWL-Net.

    PubMed

    Choi, Jin; Jo, Jung Hyun; Yim, Hong-Suh; Choi, Eun-Jung; Cho, Sungki; Park, Jang-Hyun

    2018-06-07

    An Optical Wide-field patroL-Network (OWL-Net) has been developed for maintaining Korean low Earth orbit (LEO) satellites' orbital ephemeris. The OWL-Net consists of five optical tracking stations. Brightness signals of reflected sunlight of the targets were detected by a charged coupled device (CCD). A chopper system was adopted for fast astrometric data sampling, maximum 50 Hz, within a short observation time. The astrometric accuracy of the optical observation data was validated with precise orbital ephemeris such as Consolidated Prediction File (CPF) data and precise orbit determination result with onboard Global Positioning System (GPS) data from the target satellite. In the optical observation simulation of the OWL-Net for 2017, an average observation span for a single arc of 11 LEO observation targets was about 5 min, while an average optical observation separation time was 5 h. We estimated the position and velocity with an atmospheric drag coefficient of LEO observation targets using a sequential-batch orbit estimation technique after multi-arc batch orbit estimation. Post-fit residuals for the multi-arc batch orbit estimation and sequential-batch orbit estimation were analyzed for the optical measurements and reference orbit (CPF and GPS data). The post-fit residuals with reference show few tens-of-meters errors for in-track direction for multi-arc batch and sequential-batch orbit estimation results.

  11. Parallel steady state studies on a milliliter scale accelerate fed-batch bioprocess design for recombinant protein production with Escherichia coli.

    PubMed

    Schmideder, Andreas; Cremer, Johannes H; Weuster-Botz, Dirk

    2016-11-01

    In general, fed-batch processes are applied for recombinant protein production with Escherichia coli (E. coli). However, state of the art methods for identifying suitable reaction conditions suffer from severe drawbacks, i.e. direct transfer of process information from parallel batch studies is often defective and sequential fed-batch studies are time-consuming and cost-intensive. In this study, continuously operated stirred-tank reactors on a milliliter scale were applied to identify suitable reaction conditions for fed-batch processes. Isopropyl β-d-1-thiogalactopyranoside (IPTG) induction strategies were varied in parallel-operated stirred-tank bioreactors to study the effects on the continuous production of the recombinant protein photoactivatable mCherry (PAmCherry) with E. coli. Best-performing induction strategies were transferred from the continuous processes on a milliliter scale to liter scale fed-batch processes. Inducing recombinant protein expression by dynamically increasing the IPTG concentration to 100 µM led to an increase in the product concentration of 21% (8.4 g L -1 ) compared to an implemented high-performance production process with the most frequently applied induction strategy by a single addition of 1000 µM IPGT. Thus, identifying feasible reaction conditions for fed-batch processes in parallel continuous studies on a milliliter scale was shown to be a powerful, novel method to accelerate bioprocess design in a cost-reducing manner. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1426-1435, 2016. © 2016 American Institute of Chemical Engineers.

  12. Computational time reduction for sequential batch solutions in GNSS precise point positioning technique

    NASA Astrophysics Data System (ADS)

    Martín Furones, Angel; Anquela Julián, Ana Belén; Dimas-Pages, Alejandro; Cos-Gayón, Fernando

    2017-08-01

    Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%.

  13. Improved solution accuracy for TDRSS-based TOPEX/Poseidon orbit determination

    NASA Technical Reports Server (NTRS)

    Doll, C. E.; Mistretta, G. D.; Hart, R. C.; Oza, D. H.; Bolvin, D. T.; Cox, C. M.; Nemesure, M.; Niklewski, D. J.; Samii, M. V.

    1994-01-01

    Orbit determination results are obtained by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) using a batch-least-squares estimator available in the Goddard Trajectory Determination System (GTDS) and an extended Kalman filter estimation system to process Tracking and Data Relay Satellite (TDRS) System (TDRSS) measurements. GTDS is the operational orbit determination system used by the FDD in support of the Ocean Topography Experiment (TOPEX)/Poseidon spacecraft navigation and health and safety operations. The extended Kalman filter was implemented in an orbit determination analysis prototype system, closely related to the Real-Time Orbit Determination System/Enhanced (RTOD/E) system. In addition, the Precision Orbit Determination (POD) team within the GSFC Space Geodesy Branch generated an independent set of high-accuracy trajectories to support the TOPEX/Poseidon scientific data. These latter solutions use the geodynamics (GEODYN) orbit determination system with laser ranging and Doppler Orbitography and Radiopositioning integrated by satellite (DORIS) tracking measurements. The TOPEX/Poseidon trajectories were estimated for November 7 through November 11, 1992, the timeframe under study. Independent assessments were made of the consistencies of solutions produced by the batch and sequential methods. The batch-least-squares solutions were assessed based on the solution residuals, while the sequential solutions were assessed based on primarily the estimated covariances. The batch-least-squares and sequential orbit solutions were compared with the definitive POD orbit solutions. The solution differences were generally less than 2 meters for the batch-least-squares and less than 13 meters for the sequential estimation solutions. After the sequential estimation solutions were processed with a smoother algorithm, position differences with POD orbit solutions of less than 7 meters were obtained. The differences among the POD, GTDS, and filter/smoother solutions can be traced to differences in modeling and tracking data types, which are being analyzed in detail.

  14. Evaluation of Landsat-4 orbit determination accuracy using batch least-squares and sequential methods

    NASA Astrophysics Data System (ADS)

    Oza, D. H.; Jones, T. L.; Feiertag, R.; Samii, M. V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite (TDRS) System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the May 18-24, 1992, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. During this period, there were two separate orbit-adjust maneuvers on one of the TDRSS spacecraft (TDRS-East) and one small orbit-adjust maneuver for Landsat-4. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were generally less than 30 meters after the filter had reached steady state.

  15. Evaluation of Landsat-4 orbit determination accuracy using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Feiertag, R.; Samii, M. V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1993-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) commissioned Applied Technology Associates, Incorporated, to develop the Real-Time Orbit Determination/Enhanced (RTOD/E) system on a Disk Operating System (DOS)-based personal computer (PC) as a prototype system for sequential orbit determination of spacecraft. This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite (TDRS) System (TDRSS) user spacecraft, Landsat-4, obtained using RTOD/E, operating on a PC, with the accuracy of an established batch least-squares system, the Goddard Trajectory Determination System (GTDS), operating on a mainframe computer. The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the May 18-24, 1992, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. During this period, there were two separate orbit-adjust maneuvers on one of the TDRSS spacecraft (TDRS-East) and one small orbit-adjust maneuver for Landsat-4. Independent assessments were made of the consistencies (overlap comparisons for the batch case and covariances and the first measurement residuals for the sequential case) of solutions produced by the batch and sequential methods. The forward-filtered RTOD/E orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were generally less than 30 meters after the filter had reached steady state.

  16. ANAEROBIC AND AEROBIC TREATMENT OF CHLORINATED ALIPHATIC COMPOUNDS

    EPA Science Inventory

    Biological degradation of 12 chlorinated aliphatic compounds (CACs) was assessed in bench-top reactors and in serum bottle tests. Three continuously mixed daily batch-fed reactor systems were evaluated: anaerobic, aerobic, and sequential-anaerobic-aerobic (sequential). Glucose,...

  17. Sequential Injection Analysis for Optimization of Molecular Biology Reactions

    PubMed Central

    Allen, Peter B.; Ellington, Andrew D.

    2011-01-01

    In order to automate the optimization of complex biochemical and molecular biology reactions, we developed a Sequential Injection Analysis (SIA) device and combined this with a Design of Experiment (DOE) algorithm. This combination of hardware and software automatically explores the parameter space of the reaction and provides continuous feedback for optimizing reaction conditions. As an example, we optimized the endonuclease digest of a fluorogenic substrate, and showed that the optimized reaction conditions also applied to the digest of the substrate outside of the device, and to the digest of a plasmid. The sequential technique quickly arrived at optimized reaction conditions with less reagent use than a batch process (such as a fluid handling robot exploring multiple reaction conditions in parallel) would have. The device and method should now be amenable to much more complex molecular biology reactions whose variable spaces are correspondingly larger. PMID:21338059

  18. Improved solution accuracy for Landsat-4 (TDRSS-user) orbit determination

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Niklewski, D. J.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1994-01-01

    This paper presents the results of a study to compare the orbit determination accuracy for a Tracking and Data Relay Satellite System (TDRSS) user spacecraft, Landsat-4, obtained using a Prototype Filter Smoother (PFS), with the accuracy of an established batch-least-squares system, the Goddard Trajectory Determination System (GTDS). The results of Landsat-4 orbit determination will provide useful experience for the Earth Observing System (EOS) series of satellites. The Landsat-4 ephemerides were estimated for the January 17-23, 1991, timeframe, during which intensive TDRSS tracking data for Landsat-4 were available. Independent assessments were made of the consistencies (overlap comparisons for the batch case and convariances for the sequential case) of solutions produced by the batch and sequential methods. The filtered and smoothed PFS orbit solutions were compared with the definitive GTDS orbit solutions for Landsat-4; the solution differences were generally less than 15 meters.

  19. Towards the Integration of Dark- and Photo-Fermentative Waste Treatment. 4. Repeated Batch Sequential Dark- and Photofermentation using Starch as Substrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurinavichene, T. V.; Belokopytov, B. F.; Laurinavichius, K. S.

    In this study we demonstrated the technical feasibility of a prolonged, sequential two-stage integrated process under a repeated batch mode of starch fermentation. In this durable scheme, the photobioreactor with purple bacteria in the second stage was fed directly with dark culture from the first stage without centrifugation, filtration, or sterilization (not demonstrated previously). After preliminary optimization, both the dark- and the photo-stages were performed under repeated batch modes with different process parameters. Continuous H{sub 2} production in this system was observed at a H{sub 2} yield of up to 1.4 and 3.9 mole mole{sup -1} hexose during the dark-more » and photo-stage, respectively (for a total of 5.3 mole mole{sup -1} hexose), and rates of 0.9 and 0.5 L L{sup -1} d{sup -1}, respectively. Prolonged repeated batch H{sub 2} production was maintained for up to 90 days in each stage and was rather stable under non-aseptic conditions. Potential for improvements in these results are discussed.« less

  20. A fast and accurate online sequential learning algorithm for feedforward networks.

    PubMed

    Liang, Nan-Ying; Huang, Guang-Bin; Saratchandran, P; Sundararajan, N

    2006-11-01

    In this paper, we develop an online sequential learning algorithm for single hidden layer feedforward networks (SLFNs) with additive or radial basis function (RBF) hidden nodes in a unified framework. The algorithm is referred to as online sequential extreme learning machine (OS-ELM) and can learn data one-by-one or chunk-by-chunk (a block of data) with fixed or varying chunk size. The activation functions for additive nodes in OS-ELM can be any bounded nonconstant piecewise continuous functions and the activation functions for RBF nodes can be any integrable piecewise continuous functions. In OS-ELM, the parameters of hidden nodes (the input weights and biases of additive nodes or the centers and impact factors of RBF nodes) are randomly selected and the output weights are analytically determined based on the sequentially arriving data. The algorithm uses the ideas of ELM of Huang et al. developed for batch learning which has been shown to be extremely fast with generalization performance better than other batch training methods. Apart from selecting the number of hidden nodes, no other control parameters have to be manually chosen. Detailed performance comparison of OS-ELM is done with other popular sequential learning algorithms on benchmark problems drawn from the regression, classification and time series prediction areas. The results show that the OS-ELM is faster than the other sequential algorithms and produces better generalization performance.

  1. Long-term biological hydrogen production by agar immobilized Rhodobacter capsulatus in a sequential batch photobioreactor.

    PubMed

    Elkahlout, Kamal; Alipour, Siamak; Eroglu, Inci; Gunduz, Ufuk; Yucel, Meral

    2017-04-01

    In this study, agar immobilization technique was employed for biological hydrogen production using Rhodobacter capsulatus DSM 1710 (wild type) and YO3 (hup-mutant) strains in sequential batch process. Different agar and glutamate concentrations were tested with defined nutrient medium. Agar concentration 4% (w/v) and 4 mM glutamate were selected for bacterial immobilization in terms of rate and longevity of hydrogen production. Acetate concentration was increased from 40 to 60-100 and 60 mM gave best results with both bacterial strains immobilized in 4% (w/v) agar. Cell concentration was increased from 2.5 to 5 mg dcw mL -1 agar and it was found that increasing cell concentration of wild-type strain caused decrease in yield and productivity while these parameters improved by increasing cell concentration of mutant strain. Also, the hydrogen production time has extended from 17 days up to 60 days according to the process conditions and parameters. Hydrogen production by immobilized photosynthetic bacteria is a convenient technology for hydrogen production as it enables to produce hydrogen with high organic acid concentrations comparing to suspended cultures. Besides, immobilization increases the stability of the system and allowed sequential batch operation for long-term application.

  2. Linear Covariance Analysis and Epoch State Estimators

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Carpenter, J. Russell

    2014-01-01

    This paper extends in two directions the results of prior work on generalized linear covariance analysis of both batch least-squares and sequential estimators. The first is an improved treatment of process noise in the batch, or epoch state, estimator with an epoch time that may be later than some or all of the measurements in the batch. The second is to account for process noise in specifying the gains in the epoch state estimator. We establish the conditions under which the latter estimator is equivalent to the Kalman filter.

  3. Linear Covariance Analysis and Epoch State Estimators

    NASA Technical Reports Server (NTRS)

    Markley, F. Landis; Carpenter, J. Russell

    2012-01-01

    This paper extends in two directions the results of prior work on generalized linear covariance analysis of both batch least-squares and sequential estimators. The first is an improved treatment of process noise in the batch, or epoch state, estimator with an epoch time that may be later than some or all of the measurements in the batch. The second is to account for process noise in specifying the gains in the epoch state estimator. We establish the conditions under which the latter estimator is equivalent to the Kalman filter.

  4. Comparison of TOPEX/Poseidon orbit determination solutions obtained by the Goddard Space Flight Center Flight Dynamics Division and Precision Orbit Determination Teams

    NASA Technical Reports Server (NTRS)

    Doll, C.; Mistretta, G.; Hart, R.; Oza, D.; Cox, C.; Nemesure, M.; Bolvin, D.; Samii, Mina V.

    1993-01-01

    Orbit determination results are obtained by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) using the Goddard Trajectory Determination System (GTDS) and a real-time extended Kalman filter estimation system to process Tracking Data and Relay Satellite (TDRS) System (TDRSS) measurements in support of the Ocean Topography Experiment (TOPEX)/Poseidon spacecraft navigation and health and safety operations. GTDS is the operational orbit determination system used by the FDD, and the extended Kalman fliter was implemented in an analysis prototype system, the Real-Time Orbit Determination System/Enhanced (RTOD/E). The Precision Orbit Determination (POD) team within the GSFC Space Geodesy Branch generates an independent set of high-accuracy trajectories to support the TOPEX/Poseidon scientific data. These latter solutions use the Geodynamics (GEODYN) orbit determination system with laser ranging tracking data. The TOPEX/Poseidon trajectories were estimated for the October 22 - November 1, 1992, timeframe, for which the latest preliminary POD results were available. Independent assessments were made of the consistencies of solutions produced by the batch and sequential methods. The batch cases were assessed using overlap comparisons, while the sequential cases were assessed with covariances and the first measurement residuals. The batch least-squares and forward-filtered RTOD/E orbit solutions were compared with the definitive POD orbit solutions. The solution differences were generally less than 10 meters (m) for the batch least squares and less than 18 m for the sequential estimation solutions. The differences among the POD, GTDS, and RTOD/E solutions can be traced to differences in modeling and tracking data types, which are being analyzed in detail.

  5. Assessing the effect of sodium dichloroisocyanurate concentration on transfer of Salmonella enterica serotype Typhimurium in wash water for production of minimally processed iceberg lettuce (Lactuca sativa L.).

    PubMed

    Maffei, D F; Sant'Ana, A S; Monteiro, G; Schaffner, D W; Franco, B D G M

    2016-06-01

    This study evaluated the impact of sodium dichloroisocyanurate (5, 10, 20, 30, 40, 50 and 250 mg l(-1) ) in wash water on transfer of Salmonella Typhimurium from contaminated lettuce to wash water and then to other noncontaminated lettuces washed sequentially in the same water. Experiments were designed mimicking the conditions commonly seen in minimally processed vegetable (MPV) processing plants in Brazil. The scenarios were as follows: (1) Washing one inoculated lettuce portion in nonchlorinated water, followed by washing 10 noninoculated portions sequentially. (2) Washing one inoculated lettuce portion in chlorinated water followed by washing five noninoculated portions sequentially. (3) Washing five inoculated lettuce portions in chlorinated water sequentially, followed by washing five noninoculated portions sequentially. (4) Washing five noninoculated lettuce portions in chlorinated water sequentially, followed by washing five inoculated portions sequentially and then by washing five noninoculated portions sequentially in the same water. Salm. Typhimurium transfer from inoculated lettuce to wash water and further dissemination to noninoculated lettuces occurred when nonchlorinated water was used (scenario 1). When chlorinated water was used (scenarios 2, 3 and 4), no measurable Salm. Typhimurium transfer occurred if the sanitizer was ≥10 mg l(-1) . Use of sanitizers in correct concentrations is important to minimize the risk of microbial transfer during MPV washing. In this study, the impact of sodium dichloroisocyanurate in the wash water on transfer of Salmonella Typhimurium from inoculated lettuce to wash water and then to other noninoculated lettuces washed sequentially in the same water was evaluated. The use of chlorinated water, at concentration above 10 mg l(-1) , effectively prevented Salm. Typhimurium transfer under several different washing scenarios. Conversely, when nonchlorinated water was used, Salm. Typhimurium transfer occurred in up to at least 10 noninoculated batches of lettuce washed sequentially in the same water. © 2016 The Society for Applied Microbiology.

  6. Fate of 90Sr and U(VI) in Dounreay sediments following saline inundation and erosion.

    PubMed

    Eagling, Jane; Worsfold, Paul J; Blake, William H; Keith-Roach, Miranda J

    2013-08-01

    There is concern that sea level rise associated with projected climate change will lead to the inundation, flooding and erosion of soils and sediments contaminated with radionuclides at coastal nuclear sites, such as Dounreay (UK), with seawater. Here batch and column experiments were designed to simulate these scenarios and sequential extractions were used to identify the key radionuclide solid phase associations. Strontium was exchangeable and was mobilised rapidly by ion exchange with seawater Mg(2+) in both batch and column experiments. In contrast, U was more strongly bound to the sediments and mobilisation was initially limited by the influence of the sediment on the pH of the water. Release was only observed when the pH increased above 6.9, suggesting that the formation of soluble U(VI)-carbonate species was important. Under dynamic flow conditions, long term release was significant (47%), but controlled by slow desorption kinetics from a range of binding sites. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Aerobic degradation of petroleum refinery wastewater in sequential batch reactor.

    PubMed

    Thakur, Chandrakant; Srivastava, Vimal C; Mall, Indra D

    2014-01-01

    The aim of the present work was to study the effect of various parameters affecting the treatment of raw petroleum refinery wastewater (PRW) having chemical oxygen demand (COD) of 350 mg L(-1) and total organic carbon (TOC) of 70 mg L(-1) in sequential batch reactor (SBR). Effect of hydraulic retention time (HRT) was studied in instantaneous fill condition. Maximum COD and TOC removal efficiencies were found to be 80% and 84%, respectively, for fill phase of 2 h and react phase of 2 h with fraction of SBR being filled with raw PRW in each cycle being 0.4. Effect of parameters was studied in terms of settling characteristic of treated slurry. Kinetics of treatment process has been studied. FTIR and UV-visible analysis of PRW before and after treatment have been performed so as to understand the degradation mechanism.

  8. Optimality of affine control system of several species in competition on a sequential batch reactor

    NASA Astrophysics Data System (ADS)

    Rodríguez, J. C.; Ramírez, H.; Gajardo, P.; Rapaport, A.

    2014-09-01

    In this paper, we analyse the optimality of affine control system of several species in competition for a single substrate on a sequential batch reactor, with the objective being to reach a given (low) level of the substrate. We allow controls to be bounded measurable functions of time plus possible impulses. A suitable modification of the dynamics leads to a slightly different optimal control problem, without impulsive controls, for which we apply different optimality conditions derived from Pontryagin principle and the Hamilton-Jacobi-Bellman equation. We thus characterise the singular trajectories of our problem as the extremal trajectories keeping the substrate at a constant level. We also establish conditions for which an immediate one impulse (IOI) strategy is optimal. Some numerical experiences are then included in order to illustrate our study and show that those conditions are also necessary to ensure the optimality of the IOI strategy.

  9. Worst-error analysis of batch filter and sequential filter in navigation problems. [in spacecraft trajectory estimation

    NASA Technical Reports Server (NTRS)

    Nishimura, T.

    1975-01-01

    This paper proposes a worst-error analysis for dealing with problems of estimation of spacecraft trajectories in deep space missions. Navigation filters in use assume either constant or stochastic (Markov) models for their estimated parameters. When the actual behavior of these parameters does not follow the pattern of the assumed model, the filters sometimes result in very poor performance. To prepare for such pathological cases, the worst errors of both batch and sequential filters are investigated based on the incremental sensitivity studies of these filters. By finding critical switching instances of non-gravitational accelerations, intensive tracking can be carried out around those instances. Also the worst errors in the target plane provide a measure in assignment of the propellant budget for trajectory corrections. Thus the worst-error study presents useful information as well as practical criteria in establishing the maneuver and tracking strategy of spacecraft's missions.

  10. Characteristics of aerobic granules grown on glucose a sequential batch shaking reactor.

    PubMed

    Cai, Chun-guang; Zhu, Nan-wen; Liu, Jun-shen; Wang, Zhen-peng; Cai, Wei-min

    2004-01-01

    Aerobic heterotrophic granular sludge was cultivated in a sequencing batch shaking reactor (SBSR) in which a synthetic wastewater containing glucose as carbon source was fed. The characteristics of the aerobic granules were investigated. Compared with the conventional activated sludge flocs, the aerobic granules exhibit excellent physical characteristics in terms of settleability, size, shape, biomass density, and physical strength. Scanning electron micrographs revealed that in mature granules little filamentous bacteria could be found, rod-shaped and coccoid bacteria were the dominant microorganisms.

  11. Design of a Facility to Implement a Low Cost Process for Production of NHC

    DTIC Science & Technology

    1979-05-15

    CO4PANY V. i In brief, crude NHC is produced by sequential batch- wise solution processing, initially converting BIO to the sulfide ligand with subsequent...L 1.03 50 214 R-3 (Dibutyl Sulfide ) L 0.84 -112 360 R-4 (Pyridine) L 0.98 - 44 240 1-Octyne L 0.75 -110 260 Acetone L 0.79 -138 134 Methanol L 0.79...R3 -- 25-Octyne- 300 C5 30 TOL 4 NHC -AREA 40 LEGEND: 82 - Diborane R2 - Dioxane B10 - Decaborane R3 - Butyl Sulfide C5 - Pentane MEOH - Methanol C6

  12. Anaerobic Digestion in a Flooded Densified Leachbed

    NASA Technical Reports Server (NTRS)

    Chynoweth, David P.; Teixeira, Arthur A.; Owens, John M.; Haley, Patrick J.

    2009-01-01

    A document discusses the adaptation of a patented biomass-digesting process, denoted sequential batch anaerobic composting (SEBAC), to recycling of wastes aboard a spacecraft. In SEBAC, high-solids-content biomass wastes are converted into methane, carbon dioxide, and compost.

  13. Enhanced bioethanol production by fed-batch simultaneous saccharification and co-fermentation at high solid loading of Fenton reaction and sodium hydroxide sequentially pretreated sugarcane bagasse.

    PubMed

    Zhang, Teng; Zhu, Ming-Jun

    2017-04-01

    A study on the fed-batch simultaneous saccharification and co-fermentation (SSCF) of Fenton reaction combined with NaOH pretreated sugarcane bagasse (SCB) at a high solid loading of 10-30% (w/v) was investigated. Enzyme feeding mode, substrate feeding mode and combination of both were compared with the batch mode under respective solid loadings. Ethanol concentrations of above 80g/L were obtained in batch and enzyme feeding modes at a solid loading of 30% (w/v). Enzyme feeding mode was found to increase ethanol productivity and reduce enzyme loading to a value of 1.23g/L/h and 9FPU/g substrate, respectively. The present study provides an economically feasible process for high concentration bioethanol production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Feasibility of bioengineered two-stages sequential batch reactor and filtration-adsorption process for complex agrochemical effluent.

    PubMed

    Manekar, Pravin; Biswas, Rima; Urewar, Chaitali; Pal, Sukdeb; Nandy, Tapas

    2013-11-01

    In the present study, the feasibility of a bioengineered two-stages sequential batch reactor (BTSSBR) followed by filtration-adsorption process was investigated to treat the agrochemical effluent by overcoming factor affecting process stability such as microbial imbalance and substrate sensitivity. An air stripper stripped 90% of toxic ammonia, and combined with other streams for bio-oxidation and filtration-adsorption. The BTSSBR system achieved bio-oxidation at 6 days hydraulic retention time by fending off microbial imbalance and substrate sensitivity. The maximum reduction in COD and BOD by heterotrophic bacteria in the first reactor was 87% and 90%, respectively. Removal of toxic ammoniacal-nitrogen by autotrophic bacteria in a post-second stage bio-oxidation was 97%. The optimum filtration and adsorption of pollutants were achieved at a filtration rate of 10 and 9 m(3)m(-2)h(-1), respectively. The treatment scheme comprising air stripper, BTSSBR and filtration-adsorption process showed a great promise for treating the agrochemical effluent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Continuous treatment of non-sterile hospital wastewater by Trametes versicolor: How to increase fungal viability by means of operational strategies and pretreatments.

    PubMed

    Mir-Tutusaus, J A; Sarrà, M; Caminal, G

    2016-11-15

    Hospital wastewaters have a high load of pharmaceutical active compounds (PhACs). Fungal treatments could be appropriate for source treatment of such effluents but the transition to non-sterile conditions proved to be difficult due to competition with indigenous microorganisms, resulting in very short-duration operations. In this article, coagulation-flocculation and UV-radiation processes were studied as pretreatments to a fungal reactor treating non-sterile hospital wastewater in sequential batch operation and continuous operation modes. The influent was spiked with ibuprofen and ketoprofen, and both compounds were successfully degraded by over 80%. UV pretreatment did not extent the fungal activity after coagulation-flocculation measured as laccase production and pellet integrity. Sequential batch operation did not reduce bacteria competition during fungal treatment. The best strategy was the addition of a coagulation-flocculation pretreatment to a continuous reactor, which led to an operation of 28days without biomass renovation. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Optimization of probiotic and lactic acid production by Lactobacillus plantarum in submerged bioreactor systems.

    PubMed

    Brinques, Graziela Brusch; do Carmo Peralba, Maria; Ayub, Marco Antônio Záchia

    2010-02-01

    Biomass and lactic acid production by a Lactobacillus plantarum strain isolated from Serrano cheese, a microorganism traditionally used in foods and recognized as a potent probiotic, was optimized. Optimization procedures were carried out in submerged batch bioreactors using cheese whey as the main carbon source. Sequential experimental Plackett-Burman designs followed by central composite design (CCD) were used to assess the influence of temperature, pH, stirring, aeration rate, and concentrations of lactose, peptone, and yeast extract on biomass and lactic acid production. Results showed that temperature, pH, aeration rate, lactose, and peptone were the most influential variables for biomass formation. Under optimized conditions, the CCD for temperature and aeration rate showed that the model predicted maximal biomass production of 14.30 g l(-1) (dw) of L. plantarum. At the central point of the CCD, a biomass of 10.2 g l(-1) (dw), with conversion rates of 0.10 g of cell g(-1) lactose and 1.08 g lactic acid g(-1) lactose (w/w), was obtained. These results provide useful information about the optimal cultivation conditions for growing L. plantarum in batch bioreactors in order to boost biomass to be used as industrial probiotic and to obtain high yields of conversion of lactose to lactic acid.

  17. Computerized Serial Processing System at the University of California, Berkeley

    ERIC Educational Resources Information Center

    Silberstein, Stephen M.

    1975-01-01

    The extreme flexibility of the MARC format coupled with the simplicity of a batch-oriented processing system centered around a sequential master file has enabled the University of California, Berkeley, library to gradually build an unusually large serials data base in support of both technical and public services. (Author)

  18. Potential Transport and Degradation of “Aged” Pesticide Residues in Soil

    USDA-ARS?s Scientific Manuscript database

    Increased pesticide residence time in soil, or “aging”, has been shown to affect the sorption-desorption of pesticides in the soil, which in turn can control transport and degradation processes. Aging effects have been characterized by batch sequential extraction methods, in which sorption coefficie...

  19. Potential transport and degradation of “Aged” pesticide residues in soil

    USDA-ARS?s Scientific Manuscript database

    “Aging” has been shown to affect the sorption-desorption of pesticides in the soil, which in turn can control transport and degradation processes. Aging effects have been characterized by batch sequential extraction methods, in which sorption coefficients (i.e. Kd) are determined for the chemical re...

  20. Start-up of a sequential dry anaerobic digestion of paunch under psychrophilic and mesophilic temperatures.

    PubMed

    Nkemka, Valentine Nkongndem; Hao, Xiying

    2018-04-01

    The present laboratory study evaluated the sequential leach bed dry anaerobic digestion (DAD) of paunch under psychrophilic (22°C) and mesophilic (40°C) temperatures. Three leach bed reactors were operated under the mesophilic temperature in sequence at a solid retention time (SRT) of 40d with a new batch started 27d into the run of the previous one. A total of six batches were operated for 135d. The results showed that the mesophilic DAD of paunch was efficient, reaching methane yields of 126.9-212.1mLg -1 volatile solid (VS) and a VS reduction of 32.9-55.5%. The average daily methane production rate increased from 334.3mLd -1 to 571.4mLd -1 and 825.7mLd -1 when one, two and three leach bed reactors were in operation, respectively. The psychrophilic DAD of paunch was operated under a SRT of 100d and a total of three batches were performed in sequence for 300d with each batch starting after completion of the previous one. Improvements in the methane yield from 93.9 to 107.3 and 148.3mLg -1 VS and VS reductions of 24.8, 30.2 and 38.6% were obtained in the consecutive runs, indicating the adaptation of anaerobic microbes from mesophilic to psychrophilic temperatures. In addition, it took three runs for anaerobic microbes to reduce the volatile fatty acid accumulation observed in the first and second trials. This study demonstrates the potential of renewable energy recovery from paunch under psychrophilic and mesophilic temperatures. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  1. Elimination of water pathogens with solar radiation using an automated sequential batch CPC reactor.

    PubMed

    Polo-López, M I; Fernández-Ibáñez, P; Ubomba-Jaswa, E; Navntoft, C; García-Fernández, I; Dunlop, P S M; Schmid, M; Byrne, J A; McGuigan, K G

    2011-11-30

    Solar disinfection (SODIS) of water is a well-known, effective treatment process which is practiced at household level in many developing countries. However, this process is limited by the small volume treated and there is no indication of treatment efficacy for the user. Low cost glass tube reactors, together with compound parabolic collector (CPC) technology, have been shown to significantly increase the efficiency of solar disinfection. However, these reactors still require user input to control each batch SODIS process and there is no feedback that the process is complete. Automatic operation of the batch SODIS process, controlled by UVA-radiation sensors, can provide information on the status of the process, can ensure the required UVA dose to achieve complete disinfection is received and reduces user work-load through automatic sequential batch processing. In this work, an enhanced CPC photo-reactor with a concentration factor of 1.89 was developed. The apparatus was automated to achieve exposure to a pre-determined UVA dose. Treated water was automatically dispensed into a reservoir tank. The reactor was tested using Escherichia coli as a model pathogen in natural well water. A 6-log inactivation of E. coli was achieved following exposure to the minimum uninterrupted lethal UVA dose. The enhanced reactor decreased the exposure time required to achieve the lethal UVA dose, in comparison to a CPC system with a concentration factor of 1.0. Doubling the lethal UVA dose prevented the need for a period of post-exposure dark inactivation and reduced the overall treatment time. Using this reactor, SODIS can be automatically carried out at an affordable cost, with reduced exposure time and minimal user input. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Design of a potential colonic drug delivery system of mesalamine.

    PubMed

    Gohel, Mukesh C; Parikh, Rajesh K; Nagori, Stavan A; Dabhi, Mahesh R

    2008-01-01

    The aim of the present investigation was to develop a site-specific colonic drug delivery system, built on the principles of the combination of pH and time sensitivity. Press-coated mesalamine tablets with a coat of HPMC E-15 were over-coated with Eudragit S100. The in vitro drug release study was conducted using sequential dissolution technique at pH 1.2, 6.0, 7.2 and 6.4 mimicking different regions of gastrointestinal tract. The optimized batch (F2) showed less than 6% of drug release before reaching colonic pH 6.4 and complete drug release was obtained thereafter within 2 hr. A short-term dissolution stability study demonstrated statistical insignificant difference in drug release.

  3. Online optimal experimental re-design in robotic parallel fed-batch cultivation facilities.

    PubMed

    Cruz Bournazou, M N; Barz, T; Nickel, D B; Lopez Cárdenas, D C; Glauche, F; Knepper, A; Neubauer, P

    2017-03-01

    We present an integrated framework for the online optimal experimental re-design applied to parallel nonlinear dynamic processes that aims to precisely estimate the parameter set of macro kinetic growth models with minimal experimental effort. This provides a systematic solution for rapid validation of a specific model to new strains, mutants, or products. In biosciences, this is especially important as model identification is a long and laborious process which is continuing to limit the use of mathematical modeling in this field. The strength of this approach is demonstrated by fitting a macro-kinetic differential equation model for Escherichia coli fed-batch processes after 6 h of cultivation. The system includes two fully-automated liquid handling robots; one containing eight mini-bioreactors and another used for automated at-line analyses, which allows for the immediate use of the available data in the modeling environment. As a result, the experiment can be continually re-designed while the cultivations are running using the information generated by periodical parameter estimations. The advantages of an online re-computation of the optimal experiment are proven by a 50-fold lower average coefficient of variation on the parameter estimates compared to the sequential method (4.83% instead of 235.86%). The success obtained in such a complex system is a further step towards a more efficient computer aided bioprocess development. Biotechnol. Bioeng. 2017;114: 610-619. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Sequential acid and enzymatic hydrolysis in situ and bioethanol production from Gracilaria biomass.

    PubMed

    Wu, Fang-Chen; Wu, Jane-Yii; Liao, Yi-Jyun; Wang, Man-Ying; Shih, Ing-Lung

    2014-03-01

    Gracilaria sp., a red alga, was used as a feedstock for the production of bioethanol. Saccharification of Gracilaria sp. by sequential acid and enzyme hydrolysis in situ produced a high quality hydrolysate that ensured its fermentability to produce ethanol. The optimal saccharification process resulted in total 11.85g/L (59.26%) of glucose and galactose, Saccharomyces cerevisiae Wu-Y2 showed a good performance on co-fermentability of glucose and galactose released in the hydrolysate from Gracilaria sp. The final ethanol concentrations of 4.72g/L (0.48g/g sugar consumed; 94% conversion efficiency) and the ethanol productivity 4.93g/L/d were achieved. 1g of dry Gracilaria can be converted to 0.236g (23.6%) of bioethanol via the processes developed. Efficient alcohol production by immobilized S. cerevisiae Wu-Y2 in batch and repeated batch fermentation was also demonstrated. The findings of this study revealed that Gracilaria sp. can be a potential feedstock in biorefinery for ethanol production. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Investigation of arsenic removal in batch wise water treatments by means of sequential hydride generation flow injection analysis.

    PubMed

    Toda, Kei; Takaki, Mari; Hashem, Md Abul

    2008-08-01

    Arsenic water pollution is a big issue worldwide. Determination of inorganic arsenic in each oxidation state is important because As(III) is much more toxic than As(V). An automated arsenic measurement system was developed based on complete vaporization of As by a sequential procedure and collection/preconcentration of the vaporized AsH(3), which was subsequently measured by a flow analysis. The automated sensitive method was applied to monitoring As(III) and As(V) concentrations in contaminated water standing overnight. Behaviors of arsenics were investigated in different conditions, and unique time dependence profiles were obtained. For example, in the standing of anaerobic water samples, the As(III) concentration immediately began decreasing whereas dead time was observed in the removal of As(V). In normal groundwater conditions, most arsenic was removed from the water simply by standing overnight. To obtain more effective removal, the addition of oxidants and use of steel wools were investigated. Simple batch wise treatments of arsenic contaminated water were demonstrated, and detail of the transitional changes in As(III) and As(V) were investigated.

  6. A sequential treatment of intermediate tropical landfill leachate using a sequencing batch reactor (SBR) and coagulation.

    PubMed

    Yong, Zi Jun; Bashir, Mohammed J K; Ng, Choon Aun; Sethupathi, Sumathi; Lim, Jun-Wei

    2018-01-01

    The increase in landfill leachate generation is due to the increase of municipal solid waste (MSW) as global development continues. Landfill leachate has constantly been the most challenging issue in MSW management as it contains high amount of organic and inorganic compounds that might cause pollution to water resources. Biologically treated landfill leachate often fails to fulfill the regulatory discharge standards. Thus, to prevent environmental pollution, many landfill leachate treatment plants involve multiple stages treatment process. The Papan Landfill in Perak, Malaysia currently has no proper leachate treatment system. In the current study, sequential treatment via sequencing batch reactor (SBR) followed by coagulation was used to treat chemical oxygen demand (COD), ammoniacal nitrogen (NH 3 -N), total suspended solids (TSS), and colour from raw landfill leachate. SBR optimum aeration rate, L/min, optimal pH and dosage (g/L) of Alum for coagulation as a post-treatment were determined. The two-step sequential treatment by SBR followed by coagulation (Alum) achieved a removal efficiency of 84.89%, 94.25%, 91.82% and 85.81% for COD, NH 3 -N, TSS and colour, respectively. Moreover, the two-stage treatment process achieved 95.0% 95.0%, 95.3%, 100.0%, 87.2%, 62.9%, 50.0%, 41.3%, 41.2, 34.8, and 22.9 removals of Cadmium, Lead, Copper, Selenium, Barium, Iron, Silver, Nickel, Zinc, Arsenic, and Manganese, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Acidification of In-Storage-Psychrophilic-Anaerobic-Digestion (ISPAD) process to reduce ammonia volatilization: Model development and validation.

    PubMed

    Madani-Hosseini, Mahsa; Mulligan, Catherine N; Barrington, Suzelle

    2016-06-01

    In-Storage-Psychrophilic-Anaerobic-Digestion (ISPAD) is an ambient temperature treatment system for wastewaters stored for over 100days under temperate climates, which produces a nitrogen rich digestate susceptible to ammonia (NH3) volatilization. Present acidification techniques reducing NH3 volatilization are not only expensive and with secondary environmental effects, but do not apply to ISPAD relying on batch-to-batch inoculation. The objectives of this study were to identify and validate sequential organic loading (OL) strategies producing imbalances in acidogen and methanogen growth, acidifying ISPAD content one week before emptying to a pH of 6, while also preserving the inoculation potential. This acidification process is challenging as wastewaters often offer a high buffering capacity and ISPAD operational practices foster low microbial populations. A model simulating the ISPAD pH regime was used to optimize 3 different sequential OLs to decrease the ISPAD pH to 6.0. All 3 strategies were compared in terms of biogas production, volatile fatty acid (VFA) concentration, microbial activity, glucose consumption, and pH decrease. Laboratory validation of the model outputs confirmed that a sequential OL of 13kg glucose/m(3) of ISPAD content over 4days could indeed reduce the pH to 6.0. Such OL competes feasibly with present acidification techniques. Nevertheless, more research is required to explain the 3-day lag between the model results and the experimental data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. "RCL-Pooling Assay": A Simplified Method for the Detection of Replication-Competent Lentiviruses in Vector Batches Using Sequential Pooling.

    PubMed

    Corre, Guillaume; Dessainte, Michel; Marteau, Jean-Brice; Dalle, Bruno; Fenard, David; Galy, Anne

    2016-02-01

    Nonreplicative recombinant HIV-1-derived lentiviral vectors (LV) are increasingly used in gene therapy of various genetic diseases, infectious diseases, and cancer. Before they are used in humans, preparations of LV must undergo extensive quality control testing. In particular, testing of LV must demonstrate the absence of replication-competent lentiviruses (RCL) with suitable methods, on representative fractions of vector batches. Current methods based on cell culture are challenging because high titers of vector batches translate into high volumes of cell culture to be tested in RCL assays. As vector batch size and titers are continuously increasing because of the improvement of production and purification methods, it became necessary for us to modify the current RCL assay based on the detection of p24 in cultures of indicator cells. Here, we propose a practical optimization of this method using a pairwise pooling strategy enabling easier testing of higher vector inoculum volumes. These modifications significantly decrease material handling and operator time, leading to a cost-effective method, while maintaining optimal sensibility of the RCL testing. This optimized "RCL-pooling assay" ameliorates the feasibility of the quality control of large-scale batches of clinical-grade LV while maintaining the same sensitivity.

  9. Integrating sequencing batch reactor with bio-electrochemical treatment for augmenting remediation efficiency of complex petrochemical wastewater.

    PubMed

    Yeruva, Dileep Kumar; Jukuri, Srinivas; Velvizhi, G; Naresh Kumar, A; Swamy, Y V; Venkata Mohan, S

    2015-01-01

    The present study evaluates the sequential integration of two advanced biological treatment methods viz., sequencing batch reactor (SBR) and bioelectrochemical treatment systems (BET) for the treatment of real-field petrochemical wastewater (PCW). Initially two SBR reactors were operated in aerobic (SBR(Ae)) and anoxic (SBR(Ax)) microenvironments with an organic loading rate (OLR) of 9.68 kg COD/m(3)-day. Relatively, SBR(Ax) showed higher substrate degradation (3.34 kg COD/m(3)-day) compared to SBR(Ae) (2.9 kg COD/m(3)-day). To further improve treatment efficiency, the effluents from SBR process were fed to BET reactors. BET(Ax) depicted higher SDR (1.92 kg COD/m(3)-day) with simultaneous power generation (17.12 mW/m(2)) followed by BET(Ae) (1.80 kg COD/m(3)-day; 14.25 mW/m(2)). Integrating both the processes documented significant improvement in COD removal efficiency due to the flexibility of combining multiple microenvironments sequentially. Results were supported with GC-MS and FTIR, which confirmed the increment in biodegradability of wastewater. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Sequential Nonlinear Learning for Distributed Multiagent Systems via Extreme Learning Machines.

    PubMed

    Vanli, Nuri Denizcan; Sayin, Muhammed O; Delibalta, Ibrahim; Kozat, Suleyman Serdar

    2017-03-01

    We study online nonlinear learning over distributed multiagent systems, where each agent employs a single hidden layer feedforward neural network (SLFN) structure to sequentially minimize arbitrary loss functions. In particular, each agent trains its own SLFN using only the data that is revealed to itself. On the other hand, the aim of the multiagent system is to train the SLFN at each agent as well as the optimal centralized batch SLFN that has access to all the data, by exchanging information between neighboring agents. We address this problem by introducing a distributed subgradient-based extreme learning machine algorithm. The proposed algorithm provides guaranteed upper bounds on the performance of the SLFN at each agent and shows that each of these individual SLFNs asymptotically achieves the performance of the optimal centralized batch SLFN. Our performance guarantees explicitly distinguish the effects of data- and network-dependent parameters on the convergence rate of the proposed algorithm. The experimental results illustrate that the proposed algorithm achieves the oracle performance significantly faster than the state-of-the-art methods in the machine learning and signal processing literature. Hence, the proposed method is highly appealing for the applications involving big data.

  11. Iodine speciation in coastal and inland bathing waters and seaweeds extracts using a sequential injection standard addition flow-batch method.

    PubMed

    Santos, Inês C; Mesquita, Raquel B R; Bordalo, Adriano A; Rangel, António O S S

    2015-02-01

    The present work describes the development of a sequential injection standard addition method for iodine speciation in bathing waters and seaweeds extracts without prior sample treatment. Iodine speciation was obtained by assessing the iodide and iodate content, the two inorganic forms of iodine in waters. For the determination of iodide, an iodide ion selective electrode (ISE) was used. The indirect determination of iodate was based on the spectrophotometric determination of nitrite (Griess reaction). For the iodate measurement, a mixing chamber was employed (flow batch approach) to explore the inherent efficient mixing, essential for the indirect determination of iodate. The application of the standard addition method enabled detection limits of 0.14 µM for iodide and 0.02 µM for iodate, together with the direct introduction of the target water samples, coastal and inland bathing waters. The results obtained were in agreement with those obtained by ICP-MS and a colorimetric reference procedure. Recovery tests also confirmed the accuracy of the developed method which was effectively applied to bathing waters and seaweed extracts. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. An evaluation of the phosphorus storage capacity of an anaerobic/aerobic sequential batch biofilm reactor.

    PubMed

    Chiou, Ren-Jie; Yang, Yi-Rong

    2008-07-01

    The aim of this work was to assess the phosphorus storage capability of the polyphosphate (poly-P) accumulating organisms (PAO) in the biofilm using a sequential batch biofilm reactor (SBBR). In the anaerobic phase, the specific COD uptake rates increases from 0.05 to 0.22 (mg-COD/mg-biomass/h) as the initial COD increases and the main COD uptake activity occurs in the initial 30 min. The polyhydroxyalkanoates (PHAs) accumulation from 18 to 38 (mg-PHA/g-biomass) and phosphorus release from 20 to 60 (mg-P/L) share a similar trend. The adsorbed COD cannot be immediately transformed to PHAs. Since the PHAs' demand per released phosphorus is independent of the initial COD, the enhancement of the PHA accumulation would be of benefit to phosphorus release. The only requirement is to have an initial amount of substrate that will result in sufficient PHA accumulation (approximately 20 mg-PHA/g-biomass) for phosphorus release. During the aerobic phase, the aeration should not only provide sufficient dissolved oxygen, but should also enhance the mass transfer and the diffusion. In other words, the limitation to the phosphorus storage capability always occurs during the anaerobic phase, not the aerobic phase.

  13. Exploiting an automated microfluidic hydrodynamic sequential injection system for determination of phosphate.

    PubMed

    Khongpet, Wanpen; Pencharee, Somkid; Puangpila, Chanida; Kradtap Hartwell, Supaporn; Lapanantnoppakhun, Somchai; Jakmunee, Jaroon

    2018-01-15

    A microfluidic hydrodynamic sequential injection (μHSI) spectrophotometric system was designed and fabricated. The system was built by laser engraving a manifold pattern on an acrylic block and sealing with another flat acrylic plate to form a microfluidic channel platform. The platform was incorporated with small solenoid valves to obtain a portable setup for programmable control of the liquid flow into the channel according to the HSI principle. The system was demonstrated for the determination of phosphate using a molybdenum blue method. An ascorbic acid, standard or sample, and acidic molybdate solutions were sequentially aspirated to fill the channel forming a stack zone before flowing to the detector. Under the optimum condition, a linear calibration graph in the range of 0.1-6mg P L -1 was obtained. The detection limit was 0.1mgL -1 . The system is compact (5.0mm thick, 80mm wide × 140mm long), durable, portable, cost-effective, and consumes little amount of chemicals (83μL each of molybdate and ascorbic acid, 133μL of the sample solution and 1.7mL of water carrier/run). It was applied for the determination of phosphate content in extracted soil samples. The percent recoveries of the analysis were obtained in the range of 91.2-107.3. The results obtained agreed well with those of the batch spectrophotometric method. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid.

    PubMed

    van Staden, J F; Mashamba, Mulalo G; Stefan, Raluca I

    2002-09-01

    An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid is proposed. A solution of 0.1 mol L(-1) sodium chloride is used as carrier. Titration is achieved by aspirating acetic acid samples between two strong base-zone volumes into a holding coil and by channelling the stack of well-defined zones with flow reversal through a reaction coil to a potentiometric sensor where the peak widths were measured. A linear relationship between peak width and logarithm of the acid concentration was obtained in the range 1-9 g/100 mL. Vinegar samples were analysed without any sample pre-treatment. The method has a relative standard deviation of 0.4% with a sample frequency of 28 samples per hour. The results revealed good agreement between the proposed sequential injection and an automated batch titration method.

  15. Unraveling the Fate and Transport of SrEDTA-2 and Sr+2 in Hanford Sediments

    NASA Astrophysics Data System (ADS)

    Pace, M. N.; Mayes, M. A.; Jardine, P. M.; Mehlhorn, T. L.; Liu, Q. G.; Yin, X. L.

    2004-12-01

    Accelerated migration of strontium-90 has been observed in the vadose zone beneath the Hanford tank farm. The goal of this paper is to provide an improved understanding of the hydrogeochemical processes that contribute to strontium transport in the far-field Hanford vadose zone. Laboratory scale batch, saturated packed column experiments, and an unsaturated transport experiment in an undisturbed core were conducted to quantify geochemical and hydrological processes controlling Sr+2 and SrEDTA-2 sorption to Hanford flood deposits. After experimentation, the undisturbed core was disassembled and samples were collected from different bedding units as a function of depth. Sequential extractions were then performed on the samples. It has been suggested that organic chelates such as EDTA may be responsible for the accelerated transport of strontium due to the formation of stable anionic complexes. Duplicate batch and column experiments performed with Sr+2 and SrEDTA-2 suggested that the SrEDTA-2 complex was not stable in the presence of soil and rapid dissociation allowed strontium to be transported as a divalent cation. Batch experiments indicated a decrease in sorption with increasing rock:water ratios, whereas saturated packed column experiments indicated equal retardation in columns of different lengths. This difference between the batch and column experiments is primarily due to the difference between equilibrium conditions where dissolution of cations may compete for sorption sites versus flowing conditions where any dissolved cations are flushed through the system minimizing competition for sorption sites. Unsaturated transport in the undisturbed core resulted in significant Sr+2 retardation despite the presence of physical nonequilibrium. Core disassembly and sequential extractions revealed the mass wetness distribution and reactive mineral phases associated with strontium in the core. Overall, results indicated that strontium will most likely be transported through the Hanford far-field vadose zone as a divalent cation.

  16. Coupling fractionation and batch desorption to understand arsenic and fluoride co-contamination in the aquifer system.

    PubMed

    Kumar, Manish; Das, Nilotpal; Goswami, Ritusmita; Sarma, Kali Prasad; Bhattacharya, Prosun; Ramanathan, A L

    2016-12-01

    The present work is an attempt to study As and F+ coevality using laboratory based assays which couples fractionation and batch dissolution experiments. Sequential extraction procedure (SEP) resulting into five "operationally defined phases", was performed on sediment and soil samples collected from the Brahmaputra flood plains, Assam, India. High correlation between the Fe (hydr)oxide fraction and total As content of the soil/sediment sample indicates the involvement of Fe (hydr)oxides as the principal source of As. F - being an anion has high potential to be sorbed onto positively charged surfaces. Findings of the SEP were used to design the batch desorption experiments by controlling the Fe (hydr)oxide content of the soil/sediment. Desorption of As and F - was observed under acidic, neutral and alkaline pH from untreated and Fe (hydr)oxide removed samples. Highest amount of As and F - were found to be released from untreated samples under alkaline pH, while the amount leached from samples with no Fe (hydr)oxide was low. The study showed that the Fe (hydr)oxide fraction commonly found in the soils and sediments, had high affinity for negatively charged species like F - oxyanions of As, AsO 4 3- (arsenate) and AsO 3 3- (arsenite). Fe (hydr)oxide fraction was found to play the major role in co-evolution of As and F - . Two sorption coefficients were proposed based on easily leachable fraction and As present in the groundwater of sampling location for understanding of contamination vulnerability from the leaching. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Grafting of a reactive siloxane onto an alpha-olefin in the melt phase

    NASA Astrophysics Data System (ADS)

    Bekele, Solomon

    1999-11-01

    This dissertation presents the outcomes of a study undertaken to graft a reactive siloxane onto an alpha-olefin in the melt phase with the objective of conducting grafting and sequentially making a film of low coefficient of friction in a twin screw extruder. The areas of focus of the research were (1) design of experiments, (2) batch kinetic study, (3) twin screw extruder grafting and film making and (4) film property analysis. The primary materials of the study were a film grade homopolymer polyethylene, Equistar NA345-013, a vinylmethylsiloxane-dimethylsiloxane copolymer, Gelest VDT-731, and an ethyl 3,3-di-(t-amylperoxy)-butyrate, Elf Atochem Lupersol 533-M75. The batch mixer was a Haake rheomixRTM 400 modified to conduct reaction under a N2 blanket. Continuous reactive extrusion and sequential film making was done in a Leistritz 18mm x 40/1 L/D corotating and intermeshing twin screw extruder coupled with a flex lip die. Reaction samples were analyzed using FT-IR for degree of grafting and GPC to determine changes in molecular weight distribution as measures of degree of side reactions. The factors with main effects on degree of grafting were found to be mole percent vinyl functionality available for reaction, amount of initiator and mixing temperature. Among side reactions chain scission was found to be absent. The degree of cross-linking was mainly dependent on mole percent of free radical initiator and mixing temperature. Grafting was found to be a third order reaction with respect to vinylsiloxane concentration. Batch kinetic data were scaled up to continuous reactive extrusion in the twin screw extruder. Tracer experiments with TiO2 were used to estimate the average residence time and the extent of axial dispersion. An axial plug flow dispersion model was assumed to represent the nonideal flow of the grafting reaction in the twin screw extruder. The model was found to under predict the degree of grafting from 9% up to 25%. The coefficient of friction of the grafted film was found to be lower than the base polymer film by 50% to 60%. This increased to 65% to 75% when both sets of film samples were subjected to 50 kGy of electron beam irradiation.

  18. A new perspective of using sequential extraction: To predict the deficiency of trace elements during anaerobic digestion.

    PubMed

    Cai, Yafan; Wang, Jungang; Zhao, Yubin; Zhao, Xiaoling; Zheng, Zehui; Wen, Boting; Cui, Zongjun; Wang, Xiaofen

    2018-09-01

    Trace elements were commonly used as additives to facilitate anaerobic digestion. However, their addition is often blind because of the complexity of reaction conditions, which has impeded their widespread application. Therefore, this study was conducted to evaluate deficiencies in trace elements during anaerobic digestion by establishing relationships between changes in trace element bioavailability (the degree to which elements are available for interaction with biological systems) and digestion performance. To accomplish this, two batch experiments were conducted. In the first, sequential extraction was used to detect changes in trace element fractions and then to evaluate trace element bioavailability in the whole digestion cycle. In the second batch experiment, trace elements (Co, Fe, Cu, Zn, Mn, Mo and Se) were added to the reaction system at three concentrations (low, medium and high) and their effects were monitored. The results showed that sequential extraction was a suitable method for assessment of the bioavailability of trace elements (appropriate coefficient of variation and recovery rate). The results revealed that Se had the highest (44.2%-70.9%) bioavailability, while Fe had the lowest (1.7%-3.0%). A lack of trace elements was not directly related to their absolute bioavailability, but was instead associated with changes in their bioavailability throughout the digestion cycle. Trace elements were insufficient when their bioavailability was steady or increased over the digestion cycle. These results indicate that changes in trace element bioavailability during the digestion cycle can be used to predict their deficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  20. Periodate and hypobromite modification of Southern pine wood to improve sorption of copper ion

    Treesearch

    James D. McSweeny; Roger M. Rowell; George C. Chen; Thomas L. Eberhardt; Min Soo-Hong

    2008-01-01

    Milled southern pine wood was modified with sequential treatments of sodium periodate and sodium hypobromite for the purpose of improving copper ion (Cu2+) sorption capacity of the wood when tested in 24-h equilibrium batch tests. The modified wood provided additional carboxyl groups to those in the native wood and substantially increased Cu2+ uptake over that of...

  1. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  2. Investigation for improving Global Positioning System (GPS) orbits using a discrete sequential estimator and stochastic models of selected physical processes

    NASA Technical Reports Server (NTRS)

    Goad, Clyde C.; Chadwell, C. David

    1993-01-01

    GEODYNII is a conventional batch least-squares differential corrector computer program with deterministic models of the physical environment. Conventional algorithms were used to process differenced phase and pseudorange data to determine eight-day Global Positioning system (GPS) orbits with several meter accuracy. However, random physical processes drive the errors whose magnitudes prevent improving the GPS orbit accuracy. To improve the orbit accuracy, these random processes should be modeled stochastically. The conventional batch least-squares algorithm cannot accommodate stochastic models, only a stochastic estimation algorithm is suitable, such as a sequential filter/smoother. Also, GEODYNII cannot currently model the correlation among data values. Differenced pseudorange, and especially differenced phase, are precise data types that can be used to improve the GPS orbit precision. To overcome these limitations and improve the accuracy of GPS orbits computed using GEODYNII, we proposed to develop a sequential stochastic filter/smoother processor by using GEODYNII as a type of trajectory preprocessor. Our proposed processor is now completed. It contains a correlated double difference range processing capability, first order Gauss Markov models for the solar radiation pressure scale coefficient and y-bias acceleration, and a random walk model for the tropospheric refraction correction. The development approach was to interface the standard GEODYNII output files (measurement partials and variationals) with software modules containing the stochastic estimator, the stochastic models, and a double differenced phase range processing routine. Thus, no modifications to the original GEODYNII software were required. A schematic of the development is shown. The observational data are edited in the preprocessor and the data are passed to GEODYNII as one of its standard data types. A reference orbit is determined using GEODYNII as a batch least-squares processor and the GEODYNII measurement partial (FTN90) and variational (FTN80, V-matrix) files are generated. These two files along with a control statement file and a satellite identification and mass file are passed to the filter/smoother to estimate time-varying parameter states at each epoch, improved satellite initial elements, and improved estimates of constant parameters.

  3. A Model-based B2B (Batch to Batch) Control for An Industrial Batch Polymerization Process

    NASA Astrophysics Data System (ADS)

    Ogawa, Morimasa

    This paper describes overview of a model-based B2B (batch to batch) control for an industrial batch polymerization process. In order to control the reaction temperature precisely, several methods based on the rigorous process dynamics model are employed at all design stage of the B2B control, such as modeling and parameter estimation of the reaction kinetics which is one of the important part of the process dynamics model. The designed B2B control consists of the gain scheduled I-PD/II2-PD control (I-PD with double integral control), the feed-forward compensation at the batch start time, and the model adaptation utilizing the results of the last batch operation. Throughout the actual batch operations, the B2B control provides superior control performance compared with that of conventional control methods.

  4. Performance evaluation of an asynchronous multisensor track fusion filter

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.; Gray, John E.; McCabe, D. H.

    2003-08-01

    Recently the authors developed a new filter that uses data generated by asynchronous sensors to produce a state estimate that is optimal in the minimum mean square sense. The solution accounts for communications delay between sensors platform and fusion center. It also deals with out of sequence data as well as latent data by processing the information in a batch-like manner. This paper compares, using simulated targets and Monte Carlo simulations, the performance of the filter to the optimal sequential processing approach. It was found that the new asynchronous Multisensor track fusion filter (AMSTFF) performance is identical to that of the extended sequential Kalman filter (SEKF), while the new filter updates its track at a much lower rate than the SEKF.

  5. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  6. A Thermal Stability Test for Primary Explosive Stab Sensitizers: Study of the Thermal and Hydrolytic Stability of 2-Picryl-5-Nitrotetrazole,

    DTIC Science & Technology

    1984-02-01

    have been described previously (2]. The actual batch used was designated Batch D and was identical to that referred to as Batch C in Reference [2...Tetrazene was type RD1357 prepared at Materials Research Laboratories. The batch used was designated Batch 10/83(A). Lead Azide was type RD1343 and was...Preparation of Experimental Detonators Eperimental detonators were prepared in mild steel tubes, 6 mm o.d., 3.2 mm i.d., length 6 mm, prepared from

  7. Efficient anaerobic transformation of raw wheat straw by a robust cow rumen-derived microbial consortium.

    PubMed

    Lazuka, Adèle; Auer, Lucas; Bozonnet, Sophie; Morgavi, Diego P; O'Donohue, Michael; Hernandez-Raquet, Guillermina

    2015-11-01

    A rumen-derived microbial consortium was enriched on raw wheat straw as sole carbon source in a sequential batch-reactor (SBR) process under strict mesophilic anaerobic conditions. After five cycles of enrichment the procedure enabled to select a stable and efficient lignocellulolytic microbial consortium, mainly constituted by members of Firmicutes and Bacteroidetes phyla. The enriched community, designed rumen-wheat straw-derived consortium (RWS) efficiently hydrolyzed lignocellulosic biomass, degrading 55.5% w/w of raw wheat straw over 15days at 35°C and accumulating carboxylates as main products. Cellulolytic and hemicellulolytic activities, mainly detected on the cell bound fraction, were produced in the earlier steps of degradation, their production being correlated with the maximal lignocellulose degradation rates. Overall, these results demonstrate the potential of RWS to convert unpretreated lignocellulosic substrates into useful chemicals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Ethanol production using whole plant biomass of Jerusalem artichoke by Kluyveromyces marxianus CBS1555.

    PubMed

    Kim, Seonghun; Park, Jang Min; Kim, Chul Ho

    2013-03-01

    Jerusalem artichoke is a low-requirement sugar crop containing cellulose and hemicellulose in the stalk and a high content of inulin in the tuber. However, the lignocellulosic component in Jerusalem artichoke stalk reduces the fermentability of the whole plant for efficient bioethanol production. In this study, Jerusalem artichoke stalk was pretreated sequentially with dilute acid and alkali, and then hydrolyzed enzymatically. During enzymatic hydrolysis, approximately 88 % of the glucan and xylan were converted to glucose and xylose, respectively. Batch and fed-batch simultaneous saccharification and fermentation of both pretreated stalk and tuber by Kluyveromyces marxianus CBS1555 were effectively performed, yielding 29.1 and 70.2 g/L ethanol, respectively. In fed-batch fermentation, ethanol productivity was 0.255 g ethanol per gram of dry Jerusalem artichoke biomass, or 0.361 g ethanol per gram of glucose, with a 0.924 g/L/h ethanol productivity. These results show that combining the tuber and the stalk hydrolysate is a useful strategy for whole biomass utilization in effective bioethanol fermentation from Jerusalem artichoke.

  9. A new experimental design method to optimize formulations focusing on a lubricant for hydrophilic matrix tablets.

    PubMed

    Choi, Du Hyung; Shin, Sangmun; Khoa Viet Truong, Nguyen; Jeong, Seong Hoon

    2012-09-01

    A robust experimental design method was developed with the well-established response surface methodology and time series modeling to facilitate the formulation development process with magnesium stearate incorporated into hydrophilic matrix tablets. Two directional analyses and a time-oriented model were utilized to optimize the experimental responses. Evaluations of tablet gelation and drug release were conducted with two factors x₁ and x₂: one was a formulation factor (the amount of magnesium stearate) and the other was a processing factor (mixing time), respectively. Moreover, different batch sizes (100 and 500 tablet batches) were also evaluated to investigate an effect of batch size. The selected input control factors were arranged in a mixture simplex lattice design with 13 experimental runs. The obtained optimal settings of magnesium stearate for gelation were 0.46 g, 2.76 min (mixing time) for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The optimal settings for drug release were 0.33 g, 7.99 min for a 100 tablet batch and 1.54 g, 6.51 min for a 500 tablet batch. The exact ratio and mixing time of magnesium stearate could be formulated according to the resulting hydrophilic matrix tablet properties. The newly designed experimental method provided very useful information for characterizing significant factors and hence to obtain optimum formulations allowing for a systematic and reliable experimental design method.

  10. Sorption of water alkalinity and hardness from high-strength wastewater on bifunctional activated carbon: process optimization, kinetics and equilibrium studies.

    PubMed

    Amosa, Mutiu K

    2016-08-01

    Sorption optimization and mechanism of hardness and alkalinity on bifunctional empty fruit bunch-based powdered activation carbon (PAC) were studied. The PAC possessed both high surface area and ion-exchange properties, and it was utilized in the treatment of biotreated palm oil mill effluent. Batch adsorption experiments designed with Design Expert(®) were conducted in correlating the singular and interactive effects of the three adsorption parameters: PAC dosage, agitation speed and contact time. The sorption trends of the two contaminants were sequentially assessed through a full factorial design with three factor interaction models and a central composite design with polynomial models of quadratic order. Analysis of variance revealed the significant factors on each design response with very high R(2) values indicating good agreement between model and experimental values. The optimum operating conditions of the two contaminants differed due to their different regions of operating interests, thus necessitating the utility of desirability factor to get consolidated optimum operation conditions. The equilibrium data for alkalinity and hardness sorption were better represented by the Langmuir isotherm, while the pseudo-second-order kinetic model described the adsorption rates and behavior better. It was concluded that chemisorption contributed majorly to the adsorption process.

  11. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  12. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish guidelines for determining optimal filter tuning parameters in a given sequential OD scenario for both covariance analysis and actual OD. Comparisons are also made with corresponding definitive OD results available from the TONS-EUVE analysis.

  13. Water reuse in the l-lysine fermentation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsiao, T.Y.; Glatz, C.E.

    1996-02-05

    L-Lysine is produced commercially by fermentation. As is typical for fermentation processes, a large amount of liquid waste is generated. To minimize the waste, which is mostly the broth effluent from the cation exchange column used for l-lysine recovery, the authors investigated a strategy of recycling a large fraction of this broth effluent to the subsequent fermentation. This was done on a lab-scale process with Corynebacterium glutamicum ATCC 21253 as the l-lysine-producing organisms. Broth effluent from a fermentation in a defined medium was able to replace 75% of the water for the subsequent batch; this recycle ratio was maintained formore » 3 sequential batches without affecting cell mass and l-lysine production. Broth effluent was recycled at 50% recycle ratio in a fermentation in a complex medium containing beet molasses. The first recycle batch had an 8% lower final l-lysine level, but 8% higher maximum cell mass. In addition to reducing the volume of liquid waste, this recycle strategy has the additional advantage of utilizing the ammonium desorbed from the ion-exchange column as a nitrogen source in the recycle fermentation. The major problem of recycling the effluent from the complex medium was in the cation-exchange operation, where column capacity was 17% lower for the recycle batch. The loss of column capacity probably results from the buildup of cations competing with l-lysine for binding.« less

  14. Sequential and simultaneous strategies for biorefining of wheat straw using room temperature ionic liquids, xylanases and cellulases.

    PubMed

    Husson, Eric; Auxenfans, Thomas; Herbaut, Mickael; Baralle, Manon; Lambertyn, Virginie; Rakotoarivonina, Harivoni; Rémond, Caroline; Sarazin, Catherine

    2018-03-01

    Sequential and simultaneous strategies for fractioning wheat straw were developed in combining 1-ethyl-3-methyl imidazolium acetate [C2mim][OAc], endo-xylanases from Thermobacillus xylanilyticus and commercial cellulases. After [C2mim][OAc]-pretreatment, hydrolysis catalyzed by endo-xylanases of wheat straw led to efficient xylose production with very competitive yield (97.6 ± 1.3%). Subsequent enzymatic saccharification allowed achieving a total degradation of cellulosic fraction (>99%). These high performances revealed an interesting complementarity of [C2mim][OAc]- and xylanase-pretreatments for increasing enzymatic digestibility of cellulosic fraction in agreement with the structural and morphological changes of wheat straw induced by each of these pretreatment steps. In addition a higher tolerance of endo-xylanases from T. xylaniliticus to [C2mim][AcO] until 30% v/v than cellulases from T. reesei was observed. Based on this property, a simultaneous strategy combining [C2mim][OAc]- and endo-xylanases as pretreatment in a one-batch produced xylose with similar yield than those obtained by the sequential strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. 40 CFR 63.1402 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... properties may vary with time. For a unit operation operated in a batch mode (i.e., batch unit operation... means a unit operation operated in a batch mode. Block means the time period that comprises a single batch cycle. Combustion device burner means a device designed to mix and ignite fuel and air to provide...

  16. 40 CFR 63.1402 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... properties may vary with time. For a unit operation operated in a batch mode (i.e., batch unit operation... means a unit operation operated in a batch mode. Block means the time period that comprises a single batch cycle. Combustion device burner means a device designed to mix and ignite fuel and air to provide...

  17. 40 CFR 63.1402 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... properties may vary with time. For a unit operation operated in a batch mode (i.e., batch unit operation... means a unit operation operated in a batch mode. Block means the time period that comprises a single batch cycle. Combustion device burner means a device designed to mix and ignite fuel and air to provide...

  18. 40 CFR 63.1402 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... properties may vary with time. For a unit operation operated in a batch mode (i.e., batch unit operation... means a unit operation operated in a batch mode. Block means the time period that comprises a single batch cycle. Combustion device burner means a device designed to mix and ignite fuel and air to provide...

  19. 40 CFR 63.1402 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... properties may vary with time. For a unit operation operated in a batch mode (i.e., batch unit operation... means a unit operation operated in a batch mode. Block means the time period that comprises a single batch cycle. Combustion device burner means a device designed to mix and ignite fuel and air to provide...

  20. Acceptance Test Data for BWXT Coated Particle Batches 93172B and 93173B—Defective IPyC and Pyrocarbon Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunn, John D.; Helmreich, Grant W.; Dyer, John A.

    Coated particle batches J52O-16-93172B and J52O-16-93173B were produced by Babcock and Wilcox Technologies (BWXT) as part of the production campaign for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), but were not used in the final fuel composite. However, these batches may be used as demonstration production-scale coated particle fuel for other experiments. Each batch was coated in a 150-mm-diameter production-scale fluidized-bed chemical vapor deposition (CVD) furnace. Tristructural isotropic (TRISO) coatings were deposited on 425-μm-nominal-diameter spherical kernels from BWXT lot J52R-16-69317 containing a mixture ofmore » 15.5%-enriched uranium carbide and uranium oxide (UCO). The TRISO coatings consisted of four consecutive CVD layers: a ~50% dense carbon buffer layer with 100-μm-nominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μm-nominal thickness. The TRISO-coated particle batches were sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batches were designated by appending the letter A to the end of the batch number (e.g., 93172A). Secondary upgrading by sieving was performed on the A-designated batches to remove particles with missing or very-thin buffer layers that were identified during previous analysis of the individual batches for defective IPyC, as reported in the acceptance test data report for the AGR-5/6/7 production batches [Hunn et al. 2017b]. The additionally-upgraded batches were designated by appending the letter B to the end of the batch number (e.g., 93172B).« less

  1. Estimation and identification study for flexible vehicles

    NASA Technical Reports Server (NTRS)

    Jazwinski, A. H.; Englar, T. S., Jr.

    1973-01-01

    Techniques are studied for the estimation of rigid body and bending states and the identification of model parameters associated with the single-axis attitude dynamics of a flexible vehicle. This problem is highly nonlinear but completely observable provided sufficient attitude and attitude rate data is available and provided all system bending modes are excited in the observation interval. A sequential estimator tracks the system states in the presence of model parameter errors. A batch estimator identifies all model parameters with high accuracy.

  2. Sequential batch culture studies for the decolorisation of reactive dye by Coriolus versicolor.

    PubMed

    Sanghi, Rashmi; Dixit, Awantika; Guha, Sauymen

    2006-02-01

    The white rot fungus Coriolus versicolor could decolorise reactive dye Remazol Brilliant Violet to almost 90%. The fungal mycelia removed color as well as COD up to 95% and 75%, respectively, in a batch reactor. Decolorising activity was observed during the repeated reuse of the fungus. It was possible to substantially increase the dye decolorising activity of the fungus by carefully selecting the operational conditions such as media composition, age of fungus and nitrogen source. The fungal pellets could be used for eight cycles during the long term operation, where medium and dye was replenished at the end of each cycle and the fungus was recycled. Presence of a nitrogen source and nutrient content of media played an important role in sustaining the decolorisation activity of the fungus. The form of nitrogen source (e.g. peptone vs. urea) was also important to maintain the decolorising activity with peptone showing better decolorisation.

  3. MBASIC batch processor architectural overview

    NASA Technical Reports Server (NTRS)

    Reynolds, S. M.

    1978-01-01

    The MBASIC (TM) batch processor, a language translator designed to operate in the MBASIC (TM) environment is described. Features include: (1) a CONVERT TO BATCH command, usable from the ready mode; and (2) translation of the users program in stages through several levels of intermediate language and optimization. The processor is to be designed and implemented in both machine-independent and machine-dependent sections. The architecture is planned so that optimization processes are transparent to the rest of the system and need not be included in the first design implementation cycle.

  4. A Bayesian sequential design using alpha spending function to control type I error.

    PubMed

    Zhu, Han; Yu, Qingzhao

    2017-10-01

    We propose in this article a Bayesian sequential design using alpha spending functions to control the overall type I error in phase III clinical trials. We provide algorithms to calculate critical values, power, and sample sizes for the proposed design. Sensitivity analysis is implemented to check the effects from different prior distributions, and conservative priors are recommended. We compare the power and actual sample sizes of the proposed Bayesian sequential design with different alpha spending functions through simulations. We also compare the power of the proposed method with frequentist sequential design using the same alpha spending function. Simulations show that, at the same sample size, the proposed method provides larger power than the corresponding frequentist sequential design. It also has larger power than traditional Bayesian sequential design which sets equal critical values for all interim analyses. When compared with other alpha spending functions, O'Brien-Fleming alpha spending function has the largest power and is the most conservative in terms that at the same sample size, the null hypothesis is the least likely to be rejected at early stage of clinical trials. And finally, we show that adding a step of stop for futility in the Bayesian sequential design can reduce the overall type I error and reduce the actual sample sizes.

  5. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Continuous-flow ultrasound assisted oxidative desulfurization (UAOD) process: An efficient diesel treatment by injection of the aqueous phase.

    PubMed

    Rahimi, Masoud; Shahhosseini, Shahrokh; Movahedirad, Salman

    2017-11-01

    A new continuous-flow ultrasound assisted oxidative desulfurization (UAOD) process was developed in order to decrease energy and aqueous phase consumption. In this process the aqueous phase is injected below the horn tip leading to enhanced mixing of the phases. Diesel fuel as the oil phase with sulfur content of 1550ppmw and an appropriate mixture of hydrogen peroxide and formic acid as the aqueous phase were used. At the first step, the optimized condition for the sulfur removal has been obtained in the batch mode operation. Hence, the effect of more important oxidation parameters; oxidant-to-sulfur molar ratio, acid-to-sulfur molar ratio and sonication time were investigated. Then the optimized conditions were obtained using Response Surface Methodology (RSM) technique. Afterwards, some experiments corresponding to the best batch condition and also with objective of minimizing the residence time and aqueous phase to fuel volume ratio have been conducted in a newly designed double-compartment reactor with injection of the aqueous phase to evaluate the process in a continuous flow operation. In addition, the effect of nozzle diameter has been examined. Significant improvement on the sulfur removal was observed specially in lower sonication time in the case of dispersion method in comparison with the conventional contact between two phases. Ultimately, the flow pattern induced by ultrasonic device, and also injection of the aqueous phase were analyzed quantitatively and qualitatively by capturing the sequential images. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    ERIC Educational Resources Information Center

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  8. Between-Batch Pharmacokinetic Variability Inflates Type I Error Rate in Conventional Bioequivalence Trials: A Randomized Advair Diskus Clinical Trial.

    PubMed

    Burmeister Getz, E; Carroll, K J; Mielke, J; Benet, L Z; Jones, B

    2017-03-01

    We previously demonstrated pharmacokinetic differences among manufacturing batches of a US Food and Drug Administration (FDA)-approved dry powder inhalation product (Advair Diskus 100/50) large enough to establish between-batch bio-inequivalence. Here, we provide independent confirmation of pharmacokinetic bio-inequivalence among Advair Diskus 100/50 batches, and quantify residual and between-batch variance component magnitudes. These variance estimates are used to consider the type I error rate of the FDA's current two-way crossover design recommendation. When between-batch pharmacokinetic variability is substantial, the conventional two-way crossover design cannot accomplish the objectives of FDA's statistical bioequivalence test (i.e., cannot accurately estimate the test/reference ratio and associated confidence interval). The two-way crossover, which ignores between-batch pharmacokinetic variability, yields an artificially narrow confidence interval on the product comparison. The unavoidable consequence is type I error rate inflation, to ∼25%, when between-batch pharmacokinetic variability is nonzero. This risk of a false bioequivalence conclusion is substantially higher than asserted by regulators as acceptable consumer risk (5%). © 2016 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of The American Society for Clinical Pharmacology and Therapeutics.

  9. ARTS: automated randomization of multiple traits for study design.

    PubMed

    Maienschein-Cline, Mark; Lei, Zhengdeng; Gardeux, Vincent; Abbasi, Taimur; Machado, Roberto F; Gordeuk, Victor; Desai, Ankit A; Saraf, Santosh; Bahroos, Neil; Lussier, Yves

    2014-06-01

    Collecting data from large studies on high-throughput platforms, such as microarray or next-generation sequencing, typically requires processing samples in batches. There are often systematic but unpredictable biases from batch-to-batch, so proper randomization of biologically relevant traits across batches is crucial for distinguishing true biological differences from experimental artifacts. When a large number of traits are biologically relevant, as is common for clinical studies of patients with varying sex, age, genotype and medical background, proper randomization can be extremely difficult to prepare by hand, especially because traits may affect biological inferences, such as differential expression, in a combinatorial manner. Here we present ARTS (automated randomization of multiple traits for study design), which aids researchers in study design by automatically optimizing batch assignment for any number of samples, any number of traits and any batch size. ARTS is implemented in Perl and is available at github.com/mmaiensc/ARTS. ARTS is also available in the Galaxy Tool Shed, and can be used at the Galaxy installation hosted by the UIC Center for Research Informatics (CRI) at galaxy.cri.uic.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Results of Hg speciation testing on DWPF SMECT-8, OGCT-1, AND OGCT-2 samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bannochie, C.

    2016-02-22

    The Savannah River National Laboratory (SRNL) was tasked with preparing and shipping samples for Hg speciation by Eurofins Frontier Global Sciences, Inc. in Seattle, WA on behalf of the Savannah River Remediation (SRR) Mercury Task Team. The sixteenth shipment of samples was designated to include a Defense Waste Processing Facility (DWPF) Slurry Mix Evaporator Condensate Tank (SMECT) sample from Sludge Receipt and Adjustment Tank (SRAT) Batch 738 processing and two Off-Gas Condensate Tank (OGCT) samples, one following Batch 736 and one following Batch 738. The DWPF sample designations for the three samples analyzed are provided. The Batch 738 ‘End ofmore » SME Cycle’ SMECT sample was taken at the conclusion of Slurry Mix Evaporator (SME) operations for this batch and represents the fourth SMECT sample examined from Batch 738. Batch 738 experienced a sludge slurry carryover event, which introduced sludge solids to the SMECT that were particularly evident in the SMECT-5 sample, but less evident in the ‘End of SME Cycle’ SMECT-8 sample.« less

  11. Improving cellulase productivity of Penicillium oxalicum RE-10 by repeated fed-batch fermentation strategy.

    PubMed

    Han, Xiaolong; Song, Wenxia; Liu, Guodong; Li, Zhonghai; Yang, Piao; Qu, Yinbo

    2017-03-01

    Medium optimization and repeated fed-batch fermentation were performed to improve the cellulase productivity by P. oxalicum RE-10 in submerged fermentation. First, Plackett-Burman design (PBD) and central composite design (CCD) were used to optimize the medium for cellulase production. PBD demonstrated wheat bran and NaNO 3 had significant influences on cellulase production. The CCD results showed the maximum filter paper activity (FPA) production of 8.61U/mL could be achieved in Erlenmeyer flasks. The maximal FPA reached 12.69U/mL by submerged batch fermentation in a 7.5-L stirred tank, 1.76-fold higher than that on the original medium. Then, the repeated fed-batch fermentation strategy was performed successfully for increasing the cellulase productivity from 105.75U/L/h in batch fermentation to 158.38U/L/h. The cellulase activity and the glucan conversion of delignined corn cob residue hydrolysis had no significant difference between the enzymes sampled from different cycles of the repeated fed-batch fermentation and that from batch culture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Modulation and modeling of monoclonal antibody N-linked glycosylation in mammalian cell perfusion reactors.

    PubMed

    Karst, Daniel J; Scibona, Ernesto; Serra, Elisa; Bielser, Jean-Marc; Souquet, Jonathan; Stettler, Matthieu; Broly, Hervé; Soos, Miroslav; Morbidelli, Massimo; Villiger, Thomas K

    2017-09-01

    Mammalian cell perfusion cultures are gaining renewed interest as an alternative to traditional fed-batch processes for the production of therapeutic proteins, such as monoclonal antibodies (mAb). The steady state operation at high viable cell density allows the continuous delivery of antibody product with increased space-time yield and reduced in-process variability of critical product quality attributes (CQA). In particular, the production of a confined mAb N-linked glycosylation pattern has the potential to increase therapeutic efficacy and bioactivity. In this study, we show that accurate control of flow rates, media composition and cell density of a Chinese hamster ovary (CHO) cell perfusion bioreactor allowed the production of a constant glycosylation profile for over 20 days. Steady state was reached after an initial transition phase of 6 days required for the stabilization of extra- and intracellular processes. The possibility to modulate the glycosylation profile was further investigated in a Design of Experiment (DoE), at different viable cell density and media supplement concentrations. This strategy was implemented in a sequential screening approach, where various steady states were achieved sequentially during one culture. It was found that, whereas high ammonia levels reached at high viable cell densities (VCD) values inhibited the processing to complex glycan structures, the supplementation of either galactose, or manganese as well as their synergy significantly increased the proportion of complex forms. The obtained experimental data set was used to compare the reliability of a statistical response surface model (RSM) to a mechanistic model of N-linked glycosylation. The latter outperformed the response surface predictions with respect to its capability and reliability in predicting the system behavior (i.e., glycosylation pattern) outside the experimental space covered by the DoE design used for the model parameter estimation. Therefore, we can conclude that the modulation of glycosylation in a sequential steady state approach in combination with mechanistic model represents an efficient and rational strategy to develop continuous processes with desired N-linked glycosylation patterns. Biotechnol. Bioeng. 2017;114: 1978-1990. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. A high-throughput media design approach for high performance mammalian fed-batch cultures

    PubMed Central

    Rouiller, Yolande; Périlleux, Arnaud; Collet, Natacha; Jordan, Martin; Stettler, Matthieu; Broly, Hervé

    2013-01-01

    An innovative high-throughput medium development method based on media blending was successfully used to improve the performance of a Chinese hamster ovary fed-batch medium in shaking 96-deepwell plates. Starting from a proprietary chemically-defined medium, 16 formulations testing 43 of 47 components at 3 different levels were designed. Media blending was performed following a custom-made mixture design of experiments considering binary blends, resulting in 376 different blends that were tested during both cell expansion and fed-batch production phases in one single experiment. Three approaches were chosen to provide the best output of the large amount of data obtained. A simple ranking of conditions was first used as a quick approach to select new formulations with promising features. Then, prediction of the best mixes was done to maximize both growth and titer using the Design Expert software. Finally, a multivariate analysis enabled identification of individual potential critical components for further optimization. Applying this high-throughput method on a fed-batch, rather than on a simple batch, process opens new perspectives for medium and feed development that enables identification of an optimized process in a short time frame. PMID:23563583

  14. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  15. Statistical Engineering in Air Traffic Management Research

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  16. An approach to optimize the batch mixing process for improving the quality consistency of the products made from traditional Chinese medicines*

    PubMed Central

    Yan, Bin-jun; Qu, Hai-bin

    2013-01-01

    The efficacy of traditional Chinese medicine (TCM) is based on the combined effects of its constituents. Variation in chemical composition between batches of TCM has always been the deterring factor in achieving consistency in efficacy. The batch mixing process can significantly reduce the batch-to-batch quality variation in TCM extracts by mixing them in a well-designed proportion. However, reducing the quality variation without sacrificing too much of the production efficiency is one of the challenges. Accordingly, an innovative and practical batch mixing method aimed at providing acceptable efficiency for industrial production of TCM products is proposed in this work, which uses a minimum number of batches of extracts to meet the content limits. The important factors affecting the utilization ratio of the extracts (URE) were studied by simulations. The results have shown that URE was affected by the correlation between the contents of constituents, and URE decreased with the increase in the number of targets and the relative standard deviations of the contents. URE could be increased by increasing the number of storage tanks. The results have provided a reference for designing the batch mixing process. The proposed method has possible application value in reducing the quality variation in TCM and providing acceptable production efficiency simultaneously. PMID:24190450

  17. An approach to optimize the batch mixing process for improving the quality consistency of the products made from traditional Chinese medicines.

    PubMed

    Yan, Bin-jun; Qu, Hai-bin

    2013-11-01

    The efficacy of traditional Chinese medicine (TCM) is based on the combined effects of its constituents. Variation in chemical composition between batches of TCM has always been the deterring factor in achieving consistency in efficacy. The batch mixing process can significantly reduce the batch-to-batch quality variation in TCM extracts by mixing them in a well-designed proportion. However, reducing the quality variation without sacrificing too much of the production efficiency is one of the challenges. Accordingly, an innovative and practical batch mixing method aimed at providing acceptable efficiency for industrial production of TCM products is proposed in this work, which uses a minimum number of batches of extracts to meet the content limits. The important factors affecting the utilization ratio of the extracts (URE) were studied by simulations. The results have shown that URE was affected by the correlation between the contents of constituents, and URE decreased with the increase in the number of targets and the relative standard deviations of the contents. URE could be increased by increasing the number of storage tanks. The results have provided a reference for designing the batch mixing process. The proposed method has possible application value in reducing the quality variation in TCM and providing acceptable production efficiency simultaneously.

  18. Parallel experimental design and multivariate analysis provides efficient screening of cell culture media supplements to improve biosimilar product quality.

    PubMed

    Brühlmann, David; Sokolov, Michael; Butté, Alessandro; Sauer, Markus; Hemberger, Jürgen; Souquet, Jonathan; Broly, Hervé; Jordan, Martin

    2017-07-01

    Rational and high-throughput optimization of mammalian cell culture media has a great potential to modulate recombinant protein product quality. We present a process design method based on parallel design-of-experiment (DoE) of CHO fed-batch cultures in 96-deepwell plates to modulate monoclonal antibody (mAb) glycosylation using medium supplements. To reduce the risk of losing valuable information in an intricate joint screening, 17 compounds were separated into five different groups, considering their mode of biological action. The concentration ranges of the medium supplements were defined according to information encountered in the literature and in-house experience. The screening experiments produced wide glycosylation pattern ranges. Multivariate analysis including principal component analysis and decision trees was used to select the best performing glycosylation modulators. Subsequent D-optimal quadratic design with four factors (three promising compounds and temperature shift) in shake tubes confirmed the outcome of the selection process and provided a solid basis for sequential process development at a larger scale. The glycosylation profile with respect to the specifications for biosimilarity was greatly improved in shake tube experiments: 75% of the conditions were equally close or closer to the specifications for biosimilarity than the best 25% in 96-deepwell plates. Biotechnol. Bioeng. 2017;114: 1448-1458. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  19. SymDex: increasing the efficiency of chemical fingerprint similarity searches for comparing large chemical libraries by using query set indexing.

    PubMed

    Tai, David; Fang, Jianwen

    2012-08-27

    The large sizes of today's chemical databases require efficient algorithms to perform similarity searches. It can be very time consuming to compare two large chemical databases. This paper seeks to build upon existing research efforts by describing a novel strategy for accelerating existing search algorithms for comparing large chemical collections. The quest for efficiency has focused on developing better indexing algorithms by creating heuristics for searching individual chemical against a chemical library by detecting and eliminating needless similarity calculations. For comparing two chemical collections, these algorithms simply execute searches for each chemical in the query set sequentially. The strategy presented in this paper achieves a speedup upon these algorithms by indexing the set of all query chemicals so redundant calculations that arise in the case of sequential searches are eliminated. We implement this novel algorithm by developing a similarity search program called Symmetric inDexing or SymDex. SymDex shows over a 232% maximum speedup compared to the state-of-the-art single query search algorithm over real data for various fingerprint lengths. Considerable speedup is even seen for batch searches where query set sizes are relatively small compared to typical database sizes. To the best of our knowledge, SymDex is the first search algorithm designed specifically for comparing chemical libraries. It can be adapted to most, if not all, existing indexing algorithms and shows potential for accelerating future similarity search algorithms for comparing chemical databases.

  20. Fluidized bed coal desulfurization

    NASA Technical Reports Server (NTRS)

    Ravindram, M.

    1983-01-01

    Laboratory scale experiments were conducted on two high volatile bituminous coals in a bench scale batch fluidized bed reactor. Chemical pretreatment and posttreatment of coals were tried as a means of enhancing desulfurization. Sequential chlorination and dechlorination cum hydrodesulfurization under modest conditions relative to the water slurry process were found to result in substantial sulfur reductions of about 80%. Sulfur forms as well as proximate and ultimate analyses of the processed coals are included. These studies indicate that a fluidized bed reactor process has considerable potential for being developed into a simple and economic process for coal desulfurization.

  1. Wall Interference in Two-Dimensional Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Kemp, William B., Jr.

    1986-01-01

    Viscosity and tunnel-wall constraints introduced via boundary conditions. TWINTN4 computer program developed to implement method of posttest assessment of wall interference in two-dimensional wind tunnels. Offers two methods for combining sidewall boundary-layer effects with upper and lower wall interference. In sequential procedure, Sewall method used to define flow free of sidewall effects, then assessed for upper and lower wall effects. In unified procedure, wind-tunnel flow equations altered to incorporate effects from all four walls at once. Program written in FORTRAN IV for batch execution.

  2. Gut-Bioreactor and Human Health in Future.

    PubMed

    Purohit, Hemant J

    2018-03-01

    Gut-microbiome provides the complementary metabolic potential to the human system. To understand the active participation and the performance of the microbial community in human health, the concept of gut as a plug-flow reactor with the fed-batch mode of operation can provide better insight. The concept suggests the virtual compartmentalized gut with sequential stratification of the microbial community in response to a typical host genotype. It also provides the analysis plan for gut microbiome; and its relevance in developing health management options under the identified clinical conditions.

  3. Formulation of advanced consumables management models: Environmental control and electrical power system performance models requirements

    NASA Technical Reports Server (NTRS)

    Daly, J. K.; Torian, J. G.

    1979-01-01

    Software design specifications for developing environmental control and life support system (ECLSS) and electrical power system (EPS) programs into interactive computer programs are presented. Specifications for the ECLSS program are at the detail design level with respect to modification of an existing batch mode program. The FORTRAN environmental analysis routines (FEAR) are the subject batch mode program. The characteristics of the FEAR program are included for use in modifying batch mode programs to form interactive programs. The EPS program specifications are at the preliminary design level. Emphasis is on top-down structuring in the development of an interactive program.

  4. Acceptance Test Data for the AGR-5/6/7 Irradiation Test Fuel Composite Defective IPyC Fraction and Pyrocarbon Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmreich, Grant W.; Hunn, John D.; Skitt, Darren J.

    Coated particle composite J52R-16-98005 was produced by Babcock and Wilcox Technologies (BWXT) as fuel for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR). This composite was comprised of four coated particle fuel batches J52O-16-93165B (26%), 93168B (26%), 93169B (24%), and 93170B (24%), chosen based on the Quality Control (QC) data acquired for each individual candidate AGR-5/6/7 batch. Each batch was coated in a 150-mm-diameter production-scale fluidized-bed chemical vapor deposition (CVD) furnace. Tristructural isotropic (TRISO) coatings were deposited on 425-μm-nominal-diameter spherical kernels from BWXT Lot J52R-16-69317more » containing a mixture of 15.5%-enriched uranium carbide and uranium oxide (UCO). The TRISO coatings consisted of four consecutive CVD layers: a ~50% dense carbon buffer layer with 100-μm-nominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μm-nominal thickness. The TRISO-coated particle batches were sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batches were designated by appending the letter A to the end of the batch number (e.g., 93165A). Secondary upgrading by sieving was performed on the A-designated batches to remove particles with missing or very-thin buffer layers that were identified during previous analysis of the individual batches for defective IPyC, as reported in the acceptance test data report for the AGR-5/6/7 production batches [Hunn et al. 2017]. The additionally-upgraded batches were designated by appending the letter B to the end of the batch number (e.g., 93165B).« less

  5. Introducing a Model for Optimal Design of Sequential Objective Structured Clinical Examinations

    ERIC Educational Resources Information Center

    Mortaz Hejri, Sara; Yazdani, Kamran; Labaf, Ali; Norcini, John J.; Jalili, Mohammad

    2016-01-01

    In a sequential OSCE which has been suggested to reduce testing costs, candidates take a short screening test and who fail the test, are asked to take the full OSCE. In order to introduce an effective and accurate sequential design, we developed a model for designing and evaluating screening OSCEs. Based on two datasets from a 10-station…

  6. In vitro growth of Curcuma longa L. in response to five mineral elements and plant density in fed-batch culture systems.

    PubMed

    El-Hawaz, Rabia F; Bridges, William C; Adelberg, Jeffrey W

    2015-01-01

    Plant density was varied with P, Ca, Mg, and KNO3 in a multifactor experiment to improve Curcuma longa L. micropropagation, biomass and microrhizome development in fed-batch liquid culture. The experiment had two paired D-optimal designs, testing sucrose fed-batch and nutrient sucrose fed-batch techniques. When sucrose became depleted, volume was restored to 5% m/v sucrose in 200 ml of modified liquid MS medium by adding sucrose solutions. Similarly, nutrient sucrose fed-batch was restored to set points with double concentration of treatments' macronutrient and MS micronutrient solutions, along with sucrose solutions. Changes in the amounts of water and sucrose supplementations were driven by the interaction of P and KNO3 concentrations. Increasing P from 1.25 to 6.25 mM increased both multiplication and biomass. The multiplication ratio was greatest in the nutrient sucrose fed-batch technique with the highest level of P, 6 buds/vessel, and the lowest level of Ca and KNO3. The highest density (18 buds/vessel) produced the highest fresh biomass at the highest concentrations of KNO3 and P with nutrient sucrose fed-batch, and moderate Ca and Mg concentrations. However, maximal rhizome dry biomass required highest P, sucrose fed-batch, and a moderate plant density. Different media formulations and fed-batch techniques were identified to maximize the propagation and storage organ responses. A single experimental design was used to optimize these dual purposes.

  7. In Vitro Growth of Curcuma longa L. in Response to Five Mineral Elements and Plant Density in Fed-Batch Culture Systems

    PubMed Central

    El-Hawaz, Rabia F.; Bridges, William C.; Adelberg, Jeffrey W.

    2015-01-01

    Plant density was varied with P, Ca, Mg, and KNO3 in a multifactor experiment to improve Curcuma longa L. micropropagation, biomass and microrhizome development in fed-batch liquid culture. The experiment had two paired D-optimal designs, testing sucrose fed-batch and nutrient sucrose fed-batch techniques. When sucrose became depleted, volume was restored to 5% m/v sucrose in 200 ml of modified liquid MS medium by adding sucrose solutions. Similarly, nutrient sucrose fed-batch was restored to set points with double concentration of treatments’ macronutrient and MS micronutrient solutions, along with sucrose solutions. Changes in the amounts of water and sucrose supplementations were driven by the interaction of P and KNO3 concentrations. Increasing P from 1.25 to 6.25 mM increased both multiplication and biomass. The multiplication ratio was greatest in the nutrient sucrose fed-batch technique with the highest level of P, 6 buds/vessel, and the lowest level of Ca and KNO3. The highest density (18 buds/vessel) produced the highest fresh biomass at the highest concentrations of KNO3 and P with nutrient sucrose fed-batch, and moderate Ca and Mg concentrations. However, maximal rhizome dry biomass required highest P, sucrose fed-batch, and a moderate plant density. Different media formulations and fed-batch techniques were identified to maximize the propagation and storage organ responses. A single experimental design was used to optimize these dual purposes. PMID:25830292

  8. Sequential biases in accumulating evidence

    PubMed Central

    Huggins, Richard; Dogo, Samson Henry

    2015-01-01

    Whilst it is common in clinical trials to use the results of tests at one phase to decide whether to continue to the next phase and to subsequently design the next phase, we show that this can lead to biased results in evidence synthesis. Two new kinds of bias associated with accumulating evidence, termed ‘sequential decision bias’ and ‘sequential design bias’, are identified. Both kinds of bias are the result of making decisions on the usefulness of a new study, or its design, based on the previous studies. Sequential decision bias is determined by the correlation between the value of the current estimated effect and the probability of conducting an additional study. Sequential design bias arises from using the estimated value instead of the clinically relevant value of an effect in sample size calculations. We considered both the fixed‐effect and the random‐effects models of meta‐analysis and demonstrated analytically and by simulations that in both settings the problems due to sequential biases are apparent. According to our simulations, the sequential biases increase with increased heterogeneity. Minimisation of sequential biases arises as a new and important research area necessary for successful evidence‐based approaches to the development of science. © 2015 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd. PMID:26626562

  9. Increasing efficiency of preclinical research by group sequential designs

    PubMed Central

    Piper, Sophie K.; Rex, Andre; Florez-Vargas, Oscar; Karystianis, George; Schneider, Alice; Wellwood, Ian; Siegerink, Bob; Ioannidis, John P. A.; Kimmelman, Jonathan; Dirnagl, Ulrich

    2017-01-01

    Despite the potential benefits of sequential designs, studies evaluating treatments or experimental manipulations in preclinical experimental biomedicine almost exclusively use classical block designs. Our aim with this article is to bring the existing methodology of group sequential designs to the attention of researchers in the preclinical field and to clearly illustrate its potential utility. Group sequential designs can offer higher efficiency than traditional methods and are increasingly used in clinical trials. Using simulation of data, we demonstrate that group sequential designs have the potential to improve the efficiency of experimental studies, even when sample sizes are very small, as is currently prevalent in preclinical experimental biomedicine. When simulating data with a large effect size of d = 1 and a sample size of n = 18 per group, sequential frequentist analysis consumes in the long run only around 80% of the planned number of experimental units. In larger trials (n = 36 per group), additional stopping rules for futility lead to the saving of resources of up to 30% compared to block designs. We argue that these savings should be invested to increase sample sizes and hence power, since the currently underpowered experiments in preclinical biomedicine are a major threat to the value and predictiveness in this research domain. PMID:28282371

  10. The effect of a sequential structure of practice for the training of perceptual-cognitive skills in tennis

    PubMed Central

    2017-01-01

    Objective Anticipation of opponent actions, through the use of advanced (i.e., pre-event) kinematic information, can be trained using video-based temporal occlusion. Typically, this involves isolated opponent skills/shots presented as trials in a random order. However, two different areas of research concerning representative task design and contextual (non-kinematic) information, suggest this structure of practice restricts expert performance. The aim of this study was to examine the effect of a sequential structure of practice during video-based training of anticipatory behavior in tennis, as well as the transfer of these skills to the performance environment. Methods In a pre-practice-retention-transfer design, participants viewed life-sized video of tennis rallies across practice in either a sequential order (sequential group), in which participants were exposed to opponent skills/shots in the order they occur in the sport, or a non-sequential (non-sequential group) random order. Results In the video-based retention test, the sequential group was significantly more accurate in their anticipatory judgments when the retention condition replicated the sequential structure compared to the non-sequential group. In the non-sequential retention condition, the non-sequential group was more accurate than the sequential group. In the field-based transfer test, overall decision time was significantly faster in the sequential group compared to the non-sequential group. Conclusion Findings highlight the benefits of a sequential structure of practice for the transfer of anticipatory behavior in tennis. We discuss the role of contextual information, and the importance of representative task design, for the testing and training of perceptual-cognitive skills in sport. PMID:28355263

  11. Fed-batch anaerobic valorization of slaughterhouse by-products with mesophilic microbial consortia without methane production.

    PubMed

    Pessiot, J; Nouaille, R; Jobard, M; Singhania, R R; Bournilhas, A; Christophe, G; Fontanille, P; Peyret, P; Fonty, G; Larroche, C

    2012-07-01

    This work aimed at setting up a fully instrumented, laboratory-scale bioreactor enabling anaerobic valorization of solid substrates through hydrogen and/or volatile fatty acid (VFA) production using mixed microbial populations (consortia). The substrate used was made of meat-based wastes, especially from slaughterhouses, which are becoming available in large amounts as a consequence of the growing constraints for waste disposal from meat industry. A reconstituted microbial mesophilic consortium without Archaebacteria (methanogens), named PBr, was cultivated in a 5-L anaerobic bioreactor on slaughterhouse wastes. The experiments were carried out with sequential fed-batch operations, including liquid medium removal from the bioreactor and addition of fresh substrate. VFAs and nitrogen were the main metabolites observed, while hydrogen accumulation was very low and no methane production was evidenced. After 1,300 h of culture, yields obtained for VFAs reached 0.38 g/g dry matter. Strain composition of the microbial consortium was also characterized using molecular tools (temporal temperature gradient gel electrophoresis and gene sequencing).

  12. A comparison of abundance estimates from extended batch-marking and Jolly–Seber-type experiments

    PubMed Central

    Cowen, Laura L E; Besbeas, Panagiotis; Morgan, Byron J T; Schwarz, Carl J

    2014-01-01

    Little attention has been paid to the use of multi-sample batch-marking studies, as it is generally assumed that an individual's capture history is necessary for fully efficient estimates. However, recently, Huggins et al. (2010) present a pseudo-likelihood for a multi-sample batch-marking study where they used estimating equations to solve for survival and capture probabilities and then derived abundance estimates using a Horvitz–Thompson-type estimator. We have developed and maximized the likelihood for batch-marking studies. We use data simulated from a Jolly–Seber-type study and convert this to what would have been obtained from an extended batch-marking study. We compare our abundance estimates obtained from the Crosbie–Manly–Arnason–Schwarz (CMAS) model with those of the extended batch-marking model to determine the efficiency of collecting and analyzing batch-marking data. We found that estimates of abundance were similar for all three estimators: CMAS, Huggins, and our likelihood. Gains are made when using unique identifiers and employing the CMAS model in terms of precision; however, the likelihood typically had lower mean square error than the pseudo-likelihood method of Huggins et al. (2010). When faced with designing a batch-marking study, researchers can be confident in obtaining unbiased abundance estimators. Furthermore, they can design studies in order to reduce mean square error by manipulating capture probabilities and sample size. PMID:24558576

  13. Lessons learned for composite structures

    NASA Technical Reports Server (NTRS)

    Whitehead, R. S.

    1991-01-01

    Lessons learned for composite structures are presented in three technology areas: materials, manufacturing, and design. In addition, future challenges for composite structures are presented. Composite materials have long gestation periods from the developmental stage to fully matured production status. Many examples exist of unsuccessful attempts to accelerate this gestation period. Experience has shown that technology transition of a new material system to fully matured production status is time consuming, involves risk, is expensive and should not be undertaken lightly. The future challenges for composite materials require an intensification of the science based approach to material development, extension of the vendor/customer interaction process to include all engineering disciplines of the end user, reduced material costs because they are a significant factor in overall part cost, and improved batch-to-batch pre-preg physical property control. Historical manufacturing lessons learned are presented using current in-service production structure as examples. Most producibility problems for these structures can be traced to their sequential engineering design. This caused an excessive emphasis on design-to-weight and schedule at the expense of design-to-cost. This resulted in expensive performance originated designs, which required costly tooling and led to non-producible parts. Historically these problems have been allowed to persist throughout the production run. The current/future approach for the production of affordable composite structures mandates concurrent engineering design where equal emphasis is placed on product and process design. Design for simplified assembly is also emphasized, since assembly costs account for a major portion of total airframe costs. The future challenge for composite manufacturing is, therefore, to utilize concurrent engineering in conjunction with automated manufacturing techniques to build affordable composite structures. Composite design experience has shown that significant weight savings have been achieved, outstanding fatigue and corrosion resistance have been demonstrated, and in-service performance has been very successful. Currently no structural design show stoppers exist for composite structures. A major lesson learned is that the full scale static test is the key test for composites, since it is the primary structural 'hot spot' indicator. The major durability issue is supportability of thin skinned structure. Impact damage has been identified as the most significant issue for the damage tolerance control of composite structures. However, delaminations induced during assembly operations have demonstrated a significant nuisance value. The future challenges for composite structures are threefold. Firstly, composite airframe weight fraction should increase to 60 percent. At the same time, the cost of composite structures must be reduced by 50 percent to attain the goal of affordability. To support these challenges it is essential to develop lower cost materials and processes.

  14. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  15. Sequential injection analysis with chemiluminescence detection for rapid monitoring of commercial Calendula officinalis extractions.

    PubMed

    Hughes, Rachel R; Scown, David; Lenehan, Claire E

    2015-01-01

    Plant extracts containing high levels of antioxidants are desirable due to their reported health benefits. Most techniques capable of determining the antioxidant activity of plant extracts are unsuitable for rapid at-line analysis as they require extensive sample preparation and/or long analysis times. Therefore, analytical techniques capable of real-time or pseudo real-time at-line monitoring of plant extractions, and determination of extraction endpoints, would be useful to manufacturers of antioxidant-rich plant extracts. To develop a reliable method for the rapid at-line extraction monitoring of antioxidants in plant extracts. Calendula officinalis extracts were prepared from dried flowers and analysed for antioxidant activity using sequential injection analysis (SIA) with chemiluminescence (CL) detection. The intensity of CL emission from the reaction of acidic potassium permanganate with antioxidants within the extract was used as the analytical signal. The SIA-CL method was applied to monitor the extraction of C. officinalis over the course of a batch extraction to determine the extraction endpoint. Results were compared with those from ultra high performance liquid chromatography (UHPLC). Pseudo real-time, at-line monitoring showed the level of antioxidants in a batch extract of Calendula officinalis plateaued after 100 min of extraction. These results correlated well with those of an offline UHPLC study. SIA-CL was found to be a suitable method for pseudo real-time monitoring of plant extractions and determination of extraction endpoints with respect to antioxidant concentrations. The method was applied at-line in the manufacturing industry. Copyright © 2015 John Wiley & Sons, Ltd.

  16. High metal reactivity and environmental risks at a site contaminated by glass waste.

    PubMed

    Augustsson, A; Åström, M; Bergbäck, B; Elert, M; Höglund, L O; Kleja, D B

    2016-07-01

    This study addresses the reactivity and risks of metals (Ba, Cd, Co, Cr, Cu, Ni, Pb, Zn, As and Sb) at a Swedish site with large glass waste deposits. Old glassworks sites typically have high total metal concentrations, but as the metals are mainly bound within the glass waste and considered relatively inert, environmental investigations at these kinds of sites are limited. In this study, soil and landfill samples were subjected to a sequential chemical extraction procedure. Data from batch leaching tests and groundwater upstream and downstream of the waste deposits were also interpreted. The sequential extraction revealed that metals in <2 mm soil/waste samples were largely associated with geochemically active fractions, indicating that metals are released from pristine glass and subsequently largely retained in the surrounding soil and/or on secondary mineral coatings on fine glass particles. From the approximately 12,000 m(3) of coarse glass waste at the site, almost 4000 kg of Pb is estimated to have been lost through corrosion, which, however, corresponds to only a small portion of the total amount of Pb in the waste. Metal sorption within the waste deposits or in underlying soil layers is supported by fairly low metal concentrations in groundwater. However, elevated concentrations in downstream groundwater and in leachates of batch leaching tests were observed for several metals, indicating on-going leaching. Taken together, the high metal concentrations in geochemically active forms and the high amounts of as yet uncorroded metal-rich glass, indicate considerable risks to human health and the environment. Copyright © 2016. Published by Elsevier Ltd.

  17. Phosphate-Induced Immobilization of Uranium in Hanford Sediments.

    PubMed

    Pan, Zezhen; Giammar, Daniel E; Mehta, Vrajesh; Troyer, Lyndsay D; Catalano, Jeffrey G; Wang, Zheming

    2016-12-20

    Phosphate can be added to subsurface environments to immobilize U(VI) contamination. The efficacy of immobilization depends on the site-specific groundwater chemistry and aquifer sediment properties. Batch and column experiments were performed with sediments from the Hanford 300 Area in Washington State and artificial groundwater prepared to emulate the conditions at the site. Batch experiments revealed enhanced U(VI) sorption with increasing phosphate addition. X-ray absorption spectroscopy measurements of samples from the batch experiments found that U(VI) was predominantly adsorbed at conditions relevant to the column experiments and most field sites (low U(VI) loadings, <25 μM), and U(VI) phosphate precipitation occurred only at high initial U(VI) (>25 μM) and phosphate loadings. While batch experiments showed the transition of U(VI) uptake from adsorption to precipitation, the column study was more directly relevant to the subsurface environment because of the high solid:water ratio in the column and the advective flow of water. In column experiments, nearly six times more U(VI) was retained in sediments when phosphate-containing groundwater was introduced to U(VI)-loaded sediments than when the groundwater did not contain phosphate. This enhanced retention persisted for at least one month after cessation of phosphate addition to the influent fluid. Sequential extractions and laser-induced fluorescence spectroscopy of sediments from the columns suggested that the retained U(VI) was primarily in adsorbed forms. These results indicate that in situ remediation of groundwater by phosphate addition provides lasting benefit beyond the treatment period via enhanced U(VI) adsorption to sediments.

  18. Phosphate-Induced Immobilization of Uranium in Hanford Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Zezhen; Giammar, Daniel E.; Mehta, Vrajesh

    2016-12-20

    Phosphate can be added to subsurface environments to immobilize U(VI) contamination. The efficacy of immobilization depends on the site-specific groundwater chemistry and aquifer sediment properties. Batch and column experiments were performed with sediments from the Hanford 300 Area in Washington State and artificial groundwater prepared to emulate the conditions at the site. Batch experiments revealed enhanced U(VI) sorption with increasing phosphate addition. X-ray absorption spectroscopy measurements of samples from the batch experiments found that U(VI) was predominantly adsorbed at conditions relevant to the column experiments and most field sites (low U(VI) loadings, <25 μM), and U(VI) phosphate precipitation occurred onlymore » at high initial U(VI) (>25 μM) and phosphate loadings. While batch experiments showed the transition of U(VI) uptake from adsorption to precipitation, the column study was more directly relevant to the subsurface environment because of the high solid:water ratio in the column and the advective flow of water. In column experiments, nearly six times more U(VI) was retained in sediments when phosphate-containing groundwater was introduced to U(VI)-loaded sediments than when the groundwater did not contain phosphate. This enhanced retention persisted for at least one month after cessation of phosphate addition to the influent fluid. Sequential extractions and laser-induced fluorescence spectroscopy of sediments from the columns suggested that the retained U(VI) was primarily in adsorbed forms. These results indicate that in situ remediation of groundwater by phosphate addition provides lasting benefit beyond the treatment period via enhanced U(VI) adsorption to sediments.« less

  19. Phosphate-Induced Immobilization of Uranium in Hanford Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Zezhen; Giammar, Daniel E.; Mehta, Vrajesh

    2016-12-20

    Phosphate can be added to subsurface environments to immobilize U(VI) contamination. The efficacy of immobilization depends on the site-specific groundwater chemistry and aquifer sediment properties. Batch and column experiments were performed with sediments from the Hanford 300 Area in Washington State and artificial groundwater prepared to emulate the conditions at the site. Batch experiments revealed enhanced U(VI) sorption with increasing phosphate addition. X-ray absorption spectroscopy measurements of samples from the batch experiments found that U(VI) was predominantly adsorbed at conditions relevant to the column experiments and most field sites (low U(VI) loadings, <25 μM), and U(VI) phosphate precipitation occurred onlymore » at high initial U(VI) (>25μM) and phosphate loadings. While batch experiments showed the transition of U(VI) uptake from adsorption to precipitation, the column study was more directly relevant to the subsurface environment because of the high solid:water ratio in the column and the advective flow of water. In column experiments, nearly six times more U(VI) was retained in sediments when phosphate-containing groundwater was introduced to U(VI)-loaded sediments than when the groundwater did not contain phosphate. This enhanced retention persisted for at least one month after cessation of phosphate addition to the influent fluid. Sequential extractions and laser-induced fluorescence spectroscopy of sediments from the columns suggested that the retained U(VI) was primarily in adsorbed forms. These results indicate that in situ remediation of groundwater by phosphate addition provides lasting benefit beyond the treatment period via enhanced U(VI) adsorption to sediments.« less

  20. The Design and Implementation of Adsorptive Removal of Cu(II) from Leachate Using ANFIS

    PubMed Central

    Turan, Nurdan Gamze; Ozgonenel, Okan

    2013-01-01

    Clinoptilolite was investigated for the removal of Cu(II) ions from industrial leachate. Adaptive neural fuzzy interface system (ANFIS) was used for modeling the batch experimental system and predicting the optimal input values, that is, initial pH, adsorbent dosage, and contact time. Experiments were studied under laboratory batch and fixed bed conditions. The outcomes of suggested ANFIS modeling were then compared to a full factorial experimental design (23), which was utilized to assess the effect of three factors on the adsorption of Cu(II) ions in aqueous leachate of industrial waste. It was observed that the optimized parameters are almost close to each other. The highest removal efficiency was found as about 93.65% at pH 6, adsorbent dosage 11.4 g/L, and contact time 33 min for batch conditions of 23 experimental design and about 90.43% at pH 5, adsorbent dosage 15 g/L and contact time 35 min for batch conditions of ANFIS. The results show that clinoptilolite is an efficient sorbent and ANFIS, which is easy to implement and is able to model the batch experimental system. PMID:23844405

  1. Improving tablet coating robustness by selecting critical process parameters from retrospective data.

    PubMed

    Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M

    2016-09-01

    Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.

  2. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    PubMed

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serin, E.; Codel, G.; Mabhouti, H.

    Purpose: In small field geometries, the electronic equilibrium can be lost, making it challenging for the dose-calculation algorithm to accurately predict the dose, especially in the presence of tissue heterogeneities. In this study, dosimetric accuracy of Monte Carlo (MC) advanced dose calculation and sequential algorithms of Multiplan treatment planning system were investigated for small radiation fields incident on homogeneous and heterogeneous geometries. Methods: Small open fields of fixed cones of Cyberknife M6 unit 100 to 500 mm2 were used for this study. The fields were incident on in house phantom containing lung, air, and bone inhomogeneities and also homogeneous phantom.more » Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of MC and sequential algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement between two algorithms and film measurement For MC algorithm, the minimum gamma analysis passing rates between measured and calculated dose distributions were 99.7% and 98.3% for homogeneous and inhomogeneous fields in the case of lung and bone respectively. For sequential algorithm, the minimum gamma analysis passing rates were 98.9% and 92.5% for for homogeneous and inhomogeneous fields respectively for used all cone sizes. In the case of the air heterogeneity, the differences were larger for both calculation algorithms. Overall, when compared to measurement, the MC had better agreement than sequential algorithm. Conclusion: The Monte Carlo calculation algorithm in the Multiplan treatment planning system is an improvement over the existing sequential algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.« less

  4. Mixing modes in a population-based interview survey: comparison of a sequential and a concurrent mixed-mode design for public health research.

    PubMed

    Mauz, Elvira; von der Lippe, Elena; Allen, Jennifer; Schilling, Ralph; Müters, Stephan; Hoebel, Jens; Schmich, Patrick; Wetzstein, Matthias; Kamtsiuris, Panagiotis; Lange, Cornelia

    2018-01-01

    Population-based surveys currently face the problem of decreasing response rates. Mixed-mode designs are now being implemented more often to account for this, to improve sample composition and to reduce overall costs. This study examines whether a concurrent or sequential mixed-mode design achieves better results on a number of indicators of survey quality. Data were obtained from a population-based health interview survey of adults in Germany that was conducted as a methodological pilot study as part of the German Health Update (GEDA). Participants were randomly allocated to one of two surveys; each of the surveys had a different design. In the concurrent mixed-mode design ( n  = 617) two types of self-administered questionnaires (SAQ-Web and SAQ-Paper) and computer-assisted telephone interviewing were offered simultaneously to the respondents along with the invitation to participate. In the sequential mixed-mode design ( n  = 561), SAQ-Web was initially provided, followed by SAQ-Paper, with an option for a telephone interview being sent out together with the reminders at a later date. Finally, this study compared the response rates, sample composition, health indicators, item non-response, the scope of fieldwork and the costs of both designs. No systematic differences were identified between the two mixed-mode designs in terms of response rates, the socio-demographic characteristics of the achieved samples, or the prevalence rates of the health indicators under study. The sequential design gained a higher rate of online respondents. Very few telephone interviews were conducted for either design. With regard to data quality, the sequential design (which had more online respondents) showed less item non-response. There were minor differences between the designs in terms of their costs. Postage and printing costs were lower in the concurrent design, but labour costs were lower in the sequential design. No differences in health indicators were found between the two designs. Modelling these results for higher response rates and larger net sample sizes indicated that the sequential design was more cost and time-effective. This study contributes to the research available on implementing mixed-mode designs as part of public health surveys. Our findings show that SAQ-Paper and SAQ-Web questionnaires can be combined effectively. Sequential mixed-mode designs with higher rates of online respondents may be of greater benefit to studies with larger net sample sizes than concurrent mixed-mode designs.

  5. Variable complexity online sequential extreme learning machine, with applications to streamflow prediction

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Hsieh, William W.; Cannon, Alex J.

    2017-12-01

    In situations where new data arrive continually, online learning algorithms are computationally much less costly than batch learning ones in maintaining the model up-to-date. The extreme learning machine (ELM), a single hidden layer artificial neural network with random weights in the hidden layer, is solved by linear least squares, and has an online learning version, the online sequential ELM (OSELM). As more data become available during online learning, information on the longer time scale becomes available, so ideally the model complexity should be allowed to change, but the number of hidden nodes (HN) remains fixed in OSELM. A variable complexity VC-OSELM algorithm is proposed to dynamically add or remove HN in the OSELM, allowing the model complexity to vary automatically as online learning proceeds. The performance of VC-OSELM was compared with OSELM in daily streamflow predictions at two hydrological stations in British Columbia, Canada, with VC-OSELM significantly outperforming OSELM in mean absolute error, root mean squared error and Nash-Sutcliffe efficiency at both stations.

  6. Sequential recycling of enzymatic lipid-extracted hydrolysate in fermentations with a thraustochytrid.

    PubMed

    Lowrey, Joshua; Armenta, Roberto E; Brooks, Marianne S

    2016-06-01

    This study extends the findings of prior studies proposing and validating nutrient recycling for the heterotrophic microalgae, Thraustochytrium sp. (T18), grown in optimized fed-batch conditions. Sequential nutrient recycling of enzymatically-derived hydrolysate in fermentors succeeded at growing the tested thraustochytrid strain, with little evidence of inhibition or detrimental effects upon culture health. The average maximum biomass obtained in the recycled hydrolysate was 63.68±1.46gL(-1) in 90h the first recycle followed by 65.27±1.15gL(-1) in 90h in the subsequent recycle of the same material. These compared to 58.59gL(-1) and 64.92gL(-1) observed in fresh media in the same time. Lipid production was slightly impaired, however, with a maximum total fatty acid content of 62.2±0.30% in the recycled hydrolysate compared to 69.4% in fresh control media. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Facile Large-scale synthesis of stable CuO nanoparticles

    NASA Astrophysics Data System (ADS)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  8. A Soft Sensor for Bioprocess Control Based on Sequential Filtering of Metabolic Heat Signals

    PubMed Central

    Paulsson, Dan; Gustavsson, Robert; Mandenius, Carl-Fredrik

    2014-01-01

    Soft sensors are the combination of robust on-line sensor signals with mathematical models for deriving additional process information. Here, we apply this principle to a microbial recombinant protein production process in a bioreactor by exploiting bio-calorimetric methodology. Temperature sensor signals from the cooling system of the bioreactor were used for estimating the metabolic heat of the microbial culture and from that the specific growth rate and active biomass concentration were derived. By applying sequential digital signal filtering, the soft sensor was made more robust for industrial practice with cultures generating low metabolic heat in environments with high noise level. The estimated specific growth rate signal obtained from the three stage sequential filter allowed controlled feeding of substrate during the fed-batch phase of the production process. The biomass and growth rate estimates from the soft sensor were also compared with an alternative sensor probe and a capacitance on-line sensor, for the same variables. The comparison showed similar or better sensitivity and lower variability for the metabolic heat soft sensor suggesting that using permanent temperature sensors of a bioreactor is a realistic and inexpensive alternative for monitoring and control. However, both alternatives are easy to implement in a soft sensor, alone or in parallel. PMID:25264951

  9. A soft sensor for bioprocess control based on sequential filtering of metabolic heat signals.

    PubMed

    Paulsson, Dan; Gustavsson, Robert; Mandenius, Carl-Fredrik

    2014-09-26

    Soft sensors are the combination of robust on-line sensor signals with mathematical models for deriving additional process information. Here, we apply this principle to a microbial recombinant protein production process in a bioreactor by exploiting bio-calorimetric methodology. Temperature sensor signals from the cooling system of the bioreactor were used for estimating the metabolic heat of the microbial culture and from that the specific growth rate and active biomass concentration were derived. By applying sequential digital signal filtering, the soft sensor was made more robust for industrial practice with cultures generating low metabolic heat in environments with high noise level. The estimated specific growth rate signal obtained from the three stage sequential filter allowed controlled feeding of substrate during the fed-batch phase of the production process. The biomass and growth rate estimates from the soft sensor were also compared with an alternative sensor probe and a capacitance on-line sensor, for the same variables. The comparison showed similar or better sensitivity and lower variability for the metabolic heat soft sensor suggesting that using permanent temperature sensors of a bioreactor is a realistic and inexpensive alternative for monitoring and control. However, both alternatives are easy to implement in a soft sensor, alone or in parallel.

  10. Navigation strategy and filter design for solar electric missions

    NASA Technical Reports Server (NTRS)

    Tapley, B. D.; Hagar, H., Jr.

    1972-01-01

    Methods which have been proposed to improve the navigation accuracy for the low-thrust space vehicle include modifications to the standard Sequential- and Batch-type orbit determination procedures and the use of inertial measuring units (IMU) which measures directly the acceleration applied to the vehicle. The navigation accuracy obtained using one of the more promising modifications to the orbit determination procedures is compared with a combined IMU-Standard. The unknown accelerations are approximated as both first-order and second-order Gauss-Markov processes. The comparison is based on numerical results obtained in a study of the navigation requirements of a numerically simulated 152-day low-thrust mission to the asteroid Eros. The results obtained in the simulation indicate that the DMC algorithm will yield a significant improvement over the navigation accuracies achieved with previous estimation algorithms. In addition, the DMC algorithms will yield better navigation accuracies than the IMU-Standard Orbit Determination algorithm, except for extremely precise IMU measurements, i.e., gyroplatform alignment .01 deg and accelerometer signal-to-noise ratio .07. Unless these accuracies are achieved, the IMU navigation accuracies are generally unacceptable.

  11. Model based adaptive control of a continuous capture process for monoclonal antibodies production.

    PubMed

    Steinebach, Fabian; Angarita, Monica; Karst, Daniel J; Müller-Späth, Thomas; Morbidelli, Massimo

    2016-04-29

    A two-column capture process for continuous processing of cell-culture supernatant is presented. Similar to other multicolumn processes, this process uses sequential countercurrent loading of the target compound in order maximize resin utilization and productivity for a given product yield. The process was designed using a novel mechanistic model for affinity capture, which takes both specific adsorption as well as transport through the resin beads into account. Simulations as well as experimental results for the capture of an IgG antibody are discussed. The model was able to predict the process performance in terms of yield, productivity and capacity utilization. Compared to continuous capture with two columns operated batch wise in parallel, a 2.5-fold higher capacity utilization was obtained for the same productivity and yield. This results in an equal improvement in product concentration and reduction of buffer consumption. The developed model was used not only for the process design and optimization but also for its online control. In particular, the unit operating conditions are changed in order to maintain high product yield while optimizing the process performance in terms of capacity utilization and buffer consumption also in the presence of changing upstream conditions and resin aging. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Enzyme recycle and fed-batch addition for high-productivity soybean flour processing to produce enriched soy protein and concentrated hydrolysate of fermentable sugars.

    PubMed

    Loman, Abdullah Al; Islam, S M Mahfuzul; Li, Qian; Ju, Lu-Kwang

    2017-10-01

    Despite having high protein and carbohydrate, soybean flour utilization is limited to partial replacement of animal feed to date. Enzymatic process can be exploited to increase its value by enriching protein content and separating carbohydrate for utilization as fermentation feedstock. Enzyme hydrolysis with fed-batch and recycle designs were evaluated here for achieving this goal with high productivities. Fed-batch process improved carbohydrate conversion, particularly at high substrate loadings of 250-375g/L. In recycle process, hydrolysate retained a significant portion of the limiting enzyme α-galactosidase to accelerate carbohydrate monomerization rate. At single-pass retention time of 6h and recycle rate of 62.5%, reducing sugar concentration reached up to 120g/L using 4ml/g enzyme. When compared with batch and fed-batch processes, the recycle process increased the volumetric productivity of reducing sugar by 36% (vs. fed-batch) to 57% (vs. batch) and that of protein product by 280% (vs. fed-batch) to 300% (vs. batch). Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Results of Hg speciation testing on DWPF SMECT-4, SMECT-6, and RCT-2 samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bannochie, C. J.

    2016-02-04

    The Savannah River National Laboratory (SRNL) was tasked with preparing and shipping samples for Hg speciation by Eurofins Frontier Global Sciences, Inc. in Seattle, WA on behalf of the Savannah River Remediation (SRR) Mercury Task Team.i,ii The fifteenth shipment of samples was designated to include Defense Waste Processing Facility (DWPF) Slurry Mix Evaporator Condensate Tank (SMECT) samples from Sludge Receipt and Adjustment Tank (SRAT) Batch 738 and a Recycle Condensate Tank (RCT) sample from SRAT Batch 736. The DWPF sample designations for the three samples analyzed are provided in Table 1. The Batch 738 ‘Baseline’ SMECT sample was taken priormore » to Precipitate Reactor Feed Tank (PRFT) addition and concentration and therefore, precedes the SMECT-5 sample reported previously. iii The Batch 738 ‘End of SRAT Cycle’ SMECT sample was taken at the conclusion of SRAT operations for this batch (PRFT addition/concentration, acid additions, initial concentration, MCU addition, and steam stripping). Batch 738 experienced a sludge slurry carryover event, which introduced sludge solids to the SMECT that were particularly evident in the SMECT-5 sample, but less evident in the ‘End of SRAT Cycle’ SMECT-6 sample. The Batch 736 ‘After SME’ RCT sample was taken after completion of SMECT transfers at the end of the SME cycle.« less

  14. Asymptotic Properties of the Sequential Empirical ROC, PPV and NPV Curves Under Case-Control Sampling.

    PubMed

    Koopmeiners, Joseph S; Feng, Ziding

    2011-01-01

    The receiver operating characteristic (ROC) curve, the positive predictive value (PPV) curve and the negative predictive value (NPV) curve are three measures of performance for a continuous diagnostic biomarker. The ROC, PPV and NPV curves are often estimated empirically to avoid assumptions about the distributional form of the biomarkers. Recently, there has been a push to incorporate group sequential methods into the design of diagnostic biomarker studies. A thorough understanding of the asymptotic properties of the sequential empirical ROC, PPV and NPV curves will provide more flexibility when designing group sequential diagnostic biomarker studies. In this paper we derive asymptotic theory for the sequential empirical ROC, PPV and NPV curves under case-control sampling using sequential empirical process theory. We show that the sequential empirical ROC, PPV and NPV curves converge to the sum of independent Kiefer processes and show how these results can be used to derive asymptotic results for summaries of the sequential empirical ROC, PPV and NPV curves.

  15. Asymptotic Properties of the Sequential Empirical ROC, PPV and NPV Curves Under Case-Control Sampling

    PubMed Central

    Koopmeiners, Joseph S.; Feng, Ziding

    2013-01-01

    The receiver operating characteristic (ROC) curve, the positive predictive value (PPV) curve and the negative predictive value (NPV) curve are three measures of performance for a continuous diagnostic biomarker. The ROC, PPV and NPV curves are often estimated empirically to avoid assumptions about the distributional form of the biomarkers. Recently, there has been a push to incorporate group sequential methods into the design of diagnostic biomarker studies. A thorough understanding of the asymptotic properties of the sequential empirical ROC, PPV and NPV curves will provide more flexibility when designing group sequential diagnostic biomarker studies. In this paper we derive asymptotic theory for the sequential empirical ROC, PPV and NPV curves under case-control sampling using sequential empirical process theory. We show that the sequential empirical ROC, PPV and NPV curves converge to the sum of independent Kiefer processes and show how these results can be used to derive asymptotic results for summaries of the sequential empirical ROC, PPV and NPV curves. PMID:24039313

  16. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices...) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550, you...

  17. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices...) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550, you...

  18. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices... (c) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550...

  19. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices... (c) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550...

  20. 40 CFR 63.2460 - What requirements must I meet for batch process vents?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... (ii) When you conduct a performance test or design evaluation for a non-flare control device used to... paragraphs (c)(9)(ii)(A) through (D) of this section. The design evaluation option for small control devices... (c) of this section. (b) Group status. If a process has batch process vents, as defined in § 63.2550...

  1. A Novel Permeable Reactive Barrier (PRB) for Simultaneous and Rapid Removal of Heavy Metal and Organic Matter - A Systematic Chemical Speciation Approach on Sustainable Technique for Pallikarani Marshland Remediation

    NASA Astrophysics Data System (ADS)

    Selvaraj, A.; Nambi, I. M.

    2014-12-01

    In this study, an innovative technique of ZVI mediated 'coupling of Fenton like oxidation of phenol and Cr(VI) reduction technique' was attempted. The hypothesis is that Fe3+ generated from Cr(VI) reduction process acts as electron acceptor and catalyst for Fenton's Phenol oxidation process. The Fe2+ formed from Fenton reactions can be reused for Cr(VI) reduction. Thus iron can be made to recycle between two reactions, changing back and forth between Fe2+ and Fe3+ forms, makes treatment sustainable.(Fig 1) This approach advances current Fenton like oxidation process by (i)single system removal of heavy metal and organic matter (ii)recycling of iron species; hence no additional iron required (iii)more contaminant removal to ZVI ratio (iv)eliminating sludge related issues. Preliminary batch studies were conducted at different modes i) concurrent removal ii) sequential removal. The sequential removal was found better for in-situ PRB applications. PRB was designed based on kinetic rate slope and half-life time, obtained from primary column study. This PRB has two segments (i)ZVI segment[Cr(VI)] (ii)iron species segment[phenol]. This makes treatment sustainable by (i) having no iron ions in outlet stream (ii)meeting hypothesis and elongates the life span of PRB. Sequential removal of contaminates were tested in pilot scale PRB(Fig 2) and its life span was calculated based on the exhaustion of filling material. Aqueous, sand and iron aliquots were collected at various segments of PRB and analyzed for precipitation and chemical speciation thoroughly (UV spectrometer, XRD, FTIR, electron microscope). Chemical speciation profile eliminates the uncertainties over in-situ PRB's long term performance. Based on the pilot scale PRB study, 'field level PRB wall construction' was suggested to remove heavy metal and organic compounds from Pallikaranai marshland(Fig 3)., which is contaminated with leachate coming from nearby Perungudi dumpsite. This research provides (i)deeper insight into the environmental friendly, accelerated, sustainable technique for combined removal of organic matter and heavy metal (ii)evaluation of the novel technique in PRB, which resulted in PRB's increased life span (iii)designing of PRB to remediate the marshland and its ecosystem, thus save the habitats related to it.

  2. Formation of Manganese Oxide Coatings onto Sand for Adsorption of Trace Metals from Groundwater.

    PubMed

    Tilak, A S; Ojewole, S; Williford, C W; Fox, G A; Sobecki, T M; Larson, S L

    2013-11-01

    Manganese oxide (MnO) occurs naturally in soil and has a high affinity for trace metals adsorption. In this work, we quantified the factors (pH; flow rate; use of oxidants such as bleach, HO, and O; initial Mn(II) concentrations; and two types of geologic media) affecting MnO coatings onto Ottawa and aquifer sand using batch and column experiments. The batch experiments consisted of manual and automated titration, and the column experiments mimicked natural MnO adsorption and oxidation cycles as a strategy for in situ adsorption. A Pb solution of 50 mg L was passed through MnO-coated sand at a flow rate of 4 mL min to determine its adsorption capacity. Batch experimental results showed that MnO coatings increased from pH 6 to 8, with maximum MnO coating occurring at pH 8. Regarding MnO coatings, bleach and O were highly effective compared with HO. The Ottawa sand had approximately twice the MnO coating of aquifer sand. The sequential increase in initial Mn(II) concentrations on both sands resulted in incremental buildup of MnO. The automated procedure enhanced MnO coatings by 3.5 times compared with manual batch experiments. Column results showed that MnO coatings were highly dependent on initial Mn(II) and oxidant concentrations, pH, flow rate, number of cycles (h), and the type of geologic media used. Manganese oxide coating exceeded 1700 mg kg for Ottawa sand and 130 mg kg for aquifer sand. The Pb adsorption exceeded 2200 mg kg for the Ottawa sand and 300 mg kg for the aquifer sand. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  3. High-Alpha Research Vehicle (HARV) longitudinal controller: Design, analyses, and simulation resultss

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Hoffler, Keith D.; Proffitt, Melissa S.; Brown, Philip W.; Phillips, Michael R.; Rivers, Robert A.; Messina, Michael D.; Carzoo, Susan W.; Bacon, Barton J.; Foster, John F.

    1994-01-01

    This paper describes the design, analysis, and nonlinear simulation results (batch and piloted) for a longitudinal controller which is scheduled to be flight-tested on the High-Alpha Research Vehicle (HARV). The HARV is an F-18 airplane modified for and equipped with multi-axis thrust vectoring. The paper includes a description of the facilities, a detailed review of the feedback controller design, linear analysis results of the feedback controller, a description of the feed-forward controller design, nonlinear batch simulation results, and piloted simulation results. Batch simulation results include maximum pitch stick agility responses, angle of attack alpha captures, and alpha regulation for full lateral stick rolls at several alpha's. Piloted simulation results include task descriptions for several types of maneuvers, task guidelines, the corresponding Cooper-Harper ratings from three test pilots, and some pilot comments. The ratings show that desirable criteria are achieved for almost all of the piloted simulation tasks.

  4. Unconditioned commercial embryo culture media contain a large variety of non-declared proteins: a comprehensive proteomics analysis.

    PubMed

    Dyrlund, Thomas F; Kirkegaard, Kirstine; Poulsen, Ebbe Toftgaard; Sanggaard, Kristian W; Hindkjær, Johnny J; Kjems, Jørgen; Enghild, Jan J; Ingerslev, Hans Jakob

    2014-11-01

    Which non-declared proteins (proteins not listed on the composition list of the product data sheet) are present in unconditioned commercial embryo culture media? A total of 110 non-declared proteins were identified in unconditioned media and between 6 and 8 of these were quantifiable and therefore represent the majority of the total protein in the media samples. There are no data in the literature on what non-declared proteins are present in unconditioned (fresh media in which no embryos have been cultured) commercial embryo media. The following eight commercial embryo culture media were included in this study: G-1 PLUS and G-2 PLUS G5 Series from Vitrolife, Sydney IVF Cleavage Medium and Sydney IVF Blastocyst Medium from Cook Medical and EmbryoAssist, BlastAssist, Sequential Cleav and Sequential Blast from ORIGIO. Two batches were analyzed from each of the Sydney IVF media and one batch from each of the other media. All embryo culture media are supplemented by the manufacturers with purified human serum albumin (HSA 5 mg/ml). The purified HSA (HSA-solution from Vitrolife) and the recombinant human albumin supplement (G-MM from Vitrolife) were also analyzed. For protein quantification, media samples were in-solution digested with trypsin and analyzed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). For in-depth protein identification, media were albumin depleted, dialyzed and concentrated before sodium dodecyl sulfate polyacrylamide gel electrophoresis. The gel was cut into 14 slices followed by in-gel trypsin digestion, and analysis by LC-MS/MS. Proteins were further investigated using gene ontology (GO) terms analysis. Using advanced mass spectrometry and high confidence criteria for accepting proteins (P < 0.01), a total of 110 proteins other than HSA were identified. The average HSA content was found to be 94% (92-97%) of total protein. Other individual proteins accounted for up to 4.7% of the total protein. Analysis of purified HSA strongly suggests that these non-declared proteins are introduced to the media when the albumin is added. GO analysis showed that many of these proteins have roles in defence pathways, for example 18 were associated with the innate immune response and 17 with inflammatory responses. Eight proteins have been reported previously as secreted embryo proteins. For six of the commercial embryo culture media only one batch was analyzed. However, this does not affect the overall conclusions. The results showed that the HSA added to IVF media contained many other proteins and that the amount varies from batch to batch. These variations in protein profiles are problematic when attempting to identify proteins derived from the embryos. Therefore, when studying the embryo secretome and analyzing conditioned media with the aim of finding potential biomarkers that can distinguish normal and abnormal embryo development, it is important that the medium used in the experimental and control groups is from the same batch. Furthermore, the proteins present in unconditioned media could potentially influence embryonic development, gestation age, birthweight and perhaps have subsequent effects on health of the offspring. The study was supported by the Danish Agency for Science, Technology and Innovation. Research at the Fertility Clinic, Aarhus University Hospital is supported by an unrestricted grant from Merck Sharp & Dohme Corp and Ferring. The authors declare no conflicts of interest. © The Author 2014. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Coupling of acrylic dyeing wastewater treatment by heterogeneous Fenton oxidation in a continuous stirred tank reactor with biological degradation in a sequential batch reactor.

    PubMed

    Esteves, Bruno M; Rodrigues, Carmen S D; Boaventura, Rui A R; Maldonado-Hódar, F J; Madeira, Luís M

    2016-01-15

    This work deals with the treatment of a recalcitrant effluent, from the dyeing stage of acrylic fibres, by combination of the heterogeneous Fenton's process in a continuous stirred tank reactor (CSTR) with biological degradation in a sequential batch reactor (SBR). Three different catalysts (a commercial Fe/ZSM-5 zeolite and two distinct Fe-containing activated carbons - ACs - prepared by wet impregnation of iron acetate and iron nitrate) were employed on the Fenton's process, and afterwards a parametric study was carried out to determine the effect of the main operating conditions, namely the hydrogen peroxide feed concentration, temperature and contact time. Under the best operating conditions found, using the activated carbon impregnated with iron nitrate, 62.7% of discolouration and 39.9% of total organic carbon (TOC) reduction were achieved, at steady-state. Furthermore, a considerable increase in the effluent's biodegradability was attained (BOD5:COD ratio increased from <0.001 to 0.27 and SOUR - specific oxygen uptake rate - from <0.2 to 11.1 mg O2/(gVSS·h)), alongside a major decrease in its toxicity (from 92.1 to 94.0% of Vibrio fischeri inhibition down to 6.9-9.9%). This allowed the application of the subsequent biological degradation stage. The combination of the two processes provided a treated effluent that clearly complies with the legislated discharge limits. It was also found that the iron leaching from the three catalysts tested was very small in all runs, a crucial factor for the stability and long-term use of such materials. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Tier 3 batch system data locality via managed caches

    NASA Astrophysics Data System (ADS)

    Fischer, Max; Giffels, Manuel; Jung, Christopher; Kühn, Eileen; Quast, Günter

    2015-05-01

    Modern data processing increasingly relies on data locality for performance and scalability, whereas the common HEP approaches aim for uniform resource pools with minimal locality, recently even across site boundaries. To combine advantages of both, the High- Performance Data Analysis (HPDA) Tier 3 concept opportunistically establishes data locality via coordinated caches. In accordance with HEP Tier 3 activities, the design incorporates two major assumptions: First, only a fraction of data is accessed regularly and thus the deciding factor for overall throughput. Second, data access may fallback to non-local, making permanent local data availability an inefficient resource usage strategy. Based on this, the HPDA design generically extends available storage hierarchies into the batch system. Using the batch system itself for scheduling file locality, an array of independent caches on the worker nodes is dynamically populated with high-profile data. Cache state information is exposed to the batch system both for managing caches and scheduling jobs. As a result, users directly work with a regular, adequately sized storage system. However, their automated batch processes are presented with local replications of data whenever possible.

  7. A novel visual hardware behavioral language

    NASA Technical Reports Server (NTRS)

    Li, Xueqin; Cheng, H. D.

    1992-01-01

    Most hardware behavioral languages just use texts to describe the behavior of the desired hardware design. This is inconvenient for VLSI designers who enjoy using the schematic approach. The proposed visual hardware behavioral language has the ability to graphically express design information using visual parallel models (blocks), visual sequential models (processes) and visual data flow graphs (which consist of primitive operational icons, control icons, and Data and Synchro links). Thus, the proposed visual hardware behavioral language can not only specify hardware concurrent and sequential functionality, but can also visually expose parallelism, sequentiality, and disjointness (mutually exclusive operations) for the hardware designers. That would make the hardware designers capture the design ideas easily and explicitly using this visual hardware behavioral language.

  8. Examination of the Relation between TEOG Score of Turkish Revolution History and Kemalism Course and Reading Comprehension Skill (An Example of Explanatory Sequential Mixed Design)

    ERIC Educational Resources Information Center

    Yuvaci, Ibrahim; Demir, Selçuk Besir

    2016-01-01

    This paper is aimed to determine the relation between reading comprehension skill and TEOG success. In this research, a mixed research method, sequential explanatory mixed design, is utilized to examine the relation between reading comprehension skills and TEOG success of 8th grade students throughly. In explanatory sequential mixed design…

  9. Generalized Linear Covariance Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F. Landis

    2014-01-01

    This talk presents a comprehensive approach to filter modeling for generalized covariance analysis of both batch least-squares and sequential estimators. We review and extend in two directions the results of prior work that allowed for partitioning of the state space into solve-for'' and consider'' parameters, accounted for differences between the formal values and the true values of the measurement noise, process noise, and textita priori solve-for and consider covariances, and explicitly partitioned the errors into subspaces containing only the influence of the measurement noise, process noise, and solve-for and consider covariances. In this work, we explicitly add sensitivity analysis to this prior work, and relax an implicit assumption that the batch estimator's epoch time occurs prior to the definitive span. We also apply the method to an integrated orbit and attitude problem, in which gyro and accelerometer errors, though not estimated, influence the orbit determination performance. We illustrate our results using two graphical presentations, which we call the variance sandpile'' and the sensitivity mosaic,'' and we compare the linear covariance results to confidence intervals associated with ensemble statistics from a Monte Carlo analysis.

  10. GPU Accelerated Clustering for Arbitrary Shapes in Geoscience Data

    NASA Astrophysics Data System (ADS)

    Pankratius, V.; Gowanlock, M.; Rude, C. M.; Li, J. D.

    2016-12-01

    Clustering algorithms have become a vital component in intelligent systems for geoscience that helps scientists discover and track phenomena of various kinds. Here, we outline advances in Density-Based Spatial Clustering of Applications with Noise (DBSCAN) which detects clusters of arbitrary shape that are common in geospatial data. In particular, we propose a hybrid CPU-GPU implementation of DBSCAN and highlight new optimization approaches on the GPU that allows clustering detection in parallel while optimizing data transport during CPU-GPU interactions. We employ an efficient batching scheme between the host and GPU such that limited GPU memory is not prohibitive when processing large and/or dense datasets. To minimize data transfer overhead, we estimate the total workload size and employ an execution that generates optimized batches that will not overflow the GPU buffer. This work is demonstrated on space weather Total Electron Content (TEC) datasets containing over 5 million measurements from instruments worldwide, and allows scientists to spot spatially coherent phenomena with ease. Our approach is up to 30 times faster than a sequential implementation and therefore accelerates discoveries in large datasets. We acknowledge support from NSF ACI-1442997.

  11. The Electrophysiological Biosensor for Batch-Measurement of Cell Signals

    NASA Astrophysics Data System (ADS)

    Suzuki, Kengo; Tanabe, Masato; Ezaki, Takahiro; Konishi, Satoshi; Oka, Hiroaki; Ozaki, Nobuhiko

    This paper presents the development of electrophysiological biosensor. The developed sensor allows a batch-measurement by detecting all signals from a large number of cells together. The developed sensor employs the same measurement principle as the patch-clamp technique. A single cell is sucked and clamped in a micro hole with detecting electrode. Detecting electrodes in arrayed micro holes are connected together for the batch-measurement of signals a large number of cell signals. Furthermore, an array of sensors for batch-measurement is designed to improve measurement-throughput to satisfy requirements for the drug screening application.

  12. Hierarchical Bayesian models to assess between- and within-batch variability of pathogen contamination in food.

    PubMed

    Commeau, Natalie; Cornu, Marie; Albert, Isabelle; Denis, Jean-Baptiste; Parent, Eric

    2012-03-01

    Assessing within-batch and between-batch variability is of major interest for risk assessors and risk managers in the context of microbiological contamination of food. For example, the ratio between the within-batch variability and the between-batch variability has a large impact on the results of a sampling plan. Here, we designed hierarchical Bayesian models to represent such variability. Compatible priors were built mathematically to obtain sound model comparisons. A numeric criterion is proposed to assess the contamination structure comparing the ability of the models to replicate grouped data at the batch level using a posterior predictive loss approach. Models were applied to two case studies: contamination by Listeria monocytogenes of pork breast used to produce diced bacon and contamination by the same microorganism on cold smoked salmon at the end of the process. In the first case study, a contamination structure clearly exists and is located at the batch level, that is, between batches variability is relatively strong, whereas in the second a structure also exists but is less marked. © 2012 Society for Risk Analysis.

  13. Establishing column batch repeatability according to Quality by Design (QbD) principles using modeling software.

    PubMed

    Rácz, Norbert; Kormány, Róbert; Fekete, Jenő; Molnár, Imre

    2015-04-10

    Column technology needs further improvement even today. To get information of batch-to-batch repeatability, intelligent modeling software was applied. Twelve columns from the same production process, but from different batches were compared in this work. In this paper, the retention parameters of these columns with real life sample solutes were studied. The following parameters were selected for measurements: gradient time, temperature and pH. Based on calculated results, batch-to-batch repeatability of BEH columns was evaluated. Two parallel measurements on two columns from the same batch were performed to obtain information about the quality of packing. Calculating the average of individual working points at the highest critical resolution (R(s,crit)) it was found that the robustness, calculated with a newly released robustness module, had a success rate >98% among the predicted 3(6) = 729 experiments for all 12 columns. With the help of retention modeling all substances could be separated independently from the batch and/or packing, using the same conditions, having high robustness of the experiments. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Preparation, Characterization, and Optimization of Folic Acid-Chitosan-Methotrexate Core-Shell Nanoparticles by Box-Behnken Design for Tumor-Targeted Drug Delivery.

    PubMed

    Naghibi Beidokhti, Hamid Reza; Ghaffarzadegan, Reza; Mirzakhanlouei, Sasan; Ghazizadeh, Leila; Dorkoosh, Farid Abedin

    2017-01-01

    The objective of this study was to investigate the combined influence of independent variables in the preparation of folic acid-chitosan-methotrexate nanoparticles (FA-Chi-MTX NPs). These NPs were designed and prepared for targeted drug delivery in tumor. The NPs of each batch were prepared by coaxial electrospray atomization method and evaluated for particle size (PS) and particle size distribution (PSD). The independent variables were selected to be concentration of FA-chitosan, ratio of shell solution flow rate to core solution flow rate, and applied voltage. The process design of experiments (DOE) was obtained with three factors in three levels by Design expert software. Box-Behnken design was used to select 15 batches of experiments randomly. The chemical structure of FA-chitosan was examined by FTIR. The NPs of each batch were collected separately, and morphologies of NPs were investigated by field emission scanning electron microscope (FE-SEM). The captured pictures of all batches were analyzed by ImageJ software. Mean PS and PSD were calculated for each batch. Polynomial equation was produced for each response. The FE-SEM results showed the mean diameter of the core-shell NPs was around 304 nm, and nearly 30% of the produced NPs are in the desirable range. Optimum formulations were selected. The validation of DOE optimization results showed errors around 2.5 and 2.3% for PS and PSD, respectively. Moreover, the feasibility of using prepared NPs to target tumor extracellular pH was shown, as drug release was greater in the pH of endosome (acidic medium). Finally, our results proved that FA-Chi-MTX NPs were active against the human epithelial cervical cancer (HeLa) cells.

  15. Group-sequential three-arm noninferiority clinical trial designs

    PubMed Central

    Ochiai, Toshimitsu; Hamasaki, Toshimitsu; Evans, Scott R.; Asakura, Koko; Ohno, Yuko

    2016-01-01

    We discuss group-sequential three-arm noninferiority clinical trial designs that include active and placebo controls for evaluating both assay sensitivity and noninferiority. We extend two existing approaches, the fixed margin and fraction approaches, into a group-sequential setting with two decision-making frameworks. We investigate the operating characteristics including power, Type I error rate, maximum and expected sample sizes, as design factors vary. In addition, we discuss sample size recalculation and its’ impact on the power and Type I error rate via a simulation study. PMID:26892481

  16. Catalytic and electrocatalytic hydrogenolysis of brominated diphenyl ethers.

    PubMed

    Bonin, Pascale M L; Edwards, Patrick; Bejan, Dorin; Lo, Chun Chi; Bunce, Nigel J; Konstantinov, Alexandre D

    2005-02-01

    Polybrominated diphenyl ethers (PBDEs) are ubiquitous environmental contaminants due to their use as additive flame-retardants. Conventional catalytic hydrogenolysis in methanol solution and electrocatalytic hydrogenolysis in aqueous methanol were examined as methods for debrominating mono- and di-bromodiphenyl ethers, as well as a commercial penta-PBDE mixture, in each case using palladium on alumina as the catalyst. Electrocatalytic hydrogenolysis employed a divided flow-through batch cell, with reticulated vitreous carbon cathodes and IrO2/Ti dimensionally stable anodes. Both methods gave efficient sequential debromination, with essentially complete removal of bromine from the PBDEs, but the electrocatalytic method was limited by the poor solubility of PBDEs in aqueous methanol.

  17. An adaptive two-stage sequential design for sampling rare and clustered populations

    USGS Publications Warehouse

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  18. Nutrient recovery from the dry grind process using sequential micro and ultrafiltration of thin stillage.

    PubMed

    Arora, Amit; Dien, Bruce S; Belyea, Ronald L; Singh, Vijay; Tumbleson, M E; Rausch, Kent D

    2010-06-01

    The effectiveness of microfiltration (MF) and ultrafiltration (UF) for nutrient recovery from a thin stillage stream was determined. When a stainless steel MF membrane (0.1microm pore size) was used, the content of solids increased from 7.0% to 22.8% with a mean permeate flux rate of 45L/m(2)/h (LMH), fat increased and ash content decreased. UF experiments were conducted in batch mode under constant temperature and flow rate conditions. Permeate flux profiles were evaluated for regenerated cellulose membranes (YM1, YM10 and YM100) with molecular weight cut offs of 1, 10 and 100kDa. UF increased total solids, protein and fat and decreased ash in retentate stream. When permeate streams from MF were subjected to UF, retentate total solids concentrations similar to those of commercial syrup (23-28.8%) were obtained. YM100 had the highest percent permeate flux decline (70% of initial flux) followed by YM10 and YM1 membranes. Sequential filtration improved permeate flux rates of the YM100 membrane (32.6-73.4LMH) but the percent decline was also highest in a sequential MF+YM100 system. Protein recovery was the highest in YM1 retentate. Removal of solids, protein and fat from thin stillage may generate a permeate stream that may improve water removal efficiency and increase water recycling. Copyright 2010 Elsevier Ltd. All rights reserved.

  19. Sequential analysis in neonatal research-systematic review.

    PubMed

    Lava, Sebastiano A G; Elie, Valéry; Ha, Phuong Thi Viet; Jacqz-Aigrain, Evelyne

    2018-05-01

    As more new drugs are discovered, traditional designs come at their limits. Ten years after the adoption of the European Paediatric Regulation, we performed a systematic review on the US National Library of Medicine and Excerpta Medica database of sequential trials involving newborns. Out of 326 identified scientific reports, 21 trials were included. They enrolled 2832 patients, of whom 2099 were analyzed: the median number of neonates included per trial was 48 (IQR 22-87), median gestational age was 28.7 (IQR 27.9-30.9) weeks. Eighteen trials used sequential techniques to determine sample size, while 3 used continual reassessment methods for dose-finding. In 16 studies reporting sufficient data, the sequential design allowed to non-significantly reduce the number of enrolled neonates by a median of 24 (31%) patients (IQR - 4.75 to 136.5, p = 0.0674) with respect to a traditional trial. When the number of neonates finally included in the analysis was considered, the difference became significant: 35 (57%) patients (IQR 10 to 136.5, p = 0.0033). Sequential trial designs have not been frequently used in Neonatology. They might potentially be able to reduce the number of patients in drug trials, although this is not always the case. What is known: • In evaluating rare diseases in fragile populations, traditional designs come at their limits. About 20% of pediatric trials are discontinued, mainly because of recruitment problems. What is new: • Sequential trials involving newborns were infrequently used and only a few (n = 21) are available for analysis. • The sequential design allowed to non-significantly reduce the number of enrolled neonates by a median of 24 (31%) patients (IQR - 4.75 to 136.5, p = 0.0674).

  20. Orbit Determination Toolbox

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave

    2010-01-01

    The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.

  1. Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures

    NASA Astrophysics Data System (ADS)

    Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A.; Park, Jiwoong

    2017-10-01

    High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides--which represent one- and three-atom-thick two-dimensional building blocks, respectively--have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.

  2. Layer-by-layer assembly of two-dimensional materials into wafer-scale heterostructures.

    PubMed

    Kang, Kibum; Lee, Kan-Heng; Han, Yimo; Gao, Hui; Xie, Saien; Muller, David A; Park, Jiwoong

    2017-10-12

    High-performance semiconductor films with vertical compositions that are designed to atomic-scale precision provide the foundation for modern integrated circuitry and novel materials discovery. One approach to realizing such films is sequential layer-by-layer assembly, whereby atomically thin two-dimensional building blocks are vertically stacked, and held together by van der Waals interactions. With this approach, graphene and transition-metal dichalcogenides-which represent one- and three-atom-thick two-dimensional building blocks, respectively-have been used to realize previously inaccessible heterostructures with interesting physical properties. However, no large-scale assembly method exists at present that maintains the intrinsic properties of these two-dimensional building blocks while producing pristine interlayer interfaces, thus limiting the layer-by-layer assembly method to small-scale proof-of-concept demonstrations. Here we report the generation of wafer-scale semiconductor films with a very high level of spatial uniformity and pristine interfaces. The vertical composition and properties of these films are designed at the atomic scale using layer-by-layer assembly of two-dimensional building blocks under vacuum. We fabricate several large-scale, high-quality heterostructure films and devices, including superlattice films with vertical compositions designed layer-by-layer, batch-fabricated tunnel device arrays with resistances that can be tuned over four orders of magnitude, band-engineered heterostructure tunnel diodes, and millimetre-scale ultrathin membranes and windows. The stacked films are detachable, suspendable and compatible with water or plastic surfaces, which will enable their integration with advanced optical and mechanical systems.

  3. Biodegradation of the X-ray contrast agent iopromide and the fluoroquinolone antibiotic ofloxacin by the white rot fungus Trametes versicolor in hospital wastewaters and identification of degradation products.

    PubMed

    Gros, Meritxell; Cruz-Morato, Carles; Marco-Urrea, Ernest; Longrée, Philipp; Singer, Heinz; Sarrà, Montserrat; Hollender, Juliane; Vicent, Teresa; Rodriguez-Mozaz, Sara; Barceló, Damià

    2014-09-01

    This paper describes the degradation of the X-ray contrast agent iopromide (IOP) and the antibiotic ofloxacin (OFLOX) by the white-rot-fungus Trametes versicolor. Batch studies in synthetic medium revealed that between 60 and 80% of IOP and OFLOX were removed when spiked at approximately 12 mg L(-1) and 10 mg L(-1), respectively. A significant number of transformation products (TPs) were identified for both pharmaceuticals, confirming their degradation. IOP TPs were attributed to two principal reactions: (i) sequential deiodination of the aromatic ring and (ii) N-dealkylation of the amide at the hydroxylated side chain of the molecule. On the other hand, OFLOX transformation products were attributed mainly to the oxidation, hydroxylation and cleavage of the piperazine ring. Experiments in 10 L-bioreactor with fungal biomass fluidized by air pulses operated in batch achieved high percentage of degradation of IOP and OFLOX when load with sterile (87% IOP, 98.5% OFLOX) and unsterile (65.4% IOP, 99% OFLOX) hospital wastewater (HWW) at their real concentration (μg L(-1) level). Some of the most relevant IOP and OFLOX TPs identified in synthetic medium were also detected in bioreactor samples. Acute toxicity tests indicated a reduction of the toxicity in the final culture broth from both experiments in synthetic medium and in batch bioreactor. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Integrated ecotechnology approach towards treatment of complex wastewater with simultaneous bioenergy production.

    PubMed

    Hemalatha, Manupati; Sravan, J Shanthi; Yeruva, Dileep Kumar; Venkata Mohan, S

    2017-10-01

    Sequential integration of three stage diverse biological processes was studied by exploiting the individual process advantage towards enhanced treatment of complex chemical based wastewater. A successful attempt to integrate sequence batch reactor (SBR) with bioelectrochemical treatment (BET) and finally with microalgae treatment was studied. The sequential integration has showed individual substrate degradation (COD) of 55% in SBR, 49% in BET and 56% in microalgae, accounting for a consolidated treatment efficiency of 90%. Nitrates removal efficiency of 25% was observed in SBR, 31% in BET and 44% in microalgae, with a total efficiency of 72%. The SBR treated effluents fed to BET with the electrode intervention showed TDS removal. BET exhibited relatively higher process performance than SBR. The integration approach significantly overcame the individual process limitations along with value addition as biomass (1.75g/L), carbohydrates (640mg/g), lipids (15%) and bioelectricity. The study resulted in providing a strategy of combining SBR as pretreatment step to BET process and finally polishing with microalgae cultivation achieving the benefits of enhanced wastewater treatment along with value addition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Kullback-Leibler information function and the sequential selection of experiments to discriminate among several linear models. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    A sequential adaptive experimental design procedure for a related problem is studied. It is assumed that a finite set of potential linear models relating certain controlled variables to an observed variable is postulated, and that exactly one of these models is correct. The problem is to sequentially design most informative experiments so that the correct model equation can be determined with as little experimentation as possible. Discussion includes: structure of the linear models; prerequisite distribution theory; entropy functions and the Kullback-Leibler information function; the sequential decision procedure; and computer simulation results. An example of application is given.

  6. FUGITIVE EMISSION SOURCES AND BATCH OPERATIONS IN SYNTHETIC ORGANIC CHEMICAL PRODUCTION

    EPA Science Inventory

    This survey report was developed for the EPA for use in assessing the potential magnitude of fugitive volatile organic compound (VOC) emissions from agitator seals, cooling towers and batch operations in the production of 378 designated chemicals. The information presented in thi...

  7. Systematic optimization of fed-batch simultaneous saccharification and fermentation at high-solid loading based on enzymatic hydrolysis and dynamic metabolic modeling of Saccharomyces cerevisiae.

    PubMed

    Unrean, Pornkamol; Khajeeram, Sutamat; Laoteng, Kobkul

    2016-03-01

    An integrative simultaneous saccharification and fermentation (SSF) modeling is a useful guiding tool for rapid process optimization to meet the techno-economic requirement of industrial-scale lignocellulosic ethanol production. In this work, we have developed the SSF model composing of a metabolic network of a Saccharomyces cerevisiae cell associated with fermentation kinetics and enzyme hydrolysis model to quantitatively capture dynamic responses of yeast cell growth and fermentation during SSF. By using model-based design of feeding profiles for substrate and yeast cell in the fed-batch SSF process, an efficient ethanol production with high titer of up to 65 g/L and high yield of 85 % of theoretical yield was accomplished. The ethanol titer and productivity was increased by 47 and 41 %, correspondingly, in optimized fed-batch SSF as compared to batch process. The developed integrative SSF model is, therefore, considered as a promising approach for systematic design of economical and sustainable SSF bioprocessing of lignocellulose.

  8. Simultaneous sequential monitoring of efficacy and safety led to masking of effects.

    PubMed

    van Eekelen, Rik; de Hoop, Esther; van der Tweel, Ingeborg

    2016-08-01

    Usually, sequential designs for clinical trials are applied on the primary (=efficacy) outcome. In practice, other outcomes (e.g., safety) will also be monitored and influence the decision whether to stop a trial early. Implications of simultaneous monitoring on trial decision making are yet unclear. This study examines what happens to the type I error, power, and required sample sizes when one efficacy outcome and one correlated safety outcome are monitored simultaneously using sequential designs. We conducted a simulation study in the framework of a two-arm parallel clinical trial. Interim analyses on two outcomes were performed independently and simultaneously on the same data sets using four sequential monitoring designs, including O'Brien-Fleming and Triangular Test boundaries. Simulations differed in values for correlations and true effect sizes. When an effect was present in both outcomes, competition was introduced, which decreased power (e.g., from 80% to 60%). Futility boundaries for the efficacy outcome reduced overall type I errors as well as power for the safety outcome. Monitoring two correlated outcomes, given that both are essential for early trial termination, leads to masking of true effects. Careful consideration of scenarios must be taken into account when designing sequential trials. Simulation results can help guide trial design. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Development of a mathematical model for the growth associated Polyhydroxybutyrate fermentation by Azohydromonas australica and its use for the design of fed-batch cultivation strategies.

    PubMed

    Gahlawat, Geeta; Srivastava, Ashok K

    2013-06-01

    In the present investigation, batch cultivation of Azohydromonas australica DSM 1124 was carried out in a bioreactor for growth associated PHB production. The observed batch PHB production kinetics data was then used for the development of a mathematical model which adequately described the substrate limitation and inhibition during the cultivation. The statistical validity test demonstrated that the proposed mathematical model predictions were significant at 99% confidence level. The model was thereafter extrapolated to fed-batch to identify various nutrients feeding regimes during the bioreactor cultivation to improve the PHB accumulation. The distinct capability of the mathematical model to predict highly dynamic fed-batch cultivation strategies was demonstrated by experimental implementation of two fed-batch cultivation strategies. A significantly high PHB concentration of 22.65 g/L & an overall PHB content of 76% was achieved during constant feed rate fed-batch cultivation which is the highest PHB content reported so far using A. australica. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Anaerobic sequencing batch reactors for wastewater treatment: a developing technology.

    PubMed

    Zaiat, M; Rodrigues, J A; Ratusznei, S M; de Camargo, E F; Borzani, W

    2001-01-01

    This paper describes and discusses the main problems related to anaerobic batch and fed-batch processes for wastewater treatment. A critical analysis of the literature evaluated the industrial application viability and proposed alternatives to improve operation and control of this system. Two approaches were presented in order to make this anaerobic discontinuous process feasible for industrial application: (1) optimization of the operating procedures in reactors containing self-immobilized sludge as granules, and (2) design of bioreactors with inert support media for biomass immobilization.

  11. AFTER: Batch jobs on the Apollo ring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofstadler, P.

    1987-07-01

    This document describes AFTER, a system that allows users of an Apollo ring to submit batch jobs to run without leaving themselves logged in to the ring. Jobs may be submitted to run at a later time or on a different node. Results from the batch job are mailed to the user through some designated mail system. AFTER features an understandable user interface, good on line help, and site customization. This manual serves primarily as a user's guide to AFTER although administration and installation are covered for completeness.

  12. Easily constructed spectroelectrochemical cell for batch and flow injection analyses.

    PubMed

    Flowers, Paul A; Maynor, Margaret A; Owens, Donald E

    2002-02-01

    The design and performance of an easily constructed spectroelectrochemical cell suitable for batch and flow injection measurements are described. The cell is fabricated from a commercially available 5-mm quartz cuvette and employs 60 ppi reticulated vitreous carbon as the working electrode, resulting in a reasonable compromise between optical sensitivity and thin-layer electrochemical behavior. The spectroelectrochemical traits of the cell in both batch and flow modes were evaluated using aqueous ferricyanide and compare favorably to those reported previously for similar cells.

  13. Metataxonomic profiling and prediction of functional behaviour of wheat straw degrading microbial consortia

    PubMed Central

    2014-01-01

    Background Mixed microbial cultures, in which bacteria and fungi interact, have been proposed as an efficient way to deconstruct plant waste. The characterization of specific microbial consortia could be the starting point for novel biotechnological applications related to the efficient conversion of lignocellulose to cello-oligosaccharides, plastics and/or biofuels. Here, the diversity, composition and predicted functional profiles of novel bacterial-fungal consortia are reported, on the basis of replicated aerobic wheat straw enrichment cultures. Results In order to set up biodegradative microcosms, microbial communities were retrieved from a forest soil and introduced into a mineral salt medium containing 1% of (un)treated wheat straw. Following each incubation step, sequential transfers were carried out using 1 to 1,000 dilutions. The microbial source next to three sequential batch cultures (transfers 1, 3 and 10) were analyzed by bacterial 16S rRNA gene and fungal ITS1 pyrosequencing. Faith’s phylogenetic diversity values became progressively smaller from the inoculum to the sequential batch cultures. Moreover, increases in the relative abundances of Enterobacteriales, Pseudomonadales, Flavobacteriales and Sphingobacteriales were noted along the enrichment process. Operational taxonomic units affiliated with Acinetobacter johnsonii, Pseudomonas putida and Sphingobacterium faecium were abundant and the underlying strains were successfully isolated. Interestingly, Klebsiella variicola (OTU1062) was found to dominate in both consortia, whereas K. variicola-affiliated strains retrieved from untreated wheat straw consortia showed endoglucanase/xylanase activities. Among the fungal players with high biotechnological relevance, we recovered members of the genera Penicillium, Acremonium, Coniochaeta and Trichosporon. Remarkably, the presence of peroxidases, alpha-L-fucosidases, beta-xylosidases, beta-mannases and beta-glucosidases, involved in lignocellulose degradation, was indicated by predictive bacterial metagenome reconstruction. Reassuringly, tests for specific (hemi)cellulolytic enzymatic activities, performed on the consortial secretomes, confirmed the presence of such gene functions. Conclusion In an in-depth characterization of two wheat straw degrading microbial consortia, we revealed the enrichment and selection of specific bacterial and fungal taxa that were presumably involved in (hemi) cellulose degradation. Interestingly, the microbial community composition was strongly influenced by the wheat straw pretreatment. Finally, the functional bacterial-metagenome prediction and the evaluation of enzymatic activities (at the consortial secretomes) revealed the presence and enrichment of proteins involved in the deconstruction of plant biomass. PMID:24955113

  14. Continuous/Batch Mg/MgH2/H2O-Based Hydrogen Generator

    NASA Technical Reports Server (NTRS)

    Kindler, Andrew; Huang, Yuhong

    2010-01-01

    A proposed apparatus for generating hydrogen by means of chemical reactions of magnesium and magnesium hydride with steam would exploit the same basic principles as those discussed in the immediately preceding article, but would be designed to implement a hybrid continuous/batch mode of operation. The design concept would simplify the problem of optimizing thermal management and would help to minimize the size and weight necessary for generating a given amount of hydrogen.

  15. Bacterial, viral and turbidity removal by intermittent slow sand filtration for household use in developing countries: experimental investigation and modeling.

    PubMed

    Jenkins, Marion W; Tiwari, Sangam K; Darby, Jeannie

    2011-11-15

    A two-factor three-block experimental design was developed to permit rigorous evaluation and modeling of the main effects and interactions of sand size (d(10) of 0.17 and 0.52 mm) and hydraulic head (10, 20, and 30 cm) on removal of fecal coliform (FC) bacteria, MS2 bacteriophage virus, and turbidity, under two batch operating modes ('long' and 'short') in intermittent slow sand filters (ISSFs). Long operation involved an overnight pause time between feeding of two successive 20 L batches (16 h average batch residence time (RT)). Short operation involved no pause between two 20 L batch feeds (5h average batch RT). Conditions tested were representative of those encountered in developing country field settings. Over a ten week period, the 18 experimental filters were fed river water augmented with wastewater (influent turbidity of 5.4-58.6 NTU) and maintained with the wet harrowing method. Linear mixed modeling allowed systematic estimates of the independent marginal effects of each independent variable on each performance outcome of interest while controlling for the effects of variations in a batch's actual residence time, days since maintenance, and influent turbidity. This is the first study in which simultaneous measurement of bacteria, viruses and turbidity removal at the batch level over an extended duration has been undertaken with a large number of replicate units to permit rigorous modeling of ISSF performance variability within and across a range of likely filter design configurations and operating conditions. On average, the experimental filters removed 1.40 log fecal coliform CFU (SD 0.40 log, N=249), 0.54 log MS2 PFU (SD 0.42 log, N=245) and 89.0 percent turbidity (SD 6.9 percent, N=263). Effluent turbidity averaged 1.24 NTU (SD 0.53 NTU, N=263) and always remained below 3 NTU. Under the best performing design configuration and operating mode (fine sand, 10 cm head, long operation, initial HLR of 0.01-0.03 m/h), mean 1.82 log removal of bacteria (98.5%) and mean 0.94 log removal of MS2 viruses (88.5%) were achieved. Results point to new recommendations regarding filter design, manufacture, and operation for implementing ISSFs in local settings in developing countries. Sand size emerged as a critical design factor on performance. A single layer of river sand used in this investigation demonstrated removals comparable to those reported for 2 layers of crushed sand. Pause time and increased residence time each emerged as highly beneficial for improving removal performance on all four outcomes. A relatively large and significant negative effect of influent turbidity on MS2 viral removal in the ISSF was measured in parallel with a much smaller weaker positive effect of influent turbidity on FC bacterial removal. Disturbance of the schmutzdecke by wet harrowing showed no effect on virus removal and a modest reductive effect on the bacterial and turbidity removal as measured 7 days or more after the disturbance. For existing coarse sand ISSFs, this research indicates that a reduction in batch feed volume, effectively reducing the operating head and increasing the pore:batch volume ratio, could improve their removal performance by increasing batch residence time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Mechanical System Reliability and Cost Integration Using a Sequential Linear Approximation Method

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1997-01-01

    The development of new products is dependent on product designs that incorporate high levels of reliability along with a design that meets predetermined levels of system cost. Additional constraints on the product include explicit and implicit performance requirements. Existing reliability and cost prediction methods result in no direct linkage between variables affecting these two dominant product attributes. A methodology to integrate reliability and cost estimates using a sequential linear approximation method is proposed. The sequential linear approximation method utilizes probability of failure sensitivities determined from probabilistic reliability methods as well a manufacturing cost sensitivities. The application of the sequential linear approximation method to a mechanical system is demonstrated.

  17. Optimising the design and operation of semi-continuous affinity chromatography for clinical and commercial manufacture.

    PubMed

    Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S

    2013-04-05

    This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost savings obtained by 8 proof-of-concept batches would be sufficient to pay back the investment cost of the pilot-scale semi-continuous chromatography system. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Design and fabrication of a fixed-bed batch type pyrolysis reactor for pilot scale pyrolytic oil production in Bangladesh

    NASA Astrophysics Data System (ADS)

    Aziz, Mohammad Abdul; Al-khulaidi, Rami Ali; Rashid, MM; Islam, M. R.; Rashid, MAN

    2017-03-01

    In this research, a development and performance test of a fixed-bed batch type pyrolysis reactor for pilot scale pyrolysis oil production was successfully completed. The characteristics of the pyrolysis oil were compared to other experimental results. A solid horizontal condenser, a burner for furnace heating and a reactor shield were designed. Due to the pilot scale pyrolytic oil production encountered numerous problems during the plant’s operation. This fixed-bed batch type pyrolysis reactor method will demonstrate the energy saving concept of solid waste tire by creating energy stability. From this experiment, product yields (wt. %) for liquid or pyrolytic oil were 49%, char 38.3 % and pyrolytic gas 12.7% with an operation running time of 185 minutes.

  19. Recommendations of the VAC2VAC workshop on the design of multi-centre validation studies.

    PubMed

    Halder, Marlies; Depraetere, Hilde; Delannois, Frédérique; Akkermans, Arnoud; Behr-Gross, Marie-Emmanuelle; Bruysters, Martijn; Dierick, Jean-François; Jungbäck, Carmen; Kross, Imke; Metz, Bernard; Pennings, Jeroen; Rigsby, Peter; Riou, Patrice; Balks, Elisabeth; Dobly, Alexandre; Leroy, Odile; Stirling, Catrina

    2018-03-01

    Within the Innovative Medicines Initiative 2 (IMI 2) project VAC2VAC (Vaccine batch to vaccine batch comparison by consistency testing), a workshop has been organised to discuss ways of improving the design of multi-centre validation studies and use the data generated for product-specific validation purposes. Moreover, aspects of validation within the consistency approach context were addressed. This report summarises the discussions and outlines the conclusions and recommendations agreed on by the workshop participants. Copyright © 2018.

  20. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  1. Anaerobic ammonia removal in presence of organic matter: a novel route.

    PubMed

    Sabumon, P C

    2007-10-01

    This study describes the feasibility of anaerobic ammonia removal process in presence of organic matter. Different sources of biomass collected from diverse eco-systems containing ammonia and organic matter (OM) were screened for potential anaerobic ammonia removal. Sequential batch studies confirmed the possibility of anaerobic ammonia removal in presence of OM, but ammonia was oxidized anoxically to nitrate (at oxidation reduction potential; ORP=-248+/-25 mV) by an unknown mechanism unlike in the reported anammox process. The oxygen required for oxidation of ammonia might have been generated through catalase enzymatic activity of facultative anaerobes in mixed culture. The oxygen generation possibility by catalase enzyme route was demonstrated. Among the inorganic electron acceptors (NO(2)(-), NO(3)(-) and SO(4)(2-)) studied, NO(2)(-) was found to be most effective in total nitrogen removal. Denitrification by the developed culture was much effective and faster compared to ammonia oxidation. The results of this study show that anaerobic ammonia removal is feasible in presence of OM. The novel nitrogen removal route is hypothesized as enzymatic anoxic oxidation of NH(4)(+) to NO(3)(-), followed by denitrification via autotrophic and/or heterotrophic routes. The results of batch study were confirmed in continuous reactor operation.

  2. Application of gas sensor arrays in assessment of wastewater purification effects.

    PubMed

    Guz, Łukasz; Łagód, Grzegorz; Jaromin-Gleń, Katarzyna; Suchorab, Zbigniew; Sobczuk, Henryk; Bieganowski, Andrzej

    2014-12-23

    A gas sensor array consisting of eight metal oxide semiconductor (MOS) type gas sensors was evaluated for its ability for assessment of the selected wastewater parameters. Municipal wastewater was collected in a wastewater treatment plant (WWTP) in a primary sedimentation tank and was treated in a laboratory-scale sequential batch reactor (SBR). A comparison of the gas sensor array (electronic nose) response to the standard physical-chemical parameters of treated wastewater was performed. To analyze the measurement results, artificial neural networks were used. E-nose-gas sensors array and artificial neural networks proved to be a suitable method for the monitoring of treated wastewater quality. Neural networks used for data validation showed high correlation between the electronic nose readouts and: (I) chemical oxygen demand (COD) (r = 0.988); (II) total suspended solids (TSS) (r = 0.938); (III) turbidity (r = 0.940); (IV) pH (r = 0.554); (V) nitrogen compounds: N-NO3 (r = 0.958), N-NO2 (r = 0.869) and N-NH3 (r = 0.978); (VI) and volatile organic compounds (VOC) (r = 0.987). Good correlation of the abovementioned parameters are observed under stable treatment conditions in a laboratory batch reactor.

  3. Accounting for Test Variability through Sizing Local Domains in Sequential Design Optimization with Concurrent Calibration-Based Model Validation

    DTIC Science & Technology

    2013-08-01

    in Sequential Design Optimization with Concurrent Calibration-Based Model Validation Dorin Drignei 1 Mathematics and Statistics Department...Validation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dorin Drignei; Zissimos Mourelatos; Vijitashwa Pandey

  4. X-ray Analysis of Defects and Anomalies in AGR-5/6/7 TRISO Particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmreich, Grant W.; Hunn, John D.; Skitt, Darren J.

    2017-06-01

    Coated particle fuel batches J52O-16-93164, 93165, 93166, 93168, 93169, 93170, and 93172 were produced by Babcock and Wilcox Technologies (BWXT) for possible selection as fuel for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), or may be used for other tests. Each batch was coated in a 150-mm-diameter production-scale fluidized-bed chemical vapor deposition (CVD) furnace. Tristructural isotropic (TRISO) coatings were deposited on 425-μm-nominal-diameter spherical kernels from BWXT lot J52R-16-69317 containing a mixture of 15.4%-enriched uranium carbide and uranium oxide (UCO), with the exception of Batchmore » 93164, which used similar kernels from BWXT lot J52L-16-69316. The TRISO-coatings consisted of a ~50% dense carbon buffer layer with 100-μmnominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μm-nominal thickness. Each coated particle batch was sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batch was designated by appending the letter A to the end of the batch number (e.g., 93164A). Secondary upgrading by sieving was performed on the upgraded batches to remove specific anomalies identified during analysis for Defective IPyC, and the upgraded batches were designated by appending the letter B to the end of the batch number (e.g., 93165B). Following this secondary upgrading, coated particle composite J52R-16-98005 was produced by BWXT as fuel for the AGR Program’s AGR-5/6/7 irradiation test in the INL ATR. This composite was comprised of coated particle fuel batches J52O-16-93165B, 93168B, 93169B, and 93170B.« less

  5. Moving from Batch to Field Using the RT3D Reactive Transport Modeling System

    NASA Astrophysics Data System (ADS)

    Clement, T. P.; Gautam, T. R.

    2002-12-01

    The public domain reactive transport code RT3D (Clement, 1997) is a general-purpose numerical code for solving coupled, multi-species reactive transport in saturated groundwater systems. The code uses MODFLOW to simulate flow and several modules of MT3DMS to simulate the advection and dispersion processes. RT3D employs the operator-split strategy which allows the code solve the coupled reactive transport problem in a modular fashion. The coupling between reaction and transport is defined through a separate module where the reaction equations are specified. The code supports a versatile user-defined reaction option that allows users to define their own reaction system through a Fortran-90 subroutine, known as the RT3D-reaction package. Further a utility code, known as BATCHRXN, allows the users to independently test and debug their reaction package. To analyze a new reaction system at a batch scale, users should first run BATCHRXN to test the ability of their reaction package to model the batch data. After testing, the reaction package can simply be ported to the RT3D environment to study the model response under 1-, 2-, or 3-dimensional transport conditions. This paper presents example problems that demonstrate the methods for moving from batch to field-scale simulations using BATCHRXN and RT3D codes. The first example describes a simple first-order reaction system for simulating the sequential degradation of Tetrachloroethene (PCE) and its daughter products. The second example uses a relatively complex reaction system for describing the multiple degradation pathways of Tetrachloroethane (PCA) and its daughter products. References 1) Clement, T.P, RT3D - A modular computer code for simulating reactive multi-species transport in 3-Dimensional groundwater aquifers, Battelle Pacific Northwest National Laboratory Research Report, PNNL-SA-28967, September, 1997. Available at: http://bioprocess.pnl.gov/rt3d.htm.

  6. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less

  7. High-Alpha Research Vehicle Lateral-Directional Control Law Description, Analyses, and Simulation Results

    NASA Technical Reports Server (NTRS)

    Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.

    1998-01-01

    This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.

  8. Automatic flow-batch system for cold vapor atomic absorption spectroscopy determination of mercury in honey from Argentina using online sample treatment.

    PubMed

    Domínguez, Marina A; Grünhut, Marcos; Pistonesi, Marcelo F; Di Nezio, María S; Centurión, María E

    2012-05-16

    An automatic flow-batch system that includes two borosilicate glass chambers to perform sample digestion and cold vapor atomic absorption spectroscopy determination of mercury in honey samples was designed. The sample digestion was performed by using a low-cost halogen lamp to obtain the optimum temperature. Optimization of the digestion procedure was done using a Box-Behnken experimental design. A linear response was observed from 2.30 to 11.20 μg Hg L(-1). The relative standard deviation was 3.20% (n = 11, 6.81 μg Hg L(-1)), the sample throughput was 4 sample h(-1), and the detection limit was 0.68 μg Hg L(-1). The obtained results with the flow-batch method are in good agreement with those obtained with the reference method. The flow-batch system is simple, allows the use of both chambers simultaneously, is seen as a promising methodology for achieving green chemistry goals, and is a good proposal to improving the quality control of honey.

  9. Application of gain scheduling to the control of batch bioreactors

    NASA Technical Reports Server (NTRS)

    Cardello, Ralph; San, Ka-Yiu

    1987-01-01

    The implementation of control algorithms to batch bioreactors is often complicated by the inherent variations in process dynamics during the course of fermentation. Such a wide operating range may render the performance of fixed gain PID controllers unsatisfactory. In this work, a detailed study on the control of batch fermentation is performed. Furthermore, a simple batch controller design is proposed which incorporates the concept of gain-scheduling, a subclass of adaptive control, with oxygen uptake rate as an auxiliary variable. The control of oxygen tension in the biorector is used as a vehicle to convey the proposed idea, analysis and results. Simulation experiments indicate significant improvement in controller performance can be achieved by the proposed approach even in the presence of measurement noise.

  10. A Batch Feeder for Inhomogeneous Bulk Materials

    NASA Astrophysics Data System (ADS)

    Vislov, I. S.; Kladiev, S. N.; Slobodyan, S. M.; Bogdan, A. M.

    2016-04-01

    The work includes the mechanical analysis of mechanical feeders and batchers that find application in various technological processes and industrial fields. Feeders are usually classified according to their design features into two groups: conveyor-type feeders and non-conveyor feeders. Batchers are used to batch solid bulk materials. Less frequently, they are used for liquids. In terms of a batching method, they are divided into volumetric and weighting batchers. Weighting batchers do not provide for sufficient batching accuracy. Automatic weighting batchers include a mass controlling sensor and systems for automatic material feed and automatic mass discharge control. In terms of operating principle, batchers are divided into gravitational batchers and batchers with forced feed of material using conveyors and pumps. Improved consumption of raw materials, decreased loss of materials, ease of use in automatic control systems of industrial facilities allows increasing the quality of technological processes and improve labor conditions. The batch feeder suggested by the authors is a volumetric batcher that has no comparable counterparts among conveyor-type feeders and allows solving the problem of targeted feeding of bulk material batches increasing reliability and hermeticity of the device.

  11. Sequential color video to parallel color video converter

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The engineering design, development, breadboard fabrication, test, and delivery of a breadboard field sequential color video to parallel color video converter is described. The converter was designed for use onboard a manned space vehicle to eliminate a flickering TV display picture and to reduce the weight and bulk of previous ground conversion systems.

  12. Adaptive x-ray threat detection using sequential hypotheses testing with fan-beam experimental data (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Thamvichai, Ratchaneekorn; Huang, Liang-Chih; Ashok, Amit; Gong, Qian; Coccarelli, David; Greenberg, Joel A.; Gehm, Michael E.; Neifeld, Mark A.

    2017-05-01

    We employ an adaptive measurement system, based on sequential hypotheses testing (SHT) framework, for detecting material-based threats using experimental data acquired on an X-ray experimental testbed system. This testbed employs 45-degree fan-beam geometry and 15 views over a 180-degree span to generate energy sensitive X-ray projection data. Using this testbed system, we acquire multiple view projection data for 200 bags. We consider an adaptive measurement design where the X-ray projection measurements are acquired in a sequential manner and the adaptation occurs through the choice of the optimal "next" source/view system parameter. Our analysis of such an adaptive measurement design using the experimental data demonstrates a 3x-7x reduction in the probability of error relative to a static measurement design. Here the static measurement design refers to the operational system baseline that corresponds to a sequential measurement using all the available sources/views. We also show that by using adaptive measurements it is possible to reduce the number of sources/views by nearly 50% compared a system that relies on static measurements.

  13. The mechanism and design of sequencing batch reactor systems for nutrient removal--the state of the art.

    PubMed

    Artan, N; Wilderer, P; Orhon, D; Morgenroth, E; Ozgür, N

    2001-01-01

    The Sequencing Batch Reactor (SBR) process for carbon and nutrient removal is subject to extensive research, and it is finding a wider application in full-scale installations. Despite the growing popularity, however, a widely accepted approach to process analysis and modeling, a unified design basis, and even a common terminology are still lacking; this situation is now regarded as the major obstacle hindering broader practical application of the SBR. In this paper a rational dimensioning approach is proposed for nutrient removal SBRs based on scientific information on process stoichiometry and modelling, also emphasizing practical constraints in design and operation.

  14. Demonstration of a strategy for product purification by high-gradient magnetic fishing: recovery of superoxide dismutase from unconditioned whey.

    PubMed

    Meyer, Andrea; Hansen, Dennis B; Gomes, Cláudia S G; Hobley, Timothy J; Thomas, Owen R T; Franzreb, Matthias

    2005-01-01

    A systematic approach for the design of a bioproduct recovery process employing magnetic supports and the technique of high-gradient magnetic fishing (HGMF) is described. The approach is illustrated for the separation of superoxide dismutase (SOD), an antioxidant protein present in low concentrations (ca. 0.15-0.6 mg L(-1)) in whey. The first part of the process design consisted of ligand screening in which metal chelate supports charged with copper(II) ions were found to be the most suitable. The second stage involved systematic and sequential optimization of conditions for the following steps: product adsorption, support washing, and product elution. Next, the capacity of a novel high-gradient magnetic separator (designed for biotechnological applications) for trapping and holding magnetic supports was determined. Finally, all of the above elements were assembled to deliver a HGMF process for the isolation of SOD from crude sweet whey, which consisted of (i) binding SOD using Cu2+ -charged magnetic metal chelator particles in a batch reactor with whey; (ii) recovery of the "SOD-loaded" supports by high-gradient magnetic separation (HGMS); (iii) washing out loosely bound and entrained proteins and solids; (iv) elution of the target protein; and (v) recovery of the eluted supports from the HGMF rig. Efficient recovery of SOD was demonstrated at approximately 50-fold increased scale (cf magnetic rack studies) in three separate HGMF experiments, and in the best of these (run 3) an SOD yield of >85% and purification factor of approximately 21 were obtained.

  15. Algorithms for the Construction of Parallel Tests by Zero-One Programming. Project Psychometric Aspects of Item Banking No. 7. Research Report 86-7.

    ERIC Educational Resources Information Center

    Boekkooi-Timminga, Ellen

    Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…

  16. Topics in the Sequential Design of Experiments

    DTIC Science & Technology

    1992-03-01

    decision , unless so designated by other documentation. 12a. DISTRIBUTION /AVAILABIIUTY STATEMENT 12b. DISTRIBUTION CODE Approved for public release...3 0 1992 D 14. SUBJECT TERMS 15. NUMBER OF PAGES12 Design of Experiments, Renewal Theory , Sequential Testing 1 2. PRICE CODE Limit Theory , Local...distributions for one parameter exponential families," by Michael Woodroofe. Stntca, 2 (1991), 91-112. [6] "A non linear renewal theory for a functional of

  17. MSFC Skylab airlock module, volume 1. [systems design and performance

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The history and development of the Skylab Airlock Module and Payload Shroud is presented from initial concept through final design. A summary is given of the Airlock features and systems. System design and performance are presented for the Spent Stage Experiment Support Module, structure and mechanical systems, mass properties, thermal and environmental control systems, EVA/IVA suite system, electrical power system, sequential system, sequential system, and instrumentation system.

  18. Designing group sequential randomized clinical trials with time to event end points using a R function.

    PubMed

    Filleron, Thomas; Gal, Jocelyn; Kramar, Andrew

    2012-10-01

    A major and difficult task is the design of clinical trials with a time to event endpoint. In fact, it is necessary to compute the number of events and in a second step the required number of patients. Several commercial software packages are available for computing sample size in clinical trials with sequential designs and time to event endpoints, but there are a few R functions implemented. The purpose of this paper is to describe features and use of the R function. plansurvct.func, which is an add-on function to the package gsDesign which permits in one run of the program to calculate the number of events, and required sample size but also boundaries and corresponding p-values for a group sequential design. The use of the function plansurvct.func is illustrated by several examples and validated using East software. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Model-Based Nutrient Feeding Strategies for the Increased Production of Polyhydroxybutyrate (PHB) by Alcaligenes latus.

    PubMed

    Gahlawat, Geeta; Srivastava, Ashok K

    2017-10-01

    Polyhydroxyalkanoates (PHAs) are biodegradable polymers which are considered as an effective alternative for conventional plastics due to their mechanical properties similar to the latter. However, the widespread use of these polymers is still hampered due to their higher cost of production as compared to plastics. The production cost could be overcome by obtaining high yields and productivity. The goal of the present research was to enhance the yield of polyhydroxybutyrate (PHB) with the help of two simple fed-batch cultivation strategies. In the present study, average batch kinetic and substrate limitation/inhibition study data of Alcaligenes latus was used for the development of PHB model which was then adopted for designing various off-line nutrient feeding strategies to enhance PHB accumulation. The predictive ability of the model was validated by experimental implementation of two fed-batch strategies. One such dynamic strategy of fed-batch cultivation under pseudo-steady state with respect to nitrogen and simultaneous carbon feeding strategy resulted in significantly high biomass and PHB concentration of 39.17 g/L and 29.64 g/L, respectively. This feeding strategy demonstrated a high PHB productivity and PHB content of 0.6 g/L h and 75%, respectively, which were remarkably high in comparison to batch cultivation. The mathematical model can also be employed for designing various other nutrient feeding strategies.

  20. Characterization of co-products from producing ethanol by sequential extraction processing of corn

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hojilla-Evangelista, M.P.; Johnson, L.A.; Pometto, A.L. III

    1996-12-31

    Sequential Extraction Processing (SEP) is a new process for ethanol production that has potential to produce more valuable co-products than alternative processes. Previous work determined the yields of oil and protein and evaluated their chemical and functional properties. The properties of the crude fiber and spent solids, however, have yet to be studied. This research was conducted to evaluate the potential of SEP corn fiber to increase ethanol conversion and as replacement for gum arabic, and evaluate the potential of SEP starch and fiber to be fermented to ethanol. SEP hemicellulose from crude fiber was readily dispersible in water andmore » its solution (5%) gave low viscosity despite having high solids content. These properties indicated potential utilization as stabilizers, thickeners, and adhesive for coatings and batters in food and industrial products. Enzyme hydrolysis studies and batch fermentation of SEP starch/fiber indicated that SEP crude fiber was more readily accessible to the action of cellulases. More ethanol (about 10%) was produced from the fermentation of SEP starch/fiber than from undegermed or degermed soft dent corn, particularly when the hemicellulose fraction was absent from the SEP fiber.« less

  1. Sequential anaerobic-aerobic biological treatment of colored wastewaters: case study of a textile dyeing factory wastewater.

    PubMed

    Abiri, Fardin; Fallah, Narges; Bonakdarpour, Babak

    2017-03-01

    In the present study the feasibility of the use of a bacterial batch sequential anaerobic-aerobic process, in which activated sludge was used in both parts of the process, for pretreatment of wastewater generated by a textile dyeing factory has been considered. Activated sludge used in the process was obtained from a municipal wastewater treatment plant and adapted to real dyeing wastewater using either an anaerobic-only or an anaerobic-aerobic process over a period of 90 days. The use of activated sludge adapted using the anaerobic-aerobic process resulted in a higher overall decolorization efficiency compared to that achieved with activated sludge adapted using the anaerobic-only cycles. Anaerobic and aerobic periods of around 34 and 22 hours respectively resulted in an effluent with chemical oxygen demand (COD) and color content which met the standards for discharge into the centralized wastewater treatment plant of the industrial estate in which the dyeing factory was situated. Neutralization of the real dyeing wastewater and addition of carbon source to it, both of which results in significant increase in the cost of the bacterial treatment process, was not found to be necessary to achieve the required discharge standards.

  2. Lineup Composition, Suspect Position, and the Sequential Lineup Advantage

    ERIC Educational Resources Information Center

    Carlson, Curt A.; Gronlund, Scott D.; Clark, Steven E.

    2008-01-01

    N. M. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001) argued that sequential lineups reduce the likelihood of mistaken eyewitness identification. Experiment 1 replicated the design of R. C. L. Lindsay and G. L. Wells (1985), the first study to show the sequential lineup advantage. However, the innocent suspect was chosen at a lower rate…

  3. Characterising and correcting batch variation in an automated direct infusion mass spectrometry (DIMS) metabolomics workflow.

    PubMed

    Kirwan, J A; Broadhurst, D I; Davidson, R L; Viant, M R

    2013-06-01

    Direct infusion mass spectrometry (DIMS)-based untargeted metabolomics measures many hundreds of metabolites in a single experiment. While every effort is made to reduce within-experiment analytical variation in untargeted metabolomics, unavoidable sources of measurement error are introduced. This is particularly true for large-scale multi-batch experiments, necessitating the development of robust workflows that minimise batch-to-batch variation. Here, we conducted a purpose-designed, eight-batch DIMS metabolomics study using nanoelectrospray (nESI) Fourier transform ion cyclotron resonance mass spectrometric analyses of mammalian heart extracts. First, we characterised the intrinsic analytical variation of this approach to determine whether our existing workflows are fit for purpose when applied to a multi-batch investigation. Batch-to-batch variation was readily observed across the 7-day experiment, both in terms of its absolute measurement using quality control (QC) and biological replicate samples, as well as its adverse impact on our ability to discover significant metabolic information within the data. Subsequently, we developed and implemented a computational workflow that includes total-ion-current filtering, QC-robust spline batch correction and spectral cleaning, and provide conclusive evidence that this workflow reduces analytical variation and increases the proportion of significant peaks. We report an overall analytical precision of 15.9%, measured as the median relative standard deviation (RSD) for the technical replicates of the biological samples, across eight batches and 7 days of measurements. When compared against the FDA guidelines for biomarker studies, which specify an RSD of <20% as an acceptable level of precision, we conclude that our new workflows are fit for purpose for large-scale, high-throughput nESI DIMS metabolomics studies.

  4. Evaluating Bias of Sequential Mixed-Mode Designs against Benchmark Surveys

    ERIC Educational Resources Information Center

    Klausch, Thomas; Schouten, Barry; Hox, Joop J.

    2017-01-01

    This study evaluated three types of bias--total, measurement, and selection bias (SB)--in three sequential mixed-mode designs of the Dutch Crime Victimization Survey: telephone, mail, and web, where nonrespondents were followed up face-to-face (F2F). In the absence of true scores, all biases were estimated as mode effects against two different…

  5. C-quence: a tool for analyzing qualitative sequential data.

    PubMed

    Duncan, Starkey; Collier, Nicholson T

    2002-02-01

    C-quence is a software application that matches sequential patterns of qualitative data specified by the user and calculates the rate of occurrence of these patterns in a data set. Although it was designed to facilitate analyses of face-to-face interaction, it is applicable to any data set involving categorical data and sequential information. C-quence queries are constructed using a graphical user interface. The program does not limit the complexity of the sequential patterns specified by the user.

  6. Cell-controlled hybrid perfusion fed-batch CHO cell process provides significant productivity improvement over conventional fed-batch cultures.

    PubMed

    Hiller, Gregory W; Ovalle, Ana Maria; Gagnon, Matthew P; Curran, Meredith L; Wang, Wenge

    2017-07-01

    A simple method originally designed to control lactate accumulation in fed-batch cultures of Chinese Hamster Ovary (CHO) cells has been modified and extended to allow cells in culture to control their own rate of perfusion to precisely deliver nutritional requirements. The method allows for very fast expansion of cells to high density while using a minimal volume of concentrated perfusion medium. When the short-duration cell-controlled perfusion is performed in the production bioreactor and is immediately followed by a conventional fed-batch culture using highly concentrated feeds, the overall productivity of the culture is approximately doubled when compared with a highly optimized state-of-the-art fed-batch process. The technology was applied with near uniform success to five CHO cell processes producing five different humanized monoclonal antibodies. The increases in productivity were due to the increases in sustained viable cell densities. Biotechnol. Bioeng. 2017;114: 1438-1447. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. First year medical students' learning style preferences and their correlation with performance in different subjects within the medical course.

    PubMed

    Hernández-Torrano, Daniel; Ali, Syed; Chan, Chee-Kai

    2017-08-08

    Students commencing their medical training arrive with different educational backgrounds and a diverse range of learning experiences. Consequently, students would have developed preferred approaches to acquiring and processing information or learning style preferences. Understanding first-year students' learning style preferences is important to success in learning. However, little is understood about how learning styles impact learning and performance across different subjects within the medical curriculum. Greater understanding of the relationship between students' learning style preferences and academic performance in specific medical subjects would be valuable. This cross-sectional study examined the learning style preferences of first-year medical students and how they differ across gender. This research also analyzed the effect of learning styles on academic performance across different subjects within a medical education program in a Central Asian university. A total of 52 students (57.7% females) from two batches of first-year medical school completed the Index of Learning Styles Questionnaire, which measures four dimensions of learning styles: sensing-intuitive; visual-verbal; active-reflective; sequential-global. First-year medical students reported preferences for visual (80.8%) and sequential (60.5%) learning styles, suggesting that these students preferred to learn through demonstrations and diagrams and in a linear and sequential way. Our results indicate that male medical students have higher preference for visual learning style over verbal, while females seemed to have a higher preference for sequential learning style over global. Significant associations were found between sensing-intuitive learning styles and performance in Genetics [β = -0.46, B = -0.44, p < 0.01] and Anatomy [β = -0.41, B = -0.61, p < 0.05] and between sequential-global styles and performance in Genetics [β = 0.36, B = 0.43, p < 0.05]. More specifically, sensing learners were more likely to perform better than intuitive learners in the two subjects and global learners were more likely to perform better than sequential learners in Genetics. This knowledge will be helpful to individual students to improve their performance in these subjects by adopting new sensing learning techniques. Instructors can also benefit by modifying and adapting more appropriate teaching approaches in these subjects. Future studies to validate this observation will be valuable.

  8. Fast BIA-amperometric determination of isoniazid in tablets.

    PubMed

    Quintino, Maria S M; Angnes, Lúcio

    2006-09-26

    This paper proposes a new, fast and precise method to analyze isoniazid based on the electrochemical oxidation of the analyte at a glassy carbon electrode in 0.1M NaOH. The quantification was performed utilizing amperometry associated with batch injection analysis (BIA) technique. Fast sequential analysis (60 determinations h(-1)) in an unusually wide linear dynamic range (from 2.5 x 10(-8) to 1.0 x 10(-3)M), with high sensitivity and low limits of detection (4.1 x 10(-9)M) and quantification (1.4 x 10(-8)M), was achieved. Such characteristics allied to a good repeatability of the current responses (relative standard deviation of 0.79% for 30 measurements), were explored for the specific determination of isoniazid in isoniazid-rifampin tablet.

  9. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    PubMed

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  10. Design Language for Digital Systems

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.

    1985-01-01

    Digital Systems Design Language (DDL) is convenient hardware description language for developing and testing digital designs and for inputting design details into design automation system. Describes digital systems at gate, register transfer, and combinational block levels. DDL-based programs written in FORTRAN IV for batch execution.

  11. Experimental design of a twin-column countercurrent gradient purification process.

    PubMed

    Steinebach, Fabian; Ulmer, Nicole; Decker, Lara; Aumann, Lars; Morbidelli, Massimo

    2017-04-07

    As typical for separation processes, single unit batch chromatography exhibits a trade-off between purity and yield. The twin-column MCSGP (multi-column countercurrent solvent gradient purification) process allows alleviating such trade-offs, particularly in the case of difficult separations. In this work an efficient and reliable procedure for the design of the twin-column MCSGP process is developed. This is based on a single batch chromatogram, which is selected as the design chromatogram. The derived MCSGP operation is not intended to provide optimal performance, but it provides the target product in the selected fraction of the batch chromatogram, but with higher yield. The design procedure is illustrated for the isolation of the main charge isoform of a monoclonal antibody from Protein A eluate with ion-exchange chromatography. The main charge isoform was obtained at a purity and yield larger than 90%. At the same time process related impurities such as HCP and leached Protein A as well as aggregates were at least equally well removed. Additionally, the impact of several design parameters on the process performance in terms of purity, yield, productivity and buffer consumption is discussed. The obtained results can be used for further fine-tuning of the process parameters so as to improve its performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. How Many Batches Are Needed for Process Validation under the New FDA Guidance?

    PubMed

    Yang, Harry

    2013-01-01

    The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and they highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance. The newly updated FDA Guidance for Industry on Process Validation: General Principles and Practices ushers in a life cycle approach to process validation. While the guidance no longer considers the use of traditional three-batch validation appropriate, it does not prescribe the number of validation batches for a prospective validation protocol, nor does it provide specific methods to determine it. This potentially could leave manufacturers in a quandary. In this paper, I develop a Bayesian method to address the issue. By combining process knowledge gained from Stage 1 Process Design (PD) with expected outcomes of Stage 2 Process Performance Qualification (PPQ), the number of validation batches for PPQ is determined to provide a high level of assurance that the process will consistently produce future batches meeting quality standards. Several examples based on simulated data are presented to illustrate the use of the Bayesian method in helping manufacturers make risk-based decisions for Stage 2 PPQ, and THEY highlight the advantages of the method over traditional Frequentist approaches. The discussions in the paper lend support for a life cycle and risk-based approach to process validation recommended in the new FDA guidance.

  13. Emulsion of Chloramphenicol: an Overwhelming Approach for Ocular Delivery.

    PubMed

    Ashara, Kalpesh C; Shah, Ketan V

    2017-03-01

    Ophthalmic formulations of chloramphenicol have poor bioavailability of chloramphenicol in the ocular cavity. The present study aimed at exploring the impact of different oil mixtures in the form of emulsion on the permeability of chloramphenicol after ocular application. Selection of oil mixture and ratio of the components was made by an equilibrium solubility method. An emulsifier was chosen according to its emulsification properties. A constrained simplex centroid design was used for the assessment of the emulsion development. Emulsions were evaluated for physicochemical properties; zone of inhibition, in-vitro diffusion and ex-vivo local accumulation of chloramphenicol. Validation of the design using check-point batch and reduced polynomial equations were also developed. Optimization of the emulsion was developed by software Design® expert 6.0.8. Assessment of the osmolarity, ocular irritation, sterility testing and isotonicity of optimized batch were also made. Parker Neem®, olive and peppermint oils were selected as an oil phase in the ratio 63.64:20.2:16.16. PEG-400 was selected as an emulsifier according to a pseudo-ternary phase diagram. Constrained simplex-centroid design was applied in the range of 25-39% water, 55-69% PEG-400, 5-19% optimized oil mixture, and 1% chloramphenicol. Unpaired Student's t-test showed for in-vitro and ex-vivo studies that there was a significant difference between the optimized batch of emulsion and Chloramphenicol eye caps (a commercial product) according to both were equally safe. The optimized batch of an emulsion of chloramphenicol was found to be as safe as and more effective than Chloramphenicol eye caps.

  14. Impact of broiler processing scalding and chilling profiles on carcass and breast meat yield.

    PubMed

    Buhr, R J; Walker, J M; Bourassa, D V; Caudill, A B; Kiepper, B H; Zhuang, H

    2014-06-01

    The effect of scalding and chilling procedures was evaluated on carcass and breast meat weight and yield in broilers. On 4 separate weeks (trials), broilers were subjected to feed withdrawal, weighed, and then stunned and bled in 4 sequential batches (n = 16 broilers/batch, 64 broilers/trial). In addition, breast skin was collected before scalding, after scalding, and after defeathering for proximate analysis. Each batch of 16 carcasses was subjected to either hard (60.0°C for 1.5 min) or soft (52.8°C for 3 min) immersion scalding. Following defeathering and evisceration, 8 carcasses/batch were air-chilled (0.5°C, 120 min, 86% RH) and 8 carcasses/batch were immersion water-chilled (water and ice 0.5°C, 40 min). Carcasses were reweighed individually following evisceration and following chilling. Breast meat was removed from the carcass and weighed within 4 h postmortem. There were significant (P < 0.05) differences among the trials for all weights and yields; however, postfeed withdrawal shackle weight and postscald-defeathered eviscerated weights did not differ between the scalding and chilling treatments. During air-chilling all carcasses lost weight, resulting in postchill carcass yield of 73.0% for soft-scalded and 71.3% for hard-scalded carcasses, a difference of 1.7%. During water-chilling all carcasses gained weight, resulting in heavier postchill carcass weights (2,031 g) than for air-chilled carcasses (1,899 g). Postchill carcass yields were correspondingly higher for water-chilled carcasses, 78.2% for soft-scalded and 76.1% for hard-scalded carcasses, a difference of 2.1%. Only in trials 1 and 4 was breast meat yield significantly lower for hard-scalded, air-chilled carcasses (16.1 and 17.5%) than the other treatments. Proximate analysis of skin sampled after scalding or defeathering did not differ significantly in moisture (P = 0.2530) or lipid (P = 0.6412) content compared with skin sampled before scalding. Skin protein content was significantly higher (P < 0.05) for prescald and soft-scalded skin samples than for hard-scalded or soft or hard-scalded skin samples after defeathering. The hard-scalding method used in this experiment did not result in increased skin lipid loss either before or after defeathering. Poultry Science Association Inc.

  15. Sequential, progressive, equal-power, reflective beam-splitter arrays

    NASA Astrophysics Data System (ADS)

    Manhart, Paul K.

    2017-11-01

    The equations to calculate equal-power reflectivity of a sequential series of beam splitters is presented. Non-sequential optical design examples are offered for uniform illumination using diode lasers. Objects created using Boolean operators and Swept Surfaces can create objects capable of reflecting light into predefined elevation and azimuth angles. Analysis of the illumination patterns for the array are also presented.

  16. Correcting for intra-experiment variation in Illumina BeadChip data is necessary to generate robust gene-expression profiles.

    PubMed

    Kitchen, Robert R; Sabine, Vicky S; Sims, Andrew H; Macaskill, E Jane; Renshaw, Lorna; Thomas, Jeremy S; van Hemert, Jano I; Dixon, J Michael; Bartlett, John M S

    2010-02-24

    Microarray technology is a popular means of producing whole genome transcriptional profiles, however high cost and scarcity of mRNA has led many studies to be conducted based on the analysis of single samples. We exploit the design of the Illumina platform, specifically multiple arrays on each chip, to evaluate intra-experiment technical variation using repeated hybridisations of universal human reference RNA (UHRR) and duplicate hybridisations of primary breast tumour samples from a clinical study. A clear batch-specific bias was detected in the measured expressions of both the UHRR and clinical samples. This bias was found to persist following standard microarray normalisation techniques. However, when mean-centering or empirical Bayes batch-correction methods (ComBat) were applied to the data, inter-batch variation in the UHRR and clinical samples were greatly reduced. Correlation between replicate UHRR samples improved by two orders of magnitude following batch-correction using ComBat (ranging from 0.9833-0.9991 to 0.9997-0.9999) and increased the consistency of the gene-lists from the duplicate clinical samples, from 11.6% in quantile normalised data to 66.4% in batch-corrected data. The use of UHRR as an inter-batch calibrator provided a small additional benefit when used in conjunction with ComBat, further increasing the agreement between the two gene-lists, up to 74.1%. In the interests of practicalities and cost, these results suggest that single samples can generate reliable data, but only after careful compensation for technical bias in the experiment. We recommend that investigators appreciate the propensity for such variation in the design stages of a microarray experiment and that the use of suitable correction methods become routine during the statistical analysis of the data.

  17. Correcting for intra-experiment variation in Illumina BeadChip data is necessary to generate robust gene-expression profiles

    PubMed Central

    2010-01-01

    Background Microarray technology is a popular means of producing whole genome transcriptional profiles, however high cost and scarcity of mRNA has led many studies to be conducted based on the analysis of single samples. We exploit the design of the Illumina platform, specifically multiple arrays on each chip, to evaluate intra-experiment technical variation using repeated hybridisations of universal human reference RNA (UHRR) and duplicate hybridisations of primary breast tumour samples from a clinical study. Results A clear batch-specific bias was detected in the measured expressions of both the UHRR and clinical samples. This bias was found to persist following standard microarray normalisation techniques. However, when mean-centering or empirical Bayes batch-correction methods (ComBat) were applied to the data, inter-batch variation in the UHRR and clinical samples were greatly reduced. Correlation between replicate UHRR samples improved by two orders of magnitude following batch-correction using ComBat (ranging from 0.9833-0.9991 to 0.9997-0.9999) and increased the consistency of the gene-lists from the duplicate clinical samples, from 11.6% in quantile normalised data to 66.4% in batch-corrected data. The use of UHRR as an inter-batch calibrator provided a small additional benefit when used in conjunction with ComBat, further increasing the agreement between the two gene-lists, up to 74.1%. Conclusion In the interests of practicalities and cost, these results suggest that single samples can generate reliable data, but only after careful compensation for technical bias in the experiment. We recommend that investigators appreciate the propensity for such variation in the design stages of a microarray experiment and that the use of suitable correction methods become routine during the statistical analysis of the data. PMID:20181233

  18. Acceptance Test Data for Candidate AGR-5/6/7 TRISO Particle Batches BWXT Coater Batches 93165 93172 Defective IPyC Fraction and Pyrocarbon Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmreich, Grant W.; Hunn, John D.; Skitt, Darren J.

    2017-03-01

    Coated particle fuel batches J52O-16-93165, 93166, 93168, 93169, 93170, and 93172 were produced by Babcock and Wilcox Technologies (BWXT) for possible selection as fuel for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR). Some of these batches may alternately be used as demonstration coated particle fuel for other experiments. Each batch was coated in a 150-mm-diameter production-scale fluidized-bed chemical vapor deposition (CVD) furnace. Tristructural isotropic (TRISO) coatings were deposited on 425-μm-nominal-diameter spherical kernels from BWXT lot J52R-16-69317 containing a mixture of 15.5%-enriched uranium carbide andmore » uranium oxide (UCO). The TRISO coatings consisted of four consecutive CVD layers: a ~50% dense carbon buffer layer with 100-μm-nominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μmnominal thickness. The TRISO-coated particle batches were sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batches were designated by appending the letter A to the end of the batch number (e.g., 93165A).« less

  19. Semiautomated, Reproducible Batch Processing of Soy

    NASA Technical Reports Server (NTRS)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings per inch (.28 openings per centimeter)] and is used in conjunction with the press-piston head. Supporting equipment includes a soy-milk heat exchanger for maintaining selected coagulation temperatures, a filter system for separating okara from other particulate matter and from soy milk, two pumps, and various thermocouples, flowmeters, level indicators, pressure sensors, valves, tubes, and sample ports

  20. Planning and Instruction and the Social Studies Curriculum: A Discourse on Design and Delivery Systems.

    ERIC Educational Resources Information Center

    Peters, Richard

    A model for Continuous-Integrated-Sequential (C/I/S) curricula for social studies education is presented. The design advocated involves ensuring continuity of instruction from grades K-12, an integration of social studies disciplines, and a sequential process of refining and reinforcing concept and skills from grade-to-grade along the K-12…

  1. Flexible sequential designs for multi-arm clinical trials.

    PubMed

    Magirr, D; Stallard, N; Jaki, T

    2014-08-30

    Adaptive designs that are based on group-sequential approaches have the benefit of being efficient as stopping boundaries can be found that lead to good operating characteristics with test decisions based solely on sufficient statistics. The drawback of these so called 'pre-planned adaptive' designs is that unexpected design changes are not possible without impacting the error rates. 'Flexible adaptive designs' on the other hand can cope with a large number of contingencies at the cost of reduced efficiency. In this work, we focus on two different approaches for multi-arm multi-stage trials, which are based on group-sequential ideas, and discuss how these 'pre-planned adaptive designs' can be modified to allow for flexibility. We then show how the added flexibility can be used for treatment selection and sample size reassessment and evaluate the impact on the error rates in a simulation study. The results show that an impressive overall procedure can be found by combining a well chosen pre-planned design with an application of the conditional error principle to allow flexible treatment selection. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Repeated significance tests of linear combinations of sensitivity and specificity of a diagnostic biomarker

    PubMed Central

    Wu, Mixia; Shu, Yu; Li, Zhaohai; Liu, Aiyi

    2016-01-01

    A sequential design is proposed to test whether the accuracy of a binary diagnostic biomarker meets the minimal level of acceptance. The accuracy of a binary diagnostic biomarker is a linear combination of the marker’s sensitivity and specificity. The objective of the sequential method is to minimize the maximum expected sample size under the null hypothesis that the marker’s accuracy is below the minimal level of acceptance. The exact results of two-stage designs based on Youden’s index and efficiency indicate that the maximum expected sample sizes are smaller than the sample sizes of the fixed designs. Exact methods are also developed for estimation, confidence interval and p-value concerning the proposed accuracy index upon termination of the sequential testing. PMID:26947768

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shull, H.E.

    The objective of the project was to investigate the economic feasibility of converting potato waste to fuel alcohol. The source of potato starch was Troyer Farms Potato Chips. Experimental work was carried out at both the laboratory scale and the larger pilot scale batch operation at a decommissioned waste water treatment building on campus. The laboratory scale work was considerably more extensive than originally planned, resulting in a much improved scientific work. The pilot scale facility has been completed and operated successfully. In contrast, the analysis of the economic feasibility of commercial production has not yet been completed. The projectmore » was brought to a close with the successful demonstration of the fermentation and distillation using the large scale facilities described previously. Two batches of mash were cooked using the procedures established in support of the laboratory scale work. One of the batches was fermented using the optimum values of the seven controlled factors as predicted by the laboratory scale application of the Box-Wilson design. The other batch was fermented under conditions derived out of Mr. Rouse's interpretation of his long sequence of laboratory results. He was gratified to find that his commitment to the Box-Wilson experiments was justified. The productivity of the Box-Wilson design was greater. The difference between the performance of the two fermentors (one stirred, one not) has not been established yet. Both batches were then distilled together, demonstrating the satisfactory performance of the column still. 4 references.« less

  4. Coal fly ash interaction with environmental fluids: Geochemical and strontium isotope results from combined column and batch leaching experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brubaker, Tonya M; Stewart, Brian W; Capo, Rosemary C

    2013-05-01

    The major element and Sr isotope systematics and geochemistry of coal fly ash and its interactions with environmental waters were investigated using laboratory flow-through column leaching experiments (sodium carbonate, acetic acid, nitric acid) and sequential batch leaching experiments (water, acetic acid, hydrochloric acid). Column leaching of Class F fly ash samples shows rapid release of most major elements early in the leaching procedure, suggesting an association of these elements with soluble and surface bound phases. Delayed release of certain elements (e.g., Al, Fe, Si) signals gradual dissolution of more resistant silicate or glass phases as leaching continues. Strontium isotope resultsmore » from both column and batch leaching experiments show a marked increase in {sup 87}Sr/{sup 86}Sr ratio with continued leaching, yielding a total range of values from 0.7107 to 0.7138. For comparison, the isotopic composition of fluid output from a fly ash impoundment in West Virginia falls in a narrow range around 0.7124. The experimental data suggest the presence of a more resistant, highly radiogenic silicate phase that survives the combustion process and is leached after the more soluble minerals are removed. Strontium isotopic homogenization of minerals in coal does not always occur during the combustion process, despite the high temperatures encountered in the boiler. Early-released Sr tends to be isotopically uniform; thus the Sr isotopic composition of fly ash could be distinguishable from other sources and is a useful tool for quantifying the possible contribution of fly ash leaching to the total dissolved load in natural surface and ground waters.« less

  5. Structured Design Language for Computer Programs

    NASA Technical Reports Server (NTRS)

    Pace, Walter H., Jr.

    1986-01-01

    Box language used at all stages of program development. Developed to provide improved productivity in designing, coding, and maintaining computer programs. BOX system written in FORTRAN 77 for batch execution.

  6. Sequential Injection/Electrochemical Immunoassay for Quantifying the Pesticide Metabolite 3, 5, 6-Trichloro-2-Pyridinol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Guodong; Riechers, Shawn L.; Timchalk, Chuck

    2005-12-04

    An automated and sensitive sequential injection electrochemical immunoassay was developed to monitor a potential insecticide biomarker, 3, 5, 6-trichloro-2-pyridinol. The current method involved a sequential injection analysis (SIA) system equipped with a thin-layer electrochemical flow cell and permanent magnet, which was used to fix 3,5,6-trichloro-2-pyridinol (TCP) antibody coated magnetic beads (TCP-Ab-MBs) in the reaction zone. After competitive immunoreactions among TCP-Ab-MBs, TCP analyte, and horseradish peroxidase (HRP) labeled TCP, a 3, 3?, 5, 5?-tetramethylbenzidine dihydrochloride and hydrogen peroxide (TMB-H2O2) substrate solution was injected to produce an electroactive enzymatic product. The activity of HRP tracers was monitored by a square wave voltammetricmore » scanning electroactive enzymatic product in the thin-layer flow cell. The voltammetric characteristics of the substrate and the enzymatic product were investigated under batch conditions, and the parameters of the immunoassay were optimized in the SIA system. Under the optimal conditions, the system was used to measure as low as 6 ng L-1 (ppt) TCP, which is around 50-fold lower than the value indicated by the manufacturer of the TCP RaPID Assay? kit (0.25 ug/L, colorimetric detection). The performance of the developed immunoassay system was successfully evaluated on tap water and river water samples spiked with TCP. This technique could be readily used for detecting other environmental contaminants by developing specific antibodies against contaminants and is expected to open new opportunities for environmental and biological monitoring.« less

  7. Sequential causal inference: Application to randomized trials of adaptive treatment strategies

    PubMed Central

    Dawson, Ree; Lavori, Philip W.

    2009-01-01

    SUMMARY Clinical trials that randomize subjects to decision algorithms, which adapt treatments over time according to individual response, have gained considerable interest as investigators seek designs that directly inform clinical decision making. We consider designs in which subjects are randomized sequentially at decision points, among adaptive treatment options under evaluation. We present a sequential method to estimate the comparative effects of the randomized adaptive treatments, which are formalized as adaptive treatment strategies. Our causal estimators are derived using Bayesian predictive inference. We use analytical and empirical calculations to compare the predictive estimators to (i) the ‘standard’ approach that allocates the sequentially obtained data to separate strategy-specific groups as would arise from randomizing subjects at baseline; (ii) the semi-parametric approach of marginal mean models that, under appropriate experimental conditions, provides the same sequential estimator of causal differences as the proposed approach. Simulation studies demonstrate that sequential causal inference offers substantial efficiency gains over the standard approach to comparing treatments, because the predictive estimators can take advantage of the monotone structure of shared data among adaptive strategies. We further demonstrate that the semi-parametric asymptotic variances, which are marginal ‘one-step’ estimators, may exhibit significant bias, in contrast to the predictive variances. We show that the conditions under which the sequential method is attractive relative to the other two approaches are those most likely to occur in real studies. PMID:17914714

  8. Evidence for decreased interaction and improved carotenoid bioavailability by sequential delivery of a supplement.

    PubMed

    Salter-Venzon, Dawna; Kazlova, Valentina; Izzy Ford, Samantha; Intra, Janjira; Klosner, Allison E; Gellenbeck, Kevin W

    2017-05-01

    Despite the notable health benefits of carotenoids for human health, the majority of human diets worldwide are repeatedly shown to be inadequate in intake of carotenoid-rich fruits and vegetables, according to current health recommendations. To address this deficit, strategies designed to increase dietary intakes and subsequent plasma levels of carotenoids are warranted. When mixed carotenoids are delivered into the intestinal tract simultaneously, competition occurs for micelle formation and absorption, affecting carotenoid bioavailability. Previously, we tested the in vitro viability of a carotenoid mix designed to deliver individual carotenoids sequentially spaced from one another over the 6 hr transit time of the human upper gastrointestinal system. We hypothesized that temporally and spatially separating the individual carotenoids would reduce competition for micelle formation, improve uptake, and maximize efficacy. Here, we test this hypothesis in a double-blind, repeated-measure, cross-over human study with 12 subjects by comparing the change of plasma carotenoid levels for 8 hr after oral doses of a sequentially spaced carotenoid mix, to a matched mix without sequential spacing. We find the carotenoid change from baseline, measured as area under the curve, is increased following consumption of the sequentially spaced mix compared to concomitant carotenoids delivery. These results demonstrate reduced interaction and regulation between the sequentially spaced carotenoids, suggesting improved bioavailability from a novel sequentially spaced carotenoid mix.

  9. Improved design of constrained model predictive tracking control for batch processes against unknown uncertainties.

    PubMed

    Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong

    2017-07-01

    In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Batch Mode Reinforcement Learning based on the Synthesis of Artificial Trajectories

    PubMed Central

    Fonteneau, Raphael; Murphy, Susan A.; Wehenkel, Louis; Ernst, Damien

    2013-01-01

    In this paper, we consider the batch mode reinforcement learning setting, where the central problem is to learn from a sample of trajectories a policy that satisfies or optimizes a performance criterion. We focus on the continuous state space case for which usual resolution schemes rely on function approximators either to represent the underlying control problem or to represent its value function. As an alternative to the use of function approximators, we rely on the synthesis of “artificial trajectories” from the given sample of trajectories, and show that this idea opens new avenues for designing and analyzing algorithms for batch mode reinforcement learning. PMID:24049244

  11. Automated batch characterization of inkjet-printed elastomer lenses using a LEGO platform.

    PubMed

    Sung, Yu-Lung; Garan, Jacob; Nguyen, Hoang; Hu, Zhenyu; Shih, Wei-Chuan

    2017-09-10

    Small, self-adhesive, inkjet-printed elastomer lenses have enabled smartphone cameras to image and resolve microscopic objects. However, the performance of different lenses within a batch is affected by hard-to-control environmental variables. We present a cost-effective platform to perform automated batch characterization of 300 lens units simultaneously for quality inspection. The system was designed and configured with LEGO bricks, 3D printed parts, and a digital camera. The scheme presented here may become the basis of a high-throughput, in-line inspection tool for quality control purposes and can also be employed for optimization of the manufacturing process.

  12. High rate psychrophilic anaerobic digestion of high solids (35%) dairy manure in sequence batch reactor.

    PubMed

    Saady, Noori M Cata; Massé, Daniel I

    2015-06-01

    Zero liquid discharge is increasingly adopted as an objective for waste treatment process. The objective of this study was to increase the feed total solids (TS) and the organic loading rate (OLR) fed to a novel psychrophilic (20°C) dry anaerobic digestion (PDAD). Duplicate laboratory-scale bioreactors were fed cow feces and wheat straw (35% TS in feed) at OLR of 6.0 g TCOD kg(-1) inoculum d(-1) during long-term operation (147 days consisting of 7 successive cycles). An overall average specific methane yield (SMY) of 151.8±7.9 N L CH4 kg(-1) VS fed with an averaged volatile solids removal of 42.4±4.3% were obtained at a volatile solids-based inoculum-to-substrate ratio (ISR) of 2.13±0.2. The operation was stable as indicated by biogas and VFAs profiles and the results were reproducible in successive cycles; a maximum SMY of 163.3±5.7 N L CH4 kg(-1) VS fed was obtained. Hydrolysis was the reaction limiting step. High rate PDAD of 35% TS dairy manure is possible in sequential batch reactor within 21 days treatment cycle length. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  13. Waste Water for Power Generation via Energy Efficient Selective Silica Separations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nenoff, Tina M.; Brady, Patrick Vane; Sasan, Koroush

    Silica is ubiquitous in produced and industrial waters, and plays a major disruptive role in water recycle. Herein we have investigated the use of mixed oxides for the removal of silica from these waters, and their incorporation into a low cost and low energy water purification process. High selectivity hydrotalcite (HTC, (Mg 6Al 2(OH) 16(CO 3)•4H 2O)), is combined in series with high surface area active alumina (AA, (Al 2O 3)) as the dissolved silica removal media. Batch test results indicated that combined HTC/AA is a more effective method for removing silica from industrial cooling tower wasters (CTW) than usingmore » HTC or AA separately. The silica uptake via ion exchange on the mixed oxides was confirmed by Fourier transform infrared (FTIR), and Energy dispersive spectroscopy (EDS). Furthermore, HTC/AA effectively removes silica from CTW even in the presence of large concentrations of competing anions, such as Cl -, NO 3 - HCO 3 -, CO 3 2- and SO 4 2-. Similar to batch tests, Single Path Flow Through (SPFT) tests with sequential HTC/AA column filtration has very high silica removal too. Technoeconomic Analysis (TEA) was simultaneously performed for cost comparisons to existing silica removal technologies.« less

  14. Metal-organic framework mixed-matrix disks: Versatile supports for automated solid-phase extraction prior to chromatographic separation.

    PubMed

    Ghani, Milad; Font Picó, Maria Francesca; Salehinia, Shima; Palomino Cabello, Carlos; Maya, Fernando; Berlier, Gloria; Saraji, Mohammad; Cerdà, Víctor; Turnes Palomino, Gemma

    2017-03-10

    We present for the first time the application of metal-organic framework (MOF) mixed-matrix disks (MMD) for the automated flow-through solid-phase extraction (SPE) of environmental pollutants. Zirconium terephthalate UiO-66 and UiO-66-NH 2 MOFs with different size (90, 200 and 300nm) have been incorporated into mechanically stable polyvinylidene difluoride (PVDF) disks. The performance of the MOF-MMDs for automated SPE of seven substituted phenols prior to HPLC analysis has been evaluated using the sequential injection analysis technique. MOF-MMDs enabled the simultaneous extraction of phenols with the concomitant size exclusion of molecules of larger size. The best extraction performance was obtained using a MOF-MMD containing 90nm UiO-66-NH 2 crystals. Using the selected MOF-MMD, detection limits ranging from 0.1 to 0.2μgL -1 were obtained. Relative standard deviations ranged from 3.9 to 5.3% intra-day, and 4.7-5.7% inter-day. Membrane batch-to-batch reproducibility was from 5.2 to 6.4%. Three different groundwater samples were analyzed with the proposed method using MOF-MMDs, obtaining recoveries ranging from 90 to 98% for all tested analytes. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Two-dimensional liquid chromatography consisting of twelve second-dimension columns for comprehensive analysis of intact proteins.

    PubMed

    Ren, Jiangtao; Beckner, Matthew A; Lynch, Kyle B; Chen, Huang; Zhu, Zaifang; Yang, Yu; Chen, Apeng; Qiao, Zhenzhen; Liu, Shaorong; Lu, Joann J

    2018-05-15

    A comprehensive two-dimensional liquid chromatography (LCxLC) system consisting of twelve columns in the second dimension was developed for comprehensive analysis of intact proteins in complex biological samples. The system consisted of an ion-exchange column in the first dimension and the twelve reverse-phase columns in the second dimension; all thirteen columns were monolithic and prepared inside 250 µm i.d. capillaries. These columns were assembled together through the use of three valves and an innovative configuration. The effluent from the first dimension was continuously fractionated and sequentially transferred into the twelve second-dimension columns, while the second-dimension separations were carried out in a series of batches (six columns per batch). This LCxLC system was tested first using standard proteins followed by real-world samples from E. coli. Baseline separation was observed for eleven standard proteins and hundreds of peaks were observed for the real-world sample analysis. Two-dimensional liquid chromatography, often considered as an effective tool for mapping proteins, is seen as laborious and time-consuming when configured offline. Our online LCxLC system with increased second-dimension columns promises to provide a solution to overcome these hindrances. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Treatment of industrial wastewater effluents using hydrodynamic cavitation and the advanced Fenton process.

    PubMed

    Chakinala, Anand G; Gogate, Parag R; Burgess, Arthur E; Bremner, David H

    2008-01-01

    For the first time, hydrodynamic cavitation induced by a liquid whistle reactor (LWR) has been used in conjunction with the advanced Fenton process (AFP) for the treatment of real industrial wastewater. Semi-batch experiments in the LWR were designed to investigate the performance of the process for two different industrial wastewater samples. The effect of various operating parameters such as pressure, H2O2 concentration and the initial concentration of industrial wastewater samples on the extent of mineralization as measured by total organic carbon (TOC) content have been studied with the aim of maximizing the extent of degradation. It has been observed that higher pressures, sequential addition of hydrogen peroxide at higher loadings and lower concentration of the effluent are more favourable for a rapid TOC mineralization. In general, the novel combination of hydrodynamic cavitation with AFP results in about 60-80% removal of TOC under optimized conditions depending on the type of industrial effluent samples. The combination described herein is most useful for treatment of bio-refractory materials where the diminution in toxicity can be achieved up to a certain level and then conventional biological oxidation can be employed for final treatment. The present work is the first to report the use of a hydrodynamic cavitation technique for real industrial wastewater treatment.

  17. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    PubMed Central

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  18. 46 CFR 154.610 - Design temperature not colder than 0 °C (32 °F).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... materials must meet §§ 54.25-1 and 54.25-3 of this chapter. (b) Plates, forgings, rolled and forged bars and... batch of forgings, forged or rolled fittings, and forged or rolled bars and shapes. (f) The specified... ton batch of forgings, forged or rolled fittings and rolled or forged bars and shapes. (h) The...

  19. Superstructure-based Design and Optimization of Batch Biodiesel Production Using Heterogeneous Catalysts

    NASA Astrophysics Data System (ADS)

    Nuh, M. Z.; Nasir, N. F.

    2017-08-01

    Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.

  20. Implementing Quality Criteria in Designing and Conducting a Sequential Quan [right arrow] Qual Mixed Methods Study of Student Engagement with Learning Applied Research Methods Online

    ERIC Educational Resources Information Center

    Ivankova, Nataliya V.

    2014-01-01

    In spite of recent methodological developments related to quality assurance in mixed methods research, practical examples of how to implement quality criteria in designing and conducting sequential QUAN [right arrow] QUAL mixed methods studies to ensure the process is systematic and rigorous remain scarce. This article discusses a three-step…

  1. A multi-stage drop-the-losers design for multi-arm clinical trials.

    PubMed

    Wason, James; Stallard, Nigel; Bowden, Jack; Jennison, Christopher

    2017-02-01

    Multi-arm multi-stage trials can improve the efficiency of the drug development process when multiple new treatments are available for testing. A group-sequential approach can be used in order to design multi-arm multi-stage trials, using an extension to Dunnett's multiple-testing procedure. The actual sample size used in such a trial is a random variable that has high variability. This can cause problems when applying for funding as the cost will also be generally highly variable. This motivates a type of design that provides the efficiency advantages of a group-sequential multi-arm multi-stage design, but has a fixed sample size. One such design is the two-stage drop-the-losers design, in which a number of experimental treatments, and a control treatment, are assessed at a prescheduled interim analysis. The best-performing experimental treatment and the control treatment then continue to a second stage. In this paper, we discuss extending this design to have more than two stages, which is shown to considerably reduce the sample size required. We also compare the resulting sample size requirements to the sample size distribution of analogous group-sequential multi-arm multi-stage designs. The sample size required for a multi-stage drop-the-losers design is usually higher than, but close to, the median sample size of a group-sequential multi-arm multi-stage trial. In many practical scenarios, the disadvantage of a slight loss in average efficiency would be overcome by the huge advantage of a fixed sample size. We assess the impact of delay between recruitment and assessment as well as unknown variance on the drop-the-losers designs.

  2. Lineup composition, suspect position, and the sequential lineup advantage.

    PubMed

    Carlson, Curt A; Gronlund, Scott D; Clark, Steven E

    2008-06-01

    N. M. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001) argued that sequential lineups reduce the likelihood of mistaken eyewitness identification. Experiment 1 replicated the design of R. C. L. Lindsay and G. L. Wells (1985), the first study to show the sequential lineup advantage. However, the innocent suspect was chosen at a lower rate in the simultaneous lineup, and no sequential lineup advantage was found. This led the authors to hypothesize that protection from a sequential lineup might emerge only when an innocent suspect stands out from the other lineup members. In Experiment 2, participants viewed a simultaneous or sequential lineup with either the guilty suspect or 1 of 3 innocent suspects. Lineup fairness was varied to influence the degree to which a suspect stood out. A sequential lineup advantage was found only for the unfair lineups. Additional analyses of suspect position in the sequential lineups showed an increase in the diagnosticity of suspect identifications as the suspect was placed later in the sequential lineup. These results suggest that the sequential lineup advantage is dependent on lineup composition and suspect position. (c) 2008 APA, all rights reserved

  3. Design of two-column batch-to-batch recirculation to enhance performance in ion-exchange chromatography.

    PubMed

    Persson, Oliver; Andersson, Niklas; Nilsson, Bernt

    2018-01-05

    Preparative liquid chromatography is a separation technique widely used in the manufacturing of fine chemicals and pharmaceuticals. A major drawback of traditional single-column batch chromatography step is the trade-off between product purity and process performance. Recirculation of impure product can be utilized to make the trade-off more favorable. The aim of the present study was to investigate the usage of a two-column batch-to-batch recirculation process step to increase the performance compared to single-column batch chromatography at a high purity requirement. The separation of a ternary protein mixture on ion-exchange chromatography columns was used to evaluate the proposed process. The investigation used modelling and simulation of the process step, experimental validation and optimization of the simulated process. In the presented case the yield increases from 45.4% to 93.6% and the productivity increases 3.4 times compared to the performance of a batch run for a nominal case. A rapid concentration build-up product can be seen during the first cycles, before the process reaches a cyclic steady-state with reoccurring concentration profiles. The optimization of the simulation model predicts that the recirculated salt can be used as a flying start of the elution, which would enhance the process performance. The proposed process is more complex than a batch process, but may improve the separation performance, especially while operating at cyclic steady-state. The recirculation of impure fractions reduces the product losses and ensures separation of product to a high degree of purity. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Anaerobic co-digestion of waste activated sludge and greasy sludge from flotation process: batch versus CSTR experiments to investigate optimal design.

    PubMed

    Girault, R; Bridoux, G; Nauleau, F; Poullain, C; Buffet, J; Peu, P; Sadowski, A G; Béline, F

    2012-02-01

    In this study, the maximum ratio of greasy sludge to incorporate with waste activated sludge was investigated in batch and CSTR experiments. In batch experiments, inhibition occurred with a greasy sludge ratio of more than 20-30% of the feed COD. In CSTR experiments, the optimal greasy sludge ratio was 60% of the feed COD and inhibition occurred above a ratio of 80%. Hence, batch experiments can predict the CSTR yield when the degradation phenomenon are additive but cannot be used to determine the maximum ratio to be used in a CSTR configuration. Additionally, when the ratio of greasy sludge increased from 0% to 60% of the feed COD, CSTR methane production increased by more than 60%. When the greasy sludge ratio increased from 60% to 90% of the feed COD, the reactor yield decreased by 75%. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Optimal flexible sample size design with robust power.

    PubMed

    Zhang, Lanju; Cui, Lu; Yang, Bo

    2016-08-30

    It is well recognized that sample size determination is challenging because of the uncertainty on the treatment effect size. Several remedies are available in the literature. Group sequential designs start with a sample size based on a conservative (smaller) effect size and allow early stop at interim looks. Sample size re-estimation designs start with a sample size based on an optimistic (larger) effect size and allow sample size increase if the observed effect size is smaller than planned. Different opinions favoring one type over the other exist. We propose an optimal approach using an appropriate optimality criterion to select the best design among all the candidate designs. Our results show that (1) for the same type of designs, for example, group sequential designs, there is room for significant improvement through our optimization approach; (2) optimal promising zone designs appear to have no advantages over optimal group sequential designs; and (3) optimal designs with sample size re-estimation deliver the best adaptive performance. We conclude that to deal with the challenge of sample size determination due to effect size uncertainty, an optimal approach can help to select the best design that provides most robust power across the effect size range of interest. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Bacteriophage PRD1 batch experiments to study attachment, detachment and inactivation processes

    NASA Astrophysics Data System (ADS)

    Sadeghi, Gholamreza; Schijven, Jack F.; Behrends, Thilo; Hassanizadeh, S. Majid; van Genuchten, Martinus Th.

    2013-09-01

    Knowledge of virus removal in subsurface environments is pivotal for assessing the risk of viral contamination of water resources and developing appropriate protection measures. Columns packed with sand are frequently used to quantify attachment, detachment and inactivation rates of viruses. Since column transport experiments are very laborious, a common alternative is to perform batch experiments where usually one or two measurements are done assuming equilibrium is reached. It is also possible to perform kinetic batch experiments. In that case, however, it is necessary to monitor changes in the concentration with time. This means that kinetic batch experiments will be almost as laborious as column experiments. Moreover, attachment and detachment rate coefficients derived from batch experiments may differ from those determined using column experiments. The aim of this study was to determine the utility of kinetic batch experiments and investigate the effects of different designs of the batch experiments on estimated attachment, detachment and inactivation rate coefficients. The experiments involved various combinations of container size, sand-water ratio, and mixing method (i.e., rolling or tumbling by pivoting the tubes around their horizontal or vertical axes, respectively). Batch experiments were conducted with clean quartz sand, water at pH 7 and ionic strength of 20 mM, and using the bacteriophage PRD1 as a model virus. Values of attachment, detachment and inactivation rate coefficients were found by fitting an analytical solution of the kinetic model equations to the data. Attachment rate coefficients were found to be systematically higher under tumbling than under rolling conditions because of better mixing and more efficient contact of phages with the surfaces of the sand grains. In both mixing methods, more sand in the container yielded higher attachment rate coefficients. A linear increase in the detachment rate coefficient was observed with increased solid-water ratio using tumbling method. Given the differences in the attachment rate coefficients, and assuming the same sticking efficiencies since chemical conditions of the batch and column experiments were the same, our results show that collision efficiencies of batch experiments are not the same as those of column experiments. Upscaling of the attachment rate from batch to column experiments hence requires proper understanding of the mixing conditions. Because batch experiments, in which the kinetics are monitored, are as laborious as column experiments, there seems to be no major advantage in performing batch instead of column experiments.

  7. Characterization of Nanoparticle Batch-To-Batch Variability

    PubMed Central

    Mülhopt, Sonja; Dilger, Marco; Adelhelm, Christel; Anderlohr, Christopher; Gómez de la Torre, Johan; Langevin, Dominique; Mahon, Eugene; Piella, Jordi; Puntes, Victor; Ray, Sikha; Schneider, Reinhard; Wilkins, Terry; Weiss, Carsten

    2018-01-01

    A central challenge for the safe design of nanomaterials (NMs) is the inherent variability of NM properties, both as produced and as they interact with and evolve in, their surroundings. This has led to uncertainty in the literature regarding whether the biological and toxicological effects reported for NMs are related to specific NM properties themselves, or rather to the presence of impurities or physical effects such as agglomeration of particles. Thus, there is a strong need for systematic evaluation of the synthesis and processing parameters that lead to potential variability of different NM batches and the reproducible production of commonly utilized NMs. The work described here represents over three years of effort across 14 European laboratories to assess the reproducibility of nanoparticle properties produced by the same and modified synthesis routes for four of the OECD priority NMs (silica dioxide, zinc oxide, cerium dioxide and titanium dioxide) as well as amine-modified polystyrene NMs, which are frequently employed as positive controls for nanotoxicity studies. For 46 different batches of the selected NMs, all physicochemical descriptors as prioritized by the OECD have been fully characterized. The study represents the most complete assessment of NMs batch-to-batch variability performed to date and provides numerous important insights into the potential sources of variability of NMs and how these might be reduced. PMID:29738461

  8. Sequential circuit design for radiation hardened multiple voltage integrated circuits

    DOEpatents

    Clark, Lawrence T [Phoenix, AZ; McIver, III, John K.

    2009-11-24

    The present invention includes a radiation hardened sequential circuit, such as a bistable circuit, flip-flop or other suitable design that presents substantial immunity to ionizing radiation while simultaneously maintaining a low operating voltage. In one embodiment, the circuit includes a plurality of logic elements that operate on relatively low voltage, and a master and slave latches each having storage elements that operate on a relatively high voltage.

  9. Mining of high utility-probability sequential patterns from uncertain databases

    PubMed Central

    Zhang, Binbin; Fournier-Viger, Philippe; Li, Ting

    2017-01-01

    High-utility sequential pattern mining (HUSPM) has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs). They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM) for mining high utility-probability sequential patterns (HUPSPs) in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds. PMID:28742847

  10. Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis.

    PubMed

    Montague, Enid; Xu, Jie; Chen, Ping-Yu; Asan, Onur; Barrett, Bruce P; Chewning, Betty

    2011-10-01

    The aim of this study was to examine whether lag sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multiuser health care settings in which trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Nonverbal communication patterns are important aspects of clinician-patient interactions and may affect patient outcomes. The eye gaze behaviors of clinicians and patients in 110 videotaped medical encounters were analyzed using the lag sequential method to identify significant behavior sequences. Lag sequential analysis included both event-based lag and time-based lag. Results from event-based lag analysis showed that the patient's gaze followed that of the clinician, whereas the clinician's gaze did not follow the patient's. Time-based sequential analysis showed that responses from the patient usually occurred within 2 s after the initial behavior of the clinician. Our data suggest that the clinician's gaze significantly affects the medical encounter but that the converse is not true. Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs.

  11. Modeling Eye Gaze Patterns in Clinician-Patient Interaction with Lag Sequential Analysis

    PubMed Central

    Montague, E; Xu, J; Asan, O; Chen, P; Chewning, B; Barrett, B

    2011-01-01

    Objective The aim of this study was to examine whether lag-sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multi-user health care settings where trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Background Nonverbal communication patterns are important aspects of clinician-patient interactions and may impact patient outcomes. Method Eye gaze behaviors of clinicians and patients in 110-videotaped medical encounters were analyzed using the lag-sequential method to identify significant behavior sequences. Lag-sequential analysis included both event-based lag and time-based lag. Results Results from event-based lag analysis showed that the patients’ gaze followed that of clinicians, while clinicians did not follow patients. Time-based sequential analysis showed that responses from the patient usually occurred within two seconds after the initial behavior of the clinician. Conclusion Our data suggest that the clinician’s gaze significantly affects the medical encounter but not the converse. Application Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs. PMID:22046723

  12. Fully vs. Sequentially Coupled Loads Analysis of Offshore Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick; Wendt, Fabian; Musial, Walter

    The design and analysis methods for offshore wind turbines must consider the aerodynamic and hydrodynamic loads and response of the entire system (turbine, tower, substructure, and foundation) coupled to the turbine control system dynamics. Whereas a fully coupled (turbine and support structure) modeling approach is more rigorous, intellectual property concerns can preclude this approach. In fact, turbine control system algorithms and turbine properties are strictly guarded and often not shared. In many cases, a partially coupled analysis using separate tools and an exchange of reduced sets of data via sequential coupling may be necessary. In the sequentially coupled approach, themore » turbine and substructure designers will independently determine and exchange an abridged model of their respective subsystems to be used in their partners' dynamic simulations. Although the ability to achieve design optimization is sacrificed to some degree with a sequentially coupled analysis method, the central question here is whether this approach can deliver the required safety and how the differences in the results from the fully coupled method could affect the design. This work summarizes the scope and preliminary results of a study conducted for the Bureau of Safety and Environmental Enforcement aimed at quantifying differences between these approaches through aero-hydro-servo-elastic simulations of two offshore wind turbines on a monopile and jacket substructure.« less

  13. Accurately controlled sequential self-folding structures by polystyrene film

    NASA Astrophysics Data System (ADS)

    Deng, Dongping; Yang, Yang; Chen, Yong; Lan, Xing; Tice, Jesse

    2017-08-01

    Four-dimensional (4D) printing overcomes the traditional fabrication limitations by designing heterogeneous materials to enable the printed structures evolve over time (the fourth dimension) under external stimuli. Here, we present a simple 4D printing of self-folding structures that can be sequentially and accurately folded. When heated above their glass transition temperature pre-strained polystyrene films shrink along the XY plane. In our process silver ink traces printed on the film are used to provide heat stimuli by conducting current to trigger the self-folding behavior. The parameters affecting the folding process are studied and discussed. Sequential folding and accurately controlled folding angles are achieved by using printed ink traces and angle lock design. Theoretical analyses are done to guide the design of the folding processes. Programmable structures such as a lock and a three-dimensional antenna are achieved to test the feasibility and potential applications of this method. These self-folding structures change their shapes after fabrication under controlled stimuli (electric current) and have potential applications in the fields of electronics, consumer devices, and robotics. Our design and fabrication method provides an easy way by using silver ink printed on polystyrene films to 4D print self-folding structures for electrically induced sequential folding with angular control.

  14. Development of a new model for batch sedimentation and application to secondary settling tanks design.

    PubMed

    Karamisheva, Ralica D; Islam, M A

    2005-01-01

    Assuming that settling takes place in two zones (a constant rate zone and a variable rate zone), a model using four parameters accounting for the nature of the water-suspension system has been proposed for describing batch sedimentation processes. The sludge volume index (SVI) has been expressed in terms of these parameters. Some disadvantages of the SVI application as a design parameter have been pointed out, and it has been shown that a relationship between zone settling velocity and sludge concentration is more consistent for describing the settling behavior and for design of settling tanks. The permissible overflow rate has been related to the technological parameters of secondary settling tank by simple working equations. The graphical representations of these equations could be used to optimize the design and operation of secondary settling tanks.

  15. [Not Available].

    PubMed

    Paturzo, Marco; Colaceci, Sofia; Clari, Marco; Mottola, Antonella; Alvaro, Rosaria; Vellone, Ercole

    2016-01-01

    . Mixed methods designs: an innovative methodological approach for nursing research. The mixed method research designs (MM) combine qualitative and quantitative approaches in the research process, in a single study or series of studies. Their use can provide a wider understanding of multifaceted phenomena. This article presents a general overview of the structure and design of MM to spread this approach in the Italian nursing research community. The MM designs most commonly used in the nursing field are the convergent parallel design, the sequential explanatory design, the exploratory sequential design and the embedded design. For each method a research example is presented. The use of MM can be an added value to improve clinical practices as, through the integration of qualitative and quantitative methods, researchers can better assess complex phenomena typical of nursing.

  16. Design of Mixed Batch Reactor and Column Studies at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Weimin; Criddle, Craig S.

    2015-11-16

    We (the Stanford research team) were invited as external collaborators to contribute expertise in environmental engineering and field research at the ORNL IFRC, Oak Ridge, TN, for projects carried out at the Argonne National Laboratory and funded by US DOE. Specifically, we assisted in the design of batch and column reactors using ORNL IFRC materials to ensure the experiments were relevant to field conditions. During the funded research period, we characterized ORNL IFRC groundwater and sediments in batch microcosm and column experiments conducted at ANL, and we communicated with ANL team members through email and conference calls and face-to-face meetingsmore » at the annual ERSP PI meeting and national meetings. Microcosm test results demonstrated that U(VI) in sediments was reduced to U(IV) when amended with ethanol. The reduced products were not uraninite but unknown U(IV) complexes associated with Fe. Fe(III) in solid phase was only partially reduced. Due to budget reductions at ANL, Stanford contributions ended in 2011.« less

  17. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  18. Microcomputer Applications in Interaction Analysis.

    ERIC Educational Resources Information Center

    Wadham, Rex A.

    The Timed Interval Categorical Observation Recorder (TICOR), a portable, battery powered microcomputer designed to automate the collection of sequential and simultaneous behavioral observations and their associated durations, was developed to overcome problems in gathering subtle interaction analysis data characterized by sequential flow of…

  19. Proceedings of the Conference on the Design of Experiments in Army Research, Development and Testing (29th)

    DTIC Science & Technology

    1984-06-01

    SEQUENTIAL TESTING (Bldg. A, Room C) 1300-1330 ’ 1330-1415 1415-1445 1445-1515 BREAK 1515-1545 A TRUNCATED SEQUENTIAL PROBABILITY RATIO TEST J...suicide optical data operational testing reliability random numbers bootstrap methods missing data sequential testing fire support complex computer model carcinogenesis studies EUITION Of 1 NOV 68 I% OBSOLETE a ...contributed papers can be ascertained from the titles of the

  20. Physico-chemical, microbiological and ecotoxicological evaluation of a septic tank/Fenton reaction combination for the treatment of hospital wastewaters.

    PubMed

    Berto, Josiani; Rochenbach, Gisele Canan; Barreiros, Marco Antonio B; Corrêa, Albertina X R; Peluso-Silva, Sandra; Radetski, Claudemir Marcos

    2009-05-01

    Hospital wastewater is considered a complex mixture populated with pathogenic microorganisms. The genetic constitution of these microorganisms can be changed through the direct and indirect effects of hospital wastewater constituents, leading to the appearance of antibiotic multi-resistant bacteria. To avoid environmental contamination hospital wastewaters must be treated. The objective of this study was to evaluate the efficiency of hospital wastewater treated by a combined process of biological degradation (septic tank) and the Fenton reaction. Thus, after septic tank biodegradation, batch Fenton reaction experiments were performed in a laboratory-scale reactor and the effectiveness of this sequential treatment was evaluated by a physico-chemical/microbiological time-course analysis of COD, BOD(5), and thermotolerant and total coliforms. The results showed that after 120min of Fenton treatment BOD(5) and COD values decreased by 90.6% and 91.0%, respectively. The BOD(5)/COD ratio changed from 0.46 to 0.48 after 120min of treatment. Bacterial removal efficiency reached 100%, while biotests carried out with Scenedesmus subspicatus and Daphnia magna showed a significant decrease in the ecotoxicity of hospital wastewater after the sequential treatment. The use of this combined system would ensure that neither multi-resistant bacteria nor ecotoxic substances are released to the environment through hospital wastewater discharge.

  1. 40 CFR 1065.1105 - Sampling system design.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Sampling system design. 1065.1105... Compounds § 1065.1105 Sampling system design. (a) General. We recommend that you design your SVOC batch... practical, adjust sampling times based on the emission rate of target analytes from the engine to obtain...

  2. Dose finding with the sequential parallel comparison design.

    PubMed

    Wang, Jessie J; Ivanova, Anastasia

    2014-01-01

    The sequential parallel comparison design (SPCD) is a two-stage design recommended for trials with possibly high placebo response. A drug-placebo comparison in the first stage is followed in the second stage by placebo nonresponders being re-randomized between drug and placebo. We describe how SPCD can be used in trials where multiple doses of a drug or multiple treatments are compared with placebo and present two adaptive approaches. We detail how to analyze data in such trials and give recommendations about the allocation proportion to placebo in the two stages of SPCD.

  3. RACE/A: an architectural account of the interactions between learning, task control, and retrieval dynamics.

    PubMed

    van Maanen, Leendert; van Rijn, Hedderik; Taatgen, Niels

    2012-01-01

    This article discusses how sequential sampling models can be integrated in a cognitive architecture. The new theory Retrieval by Accumulating Evidence in an Architecture (RACE/A) combines the level of detail typically provided by sequential sampling models with the level of task complexity typically provided by cognitive architectures. We will use RACE/A to model data from two variants of a picture-word interference task in a psychological refractory period design. These models will demonstrate how RACE/A enables interactions between sequential sampling and long-term declarative learning, and between sequential sampling and task control. In a traditional sequential sampling model, the onset of the process within the task is unclear, as is the number of sampling processes. RACE/A provides a theoretical basis for estimating the onset of sequential sampling processes during task execution and allows for easy modeling of multiple sequential sampling processes within a task. Copyright © 2011 Cognitive Science Society, Inc.

  4. Evaluation Using Sequential Trials Methods.

    ERIC Educational Resources Information Center

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  5. Sequential Designs Based on Bayesian Uncertainty Quantification in Sparse Representation Surrogate Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Ray -Bing; Wang, Weichung; Jeff Wu, C. F.

    A numerical method, called OBSM, was recently proposed which employs overcomplete basis functions to achieve sparse representations. While the method can handle non-stationary response without the need of inverting large covariance matrices, it lacks the capability to quantify uncertainty in predictions. We address this issue by proposing a Bayesian approach which first imposes a normal prior on the large space of linear coefficients, then applies the MCMC algorithm to generate posterior samples for predictions. From these samples, Bayesian credible intervals can then be obtained to assess prediction uncertainty. A key application for the proposed method is the efficient construction ofmore » sequential designs. Several sequential design procedures with different infill criteria are proposed based on the generated posterior samples. As a result, numerical studies show that the proposed schemes are capable of solving problems of positive point identification, optimization, and surrogate fitting.« less

  6. Sequential Designs Based on Bayesian Uncertainty Quantification in Sparse Representation Surrogate Modeling

    DOE PAGES

    Chen, Ray -Bing; Wang, Weichung; Jeff Wu, C. F.

    2017-04-12

    A numerical method, called OBSM, was recently proposed which employs overcomplete basis functions to achieve sparse representations. While the method can handle non-stationary response without the need of inverting large covariance matrices, it lacks the capability to quantify uncertainty in predictions. We address this issue by proposing a Bayesian approach which first imposes a normal prior on the large space of linear coefficients, then applies the MCMC algorithm to generate posterior samples for predictions. From these samples, Bayesian credible intervals can then be obtained to assess prediction uncertainty. A key application for the proposed method is the efficient construction ofmore » sequential designs. Several sequential design procedures with different infill criteria are proposed based on the generated posterior samples. As a result, numerical studies show that the proposed schemes are capable of solving problems of positive point identification, optimization, and surrogate fitting.« less

  7. A Framework for Batched and GPU-Resident Factorization Algorithms Applied to Block Householder Transformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Tingzing Tim; Tomov, Stanimire Z; Luszczek, Piotr R

    As modern hardware keeps evolving, an increasingly effective approach to developing energy efficient and high-performance solvers is to design them to work on many small size and independent problems. Many applications already need this functionality, especially for GPUs, which are currently known to be about four to five times more energy efficient than multicore CPUs. We describe the development of one-sided factorizations that work for a set of small dense matrices in parallel, and we illustrate our techniques on the QR factorization based on Householder transformations. We refer to this mode of operation as a batched factorization. Our approach ismore » based on representing the algorithms as a sequence of batched BLAS routines for GPU-only execution. This is in contrast to the hybrid CPU-GPU algorithms that rely heavily on using the multicore CPU for specific parts of the workload. But for a system to benefit fully from the GPU's significantly higher energy efficiency, avoiding the use of the multicore CPU must be a primary design goal, so the system can rely more heavily on the more efficient GPU. Additionally, this will result in the removal of the costly CPU-to-GPU communication. Furthermore, we do not use a single symmetric multiprocessor(on the GPU) to factorize a single problem at a time. We illustrate how our performance analysis, and the use of profiling and tracing tools, guided the development and optimization of our batched factorization to achieve up to a 2-fold speedup and a 3-fold energy efficiency improvement compared to our highly optimized batched CPU implementations based on the MKL library(when using two sockets of Intel Sandy Bridge CPUs). Compared to a batched QR factorization featured in the CUBLAS library for GPUs, we achieved up to 5x speedup on the K40 GPU.« less

  8. Circular chemiresistors for microchemical sensors

    DOEpatents

    Ho, Clifford K [Albuquerque, NM

    2007-03-13

    A circular chemiresistor for use in microchemical sensors. A pair of electrodes is fabricated on an electrically insulating substrate. The pattern of electrodes is arranged in a circle-filling geometry, such as a concentric, dual-track spiral design, or a circular interdigitated design. A drop of a chemically sensitive polymer (i.e., chemiresistive ink) is deposited on the insulating substrate on the electrodes, which spreads out into a thin, circular disk contacting the pair of electrodes. This circularly-shaped electrode geometry maximizes the contact area between the pair of electrodes and the polymer deposit, which provides a lower and more stable baseline resistance than with linear-trace designs. The circularly-shaped electrode pattern also serves to minimize batch-to-batch variations in the baseline resistance due to non-uniform distributions of conductive particles in the chemiresistive polymer film.

  9. Connecting drug delivery reality to smart materials design.

    PubMed

    Grainger, David W

    2013-09-15

    Inflated claims to both design and mechanistic novelty in drug delivery and imaging systems, including most nanotechnologies, are not supported by the generally poor translation of these systems to clinical efficacy. The "form begets function" design paradigm is seductive but perhaps over-simplistic in translation to pharmaceutical efficacy. Most innovations show few clinically important distinctions in their therapeutic benefits in relevant preclinical disease and delivery models, despite frequent claims to the contrary. Long-standing challenges in drug delivery issues must enlist more realistic, back-to-basics approaches to address fundamental materials properties in complex biological systems, preclinical test beds, and analytical methods to more reliably determine fundamental pharmaceutical figures of merit, including drug carrier purity and batch-batch variability, agent biodistribution, therapeutic index (safety), and efficacy. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Component and System Sensitivity Considerations for Design of a Lunar ISRU Oxygen Production Plant

    NASA Technical Reports Server (NTRS)

    Linne, Diane L.; Gokoglu, Suleyman; Hegde, Uday G.; Balasubramaniam, Ramaswamy; Santiago-Maldonado, Edgardo

    2009-01-01

    Component and system sensitivities of some design parameters of ISRU system components are analyzed. The differences between terrestrial and lunar excavation are discussed, and a qualitative comparison of large and small excavators is started. The effect of excavator size on the size of the ISRU plant's regolith hoppers is presented. Optimum operating conditions of both hydrogen and carbothermal reduction reactors are explored using recently developed analytical models. Design parameters such as batch size, conversion fraction, and maximum particle size are considered for a hydrogen reduction reactor while batch size, conversion fraction, number of melt zones, and methane flow rate are considered for a carbothermal reduction reactor. For both reactor types the effect of reactor operation on system energy and regolith delivery requirements is presented.

  11. A method for simultaneously counterbalancing condition order and assignment of stimulus materials to conditions.

    PubMed

    Zeelenberg, René; Pecher, Diane

    2015-03-01

    Counterbalanced designs are frequently used in the behavioral sciences. Studies often counterbalance either the order in which conditions are presented in the experiment or the assignment of stimulus materials to conditions. Occasionally, researchers need to simultaneously counterbalance both condition order and stimulus assignment to conditions. Lewis (1989; Behavior Research Methods, Instruments, & Computers 25:414-415, 1993) presented a method for constructing Latin squares that fulfill these requirements. The resulting Latin squares counterbalance immediate sequential effects, but not remote sequential effects. Here, we present a new method for generating Latin squares that simultaneously counterbalance both immediate and remote sequential effects and assignment of stimuli to conditions. An Appendix is provided to facilitate implementation of these Latin square designs.

  12. Enhanced Biocide Treatments with D-amino Acid Mixtures against a Biofilm Consortium from a Water Cooling Tower.

    PubMed

    Jia, Ru; Li, Yingchao; Al-Mahamedh, Hussain H; Gu, Tingyue

    2017-01-01

    Different species of microbes form mixed-culture biofilms in cooling water systems. They cause microbiologically influenced corrosion (MIC) and biofouling, leading to increased operational and maintenance costs. In this work, two D-amino acid mixtures were found to enhance two non-oxidizing biocides [tetrakis hydroxymethyl phosphonium sulfate (THPS) and NALCO 7330 (isothiazoline derivatives)] and one oxidizing biocide [bleach (NaClO)] against a biofilm consortium from a water cooling tower in lab tests. Fifty ppm (w/w) of an equimass mixture of D-methionine, D-leucine, D-tyrosine, D-tryptophan, D-serine, D-threonine, D-phenylalanine, and D-valine (D8) enhanced 15 ppm THPS and 15 ppm NALCO 7330 with similar efficacies achieved by the 30 ppm THPS alone treatment and the 30 ppm NALCO 7330 alone treatment, respectively in the single-batch 3-h biofilm removal test. A sequential treatment method was used to enhance bleach because D-amino acids react with bleach. After a 4-h biofilm removal test, the sequential treatment of 5 ppm bleach followed by 50 ppm D8 achieved extra 1-log reduction in sessile cell counts of acid producing bacteria, sulfate reducing bacteria, and general heterotrophic bacteria compared with the 5 ppm bleach alone treatment. The 10 ppm bleach alone treatment showed a similar efficacy with the sequential treatment of 5 ppm bleach followed by 50 ppm D8. The efficacy of D8 was found better than that of D4 (an equimass mixture of D-methionine, D-leucine, D-tyrosine, and D-tryptophan) in the enhancement of the three individual biocides against the biofilm consortium.

  13. Enhanced Biocide Treatments with D-amino Acid Mixtures against a Biofilm Consortium from a Water Cooling Tower

    PubMed Central

    Jia, Ru; Li, Yingchao; Al-Mahamedh, Hussain H.; Gu, Tingyue

    2017-01-01

    Different species of microbes form mixed-culture biofilms in cooling water systems. They cause microbiologically influenced corrosion (MIC) and biofouling, leading to increased operational and maintenance costs. In this work, two D-amino acid mixtures were found to enhance two non-oxidizing biocides [tetrakis hydroxymethyl phosphonium sulfate (THPS) and NALCO 7330 (isothiazoline derivatives)] and one oxidizing biocide [bleach (NaClO)] against a biofilm consortium from a water cooling tower in lab tests. Fifty ppm (w/w) of an equimass mixture of D-methionine, D-leucine, D-tyrosine, D-tryptophan, D-serine, D-threonine, D-phenylalanine, and D-valine (D8) enhanced 15 ppm THPS and 15 ppm NALCO 7330 with similar efficacies achieved by the 30 ppm THPS alone treatment and the 30 ppm NALCO 7330 alone treatment, respectively in the single-batch 3-h biofilm removal test. A sequential treatment method was used to enhance bleach because D-amino acids react with bleach. After a 4-h biofilm removal test, the sequential treatment of 5 ppm bleach followed by 50 ppm D8 achieved extra 1-log reduction in sessile cell counts of acid producing bacteria, sulfate reducing bacteria, and general heterotrophic bacteria compared with the 5 ppm bleach alone treatment. The 10 ppm bleach alone treatment showed a similar efficacy with the sequential treatment of 5 ppm bleach followed by 50 ppm D8. The efficacy of D8 was found better than that of D4 (an equimass mixture of D-methionine, D-leucine, D-tyrosine, and D-tryptophan) in the enhancement of the three individual biocides against the biofilm consortium. PMID:28861053

  14. Sequential lineup presentation promotes less-biased criterion setting but does not improve discriminability.

    PubMed

    Palmer, Matthew A; Brewer, Neil

    2012-06-01

    When compared with simultaneous lineup presentation, sequential presentation has been shown to reduce false identifications to a greater extent than it reduces correct identifications. However, there has been much debate about whether this difference in identification performance represents improved discriminability or more conservative responding. In this research, data from 22 experiments that compared sequential and simultaneous lineups were analyzed using a compound signal-detection model, which is specifically designed to describe decision-making performance on tasks such as eyewitness identification tests. Sequential (cf. simultaneous) presentation did not influence discriminability, but produced a conservative shift in response bias that resulted in less-biased choosing for sequential than simultaneous lineups. These results inform understanding of the effects of lineup presentation mode on eyewitness identification decisions.

  15. 40 CFR 63.1413 - Compliance demonstration procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... liquids, other than fuels, received by the control device. (i) For a scrubber, the design evaluation shall... scrubbing liquid. The design evaluation shall establish the design exhaust vent stream organic compound... process vent and the overall percent reduction for the collection of non-reactor batch process vents...

  16. Design of batch audio/video conversion platform based on JavaEE

    NASA Astrophysics Data System (ADS)

    Cui, Yansong; Jiang, Lianpin

    2018-03-01

    With the rapid development of digital publishing industry, the direction of audio / video publishing shows the diversity of coding standards for audio and video files, massive data and other significant features. Faced with massive and diverse data, how to quickly and efficiently convert to a unified code format has brought great difficulties to the digital publishing organization. In view of this demand and present situation in this paper, basing on the development architecture of Sptring+SpringMVC+Mybatis, and combined with the open source FFMPEG format conversion tool, a distributed online audio and video format conversion platform with a B/S structure is proposed. Based on the Java language, the key technologies and strategies designed in the design of platform architecture are analyzed emphatically in this paper, designing and developing a efficient audio and video format conversion system, which is composed of “Front display system”, "core scheduling server " and " conversion server ". The test results show that, compared with the ordinary audio and video conversion scheme, the use of batch audio and video format conversion platform can effectively improve the conversion efficiency of audio and video files, and reduce the complexity of the work. Practice has proved that the key technology discussed in this paper can be applied in the field of large batch file processing, and has certain practical application value.

  17. Bacteriophage PRD1 batch experiments to study attachment, detachment and inactivation processes.

    PubMed

    Sadeghi, Gholamreza; Schijven, Jack F; Behrends, Thilo; Hassanizadeh, S Majid; van Genuchten, Martinus Th

    2013-09-01

    Knowledge of virus removal in subsurface environments is pivotal for assessing the risk of viral contamination of water resources and developing appropriate protection measures. Columns packed with sand are frequently used to quantify attachment, detachment and inactivation rates of viruses. Since column transport experiments are very laborious, a common alternative is to perform batch experiments where usually one or two measurements are done assuming equilibrium is reached. It is also possible to perform kinetic batch experiments. In that case, however, it is necessary to monitor changes in the concentration with time. This means that kinetic batch experiments will be almost as laborious as column experiments. Moreover, attachment and detachment rate coefficients derived from batch experiments may differ from those determined using column experiments. The aim of this study was to determine the utility of kinetic batch experiments and investigate the effects of different designs of the batch experiments on estimated attachment, detachment and inactivation rate coefficients. The experiments involved various combinations of container size, sand-water ratio, and mixing method (i.e., rolling or tumbling by pivoting the tubes around their horizontal or vertical axes, respectively). Batch experiments were conducted with clean quartz sand, water at pH 7 and ionic strength of 20 mM, and using the bacteriophage PRD1 as a model virus. Values of attachment, detachment and inactivation rate coefficients were found by fitting an analytical solution of the kinetic model equations to the data. Attachment rate coefficients were found to be systematically higher under tumbling than under rolling conditions because of better mixing and more efficient contact of phages with the surfaces of the sand grains. In both mixing methods, more sand in the container yielded higher attachment rate coefficients. A linear increase in the detachment rate coefficient was observed with increased solid-water ratio using tumbling method. Given the differences in the attachment rate coefficients, and assuming the same sticking efficiencies since chemical conditions of the batch and column experiments were the same, our results show that collision efficiencies of batch experiments are not the same as those of column experiments. Upscaling of the attachment rate from batch to column experiments hence requires proper understanding of the mixing conditions. Because batch experiments, in which the kinetics are monitored, are as laborious as column experiments, there seems to be no major advantage in performing batch instead of column experiments. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Production of pullulan by a thermotolerant aureobasidium pullulans strain in non-stirred fed batch fermentation process.

    PubMed

    Singh, Ranjan; Gaur, Rajeeva; Tiwari, Soni; Gaur, Manogya Kumar

    2012-07-01

    Total 95 isolates of Aureobasidium pullulans were isolated from different flowers and leaves samples, out of which 11 thermotolerant strains produced pullulan. One thermotolerant non-melanin pullulan producing strain, designated as RG-5, produced highest pullulan (37.1±1.0 g/l) at 42(o)C, pH 5.5 in 48h of incubation with 3% sucrose and 0.5% ammonium sulphate in a non-stirred fed batch fermentor of 6 liters capacity. The two liters of initial volume of fermentation medium was further fed with the 2 liters in two successive batches at 5 h interval into the fermentor. The sterile air was supplied only for 10h at the rate of 0.5 vvm.

  19. Evaluation of a kinetic model for computer simulation of growth and fermentation by Scheffersomyces (Pichia) stipitis fed D-xylose.

    PubMed

    Slininger, P J; Dien, B S; Lomont, J M; Bothast, R J; Ladisch, M R; Okos, M R

    2014-08-01

    Scheffersomyces (formerly Pichia) stipitis is a potential biocatalyst for converting lignocelluloses to ethanol because the yeast natively ferments xylose. An unstructured kinetic model based upon a system of linear differential equations has been formulated that describes growth and ethanol production as functions of ethanol, oxygen, and xylose concentrations for both growth and fermentation stages. The model was validated for various growth conditions including batch, cell recycle, batch with in situ ethanol removal and fed-batch. The model provides a summary of basic physiological yeast properties and is an important tool for simulating and optimizing various culture conditions and evaluating various bioreactor designs for ethanol production. © 2014 Wiley Periodicals, Inc.

  20. SEQUENTIAL EXTRACTIONS FOR PARTITIONING OF ARSENIC ON HYDROUS IRON OXIDES AND IRON SULFIDES

    EPA Science Inventory

    The objective of this study was to use model solids to test solutions designed to extract arsenic from relatively labile solid phase fractions. The use of sequential extractions provides analytical constraints on the identification of mineral phases that control arsenic mobility...

  1. THRESHOLD ELEMENTS AND THE DESIGN OF SEQUENTIAL SWITCHING NETWORKS.

    DTIC Science & Technology

    The report covers research performed from March 1966 to March 1967. The major topics treated are: (1) methods for finding weight- threshold vectors...that realize a given switching function in multi- threshold linear logic; (2) synthesis of sequential machines by means of shift registers and simple

  2. BIOLAB experiment development status 2005

    NASA Astrophysics Data System (ADS)

    Brinckmann, Enno; Manieri, Pierfilippo

    2005-08-01

    BIOLAB, ESA's major facility for biological Space research on the International Space Station (ISS), will accommodate the first two batches of experiments after its launch with the "Columbus" Laboratory (spring 2007). Seven experiments have been selected for development: three of the first batch have concluded Phase A/B with the testing of the breadboards, in which the main functions of the scientific studies can be simulated and defined for further inputs to the final design of the experiment hardware. The biological specimens of the first batch are scorpions, plant seedlings, bacteria suspensions and cell cultures of mammalian and invertebrate origin. The experiment protocols request demanding resources ranging from life support for the entire mission (90 days) to skilled crew operations and transport/storage in deep freezers. Even more sophisticated experiments are in preparation for the second batch, dealing with various cell culture systems. This presentation gives an overview about the experiment development status, whilst the science background and breadboard test results will be presented by the respective experiment teams.

  3. Sequencing batch-reactor control using Gaussian-process models.

    PubMed

    Kocijan, Juš; Hvala, Nadja

    2013-06-01

    This paper presents a Gaussian-process (GP) model for the design of sequencing batch-reactor (SBR) control for wastewater treatment. The GP model is a probabilistic, nonparametric model with uncertainty predictions. In the case of SBR control, it is used for the on-line optimisation of the batch-phases duration. The control algorithm follows the course of the indirect process variables (pH, redox potential and dissolved oxygen concentration) and recognises the characteristic patterns in their time profile. The control algorithm uses GP-based regression to smooth the signals and GP-based classification for the pattern recognition. When tested on the signals from an SBR laboratory pilot plant, the control algorithm provided a satisfactory agreement between the proposed completion times and the actual termination times of the biodegradation processes. In a set of tested batches the final ammonia and nitrate concentrations were below 1 and 0.5 mg L(-1), respectively, while the aeration time was shortened considerably. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Autochthonous bioaugmentation with environmental samples rich in hydrocarbonoclastic bacteria for bench-scale bioremediation of oily seawater and desert soil.

    PubMed

    Ali, Nedaa; Dashti, Narjes; Salamah, Samar; Al-Awadhi, Husain; Sorkhoh, Naser; Radwan, Samir

    2016-05-01

    Oil-contaminated seawater and desert soil batches were bioaugmented with suspensions of pea (Pisum sativum) rhizosphere and soil with long history of oil pollution. Oil consumption was measured by gas-liquid chromatography. Hydrocarbonoclastic bacteria in the bioremediation batches were counted using a mineral medium with oil vapor as a sole carbon source and characterized by their 16S ribosomal RNA (rRNA)-gene sequences. Most of the oil was consumed during the first 2-4 months, and the oil-removal rate decreased or ceased thereafter due to nutrient and oxygen depletion. Supplying the batches with NaNO3 (nitrogen fertilization) at a late phase of bioremediation resulted in reenhanced oil consumption and bacterial growth. In the seawater batches bioaugmented with rhizospheric suspension, the autochthonous rhizospheric bacterial species Microbacterium oxidans and Rhodococcus spp. were established and contributed to oil-removal. The rhizosphere-bioaugmented soil batches selectively favored Arthrobacter nitroguajacolicus, Caulobacter segnis, and Ensifer adherens. In seawater batches bioaugmented with long-contaminated soil, the predominant oil-removing bacterium was the marine species Marinobacter hydrocarbonoclasticus. In soil batches on the other hand, the autochthonous inhabitants of the long-contaminated soil, Pseudomonas and Massilia species were established and contributed to oil removal. It was concluded that the use of rhizospheric bacteria for inoculating seawater and desert soil and of bacteria in long-contaminated soil for inoculating desert soil follows the concept of "autochthonous bioaugmentation." Inoculating seawater with bacteria in long-contaminated soil, on the other hand, merits the designation "allochthonous bioaugmentation."

  5. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  6. Continuous flow operation with appropriately adjusting composites in influent for recovery of Cr(VI), Cu(II) and Cd(II) in self-driven MFC-MEC system.

    PubMed

    Li, Ming; Pan, Yuzhen; Huang, Liping; Zhang, Yong; Yang, Jinhui

    2017-03-01

    A self-driven microbial fuel cell (MFC) - microbial electrolysis cell (MEC) system, where electricity generated from MFCs is in situ utilized for powering MECs, has been previously reported for recovering Cr(VI), Cu(II) and Cd(II) with individual metals fed in different units of the system in batch operation. Here it was advanced with treating synthetic mixed metals' solution at appropriately adjusting composites in fed-batch and continuous flow operations for complete separation of Cr(VI), Cu(II) and Cd(II) from each other. Under an optimal condition of hydraulic residence time of 4 h, matching of two serially connected MFCs with one MEC, and fed with a composite of either 5 mg L -1 Cr(VI), 1 mg L -1 Cu(II) and 5 mg L -1 Cd(II), or 1 mg L -1 Cr(VI), 5 mg L -1 Cu(II) and 5 mg L -1 Cd(II), the self-driven MFC-MEC system can completely and sequentially recover Cu(II), Cr(VI) and Cd(II) from mixed metals. This study provides a true sustainable and zero-energy-consumed approach of using bioelectrochemical systems for completely recovering and separating Cr(VI), Cu(II) and Cd(II) from each other or from wastes or contaminated sites.

  7. Decolorization of palm oil mill effluent using growing cultures of Curvularia clavata.

    PubMed

    Neoh, Chin Hong; Lam, Chi Yong; Lim, Chi Kim; Yahya, Adibah; Ibrahim, Zaharah

    2014-03-01

    Agricultural wastewater that produces color are of environmental and health concern as colored effluent can produce toxic and carcinogenic by-products. From this study, batch culture optimization using response surface methods indicated that the fungus isolated from the pineapple solid waste, Curvularia clavata was able to decolorize sterile palm oil mill effluent (POME) which is mainly associated with polyphenol and lignin. Results showed successful decolorization of POME up to 80 % (initial ADMI [American Dye Manufacturing Index] of 3,793) with 54 % contributed by biosorption and 46 % by biodegradation after 5 days of treatment. Analysis using HPLC and GC-MS showed the degradation of color causing compound such as 3-methoxyphenyl isothiocynate and the production of new metabolites. Ecotoxicity test indicated that the decolorized effluent is safe for discharge. To determine the longevity of the fungus for a prolonged decolorization period, sequential batch decolorization studies were carried out. The results showed that lignin peroxidase and laccase were the main ligninolytic enzymes involved in the degradation of color. Carboxymethyl cellulase (CMCase) and xylanase activities were also detected suggesting possible roles of the enzymes in promoting growth of the fungus which consequently contributed to improved decolorization of POME. In conclusion, the ability of C. clavata in treating color of POME indicated that C. clavata is of potential use for decolorization and degradation of agricultural wastewater containing polyphenolic compounds.

  8. Coupled solar photo-Fenton and biological treatment for the degradation of diuron and linuron herbicides at pilot scale.

    PubMed

    Farré, Maria José; Maldonado, Manuel Ignacio; Gernjak, Wolfgang; Oller, Isabel; Malato, Sixto; Domènech, Xavier; Peral, José

    2008-06-01

    A coupled solar photo-Fenton (chemical) and biological treatment has been used to remove biorecalcitrant diuron (42 mg l(-1)) and linuron (75 mg l(-1)) herbicides from water at pilot plant scale. The chemical process has been carried out in a 82 l solar pilot plant made up by four compound parabolic collector units, and it was followed by a biological treatment performed in a 40 l sequencing batch reactor. Two Fe(II) doses (2 and 5 mg l(-1)) and sequential additions of H2O2 (20 mg l(-1)) have been used to chemically degrade the initially polluted effluent. Next, biodegradability at different oxidation states has been assessed by means of BOD/COD ratio. A reagent dose of Fe=5 mg l(-1) and H2O2=100 mg l(-1) has been required to obtain a biodegradable effluent after 100 min of irradiation time. Finally, the organic content of the photo-treated solution has been completely assimilated by a biomass consortium in the sequencing batch reactor using a total suspended solids concentration of 0.2 g l(-1) and a hydraulic retention time of 24h. Comparison between the data obtained at pilot plant scale (specially the one corresponding to the chemical step) and previously published data from a similar system performing at laboratory scale, has been carried out.

  9. Sucrose purification and repeated ethanol production from sugars remaining in sweet sorghum juice subjected to a membrane separation process.

    PubMed

    Sasaki, Kengo; Tsuge, Yota; Kawaguchi, Hideo; Yasukawa, Masahiro; Sasaki, Daisuke; Sazuka, Takashi; Kamio, Eiji; Ogino, Chiaki; Matsuyama, Hideto; Kondo, Akihiko

    2017-08-01

    The juice from sweet sorghum cultivar SIL-05 (harvested at physiological maturity) was extracted, and the component sucrose and reducing sugars (such as glucose and fructose) were subjected to a membrane separation process to purify the sucrose for subsequent sugar refining and to obtain a feedstock for repeated bioethanol production. Nanofiltration (NF) of an ultrafiltration (UF) permeate using an NTR-7450 membrane (Nitto Denko Corporation, Osaka, Japan) concentrated the juice and produced a sucrose-rich fraction (143.2 g L -1 sucrose, 8.5 g L -1 glucose, and 4.5 g L -1 fructose). In addition, the above NF permeate was concentrated using an ESNA3 NF membrane to provide concentrated permeated sugars (227.9 g L -1 ) and capture various amino acids in the juice, enabling subsequent ethanol fermentation without the addition of an exogenous nitrogen source. Sequential batch fermentation using the ESNA3 membrane concentrate provided an ethanol titer and theoretical ethanol yield of 102.5-109.5 g L -1 and 84.4-89.6%, respectively, throughout the five-cycle batch fermentation by Saccharomyces cerevisiae BY4741. Our results demonstrate that a membrane process using UF and two types of NF membranes has the potential to allow sucrose purification and repeated bioethanol production.

  10. A preliminary evaluation of a reusable digital sterilization indicator prototype.

    PubMed

    Puttaiah, R; Griggs, J; D'Onofrio, M

    2014-09-01

    Sterilization of critical and semicritical instruments used in patient care must undergo a terminal process of sterilization. Use of chemical and physical indicators are important in providing information on the sterilizer's performance during each cycle. Regular and periodic monitoring of sterilizers using biological indicators is necessary in periodically validating performance of sterilizers. Data loggers or independent digital parametric indicators are innovative devices that provide more information than various classes chemical indicators. In this study we evaluated a prototype of an independent digital parametric indicator's use in autoclaves. The purpose of this study was to evaluate the performance of an independent digital indicator/data logger prototype (DS1922F) that could be used for multiple cycles within an autoclave.MG Materials and methods: Three batches of the DS1922F (150 samples) were used in this study that was conducted in a series. The first batch was challenged with 300 sterilization cycles within an autoclave and the data loggers evaluated to study failures and the reason for failure, make corrections and improve the prototype design. After changes made based on studying the first batch, the second batch of the prototype (150 samples) were challenged once again with 300 sterilization cycles within an autoclave and failure studied again in further improvement of the prototype. The final batch (3rd batch) of the prototype (150 samples) was challenged again but with 600 cycles to see how long they would last. Kaplan-Meier survival analysis analyses of all three batches was conducted (α = 0.05) and failed samples qualitatively studied in understanding the variables involved in the failure of the prototype, and in improving quality. Each tested batch provided crucial information on device failure and helped in improvement of the prototype. Mean lifetime survival of the final batch (Batch 3) of prototype was 498 (480, 516) sterilization cycles in an autoclave. In this study, the final batch of the DS1922F prototype data logger was found to be robust in withstanding the challenge of 600 autoclave cycles, with a mean lifetime of more than 450 cycles, multiple times more than prescribed number of cycles. Instrument reprocessing is among the important aspects of infection control. While stringent procedures are followed in instrument reprocessing within the clinic in assuring patient safety, regular use of sterilization process indicators and periodic biological validation of the sterilizer's performance is necessary. Chemical indicators for use in Autoclaves provide information on whether the particular cycle's parameters were achieved but do not provide at what specific point in time or temperature the failure occurred. Data loggers and associated reader software as the tested prototype in this evaluation (DS1922F), are designed to provide continuous information on time and temperature of the prescribed cycle. Data loggers provide immediate information on the process as opposed to Biological Indicators that take from days to a week in obtaining a confirmatory result. Further, many countries do not have the sterilization monitoring service infrastructure to meet the demands of the end users. In the absence of sterilization monitoring services, use of digital data loggers for each sterilization cycle is more pragmatic.

  11. Color Breakup In Sequentially-Scanned LC Displays

    NASA Technical Reports Server (NTRS)

    Arend, L.; Lubin, J.; Gille, J.; Larimer, J.; Statler, Irving C. (Technical Monitor)

    1994-01-01

    In sequentially-scanned liquid-crystal displays the chromatic components of color pixels are distributed in time. For such displays eye, head, display, and image-object movements can cause the individual color elements to be visible. We analyze conditions (scan designs, types of eye movement) likely to produce color breakup.

  12. Sequential Requests and the Problem of Message Sampling.

    ERIC Educational Resources Information Center

    Cantrill, James Gerard

    S. Jackson and S. Jacobs's criticism of "single message" designs in communication research served as a framework for a study that examined the differences between various sequential request paradigms. The study sought to answer the following questions: (1) What were the most naturalistic request sequences assured to replicate…

  13. The Motivating Language of Principals: A Sequential Transformative Strategy

    ERIC Educational Resources Information Center

    Holmes, William Tobias

    2012-01-01

    This study implemented a Sequential Transformative Mixed Methods design with teachers (as recipients) and principals (to give voice) in the examination of principal talk in two different school accountability contexts (Continuously Improving and Continuously Zigzag) using the conceptual framework of Motivating Language Theory. In phase one,…

  14. Estimation and filtering techniques for high-accuracy GPS applications

    NASA Technical Reports Server (NTRS)

    Lichten, S. M.

    1989-01-01

    Techniques for determination of very precise orbits for satellites of the Global Positioning System (GPS) are currently being studied and demonstrated. These techniques can be used to make cm-accurate measurements of station locations relative to the geocenter, monitor earth orientation over timescales of hours, and provide tropospheric and clock delay calibrations during observations made with deep space radio antennas at sites where the GPS receivers have been collocated. For high-earth orbiters, meter-level knowledge of position will be available from GPS, while at low altitudes, sub-decimeter accuracy will be possible. Estimation of satellite orbits and other parameters such as ground station positions is carried out with a multi-satellite batch sequential pseudo-epoch state process noise filter. Both square-root information filtering (SRIF) and UD-factorized covariance filtering formulations are implemented in the software.

  15. Robust iterative learning control for multi-phase batch processes: an average dwell-time method with 2D convergence indexes

    NASA Astrophysics Data System (ADS)

    Wang, Limin; Shen, Yiteng; Yu, Jingxian; Li, Ping; Zhang, Ridong; Gao, Furong

    2018-01-01

    In order to cope with system disturbances in multi-phase batch processes with different dimensions, a hybrid robust control scheme of iterative learning control combined with feedback control is proposed in this paper. First, with a hybrid iterative learning control law designed by introducing the state error, the tracking error and the extended information, the multi-phase batch process is converted into a two-dimensional Fornasini-Marchesini (2D-FM) switched system with different dimensions. Second, a switching signal is designed using the average dwell-time method integrated with the related switching conditions to give sufficient conditions ensuring stable running for the system. Finally, the minimum running time of the subsystems and the control law gains are calculated by solving the linear matrix inequalities. Meanwhile, a compound 2D controller with robust performance is obtained, which includes a robust extended feedback control for ensuring the steady-state tracking error to converge rapidly. The application on an injection molding process displays the effectiveness and superiority of the proposed strategy.

  16. Formulation and Optimization of Multiparticulate Drug Delivery System Approach for High Drug Loading.

    PubMed

    Shah, Neha; Mehta, Tejal; Gohel, Mukesh

    2017-08-01

    The aim of the present work was to develop and optimize multiparticulate formulation viz. pellets of naproxen by employing QbD and risk assessment approach. Mixture design with extreme vertices was applied to the formulation with high loading of drug (about 90%) and extrusion-spheronization as a process for manufacturing pellets. Independent variables chosen were level of microcrystalline cellulose (MCC)-X 1 , polyvinylpyrrolidone K-90 (PVP K-90)-X 2 , croscarmellose sodium (CCS)-X 3 , and polacrilin potassium (PP)-X 4 . Dependent variables considered were disintegration time (DT)-Y 1 , sphericity-Y 2 , and percent drug release-Y 3 . The formulation was optimized based on the batches generated by MiniTab 17 software. The batch with maximum composite desirability (0.98) proved to be optimum. From the evaluation of design batches, it was observed that, even in low variation, the excipients affect the pelletization property of the blend and also the final drug release. In conclusion, pellets with high drug loading can be effectively manufactured and optimized systematically using QbD approach.

  17. Use of response surface methodology in a fed-batch process for optimization of tricarboxylic acid cycle intermediates to achieve high levels of canthaxanthin from Dietzia natronolimnaea HS-1.

    PubMed

    Nasri Nasrabadi, Mohammad Reza; Razavi, Seyed Hadi

    2010-04-01

    In this work, we applied statistical experimental design to a fed-batch process for optimization of tricarboxylic acid cycle (TCA) intermediates in order to achieve high-level production of canthaxanthin from Dietzia natronolimnaea HS-1 cultured in beet molasses. A fractional factorial design (screening test) was first conducted on five TCA cycle intermediates. Out of the five TCA cycle intermediates investigated via screening tests, alfaketoglutarate, oxaloacetate and succinate were selected based on their statistically significant (P<0.05) and positive effects on canthaxanthin production. These significant factors were optimized by means of response surface methodology (RSM) in order to achieve high-level production of canthaxanthin. The experimental results of the RSM were fitted with a second-order polynomial equation by means of a multiple regression technique to identify the relationship between canthaxanthin production and the three TCA cycle intermediates. By means of this statistical design under a fed-batch process, the optimum conditions required to achieve the highest level of canthaxanthin (13172 + or - 25 microg l(-1)) were determined as follows: alfaketoglutarate, 9.69 mM; oxaloacetate, 8.68 mM; succinate, 8.51 mM. Copyright 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  18. The Obstacles for the Teaching of 8th Grade TR History of Revolution and Kemalism Course According to the Constructivist Approach (An Example of Exploratory Sequential Mixed Method Design)

    ERIC Educational Resources Information Center

    Karademir, Yavuz; Demir, Selcuk Besir

    2015-01-01

    The aim of this study is to ascertain the problems social studies teachers face in the teaching of topics covered in 8th grade TRHRK Course. The study was conducted in line with explanatory sequential mixed method design, which is one of the mixed research method, was used. The study involves three phases. In the first step, exploratory process…

  19. A high level language for a high performance computer

    NASA Technical Reports Server (NTRS)

    Perrott, R. H.

    1978-01-01

    The proposed computational aerodynamic facility will join the ranks of the supercomputers due to its architecture and increased execution speed. At present, the languages used to program these supercomputers have been modifications of programming languages which were designed many years ago for sequential machines. A new programming language should be developed based on the techniques which have proved valuable for sequential programming languages and incorporating the algorithmic techniques required for these supercomputers. The design objectives for such a language are outlined.

  20. Group sequential designs for stepped-wedge cluster randomised trials

    PubMed Central

    Grayling, Michael J; Wason, James MS; Mander, Adrian P

    2017-01-01

    Background/Aims: The stepped-wedge cluster randomised trial design has received substantial attention in recent years. Although various extensions to the original design have been proposed, no guidance is available on the design of stepped-wedge cluster randomised trials with interim analyses. In an individually randomised trial setting, group sequential methods can provide notable efficiency gains and ethical benefits. We address this by discussing how established group sequential methodology can be adapted for stepped-wedge designs. Methods: Utilising the error spending approach to group sequential trial design, we detail the assumptions required for the determination of stepped-wedge cluster randomised trials with interim analyses. We consider early stopping for efficacy, futility, or efficacy and futility. We describe first how this can be done for any specified linear mixed model for data analysis. We then focus on one particular commonly utilised model and, using a recently completed stepped-wedge cluster randomised trial, compare the performance of several designs with interim analyses to the classical stepped-wedge design. Finally, the performance of a quantile substitution procedure for dealing with the case of unknown variance is explored. Results: We demonstrate that the incorporation of early stopping in stepped-wedge cluster randomised trial designs could reduce the expected sample size under the null and alternative hypotheses by up to 31% and 22%, respectively, with no cost to the trial’s type-I and type-II error rates. The use of restricted error maximum likelihood estimation was found to be more important than quantile substitution for controlling the type-I error rate. Conclusion: The addition of interim analyses into stepped-wedge cluster randomised trials could help guard against time-consuming trials conducted on poor performing treatments and also help expedite the implementation of efficacious treatments. In future, trialists should consider incorporating early stopping of some kind into stepped-wedge cluster randomised trials according to the needs of the particular trial. PMID:28653550

  1. Group sequential designs for stepped-wedge cluster randomised trials.

    PubMed

    Grayling, Michael J; Wason, James Ms; Mander, Adrian P

    2017-10-01

    The stepped-wedge cluster randomised trial design has received substantial attention in recent years. Although various extensions to the original design have been proposed, no guidance is available on the design of stepped-wedge cluster randomised trials with interim analyses. In an individually randomised trial setting, group sequential methods can provide notable efficiency gains and ethical benefits. We address this by discussing how established group sequential methodology can be adapted for stepped-wedge designs. Utilising the error spending approach to group sequential trial design, we detail the assumptions required for the determination of stepped-wedge cluster randomised trials with interim analyses. We consider early stopping for efficacy, futility, or efficacy and futility. We describe first how this can be done for any specified linear mixed model for data analysis. We then focus on one particular commonly utilised model and, using a recently completed stepped-wedge cluster randomised trial, compare the performance of several designs with interim analyses to the classical stepped-wedge design. Finally, the performance of a quantile substitution procedure for dealing with the case of unknown variance is explored. We demonstrate that the incorporation of early stopping in stepped-wedge cluster randomised trial designs could reduce the expected sample size under the null and alternative hypotheses by up to 31% and 22%, respectively, with no cost to the trial's type-I and type-II error rates. The use of restricted error maximum likelihood estimation was found to be more important than quantile substitution for controlling the type-I error rate. The addition of interim analyses into stepped-wedge cluster randomised trials could help guard against time-consuming trials conducted on poor performing treatments and also help expedite the implementation of efficacious treatments. In future, trialists should consider incorporating early stopping of some kind into stepped-wedge cluster randomised trials according to the needs of the particular trial.

  2. Influence of co-substrate on textile wastewater treatment and microbial community changes in the anaerobic biological sulfate reduction process.

    PubMed

    Rasool, Kashif; Mahmoud, Khaled A; Lee, Dae Sung

    2015-12-15

    This study investigated the anaerobic treatment of sulfate-rich synthetic textile wastewater in three sulfidogenic sequential batch reactors (SBRs). The experimental protocol was designed to examine the effect of three different co-substrates (lactate, glucose, and ethanol) and their concentrations on wastewater treatment performance. Sulfate reduction and dye degradation were improved when lactate and ethanol were used as electron donors, as compared with glucose. Moreover, under co-substrate limited concentrations, color, sulfate, and chemical oxygen demand (COD) removal efficiencies were declined. By reducing co-substrate COD gradually from 3000 to 500 mg/L, color removal efficiencies were decreased from 98.23% to 78.46%, 63.37%, and 69.10%, whereas, sulfate removal efficiencies were decreased from 98.42%, 82.35%, and 87.0%, to 30.27%, 21.50%, and 10.13%, for lactate, glucose, and ethanol fed reactors, respectively. Fourier transform infrared spectroscopy (FTIR) and total aromatic amine analysis revealed lactate to be a potential co-substrate for further biodegradation of intermediate metabolites formed after dye degradation. Pyrosequencing analysis showed that microbial community structure was significantly affected by the co-substrate. The reactor with lactate as co-substrate showed the highest relative abundance of sulfate reducing bacteria (SRBs), followed by ethanol, whereas the glucose-fed reactor showed the lowest relative abundance of SRB. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Can sequential parallel comparison design and two-way enriched design be useful in medical device clinical trials?

    PubMed

    Ivanova, Anastasia; Zhang, Zhiwei; Thompson, Laura; Yang, Ying; Kotz, Richard M; Fang, Xin

    2016-01-01

    Sequential parallel comparison design (SPCD) was proposed for trials with high placebo response. In the first stage of SPCD subjects are randomized between placebo and active treatment. In the second stage placebo nonresponders are re-randomized between placebo and active treatment. Data from the population of "all comers" and the subpopulations of placebo nonresponders then combined to yield a single p-value for treatment comparison. Two-way enriched design (TED) is an extension of SPCD where active treatment responders are also re-randomized between placebo and active treatment in Stage 2. This article investigates the potential uses of SPCD and TED in medical device trials.

  5. Sequential and prosodic design of English and Greek non-valenced news receipts.

    PubMed

    Kaimaki, Marianna

    2012-03-01

    Results arising from a prosodic and interactional study of the organization of everyday talk in English suggest that news receipts can be grouped into two categories: valenced (e.g., oh good) and non-valenced (e.g., oh really). In-depth investigation of both valenced and non-valenced news receipts shows that differences in their prosodic design do not seem to affect the sequential structure of the news informing sequence. News receipts with falling and rising pitch may have the same uptake and are treated in the same way by co-participants. A preliminary study of a Greek telephone corpus yielded the following receipts of news announcements: a malista, a(h) orea, a ne, a, oh. These are news markers composed of a standalone particle or a particle followed by an adverb or a response token (ne). Analysis of the sequential and prosodic design of Greek news announcement sequences is made to determine any interactional patterns and/or prosodic constraints. By examining the way in which co-participants display their interpretation of these turns I show that the phonological systems of contrast are different depending on the sequential environment, in much the same way that consonantal systems of contrast are not the same syllable initially and finally.

  6. Bioreactors for high cell density and continuous multi-stage cultivations: options for process intensification in cell culture-based viral vaccine production.

    PubMed

    Tapia, Felipe; Vázquez-Ramírez, Daniel; Genzel, Yvonne; Reichl, Udo

    2016-03-01

    With an increasing demand for efficacious, safe, and affordable vaccines for human and animal use, process intensification in cell culture-based viral vaccine production demands advanced process strategies to overcome the limitations of conventional batch cultivations. However, the use of fed-batch, perfusion, or continuous modes to drive processes at high cell density (HCD) and overextended operating times has so far been little explored in large-scale viral vaccine manufacturing. Also, possible reductions in cell-specific virus yields for HCD cultivations have been reported frequently. Taking into account that vaccine production is one of the most heavily regulated industries in the pharmaceutical sector with tough margins to meet, it is understandable that process intensification is being considered by both academia and industry as a next step toward more efficient viral vaccine production processes only recently. Compared to conventional batch processes, fed-batch and perfusion strategies could result in ten to a hundred times higher product yields. Both cultivation strategies can be implemented to achieve cell concentrations exceeding 10(7) cells/mL or even 10(8) cells/mL, while keeping low levels of metabolites that potentially inhibit cell growth and virus replication. The trend towards HCD processes is supported by development of GMP-compliant cultivation platforms, i.e., acoustic settlers, hollow fiber bioreactors, and hollow fiber-based perfusion systems including tangential flow filtration (TFF) or alternating tangential flow (ATF) technologies. In this review, these process modes are discussed in detail and compared with conventional batch processes based on productivity indicators such as space-time yield, cell concentration, and product titers. In addition, options for the production of viral vaccines in continuous multi-stage bioreactors such as two- and three-stage systems are addressed. While such systems have shown similar virus titers compared to batch cultivations, keeping high yields for extended production times is still a challenge. Overall, we demonstrate that process intensification of cell culture-based viral vaccine production can be realized by the consequent application of fed-batch, perfusion, and continuous systems with a significant increase in productivity. The potential for even further improvements is high, considering recent developments in establishment of new (designer) cell lines, better characterization of host cell metabolism, advances in media design, and the use of mathematical models as a tool for process optimization and control.

  7. Prototype color field sequential television lens assembly

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The design, development, and evaluation of a prototype modular lens assembly with a self-contained field sequential color wheel is presented. The design of a color wheel of maximum efficiency, the selection of spectral filters, and the design of a quiet, efficient wheel drive system are included. Design tradeoffs considered for each aspect of the modular assembly are discussed. Emphasis is placed on achieving a design which can be attached directly to an unmodified camera, thus permitting use of the assembly in evaluating various candidate camera and sensor designs. A technique is described which permits maintaining high optical efficiency with an unmodified camera. A motor synchronization system is developed which requires only the vertical synchronization signal as a reference frequency input. Equations and tradeoff curves are developed to permit optimizing the filter wheel aperture shapes for a variety of different design conditions.

  8. Production of pullulan by a thermotolerant aureobasidium pullulans strain in non-stirred fed batch fermentation process

    PubMed Central

    Singh, Ranjan; Gaur, Rajeeva; Tiwari, Soni; Gaur, Manogya Kumar

    2012-01-01

    Total 95 isolates of Aureobasidium pullulans were isolated from different flowers and leaves samples, out of which 11 thermotolerant strains produced pullulan. One thermotolerant non-melanin pullulan producing strain, designated as RG-5, produced highest pullulan (37.1±1.0 g/l) at 42oC, pH 5.5 in 48h of incubation with 3% sucrose and 0.5% ammonium sulphate in a non-stirred fed batch fermentor of 6 liters capacity. The two liters of initial volume of fermentation medium was further fed with the 2 liters in two successive batches at 5 h interval into the fermentor. The sterile air was supplied only for 10h at the rate of 0.5 vvm. PMID:24031927

  9. Synthesizing a novel genetic sequential logic circuit: a push-on push-off switch

    PubMed Central

    Lou, Chunbo; Liu, Xili; Ni, Ming; Huang, Yiqi; Huang, Qiushi; Huang, Longwen; Jiang, Lingli; Lu, Dan; Wang, Mingcong; Liu, Chang; Chen, Daizhuo; Chen, Chongyi; Chen, Xiaoyue; Yang, Le; Ma, Haisu; Chen, Jianguo; Ouyang, Qi

    2010-01-01

    Design and synthesis of basic functional circuits are the fundamental tasks of synthetic biologists. Before it is possible to engineer higher-order genetic networks that can perform complex functions, a toolkit of basic devices must be developed. Among those devices, sequential logic circuits are expected to be the foundation of the genetic information-processing systems. In this study, we report the design and construction of a genetic sequential logic circuit in Escherichia coli. It can generate different outputs in response to the same input signal on the basis of its internal state, and ‘memorize' the output. The circuit is composed of two parts: (1) a bistable switch memory module and (2) a double-repressed promoter NOR gate module. The two modules were individually rationally designed, and they were coupled together by fine-tuning the interconnecting parts through directed evolution. After fine-tuning, the circuit could be repeatedly, alternatively triggered by the same input signal; it functions as a push-on push-off switch. PMID:20212522

  10. Synthesizing a novel genetic sequential logic circuit: a push-on push-off switch.

    PubMed

    Lou, Chunbo; Liu, Xili; Ni, Ming; Huang, Yiqi; Huang, Qiushi; Huang, Longwen; Jiang, Lingli; Lu, Dan; Wang, Mingcong; Liu, Chang; Chen, Daizhuo; Chen, Chongyi; Chen, Xiaoyue; Yang, Le; Ma, Haisu; Chen, Jianguo; Ouyang, Qi

    2010-01-01

    Design and synthesis of basic functional circuits are the fundamental tasks of synthetic biologists. Before it is possible to engineer higher-order genetic networks that can perform complex functions, a toolkit of basic devices must be developed. Among those devices, sequential logic circuits are expected to be the foundation of the genetic information-processing systems. In this study, we report the design and construction of a genetic sequential logic circuit in Escherichia coli. It can generate different outputs in response to the same input signal on the basis of its internal state, and 'memorize' the output. The circuit is composed of two parts: (1) a bistable switch memory module and (2) a double-repressed promoter NOR gate module. The two modules were individually rationally designed, and they were coupled together by fine-tuning the interconnecting parts through directed evolution. After fine-tuning, the circuit could be repeatedly, alternatively triggered by the same input signal; it functions as a push-on push-off switch.

  11. Multi-objective design optimization of antenna structures using sequential domain patching with automated patch size determination

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2018-02-01

    In this article, a simple yet efficient and reliable technique for fully automated multi-objective design optimization of antenna structures using sequential domain patching (SDP) is discussed. The optimization procedure according to SDP is a two-step process: (i) obtaining the initial set of Pareto-optimal designs representing the best possible trade-offs between considered conflicting objectives, and (ii) Pareto set refinement for yielding the optimal designs at the high-fidelity electromagnetic (EM) simulation model level. For the sake of computational efficiency, the first step is realized at the level of a low-fidelity (coarse-discretization) EM model by sequential construction and relocation of small design space segments (patches) in order to create a path connecting the extreme Pareto front designs obtained beforehand. The second stage involves response correction techniques and local response surface approximation models constructed by reusing EM simulation data acquired in the first step. A major contribution of this work is an automated procedure for determining the patch dimensions. It allows for appropriate selection of the number of patches for each geometry variable so as to ensure reliability of the optimization process while maintaining its low cost. The importance of this procedure is demonstrated by comparing it with uniform patch dimensions.

  12. Active Job Monitoring in Pilots

    NASA Astrophysics Data System (ADS)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-12-01

    Recent developments in high energy physics (HEP) including multi-core jobs and multi-core pilots require data centres to gain a deep understanding of the system to monitor, design, and upgrade computing clusters. Networking is a critical component. Especially the increased usage of data federations, for example in diskless computing centres or as a fallback solution, relies on WAN connectivity and availability. The specific demands of different experiments and communities, but also the need for identification of misbehaving batch jobs, requires an active monitoring. Existing monitoring tools are not capable of measuring fine-grained information at batch job level. This complicates network-aware scheduling and optimisations. In addition, pilots add another layer of abstraction. They behave like batch systems themselves by managing and executing payloads of jobs internally. The number of real jobs being executed is unknown, as the original batch system has no access to internal information about the scheduling process inside the pilots. Therefore, the comparability of jobs and pilots for predicting run-time behaviour or network performance cannot be ensured. Hence, identifying the actual payload is important. At the GridKa Tier 1 centre a specific tool is in use that allows the monitoring of network traffic information at batch job level. This contribution presents the current monitoring approach and discusses recent efforts and importance to identify pilots and their substructures inside the batch system. It will also show how to determine monitoring data of specific jobs from identified pilots. Finally, the approach is evaluated.

  13. Towards Batched Linear Solvers on Accelerated Hardware Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haidar, Azzam; Dong, Tingzing Tim; Tomov, Stanimire

    2015-01-01

    As hardware evolves, an increasingly effective approach to develop energy efficient, high-performance solvers, is to design them to work on many small and independent problems. Indeed, many applications already need this functionality, especially for GPUs, which are known to be currently about four to five times more energy efficient than multicore CPUs for every floating-point operation. In this paper, we describe the development of the main one-sided factorizations: LU, QR, and Cholesky; that are needed for a set of small dense matrices to work in parallel. We refer to such algorithms as batched factorizations. Our approach is based on representingmore » the algorithms as a sequence of batched BLAS routines for GPU-contained execution. Note that this is similar in functionality to the LAPACK and the hybrid MAGMA algorithms for large-matrix factorizations. But it is different from a straightforward approach, whereby each of GPU's symmetric multiprocessors factorizes a single problem at a time. We illustrate how our performance analysis together with the profiling and tracing tools guided the development of batched factorizations to achieve up to 2-fold speedup and 3-fold better energy efficiency compared to our highly optimized batched CPU implementations based on the MKL library on a two-sockets, Intel Sandy Bridge server. Compared to a batched LU factorization featured in the NVIDIA's CUBLAS library for GPUs, we achieves up to 2.5-fold speedup on the K40 GPU.« less

  14. 21 CFR 820.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... designs, manufactures, fabricates, assembles, or processes a finished device. Manufacturer includes but is... numbers, or both, from which the history of the manufacturing, packaging, labeling, and distribution of a unit, lot, or batch of finished devices can be determined. (e) Design history file (DHF) means a...

  15. Stratified randomization controls better for batch effects in 450K methylation analysis: a cautionary tale.

    PubMed

    Buhule, Olive D; Minster, Ryan L; Hawley, Nicola L; Medvedovic, Mario; Sun, Guangyun; Viali, Satupaitea; Deka, Ranjan; McGarvey, Stephen T; Weeks, Daniel E

    2014-01-01

    Batch effects in DNA methylation microarray experiments can lead to spurious results if not properly handled during the plating of samples. Two pilot studies examining the association of DNA methylation patterns across the genome with obesity in Samoan men were investigated for chip- and row-specific batch effects. For each study, the DNA of 46 obese men and 46 lean men were assayed using Illumina's Infinium HumanMethylation450 BeadChip. In the first study (Sample One), samples from obese and lean subjects were examined on separate chips. In the second study (Sample Two), the samples were balanced on the chips by lean/obese status, age group, and census region. We used methylumi, watermelon, and limma R packages, as well as ComBat, to analyze the data. Principal component analysis and linear regression were, respectively, employed to identify the top principal components and to test for their association with the batches and lean/obese status. To identify differentially methylated positions (DMPs) between obese and lean males at each locus, we used a moderated t-test. Chip effects were effectively removed from Sample Two but not Sample One. In addition, dramatic differences were observed between the two sets of DMP results. After "removing" batch effects with ComBat, Sample One had 94,191 probes differentially methylated at a q-value threshold of 0.05 while Sample Two had zero differentially methylated probes. The disparate results from Sample One and Sample Two likely arise due to the confounding of lean/obese status with chip and row batch effects. Even the best possible statistical adjustments for batch effects may not completely remove them. Proper study design is vital for guarding against spurious findings due to such effects.

  16. Stratified randomization controls better for batch effects in 450K methylation analysis: a cautionary tale

    PubMed Central

    Buhule, Olive D.; Minster, Ryan L.; Hawley, Nicola L.; Medvedovic, Mario; Sun, Guangyun; Viali, Satupaitea; Deka, Ranjan; McGarvey, Stephen T.; Weeks, Daniel E.

    2014-01-01

    Background: Batch effects in DNA methylation microarray experiments can lead to spurious results if not properly handled during the plating of samples. Methods: Two pilot studies examining the association of DNA methylation patterns across the genome with obesity in Samoan men were investigated for chip- and row-specific batch effects. For each study, the DNA of 46 obese men and 46 lean men were assayed using Illumina's Infinium HumanMethylation450 BeadChip. In the first study (Sample One), samples from obese and lean subjects were examined on separate chips. In the second study (Sample Two), the samples were balanced on the chips by lean/obese status, age group, and census region. We used methylumi, watermelon, and limma R packages, as well as ComBat, to analyze the data. Principal component analysis and linear regression were, respectively, employed to identify the top principal components and to test for their association with the batches and lean/obese status. To identify differentially methylated positions (DMPs) between obese and lean males at each locus, we used a moderated t-test. Results: Chip effects were effectively removed from Sample Two but not Sample One. In addition, dramatic differences were observed between the two sets of DMP results. After “removing” batch effects with ComBat, Sample One had 94,191 probes differentially methylated at a q-value threshold of 0.05 while Sample Two had zero differentially methylated probes. The disparate results from Sample One and Sample Two likely arise due to the confounding of lean/obese status with chip and row batch effects. Conclusion: Even the best possible statistical adjustments for batch effects may not completely remove them. Proper study design is vital for guarding against spurious findings due to such effects. PMID:25352862

  17. Transfer of a three step mAb chromatography process from batch to continuous: Optimizing productivity to minimize consumable requirements.

    PubMed

    Gjoka, Xhorxhi; Gantier, Rene; Schofield, Mark

    2017-01-20

    The goal of this study was to adapt a batch mAb purification chromatography platform for continuous operation. The experiments and rationale used to convert from batch to continuous operation are described. Experimental data was used to design chromatography methods for continuous operation that would exceed the threshold for critical quality attributes and minimize the consumables required as compared to batch mode of operation. Four unit operations comprising of Protein A capture, viral inactivation, flow-through anion exchange (AEX), and mixed-mode cation exchange chromatography (MMCEX) were integrated across two Cadence BioSMB PD multi-column chromatography systems in order to process a 25L volume of harvested cell culture fluid (HCCF) in less than 12h. Transfer from batch to continuous resulted in an increase in productivity of the Protein A step from 13 to 50g/L/h and of the MMCEX step from 10 to 60g/L/h with no impact on the purification process performance in term of contaminant removal (4.5 log reduction of host cell proteins, 50% reduction in soluble product aggregates) and overall chromatography process yield of recovery (75%). The increase in productivity, combined with continuous operation, reduced the resin volume required for Protein A and MMCEX chromatography by more than 95% compared to batch. The volume of AEX membrane required for flow through operation was reduced by 74%. Moreover, the continuous process required 44% less buffer than an equivalent batch process. This significant reduction in consumables enables cost-effective, disposable, single-use manufacturing. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. NITRATE CONVERSION OF HB-LINE REILLEXTM HPQ RESIN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steimke, J.; Williams, M.; Steeper, T.

    Reillex{trademark} HPQ ion exchange resin is used by HB Line to remove plutonium from aqueous streams. Reillex{trademark} HPQ resin currently available from Vertellus Specialties LLC is a chloride ionic form, which can cause stress corrosion cracking in stainless steels. Therefore, HB Line Engineering requested that Savannah River National Laboratory (SRNL) convert resin from chloride form to nitrate form in the Engineering Development Laboratory (EDL). To perform this task, SRNL treated two batches of resin in 2012. The first batch of resin from Reilly Industries Batch 80302MA was initially treated at SRNL in 2001 to remove chloride. This batch of resin,more » nominally 30 liters, has been stored wet in carboys since that time until being retreated in 2012. The second batch of resin from Batch 23408 consisted of 50 kg of new resin purchased from Vertellus Specialties in 2012. Both batches were treated in a column designed to convert resin using downflow of 1.0 M sodium nitrate solution through the resin bed followed by rinsing with deionized water. Both batches were analyzed for chloride concentration, before and after treatment, using Neutron Activation Analysis (NAA). The resin specification [Werling, 2003] states the total chlorine and chloride concentration shall be less than 250 ppm. The resin condition for measuring this concentration is not specified; however, in service the resin would always be fully wet. Measurements in SRNL showed that changing from oven dry resin to fully wet resin, with liquid in the particle interstices but no supernatant, increases the total weight by a factor of at least three. Therefore, concentration of chlorine or chloride expressed as parts per million (ppm) decreases by a factor of three. Therefore, SRNL recommends measuring chlorine concentration on an oven dry basis, then dividing by three to estimate chloride concentration in the fully wet condition. Chloride concentration in the first batch (No.80302MA) was nearly the same before the current treatment (759 ppm dry) and after treatment (745 ppm dry or {approx}248 ppm wet). Treatment of the second batch of resin (No.23408) was very successful. Chloride concentration decreased from 120,000 ppm dry to an average of 44 ppm dry or {approx}15ppm wet, which easily passes the 250 ppm wet criterion. Per guidance from HB Line Engineering, SRNL blended Batch 80302 resin with Batch P9059 resin which had been treated previously by ResinTech to remove chloride. The chloride concentrations for the two drums of Batch P9059 were 248 ppm dry ({approx}83 ppm wet) {+-}22.8% and 583 ppm dry ({approx}194 ppm wet) {+-} 11.8%. The blended resin was packaged in five gallon buckets.« less

  19. Sequential programmable self-assembly: Role of cooperative interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonathan D. Halverson; Tkachenko, Alexei V.

    Here, we propose a general strategy of “sequential programmable self-assembly” that enables a bottom-up design of arbitrary multi-particle architectures on nano- and microscales. We show that a naive realization of this scheme, based on the pairwise additive interactions between particles, has fundamental limitations that lead to a relatively high error rate. This can be overcome by using cooperative interparticle binding. The cooperativity is a well known feature of many biochemical processes, responsible, e.g., for signaling and regulations in living systems. Here we propose to utilize a similar strategy for high precision self-assembly, and show that DNA-mediated interactions provide a convenientmore » platform for its implementation. In particular, we outline a specific design of a DNA-based complex which we call “DNA spider,” that acts as a smart interparticle linker and provides a built-in cooperativity of binding. We demonstrate versatility of the sequential self-assembly based on spider-functionalized particles by designing several mesostructures of increasing complexity and simulating their assembly process. This includes a number of finite and repeating structures, in particular, the so-called tetrahelix and its several derivatives. Due to its generality, this approach allows one to design and successfully self-assemble virtually any structure made of a “GEOMAG” magnetic construction toy, out of nanoparticles. According to our results, once the binding cooperativity is strong enough, the sequential self-assembly becomes essentially error-free.« less

  20. Sequential programmable self-assembly: Role of cooperative interactions

    DOE PAGES

    Jonathan D. Halverson; Tkachenko, Alexei V.

    2016-03-04

    Here, we propose a general strategy of “sequential programmable self-assembly” that enables a bottom-up design of arbitrary multi-particle architectures on nano- and microscales. We show that a naive realization of this scheme, based on the pairwise additive interactions between particles, has fundamental limitations that lead to a relatively high error rate. This can be overcome by using cooperative interparticle binding. The cooperativity is a well known feature of many biochemical processes, responsible, e.g., for signaling and regulations in living systems. Here we propose to utilize a similar strategy for high precision self-assembly, and show that DNA-mediated interactions provide a convenientmore » platform for its implementation. In particular, we outline a specific design of a DNA-based complex which we call “DNA spider,” that acts as a smart interparticle linker and provides a built-in cooperativity of binding. We demonstrate versatility of the sequential self-assembly based on spider-functionalized particles by designing several mesostructures of increasing complexity and simulating their assembly process. This includes a number of finite and repeating structures, in particular, the so-called tetrahelix and its several derivatives. Due to its generality, this approach allows one to design and successfully self-assemble virtually any structure made of a “GEOMAG” magnetic construction toy, out of nanoparticles. According to our results, once the binding cooperativity is strong enough, the sequential self-assembly becomes essentially error-free.« less

  1. Field-Sequential Color Converter

    NASA Technical Reports Server (NTRS)

    Studer, Victor J.

    1989-01-01

    Electronic conversion circuit enables display of signals from field-sequential color-television camera on color video camera. Designed for incorporation into color-television monitor on Space Shuttle, circuit weighs less, takes up less space, and consumes less power than previous conversion equipment. Incorporates state-of-art memory devices, also used in terrestrial stationary or portable closed-circuit television systems.

  2. Apollo experience report: Command and service module sequential events control subsystem

    NASA Technical Reports Server (NTRS)

    Johnson, G. W.

    1975-01-01

    The Apollo command and service module sequential events control subsystem is described, with particular emphasis on the major systems and component problems and solutions. The subsystem requirements, design, and development and the test and flight history of the hardware are discussed. Recommendations to avoid similar problems on future programs are outlined.

  3. An Undergraduate Survey Course on Asynchronous Sequential Logic, Ladder Logic, and Fuzzy Logic

    ERIC Educational Resources Information Center

    Foster, D. L.

    2012-01-01

    For a basic foundation in computer engineering, universities traditionally teach synchronous sequential circuit design, using discrete gates or field programmable gate arrays, and a microcomputers course that includes basic I/O processing. These courses, though critical, expose students to only a small subset of tools. At co-op schools like…

  4. Terminating Sequential Delphi Survey Data Collection

    ERIC Educational Resources Information Center

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  5. The effects of the sequential addition of synthesis parameters on the performance of alkali activated fly ash mortar

    NASA Astrophysics Data System (ADS)

    Dassekpo, Jean-Baptiste Mawulé; Zha, Xiaoxiong; Zhan, Jiapeng; Ning, Jiaqian

    Geopolymer is an energy efficient and sustainable material that is currently used in construction industry as an alternative for Portland cement. As a new material, specific mix design method is essential and efforts have been made to develop a mix design procedure with the main focus on achieving better compressive strength and economy. In this paper, a sequential addition of synthesis parameters such as fly ash-sand, alkaline liquids, plasticizer and additional water at well-defined time intervals was investigated. A total of 4 mix procedures were used to study the compressive performance on fly ash-based geopolymer mortar and the results of each method were analyzed and discussed. Experimental results show that the sequential addition of sodium hydroxide (NaOH), sodium silicate (Na2SiO3), plasticizer (PL), followed by adding water (WA) increases considerably the compressive strengths of the geopolymer-based mortar. These results clearly demonstrate the high significant influence of sequential addition of synthesis parameters on geopolymer materials compressive properties, and also provide a new mixing method for the preparation of geopolymer paste, mortar and concrete.

  6. Significantly enhanced production of acarbose in fed-batch fermentation with the addition of S-adenosylmethionine.

    PubMed

    Sun, Li-Hui; Li, Ming-Gang; Wang, Yuan-Shan; Zheng, Yu-Guo

    2012-06-01

    Acarbose, a pseudo-oligosaccharide, is widely used clinically in therapies for non-insulin-dependent diabetes. In the present study, S-adenosylmethionine (SAM) was added to selected media in order to investigate its effect on acarbose fermentation by Actinoplanes utahensis ZJB- 08196. Acarbose titer was seen to increase markedly when concentrations of SAM were added over a period of time. The effects of glucose and maltose on the production of acarbose were investigated in both batch and fed-batch fermentation. Optimal acarbose production was observed at relatively low glucose levels and high maltose levels. Based on these results, a further fed-batch experiment was designed so as to enhance the production of acarbose. Fed-batch fermentation was carried out at an initial glucose level of 10 g/l and an initial maltose level of 60 g/l. Then, 12 h post inoculation, 100 micromol/l SAM was added. In addition, 8 g/l of glucose was added every 24 h, and 20 g/l of maltose was added at 96 h. By way of this novel feeding strategy, the maximum titer of acarbose achieved was 6,113 mg/l at 192 h. To our knowledge, the production level of acarbose achieved in this study is the highest ever reported.

  7. Adaptation to high throughput batch chromatography enhances multivariate screening.

    PubMed

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Kinetic studies on batch cultivation of Trichoderma reesei and application to enhance cellulase production by fed-batch fermentation.

    PubMed

    Ma, Lijuan; Li, Chen; Yang, Zhenhua; Jia, Wendi; Zhang, Dongyuan; Chen, Shulin

    2013-07-20

    Reducing the production cost of cellulase as the key enzyme for cellulose hydrolysis to fermentable sugars remains a major challenge for biofuel production. Because of the complexity of cellulase production, kinetic modeling and mass balance calculation can be used as effective tools for process design and optimization. In this study, kinetic models for cell growth, substrate consumption and cellulase production in batch fermentation were developed, and then applied in fed-batch fermentation to enhance cellulase production. Inhibition effect of substrate was considered and a modified Luedeking-Piret model was developed for cellulase production and substrate consumption according to the growth characteristics of Trichoderma reesei. The model predictions fit well with the experimental data. Simulation results showed that higher initial substrate concentration led to decrease of cellulase production rate. Mass balance and kinetic simulation results were applied to determine the feeding strategy. Cellulase production and its corresponding productivity increased by 82.13% after employing the proper feeding strategy in fed-batch fermentation. This method combining mathematics and chemometrics by kinetic modeling and mass balance can not only improve cellulase fermentation process, but also help to better understand the cellulase fermentation process. The model development can also provide insight to other similar fermentation processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Evaluation of an injectable polymeric delivery system for controlled and localized release of biological factors to promote therapeutic angiogenesis

    NASA Astrophysics Data System (ADS)

    Rocker, Adam John

    Cardiovascular disease remains as the leading cause of death worldwide and is frequently associated with partial or full occlusion of coronary arteries. Currently, angioplasty and bypass surgery are the standard approaches for treating patients with these ischemic heart conditions. However, a large number of patients cannot undergo these procedures. Therapeutic angiogenesis provides a minimally invasive tool for treating cardiovascular diseases by inducing new blood vessel growth from the existing vasculature. Angiogenic growth factors can be delivered locally through gene, cell, and protein therapy. Natural and synthetic polymer growth factor delivery systems are under extensive investigation due their widespread applications and promising therapeutic potential. Although biocompatible, natural polymers often suffer from batch-to-batch variability which can cause unpredictable growth factor release rates. Synthetic polymers offer advantages for growth factor delivery as they can be easily modified to control release kinetics. During the angiogenesis process, vascular endothelial growth factor (VEGF) is necessary to initiate neovessel formation while platelet-derived growth factor (PDGF) is needed later to help stabilize and mature new vessels. In the setting of myocardial infarction, additional anti-inflammatory cytokines like IL-10 are needed to help optimize cardiac repair and limit the damaging effects of inflammation following infarction. To meet these angiogenic and anti-inflammatory needs, an injectable polymer delivery system created from a sulfonated reverse thermal gel encapsulating micelle nanoparticles was designed and evaluated. The sulfonate groups on the thermal gel electrostatically bind to VEGF which controls its release rate, while the micelles are loaded with PDGF and are slowly released as the gel degrades. IL-10 was loaded into the system as well and diffused from the gel over time. An in vitro release study was performed which demonstrated the sequential release capabilities of the polymer system. The ability of the polymer system to induce new blood vessel formation was analyzed in vivo using a subcutaneous injection mouse model. Histological assessment was used to quantify blood vessel formation and an inflammatory response which showed that the polymer delivery system demonstrated a significant increase in functional and mature vessel formation while significantly reducing inflammation.

  10. Sample size re-estimation and other midcourse adjustments with sequential parallel comparison design.

    PubMed

    Silverman, Rachel K; Ivanova, Anastasia

    2017-01-01

    Sequential parallel comparison design (SPCD) was proposed to reduce placebo response in a randomized trial with placebo comparator. Subjects are randomized between placebo and drug in stage 1 of the trial, and then, placebo non-responders are re-randomized in stage 2. Efficacy analysis includes all data from stage 1 and all placebo non-responding subjects from stage 2. This article investigates the possibility to re-estimate the sample size and adjust the design parameters, allocation proportion to placebo in stage 1 of SPCD, and weight of stage 1 data in the overall efficacy test statistic during an interim analysis.

  11. Simultaneous application of chemical oxidation and extraction processes is effective at remediating soil Co-contaminated with petroleum and heavy metals.

    PubMed

    Yoo, Jong-Chan; Lee, Chadol; Lee, Jeung-Sun; Baek, Kitae

    2017-01-15

    Chemical extraction and oxidation processes to clean up heavy metals and hydrocarbon from soil have a higher remediation efficiency and take less time than other remediation processes. In batch extraction/oxidation process, 3% hydrogen peroxide (H 2 O 2 ) and 0.1 M ethylenediaminetetraacetic acid (EDTA) could remove approximately 70% of the petroleum and 60% of the Cu and Pb in the soil, respectively. In particular, petroleum was effectively oxidized by H 2 O 2 without addition of any catalysts through dissolution of Fe oxides in natural soils. Furthermore, heavy metals bound to Fe-Mn oxyhydroxides could be extracted by metal-EDTA as well as Fe-EDTA complexation due to the high affinity of EDTA for metals. However, the strong binding of Fe-EDTA inhibited the oxidation of petroleum in the extraction-oxidation sequential process because Fe was removed during the extraction process with EDTA. The oxidation-extraction sequential process did not significantly enhance the extraction of heavy metals from soil, because a small portion of heavy metals remained bound to organic matter. Overall, simultaneous application of oxidation and extraction processes resulted in highly efficient removal of both contaminants; this approach can be used to remove co-contaminants from soil in a short amount of time at a reasonable cost. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Preferential adsorption and surface precipitation of lead(II) ions onto anatase in artificially contaminated Dixie clay.

    PubMed

    Suzuki, Tasuma; Okita, Miyu; Kakoyama, Satoshi; Niinae, Masakazu; Nakata, Hideki; Fujii, Hiroshi; Tasaka, Yukio

    2017-09-15

    During TEM-EDS (transmission electron microscopy coupled with an X-ray energy dispersive spectrometer) analysis of Dixie clay artificially contaminated with Pb(II), we observed that Pb(II) was preferentially adsorbed and precipitated on the surface of TiO 2 . To deepen the understanding of the mechanism and importance of this phenomenon, batch sorption experiments, XANES (X-ray absorption near edge spectroscopy) analysis, and sequential extraction analysis were performed. The TiO 2 in Dixie clay was found to be anatase, and anatase showed a higher Pb(II) sorption propensity than rutile, α-FeOOH, and one of two MnO 2 investigated in this study. Our experimental results indicated that the Pb precipitates preferentially formed on the surface of anatase was Pb(II) hydroxide or Pb(II) oxide. Additionally, sequential extraction analysis showed that at least 32% and 42% of Pb(II) was sorbed onto anatase in the Dixie clay contaminated with a Pb content of 736mg Pb/kg and 1,958mg Pb/kg, respectively. These results demonstrated that in addition to Fe and Mn oxides that are well-known metal oxides that serve as sinks for Pb(II) in the soil environment, TiO 2 is also a metal oxide that controls the behavior and fate of Pb(II) in soils. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Experimental study on anomalous neutron production in deuterium/solid system

    NASA Astrophysics Data System (ADS)

    He, Jianyu; Zhu, Rongbao; Wang, Xiaozhong; Lu, Feng; Luo, Longjun; Liu, Hengjun; Jiang, Jincai; Tian, Baosheng; Chen, Guoan; Yuan, Yuan; Dong, Baiting; Yang, Liucheng; Qiao, Shengzhong; Yi, Guoan; Guo, Hua; Ding, Dazhao; Menlove, H. O.

    1991-05-01

    A series of experiments on both D2O electrolysis and thermal cycle of deuterium absorbed Ti Turnings has been designed to examine the anomalous phenomena in Deuterium/Solid System. A neutron detector containing 16 BF3 tubes with a detection limit of 0.38 n/s for two hour counting was used for electrolysis experiments. No neutron counting rate statistically higher than detection limit was observed from Fleischmann & Pons type experiments. An HLNCC neutron detector equipped with 18 3He tubes and a JSR-11 shift register unit with a detection limit of 0.20 n/s for a two hour run was employed to study the neutron signals in D2 gas experiments. Different material pretreatments were selected to review the changes in frequency and size of the neutron burst production. Experiment sequence was deliberately designed to distinguish the neutron burst from fake signals, e.g. electronic noise pickup, the cosmic rays and other sources of environmental background. Ten batches of dry fusion samples were tested, among them, seven batches with neutron burst signals occurred roughly at the temperature from -100 degree centigrade to near room temperature. In the first four runs of a typical sample batch, seven neutron bursts were observed with neutron numbers from 15 to 482, which are 3 and 75 times, respectively, higher than the uncertainty of background. However, no bursts happened for H2 dummy samples running in-between and afterwards and for sample batch after certain runs.

  14. Batched matrix computations on hardware accelerators based on GPUs

    DOE PAGES

    Haidar, Azzam; Dong, Tingxing; Luszczek, Piotr; ...

    2015-02-09

    Scientific applications require solvers that work on many small size problems that are independent from each other. At the same time, the high-end hardware evolves rapidly and becomes ever more throughput-oriented and thus there is an increasing need for an effective approach to develop energy-efficient, high-performance codes for these small matrix problems that we call batched factorizations. The many applications that need this functionality could especially benefit from the use of GPUs, which currently are four to five times more energy efficient than multicore CPUs on important scientific workloads. This study, consequently, describes the development of the most common, one-sidedmore » factorizations, Cholesky, LU, and QR, for a set of small dense matrices. The algorithms we present together with their implementations are, by design, inherently parallel. In particular, our approach is based on representing the process as a sequence of batched BLAS routines that are executed entirely on a GPU. Importantly, this is unlike the LAPACK and the hybrid MAGMA factorization algorithms that work under drastically different assumptions of hardware design and efficiency of execution of the various computational kernels involved in the implementation. Thus, our approach is more efficient than what works for a combination of multicore CPUs and GPUs for the problems sizes of interest of the application use cases. The paradigm where upon a single chip (a GPU or a CPU) factorizes a single problem at a time is not at all efficient in our applications’ context. We illustrate all of these claims through a detailed performance analysis. With the help of profiling and tracing tools, we guide our development of batched factorizations to achieve up to two-fold speedup and three-fold better energy efficiency as compared against our highly optimized batched CPU implementations based on MKL library. Finally, the tested system featured two sockets of Intel Sandy Bridge CPUs and we compared with a batched LU factorizations featured in the CUBLAS library for GPUs, we achieve as high as 2.5× speedup on the NVIDIA K40 GPU.« less

  15. Efficient estimation of the maximum metabolic productivity of batch systems

    DOE PAGES

    St. John, Peter C.; Crowley, Michael F.; Bomble, Yannick J.

    2017-01-31

    Production of chemicals from engineered organisms in a batch culture involves an inherent trade-off between productivity, yield, and titer. Existing strategies for strain design typically focus on designing mutations that achieve the highest yield possible while maintaining growth viability. While these methods are computationally tractable, an optimum productivity could be achieved by a dynamic strategy in which the intracellular division of resources is permitted to change with time. New methods for the design and implementation of dynamic microbial processes, both computational and experimental, have therefore been explored to maximize productivity. However, solving for the optimal metabolic behavior under the assumptionmore » that all fluxes in the cell are free to vary is a challenging numerical task. Here, previous studies have therefore typically focused on simpler strategies that are more feasible to implement in practice, such as the time-dependent control of a single flux or control variable.« less

  16. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    PubMed

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Active pharmaceutical ingredient (API) production involving continuous processes--a process system engineering (PSE)-assisted design framework.

    PubMed

    Cervera-Padrell, Albert E; Skovby, Tommy; Kiil, Søren; Gani, Rafiqul; Gernaey, Krist V

    2012-10-01

    A systematic framework is proposed for the design of continuous pharmaceutical manufacturing processes. Specifically, the design framework focuses on organic chemistry based, active pharmaceutical ingredient (API) synthetic processes, but could potentially be extended to biocatalytic and fermentation-based products. The method exploits the synergic combination of continuous flow technologies (e.g., microfluidic techniques) and process systems engineering (PSE) methods and tools for faster process design and increased process understanding throughout the whole drug product and process development cycle. The design framework structures the many different and challenging design problems (e.g., solvent selection, reactor design, and design of separation and purification operations), driving the user from the initial drug discovery steps--where process knowledge is very limited--toward the detailed design and analysis. Examples from the literature of PSE methods and tools applied to pharmaceutical process design and novel pharmaceutical production technologies are provided along the text, assisting in the accumulation and interpretation of process knowledge. Different criteria are suggested for the selection of batch and continuous processes so that the whole design results in low capital and operational costs as well as low environmental footprint. The design framework has been applied to the retrofit of an existing batch-wise process used by H. Lundbeck A/S to produce an API: zuclopenthixol. Some of its batch operations were successfully converted into continuous mode, obtaining higher yields that allowed a significant simplification of the whole process. The material and environmental footprint of the process--evaluated through the process mass intensity index, that is, kg of material used per kg of product--was reduced to half of its initial value, with potential for further reduction. The case-study includes reaction steps typically used by the pharmaceutical industry featuring different characteristic reaction times, as well as L-L separation and distillation-based solvent exchange steps, and thus constitutes a good example of how the design framework can be useful to efficiently design novel or already existing API manufacturing processes taking advantage of continuous processes. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    PubMed Central

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  19. Fermentation of Saccharomyces cerevisiae - Combining kinetic modeling and optimization techniques points out avenues to effective process design.

    PubMed

    Scheiblauer, Johannes; Scheiner, Stefan; Joksch, Martin; Kavsek, Barbara

    2018-09-14

    A combined experimental/theoretical approach is presented, for improving the predictability of Saccharomyces cerevisiae fermentations. In particular, a mathematical model was developed explicitly taking into account the main mechanisms of the fermentation process, allowing for continuous computation of key process variables, including the biomass concentration and the respiratory quotient (RQ). For model calibration and experimental validation, batch and fed-batch fermentations were carried out. Comparison of the model-predicted biomass concentrations and RQ developments with the corresponding experimentally recorded values shows a remarkably good agreement for both batch and fed-batch processes, confirming the adequacy of the model. Furthermore, sensitivity studies were performed, in order to identify model parameters whose variations have significant effects on the model predictions: our model responds with significant sensitivity to the variations of only six parameters. These studies provide a valuable basis for model reduction, as also demonstrated in this paper. Finally, optimization-based parametric studies demonstrate how our model can be utilized for improving the efficiency of Saccharomyces cerevisiae fermentations. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Reversible logic gates on Physarum Polycephalum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schumann, Andrew

    2015-03-10

    In this paper, we consider possibilities how to implement asynchronous sequential logic gates and quantum-style reversible logic gates on Physarum polycephalum motions. We show that in asynchronous sequential logic gates we can erase information because of uncertainty in the direction of plasmodium propagation. Therefore quantum-style reversible logic gates are more preferable for designing logic circuits on Physarum polycephalum.

  1. Functions Represented as Linear Sequential Data: Relationships between Presentation and Student Responses

    ERIC Educational Resources Information Center

    Ayalon, Michal; Watson, Anne; Lerman, Steve

    2015-01-01

    This study investigates students' ways of attending to linear sequential data in two tasks, and conjectures possible relationships between those ways and elements of the task design. Drawing on the substantial literature about such situations, we focus for this paper on linear rate of change, and on covariation and correspondence approaches to…

  2. Beyond Grand Rounds: A Comprehensive and Sequential Intervention to Improve Identification of Delirium

    ERIC Educational Resources Information Center

    Ramaswamy, Ravishankar; Dix, Edward F.; Drew, Janet E.; Diamond, James J.; Inouye, Sharon K.; Roehl, Barbara J. O.

    2011-01-01

    Purpose of the Study: Delirium is a widespread concern for hospitalized seniors, yet is often unrecognized. A comprehensive and sequential intervention (CSI) aiming to effect change in clinician behavior by improving knowledge about delirium was tested. Design and Methods: A 2-day CSI program that consisted of progressive 4-part didactic series,…

  3. Applying quality by design (QbD) concept for fabrication of chitosan coated nanoliposomes.

    PubMed

    Pandey, Abhijeet P; Karande, Kiran P; Sonawane, Raju O; Deshmukh, Prashant K

    2014-03-01

    In the present investigation, a quality by design (QbD) strategy was successfully applied to the fabrication of chitosan-coated nanoliposomes (CH-NLPs) encapsulating a hydrophilic drug. The effects of the processing variables on the particle size, encapsulation efficiency (%EE) and coating efficiency (%CE) of CH-NLPs (prepared using a modified ethanol injection method) were investigated. The concentrations of lipid, cholesterol, drug and chitosan; stirring speed, sonication time; organic:aqueous phase ratio; and temperature were identified as the key factors after risk analysis for conducting a screening design study. A separate study was designed to investigate the robustness of the predicted design space. The particle size, %EE and %CE of the optimized CH-NLPs were 111.3 nm, 33.4% and 35.2%, respectively. The observed responses were in accordance with the predicted response, which confirms the suitability and robustness of the design space for CH-NLP formulation. In conclusion, optimization of the selected key variables will help minimize the problems related to size, %EE and %CE that are generally encountered when scaling up processes for NLP formulations. The robustness of the design space will help minimize both intra-batch and inter-batch variations, which are quite common in the pharmaceutical industry.

  4. Aerospace electrode line

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1980-01-01

    A facility which produces electrodes for spacecraft power supplies is described. The electrode assembly procedures are discussed. A number of design features in the production process are reported including a batch operation mode and an independent equipment module design approach for transfering the electrode materials from process tank to process tank.

  5. Model-Based Optimal Experimental Design for Complex Physical Systems

    DTIC Science & Technology

    2015-12-03

    for public release. magnitude reduction in estimator error required to make solving the exact optimal design problem tractable. Instead of using a naive...for designing a sequence of experiments uses suboptimal approaches: batch design that has no feedback, or greedy ( myopic ) design that optimally...approved for public release. Equation 1 is difficult to solve directly, but can be expressed in an equivalent form using the principle of dynamic programming

  6. Novel Designs of Quantum Reversible Counters

    NASA Astrophysics Data System (ADS)

    Qi, Xuemei; Zhu, Haihong; Chen, Fulong; Zhu, Junru; Zhang, Ziyang

    2016-11-01

    Reversible logic, as an interesting and important issue, has been widely used in designing combinational and sequential circuits for low-power and high-speed computation. Though a significant number of works have been done on reversible combinational logic, the realization of reversible sequential circuit is still at premature stage. Reversible counter is not only an important part of the sequential circuit but also an essential part of the quantum circuit system. In this paper, we designed two kinds of novel reversible counters. In order to construct counter, the innovative reversible T Flip-flop Gate (TFG), T Flip-flop block (T_FF) and JK flip-flop block (JK_FF) are proposed. Based on the above blocks and some existing reversible gates, the 4-bit binary-coded decimal (BCD) counter and controlled Up/Down synchronous counter are designed. With the help of Verilog hardware description language (Verilog HDL), these counters above have been modeled and confirmed. According to the simulation results, our circuits' logic structures are validated. Compared to the existing ones in terms of quantum cost (QC), delay (DL) and garbage outputs (GBO), it can be concluded that our designs perform better than the others. There is no doubt that they can be used as a kind of important storage components to be applied in future low-power computing systems.

  7. Enhanced biodiesel production in Neochloris oleoabundans by a semi-continuous process in two stage photobioreactors.

    PubMed

    Yoon, Se Young; Hong, Min Eui; Chang, Won Seok; Sim, Sang Jun

    2015-07-01

    Under autotrophic conditions, highly productive biodiesel production was achieved using a semi-continuous culture system in Neochloris oleoabundans. In particular, the flue gas generated by combustion of liquefied natural gas and natural solar radiation were used for cost-effective microalgal culture system. In semi-continuous culture, the greater part (~80%) of the culture volume containing vegetative cells grown under nitrogen-replete conditions in a first photobioreactor (PBR) was directly transferred to a second PBR and cultured sequentially under nitrogen-deplete conditions for accelerating oil accumulation. As a result, in semi-continuous culture, the productivities of biomass and biodiesel in the cells were increased by 58% (growth phase) and 51% (induction phase) compared to the cells in batch culture, respectively. The semi-continuous culture system using two stage photobioreactors is a very efficient strategy to further improve biodiesel production from microalgae under photoautotrophic conditions.

  8. Chemical reactions simulated by ground-water-quality models

    USGS Publications Warehouse

    Grove, David B.; Stollenwerk, Kenneth G.

    1987-01-01

    Recent literature concerning the modeling of chemical reactions during transport in ground water is examined with emphasis on sorption reactions. The theory of transport and reactions in porous media has been well documented. Numerous equations have been developed from this theory, to provide both continuous and sequential or multistep models, with the water phase considered for both mobile and immobile phases. Chemical reactions can be either equilibrium or non-equilibrium, and can be quantified in linear or non-linear mathematical forms. Non-equilibrium reactions can be separated into kinetic and diffusional rate-limiting mechanisms. Solutions to the equations are available by either analytical expressions or numerical techniques. Saturated and unsaturated batch, column, and field studies are discussed with one-dimensional, laboratory-column experiments predominating. A summary table is presented that references the various kinds of models studied and their applications in predicting chemical concentrations in ground waters.

  9. Effects of ammonium on uranium partitioning and kaolinite mineral dissolution.

    PubMed

    Emerson, Hilary P; Di Pietro, Silvina; Katsenovich, Yelena; Szecsody, Jim

    2017-02-01

    Ammonia gas injection is a promising technique for the remediation of uranium within the vadose zone. It can be used to manipulate the pH of a system and cause co-precipitation processes that are expected to remove uranium from the aqueous phase and decrease leaching from the solid phase. The work presented in this paper explores the effects of ammonium and sodium hydroxide on the partitioning of uranium and dissolution of the kaolinite mineral in simplified synthetic groundwaters using equilibrium batch sorption and sequential extraction experiments. It shows that there is a significant increase in uranium removal in systems with divalent cations present in the aqueous phase but not in sodium chloride synthetic groundwaters. Further, the initial conditions of the aqueous phase do not affect the dissolution of kaolinite. However, the type of base treatment does have an effect on mineral dissolution. Published by Elsevier Ltd.

  10. Perspectives on anaerobic treatment in developing countries.

    PubMed

    Foresti, E

    2001-01-01

    Developing countries occupy regions where the climate is warm most of the time. Even in sub-tropical areas, low temperatures do not persist for long periods. This is the main factor that makes the use of anaerobic technology applicable and less expensive, even for the treatment of low-strength industrial wastewaters and domestic sewage. Based mainly on papers presented at the "VI Latin-American Workshop and Seminar on Anaerobic Digestion" held in Recife, Brazil, in November 2000, this text approaches the perspectives of anaerobic treatment of wastewaters in developing countries. Emphasis is given to domestic sewage treatment and to the use of compact systems in which sequential batch reactors (SBR) or dissolvedair flotation (DAF) systems are applied for the post-treatment of anaerobic reactor effluents. Experiments on bench- and pilot-plants have indicated that these systems can achieve high performance in removing organic matter and nutrients during the treatment of domestic sewage at ambient temperatures.

  11. Fermentation of biomass sugars to ethanol using native industrial yeast strains.

    PubMed

    Yuan, Dawei; Rao, Kripa; Relue, Patricia; Varanasi, Sasidhar

    2011-02-01

    In this paper, the feasibility of a technology for fermenting sugar mixtures representative of cellulosic biomass hydrolyzates with native industrial yeast strains is demonstrated. This paper explores the isomerization of xylose to xylulose using a bi-layered enzyme pellet system capable of sustaining a micro-environmental pH gradient. This ability allows for considerable flexibility in conducting the isomerization and fermentation steps. With this method, the isomerization and fermentation could be conducted sequentially, in fed-batch, or simultaneously to maximize utilization of both C5 and C6 sugars and ethanol yield. This system takes advantage of a pH-dependent complexation of xylulose with a supplemented additive to achieve up to 86% isomerization of xylose at fermentation conditions. Commercially-proven Saccharomyces cerevisiae strains from the corn-ethanol industry were used and shown to be very effective in implementation of the technology for ethanol production. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  13. Enhancing the biocompatibility of microfluidics-assisted fabrication of cell-laden microgels with channel geometry.

    PubMed

    Kim, Suntae; Oh, Jonghyun; Cha, Chaenyung

    2016-11-01

    Microfluidic flow-focusing devices (FFD) are widely used to generate monodisperse droplets and microgels with controllable size, shape and composition for various biomedical applications. However, highly inconsistent and often low viability of cells encapsulated within the microgels prepared via microfluidic FFD has been a major concern, and yet this aspect has not been systematically explored. In this study, we demonstrate that the biocompatibility of microfluidic FFD to fabricate cell-laden microgels can be significantly enhanced by controlling the channel geometry. When a single emulsion ("single") microfluidic FFD is used to fabricate cell-laden microgels, there is a significant decrease and batch-to-batch variability in the cell viability, regardless of their size and composition. It is determined that during droplet generation, some of the cells are exposed to the oil phase which is shown to have a cytotoxic effect. Therefore, a microfluidic device with a sequential ('double') flow-focusing channels is employed instead, in which a secondary aqueous phase containing cells enters the primary aqueous phase, so the cells' exposure to the oil phase is minimized by directing them to the center of droplets. This microfluidic channel geometry significantly enhances the biocompatibility of cell-laden microgels, while maintaining the benefits of a typical microfluidic process. This study therefore provides a simple and yet highly effective strategy to improve the biocompatibility of microfluidic fabrication of cell-laden microgels. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Treatment of Slaughter House Wastewater in a Sequencing Batch Reactor: Performance Evaluation and Biodegradation Kinetics

    PubMed Central

    Kundu, Pradyut; Debsarkar, Anupam; Mukherjee, Somnath

    2013-01-01

    Slaughterhouse wastewater contains diluted blood, protein, fat, and suspended solids, as a result the organic and nutrient concentration in this wastewater is vary high and the residues are partially solubilized, leading to a highly contaminating effect in riverbeds and other water bodies if the same is let off untreated. The performance of a laboratory-scale Sequencing Batch Reactor (SBR) has been investigated in aerobic-anoxic sequential mode for simultaneous removal of organic carbon and nitrogen from slaughterhouse wastewater. The reactor was operated under three different variations of aerobic-anoxic sequence, namely, (4+4), (5+3), and (3+5) hr. of total react period with two different sets of influent soluble COD (SCOD) and ammonia nitrogen (NH4 +-N) level 1000 ± 50 mg/L, and 90 ± 10 mg/L, 1000 ± 50 mg/L and 180 ± 10 mg/L, respectively. It was observed that from 86 to 95% of SCOD removal is accomplished at the end of 8.0 hr of total react period. In case of (4+4) aerobic-anoxic operating cycle, a reasonable degree of nitrification 90.12 and 74.75% corresponding to initial NH4 +-N value of 96.58 and 176.85 mg/L, respectively, were achieved. The biokinetic coefficients (k, K s, Y, k d) were also determined for performance evaluation of SBR for scaling full-scale reactor in future operation. PMID:24027751

  15. Optimization of sodium hydroxide pretreatment and enzyme loading for efficient hydrolysis of rice straw to improve succinate production by metabolically engineered Escherichia coli KJ122 under simultaneous saccharification and fermentation.

    PubMed

    Sawisit, Apichai; Jampatesh, Surawee; Jantama, Sirima Suvarnakuta; Jantama, Kaemwich

    2018-07-01

    Rice straw was pretreated with sodium hydroxide (NaOH) before subsequent use for succinate production by Escherichia coli KJ122 under simultaneous saccharification and fermentation (SSF). The NaOH pretreated rice straw was significantly enhanced lignin removal up to 95%. With the optimized enzyme loading of 4% cellulase complex + 0.5% xylanase (endo-glucanase 67 CMC-U/g, β-glucosidase 26 pNG-U/g and xylanase 18 CMC-U/g dry biomass), total sugar conversion reached 91.7 ± 0.8% (w/w). The physicochemical analysis of NaOH pretreated rice straw indicated dramatical changes in its structure, thereby favoring enzymatic saccharification. In batch SSF, succinate production of 69.8 ± 0.3 g/L with yield and productivity of 0.84 g/g pretreated rice straw and 0.76 ± 0.02 g/L/h, respectively, was obtained. Fed-batch SSF significantly improved succinate concentration and productivity to 103.1 ± 0.4 g/L and 1.37 ± 0.07 g/L/h with a comparable yield. The results demonstrated a feasibility of sequential saccharification and fermentation of rice straw as a promising process for succinate production in industrial scale. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Soft robot design methodology for `push-button' manufacturing

    NASA Astrophysics Data System (ADS)

    Paik, Jamie

    2018-06-01

    `Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.

  17. Using timed event sequential data in nursing research.

    PubMed

    Pecanac, Kristen E; Doherty-King, Barbara; Yoon, Ju Young; Brown, Roger; Schiefelbein, Tony

    2015-01-01

    Measuring behavior is important in nursing research, and innovative technologies are needed to capture the "real-life" complexity of behaviors and events. The purpose of this article is to describe the use of timed event sequential data in nursing research and to demonstrate the use of this data in a research study. Timed event sequencing allows the researcher to capture the frequency, duration, and sequence of behaviors as they occur in an observation period and to link the behaviors to contextual details. Timed event sequential data can easily be collected with handheld computers, loaded with a software program designed for capturing observations in real time. Timed event sequential data add considerable strength to analysis of any nursing behavior of interest, which can enhance understanding and lead to improvement in nursing practice.

  18. Speech Perception and Production by Sequential Bilingual Children: A Longitudinal Study of Voice Onset Time Acquisition

    PubMed Central

    McCarthy, Kathleen M; Mahon, Merle; Rosen, Stuart; Evans, Bronwen G

    2014-01-01

    The majority of bilingual speech research has focused on simultaneous bilinguals. Yet, in immigrant communities, children are often initially exposed to their family language (L1), before becoming gradually immersed in the host country's language (L2). This is typically referred to as sequential bilingualism. Using a longitudinal design, this study explored the perception and production of the English voicing contrast in 55 children (40 Sylheti-English sequential bilinguals and 15 English monolinguals). Children were tested twice: when they were in nursery (52-month-olds) and 1 year later. Sequential bilinguals' perception and production of English plosives were initially driven by their experience with their L1, but after starting school, changed to match that of their monolingual peers. PMID:25123987

  19. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  20. Meta-analyses and adaptive group sequential designs in the clinical development process.

    PubMed

    Jennison, Christopher; Turnbull, Bruce W

    2005-01-01

    The clinical development process can be viewed as a succession of trials, possibly overlapping in calendar time. The design of each trial may be influenced by results from previous studies and other currently proceeding trials, as well as by external information. Results from all of these trials must be considered together in order to assess the efficacy and safety of the proposed new treatment. Meta-analysis techniques provide a formal way of combining the information. We examine how such methods can be used in combining results from: (1) a collection of separate studies, (2) a sequence of studies in an organized development program, and (3) stages within a single study using a (possibly adaptive) group sequential design. We present two examples. The first example concerns the combining of results from a Phase IIb trial using several dose levels or treatment arms with those of the Phase III trial comparing the treatment selected in Phase IIb against a control This enables a "seamless transition" from Phase IIb to Phase III. The second example examines the use of combination tests to analyze data from an adaptive group sequential trial.

  1. Development of a low-flow multiplexed interface for capillary electrophoresis/electrospray ion trap mass spectrometry using sequential spray.

    PubMed

    Chen, Chao-Jung; Li, Fu-An; Her, Guor-Rong

    2008-05-01

    A multiplexed CE-MS interface using four low-flow sheath liquid ESI sprayers has been developed. Because of the limited space between the low-flow sprayers and the entrance aperture of the ESI source, multichannel analysis is difficult using conventional rotating plate approaches. Instead, a multiplexed low-flow system was achieved by applying an ESI potential sequentially to the four low-flow sprayers, resulting in only one sprayer being sprayed at any given time. The synchronization of the scan event and the voltage relays was accomplished by using the data acquisition signal from the IT mass spectrometer. This synchronization resulted in the ESI voltage being sequentially applied to each of the four sprayers according to the corresponding scan event. With this design, a four-fold increase in analytical throughput was achieved. Because of the use of low-flow interfaces, this multiplexed system has superior sensitivity than a rotating plate design using conventional sheath liquid interfaces. The multiplexed design presented has the potential to be applied to other low-flow multiplexed systems, such as multiplexed capillary LC and multiplexed CEC.

  2. The consistency approach for quality control of vaccines - a strategy to improve quality control and implement 3Rs.

    PubMed

    De Mattia, Fabrizio; Chapsal, Jean-Michel; Descamps, Johan; Halder, Marlies; Jarrett, Nicholas; Kross, Imke; Mortiaux, Frederic; Ponsar, Cecile; Redhead, Keith; McKelvie, Jo; Hendriksen, Coenraad

    2011-01-01

    Current batch release testing of established vaccines emphasizes quality control of the final product and is often characterized by extensive use of animals. This report summarises the discussions of a joint ECVAM/EPAA workshop on the applicability of the consistency approach for routine release of human and veterinary vaccines and its potential to reduce animal use. The consistency approach is based upon thorough characterization of the vaccine during development and the principle that the quality of subsequent batches is the consequence of the strict application of a quality system and of a consistent production of batches. The concept of consistency of production is state-of-the-art for new-generation vaccines, where batch release is mainly based on non-animal methods. There is now the opportunity to introduce the approach into established vaccine production, where it has the potential to replace in vivo tests with non-animal tests designed to demonstrate batch quality while maintaining the highest quality standards. The report indicates how this approach may be further developed for application to established human and veterinary vaccines and emphasizes the continuing need for co-ordination and harmonization. It also gives recommendations for work to be undertaken in order to encourage acceptance and implementation of the consistency approach. Copyright © 2011. Published by Elsevier Ltd.. All rights reserved.

  3. Use of Model-Based Nutrient Feeding for Improved Production of Artemisinin by Hairy Roots of Artemisia Annua in a Modified Stirred Tank Bioreactor.

    PubMed

    Patra, Nivedita; Srivastava, Ashok K

    2015-09-01

    Artemisinin has been indicated to be a potent drug for the cure of malaria. Batch growth and artemisinin production kinetics of hairy root cultures of Artemisia annua were studied under shake flask conditions which resulted in accumulation of 12.49 g/L biomass and 0.27 mg/g artemisinin. Using the kinetic data, a mathematical model was identified to understand and optimize the system behavior. The developed model was then extrapolated to design nutrient feeding strategies during fed-batch cultivation for enhanced production of artemisinin. In one of the fed-batch cultivation, sucrose (37 g/L) feeding was done at a constant feed rate of 0.1 L/day during 10-15 days, which led to improved artemisinin accumulation of 0.77 mg/g. The second strategy of fed-batch hairy root cultivation involved maintenance of pseudo-steady state sucrose concentration (20.8 g/L) during 10-15 days which resulted in artemisinin accumulation of 0.99 mg/g. Fed-batch cultivation (with the maintenance of pseudo-steady state of substrate) of Artemisia annua hairy roots was, thereafter, implemented in bioreactor cultivation, which featured artemisinin accumulation of 1.0 mg/g artemisinin in 16 days of cultivation. This is the highest reported artemisinin yield by hairy root cultivation in a bioreactor.

  4. Enhancing tablet disintegration characteristics of a highly water-soluble high-drug-loading formulation by granulation process.

    PubMed

    Pandey, Preetanshu; Levins, Christopher; Pafiakis, Steve; Zacour, Brian; Bindra, Dilbir S; Trinh, Jade; Buckley, David; Gour, Shruti; Sharif, Shasad; Stamato, Howard

    2018-07-01

    The objective of this study was to improve the disintegration and dissolution characteristics of a highly water-soluble tablet matrix by altering the manufacturing process. A high disintegration time along with high dependence of the disintegration time on tablet hardness was observed for a high drug loading (70% w/w) API when formulated using a high-shear wet granulation (HSWG) process. Keeping the formulation composition mostly constant, a fluid-bed granulation (FBG) process was explored as an alternate granulation method using a 2 (4-1) fractional factorial design with two center points. FBG batches (10 batches) were manufactured using varying disingtegrant amount, spray rate, inlet temperature (T) and atomization air pressure. The resultant final blend particle size was affected significantly by spray rate (p = .0009), inlet T (p = .0062), atomization air pressure (p = .0134) and the interaction effect between inlet T*spray rate (p = .0241). The compactibility of the final blend was affected significantly by disintegrant amount (p < .0001), atomization air pressure (p = .0013) and spray rate (p = .05). It was observed that the fluid-bed batches gave significantly lower disintegration times than the HSWG batches, and mercury intrusion porosimetry data revealed that this was caused by the higher internal pore structure of tablets manufactured using the FBG batches.

  5. Acceptance Test Data for BWXT Coated Particle Batch 93164A Defective IPyC Fraction and Pyrocarbon Anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helmreich, Grant W.; Hunn, John D.; Skitt, Darren J.

    2017-02-01

    Coated particle fuel batch J52O-16-93164 was produced by Babcock and Wilcox Technologies (BWXT) for possible selection as fuel for the Advanced Gas Reactor Fuel Development and Qualification (AGR) Program’s AGR-5/6/7 irradiation test in the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), or may be used as demonstration production-scale coated particle fuel for other experiments. The tristructural-isotropic (TRISO) coatings were deposited in a 150-mm-diameter production-scale fluidizedbed chemical vapor deposition (CVD) furnace onto 425-μm-nominal-diameter spherical kernels from BWXT lot J52L-16-69316. Each kernel contained a mixture of 15.5%-enriched uranium carbide and uranium oxide (UCO) and was coated with four consecutive CVD layers:more » a ~50% dense carbon buffer layer with 100-μm-nominal thickness, a dense inner pyrolytic carbon (IPyC) layer with 40-μm-nominal thickness, a silicon carbide (SiC) layer with 35-μm-nominal thickness, and a dense outer pyrolytic carbon (OPyC) layer with 40-μm-nominal thickness. The TRISO-coated particle batch was sieved to upgrade the particles by removing over-sized and under-sized material, and the upgraded batch was designated by appending the letter A to the end of the batch number (i.e., 93164A).« less

  6. Speech Perception and Production by Sequential Bilingual Children: A Longitudinal Study of Voice Onset Time Acquisition

    ERIC Educational Resources Information Center

    McCarthy, Kathleen M.; Mahon, Merle; Rosen, Stuart; Evans, Bronwen G.

    2014-01-01

    The majority of bilingual speech research has focused on simultaneous bilinguals. Yet, in immigrant communities, children are often initially exposed to their family language (L1), before becoming gradually immersed in the host country's language (L2). This is typically referred to as sequential bilingualism. Using a longitudinal design, this…

  7. Evaluating the parent-adolescent communication toolkit: Usability and preliminary content effectiveness of an online intervention.

    PubMed

    Toombs, Elaine; Unruh, Anita; McGrath, Patrick

    2018-01-01

    This study aimed to assess the Parent-Adolescent Communication Toolkit, an online intervention designed to help improve parent communication with their adolescents. Participant preferences for two module delivery systems (sequential and unrestricted module access) were identified. Usability assessment of the PACT intervention was completed using pre-test and posttest comparisons. Usability data, including participant completion and satisfaction ratings were examined. Parents ( N  =   18) of adolescents were randomized to a sequential or unrestricted chapter access group. Parent participants completed pre-test measures, the PACT intervention and posttest measures. Participants provided feedback for the intervention to improve modules and provided usability ratings. Adolescent pre- and posttest ratings were evaluated. Usability ratings were high and parent feedback was positive. The sequential module access groups rated the intervention content higher and completed more content than the unrestricted chapter access group, indicating support for the sequential access design. Parent mean posttest communication scores were significantly higher ( p  <   .05) than pre-test scores. No significant differences were detected for adolescent participants. Findings suggest that the Parent-Adolescent Communication Toolkit has potential to improve parent-adolescent communication but further effectiveness assessment is required.

  8. Evaluation of Heat Recuperation in a Concentric Hydrogen Reduction Reactor

    NASA Technical Reports Server (NTRS)

    Linne, Diane; Kleinhenz, Julie; Hegde, Uday

    2012-01-01

    Heat recuperation in an ISRU reactor system involves the recovery of heat from a reacted regolith batch by transferring this energy into a batch of fresh regolith. One concept for a hydrogen reduction reactor is a concentric chamber design where heat is transferred from the inner, reaction chamber into fresh regolith in the outer, recuperation chamber. This concept was tested and analyzed to define the overall benefit compared to a more traditional single chamber batch reactor. Data was gathered for heat-up and recuperation in the inner chamber alone, simulating a single chamber design, as well as recuperation into the outer chamber, simulating a dual chamber design. Experimental data was also used to improve two analytical models, with good agreement for temperature behavior during recuperation, calculated mass of the reactor concepts, and energy required during heat-up. The five tests, performed using JSC-1A regolith simulant, also explored the effectiveness of helium gas fluidization, hydrogen gas fluidization, and vibrational fluidization. Results indicate that higher hydrogen volumetric flow rates are required compared to helium for complete fluidization and mixing, and that vibrational fluidization may provide equivalent mixing while eliminating the need to flow large amounts of excess hydrogen. Analysis of the total energy required for heat-up and steady-state operations for a variety of conditions and assumptions shows that the dual-chamber concept requires the same or more energy than the single chamber concept. With no clear energy savings, the added mass and complexity of the dual-chamber makes it unlikely that this design concept will provide any added benefit to the overall ISRU oxygen production system.

  9. Batch, design optimization, and DNA sequencing study for continuous 1,3-propanediol production from waste glycerol by a soil-based inoculum.

    PubMed

    Kanjilal, Baishali; Noshadi, Iman; Bautista, Eddy J; Srivastava, Ranjan; Parnas, Richard S

    2015-03-01

    1,3-propanediol (1,3-PD) was produced with a robust fermentation process using waste glycerol feedstock from biodiesel production and a soil-based bacterial inoculum. An iterative inoculation method was developed to achieve independence from soil and selectively breed bacterial populations capable of glycerol metabolism to 1,3-PD. The inoculum showed high resistance to impurities in the feedstock. 1,3-PD selectivity and yield in batch fermentations was optimized by appropriate nutrient compositions and pH control. The batch yield of 1,3-PD was maximized to ~0.7 mol/mol for industrial glycerol which was higher than that for pure glycerin. 16S rDNA sequencing results show a systematic selective enrichment of 1,3-PD producing bacteria with iterative inoculation and subsequent process control. A statistical design of experiments was carried out on industrial glycerol batches to optimize conditions, which were used to run two continuous flow stirred-tank reactor (CSTR) experiments over a period of >500 h each. A detailed analysis of steady states at three dilution rates is presented. Enhanced specific 1,3-PD productivity was observed with faster dilution rates due to lower levels of solvent degeneration. 1,3-PD productivity, specific productivity, and yield of 1.1 g/l hr, 1.5 g/g hr, and 0.6 mol/mol of glycerol were obtained at a dilution rate of 0.1 h(-1)which is bettered only by pure strains in pure glycerin feeds.

  10. Flipped clinical training: a structured training method for undergraduates in complete denture prosthesis.

    PubMed

    K, Anbarasi; K, Kasim Mohamed; Vijayaraghavan, Phagalvarthy; Kandaswamy, Deivanayagam

    2016-12-01

    To design and implement flipped clinical training for undergraduate dental students in removable complete denture treatment and predict its effectiveness by comparing the assessment results of students trained by flipped and traditional methods. Flipped training was designed by shifting the learning from clinics to learning center (phase I) and by preserving the practice in clinics (phase II). In phase I, student-faculty interactive session was arranged to recap prior knowledge. This is followed by a display of audio synchronized video demonstration of the procedure in a repeatable way and subsequent display of possible errors that may occur in treatment with guidelines to overcome such errors. In phase II, live demonstration of the procedure was given. Students were asked to treat three patients under instructor's supervision. The summative assessment was conducted by applying the same checklist criterion and rubric scoring used for the traditional method. Assessment results of three batches of students trained by flipped method (study group) and three traditionally trained previous batches (control group) were taken for comparison by chi-square test. The sum of traditionally trained three batch students who prepared acceptable dentures (score: 2 and 3) and unacceptable dentures (score: 1) was compared with the same of flipped trained three batch students revealed that the number of students who demonstrated competency by preparing acceptable dentures was higher for flipped training (χ 2 =30.996 with p<0.001). The results reveal the supremacy of flipped training in enhancing students competency and hence recommended for training various clinical procedures.

  11. Development and optimization of locust bean gum and sodium alginate interpenetrating polymeric network of capecitabine.

    PubMed

    Upadhyay, Mansi; Adena, Sandeep Kumar Reddy; Vardhan, Harsh; Pandey, Sureshwar; Mishra, Brahmeshwar

    2018-03-01

    The objective of the study was to develop interpenetrating polymeric network (IPN) of capecitabine (CAP) using natural polymers locust bean gum (LBG) and sodium alginate (NaAlg). The IPN microbeads were optimized by Box-Behnken Design (BBD) to provide anticipated particle size with good drug entrapment efficiency. The comparative dissolution profile of IPN microbeads of CAP with the marketed preparation proved an excellent sustained drug delivery vehicle. Ionotropic gelation method utilizing metal ion calcium (Ca 2+ ) as a cross-linker was used to prepare IPN microbeads. The optimization study was done by response surface methodology based Box-Behnken Design. The effect of the factors on the responses of optimized batch was exhibited through response surface and contour plots. The optimized batch was analyzed for particle size, % drug entrapment, pharmacokinetic study, in vitro drug release study and further characterized by FTIR, XRD, and SEM. To study the water uptake capacity and hydrodynamic activity of the polymers, swelling studies and viscosity measurement were performed, respectively. The particle size and % drug entrapment of the optimized batch was 494.37 ± 1.4 µm and 81.39 ± 2.9%, respectively, closer to the value predicted by Minitab 17 software. The in vitro drug release study showed sustained release of 92% for 12 h and followed anomalous drug release pattern. The derived pharmacokinetic parameters of optimized batch showed improved results than pure CAP. Thus, the formed IPN microbeads of CAP proved to be an effective extended drug delivery vehicle for the water soluble antineoplastic drug.

  12. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    NASA Astrophysics Data System (ADS)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can evaluate satellite operational status and affirm its true identity. The process of ingesting photometry data and deriving satellite physical characteristics can be directed by analysts in a batch mode, meaning using a batch of recent data, or by automated algorithms in an on-line mode in which the assessment is updated with each new data point. Tools used for detecting change to satellite's status or identity, whether performed with a human in the loop or automated algorithms, are generally not built to detect with minimum latency and traceable confidence intervals. To alleviate those deficiencies, we investigate the use of Hidden Markov Models (HMM), in a Bayesian Network framework, to infer the hidden state (changed or unchanged) of a three-axis stabilized geostationary satellite using broadband and color photometry. Unlike frequentist statistics which exploit only the stationary statistics of the observables in the database, HMM also exploits the temporal pattern of the observables as well. The algorithm also operates in “learning” mode to gradually evolve the HMM and accommodate natural changes such as due to the seasonal dependence of GEO satellite's light curve. Our technique is designed to operate with missing color data. The version that ingests both panchromatic and color data can accommodate gaps in color photometry data. That attribute is important because while color indices, e.g. Johnson R and B, enhance the belief (probability) of a hidden state, in real world situations, flux data is collected sporadically in an untasked collect, and color data is limited and sometimes absent. Fluxes are measured with experimental error whose effect on the algorithm will be studied. Photometry data in the AFRL's Geo Color Photometry Catalog and Geo Observations with Latitudinal Diversity Simultaneously (GOLDS) data sets are used to simulate a wide variety of operational changes and identity cross tags. The algorithm is tested against simulated sequences of observed magnitudes, mimicking both the cadence of untasked SSN and other ground sensors, occasional operational changes and possible occurrence of cross tags of in-cluster satellites. We would like to show that the on-line algorithm can detect change; sometimes right after the first post-change data point is analyzed, for zero latency. We also want to show the unsupervised “learning” capability that allows the HMM to evolve with time without user's assistance. For example, the users are not required to “label” the true state of the data points.

  13. Simultaneous versus sequential optimal experiment design for the identification of multi-parameter microbial growth kinetics as a function of temperature.

    PubMed

    Van Derlinden, E; Bernaerts, K; Van Impe, J F

    2010-05-21

    Optimal experiment design for parameter estimation (OED/PE) has become a popular tool for efficient and accurate estimation of kinetic model parameters. When the kinetic model under study encloses multiple parameters, different optimization strategies can be constructed. The most straightforward approach is to estimate all parameters simultaneously from one optimal experiment (single OED/PE strategy). However, due to the complexity of the optimization problem or the stringent limitations on the system's dynamics, the experimental information can be limited and parameter estimation convergence problems can arise. As an alternative, we propose to reduce the optimization problem to a series of two-parameter estimation problems, i.e., an optimal experiment is designed for a combination of two parameters while presuming the other parameters known. Two different approaches can be followed: (i) all two-parameter optimal experiments are designed based on identical initial parameter estimates and parameters are estimated simultaneously from all resulting experimental data (global OED/PE strategy), and (ii) optimal experiments are calculated and implemented sequentially whereby the parameter values are updated intermediately (sequential OED/PE strategy). This work exploits OED/PE for the identification of the Cardinal Temperature Model with Inflection (CTMI) (Rosso et al., 1993). This kinetic model describes the effect of temperature on the microbial growth rate and encloses four parameters. The three OED/PE strategies are considered and the impact of the OED/PE design strategy on the accuracy of the CTMI parameter estimation is evaluated. Based on a simulation study, it is observed that the parameter values derived from the sequential approach deviate more from the true parameters than the single and global strategy estimates. The single and global OED/PE strategies are further compared based on experimental data obtained from design implementation in a bioreactor. Comparable estimates are obtained, but global OED/PE estimates are, in general, more accurate and reliable. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  14. Communication latencies of wireless devices suitable for time-critical messaging to anesthesia providers.

    PubMed

    Epstein, Richard H; Dexter, Franklin; Rothman, Brian

    2013-04-01

    Rapid and reliable methods of text communication to mobile anesthesia care providers are important to patient care and to efficient operating room management. Anesthesia departments are implementing automated methods to send text messages to mobile devices for abnormal vital signs, clinical recommendations, quality of care, and compliance or billing issues. The most time-critical communications determine maximum acceptable latencies. We studied the reliability of several alphanumeric messaging systems to identify an appropriate technology for such use. Latencies between message initiation and delivery to 3 alphanumeric paging devices were measured over weeks. Two devices used Internet pathways outside the hospital's local network with an external paging vendor (SkyTel). The third device used only the internal hospital network (Zetron). Sequential cell phone text page latencies were examined for lag-1 autocorrelation using the runs test, with results binned by hour and by day. Message latencies subsequently were batched in successive 1-week bins for calculation of the mean and 99th percentiles of latencies. We defined acceptance criteria as a mean latency <30 seconds and no more than 1 in 200 pages (0.5%) having a latency longer than 100 seconds. Cell phone texting was used as a positive control to assure that the analysis was appropriate, because such devices have (known) poor reliability during high network activity. There was substantial correlation among latencies for sequential cell phone text messages when binned by hours (P < 0.0001), but not by days (P = 0.61). The 2 devices using Internet pathways outside the hospital's network demonstrated unacceptable performance, with 1.3% and 33% of latencies exceeding 100 seconds, respectively. The device dependent only on the internal network had a mean latency of 8 seconds, with 100% of 40,200 pages having latencies <100 seconds. The findings suggest that the network used was the deciding factor. Developers of anesthesia communication systems need to measure latencies of proposed communication pathways and devices used to deliver urgent messages to mobile users. Similar evaluation is relevant for text pagers used on an ad hoc basis for delivery of time-critical notifications. Testing over a period of hours to days is adequate only for disqualification of a candidate paging system, because acceptable results are not necessarily indicative of long-term performance. Rather, weeks of testing are required, with appropriate batching of pages for analysis.

  15. Adsorption and Desorption of Cesium in Clay Minerals: Effects of Natural Organic Matter and pH

    NASA Astrophysics Data System (ADS)

    Yoon, Hongkyu; Ilgen, Anastasia; Mills, Melissa; Lee, Moo; Seol, Jeung Gun; Cho, Nam Chan; Kang, Hyungyu

    2017-04-01

    Cesium (Cs) released into the environment (e.g., Fukushima accident) poses significant environmental concerns and remediation challenges. A majority of Cs in the environment have remained within the surface soils due to the strong adsorption affinity of Cs towards clay minerals. Different clay minerals have different bonding sites, resulting in various adsorption mechanisms at nanometer scale. For example, the illite commonly has a basal spacing of 1.0 nm, but becomes wider to 1.4 nm once other cations exchange with K in the interlayer site. Cs adsorbs into these expanded wedged zone strongly, which can control its mobility in the environment. In addition, natural organic matter (NOM) in the surface soils can interact with clay minerals, which can modify the mechanisms of Cs adsorption on the clay minerals by blocking specific adsorption sites and/or providing Cs adsorption sites on NOM surface. In this work, three representative clay minerals (illite, vermiculite, montmorillonite) and humic acid (HA) are used to systematically investigate the adsorption and desorption behavior of Cs. We performed batch adsorption experiments over a range of Cs concentrations on three clay minerals with and without HA, followed by sequential desorption batch testing. We tested desorption efficiency as a function of initial adsorbed Cs concentration, HA content, sodium concentration, and pH. The sequential extraction results are compared to the structural changes in clay minerals, measured using extended X-ray absorption fine structure spectroscopy (EXAFS) and aberration-corrected (scanning) transmission electron microscopy (TEM) - energy dispersive X-ray spectroscopy (EDX). Hence, this work aims to identify the mechanisms of Cs fixation at the nanometer (or atomic-) scale as a function of the clay mineral properties (e.g. expandability, permanent surface charge) and varying organic matter content at different pH values and to enhance our atomic-scale mechanistic understanding of the clay mineral interactions with cesium in the presence of NOM. The expandability of clay minerals and effect of HA addition on Cs adsorption and desorption are highlighted to address the efficiency of Cs removal schemes from contaminated soils. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  16. Complete Non-Radioactive Operability Tests for Cladding Hull Chlorination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Emory D; Johnson, Jared A.; Hylton, Tom D.

    2016-04-01

    Non-radioactive operability tests were made to test the metal chlorination reactor and condenser and their accessories using batch chlorinations of non-radioactive cladding samples and to identify optimum operating practices and components that need further modifications prior to installation of the equipment into the hot cell for tests on actual used nuclear fuel (UNF) cladding. The operability tests included (1) modifications to provide the desired heating and reactor temperature profile; and (2) three batch chlorination tests using, respectively, 100, 250, and 500 g of cladding. During the batch chlorinations, metal corrosion of the equipment was assessed, pressurization of the gas inletmore » was examined and the best method for maintaining solid salt product transfer through the condenser was determined. Also, additional accessing equipment for collection of residual ash and positioning of the unit within the hot cell were identified, designed, and are being fabricated.« less

  17. Improving Embryonic Stem Cell Expansion through the Combination of Perfusion and Bioprocess Model Design

    PubMed Central

    Yeo, David; Kiparissides, Alexandros; Cha, Jae Min; Aguilar-Gallardo, Cristobal; Polak, Julia M.; Tsiridis, Elefterios; Pistikopoulos, Efstratios N.; Mantalaris, Athanasios

    2013-01-01

    Background High proliferative and differentiation capacity renders embryonic stem cells (ESCs) a promising cell source for tissue engineering and cell-based therapies. Harnessing their potential, however, requires well-designed, efficient and reproducible expansion and differentiation protocols as well as avoiding hazardous by-products, such as teratoma formation. Traditional, standard culture methodologies are fragmented and limited in their fed-batch feeding strategies that afford a sub-optimal environment for cellular metabolism. Herein, we investigate the impact of metabolic stress as a result of inefficient feeding utilizing a novel perfusion bioreactor and a mathematical model to achieve bioprocess improvement. Methodology/Principal Findings To characterize nutritional requirements, the expansion of undifferentiated murine ESCs (mESCs) encapsulated in hydrogels was performed in batch and perfusion cultures using bioreactors. Despite sufficient nutrient and growth factor provision, the accumulation of inhibitory metabolites resulted in the unscheduled differentiation of mESCs and a decline in their cell numbers in the batch cultures. In contrast, perfusion cultures maintained metabolite concentration below toxic levels, resulting in the robust expansion (>16-fold) of high quality ‘naïve’ mESCs within 4 days. A multi-scale mathematical model describing population segregated growth kinetics, metabolism and the expression of selected pluripotency (‘stemness’) genes was implemented to maximize information from available experimental data. A global sensitivity analysis (GSA) was employed that identified significant (6/29) model parameters and enabled model validation. Predicting the preferential propagation of undifferentiated ESCs in perfusion culture conditions demonstrates synchrony between theory and experiment. Conclusions/Significance The limitations of batch culture highlight the importance of cellular metabolism in maintaining pluripotency, which necessitates the design of suitable ESC bioprocesses. We propose a novel investigational framework that integrates a novel perfusion culture platform (controlled metabolic conditions) with mathematical modeling (information maximization) to enhance ESC bioprocess productivity and facilitate bioprocess optimization. PMID:24339957

  18. The combination of satellite observation techniques for sequential ionosphere VTEC modeling

    NASA Astrophysics Data System (ADS)

    Erdogan, Eren; Limberger, Marco; Schmidt, Michael; Seitz, Florian; Dettmering, Denise; Börger, Klaus; Brandert, Sylvia; Görres, Barbara; Kersten, Wilhelm F.; Bothmer, Volker; Hinrichs, Johannes; Venzmer, Malte; Mrotzek, Niclas

    2016-04-01

    The project OPTIMAP is a joint initiative by the Bundeswehr GeoInformation Centre (BGIC), the German Space Situational Awareness Centre (GSSAC), the German Geodetic Research Institute of the Technical University of Munich (DGFI-TUM) and the Institute for Astrophysics at the University of Göttingen (IAG). The main goal is to develop an operational tool for ionospheric mapping and prediction (OPTIMAP). A key feature of the project is the combination of different satellite observation techniques to improve the spatio-temporal data coverage and the sensitivity for selected target parameters. In the current status, information about the vertical total electron content (VTEC) is derived from the dual frequency signal processing of four techniques: (1) Terrestrial observations of GPS and GLONASS ensure the high-resolution coverage of continental regions, (2) the satellite altimetry mission Jason-2 is taken into account to provide VTEC in nadir direction along the satellite tracks over the oceans, (3) GPS radio occultations to Formosat-3/COSMIC are exploited for the retrieval of electron density profiles that are integrated to obtain VTEC and (4) Jason-2 carrier-phase observations tracked by the on-board DORIS receiver are processed to determine the relative VTEC. All measurements are sequentially pre-processed in hourly batches serving as input data of a Kalman filter (KF) for modeling the global VTEC distribution. The KF runs in a predictor-corrector mode allowing for the sequential processing of the measurements where update steps are performed with one-minute sampling in the current configuration. The spatial VTEC distribution is represented by B-spline series expansions, i.e., the corresponding B-spline series coefficients together with additional technique-dependent unknowns such as Differential Code Biases and Intersystem Biases are estimated by the KF. As a preliminary solution, the prediction model to propagate the filter state through time is defined by a random walk.

  19. Pollution prevention applications in batch manufacturing operations

    NASA Astrophysics Data System (ADS)

    Sykes, Derek W.; O'Shaughnessy, James

    2004-02-01

    Older, "low-tech" batch manufacturing operations are often fertile grounds for gains resulting from pollution prevention techniques. This paper presents a pollution prevention technique utilized for wastewater discharge permit compliance purposes at a batch manufacturer of detergents, deodorants, and floor-care products. This manufacturer generated industrial wastewater as a result of equipment rinses required after each product batch changeover. After investing a significant amount of capital on end of pip-line wastewater treatment technology designed to address existing discharge limits, this manufacturer chose to investigate alternate, low-cost approaches to address anticipated new permit limits. Mass balances using spreadsheets and readily available formulation and production data were conducted on over 300 products to determine how each individual product contributed to the total wastewater pollutant load. These mass balances indicated that 22 products accounted for over 55% of the wastewater pollutant. Laboratory tests were conducted to determine whether these same products could accept their individual changeover rinse water as make-up water in formulations without sacrificing product quality. This changeover reuse technique was then implement at the plant scale for selected products. Significant reductions in wastewater volume (25%) and wastewater pollutant loading (85+%) were realized as a direct result of this approach.

  20. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    NASA Astrophysics Data System (ADS)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  1. Simultaneous determination of potassium and total fluoride in toothpastes using a SIA system with two potentiometric detectors.

    PubMed

    Pérez-Olmos, R; Soto, J C; Zárate, N; Díez, I

    2008-05-12

    A sequential injection analysis (SIA) system has been developed for the first time to quantify potassium and total fluoride in toothpastes and gels used to prevent both dentinal hypersensitivity and dental caries. To enable this simultaneous determination, potentiometric detection, using a conventional fluoride electrode and a tubular potassium selective electrode, formed by a PVC membrane containing valinomycin as ionophore, was carried out. A manifold that uses a three-way solenoid valve was designed. The former under binary sampling conditions, provides reproducible mixing ratios of two solutions. This fact facilitates that the system automatically generates, on-line, the calibration curves required by the analytical procedure. The calibration ranged from 1.0 x 10(-4) to 1.0 x 10(-3) mol L(-1) for both potassium and total fluoride determinations. The R.S.D. (11 readings) resulted to be less than 1.5% for both determinations. Off-line studies related to the dissolution of the solid samples, the transformation of monofluorophosphate in fluoride, the elimination of organic matrix interference onto the plastic membrane of the potassium electrode, and the selection of the most adequate TISAB solution for fluoride determination, were also considered. A sampling rate of 18 samples h(-1) for both determinations was attained, their precisions and accuracies being statistically indistinguishable from those achieved by atomic emission spectroscopy (for potassium determination) and by a conventional batch potentiometry (for total fluoride determination) adopted as reference techniques.

  2. ProperCAD: A portable object-oriented parallel environment for VLSI CAD

    NASA Technical Reports Server (NTRS)

    Ramkumar, Balkrishna; Banerjee, Prithviraj

    1993-01-01

    Most parallel algorithms for VLSI CAD proposed to date have one important drawback: they work efficiently only on machines that they were designed for. As a result, algorithms designed to date are dependent on the architecture for which they are developed and do not port easily to other parallel architectures. A new project under way to address this problem is described. A Portable object-oriented parallel environment for CAD algorithms (ProperCAD) is being developed. The objectives of this research are (1) to develop new parallel algorithms that run in a portable object-oriented environment (CAD algorithms using a general purpose platform for portable parallel programming called CARM is being developed and a C++ environment that is truly object-oriented and specialized for CAD applications is also being developed); and (2) to design the parallel algorithms around a good sequential algorithm with a well-defined parallel-sequential interface (permitting the parallel algorithm to benefit from future developments in sequential algorithms). One CAD application that has been implemented as part of the ProperCAD project, flat VLSI circuit extraction, is described. The algorithm, its implementation, and its performance on a range of parallel machines are discussed in detail. It currently runs on an Encore Multimax, a Sequent Symmetry, Intel iPSC/2 and i860 hypercubes, a NCUBE 2 hypercube, and a network of Sun Sparc workstations. Performance data for other applications that were developed are provided: namely test pattern generation for sequential circuits, parallel logic synthesis, and standard cell placement.

  3. The Constructivist Approach? I Have Heard about It but I Have Never Seen It: "An Example of Exploratory Sequential Mixed Design Study"

    ERIC Educational Resources Information Center

    Polat, Ahmet; Dogan, Soner; Demir, Selçuk Besir

    2016-01-01

    The present study was undertaken to investigate the quality of education based on the views of the students attending social studies education departments at the Faculties of Education and to determine the existing problems and present suggestions for their solutions. The study was conducted according to exploratory sequential mixed method. In…

  4. The Constructivist Approach? I Have Heard about It but I Have Never Seen It "An Example of Exploratory Sequential Mixed Design Study"

    ERIC Educational Resources Information Center

    Polat, Ahmet; Dogan, Soner; Demir, Selçuk Besir

    2016-01-01

    The present study was undertaken to investigate the quality of education based on the views of the students attending social studies education departments at the Faculties of Education and to determine the existing problems and present suggestions for their solutions. The study was conducted according to exploratory sequential mixed method. In…

  5. Enantioselective Cobalt-Catalyzed Sequential Nazarov Cyclization/Electrophilic Fluorination: Access to Chiral α-Fluorocyclopentenones.

    PubMed

    Zhang, Heyi; Cheng, Biao; Lu, Zhan

    2018-06-20

    A newly designed thiazoline iminopyridine ligand for enantioselective cobalt-catalyzed sequential Nazarov cyclization/electrophilic fluorination was developed. Various chiral α-fluorocyclopentenones were prepared with good yields and diastereo- and enantioselectivities. Further derivatizations could be easily carried out to provide chiral cyclopentenols with three contiguous stereocenters. Furthermore, a direct deesterification of fluorinated products could afford chiral α-single fluorine-substituted cyclopentenones.

  6. 40 CFR 63.11511 - What definitions apply to this subpart?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... eliminator except that the device is designed with multiple pads in series that are woven with layers of... unit (i.e., as a batch) for a predetermined period of time, during which none of the parts are removed... given capture system design: duct intake devices, hoods, enclosures, ductwork, dampers, manifolds...

  7. 40 CFR 63.11511 - What definitions apply to this subpart?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... eliminator except that the device is designed with multiple pads in series that are woven with layers of... unit (i.e., as a batch) for a predetermined period of time, during which none of the parts are removed... given capture system design: duct intake devices, hoods, enclosures, ductwork, dampers, manifolds...

  8. 40 CFR 63.11511 - What definitions apply to this subpart?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... eliminator except that the device is designed with multiple pads in series that are woven with layers of... unit (i.e., as a batch) for a predetermined period of time, during which none of the parts are removed... given capture system design: duct intake devices, hoods, enclosures, ductwork, dampers, manifolds...

  9. 40 CFR 63.11511 - What definitions apply to this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... eliminator except that the device is designed with multiple pads in series that are woven with layers of... unit (i.e., as a batch) for a predetermined period of time, during which none of the parts are removed... given capture system design: duct intake devices, hoods, enclosures, ductwork, dampers, manifolds...

  10. Application of Weighted Gene Co-expression Network Analysis for Data from Paired Design.

    PubMed

    Li, Jianqiang; Zhou, Doudou; Qiu, Weiliang; Shi, Yuliang; Yang, Ji-Jiang; Chen, Shi; Wang, Qing; Pan, Hui

    2018-01-12

    Investigating how genes jointly affect complex human diseases is important, yet challenging. The network approach (e.g., weighted gene co-expression network analysis (WGCNA)) is a powerful tool. However, genomic data usually contain substantial batch effects, which could mask true genomic signals. Paired design is a powerful tool that can reduce batch effects. However, it is currently unclear how to appropriately apply WGCNA to genomic data from paired design. In this paper, we modified the current WGCNA pipeline to analyse high-throughput genomic data from paired design. We illustrated the modified WGCNA pipeline by analysing the miRNA dataset provided by Shiah et al. (2014), which contains forty oral squamous cell carcinoma (OSCC) specimens and their matched non-tumourous epithelial counterparts. OSCC is the sixth most common cancer worldwide. The modified WGCNA pipeline identified two sets of novel miRNAs associated with OSCC, in addition to the existing miRNAs reported by Shiah et al. (2014). Thus, this work will be of great interest to readers of various scientific disciplines, in particular, genetic and genomic scientists as well as medical scientists working on cancer.

  11. Modeling of thermal mode of drying special purposes ceramic products in batch action chamber dryers

    NASA Astrophysics Data System (ADS)

    Lukianov, E. S.; Lozovaya, S. Yu; Lozovoy, N. M.

    2018-03-01

    The article is devoted to the modeling of batch action chamber dryers in the processing line for producing shaped ceramic products. At the drying stage, for various reasons, most of these products are warped and cracked due to the occurrence of irregular shrinkage deformations due to the action of capillary forces. The primary cause is an untruly organized drying mode due to imperfection of chamber dryers design specifically because of the heat-transfer agent supply method and the possibility of creating a uniform temperature field in the whole volume of the chamber.

  12. Design and application of PDF model for extracting

    NASA Astrophysics Data System (ADS)

    Xiong, Lei

    2013-07-01

    In order to change the steps of contributions in editorial department system from two steps to one, this paper advocates that the technology of extracting the information of PDF files should be transplanted from PDF reader into IEEE Xplore contribution system and that it should be combined with uploading in batch skillfully to enable editors to upload PDF files about 1GB in batch for once. Computers will extract the information of the title, author, address, mailbox, abstract and key words of thesis voluntarily for later retrieval so as to save plenty of labor, material and finance for editorial department.

  13. Numerical investigation of effects on blanks for press hardening process during longitudinal flux heating

    NASA Astrophysics Data System (ADS)

    Dietrich, André; Nacke, Bernard

    2018-05-01

    With the induction heating technology, it is possible to heat up blanks for the press hardening process in 20 s or less. Furthermore, the dimension of an induction system is small and easy to control in comparison to conventional heating systems. To bring the induction heating technology to warm forming industry it is necessary to analyze the process under the view of induction. This paper investigates the edge- and end-effects of a batch heated blank. The results facilitate the later design of induction heating systems for the batch process.

  14. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  15. A simple method to determine IgG light chain to heavy chain polypeptide ratios expressed by CHO cells.

    PubMed

    Gerster, Anja; Wodarczyk, Claas; Reichenbächer, Britta; Köhler, Janet; Schulze, Andreas; Krause, Felix; Müller, Dethardt

    2016-12-01

    To establish a high-throughput method for determination of antibodies intra- and extracellular light chain (LC) to heavy chain (HC) polypeptide ratio as screening parameter during cell line development. Chinese Hamster Ovary (CHO) TurboCell pools containing different designed vectors supposed to result in different LC:HC polypeptide ratios were generated by targeted integration. Cell culture supernatants and cell lysates of a fed batch experiment were purified by combined Protein A and anti-kappa affinity batch purification in 96-well format. Capture of all antibodies and their fragments allowed the determination of the intra- and extracellular LC:HC peptide ratios by reduced SDS capillary electrophoresis. Results demonstrate that the method is suitable to show the significant impact of the vector design on the intra- and extracellular LC:HC polypeptide ratios. Determination of LC:HC polypeptide ratios can give important information in vector design optimization leading to CHO cell lines with optimized antibody assembly and preferred product quality.

  16. Optimal Solutions of Multiproduct Batch Chemical Process Using Multiobjective Genetic Algorithm with Expert Decision System

    PubMed Central

    Mokeddem, Diab; Khellaf, Abdelhafid

    2009-01-01

    Optimal design problem are widely known by their multiple performance measures that are often competing with each other. In this paper, an optimal multiproduct batch chemical plant design is presented. The design is firstly formulated as a multiobjective optimization problem, to be solved using the well suited non dominating sorting genetic algorithm (NSGA-II). The NSGA-II have capability to achieve fine tuning of variables in determining a set of non dominating solutions distributed along the Pareto front in a single run of the algorithm. The NSGA-II ability to identify a set of optimal solutions provides the decision-maker DM with a complete picture of the optimal solution space to gain better and appropriate choices. Then an outranking with PROMETHEE II helps the decision-maker to finalize the selection of a best compromise. The effectiveness of NSGA-II method with multiojective optimization problem is illustrated through two carefully referenced examples. PMID:19543537

  17. Mechanistic modeling of the loss of protein sieving due to internal and external fouling of microfilters.

    PubMed

    Bolton, Glen R; Apostolidis, Alex J

    2017-09-01

    Fed-batch and perfusion cell culture processes used to produce therapeutic proteins can use microfilters for product harvest. In this study, new explicit mathematical models of sieving loss due to internal membrane fouling, external membrane fouling, or a combination of the two were generated. The models accounted for membrane and cake structures and hindered solute transport. Internal membrane fouling was assumed to occur due to the accumulation of foulant on either membrane pore walls (pore-retention model) or membrane fibers (fiber-retention model). External cake fouling was assumed to occur either by the growth of a single incompressible cake layer (cake-growth) or by the accumulation of a number of independent cake layers (cake-series). The pore-retention model was combined with either the cake-series or cake-growth models to obtain models that describe internal and external fouling occurring either simultaneously or sequentially. The models were tested using well-documented sieving decline data available in the literature. The sequential pore-retention followed by cake-growth model provided a good fit of sieving decline data during beer microfiltration. The cake-series and cake-growth models provided good fits of sieving decline data during the microfiltration of a perfusion cell culture. The new models provide insights into the mechanisms of fouling that result in the loss of product sieving. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:1323-1333, 2017. © 2017 American Institute of Chemical Engineers.

  18. Retention and chemical speciation of uranium in an oxidized wetland sediment from the Savannah River Site.

    PubMed

    Li, Dien; Seaman, John C; Chang, Hyun-Shik; Jaffe, Peter R; Koster van Groos, Paul; Jiang, De-Tong; Chen, Ning; Lin, Jinru; Arthur, Zachary; Pan, Yuanming; Scheckel, Kirk G; Newville, Matthew; Lanzirotti, Antonio; Kaplan, Daniel I

    2014-05-01

    Uranium speciation and retention mechanisms onto Savannah River Site (SRS) wetland sediments was studied using batch (ad)sorption experiments, sequential extraction, U L3-edge X-ray absorption near-edge structure (XANES) spectroscopy, fluorescence mapping and μ-XANES. Under oxidized conditions, U was highly retained by the SRS wetland sediments. In contrast to other similar but much lower natural organic matter (NOM) sediments, significant sorption of U onto the SRS sediments was observed at pH < 4 and pH > 8. Sequential extraction indicated that the U species were primarily associated with the acid soluble fraction (weak acetic acid extractable) and organic fraction (Na-pyrophosphate extractable). Uranium L3-edge XANES spectra of the U-bound sediments were nearly identical to that of uranyl acetate. Based on fluorescence mapping, U and Fe distributions in the sediment were poorly correlated, U was distributed throughout the sample and did not appear as isolated U mineral phases. The primary oxidation state of U in these oxidized sediments was U(VI), and there was little evidence that the high sorptive capacity of the sediments could be ascribed to abiotic or biotic reduction to the less soluble U(IV) species or to secondary mineral formation. Collectively, this study suggests that U may be strongly bound to wetland sediments, not only under reducing conditions by reductive precipitation, but also under oxidizing conditions through NOM-uranium bonding. Published by Elsevier Ltd.

  19. Sequential anaerobic-aerobic biodegradation of emerging insensitive munitions compound 3-nitro-1,2,4-triazol-5-one (NTO).

    PubMed

    Madeira, Camila L; Speet, Samuel A; Nieto, Cristina A; Abrell, Leif; Chorover, Jon; Sierra-Alvarez, Reyes; Field, Jim A

    2017-01-01

    Insensitive munitions, such as 3-nitro-1,2,4-triazol-5-one (NTO), are being considered by the U.S. Army as replacements for conventional explosives. Environmental emissions of NTO are expected to increase as its use becomes widespread; but only a few studies have considered the remediation of NTO-contaminated sites. In this study, sequential anaerobic-aerobic biodegradation of NTO was investigated in bioreactors using soil as inoculum. Batch bioassays confirmed microbial reduction of NTO under anaerobic conditions to 3-amino-1,2,4-triazol-5-one (ATO) using pyruvate as electron-donating cosubstrate. However, ATO biodegradation was only observed after the redox condition was switched to aerobic. This study also demonstrated that the high-rate removal of NTO in contaminated water can be attained in a continuous-flow aerated bioreactor. The reactor was first fed ATO as sole energy and nitrogen source prior to NTO addition. After few days, ATO was removed in a sustained fashion by 100%. When NTO was introduced together with electron-donor (pyruvate), NTO degradation increased progressively, reaching a removal efficiency of 93.5%. Mineralization of NTO was evidenced by the partial release of inorganic nitrogen species in the effluent, and lack of ATO accumulation. A plausible hypothesis for these findings is that NTO reduction occurred in anaerobic zones of the biofilm whereas ATO was mineralized in the bulk aerobic zones of the reactor. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Sequential anaerobic-aerobic biodegradation of emerging insensitive munitions compound 3-nitro-1,2,4-triazol-5-one (NTO)

    PubMed Central

    Madeira, Camila L.; Speet, Samuel A.; Nieto, Cristina A.; Abrell, Leif; Chorover, Jon; Sierra-Alvarez, Reyes; Field, Jim A.

    2017-01-01

    Insensitive munitions, such as 3-nitro-1,2,4-triazol-5-one (NTO), are being considered by the U.S. Army as replacements for conventional explosives. Environmental emissions of NTO are expected to increase as its use becomes widespread; but only a few studies have considered the remediation of NTO-contaminated sites. In this study, sequential anaerobic-aerobic biodegradation of NTO was investigated in bioreactors using soil as inoculum. Batch bioassays confirmed microbial reduction of NTO under anaerobic conditions to 3-amino-1,2,4-triazol-5-one (ATO) using pyruvate as electron-donating cosubstrate. However, ATO biodegradation was only observed after the redox condition was switched to aerobic. This study also demonstrated that the high-rate removal of NTO in contaminated water can be attained in a continuous-flow aerated bioreactor. The reactor was first fed ATO as sole energy and nitrogen source prior to NTO addition. After few days, ATO was removed in a sustained fashion by 100%. When NTO was introduced together with electron-donor (pyruvate), NTO degradation increased progressively, reaching a removal efficiency of 93.5%. Mineralization of NTO was evidenced by the partial release of inorganic nitrogen species in the effluent and lack of ATO accumulation. A plausible hypothesis for these findings is that NTO reduction occurred in anaerobic zones of the biofilm whereas ATO was mineralized in the bulk aerobic zones of the reactor. PMID:27750172

  1. 40 CFR 53.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... substantial deviations from the design specifications of the sampler specified for reference methods in... general requirements as an ISO 9001-registered facility for the design and manufacture of designated... capable of automatically collecting a series of sequential samples. NO means nitrogen oxide. NO 2 means...

  2. Application and Design Characteristics of Generalized Training Devices.

    ERIC Educational Resources Information Center

    Parker, Edward L.

    This program identified applications and developed design characteristics for generalized training devices. The first of three sequential phases reviewed in detail new developments in Naval equipment technology that influence the design of maintenance training devices: solid-state circuitry, modularization, digital technology, standardization,…

  3. Estimation of fundamental kinetic parameters of polyhydroxybutyrate fermentation process of Azohydromonas australica using statistical approach of media optimization.

    PubMed

    Gahlawat, Geeta; Srivastava, Ashok K

    2012-11-01

    Polyhydroxybutyrate or PHB is a biodegradable and biocompatible thermoplastic with many interesting applications in medicine, food packaging, and tissue engineering materials. The present study deals with the enhanced production of PHB by Azohydromonas australica using sucrose and the estimation of fundamental kinetic parameters of PHB fermentation process. The preliminary culture growth inhibition studies were followed by statistical optimization of medium recipe using response surface methodology to increase the PHB production. Later on batch cultivation in a 7-L bioreactor was attempted using optimum concentration of medium components (process variables) obtained from statistical design to identify the batch growth and product kinetics parameters of PHB fermentation. A. australica exhibited a maximum biomass and PHB concentration of 8.71 and 6.24 g/L, respectively in bioreactor with an overall PHB production rate of 0.75 g/h. Bioreactor cultivation studies demonstrated that the specific biomass and PHB yield on sucrose was 0.37 and 0.29 g/g, respectively. The kinetic parameters obtained in the present investigation would be used in the development of a batch kinetic mathematical model for PHB production which will serve as launching pad for further process optimization studies, e.g., design of several bioreactor cultivation strategies to further enhance the biopolymer production.

  4. Modeling of the adsorptive removal of arsenic(III) using plant biomass: a bioremedial approach

    NASA Astrophysics Data System (ADS)

    Roy, Palas; Dey, Uttiya; Chattoraj, Soumya; Mukhopadhyay, Debasis; Mondal, Naba Kumar

    2017-06-01

    In the present work, the possibility of using a non-conventional finely ground (250 μm) Azadirachta indica (neem) bark powder [AiBP] has been tested as a low-cost biosorbent for the removal of arsenic(III) from water. The removal of As(III) was studied by performing a series of biosorption experiments (batch and column). The biosorption behavior of As(III) for batch and column operations were examined in the concentration ranges of 50-500 µg L-1 and 500.0-2000.0 µg L-1, respectively. Under optimized batch conditions, the AiBP could remove up to 89.96 % of As(III) in water system. The artificial neural network (ANN) model was developed from batch experimental data sets which provided reasonable predictive performance ( R 2 = 0.961; 0.954) of As(III) biosorption. In batch operation, the initial As(III) concentration had the most significant impact on the biosorption process. For column operation, central composite design (CCD) was applied to investigate the influence on the breakthrough time for optimization of As(III) biosorption process and evaluation of interacting effects of different operating variables. The optimized result of CCD revealed that the AiBP was an effective and economically feasible biosorbent with maximum breakthrough time of 653.9 min, when the independent variables were retained at 2.0 g AiBP dose, 2000.0 µg L-1 initial As(III) concentrations, and 3.0 mL min-1 flow rate, at maximum desirability value of 0.969.

  5. A Method To Determine the Kinetics of Solute Mixing in Liquid/Liquid Formulation Dual-Chamber Syringes.

    PubMed

    Werk, Tobias; Mahler, Hanns-Christian; Ludwig, Imke Sonja; Luemkemann, Joerg; Huwyler, Joerg; Hafner, Mathias

    Dual-chamber syringes were originally designed to separate a solid substance and its diluent. However, they can also be used to separate liquid formulations of two individual drug products, which cannot be co-formulated due to technical or regulatory issues. A liquid/liquid dual-chamber syringe can be designed to achieve homogenization and mixing of both solutions prior to administration, or it can be used to sequentially inject both solutions. While sequential injection can be easily achieved by a dual-chamber syringe with a bypass located at the needle end of the syringe barrel, mixing of the two fluids may provide more challenges. Within this study, the mixing behavior of surrogate solutions in different dual-chamber syringes is assessed. Furthermore, the influence of parameters such as injection angle, injection speed, agitation, and sample viscosity were studied. It was noted that mixing was poor for the commercial dual-chamber syringes (with a bypass designed as a longitudinal ridge) when the two liquids significantly differ in their physical properties (viscosity, density). However, an optimized dual-chamber syringe design with multiple bypass channels resulted in improved mixing of liquids. Dual-chamber syringes were originally designed to separate a solid substance and its diluent. However, they can also be used to separate liquid formulations of two individual drug products. A liquid/liquid dual-chamber syringe can be designed to achieve homogenization and mixing of both solutions prior to administration, or it can be used to sequentially inject both solutions. While sequential injection can be easily achieved by a dual-chamber syringe with a bypass located at the needle end of the syringe barrel, mixing of the two fluids may provide more challenges. Within this study, the mixing behavior of surrogate solutions in different dual-chamber syringes is assessed. Furthermore, the influence of parameters such as injection angle, injection speed, agitation, and sample viscosity were studied. It was noted that mixing was poor for the commercially available dual-chamber syringes when the two liquids significantly differ in viscosity and density. However, an optimized dual-chamber syringe design resulted in improved mixing of liquids. © PDA, Inc. 2017.

  6. Demonstration of Robustness and Integrated Operation of a Series-Bosch System

    NASA Technical Reports Server (NTRS)

    Abney, Morgan B.; Mansell, Matthew J.; Stanley, Christine; Barnett, Bill; Junaedi, Christian; Vilekar, Saurabh A.; Ryan, Kent

    2016-01-01

    Manned missions beyond low Earth orbit will require highly robust, reliable, and maintainable life support systems that maximize recycling of water and oxygen. Bosch technology is one option to maximize oxygen recovery, in the form of water, from metabolically-produced carbon dioxide (CO2). A two stage approach to Bosch, called Series-Bosch, reduces metabolic CO2 with hydrogen (H2) to produce water and solid carbon using two reactors: a Reverse Water-Gas Shift (RWGS) reactor and a carbon formation (CF) reactor. Previous development efforts demonstrated the stand-alone performance of a NASA-designed RWGS reactor designed for robustness against carbon formation, two membrane separators intended to maximize single pass conversion of reactants, and a batch CF reactor with both transit and surface catalysts. In the past year, Precision Combustion, Inc. (PCI) developed and delivered a RWGS reactor for testing at NASA. The reactor design was based on their patented Microlith® technology and was first evaluated under a Phase I Small Business Innovative Research (SBIR) effort in 2010. The RWGS reactor was recently evaluated at NASA to compare its performance and operating conditions with NASA's RWGS reactor. The test results will be provided in this paper. Separately, in 2015, a semi-continuous CF reactor was designed and fabricated at NASA based on the results from batch CF reactor testing. The batch CF reactor and the semi-continuous CF reactor were individually integrated with an upstream RWGS reactor to demonstrate the system operation and to evaluate performance. Here, we compare the performance and robustness to carbon formation of both RWGS reactors. We report the results of the integrated operation of a Series-Bosch system and we discuss the technology readiness level.

  7. The Magnitude, Generality, and Determinants of Flynn Effects on Forms of Declarative Memory and Visuospatial Ability: Time-Sequential Analyses of Data from a Swedish Cohort Study

    ERIC Educational Resources Information Center

    Ronnlund, Michael; Nilsson, Lars-Goran

    2008-01-01

    To estimate Flynn effects (FEs) on forms of declarative memory (episodic, semantic) and visuospatial ability (Block Design) time-sequential analyses of data for Swedish adult samples (35-80 years) assessed on either of four occasions (1989, 1994, 1999, 2004; n = 2995) were conducted. The results demonstrated cognitive gains across occasions,…

  8. Sequential design of discrete linear quadratic regulators via optimal root-locus techniques

    NASA Technical Reports Server (NTRS)

    Shieh, Leang S.; Yates, Robert E.; Ganesan, Sekar

    1989-01-01

    A sequential method employing classical root-locus techniques has been developed in order to determine the quadratic weighting matrices and discrete linear quadratic regulators of multivariable control systems. At each recursive step, an intermediate unity rank state-weighting matrix that contains some invariant eigenvectors of that open-loop matrix is assigned, and an intermediate characteristic equation of the closed-loop system containing the invariant eigenvalues is created.

  9. Use of personalized Dynamic Treatment Regimes (DTRs) and Sequential Multiple Assignment Randomized Trials (SMARTs) in mental health studies

    PubMed Central

    Liu, Ying; ZENG, Donglin; WANG, Yuanjia

    2014-01-01

    Summary Dynamic treatment regimens (DTRs) are sequential decision rules tailored at each point where a clinical decision is made based on each patient’s time-varying characteristics and intermediate outcomes observed at earlier points in time. The complexity, patient heterogeneity, and chronicity of mental disorders call for learning optimal DTRs to dynamically adapt treatment to an individual’s response over time. The Sequential Multiple Assignment Randomized Trial (SMARTs) design allows for estimating causal effects of DTRs. Modern statistical tools have been developed to optimize DTRs based on personalized variables and intermediate outcomes using rich data collected from SMARTs; these statistical methods can also be used to recommend tailoring variables for designing future SMART studies. This paper introduces DTRs and SMARTs using two examples in mental health studies, discusses two machine learning methods for estimating optimal DTR from SMARTs data, and demonstrates the performance of the statistical methods using simulated data. PMID:25642116

  10. Safety Testing of AGR-2 UCO Compacts 5-2-2, 2-2-2, and 5-4-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunn, John D.; Morris, Robert Noel; Baldwin, Charles A.

    2016-08-01

    Post-irradiation examination (PIE) is being performed on tristructural-isotropic (TRISO) coated-particle fuel compacts from the Advanced Gas Reactor (AGR) Fuel Development and Qualification Program second irradiation experiment (AGR-2). This effort builds upon the understanding acquired throughout the AGR-1 PIE campaign, and is establishing a database for the different AGR-2 fuel designs. The AGR-2 irradiation experiment included TRISO fuel particles coated at BWX Technologies (BWXT) with a 150-mm-diameter engineering-scale coater. Two coating batches were tested in the AGR-2 irradiation experiment. Batch 93085 had 508-μm-diameter uranium dioxide (UO 2) kernels. Batch 93073 had 427-μm-diameter UCO kernels, which is a kernel design where somemore » of the uranium oxide is converted to uranium carbide during fabrication to provide a getter for oxygen liberated during fission and limit CO production. Fabrication and property data for the AGR-2 coating batches have been compiled and compared to those for AGR-1. The AGR-2 TRISO coatings were most like the AGR-1 Variant 3 TRISO deposited in the 50-mm-diameter ORNL lab-scale coater. In both cases argon-dilution of the hydrogen and methyltrichlorosilane coating gas mixture employed to deposit the SiC was used to produce a finer-grain, more equiaxed SiC microstructure. In addition to the fact that AGR-1 fuel had smaller, 350-μm-diameter UCO kernels, notable differences in the TRISO particle properties included the pyrocarbon anisotropy, which was slightly higher in the particles coated in the engineering-scale coater, and the exposed kernel defect fraction, which was higher for AGR-2 fuel due to the detected presence of particles with impact damage introduced during TRISO particle handling.« less

  11. Sequential parallel comparison design with binary and time-to-event outcomes.

    PubMed

    Silverman, Rachel Kloss; Ivanova, Anastasia; Fine, Jason

    2018-04-30

    Sequential parallel comparison design (SPCD) has been proposed to increase the likelihood of success of clinical trials especially trials with possibly high placebo effect. Sequential parallel comparison design is conducted with 2 stages. Participants are randomized between active therapy and placebo in stage 1. Then, stage 1 placebo nonresponders are rerandomized between active therapy and placebo. Data from the 2 stages are pooled to yield a single P value. We consider SPCD with binary and with time-to-event outcomes. For time-to-event outcomes, response is defined as a favorable event prior to the end of follow-up for a given stage of SPCD. We show that for these cases, the usual test statistics from stages 1 and 2 are asymptotically normal and uncorrelated under the null hypothesis, leading to a straightforward combined testing procedure. In addition, we show that the estimators of the treatment effects from the 2 stages are asymptotically normal and uncorrelated under the null and alternative hypothesis, yielding confidence interval procedures with correct coverage. Simulations and real data analysis demonstrate the utility of the binary and time-to-event SPCD. Copyright © 2018 John Wiley & Sons, Ltd.

  12. Dynamic flux balance modeling of microbial co-cultures for efficient batch fermentation of glucose and xylose mixtures.

    PubMed

    Hanly, Timothy J; Henson, Michael A

    2011-02-01

    Sequential uptake of pentose and hexose sugars that compose lignocellulosic biomass limits the ability of pure microbial cultures to efficiently produce value-added bioproducts. In this work, we used dynamic flux balance modeling to examine the capability of mixed cultures of substrate-selective microbes to improve the utilization of glucose/xylose mixtures and to convert these mixed substrates into products. Co-culture simulations of Escherichia coli strains ALS1008 and ZSC113, engineered for glucose and xylose only uptake respectively, indicated that improvements in batch substrate consumption observed in previous experimental studies resulted primarily from an increase in ZSC113 xylose uptake relative to wild-type E. coli. The E. coli strain ZSC113 engineered for the elimination of glucose uptake was computationally co-cultured with wild-type Saccharomyces cerevisiae, which can only metabolize glucose, to determine if the co-culture was capable of enhanced ethanol production compared to pure cultures of wild-type E. coli and the S. cerevisiae strain RWB218 engineered for combined glucose and xylose uptake. Under the simplifying assumption that both microbes grow optimally under common environmental conditions, optimization of the strain inoculum and the aerobic to anaerobic switching time produced an almost twofold increase in ethanol productivity over the pure cultures. To examine the effect of reduced strain growth rates at non-optimal pH and temperature values, a break even analysis was performed to determine possible reductions in individual strain substrate uptake rates that resulted in the same predicted ethanol productivity as the best pure culture. © 2010 Wiley Periodicals, Inc.

  13. Influence of the soil/solution ratio, interaction time, and extractant on the evaluation of iron chelate sorption/desorption by soils.

    PubMed

    Hernández-Apaolaza, Lourdes; Lucena, Juan J

    2011-03-23

    Synthetic Fe chelates are the most efficient agricultural practice to control Fe deficiency in crops, EDTA/Fe3+ and o,o-EDDHA/Fe3+ being the most commonly used. Their efficacy as Fe sources and carriers in soils can be severely limited by their retention on it. The aim of this work is to evaluate the possible bias introduced in the studies of the iron chelate retention by soils. For that purpose, results obtained for EDTA and EDDHA iron chelates from two batch studies with different soil/solution ratios were compared with data obtained for a leaching column experiment. Moreover, different extractants were tested to study the o,o-EDDHA/Fe3+ and o,p-EDDHA/Fe3+ desorption from a calcareous soil, and also the effect of the interaction time in their retention process has been evaluated. In summary, the mobility through a calcareous soil of the studied iron chelates differs greatly depending on the type of iron chelate and also on the procedure used to evaluate the retention and the soil/solution ratio used. In general, the leaching column method is preferred because the achieved conclusions are more representative of the natural conditions, but batch methods are very useful as a preliminary experiment, especially one with a high soil/solution ratio. The iron chelate desorption could be quantified by using a sequential extraction with water, sodium sulfate, and DTPA as extractants. Under the experimental conditions used in this study, o,o-EDDHA/Fe3+ retention increased with interaction time.

  14. Home | BEopt

    Science.gov Websites

    BEopt - Building Energy Optimization BEopt NREL - National Renewable Energy Laboratory Primary Energy Optimization) software provides capabilities to evaluate residential building designs and identify sequential search optimization technique used by BEopt: Finds minimum-cost building designs at different

  15. When to Use What Research Design

    ERIC Educational Resources Information Center

    Vogt, W. Paul; Gardner, Dianne C.; Haeffele, Lynne M.

    2012-01-01

    Systematic, practical, and accessible, this is the first book to focus on finding the most defensible design for a particular research question. Thoughtful guidelines are provided for weighing the advantages and disadvantages of various methods, including qualitative, quantitative, and mixed methods designs. The book can be read sequentially or…

  16. Designing User-Computer Dialogues: Basic Principles and Guidelines.

    ERIC Educational Resources Information Center

    Harrell, Thomas H.

    This discussion of the design of computerized psychological assessment or testing instruments stresses the importance of the well-designed computer-user interface. The principles underlying the three main functional elements of computer-user dialogue--data entry, data display, and sequential control--are discussed, and basic guidelines derived…

  17. A sequential linear optimization approach for controller design

    NASA Technical Reports Server (NTRS)

    Horta, L. G.; Juang, J.-N.; Junkins, J. L.

    1985-01-01

    A linear optimization approach with a simple real arithmetic algorithm is presented for reliable controller design and vibration suppression of flexible structures. Using first order sensitivity of the system eigenvalues with respect to the design parameters in conjunction with a continuation procedure, the method converts a nonlinear optimization problem into a maximization problem with linear inequality constraints. The method of linear programming is then applied to solve the converted linear optimization problem. The general efficiency of the linear programming approach allows the method to handle structural optimization problems with a large number of inequality constraints on the design vector. The method is demonstrated using a truss beam finite element model for the optimal sizing and placement of active/passive-structural members for damping augmentation. Results using both the sequential linear optimization approach and nonlinear optimization are presented and compared. The insensitivity to initial conditions of the linear optimization approach is also demonstrated.

  18. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  19. Optimization of medium components and cultural variables for enhanced production of acidic high maltose-forming and Ca2+-independent α-amylase by Bacillus acidicola.

    PubMed

    Sharma, Archana; Satyanarayana, Tulasi

    2011-05-01

    The production of acidic α-amylase by a novel acidophilic bacterium Bacillus acidicola TSAS1 was optimized in submerged fermentation using statistical approaches. The process parameters that significantly affected α-amylase production (starch, K(2)HPO(4), inoculum size and temperature) were identified by Plackett and Burman design. The optimum levels of the significant variables as determined using central composite design of response surface methodology are starch (2.75%), K(2)HPO(4) (0.01%), inoculum size [2% (v/v) containing 1.9×10(8) CFU ml(-1)], and temperature (33°C). An overall 2.4 and 2.9-fold increase in enzyme production has been attained in batch and fed-batch fermentations in the laboratory fermentor, respectively. Copyright © 2011 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  20. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  1. Peptide-based protein capture agents with high affinity, selectivity, and stability as antibody replacements in biodetection assays

    NASA Astrophysics Data System (ADS)

    Coppock, Matthew B.; Farrow, Blake; Warner, Candice; Finch, Amethist S.; Lai, Bert; Sarkes, Deborah A.; Heath, James R.; Stratis-Cullum, Dimitra

    2014-05-01

    Current biodetection assays that employ monoclonal antibodies as primary capture agents exhibit limited fieldability, shelf life, and performance due to batch-to-batch production variability and restricted thermal stability. In order to improve upon the detection of biological threats in fieldable assays and systems for the Army, we are investigating protein catalyzed capture (PCC) agents as drop-in replacements for the existing antibody technology through iterative in situ click chemistry. The PCC agent oligopeptides are developed against known protein epitopes and can be mass produced using robotic methods. In this work, a PCC agent under development will be discussed. The performance, including affinity, selectivity, and stability of the capture agent technology, is analyzed by immunoprecipitation, western blotting, and ELISA experiments. The oligopeptide demonstrates superb selectivity coupled with high affinity through multi-ligand design, and improved thermal, chemical, and biochemical stability due to non-natural amino acid PCC agent design.

  2. Changing case Order to Optimise patterns of Performance in mammography Screening (CO-OPS): study protocol for a randomized controlled trial

    PubMed Central

    2014-01-01

    Background X-ray mammography remains the predominant test for screening for breast cancer, with the aim of reducing breast cancer mortality. In the English NHS Breast Screening Programme each woman’s mammograms are examined separately by two expert readers. The two readers read each batch in the same order and each indicates if there should be recall for further tests. This is a highly skilled, pressurised, repetitive and frequently intellectually unchallenging activity where readers examine one or more batches of 30–50 women’s mammograms in each session. A vigilance decrement or performance decrease over time has been observed in similar repetitive visual tasks such as radar operation. Methods/Design The CO-OPS study is a pragmatic, multi-centre, two-arm, double blind cluster randomised controlled trial of a computer software intervention designed to reduce the effects of a vigilance decrement in breast cancer screening. The unit of randomisation is the batch. Intervention batches will be examined in the opposite order by the two readers (one forwards, one backwards). Control batches will be read in the same order as one another, as is current standard practice. The hypothesis is that cancer detection rates will be higher in the intervention group because each readers’ peak performance will occur when examining different women’s mammograms. The trial will take place in 44 English breast screening centres for 1 year and 4 months. The primary outcome is cancer detection rate, which will be extracted from computer records after 1 year of the trial. The secondary outcomes include rate of disagreement between readers (a more statistically powerful surrogate for cancer detection rate), recall rate, positive predictive value, and interval cancer rate (cancers found between screening rounds which will be measured three years after the end of the trial). Discussion This is the first trial of an intervention to ameliorate a vigilance decrement in breast cancer screening. Trial registration ISRCTN46603370 (submitted: 24 October 2012, date of registration: 26 March 2013). PMID:24411004

  3. Concurrent versus sequential sorafenib therapy in combination with radiation for hepatocellular carcinoma.

    PubMed

    Wild, Aaron T; Gandhi, Nishant; Chettiar, Sivarajan T; Aziz, Khaled; Gajula, Rajendra P; Williams, Russell D; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F; Cosgrove, David; Pawlik, Timothy M; Maitra, Anirban; Wong, John; Hales, Russell K; Torbenson, Michael S; Herman, Joseph M; Tran, Phuoc T

    2013-01-01

    Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design.

  4. Concurrent versus Sequential Sorafenib Therapy in Combination with Radiation for Hepatocellular Carcinoma

    PubMed Central

    Chettiar, Sivarajan T.; Aziz, Khaled; Gajula, Rajendra P.; Williams, Russell D.; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A.; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F.; Cosgrove, David; Pawlik, Timothy M.; Maitra, Anirban; Wong, John; Hales, Russell K.; Torbenson, Michael S.; Herman, Joseph M.; Tran, Phuoc T.

    2013-01-01

    Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design. PMID:23762417

  5. Radar Unix: a complete package for GPR data processing

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Durand, Herve

    1999-03-01

    A complete package for ground penetrating radar data interpretation including data processing, forward modeling and a case history database consultation is presented. Running on an Unix operating system, its architecture consists of a graphical user interface generating batch files transmitted to a library of processing routines. This design allows a better software maintenance and the possibility for the user to run processing or modeling batch files by itself and differed in time. A case history data base is available and consists of an hypertext document which can be consulted by using a standard HTML browser. All the software specifications are presented through a realistic example.

  6. THRESHOLD LOGIC SYNTHESIS OF SEQUENTIAL MACHINES.

    DTIC Science & Technology

    The application of threshold logic to the design of sequential machines is the subject of this research. A single layer of threshold logic units in...advantages of fewer components because of the use of threshold logic, along with very high-speed operation resulting from the use of only a single layer of...logic. In some instances, namely for asynchronous machines, the only delay need be the natural delay of the single layer of threshold elements. It is

  7. Heater Validation for the NEXT-C Hollow Cathodes

    NASA Technical Reports Server (NTRS)

    Verhey, Timothy R.; Soulas, George C.; Mackey, Jonathan A.

    2018-01-01

    Swaged cathode heaters whose design was successfully demonstrated under a prior flight project are to be provided by the NASA Glenn Research Center for the NEXT-C ion thruster being fabricated by Aerojet Rocketdyne. Extensive requalification activities were performed to validate process controls that had to be re-established or revised because systemic changes prevented reuse of the past approaches. A development batch of heaters was successfully fabricated based on the new process controls. Acceptance and cyclic life testing of multiple discharge and neutralizer sized heaters extracted from the development batch was initiated in August, 2016, with the last heater completing testing in April, 2017. Cyclic life testing results substantially exceeded the NEXT-C thruster requirement as well as all past experience for GRC-fabricated units. The heaters demonstrated ultimate cyclic life capability of 19050 to 33500 cycles. A qualification batch of heaters is now being fabricated using the finalized process controls. A set of six heaters will be acceptance and cyclic tested to verify conformance to the behavior observed with the development heaters. The heaters for flight use will be then be provided to the contractor from the remainder of the qualification batch. This paper summarizes the fabrication process control activities and the acceptance and life testing of the development heater units.

  8. Low-temperature catalytic gasification of food processing wastes. 1995 topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, D.C.; Hart, T.R.

    The catalytic gasification system described in this report has undergone continuing development and refining work at Pacific Northwest National Laboratory (PNNL) for over 16 years. The original experiments, performed for the Gas Research Institute, were aimed at developing kinetics information for steam gasification of biomass in the presence of catalysts. From the fundamental research evolved the concept of a pressurized, catalytic gasification system for converting wet biomass feedstocks to fuel gas. Extensive batch reactor testing and limited continuous stirred-tank reactor tests provided useful design information for evaluating the preliminary economics of the process. This report is a follow-on to previousmore » interim reports which reviewed the results of the studies conducted with batch and continuous-feed reactor systems from 1989 to 1994, including much work with food processing wastes. The discussion here provides details of experiments on food processing waste feedstock materials, exclusively, that were conducted in batch and continuous- flow reactors.« less

  9. Laboratory-scale anaerobic sequencing batch reactor for treatment of stillage from fruit distillation.

    PubMed

    Rada, Elena Cristina; Ragazzi, Marco; Torretta, Vincenzo

    2013-01-01

    This work describes batch anaerobic digestion tests carried out on stillages, the residue of the distillation process on fruit, in order to contribute to the setting of design parameters for a planned plant. The experimental apparatus was characterized by three reactors, each with a useful volume of 5 L. The different phases of the work carried out were: determining the basic components of the chemical oxygen demand (COD) of the stillages; determining the specific production of biogas; and estimating the rapidly biodegradable COD contained in the stillages. In particular, the main goal of the anaerobic digestion tests on stillages was to measure the parameters of specific gas production (SGP) and gas production rate (GPR) in reactors in which stillages were being digested using ASBR (anaerobic sequencing batch reactor) technology. Runs were developed with increasing concentrations of the feed. The optimal loads for obtaining the maximum SGP and GPR values were 8-9 gCOD L(-1) and 0.9 gCOD g(-1) volatile solids.

  10. Bayesian assurance and sample size determination in the process validation life-cycle.

    PubMed

    Faya, Paul; Seaman, John W; Stamey, James D

    2017-01-01

    Validation of pharmaceutical manufacturing processes is a regulatory requirement and plays a key role in the assurance of drug quality, safety, and efficacy. The FDA guidance on process validation recommends a life-cycle approach which involves process design, qualification, and verification. The European Medicines Agency makes similar recommendations. The main purpose of process validation is to establish scientific evidence that a process is capable of consistently delivering a quality product. A major challenge faced by manufacturers is the determination of the number of batches to be used for the qualification stage. In this article, we present a Bayesian assurance and sample size determination approach where prior process knowledge and data are used to determine the number of batches. An example is presented in which potency uniformity data is evaluated using a process capability metric. By using the posterior predictive distribution, we simulate qualification data and make a decision on the number of batches required for a desired level of assurance.

  11. Solving a mathematical model integrating unequal-area facilities layout and part scheduling in a cellular manufacturing system by a genetic algorithm.

    PubMed

    Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi

    2016-01-01

    In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.

  12. Economic Factors in Tunnel Construction

    DOT National Transportation Integrated Search

    1979-02-01

    This report describes a new cost estimating system for tunneling. The system is designed so that it may be used to aid planners, engineers, and designers in evaluating the cost impact of decisions they may make during the sequential stages of plannin...

  13. Environmentally friendly microwave-assisted sequential extraction method followed by ICP-OES and ion-chromatographic analysis for rapid determination of sulphur forms in coal samples.

    PubMed

    Mketo, Nomvano; Nomngongo, Philiswa N; Ngila, J Catherine

    2018-05-15

    A rapid three-step sequential extraction method was developed under microwave radiation followed by inductively coupled plasma-optical emission spectroscopic (ICP-OES) and ion-chromatographic (IC) analysis for the determination of sulphur forms in coal samples. The experimental conditions of the proposed microwave-assisted sequential extraction (MW-ASE) procedure were optimized by using multivariate mathematical tools. Pareto charts generated from 2 3 full factorial design showed that, extraction time has insignificant effect on the extraction of sulphur species, therefore, all the sequential extraction steps were performed for 5 min. The optimum values according to the central composite designs and counter plots of the response surface methodology were 200 °C (microwave temperature) and 0.1 g (coal amount) for all the investigated extracting reagents (H 2 O, HCl and HNO 3 ). When the optimum conditions of the proposed MW-ASE procedure were applied in coal CRMs, SARM 18 showed more organic sulphur (72%) and the other two coal CRMs (SARMs 19 and 20) were dominated by sulphide sulphur species (52-58%). The sum of the sulphur forms from the sequential extraction steps have shown consistent agreement (95-96%) with certified total sulphur values on the coal CRM certificates. This correlation, in addition to the good precision (1.7%) achieved by the proposed procedure, suggests that the sequential extraction method is reliable, accurate and reproducible. To safe-guard the destruction of pyritic and organic sulphur forms in extraction step 1, water was used instead of HCl. Additionally, the notorious acidic mixture (HCl/HNO 3 /HF) was replaced by greener reagent (H 2 O 2 ) in the last extraction step. Therefore, the proposed MW-ASE method can be applied in routine laboratories for the determination of sulphur forms in coal and coal related matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Two Computer Programs for the Statistical Evaluation of a Weighted Linear Composite.

    ERIC Educational Resources Information Center

    Sands, William A.

    1978-01-01

    Two computer programs (one batch, one interactive) are designed to provide statistics for a weighted linear combination of several component variables. Both programs provide mean, variance, standard deviation, and a validity coefficient. (Author/JKS)

  15. MIGRATION OF HAZARDOUS SUBSTANCES THROUGH SOIL

    EPA Science Inventory

    Factorlally designed column and batch leaching studies were conducted on samples of various Industrial wastes, flue gas desulfurlzatlon sludges, and coal fly ash to determine the effect of leaching solution composition on release of hazardous substances from waste samples, and t...

  16. Impact of thermal spectrum small modular reactors on performance of once-through nuclear fuel cycles with low-enriched uranium

    DOE PAGES

    Brown, Nicholas R.; Worrall, Andrew; Todosow, Michael

    2016-11-18

    Small modular reactors (SMRs) offer potential benefits, such as enhanced operational flexibility. However, it is vital to understand the holistic impact of SMRs on nuclear fuel cycle performance. The focus of this paper is the fuel cycle impacts of light water SMRs in a once-through fuel cycle with low-enriched uranium fuel. A key objective of this paper is to describe preliminary example reactor core physics and fuel cycle analyses conducted in support of the U.S. Department of Energy, Office of Nuclear Energy, Fuel Cycle Options Campaign. The hypothetical light water SMR example case considered in these preliminary scoping studies ismore » a cartridge type one-batch core with slightly less than 5.0% enrichment. Challenges associated with SMRs include increased neutron leakage, fewer assemblies in the core (and therefore fewer degrees of freedom in the core design), complex enrichment and burnable absorber loadings, full power operation with inserted control rods, the potential for frequent load-following operation, and shortened core height. Each of these will impact the achievable discharge burnup in the reactor and the fuel cycle performance. This paper summarizes a list of the factors relevant to SMR fuel, core, and operation that will impact fuel cycle performance. The high-level issues identified and preliminary scoping calculations in this paper are intended to inform on potential fuel cycle impacts of one-batch thermal spectrum SMRs. In particular, this paper highlights the impact of increased neutron leakage and reduced number of batches on the achievable burnup of the reactor. Fuel cycle performance metrics for a hypothetical example SMR are compared with those for a conventional three-batch light water reactor in the following areas: nuclear waste management, environmental impact, and resource utilization. The metrics performance for such an SMR is degraded for the mass of spent nuclear fuel and high-level waste disposed of, mass of depleted uranium disposed of, land use per energy generated, and carbon emissions per energy generated. Finally, it is noted that the features of some SMR designs impact three main aspects of fuel cycle performance: (1) small cores which means high leakage (there is a radial and axial component), (2) no boron which means heterogeneous core and extensive use of control rods and BPs, and (3) single batch cores. But not all of the SMR designs have all of these traits. As a result, the approach used in this study is therefore a bounding case and not all SMRs may be affected to the same extent.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Worrall, Andrew; Todosow, Michael

    Small modular reactors (SMRs) offer potential benefits, such as enhanced operational flexibility. However, it is vital to understand the holistic impact of SMRs on nuclear fuel cycle performance. The focus of this paper is the fuel cycle impacts of light water SMRs in a once-through fuel cycle with low-enriched uranium fuel. A key objective of this paper is to describe preliminary example reactor core physics and fuel cycle analyses conducted in support of the U.S. Department of Energy, Office of Nuclear Energy, Fuel Cycle Options Campaign. The hypothetical light water SMR example case considered in these preliminary scoping studies ismore » a cartridge type one-batch core with slightly less than 5.0% enrichment. Challenges associated with SMRs include increased neutron leakage, fewer assemblies in the core (and therefore fewer degrees of freedom in the core design), complex enrichment and burnable absorber loadings, full power operation with inserted control rods, the potential for frequent load-following operation, and shortened core height. Each of these will impact the achievable discharge burnup in the reactor and the fuel cycle performance. This paper summarizes a list of the factors relevant to SMR fuel, core, and operation that will impact fuel cycle performance. The high-level issues identified and preliminary scoping calculations in this paper are intended to inform on potential fuel cycle impacts of one-batch thermal spectrum SMRs. In particular, this paper highlights the impact of increased neutron leakage and reduced number of batches on the achievable burnup of the reactor. Fuel cycle performance metrics for a hypothetical example SMR are compared with those for a conventional three-batch light water reactor in the following areas: nuclear waste management, environmental impact, and resource utilization. The metrics performance for such an SMR is degraded for the mass of spent nuclear fuel and high-level waste disposed of, mass of depleted uranium disposed of, land use per energy generated, and carbon emissions per energy generated. Finally, it is noted that the features of some SMR designs impact three main aspects of fuel cycle performance: (1) small cores which means high leakage (there is a radial and axial component), (2) no boron which means heterogeneous core and extensive use of control rods and BPs, and (3) single batch cores. But not all of the SMR designs have all of these traits. As a result, the approach used in this study is therefore a bounding case and not all SMRs may be affected to the same extent.« less

  18. Learning to Design Backwards: Examining a Means to Introduce Human-Centered Design Processes to Teachers and Students

    ERIC Educational Resources Information Center

    Gibson, Michael R.

    2016-01-01

    "Designing backwards" is presented here as a means to utilize human-centered processes in diverse educational settings to help teachers and students learn to formulate and operate design processes to achieve three sequential and interrelated goals. The first entails teaching them to effectively and empathetically identify, frame and…

  19. A Survey of Methods for Computing Best Estimates of Endoatmospheric and Exoatmospheric Trajectories

    NASA Technical Reports Server (NTRS)

    Bernard, William P.

    2018-01-01

    Beginning with the mathematical prediction of planetary orbits in the early seventeenth century up through the most recent developments in sensor fusion methods, many techniques have emerged that can be employed on the problem of endo and exoatmospheric trajectory estimation. Although early methods were ad hoc, the twentieth century saw the emergence of many systematic approaches to estimation theory that produced a wealth of useful techniques. The broad genesis of estimation theory has resulted in an equally broad array of mathematical principles, methods and vocabulary. Among the fundamental ideas and methods that are briefly touched on are batch and sequential processing, smoothing, estimation, and prediction, sensor fusion, sensor fusion architectures, data association, Bayesian and non Bayesian filtering, the family of Kalman filters, models of the dynamics of the phases of a rocket's flight, and asynchronous, delayed, and asequent data. Along the way, a few trajectory estimation issues are addressed and much of the vocabulary is defined.

  20. Determination of the total acidity in soft drinks using potentiometric sequential injection titration.

    PubMed

    van Staden, J Koos F; Mashamba, M Mulalo G; Stefan, R Raluca I

    2002-12-06

    A potentiometric SI titration system for the determination of total acidity in soft drinks is proposed. The concept is based on the aspiration of the acid soft drink sample between two base zones into a holding coil with the volume of the first base zone twice to that of the second one and channelled by flow reversal through a reaction coil to a potentiometric sensor. A solution of 0.1 mol l(-1) sodium chloride is used as ionic strength adjustment buffer in the carrier stream. The system has been applied to the analysis of some South African soft drinks having a total acidity level of about 0.2-0.3% (w/v). The method has a sample frequency of 45 samples per h with a linear range of 0.1 and 0.6% (w/v). It is easy to use, fully computerised, and gives the results that are comparable to both automated batch titration and manual titration.

  1. Rapid Vortex Fluidics: Continuous Flow Synthesis of Amides and Local Anesthetic Lidocaine.

    PubMed

    Britton, Joshua; Chalker, Justin M; Raston, Colin L

    2015-07-20

    Thin film flow chemistry using a vortex fluidic device (VFD) is effective in the scalable acylation of amines under shear, with the yields of the amides dramatically enhanced relative to traditional batch techniques. The optimized monophasic flow conditions are effective in ≤80 seconds at room temperature, enabling access to structurally diverse amides, functionalized amino acids and substituted ureas on multigram scales. Amide synthesis under flow was also extended to a total synthesis of local anesthetic lidocaine, with sequential reactions carried out in two serially linked VFD units. The synthesis could also be executed in a single VFD, in which the tandem reactions involve reagent delivery at different positions along the rapidly rotating tube with in situ solvent replacement, as a molecular assembly line process. This further highlights the versatility of the VFD in organic synthesis, as does the finding of a remarkably efficient debenzylation of p-methoxybenzyl amines. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. D-Tagatose production in the presence of borate by resting Lactococcus lactis cells harboring Bifidobacterium longum L-arabinose isomerase.

    PubMed

    Salonen, Noora; Salonen, Kalle; Leisola, Matti; Nyyssölä, Antti

    2013-04-01

    Bifidobacterium longum NRRL B-41409 L-arabinose isomerase (L-AI) was overexpressed in Lactococcus lactis using a phosphate depletion inducible expression system. The resting L. lactis cells harboring the B. longum L-AI were used for production of D-tagatose from D-galactose in the presence of borate buffer. Multivariable analysis suggested that high pH, temperature and borate concentration favoured the conversion of D-galactose to D-tagatose. Almost quantitative conversion (92 %) was achieved at 20 g L⁻¹ substrate and at 37.5 °C after 5 days. The D-tagatose production rate of 185 g L⁻¹ day ⁻¹ was obtained at 300 g L⁻¹ galactose, at 1.15 M borate, and at 41 °C during 10 days when the production medium was changed every 24 h. There was no significant loss in productivity during ten sequential 24 h batches. The initial D-tagatose production rate was 290 g L⁻¹ day⁻¹ under these conditions.

  3. Cultivation of aerobic granular sludge for rubber wastewater treatment.

    PubMed

    Rosman, Noor Hasyimah; Nor Anuar, Aznah; Othman, Inawati; Harun, Hasnida; Sulong Abdul Razak, Muhammad Zuhdi; Elias, Siti Hanna; Mat Hassan, Mohd Arif Hakimi; Chelliapan, Shreesivadass; Ujang, Zaini

    2013-02-01

    Aerobic granular sludge (AGS) was successfully cultivated at 27±1 °C and pH 7.0±1 during the treatment of rubber wastewater using a sequential batch reactor system mode with complete cycle time of 3 h. Results showed aerobic granular sludge had an excellent settling ability and exhibited exceptional performance in the organics and nutrients removal from rubber wastewater. Regular, dense and fast settling granule (average diameter, 1.5 mm; settling velocity, 33 m h(-1); and sludge volume index, 22.3 mL g(-1)) were developed in a single reactor. In addition, 96.5% COD removal efficiency was observed in the system at the end of the granulation period, while its ammonia and total nitrogen removal efficiencies were up to 94.7% and 89.4%, respectively. The study demonstrated the capabilities of AGS development in a single, high and slender column type-bioreactor for the treatment of rubber wastewater. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Metabolic influence of lead on polyhydroxyalkanoates (PHA) production and phosphate uptake in activated sludge fed with glucose or acetic acid as carbon source.

    PubMed

    You, Sheng-Jie; Tsai, Yung-Pin; Cho, Bo-Chuan; Chou, Yi-Hsiu

    2011-09-01

    Sludge in a sequential batch reactor (SBR) system was used to investigate the effect of lead toxicity on metabolisms of polyphosphate accumulating organisms (PAOs) and glycogen accumulating organisms (GAOs) communities fed with acetic acid or glucose as their sole carbon source, respectively. Results showed that the effect of lead on substrate utilization of both PAOs and GAOs was insignificant. However, lead substantially inhibited both of phosphate release and uptake of PAOs. In high concentration of acetic acid trials, an abnormal aerobic phosphate release was observed instead of phosphate uptake and the release rate increased with increasing lead concentration. Results also showed that PAOs could normally synthesize polyhydroxybutyrate (PHB) in the anaerobic phase even though lead concentration was 40 mg L(-1). However, they could not aerobically utilize PHB normally in the presence of lead. On the other hand, GAOs could not normally metabolize polyhydroxyvalerate (PHV) in both the anaerobic and aerobic phases. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Aroma profiling of an aerated fermentation of natural grape must with selected yeast strains at pilot scale.

    PubMed

    Tronchoni, Jordi; Curiel, José Antonio; Sáenz-Navajas, María Pilar; Morales, Pilar; de-la-Fuente-Blanco, Arancha; Fernández-Zurbano, Purificación; Ferreira, Vicente; Gonzalez, Ramon

    2018-04-01

    The use of non-Saccharomyces strains in aerated conditions has proven effective for alcohol content reduction in wine during lab-scale fermentation. The process has been scaled up to 20 L batches, in order to produce lower alcohol wines amenable to sensory analysis. Sequential instead of simultaneous inoculation was chosen to prevent oxygen exposure of Saccharomyces cerevisiae during fermentation, since previous results indicated that this would result in increased acetic acid production. In addition, an adaptation step was included to facilitate non-Saccharomyces implantation in natural must. Wines elaborated with Torulaspora delbrueckii or Metschnikowia pulcherrima in aerated conditions contained less alcohol than control wine (S. cerevisiae, non-aerated). Sensory and aroma analysis revealed that the quality of mixed fermentations was affected by the high levels of some yeast amino acid related byproducts, which suggests that further progress requires a careful selection of non-Saccharomyces strains and the use of specific N-nutrients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Immobilized acclimated biomass-powdered activated carbon for the bioregeneration of granular activated carbon loaded with phenol and o-cresol.

    PubMed

    Toh, Run-Hong; Lim, Poh-Eng; Seng, Chye-Eng; Adnan, Rohana

    2013-09-01

    The objectives of the study are to use immobilized acclimated biomass and immobilized biomass-powdered activated carbon (PAC) as a novel approach in the bioregeneration of granular activated carbon (GAC) loaded with phenol and o-cresol, respectively, and to compare the efficiency and rate of the bioregeneration of the phenolic compound-loaded GAC using immobilized and suspended biomasses under varying GAC dosages. Bioregeneration of GAC loaded with phenol and o-cresol, respectively, was conducted in batch system using the sequential adsorption and biodegradation approach. The results showed that the bioregeneration efficiency of GAC loaded with phenol or o-cresol was basically the same irrespective of whether the immobilized or suspended biomass was used. Nonetheless, the duration for bioregeneration was longer under immobilized biomass. The beneficial effect of immobilized PAC-biomass for bioregeneration is the enhancement of the removal rate of the phenolic compounds via adsorption and the shortening of the bioregeneration duration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Highly organic natural media as permeable reactive barriers: TCE partitioning and anaerobic degradation profile in eucalyptus mulch and compost.

    PubMed

    Öztürk, Zuhal; Tansel, Berrin; Katsenovich, Yelena; Sukop, Michael; Laha, Shonali

    2012-10-01

    Batch and column experiments were conducted with eucalyptus mulch and commercial compost to evaluate suitability of highly organic natural media to support anaerobic decomposition of trichloroethylene (TCE) in groundwater. Experimental data for TCE and its dechlorination byproducts were analyzed with Hydrus-1D model to estimate the partitioning and kinetic parameters for the sequential dechlorination reactions during TCE decomposition. The highly organic natural media allowed development of a bioactive zone capable of decomposing TCE under anaerobic conditions. The first order TCE biodecomposition reaction rates were 0.23 and 1.2d(-1) in eucalyptus mulch and compost media, respectively. The retardation factors in the eucalyptus mulch and compost columns for TCE were 35 and 301, respectively. The results showed that natural organic soil amendments can effectively support the anaerobic bioactive zone for remediation of TCE contaminated groundwater. The natural organic media are effective environmentally sustainable materials for use in permeable reactive barriers. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Chemical characterisation and analysis of the cell wall polysaccharides of duckweed (Lemna minor).

    PubMed

    Zhao, X; Moates, G K; Wellner, N; Collins, S R A; Coleman, M J; Waldron, K W

    2014-10-13

    Duckweed is potentially an ideal biofuel feedstock due to its high proportion of cellulose and starch and low lignin content. However, there is little detailed information on the composition and structure of duckweed cell walls relevant to optimising the conversion of duckweed biomass to ethanol and other biorefinery products. This study reports that, for the variety and batch evaluated, carbohydrates constitute 51.2% (w/w) of dry matter while starch accounts for 19.9%. This study, for the first time, analyses duckweed cell wall composition through a detailed sequential extraction. The cell wall is rich in cellulose and also contains 20.3% pectin comprising galacturonan, xylogalacturonan, rhamnogalacturonan; 3.5% hemicellulose comprising xyloglucan and xylan, and 0.03% phenolics. In addition, essential fatty acids (0.6%, α-linolenic and linoleic/linoelaidic acid) and p-coumaric acid (0.015%) respectively are the most abundant fatty acids and phenolics in whole duckweed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Treatment of variable and intermittently flowing wastewaters.

    PubMed

    Kocasoy, Günay

    1993-11-01

    The biological treatment of wastewaters originating from hotels and residential areas of seasonal use, flowing intermittently, is difficult due to the fact that bacteria cannot survive during periods of no-flow. An investigation has been conducted in order to develop a system which will be able to overcome the difficulties encountered. After a long investigation the following system has given satisfactory results. The wastewater was taken initially into an aeration tank operating as a sequential batch reactor. Waste was taken after the sedimentation phase of the reactor into a coagulation-flocculation tank where it was treated by chemical means, and then settled in order to separate the floes. When the population of bacteria in the aeration tank reached the required level, the physico-chemical treatment was terminated and the tank used for chemical treatment has been started to be used as an equalization tank while the aeration and sedimentation tanks have been used as an activated sludge unit. This system has been proved to be a satisfactory method for the above mentioned wastes.

  10. Simplified recovery of enzymes and nutrients in sweet potato wastewater and preparing health black tea and theaflavins with scrap tea.

    PubMed

    Li, Qing-Rong; Luo, Jia-Ling; Zhou, Zhong-Hua; Wang, Guang-Ying; Chen, Rui; Cheng, Shi; Wu, Min; Li, Hui; Ni, He; Li, Hai-Hang

    2018-04-15

    The industry discards generous organic wastewater in sweet potato starch factory and scrap tea in tea production. A simplified procedure to recover all biochemicals from the wastewater of sweet potato starch factory and use them to make health black tea and theaflavins from scrap green tea was developed. The sweet potato wastewater was sequentially treated by isoelectric precipitation, ultrafiltration and nanofiltration to recover polyphenol oxidase (PPO), β-amylase, and small molecular fractions, respectively. The PPO fraction can effectively transform green tea extracts into black tea with high content of theaflavins through the optimized fed-batch feeding fermentation. The PPO transformed black tea with sporamins can be used to make health black tea, or make theaflavins by fractionation with ethyl acetate. This work provides a resource- and environment-friendly approach for economically utilizing the sweet potato wastewater and the scrap tea, and making biochemical, nutrient and health products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Transfer of the epoxidation of soybean oil from batch to flow chemistry guided by cost and environmental issues.

    PubMed

    Kralisch, Dana; Streckmann, Ina; Ott, Denise; Krtschil, Ulich; Santacesaria, Elio; Di Serio, Martino; Russo, Vincenzo; De Carlo, Lucrezia; Linhart, Walter; Christian, Engelbert; Cortese, Bruno; de Croon, Mart H J M; Hessel, Volker

    2012-02-13

    The simple transfer of established chemical production processes from batch to flow chemistry does not automatically result in more sustainable ones. Detailed process understanding and the motivation to scrutinize known process conditions are necessary factors for success. Although the focus is usually "only" on intensifying transport phenomena to operate under intrinsic kinetics, there is also a large intensification potential in chemistry under harsh conditions and in the specific design of flow processes. Such an understanding and proposed processes are required at an early stage of process design because decisions on the best-suited tools and parameters required to convert green engineering concepts into practice-typically with little chance of substantial changes later-are made during this period. Herein, we present a holistic and interdisciplinary process design approach that combines the concept of novel process windows with process modeling, simulation, and simplified cost and lifecycle assessment for the deliberate development of a cost-competitive and environmentally sustainable alternative to an existing production process for epoxidized soybean oil. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A Generic Inner-Loop Control Law Structure for Six-Degree-of-Freedom Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Cox, Timothy H.; Cotting, M. Christopher

    2005-01-01

    A generic control system framework for both real-time and batch six-degree-of-freedom simulations is presented. This framework uses a simplified dynamic inversion technique to allow for stabilization and control of any type of aircraft at the pilot interface level. The simulation, designed primarily for the real-time simulation environment, also can be run in a batch mode through a simple guidance interface. Direct vehicle-state acceleration feedback is required with the simplified dynamic inversion technique. The estimation of surface effectiveness within real-time simulation timing constraints also is required. The generic framework provides easily modifiable control variables, allowing flexibility in the variables that the pilot commands. A direct control allocation scheme is used to command aircraft effectors. Primary uses for this system include conceptual and preliminary design of aircraft, when vehicle models are rapidly changing and knowledge of vehicle six-degree-of-freedom performance is required. A simulated airbreathing hypersonic vehicle and simulated high-performance fighter aircraft are used to demonstrate the flexibility and utility of the control system.

  13. A Generic Inner-Loop Control Law Structure for Six-Degree-of-Freedom Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Cox, Timothy H.; Cotting, Christopher

    2005-01-01

    A generic control system framework for both real-time and batch six-degree-of-freedom (6-DOF) simulations is presented. This framework uses a simplified dynamic inversion technique to allow for stabilization and control of any type of aircraft at the pilot interface level. The simulation, designed primarily for the real-time simulation environment, also can be run in a batch mode through a simple guidance interface. Direct vehicle-state acceleration feedback is required with the simplified dynamic inversion technique. The estimation of surface effectiveness within real-time simulation timing constraints also is required. The generic framework provides easily modifiable control variables, allowing flexibility in the variables that the pilot commands. A direct control allocation scheme is used to command aircraft effectors. Primary uses for this system include conceptual and preliminary design of aircraft, when vehicle models are rapidly changing and knowledge of vehicle 6-DOF performance is required. A simulated airbreathing hypersonic vehicle and simulated high-performance fighter aircraft are used to demonstrate the flexibility and utility of the control system.

  14. Comparative meta-analysis and experimental kinetic investigation of column and batch bottle microcosm treatability studies informing in situ groundwater remedial design.

    PubMed

    Driver, Erin M; Roberts, Jeff; Dollar, Peter; Charles, Maurissa; Hurst, Paul; Halden, Rolf U

    2017-02-05

    A systematic comparison was performed between batch bottle and continuous-flow column microcosms (BMs and CMs, respectively) commonly used for in situ groundwater remedial design. Review of recent literature (2000-2014) showed a preference for reporting batch kinetics, even when corresponding column data were available. Additionally, CMs produced higher observed rate constants, exceeding those of BMs by a factor of 6.1±1.1 standard error. In a subsequent laboratory investigation, 12 equivalent microcosm pairs were constructed from fractured bedrock and perchloroethylene (PCE) impacted groundwater. First-order PCE transformation kinetics of CMs were 8.0±4.8 times faster than BMs (rates: 1.23±0.87 vs. 0.16±0.05d -1 , respectively). Additionally, CMs transformed 16.1±8.0-times more mass than BMs owing to continuous-feed operation. CMs are concluded to yield more reliable kinetic estimates because of much higher data density stemming from long-term, steady-state conditions. Since information from BMs and CMs is valuable and complementary, treatability studies should report kinetic data from both when available. This first systematic investigation of BMs and CMs highlights the need for a more unified framework for data use and reporting in treatability studies informing decision-making for field-scale groundwater remediation. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. The combined effect of wet granulation process parameters and dried granule moisture content on tablet quality attributes.

    PubMed

    Gabbott, Ian P; Al Husban, Farhan; Reynolds, Gavin K

    2016-09-01

    A pharmaceutical compound was used to study the effect of batch wet granulation process parameters in combination with the residual moisture content remaining after drying on granule and tablet quality attributes. The effect of three batch wet granulation process parameters was evaluated using a multivariate experimental design, with a novel constrained design space. Batches were characterised for moisture content, granule density, crushing strength, porosity, disintegration time and dissolution. Mechanisms of the effect of the process parameters on the granule and tablet quality attributes are proposed. Water quantity added during granulation showed a significant effect on granule density and tablet dissolution rate. Mixing time showed a significant effect on tablet crushing strength, and mixing speed showed a significant effect on the distribution of tablet crushing strengths obtained. The residual moisture content remaining after granule drying showed a significant effect on tablet crushing strength. The effect of moisture on tablet tensile strength has been reported before, but not in combination with granulation parameters and granule properties, and the impact on tablet dissolution was not assessed. Correlations between the energy input during granulation, the density of granules produced, and the quality attributes of the final tablets were also identified. Understanding the impact of the granulation and drying process parameters on granule and tablet properties provides a basis for process optimisation and scaling. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Solar oxidation and removal of arsenic--Key parameters for continuous flow applications.

    PubMed

    Gill, L W; O'Farrell, C

    2015-12-01

    Solar oxidation to remove arsenic from water has previously been investigated as a batch process. This research has investigated the kinetic parameters for the design of a continuous flow solar reactor to remove arsenic from contaminated groundwater supplies. Continuous flow recirculated batch experiments were carried out under artificial UV light to investigate the effect of different parameters on arsenic removal efficiency. Inlet water arsenic concentrations of up to 1000 μg/L were reduced to below 10 μg/L requiring 12 mg/L iron after receiving 12 kJUV/L radiation. Citrate however was somewhat surprisingly found to promote a detrimental effect on the removal process in the continuous flow reactor studies which is contrary to results found in batch scale tests. The impact of other typical water groundwater quality parameters (phosphate and silica) on the process due to their competition with arsenic for photooxidation products revealed a much higher sensitivity to phosphate ions compared to silicate. Other results showed no benefit from the addition of TiO2 photocatalyst but enhanced arsenic removal at higher temperatures up to 40 °C. Overall, these results have indicated the kinetic envelope from which a continuous flow SORAS single pass system could be more confidently designed for a full-scale community groundwater application at a village level. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Orphan therapies: making best use of postmarket data.

    PubMed

    Maro, Judith C; Brown, Jeffrey S; Dal Pan, Gerald J; Li, Lingling

    2014-08-01

    Postmarket surveillance of the comparative safety and efficacy of orphan therapeutics is challenging, particularly when multiple therapeutics are licensed for the same orphan indication. To make best use of product-specific registry data collected to fulfill regulatory requirements, we propose the creation of a distributed electronic health data network among registries. Such a network could support sequential statistical analyses designed to detect early warnings of excess risks. We use a simulated example to explore the circumstances under which a distributed network may prove advantageous. We perform sample size calculations for sequential and non-sequential statistical studies aimed at comparing the incidence of hepatotoxicity following initiation of two newly licensed therapies for homozygous familial hypercholesterolemia. We calculate the sample size savings ratio, or the proportion of sample size saved if one conducted a sequential study as compared to a non-sequential study. Then, using models to describe the adoption and utilization of these therapies, we simulate when these sample sizes are attainable in calendar years. We then calculate the analytic calendar time savings ratio, analogous to the sample size savings ratio. We repeat these analyses for numerous scenarios. Sequential analyses detect effect sizes earlier or at the same time as non-sequential analyses. The most substantial potential savings occur when the market share is more imbalanced (i.e., 90% for therapy A) and the effect size is closest to the null hypothesis. However, due to low exposure prevalence, these savings are difficult to realize within the 30-year time frame of this simulation for scenarios in which the outcome of interest occurs at or more frequently than one event/100 person-years. We illustrate a process to assess whether sequential statistical analyses of registry data performed via distributed networks may prove a worthwhile infrastructure investment for pharmacovigilance.

  18. Sea Fighter Analysis

    DTIC Science & Technology

    2007-02-01

    which is used by the model to drive the normal activities of the crew (Figure C.1-2). These routines consist of a sequential list of high- level...separately. Figure C.1-3: Resources & Logic Sheet C.1.1.4 Scenario The scenario that is performed during a model run is a sequential list of all...were marked with a white fore and aft lineup stripe on both landing spots. Current Sea Fighter design does not provide a hangar; however, there

  19. Diagnosing Femoroacetabular Impingement From Plain Radiographs

    PubMed Central

    Ayeni, Olufemi R.; Chan, Kevin; Whelan, Daniel B.; Gandhi, Rajiv; Williams, Dale; Harish, Srinivasan; Choudur, Hema; Chiavaras, Mary M.; Karlsson, Jon; Bhandari, Mohit

    2014-01-01

    Background: A diagnosis of femoroacetabular impingement (FAI) requires careful history and physical examination, as well as an accurate and reliable radiologic evaluation using plain radiographs as a screening modality. Radiographic markers in the diagnosis of FAI are numerous and not fully validated. In particular, reliability in their assessment across health care providers is unclear. Purpose: To determine inter- and intraobserver reliability between orthopaedic surgeons and musculoskeletal radiologists. Study Design: Cohort study (diagnosis); Level of evidence, 3. Methods: Six physicians (3 orthopaedic surgeons, 3 musculoskeletal radiologists) independently evaluated a broad spectrum of FAI pathologies across 51 hip radiographs on 2 occasions separated by at least 4 weeks. Reviewers used 8 common criteria to diagnose FAI, including (1) pistol-grip deformity, (2) size of alpha angle, (3) femoral head-neck offset, (4) posterior wall sign abnormality, (5) ischial spine sign abnormality, (6) coxa profunda abnormality, (7) crossover sign abnormality, and (8) acetabular protrusion. Agreement was calculated using the intraclass correlation coefficient (ICC). Results: When establishing an FAI diagnosis, there was poor interobserver reliability between the surgeons and radiologists (ICC batch 1 = 0.33; ICC batch 2 = 0.15). In contrast, there was higher interobserver reliability within each specialty, ranging from fair to good (surgeons: ICC batch 1 = 0.72; ICC batch 2 = 0.70 vs radiologists: ICC batch 1 = 0.59; ICC batch 2 = 0.74). Orthopaedic surgeons had the highest interobserver reliability when identifying pistol-grip deformities (ICC = 0.81) or abnormal alpha angles (ICC = 0.81). Similarly, radiologists had the highest agreement for detecting pistol-grip deformities (ICC = 0.75). Conclusion: These results suggest that surgeons and radiologists agree among themselves, but there is a need to improve the reliability of radiographic interpretations for FAI between the 2 specialties. The observed degree of low reliability may ultimately lead to missed, delayed, or inappropriate treatments for patients with symptomatic FAI. PMID:26535344

  20. Hot plant recycling of asphaltic concrete : final report.

    DOT National Transportation Integrated Search

    1980-05-01

    This report covers the design, construction and evaluation of two hot mix recycling projects. One project recycled two inches of existing dense-asphaltic concrete through a modified batch plant. The second project recycled a total of five inches of e...

  1. IN-SITU REGENERATION OF GRANULAR ACTIVATED CARBON (GAC) USING FENTON'S REAGENTS

    EPA Science Inventory

    Fenton-dependent regeneration of granular activated carbon (GAC) initially saturated with one of several chlorinated aliphatic contaminants was studied in batch and continuous-flow reactors. Homogeneous and heterogeneous experiments were designed to investigate the effects of va...

  2. Logs Perl Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  3. A strategy for comprehensive identification of sequential constituents using ultra-high-performance liquid chromatography coupled with linear ion trap-Orbitrap mass spectrometer, application study on chlorogenic acids in Flos Lonicerae Japonicae.

    PubMed

    Zhang, Jia-yu; Wang, Zi-jian; Li, Yun; Liu, Ying; Cai, Wei; Li, Chen; Lu, Jian-qiu; Qiao, Yan-jiang

    2016-01-15

    The analytical methodologies for evaluation of multi-component system in traditional Chinese medicines (TCMs) have been inadequate or unacceptable. As a result, the unclarity of multi-component hinders the sufficient interpretation of their bioactivities. In this paper, an ultra-high-performance liquid chromatography coupled with linear ion trap-Orbitrap (UPLC-LTQ-Orbitrap)-based strategy focused on the comprehensive identification of TCM sequential constituents was developed. The strategy was characterized by molecular design, multiple ion monitoring (MIM), targeted database hits and mass spectral trees similarity filter (MTSF), and even more isomerism discrimination. It was successfully applied in the HRMS data-acquisition and processing of chlorogenic acids (CGAs) in Flos Lonicerae Japonicae (FLJ), and a total of 115 chromatographic peaks attributed to 18 categories were characterized, allowing a comprehensive revelation of CGAs in FLJ for the first time. This demonstrated that MIM based on molecular design could improve the efficiency to trigger MS/MS fragmentation reactions. Targeted database hits and MTSF searching greatly facilitated the processing of extremely large information data. Besides, the introduction of diagnostic product ions (DPIs) discrimination, ClogP analysis, and molecular simulation, raised the efficiency and accuracy to characterize sequential constituents especially position and geometric isomers. In conclusion, the results expanded our understanding on CGAs in FLJ, and the strategy could be exemplary for future research on the comprehensive identification of sequential constituents in TCMs. Meanwhile, it may propose a novel idea for analyzing sequential constituents, and is promising for quality control and evaluation of TCMs. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Learning Sequential Composition Control.

    PubMed

    Najafi, Esmaeil; Babuska, Robert; Lopes, Gabriel A D

    2016-11-01

    Sequential composition is an effective supervisory control method for addressing control problems in nonlinear dynamical systems. It executes a set of controllers sequentially to achieve a control specification that cannot be realized by a single controller. As these controllers are designed offline, sequential composition cannot address unmodeled situations that might occur during runtime. This paper proposes a learning approach to augment the standard sequential composition framework by using online learning to handle unforeseen situations. New controllers are acquired via learning and added to the existing supervisory control structure. In the proposed setting, learning experiments are restricted to take place within the domain of attraction (DOA) of the existing controllers. This guarantees that the learning process is safe (i.e., the closed loop system is always stable). In addition, the DOA of the new learned controller is approximated after each learning trial. This keeps the learning process short as learning is terminated as soon as the DOA of the learned controller is sufficiently large. The proposed approach has been implemented on two nonlinear systems: 1) a nonlinear mass-damper system and 2) an inverted pendulum. The results show that in both cases a new controller can be rapidly learned and added to the supervisory control structure.

  5. Optical design of system for a lightship

    NASA Astrophysics Data System (ADS)

    Chirkov, M. A.; Tsyganok, E. A.

    2017-06-01

    This article presents the result of the optical design of illuminating optical system for lightship using the freeform surface. It shows an algorithm of optical design of side-emitting lens for point source using Freeform Z function in Zemax non-sequential mode; optimization of calculation results and testing of optical system with real diode

  6. Viral nerve necrosis in hatchery-produced fry of Asian seabass Lates calcarifer: sequential microscopic analysis of histopathology.

    PubMed

    Azad, I S; Shekhar, M S; Thirunavukkarasu, A R; Jithendran, K P

    2006-12-14

    We studied the natural progression of viral nerve necrosis (VNN) in larvae of Asian seabass Lates calcarifer Bloch from 0 to 40 days post-hatch (dph). The hatchlings were reared in the vicinity of a confirmed nodavirus-affected older batch. Using light and electron microscopy (EM), we made a sequential analysis of histopathological manifestations in nerve tissue and other organs. There were no changes from the day of hatching until 4 dph. Larvae at 4 dph had viral particles in the intramuscular spaces underlying the skin, but the nerve cells of the brain were normal. The first signs of necrosis of the brain cells were observed at 6 dph. EM observations revealed characteristic membrane-bound viral particles measuring 30 nm in the cytoplasm of nerve cells of the brain, spinal cord and retina. Histological samples of fry examined when group mortalities reached 20 to 35% revealed highly vacuolated brains, empty nerve cell cytoplasm and viral particles in the intercellular spaces. Viral particles occurred extensively in the intramuscular spaces and the epidermal layers. These observations were corroborated by positive immunostaining of the virus-rich intramuscular spaces. EM studies also revealed progressive necrotic changes in the cells harboring the virus. Results emphasize the need to maintain hygiene in the hatchery environment and to develop strategies for prevention of disease spread among cohabiting seabass and other susceptible fish larvae. Intramuscular localization of the nodavirus in both preclinical healthy-looking and post-clinical moribund larvae suggests that virus neutralization strategies during larval development could be effective in controlling VNN-associated mortalities.

  7. Transport of U(VI) through sediments amended with phosphate to induce in situ uranium immobilization.

    PubMed

    Mehta, Vrajesh S; Maillot, Fabien; Wang, Zheming; Catalano, Jeffrey G; Giammar, Daniel E

    2015-02-01

    Phosphate amendments can be added to U(VI)-contaminated subsurface environments to promote in situ remediation. The primary objective of this study was to evaluate the impacts of phosphate addition on the transport of U(VI) through contaminated sediments. In batch experiments using sediments (<2 mm size fraction) from a site in Rifle, Colorado, U(VI) only weakly adsorbed due to the dominance of the aqueous speciation by Ca-U(VI)-carbonate complexes. Column experiments with these sediments were performed with flow rates that correspond to a groundwater velocity of 1.1 m/day. In the absence of phosphate, the sediments took up 1.68-1.98 μg U/g of sediments when the synthetic groundwater influent contained 4 μM U(VI). When U(VI)-free influents were then introduced with and without phosphate, substantially more uranium was retained within the column when phosphate was present in the influent. Sequential extractions of sediments from the columns revealed that uranium was uniformly distributed along the length of the columns and was primarily in forms that could be extracted by ion exchange and contact with a weak acid. Laser induced fluorescence spectroscopy (LIFS) analysis along with sequential extraction results suggest adsorption as the dominant uranium uptake mechanism. The response of dissolved uranium concentrations to stopped-flow events and the comparison of experimental data with simulations from a simple reactive transport model indicated that uranium adsorption to and desorption from the sediments was not always at local equilibrium. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Sequential two-column electro-Fenton-photolytic reactor for the treatment of winery wastewater.

    PubMed

    Díez, A M; Sanromán, M A; Pazos, M

    2017-01-01

    The high amount of winery wastewaters produced each year makes their treatment a priority issue due to their problematic characteristics such as acid pH, high concentration of organic load and colourful compounds. Furthermore, some of these effluents can have dissolved pesticides, due to the previous grape treatments, which are recalcitrant to conventional treatments. Recently, photo-electro-Fenton process has been reported as an effective procedure to mineralize different organic contaminants and a promising technology for the treatment of these complex matrixes. However, the reactors available for applying this process are scarce and they show several limitations. In this study, a sequential two-column reactor for the photo-electro-Fenton treatment was designed and evaluated for the treatment of different pesticides, pirimicarb and pyrimethanil, used in wine production. Both studied pesticides were efficiently removed, and the transformation products were determined. Finally, the treatment of a complex aqueous matrix composed by winery wastewater and the previously studied pesticides was carried out in the designed sequential reactor. The high removals of TOC and COD reached and the low energy consumption demonstrated the efficiency of this new configuration.

  9. NASA DOE POD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  10. Multi-arm group sequential designs with a simultaneous stopping rule.

    PubMed

    Urach, S; Posch, M

    2016-12-30

    Multi-arm group sequential clinical trials are efficient designs to compare multiple treatments to a control. They allow one to test for treatment effects already in interim analyses and can have a lower average sample number than fixed sample designs. Their operating characteristics depend on the stopping rule: We consider simultaneous stopping, where the whole trial is stopped as soon as for any of the arms the null hypothesis of no treatment effect can be rejected, and separate stopping, where only recruitment to arms for which a significant treatment effect could be demonstrated is stopped, but the other arms are continued. For both stopping rules, the family-wise error rate can be controlled by the closed testing procedure applied to group sequential tests of intersection and elementary hypotheses. The group sequential boundaries for the separate stopping rule also control the family-wise error rate if the simultaneous stopping rule is applied. However, we show that for the simultaneous stopping rule, one can apply improved, less conservative stopping boundaries for local tests of elementary hypotheses. We derive corresponding improved Pocock and O'Brien type boundaries as well as optimized boundaries to maximize the power or average sample number and investigate the operating characteristics and small sample properties of the resulting designs. To control the power to reject at least one null hypothesis, the simultaneous stopping rule requires a lower average sample number than the separate stopping rule. This comes at the cost of a lower power to reject all null hypotheses. Some of this loss in power can be regained by applying the improved stopping boundaries for the simultaneous stopping rule. The procedures are illustrated with clinical trials in systemic sclerosis and narcolepsy. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  11. High-strength fermentable wastewater reclamation through a sequential process of anaerobic fermentation followed by microalgae cultivation.

    PubMed

    Qi, Wenqiang; Chen, Taojing; Wang, Liang; Wu, Minghong; Zhao, Quanyu; Wei, Wei

    2017-03-01

    In this study, the sequential process of anaerobic fermentation followed by microalgae cultivation was evaluated from both nutrient and energy recovery standpoints. The effects of different fermentation type on the biogas generation, broth metabolites' composition, algal growth and nutrients' utilization, and energy conversion efficiencies for the whole processes were discussed. When the fermentation was designed to produce hydrogen-dominating biogas, the total energy conversion efficiency (TECE) of the sequential process was higher than that of the methane fermentation one. With the production of hydrogen in anaerobic fermentation, more organic carbon metabolites were left in the broth to support better algal growth with more efficient incorporation of ammonia nitrogen. By applying the sequential process, the heat value conversion efficiency (HVCE) for the wastewater could reach 41.2%, if methane was avoided in the fermentation biogas. The removal efficiencies of organic metabolites and NH 4 + -N in the better case were 100% and 98.3%, respectively. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Sequential Feedback Scheme Outperforms the Parallel Scheme for Hamiltonian Parameter Estimation.

    PubMed

    Yuan, Haidong

    2016-10-14

    Measurement and estimation of parameters are essential for science and engineering, where the main quest is to find the highest achievable precision with the given resources and design schemes to attain it. Two schemes, the sequential feedback scheme and the parallel scheme, are usually studied in the quantum parameter estimation. While the sequential feedback scheme represents the most general scheme, it remains unknown whether it can outperform the parallel scheme for any quantum estimation tasks. In this Letter, we show that the sequential feedback scheme has a threefold improvement over the parallel scheme for Hamiltonian parameter estimations on two-dimensional systems, and an order of O(d+1) improvement for Hamiltonian parameter estimation on d-dimensional systems. We also show that, contrary to the conventional belief, it is possible to simultaneously achieve the highest precision for estimating all three components of a magnetic field, which sets a benchmark on the local precision limit for the estimation of a magnetic field.

  13. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  14. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  15. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM−2.5.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  16. Quality by design approach of a pharmaceutical gel manufacturing process, part 1: determination of the design space.

    PubMed

    Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalá, Manel

    2011-10-01

    This work was conducted in the framework of a quality by design project involving the production of a pharmaceutical gel. Preliminary work included the identification of the quality target product profiles (QTPPs) from historical values for previously manufactured batches, as well as the critical quality attributes for the process (viscosity and pH), which were used to construct a D-optimal experimental design. The experimental design comprised 13 gel batches, three of which were replicates at the domain center intended to assess the reproducibility of the target process. The viscosity and pH models established exhibited very high linearity and negligible lack of fit (LOF). Thus, R(2) was 0.996 for viscosity and 0.975 for pH, and LOF was 0.53 for the former parameter and 0.84 for the latter. The process proved reproducible at the domain center. Water content and temperature were the most influential factors for viscosity, and water content and acid neutralized fraction were the most influential factors for pH. A desirability function was used to find the best compromise to optimize the QTPPs. The body of information was used to identify and define the design space for the process. A model capable of combining the two response variables into a single one was constructed to facilitate monitoring of the process. Copyright © 2011 Wiley-Liss, Inc.

  17. FT-NIR: A Tool for Process Monitoring and More.

    PubMed

    Martoccia, Domenico; Lutz, Holger; Cohen, Yvan; Jerphagnon, Thomas; Jenelten, Urban

    2018-03-30

    With ever-increasing pressure to optimize product quality, to reduce cost and to safely increase production output from existing assets, all combined with regular changes in terms of feedstock and operational targets, process monitoring with traditional instruments reaches its limits. One promising answer to these challenges is in-line, real time process analysis with spectroscopic instruments, and above all Fourier-Transform Near Infrared spectroscopy (FT-NIR). Its potential to afford decreased batch cycle times, higher yields, reduced rework and minimized batch variance is presented and application examples in the field of fine chemicals are given. We demonstrate that FT-NIR can be an efficient tool for improved process monitoring and optimization, effective process design and advanced process control.

  18. A group sequential adaptive treatment assignment design for proof of concept and dose selection in headache trials.

    PubMed

    Hall, David B; Meier, Ulrich; Diener, Hans-Cristoph

    2005-06-01

    The trial objective was to test whether a new mechanism of action would effectively treat migraine headaches and to select a dose range for further investigation. The motivation for a group sequential, adaptive, placebo-controlled trial design was (1) limited information about where across the range of seven doses to focus attention, (2) a need to limit sample size for a complicated inpatient treatment and (3) a desire to reduce exposure of patients to ineffective treatment. A design based on group sequential and up and down designs was developed and operational characteristics were explored by trial simulation. The primary outcome was headache response at 2 h after treatment. Groups of four treated and two placebo patients were assigned to one dose. Adaptive dose selection was based on response rates of 60% seen with other migraine treatments. If more than 60% of treated patients responded, then the next dose was the next lower dose; otherwise, the dose was increased. A stopping rule of at least five groups at the target dose and at least four groups at that dose with more than 60% response was developed to ensure that a selected dose would be statistically significantly (p=0.05) superior to placebo. Simulations indicated good characteristics in terms of control of type 1 error, sufficient power, modest expected sample size and modest bias in estimation. The trial design is attractive for phase 2 clinical trials when response is acute and simple, ideally binary, placebo comparator is required, and patient accrual is relatively slow allowing for the collection and processing of results as a basis for the adaptive assignment of patients to dose groups. The acute migraine trial based on this design was successful in both proof of concept and dose range selection.

  19. TRANSIENT SUPPRESSION PACKAGING FOR REDUCED EMISSIONS FROM ROTARY KILN INCINERATORS

    EPA Science Inventory

    Experiments were performed on a 73 kW rotary kiln incinerator simulator to determine whether innovative waste packaging designs might reduce transient emissions of products of incomplete combustion due to batch charging of containerized liquid surrogate waste compounds bound on g...

  20. Nonlinear interferometry approach to photonic sequential logic

    NASA Astrophysics Data System (ADS)

    Mabuchi, Hideo

    2011-10-01

    Motivated by rapidly advancing capabilities for extensive nanoscale patterning of optical materials, I propose an approach to implementing photonic sequential logic that exploits circuit-scale phase coherence for efficient realizations of fundamental components such as a NAND-gate-with-fanout and a bistable latch. Kerr-nonlinear optical resonators are utilized in combination with interference effects to drive the binary logic. Quantum-optical input-output models are characterized numerically using design parameters that yield attojoule-scale energy separation between the latch states.

  1. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  2. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Preparing the Teacher of Tomorrow

    ERIC Educational Resources Information Center

    Hemp, Paul E.

    1976-01-01

    Suggested ways of planning and conducting high quality teacher preparation programs are discussed under major headings of student selection, sequential courses and experiences, and program design. (HD)

  4. Assisted annotation of medical free text using RapTAT

    PubMed Central

    Gobbel, Glenn T; Garvin, Jennifer; Reeves, Ruth; Cronin, Robert M; Heavirland, Julia; Williams, Jenifer; Weaver, Allison; Jayaramaraja, Shrimalini; Giuse, Dario; Speroff, Theodore; Brown, Steven H; Xu, Hua; Matheny, Michael E

    2014-01-01

    Objective To determine whether assisted annotation using interactive training can reduce the time required to annotate a clinical document corpus without introducing bias. Materials and methods A tool, RapTAT, was designed to assist annotation by iteratively pre-annotating probable phrases of interest within a document, presenting the annotations to a reviewer for correction, and then using the corrected annotations for further machine learning-based training before pre-annotating subsequent documents. Annotators reviewed 404 clinical notes either manually or using RapTAT assistance for concepts related to quality of care during heart failure treatment. Notes were divided into 20 batches of 19–21 documents for iterative annotation and training. Results The number of correct RapTAT pre-annotations increased significantly and annotation time per batch decreased by ∼50% over the course of annotation. Annotation rate increased from batch to batch for assisted but not manual reviewers. Pre-annotation F-measure increased from 0.5 to 0.6 to >0.80 (relative to both assisted reviewer and reference annotations) over the first three batches and more slowly thereafter. Overall inter-annotator agreement was significantly higher between RapTAT-assisted reviewers (0.89) than between manual reviewers (0.85). Discussion The tool reduced workload by decreasing the number of annotations needing to be added and helping reviewers to annotate at an increased rate. Agreement between the pre-annotations and reference standard, and agreement between the pre-annotations and assisted annotations, were similar throughout the annotation process, which suggests that pre-annotation did not introduce bias. Conclusions Pre-annotations generated by a tool capable of interactive training can reduce the time required to create an annotated document corpus by up to 50%. PMID:24431336

  5. Cost Optimal Design of a Power Inductor by Sequential Gradient Search

    NASA Astrophysics Data System (ADS)

    Basak, Raju; Das, Arabinda; Sanyal, Amarnath

    2018-05-01

    Power inductors are used for compensating VAR generated by long EHV transmission lines and in electronic circuits. For the EHV-lines, the rating of the inductor is decided upon by techno-economic considerations on the basis of the line-susceptance. It is a high voltage high current device, absorbing little active power and large reactive power. The cost is quite high- hence the design should be made cost-optimally. The 3-phase power inductor is similar in construction to a 3-phase core-type transformer with the exception that it has only one winding per phase and each limb is provided with an air-gap, the length of which is decided upon by the inductance required. In this paper, a design methodology based on sequential gradient search technique and the corresponding algorithm leading to cost-optimal design of a 3-phase EHV power inductor has been presented. The case-study has been made on a 220 kV long line of NHPC running from Chukha HPS to Birpara of Coochbihar.

  6. Sequential experimental design based generalised ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-07-01

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  7. Sequential experimental design based generalised ANOVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover,more » generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.« less

  8. Modeling methods for merging computational and experimental aerodynamic pressure data

    NASA Astrophysics Data System (ADS)

    Haderlie, Jacob C.

    This research describes a process to model surface pressure data sets as a function of wing geometry from computational and wind tunnel sources and then merge them into a single predicted value. The described merging process will enable engineers to integrate these data sets with the goal of utilizing the advantages of each data source while overcoming the limitations of both; this provides a single, combined data set to support analysis and design. The main challenge with this process is accurately representing each data source everywhere on the wing. Additionally, this effort demonstrates methods to model wind tunnel pressure data as a function of angle of attack as an initial step towards a merging process that uses both location on the wing and flow conditions (e.g., angle of attack, flow velocity or Reynold's number) as independent variables. This surrogate model of pressure as a function of angle of attack can be useful for engineers that need to predict the location of zero-order discontinuities, e.g., flow separation or normal shocks. Because, to the author's best knowledge, there is no published, well-established merging method for aerodynamic pressure data (here, the coefficient of pressure Cp), this work identifies promising modeling and merging methods, and then makes a critical comparison of these methods. Surrogate models represent the pressure data for both data sets. Cubic B-spline surrogate models represent the computational simulation results. Machine learning and multi-fidelity surrogate models represent the experimental data. This research compares three surrogates for the experimental data (sequential--a.k.a. online--Gaussian processes, batch Gaussian processes, and multi-fidelity additive corrector) on the merits of accuracy and computational cost. The Gaussian process (GP) methods employ cubic B-spline CFD surrogates as a model basis function to build a surrogate model of the WT data, and this usage of the CFD surrogate in building the WT data could serve as a "merging" because the resulting WT pressure prediction uses information from both sources. In the GP approach, this model basis function concept seems to place more "weight" on the Cp values from the wind tunnel (WT) because the GP surrogate uses the CFD to approximate the WT data values. Conversely, the computationally inexpensive additive corrector method uses the CFD B-spline surrogate to define the shape of the spanwise distribution of the Cp while minimizing prediction error at all spanwise locations for a given arc length position; this, too, combines information from both sources to make a prediction of the 2-D WT-based Cp distribution, but the additive corrector approach gives more weight to the CFD prediction than to the WT data. Three surrogate models of the experimental data as a function of angle of attack are also compared for accuracy and computational cost. These surrogates are a single Gaussian process model (a single "expert"), product of experts, and generalized product of experts. The merging approach provides a single pressure distribution that combines experimental and computational data. The batch Gaussian process method provides a relatively accurate surrogate that is computationally acceptable, and can receive wind tunnel data from port locations that are not necessarily parallel to a variable direction. On the other hand, the sequential Gaussian process and additive corrector methods must receive a sufficient number of data points aligned with one direction, e.g., from pressure port bands (tap rows) aligned with the freestream. The generalized product of experts best represents wind tunnel pressure as a function of angle of attack, but at higher computational cost than the single expert approach. The format of the application data from computational and experimental sources in this work precluded the merging process from including flow condition variables (e.g., angle of attack) in the independent variables, so the merging process is only conducted in the wing geometry variables of arc length and span. The merging process of Cp data allows a more "hands-off" approach to aircraft design and analysis, (i.e., not as many engineers needed to debate the Cp distribution shape) and generates Cp predictions at any location on the wing. However, the cost with these benefits are engineer time (learning how to build surrogates), computational time in constructing the surrogates, and surrogate accuracy (surrogates introduce error into data predictions). This dissertation effort used the Trap Wing / First AIAA CFD High-Lift Prediction Workshop as a relevant transonic wing with a multi-element high-lift system, and this work identified that the batch GP model for the WT data and the B-spline surrogate for the CFD might best be combined using expert belief weights to describe Cp as a function of location on the wing element surface. (Abstract shortened by ProQuest.).

  9. Spacecraft Data Simulator for the test of level zero processing systems

    NASA Technical Reports Server (NTRS)

    Shi, Jeff; Gordon, Julie; Mirchandani, Chandru; Nguyen, Diem

    1994-01-01

    The Microelectronic Systems Branch (MSB) at Goddard Space Flight Center (GSFC) has developed a Spacecraft Data Simulator (SDS) to support the development, test, and verification of prototype and production Level Zero Processing (LZP) systems. Based on a disk array system, the SDS is capable of generating large test data sets up to 5 Gigabytes and outputting serial test data at rates up to 80 Mbps. The SDS supports data formats including NASA Communication (Nascom) blocks, Consultative Committee for Space Data System (CCSDS) Version 1 & 2 frames and packets, and all the Advanced Orbiting Systems (AOS) services. The capability to simulate both sequential and non-sequential time-ordered downlink data streams with errors and gaps is crucial to test LZP systems. This paper describes the system architecture, hardware and software designs, and test data designs. Examples of test data designs are included to illustrate the application of the SDS.

  10. Collaborative, Sequential and Isolated Decisions in Design

    NASA Technical Reports Server (NTRS)

    Lewis, Kemper; Mistree, Farrokh

    1997-01-01

    The Massachusetts Institute of Technology (MIT) Commission on Industrial Productivity, in their report Made in America, found that six recurring weaknesses were hampering American manufacturing industries. The two weaknesses most relevant to product development were 1) technological weakness in development and production, and 2) failures in cooperation. The remedies to these weaknesses are considered the essential twin pillars of CE: 1) improved development process, and 2) closer cooperation. In the MIT report, it is recognized that total cooperation among teams in a CE environment is rare in American industry, while the majority of the design research in mathematically modeling CE has assumed total cooperation. In this paper, we present mathematical constructs, based on game theoretic principles, to model degrees of collaboration characterized by approximate cooperation, sequential decision making and isolation. The design of a pressure vessel and a passenger aircraft are included as illustrative examples.

  11. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    PubMed

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  12. ADS: A FORTRAN program for automated design synthesis: Version 1.10

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1985-01-01

    A new general-purpose optimization program for engineering design is described. ADS (Automated Design Synthesis - Version 1.10) is a FORTRAN program for solution of nonlinear constrained optimization problems. The program is segmented into three levels: strategy, optimizer, and one-dimensional search. At each level, several options are available so that a total of over 100 possible combinations can be created. Examples of available strategies are sequential unconstrained minimization, the Augmented Lagrange Multiplier method, and Sequential Linear Programming. Available optimizers include variable metric methods and the Method of Feasible Directions as examples, and one-dimensional search options include polynomial interpolation and the Golden Section method as examples. Emphasis is placed on ease of use of the program. All information is transferred via a single parameter list. Default values are provided for all internal program parameters such as convergence criteria, and the user is given a simple means to over-ride these, if desired.

  13. Medication Waste Reduction in Pediatric Pharmacy Batch Processes

    PubMed Central

    Veltri, Michael A.; Hamrock, Eric; Mollenkopf, Nicole L.; Holt, Kristen; Levin, Scott

    2014-01-01

    OBJECTIVES: To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. METHODS: A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. RESULTS: Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. CONCLUSIONS: The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste. PMID:25024671

  14. Medication waste reduction in pediatric pharmacy batch processes.

    PubMed

    Toerper, Matthew F; Veltri, Michael A; Hamrock, Eric; Mollenkopf, Nicole L; Holt, Kristen; Levin, Scott

    2014-04-01

    To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste.

  15. Students' Preferences and Opinions on Design of a Mobile Marketing Education Application

    ERIC Educational Resources Information Center

    Ozata, Zeynep; Ozdama Keskin, Nilgun

    2014-01-01

    The purpose of this study was to define and better understand business school students' opinions and preferences on the design of a mobile marketing education application. To accomplish this purpose an explanatory mixed methods study design was used and the data was collected sequentially. First, a questionnaire was conducted with 168 business…

  16. Pre-Modeling Ensures Accurate Solid Models

    ERIC Educational Resources Information Center

    Gow, George

    2010-01-01

    Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high…

  17. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  18. The renaissance of continuous culture in the post-genomics age.

    PubMed

    Bull, Alan T

    2010-10-01

    The development of continuous culture techniques 60 years ago and the subsequent formulation of theory and the diversification of experimental systems revolutionised microbiology and heralded a unique period of innovative research. Then, progressively, molecular biology and thence genomics and related high-information-density omics technologies took centre stage and microbial growth physiology in general faded from educational programmes and research funding priorities alike. However, there has been a gathering appreciation over the past decade that if the claims of systems biology are going to be realised, they will have to be based on rigorously controlled and reproducible microbial and cell growth platforms. This revival of continuous culture will be long lasting because its recognition as the growth system of choice is firmly established. The purpose of this review, therefore, is to remind microbiologists, particularly those new to continuous culture approaches, of the legacy of what I call the first age of continuous culture, and to explore a selection of researches that are using these techniques in this post-genomics age. The review looks at the impact of continuous culture across a comprehensive range of microbiological research and development. The ability to establish (quasi-) steady state conditions is a frequently stated advantage of continuous cultures thereby allowing environmental parameters to be manipulated without causing concomitant changes in the specific growth rate. However, the use of continuous cultures also enables the critical study of specified transition states and chemical, physical or biological perturbations. Such dynamic analyses enhance our understanding of microbial ecology and microbial pathology for example, and offer a wider scope for innovative drug discovery; they also can inform the optimization of batch and fed-batch operations that are characterized by sequential transitions states.

  19. An anaerobic-aerobic sequential batch process with simultaneous methanogenesis and short-cut denitrification for the treatment of marine biofoulings.

    PubMed

    Akizuki, S; Toda, T

    2018-04-01

    Although combination of denitritation and methanogenesis for wastewater treatment has been widely investigated, an application of this technology to solid waste treatment has been rarely studied. This study investigated an anaerobic-aerobic batch system with simultaneous denitritation-methanogenesis as an effective treatment for marine biofoulings, which is a major source of intermittently discharged organic solid wastes. Preliminary NO 2 - -exposed sludge was inoculated to achieve stable methanogenesis process without NO 2 - inhibition. Both high NH 4 + -N removal of 99.5% and high NO 2 - -N accumulation of 96.4% were achieved on average during the nitritation step. Sufficient CH 4 recovery of 101 L-CH 4 kg-COD -1 was achieved, indicating that the use of NO 2 - -exposed sludge is effective to avoid NO 2 - inhibition on methanogenesis. Methanogenesis was the main COD utilization pathway when the substrate solubilization occurred actively, while denitritation was the main when solubilization was limited because of substrate shortage. The results showed a high COD removal efficiency of 96.0% and a relatively low nitrogen removal efficiency of 64.4%. Fitting equations were developed to optimize the effluent exchange ratio. The estimated results showed that the increase of effluent exchange ratio during the active solubilization period increased the nitrogen removal efficiency but decreased CH 4 content in biogas. An appropriate effluent exchange ratio with high anaerobic effluent quality below approximately 120 mg-N L -1 as well as sufficient CH 4 gas quality which can be used as fuel for gas engine generator was achieved by daily effluent exchange of 80% during the first week and 5% during the subsequent 8 days. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. 21 CFR 211.130 - Packaging and labeling operations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Packaging and Labeling Control § 211.130 Packaging and labeling operations. There shall be written procedures designed to... manufacture and control of the batch. (d) Examination of packaging and labeling materials for suitability and...

Top