Science.gov

Sample records for scaling-up technique applied

  1. An efficient permeability scaling-up technique applied to the discretized flow equations

    SciTech Connect

    Urgelli, D.; Ding, Yu

    1997-08-01

    Grid-block permeability scaling-up for numerical reservoir simulations has been discussed for a long time in the literature. It is now recognized that a full permeability tensor is needed to get an accurate reservoir description at large scale. However, two major difficulties are encountered: (1) grid-block permeability cannot be properly defined because it depends on boundary conditions; (2) discretization of flow equations with a full permeability tensor is not straightforward and little work has been done on this subject. In this paper, we propose a new method, which allows us to get around both difficulties. As the two major problems are closely related, a global approach will preserve the accuracy. So, in the proposed method, the permeability up-scaling technique is integrated in the discretized numerical scheme for flow simulation. The permeability is scaled-up via the transmissibility term, in accordance with the fluid flow calculation in the numerical scheme. A finite-volume scheme is particularly studied, and the transmissibility scaling-up technique for this scheme is presented. Some numerical examples are tested for flow simulation. This new method is compared with some published numerical schemes for full permeability tensor discretization where the full permeability tensor is scaled-up through various techniques. Comparing the results with fine grid simulations shows that the new method is more accurate and more efficient.

  2. Imaging techniques applied to the study of fluids in porous media. Scaling up in Class 1 reservoir type rock

    SciTech Connect

    Tomutsa, L.; Brinkmeyer, A.; Doughty, D.

    1993-04-01

    A synergistic rock characterization methodology has been developed. It derives reservoir engineering parameters from X-ray tomography (CT) scanning, computer assisted petrographic image analysis, minipermeameter measurements, and nuclear magnetic resonance imaging (NMRI). This rock characterization methodology is used to investigate the effect of small-scale rock heterogeneity on oil distribution and recovery. It is also used to investigate the applicability of imaging technologies to the development of scaleup procedures from core plug to whole core, by comparing the results of detailed simulations with the images ofthe fluid distributions observed by CT scanning. By using the rock and fluid detailed data generated by imaging technology describe, one can verify directly, in the laboratory, various scaling up techniques. Asan example, realizations of rock properties statistically and spatially compatible with the observed values are generated by one of the various stochastic methods available (fuming bands) and are used as simulator input. The simulation results were compared with both the simulation results using the true rock properties and the fluid distributions observed by CT. Conclusions regarding the effect of the various permeability models on waterflood oil recovery were formulated.

  3. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    Tri-isotropic (TRISO) fuel particle coating is critical for the future use of nuclear energy produced byadvanced gas reactors (AGRs). The fuel kernels are coated using chemical vapor deposition in a spouted fluidized bed. The challenges encountered in operating TRISO fuel coaters are due to the fact that in modern AGRs, such as High Temperature Gas Reactors (HTGRs), the acceptable level of defective/failed coated particles is essentially zero. This specification requires processes that produce coated spherical particles with even coatings having extremely low defect fractions. Unfortunately, the scale-up and design of the current processes and coaters have been based on empirical approaches and are operated as black boxes. Hence, a voluminous amount of experimental development and trial and error work has been conducted. It has been clearly demonstrated that the quality of the coating applied to the fuel kernels is impacted by the hydrodynamics, solids flow field, and flow regime characteristics of the spouted bed coaters, which themselves are influenced by design parameters and operating variables. Further complicating the outlook for future fuel-coating technology and nuclear energy production is the fact that a variety of new concepts will involve fuel kernels of different sizes and with compositions of different densities. Therefore, without a fundamental understanding the underlying phenomena of the spouted bed TRISO coater, a significant amount of effort is required for production of each type of particle with a significant risk of not meeting the specifications. This difficulty will significantly and negatively impact the applications of AGRs for power generation and cause further challenges to them as an alternative source of commercial energy production. Accordingly, the proposed work seeks to overcome such hurdles and advance the scale-up, design, and performance of TRISO fuel particle spouted bed coaters. The overall objectives of the proposed work are

  4. Mechanochemistry applied to reformulation and scale-up production of Ethionamide: Salt selection and solubility enhancement.

    PubMed

    de Melo, Cristiane C; da Silva, Cecilia C P; Pereira, Carla C S S; Rosa, Paulo C P; Ellena, Javier

    2016-01-01

    Ethionamide (ETH), a Biopharmaceutics Classification System class II drug, is a second-line drug manufactured as an oral dosage form by Pfizer to treat tuberculosis. Since its discovery in 1956, only one reformulation was proposed in 2005 as part of the efforts to improve its solubility. Due to the limited scientific research on active pharmaceutical ingredients (APIs) for the treatment of neglected diseases, we focused on the development of an approachable and green supramolecular synthesis protocol for the production of novel solid forms of ETH. Initially, three salts were crystal engineered and supramolecular synthesized via slow evaporation of the solvent: a saccharinate, a maleate and an oxalate. The crystal structures of all salts were determined by single crystal X-ray diffraction. In sequence, mechanochemical protocols for them were developed, being the scale-up production of the maleate salt successfully reproducible and confirmed by powder X-ray diffraction. Finally, a more complete solid-state characterization was carried out for the ETH maleate salt, including thermal analysis, infrared spectroscopy, scanning electron microscopy and equilibrium solubility at different dissolution media. Although ETH maleate is thermodynamically less stable than ETH, the equilibrium solubility results revealed that this novel salt is much more soluble in purified water than ETH, thus being a suitable new candidate for future formulations. PMID:26472469

  5. Estimating the Annual Incidence of Abortions in Iran Applying a Network Scale-up Approach

    PubMed Central

    Rastegari, Azam; Baneshi, Mohammad Reza; Haji-maghsoudi, Saiedeh; Nakhaee, Nowzar; Eslami, Mohammad; Malekafzali, Hossein; Haghdoost, Ali Akbar

    2014-01-01

    Background: Abortions are of major public health concern in developing countries. In settings in which abortion is highly prohibited, the direct interview is not a reliable method to estimate the abortion rate. The indirect estimation methods to measure the rate of abortion might overcome this dilemma; They are practical methods to estimate the size of the hidden group who do not agree to participate in a direct interview. Objectives: The aim of this study was to explore the practicality of an indirect method for estimating the abortion rate , Known as Network Scale-up, and to provide an estimate about the episode of abortion with and without medical indications (AWMI+ and AWMI-) in Iran. Materials and Methods: This cross-sectional study was conducted in 31 provinces of Iran in 2012. A random sample between 200 and 1000 was selected in each province by the multistage sampling method that 75% of the data were collected from the capital and 25% from one main city. We selected samples from urban people more than 18 years old (12960) and we asked them about the number of abortion in women they knew who had experienced the medical and non-medical abortions in the past year. A range for the transparency factor was estimated based on the expert opinion. Results: The range of the transparency factors for AWMI+ and AWOMI- were 0.43-0.75 and 0.2-0.34, respectively. Regarding the AWMI+, our minimum and maximum estimations (per 1000 pregnancies) were 70.54 and 116.9, respectively. The corresponding figures for AWMI- were 93.18, and 148.7. Conclusions: The frequency rates for AWMI+ and AWMI- were relatively high. Therefore, the system has to address to this hidden problem using the appropriate preventive policies. PMID:25558379

  6. Scale effects: HCMM data simulation. Usage of filtering techniques for scaling-up simulations

    NASA Technical Reports Server (NTRS)

    Digennaro, V. (Principal Investigator)

    1980-01-01

    Image reduction used to simulate increase in altitude of an acquisition platform is equivalent to data smoothing, and can be achieved either by neighborhood averaging or by filtering techniques. The averaging approach is limited for accurate simulation. A filtering method is described which was based on the hypothesis that all changes due to altitude increase can be represented by a point spread function. Determination of the scale function and factor are discussed as well as implementation of the filtering. Filtering can be either in the spatial or frequency domain. In the spatial domain, filtering consists of the convolution of the image with the weights mask, and then of the declination of the points according to the appropriates scale factor. A simulation of an aircraft day image in the infrared channel is examined.

  7. Scaling up Psycholinguistics

    ERIC Educational Resources Information Center

    Smith, Nathaniel J.

    2011-01-01

    This dissertation contains several projects, each addressing different questions with different techniques. In chapter 1, I argue that they are unified thematically by their goal of "scaling up psycholinguistics"; they are all aimed at analyzing large data-sets using tools that reveal patterns to propose and test mechanism-neutral hypotheses about…

  8. Evaluating of scale-up methodologies of gas-solid spouted beds for coating TRISO nuclear fuel particles using advanced measurement techniques

    NASA Astrophysics Data System (ADS)

    Ali, Neven Y.

    The work focuses on implementing for the first time advanced non-invasive measurement techniques to evaluate the scale-up methodology of gas-solid spouted beds for hydrodynamics similarity that has been reported in the literature based on matching dimensionless groups and the new mechanistic scale up methodology that has been developed in our laboratory based on matching the radial profile of gas holdup since the gas dynamics dictate the hydrodynamics of the gas-solid spouted beds. These techniques are gamma-ray computed tomography (CT) to measure the cross-sectional distribution of the phases' holdups and their radial profiles along the bed height and radioactive particle tracking (RPT) to measure in three-dimension (3D) solids velocity and their turbulent parameters. The measured local parameters and the analysis of the results obtained in this work validate our new methodology of scale up of gas-solid spouted beds by comparing for the similarity the phases' holdups and the dimensionless solids velocities and their turbulent parameters that are non-dimensionalized using the minimum spouting superficial gas velocity. However, the scale-up methodology of gas-solid spouted beds that is based on matching dimensionless groups has not been validated for hydrodynamics similarity with respect to the local parameters such as phases' holdups and dimensionless solids velocities and their turbulent parameters. Unfortunately, this method was validated in the literature by only measuring the global parameters. Thus, this work confirms that validation of the scale-up methods of gas-solid spouted beds for hydrodynamics similarity should reside on measuring and analyzing the local hydrodynamics parameters.

  9. Applied ALARA techniques

    SciTech Connect

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  10. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  11. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  12. Scaling up of renewable chemicals.

    PubMed

    Sanford, Karl; Chotani, Gopal; Danielson, Nathan; Zahn, James A

    2016-04-01

    The transition of promising technologies for production of renewable chemicals from a laboratory scale to commercial scale is often difficult and expensive. As a result the timeframe estimated for commercialization is typically underestimated resulting in much slower penetration of these promising new methods and products into the chemical industries. The theme of 'sugar is the next oil' connects biological, chemical, and thermochemical conversions of renewable feedstocks to products that are drop-in replacements for petroleum derived chemicals or are new to market chemicals/materials. The latter typically offer a functionality advantage and can command higher prices that result in less severe scale-up challenges. However, for drop-in replacements, price is of paramount importance and competitive capital and operating expenditures are a prerequisite for success. Hence, scale-up of relevant technologies must be interfaced with effective and efficient management of both cell and steel factories. Details involved in all aspects of manufacturing, such as utilities, sterility, product recovery and purification, regulatory requirements, and emissions must be managed successfully. PMID:26874264

  13. Wax deposition scale-up modeling for waxy crude production lines

    SciTech Connect

    Hsu, J.J.C.; Brubaker, J.P.

    1995-12-01

    A wax deposition scale-up model has been developed to scale-up laboratory wax deposition results for waxy crude production lines. The wax deposition model allows users to predict wax deposition profile along a cold pipeline and predict potential wax problems and pigging frequency. Consideration of the flow turbulence effect significantly increases prediction accuracy. Accurate wax deposition prediction should save capital and operation investments for waxy crude production systems. Many wax deposition models only apply a molecular diffusion mechanism in modeling and neglect shear effect. However, the flow turbulence effect has significant impact on wax deposition and can not be neglected in wax deposition modeling. Wax deposition scale-up parameters including shear rate, shear stress, and Reynolds number have been studied. None of these parameters can be used as a scaler. Critical wax tension concept has been proposed as a scaler. A technique to scale up shear effect and then wax deposition is described. For a given oil and oil temperature, the laboratory wax deposition data can be scaled up by heat flux and flow velocity. The scale-up techniques could be applied to multiphase flow conditions. Examples are presented in this paper to describe profiles of wax deposition and effective inside diameter along North Sea and West Africa subsea pipelines. The difference of wax deposition profiles from stock tank oil and live oil is also presented.

  14. Steamflood modeling and scale up in a hetrogeneous reservoir

    SciTech Connect

    Dehghani, K.; Basham, W.M.; Durlofsky, L.J.

    1996-12-31

    A study was undertaken to investigate the effects of different levels of reservoir description for modeling the steamflood process in Midway-Sunset 26C south. This reservoir is highly stratified and interbedded with fine scale heterogeneities. The study also included development of a methodology for coarsening very detailed geostatistically derived models so that the effects of fine scale heterogeneities on the steamflood performance prediction can be captured. porosity and permeability cubes with three different levels of detail in layering were generated geostatistically for a 30 acre area in the 26C south using data from 57 wells. Cross section models were extracted from these cubes in both dipping and non-dipping structures for steamflood simulations using field injection data and the results were compared.The most detailed models were coarsened by two methods. In the first method, the impermeable layers were retained and the permeable sands coarsened by averaging each 10 layers. In the second method the dominant flow paths in the cross section are identified and used to selectively scale up the reservoir properties, leaving detail in regions where required and coarsening in other regions. The results showed that the detailed geostatistical model gives a more accurate performance prediction than a coarse geostatistical model. The results also showed that the most accurate coarsened models were obtained by applying the general scale up technique.

  15. Steamflood modeling and scale up in a hetrogeneous reservoir

    SciTech Connect

    Dehghani, K.; Basham, W.M.; Durlofsky, L.J. )

    1996-01-01

    A study was undertaken to investigate the effects of different levels of reservoir description for modeling the steamflood process in Midway-Sunset 26C south. This reservoir is highly stratified and interbedded with fine scale heterogeneities. The study also included development of a methodology for coarsening very detailed geostatistically derived models so that the effects of fine scale heterogeneities on the steamflood performance prediction can be captured. porosity and permeability cubes with three different levels of detail in layering were generated geostatistically for a 30 acre area in the 26C south using data from 57 wells. Cross section models were extracted from these cubes in both dipping and non-dipping structures for steamflood simulations using field injection data and the results were compared.The most detailed models were coarsened by two methods. In the first method, the impermeable layers were retained and the permeable sands coarsened by averaging each 10 layers. In the second method the dominant flow paths in the cross section are identified and used to selectively scale up the reservoir properties, leaving detail in regions where required and coarsening in other regions. The results showed that the detailed geostatistical model gives a more accurate performance prediction than a coarse geostatistical model. The results also showed that the most accurate coarsened models were obtained by applying the general scale up technique.

  16. Application of a new scale up methodology to the simulation of displacement processes in heterogeneous reservoirs

    SciTech Connect

    Durlofsky, L.J.; Milliken, W.J.; Dehghani, K.; Jones, R.C.

    1994-12-31

    A general method for the scale up of highly detailed, heterogeneous reservoir cross sections is presented and applied to the simulation of several recovery processes in a variety of geologic settings. The scale up technique proceeds by first identifying portions of the fine scale reservoir description which could potentially lead to high fluid velocities, typically regions of connected, high permeability. These regions are then modeled in detail while the remainder of the domain is coarsened using a general numerical technique for the calculation of effective permeability. The overall scale up method is applied to the cross sectional simulation of three actual fields. Waterflood, steamflood and miscible flood recovery processes are considered. In all these cases, the scale up technique is shown to give coarsened reservoir descriptions which provide simulation results in very good agreement with those of the detailed reservoir descriptions. For these simulations, speedups in computation times, for the coarsened models relative to their fine grid counterparts, range from a factor of 10 to a factor of 200.

  17. Scaling up Effects in the Organic Laboratory

    ERIC Educational Resources Information Center

    Persson, Anna; Lindstrom, Ulf M.

    2004-01-01

    A simple and effective way of exposing chemistry students to some of the effects of scaling up an organic reaction is described. It gives the student an experience that may encounter in an industrial setting.

  18. Diagnostic technique applied for FEL electron bunches

    NASA Astrophysics Data System (ADS)

    Brovko, O.; Grebentsov, A.; Morozov, N.; Syresin, E.; Yurkov, M.

    2016-05-01

    Diagnostic technique applied for FEL ultrashort electron bunches is developed at JINR-DESY collaboration within the framework of the FLASH and XFEL projects. Photon diagnostics are based on calorimetric measurements and detection of undulator radiation. The infrared undulator constructed at JINR and installed at FLASH is used for longitudinal bunch shape measurements and for two-color lasing provided by the FIR and VUV undulators. The pump probe experiments with VUV and FIR undulators provide the bunch profile measurements with resolution of several femtosecond. The new three microchannel plates (MCP) detectors operated in X-ray range are under development now in JINR for SASE1-SASE 3 European XFEL.

  19. Flash Diffusivity Technique Applied to Individual Fibers

    NASA Technical Reports Server (NTRS)

    Mayeaux, Brian; Yowell, Leonard; Wang, Hsin

    2007-01-01

    A variant of the flash diffusivity technique has been devised for determining the thermal diffusivities, and thus the thermal conductivities, of individual aligned fibers. The technique is intended especially for application to nanocomposite fibers, made from narrower fibers of polyphenylene benzobisthiazole (PBZT) and carbon nanotubes. These highly aligned nanocomposite fibers could exploit the high thermal conductivities of carbon nanotubes for thermal-management applications. In the flash diffusivity technique as practiced heretofore, one or more heat pulse(s) is (are) applied to the front face of a plate or disk material specimen and the resulting time-varying temperature on the rear face is measured. Usually, the heat pulse is generated by use of a xenon flash lamp, and the variation of temperature on the rear face is measured by use of an infrared detector. The flash energy is made large enough to produce a usefully high temperature rise on the rear face, but not so large as to significantly alter the specimen material. Once the measurement has been completed, the thermal diffusivity of the specimen is computed from the thickness of the specimen and the time dependence of the temperature variation on the rear face. Heretofore, the infrared detector used in the flash diffusivity technique has been a single-point detector, which responds to a spatial average of the thermal radiation from the rear specimen surface. Such a detector cannot distinguish among regions of differing diffusivity within the specimen. Moreover, two basic assumptions of the thermaldiffusivity technique as practiced heretofore are that the specimen is homogeneous and that heat flows one-dimensionally from the front to the rear face. These assumptions are not valid for an inhomogeneous (composite) material.

  20. Digital techniques applied to sine test control

    NASA Astrophysics Data System (ADS)

    Westoby, T. J.

    1981-09-01

    Digital techniques are applied to solve problems experienced in analogue circuitry, enabling the design of a highly reliable sine control system. A sine wave is generated whose frequency is proportional to a digital number, held in the counters of the sweep generator, using the frequency related pulse stream. This pulse stream is used to generate a ramp by applying it to a count. The rate of rise is varied by using a rate multiplier arranged to slow the pulse stream as the ramp proceeds. Variation of frequency depends only on the frequency of the pulse stream entering the circuit, and the oscillator runs quite acceptably at 0.1 Hz and 10 kHz. The total distortion at this stage is less than 2%. Since the control signal is quantized, only discrete changes in control are experienced, and the control lines are static most of the time; the digital system can reduce the effects of a noisy return signal by as much as 64 times. The greatest advantage of digital techniques is its use in integrator stabilization. A tracking capacitor ensures that conversion is done to an accuracy of 1%, and residual ripple on the output is removed by a low pass filter.

  1. Characterization of Filtration Scale-Up Performance

    SciTech Connect

    Daniel, Richard C.; Billing, Justin M.; Luna, Maria L.; Cantrell, Kirk J.; Peterson, Reid A.; Bonebrake, Michael L.; Shimskey, Rick W.; Jagoda, Lynette K.

    2009-03-09

    The scale-up performance of sintered stainless steel crossflow filter elements planned for use at the Pretreatment Engineering Platform (PEP) and at the Waste Treatment and Immobilization Plant (WTP) were characterized in partial fulfillment (see Table S.1) of the requirements of Test Plan TP RPP WTP 509. This test report details the results of experimental activities related only to filter scale-up characterization. These tests were performed under the Simulant Testing Program supporting Phase 1 of the demonstration of the pretreatment leaching processes at PEP. Pacific Northwest National Laboratory (PNNL) conducted the tests discussed herein for Bechtel National, Inc. (BNI) to address the data needs of Test Specification 24590-WTP-TSP-RT-07-004. Scale-up characterization tests employ high-level waste (HLW) simulants developed under the Test Plan TP-RPP-WTP-469. The experimental activities outlined in TP-RPP-WTP-509 examined specific processes from two broad areas of simulant behavior: 1) leaching performance of the boehmite simulant as a function of suspending phase chemistry and 2) filtration performance of the blended simulant with respect to filter scale-up and fouling. With regard to leaching behavior, the effect of anions on the kinetics of boehmite leaching was examined. Two experiments were conducted: 1) one examined the effect of the aluminate anion on the rate of boehmite dissolution and 2) another determined the effect of secondary anions typical of Hanford tank wastes on the rate of boehmite dissolution. Both experiments provide insight into how compositional variations in the suspending phase impact the effectiveness of the leaching processes. In addition, the aluminate anion studies provide information on the consequences of gibbsite in waste. The latter derives from the expected fast dissolution of gibbsite relative to boehmite. This test report concerns only results of the filtration performance with respect to scale-up. Test results for boehmite

  2. Applying Renormalization Group Techniques to Nuclear Reactions

    NASA Astrophysics Data System (ADS)

    Eldredge, Zachary; Bogner, Scott; Nunes, Filomena

    2013-10-01

    Nuclear reactions are commonly used to explore the physics of unstable nuclei. Therefore, it is important that accurate, computationally favorable methods exist to describe them. Reaction models often make use of effective nucleon-nucleus potentials (optical potentials) which fit low-energy scattering data and include an imaginary component to account for the removal of flux from the elastic channel. When describing reactions in momentum space, the coupling between low- and high-momentum states can pose a technical challenge. We would like potentials which allow us to compute low-momentum interactions without including highly virtual momentum states. A solution to this problem is to apply renormalization group (RG) techniques to produce a new effective potential in which high and low momentum degrees of freedom are decoupled, so that we need only consider momenta below some cutoff. This poster will present results relating to an implementation of RG techniques on optical potentials, including complex potentials and spin-orbit effects. We show that our evolved optical potentials reproduce bound states and scattering phase shifts without the inclusion of any momenta above a selected cutoff, and compare new potentials to old ones to examine the effect of transformation.

  3. Lessons Learned on "Scaling Up" of Projects

    ERIC Educational Resources Information Center

    Viadero, Debra

    2007-01-01

    Having developed a technology-based teaching unit on weather that appeared to work well for middle school students, Nancy Butler Songer and her colleagues at the University of Michigan decided in the late 1990s to take the next logical step in their research program: They scaled up. This article discusses lessons learned by several faculty…

  4. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  5. Scaling up: Distributed machine learning with cooperation

    SciTech Connect

    Provost, F.J.; Hennessy, D.N.

    1996-12-31

    Machine-learning methods are becoming increasingly popular for automated data analysis. However, standard methods do not scale up to massive scientific and business data sets without expensive hardware. This paper investigates a practical alternative for scaling up: the use of distributed processing to take advantage of the often dormant PCs and workstations available on local networks. Each workstation runs a common rule-learning program on a subset of the data. We first show that for commonly used rule-evaluation criteria, a simple form of cooperation can guarantee that a rule will look good to the set of cooperating learners if and only if it would look good to a single learner operating with the entire data set. We then show how such a system can further capitalize on different perspectives by sharing learned knowledge for significant reduction in search effort. We demonstrate the power of the method by learning from a massive data set taken from the domain of cellular fraud detection. Finally, we provide an overview of other methods for scaling up machine learning.

  6. Nanomedicine scale-up technologies: feasibilities and challenges.

    PubMed

    Paliwal, Rishi; Babu, R Jayachandra; Palakurthi, Srinath

    2014-12-01

    Nanomedicine refers to biomedical and pharmaceutical applications of nanosized cargos of drugs/vaccine/DNA therapeutics including nanoparticles, nanoclusters, and nanospheres. Such particles have unique characteristics related to their size, surface, drug loading, and targeting potential. They are widely used to combat disease by controlled delivery of bioactive(s) or for diagnosis of life-threatening problems in their very early stage. The bioactive agent can be combined with a diagnostic agent in a nanodevice for theragnostic applications. However, the formulation scientist faces numerous challenges related to their development, scale-up feasibilities, regulatory aspects, and commercialization. This article reviews recent progress in the method of development of nanoparticles with a focus on polymeric and lipid nanoparticles, their scale-up techniques, and challenges in their commercialization. PMID:25047256

  7. The projected effect of scaling up midwifery.

    PubMed

    Homer, Caroline S E; Friberg, Ingrid K; Dias, Marcos Augusto Bastos; ten Hoope-Bender, Petra; Sandall, Jane; Speciale, Anna Maria; Bartlett, Linda A

    2014-09-20

    We used the Lives Saved Tool (LiST) to estimate deaths averted if midwifery was scaled up in 78 countries classified into three tertiles using the Human Development Index (HDI). We selected interventions in LiST to encompass the scope of midwifery practice, including prepregnancy, antenatal, labour, birth, and post-partum care, and family planning. Modest (10%), substantial (25%), or universal (95%) scale-up scenarios from present baseline levels were all found to reduce maternal deaths, stillbirths, and neonatal deaths by 2025 in all countries tested. With universal coverage of midwifery interventions for maternal and newborn health, excluding family planning, for the countries with the lowest HDI, 61% of all maternal, fetal, and neonatal deaths could be prevented. Family planning alone could prevent 57% of all deaths because of reduced fertility and fewer pregnancies. Midwifery with both family planning and interventions for maternal and newborn health could avert a total of 83% of all maternal deaths, stillbirths, and neonatal deaths. The inclusion of specialist care in the scenarios resulted in an increased number of deaths being prevented, meaning that midwifery care has the greatest effect when provided within a functional health system with effective referral and transfer mechanisms to specialist care. PMID:24965814

  8. Applying Cryopreservation Techniques to Diverse Biological Materials

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Developing new cryopreservation protocols for each new plant or tissue is time consuming and often unnecessary. Existing standard protocols can be applied to many plants resulting in moderate to excellent results or protocols may require only a few changes for optimum recovery of plants. Protocols ...

  9. Novel techniques applied to polymer lifetime predictions

    SciTech Connect

    Gillen, K.T.; Wise, J.; Clough, R.L.

    1993-12-31

    A study aimed at testing the Arrhenius life prediction approach is described. After aging elastomeric materials at several elevated (accelerated) temperatures, a modulus profiling apparatus was used to demonstrate the complicated diffusion-limited oxidation anomalies are typically present under accelerated oven-aging conditions. By using surface modulus results (oxidation less to a monotonic increase in modulus), estimates are made of the true activation energy (E{sub a}) appropriate to the oxidation reactions dominating degradation. Even though macroscopic properties should be influenced by the diffusion-limited oxidation complications, ultimate tensile elongation results were found to be correlated to the true E{sub a}. This implies that cracks initiate at the hardened surface of the material and then quickly propagate through the less oxidized interior. If values of E{sub a} obtained from accelerated exposures can be determined and rationalized, another important question involves the Arrhenius assumption that E{sub a} remains constant in the extrapolation region. Preliminary data from two ultra-sensitive techniques (oxygen consumption and microcalorimetry) aimed at testing this fundamental assumption are described.

  10. Scaling Up Decision Theoretic Planning to Planetary Rover Problems

    NASA Technical Reports Server (NTRS)

    Meuleau, Nicolas; Dearden, Richard; Washington, Rich

    2004-01-01

    Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.

  11. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  12. Scale-up on electrokinetic remediation: Engineering and technological parameters.

    PubMed

    López-Vizcaíno, Rubén; Navarro, Vicente; León, María J; Risco, Carolina; Rodrigo, Manuel A; Sáez, Cristina; Cañizares, Pablo

    2016-09-01

    This study analyses the effect of the scale-up of electrokinetic remediation (EKR) processes in natural soils. A procedure is proposed to prepare soils based on a compacting process to obtaining soils with similar moisture content and density to those found in real soils in the field. The soil used here was from a region with a high agrarian activity (Mora, Spain). The scale-up study was performed in two installations at different scales: a mock-up pilot scale (0.175m(3)) and a prototype with a scale that was very similar to a real application (16m(3)). The electrode configuration selected consisted of rows of graphite electrodes facing each other located in electrolyte wells. The discharge of 20mg of 2,4-dichlorophenoxyacetic acid [2,4-D] per kg of dry soil was treated by applying an electric potential gradient of 1Vcm(-1). An increase in scale was observed to directly influence the amount of energy supplied to the soil being treated. As a result, electroosmotic and electromigration flows and electric heating are more intense than in smaller-scale tests (24%, 1% and 25%, respectively respect to the values in prototype). In addition, possible leaks were evaluated by conducting a watertightness test and quantifying evaporation losses. PMID:27209275

  13. Scale-up of sediment microbial fuel cells

    NASA Astrophysics Data System (ADS)

    Ewing, Timothy; Ha, Phuc Thi; Babauta, Jerome T.; Tang, Nghia Trong; Heo, Deukhyoun; Beyenal, Haluk

    2014-12-01

    Sediment microbial fuel cells (SMFCs) are used as renewable power sources to operate remote sensors. However, increasing the electrode surface area results in decreased power density, which demonstrates that SMFCs do not scale up with size. As an alternative to the physical scale-up of SMFCs, we proposed that it is possible to scale up power by using smaller-sized individually operated SMFCs connected to a power management system that electrically isolates the anodes and cathodes. To demonstrate our electronic scale-up approach, we operated one 0.36-m2 SMFC (called a single-equivalent SMFC) and four independent SMFCs of 0.09 m2 each (called scaled-up SMFCs) and managed the power using an innovative custom-developed power management system. We found that the single-equivalent SMFC and the scaled-up SMFCs produced similar power for the first 155 days. However, in the long term (>155 days) our scaled-up SMFCs generated significantly more power than the single-equivalent SMFC (2.33 mW vs. 0.64 mW). Microbial community analysis of the single-equivalent SMFC and the scaled-up SMFCs showed very similar results, demonstrating that the difference in operation mode had no significant effect on the microbial community. When we compared scaled-up SMFCs with parallel SMFCs, we found that the scaled-up SMFCs generated more power. Our novel approach demonstrates that SMFCs can be scaled up electronically.

  14. Scale-up of ecological experiments: Density variation in the mobile bivalve Macomona liliana

    USGS Publications Warehouse

    Schneider, D.C.; Walters, R.; Thrush, S.; Dayton, P.

    1997-01-01

    At present the problem of scaling up from controlled experiments (necessarily at a small spatial scale) to questions of regional or global importance is perhaps the most pressing issue in ecology. Most of the proposed techniques recommend iterative cycling between theory and experiment. We present a graphical technique that facilitates this cycling by allowing the scope of experiments, surveys, and natural history observations to be compared to the scope of models and theory. We apply the scope analysis to the problem of understanding the population dynamics of a bivalve exposed to environmental stress at the scale of a harbour. Previous lab and field experiments were found not to be 1:1 scale models of harbour-wide processes. Scope analysis allowed small scale experiments to be linked to larger scale surveys and to a spatially explicit model of population dynamics.

  15. GENOMIC AND PROTEOMIC TECHNIQUES APPLIED TO REPRODUCTIVE BIOLOGY

    EPA Science Inventory

    Genomic and proteomic techniques applied to reproductive biology
    John C. Rockett
    Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research and Development, United States Environmental Protection Agency, Research Tria...

  16. Accounting for the cost of scaling-up health interventions.

    PubMed

    Johns, Benjamin; Baltussen, Rob

    2004-11-01

    Recent studies such as the Commission on Macroeconomics and Health have highlighted the need for expanding the coverage of services for HIV/AIDS, malaria, tuberculosis, immunisations and other diseases. In order for policy makers to plan for these changes, they need to analyse the change in costs when interventions are 'scaled-up' to cover greater percentages of the population. Previous studies suggest that applying current unit costs to an entire population can misconstrue the true costs of an intervention. This study presents the methodology used in WHO-CHOICE's generalised cost effectiveness analysis, which includes non-linear cost functions for health centres, transportation and supervision costs, as well as the presence of fixed costs of establishing a health infrastructure. Results show changing marginal costs as predicted by economic theory. PMID:15386683

  17. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  18. Scale-up of miscible flood processes

    SciTech Connect

    Orr, F.M. Jr.

    1992-05-01

    Results of a wide-ranging investigation of the scaling of the physical mechanisms of miscible floods are reported. Advanced techniques for analysis of crude oils are considered in Chapter 2. Application of supercritical fluid chromatography is demonstrated for characterization of crude oils for equation-of-state calculations of phase equilibrium. Results of measurements of crude oil and phase compositions by gas chromatography and mass spectrometry are also reported. The theory of development of miscibility is considered in detail in Chapter 3. The theory is extended to four components, and sample solutions for a variety of gas injection systems are presented. The analytical theory shows that miscibility can develop even though standard tie-line extension criteria developed for ternary systems are not satisfied. In addition, the theory includes the first analytical solutions for condensing/vaporizing gas drives. In Chapter 4, methods for simulation of viscous fingering are considered. The scaling of the growth of transition zones in linear viscous fingering is considered. In addition, extension of the models developed previously to three dimensions is described, as is the inclusion of effects of equilibrium phase behavior. In Chapter 5, the combined effects of capillary and gravity-driven crossflow are considered. The experimental results presented show that very high recovery can be achieved by gravity segregation when interfacial tensions are moderately low. We argue that such crossflow mechanisms are important in multicontact miscible floods in heterogeneous reservoirs. In addition, results of flow visualization experiments are presented that illustrate the interplay of crossflow driven by gravity with that driven by viscous forces.

  19. Multisite Studies and Scaling up in Educational Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2012-01-01

    A scale-up study in education typically expands the sample of students, schools, districts, and/or practices or materials used in smaller studies in ways that build in heterogeneity. Yet surprisingly little is known about the factors that promote successful scaling up efforts in education, in large part due to the absence of empirically supported…

  20. Readiness for Change. Scaling-Up Brief. Number 3

    ERIC Educational Resources Information Center

    Fixsen, Dean L.; Blase, Karen A.; Horner, Rob; Sugai, George

    2009-01-01

    The purpose of this "Brief" is to define the variables a state or large district leadership team may wish to consider as they determine if they are "ready" to invest in the scaling-up of an innovation in education. As defined here, "scaling up" means that at least 60% of the students who could benefit from an innovation have access to that…

  1. Scale Up in Education. Volume 1: Ideas in Principle

    ERIC Educational Resources Information Center

    Schneider, Barbara Ed.; McDonald, Sarah-Kathryn Ed.

    2006-01-01

    "Scale Up in Education, Volume 1: Ideas in Principle" examines the challenges of "scaling up" from a multidisciplinary perspective. It brings together contributions from disciplines that routinely take promising innovations to scale, including medicine, business, engineering, computing, and education. Together the contributors explore appropriate…

  2. Applying Parallel Processing Techniques to Tether Dynamics Simulation

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    1996-01-01

    The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.

  3. Scaling up depot medroxyprogesterone acetate (DMPA): a systematic literature review illustrating the AIDED model

    PubMed Central

    2013-01-01

    Background Use of depot medroxyprogesterone acetate (DMPA), often known by the brand name Depo-Provera, has increased globally, particularly in multiple low- and middle-income countries (LMICs). As a reproductive health technology that has scaled up in diverse contexts, DMPA is an exemplar product innovation with which to illustrate the utility of the AIDED model for scaling up family health innovations. Methods We conducted a systematic review of the enabling factors and barriers to scaling up DMPA use in LMICs. We searched 11 electronic databases for academic literature published through January 2013 (n = 284 articles), and grey literature from major health organizations. We applied exclusion criteria to identify relevant articles from peer-reviewed (n = 10) and grey literature (n = 9), extracting data on scale up of DMPA in 13 countries. We then mapped the resulting factors to the five AIDED model components: ASSESS, INNOVATE, DEVELOP, ENGAGE, and DEVOLVE. Results The final sample of sources included studies representing variation in geographies and methodologies. We identified 15 enabling factors and 10 barriers to dissemination, diffusion, scale up, and/or sustainability of DMPA use. The greatest number of factors were mapped to the ASSESS, DEVELOP, and ENGAGE components. Conclusions Findings offer early empirical support for the AIDED model, and provide insights into scale up of DMPA that may be relevant for other family planning product innovations. PMID:23915274

  4. The Use of Video Tape as an Applied Research Technique.

    ERIC Educational Resources Information Center

    Deshler, J. David; Czaplewski, Ellen C.

    This report describes the application of videotape recordings as an applied research technique in a project directed toward the research and development of training designs and materials for human service agency professionals. The multiple applications of VTR and the types of data collection situations in which it has been utilized are briefly…

  5. Sensor Data Qualification Technique Applied to Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Simon, Donald L.

    2013-01-01

    This paper applies a previously developed sensor data qualification technique to a commercial aircraft engine simulation known as the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k). The sensor data qualification technique is designed to detect, isolate, and accommodate faulty sensor measurements. It features sensor networks, which group various sensors together and relies on an empirically derived analytical model to relate the sensor measurements. Relationships between all member sensors of the network are analyzed to detect and isolate any faulty sensor within the network.

  6. Digital image correlation techniques applied to LANDSAT multispectral imagery

    NASA Technical Reports Server (NTRS)

    Bonrud, L. O. (Principal Investigator); Miller, W. J.

    1976-01-01

    The author has identified the following significant results. Automatic image registration and resampling techniques applied to LANDSAT data achieved accuracies, resulting in mean radial displacement errors of less than 0.2 pixel. The process method utilized recursive computational techniques and line-by-line updating on the basis of feedback error signals. Goodness of local feature matching was evaluated through the implementation of a correlation algorithm. An automatic restart allowed the system to derive control point coordinates over a portion of the image and to restart the process, utilizing this new control point information as initial estimates.

  7. Scale-up Synthesis of Diallyl Phthalate Prepolymer

    SciTech Connect

    Carey, D. A.

    1984-10-01

    This project was initiated to develop processes for the synthesis of diallyl phthalate (DAP) prepolymer in the Bendix Chemical Polymer Facility. Thus far, five scale-up reactions have been carried out in a 100-gallon reactor and fourteen have been conducted in the 15-gallon resin kettle. The synthesis of diallyl isophthalate prepolymer (DAIPP) was also investigated; eight scale-up reactions of this prepolymer have been carried out. Aging studies on DAIPP were also conducted.

  8. A new scale-up approach for dispersive mixing in twin-screw compounding

    NASA Astrophysics Data System (ADS)

    Fukuda, Graeme; Bigio, David I.; Andersen, Paul; Wetzel, Mark

    2015-05-01

    Scale-up rules in polymer processing are critical in ensuring consistency in product quality and properties when transitioning from low volume laboratory mixing processes to high volume industrial compounding. The scale-up approach investigated in this study evaluates the processes with respect to dispersive mixing. Demand of polymer composites with solid additives, such as carbon microfibers and nanotubes, has become increasingly popular. Dispersive mixing breaks down particles that agglomerate, which is paramount in processing composites because solid additives tend to collect and clump. The amount of stress imparted on the material governs the degree of dispersive mixing. A methodology has been developed to characterize the Residence Stress Distribution (RSD) within a twin-screw extruder in real time through the use of polymeric stress beads. Through this technique, certain mixing scale-up rules can be analyzed. The following research investigated two different scale-up rules. The industry standard for mixing scale-up takes the ratio of outer diameters cubed to convert the volumetric flow rate from the smaller process to a flow rate appropriate in the larger machine. This procedure then resolves both operating conditions since shear rate remains constant. The second rule studied is based on percent drag flow, or the fraction of pumping potential, for different elements along the screw configuration. The percent drag flow rule aims to bring greater focus to operating conditions when scaling-up with respect to dispersive mixing. Through the use of the RSD methodology and a Design of Experiment (DOE) approach, rigorous statistical analysis was used to determine the validity between the scale-up rules of argument.

  9. A quality by design approach to scale-up of high-shear wet granulation process.

    PubMed

    Pandey, Preetanshu; Badawy, Sherif

    2016-01-01

    High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review. PMID:26489403

  10. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  11. Scaling up breastfeeding programmes in a complex adaptive world.

    PubMed

    Pérez-Escamilla, Rafael; Hall Moran, Victoria

    2016-07-01

    The 2016 Breastfeeding Lancet Series continues to provide unequivocal evidence regarding the numerous benefits that optimal breastfeeding practices offer to children and women worldwide and the major savings that improving these practices can have as a result of their major public health benefits. Unfortunately, this knowledge remains underutilized as there has been little progress scaling up effective breastfeeding programmes globally. Improving the uptake and scaling up of effective national breastfeeding programmes that are potent enough to improve exclusive breastfeeding duration should be a top priority for all countries. Complex analysis systems longitudinal research is needed to understand how best to empower decision makers to achieve this goal through well-validated participatory decision-making tools to help their countries assess baseline needs, including costs, as well as progress with their scaling-up efforts. Sound systems thinking frameworks and scaling-up models are now available to guide and research prospectively future scaling-up efforts that can be replicated, with proper adaptations, across countries. PMID:27161881

  12. Soil Moisture Estimation under Vegetation Applying Polarimetric Decomposition Techniques

    NASA Astrophysics Data System (ADS)

    Jagdhuber, T.; Schön, H.; Hajnsek, I.; Papathanassiou, K. P.

    2009-04-01

    Polarimetric decomposition techniques and inversion algorithms are developed and applied on the OPAQUE data set acquired in spring 2007 to investigate their potential and limitations for soil moisture estimation. A three component model-based decomposition is used together with an eigenvalue decomposition in a combined approach to invert for soil moisture over bare and vegetated soils at L-band. The applied approach indicates a feasible capability to invert soil moisture after decomposing volume and ground scattering components over agricultural land surfaces. But there are still deficiencies in modeling the volume disturbance. The results show a root mean square error below 8.5vol.-% for the winter crop fields (winter wheat, winter triticale and winter barley) and below 11.5Vol-% for the summer crop field (summer barley) whereas all fields have a distinct volume layer of 55-85cm height.

  13. Image reconstruction techniques applied to nuclear mass models

    NASA Astrophysics Data System (ADS)

    Morales, Irving O.; Isacker, P. Van; Velazquez, V.; Barea, J.; Mendoza-Temis, J.; Vieyra, J. C. López; Hirsch, J. G.; Frank, A.

    2010-02-01

    A new procedure is presented that combines well-known nuclear models with image reconstruction techniques. A color-coded image is built by taking the differences between measured masses and the predictions given by the different theoretical models. This image is viewed as part of a larger array in the (N,Z) plane, where unknown nuclear masses are hidden, covered by a “mask.” We apply a suitably adapted deconvolution algorithm, used in astronomical observations, to “open the window” and see the rest of the pattern. We show that it is possible to improve significantly mass predictions in regions not too far from measured nuclear masses.

  14. Estimating Population Size Using the Network Scale Up Method

    PubMed Central

    Maltiel, Rachael; Raftery, Adrian E.; McCormick, Tyler H.; Baraff, Aaron J.

    2015-01-01

    We develop methods for estimating the size of hard-to-reach populations from data collected using network-based questions on standard surveys. Such data arise by asking respondents how many people they know in a specific group (e.g. people named Michael, intravenous drug users). The Network Scale up Method (NSUM) is a tool for producing population size estimates using these indirect measures of respondents’ networks. Killworth et al. (1998a,b) proposed maximum likelihood estimators of population size for a fixed effects model in which respondents’ degrees or personal network sizes are treated as fixed. We extend this by treating personal network sizes as random effects, yielding principled statements of uncertainty. This allows us to generalize the model to account for variation in people’s propensity to know people in particular subgroups (barrier effects), such as their tendency to know people like themselves, as well as their lack of awareness of or reluctance to acknowledge their contacts’ group memberships (transmission bias). NSUM estimates also suffer from recall bias, in which respondents tend to underestimate the number of members of larger groups that they know, and conversely for smaller groups. We propose a data-driven adjustment method to deal with this. Our methods perform well in simulation studies, generating improved estimates and calibrated uncertainty intervals, as well as in back estimates of real sample data. We apply them to data from a study of HIV/AIDS prevalence in Curitiba, Brazil. Our results show that when transmission bias is present, external information about its likely extent can greatly improve the estimates. The methods are implemented in the NSUM R package. PMID:26949438

  15. Brazilian meningococcal C conjugate vaccine: Scaling up studies.

    PubMed

    Bastos, Renata Chagas; de Souza, Iaralice Medeiros; da Silva, Milton Neto; Silva, Flavia de Paiva; Figueira, Elza Scott; Leal, Maria de Lurdes; Jessouroun, Ellen; da Silva, José Godinho; Medronho, Ricardo de Andrade; da Silveira, Ivna Alana Freitas Brasileiro

    2015-08-20

    Several outbreaks caused by Neisseria meningitidis group C have been occurred in different regions of Brazil. A conjugate vaccine for Neisseria meningitidis was produced by chemical linkage between periodate-oxidized meningococcal C polysaccharide and hydrazide-activated monomeric tetanus toxoid via a modified reductive amination conjugation method. Vaccine safety and immunogenicity tested in Phase I and II trials showed satisfactory results. Before starting Phase III trials, vaccine production was scaled up to obtain industrial lots under Good Manufacture Practices (GMP). Comparative analysis between data obtained from industrial and pilot scales of the meningococcal C conjugate bulk showed similar execution times in the scaling up production process without significant losses or alterations in the quality attributes of purified compounds. In conclusion, scale up was considered satisfactory and the Brazilian meningococcal conjugate vaccine production aiming to perform Phase III trials is feasible. PMID:25865466

  16. Scaling Up Impact on Nutrition: What Will It Take?1234

    PubMed Central

    Gillespie, Stuart; Menon, Purnima; Kennedy, Andrew L

    2015-01-01

    Despite consensus on actions to improve nutrition globally, less is known about how to operationalize the right mix of actions—nutrition-specific and nutrition-sensitive—equitably, at scale, in different contexts. This review draws on a large scaling-up literature search and 4 case studies of large-scale nutrition programs with proven impact to synthesize critical elements for impact at scale. Nine elements emerged as central: 1) having a clear vision or goal for impact; 2) intervention characteristics; 3) an enabling organizational context for scaling up; 4) establishing drivers such as catalysts, champions, systemwide ownership, and incentives; 5) choosing contextually relevant strategies and pathways for scaling up, 6) building operational and strategic capacities; 7) ensuring adequacy, stability, and flexibility of financing; 8) ensuring adequate governance structures and systems; and 9) embedding mechanisms for monitoring, learning, and accountability. Translating current political commitment to large-scale impact on nutrition will require robust attention to these elements. PMID:26178028

  17. Advances and Practices of Bioprocess Scale-up.

    PubMed

    Xia, Jianye; Wang, Guan; Lin, Jihan; Wang, Yonghong; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2016-01-01

    : This chapter addresses the update progress in bioprocess engineering. In addition to an overview of the theory of multi-scale analysis for fermentation process, examples of scale-up practice combining microbial physiological parameters with bioreactor fluid dynamics are also described. Furthermore, the methodology for process optimization and bioreactor scale-up by integrating fluid dynamics with biokinetics is highlighted. In addition to a short review of the heterogeneous environment in large-scale bioreactor and its effect, a scale-down strategy for investigating this issue is addressed. Mathematical models and simulation methodology for integrating flow field in the reactor and microbial kinetics response are described. Finally, a comprehensive discussion on the advantages and challenges of the model-driven scale-up method is given at the end of this chapter. PMID:25636486

  18. Scaling up impact on nutrition: what will it take?

    PubMed

    Gillespie, Stuart; Menon, Purnima; Kennedy, Andrew L

    2015-07-01

    Despite consensus on actions to improve nutrition globally, less is known about how to operationalize the right mix of actions-nutrition-specific and nutrition-sensitive-equitably, at scale, in different contexts. This review draws on a large scaling-up literature search and 4 case studies of large-scale nutrition programs with proven impact to synthesize critical elements for impact at scale. Nine elements emerged as central: 1) having a clear vision or goal for impact; 2) intervention characteristics; 3) an enabling organizational context for scaling up; 4) establishing drivers such as catalysts, champions, systemwide ownership, and incentives; 5) choosing contextually relevant strategies and pathways for scaling up, 6) building operational and strategic capacities; 7) ensuring adequacy, stability, and flexibility of financing; 8) ensuring adequate governance structures and systems; and 9) embedding mechanisms for monitoring, learning, and accountability. Translating current political commitment to large-scale impact on nutrition will require robust attention to these elements. PMID:26178028

  19. Applying field mapping refractive beam shapers to improve holographic techniques

    NASA Astrophysics Data System (ADS)

    Laskin, Alexander; Williams, Gavin; McWilliam, Richard; Laskin, Vadim

    2012-03-01

    Performance of various holographic techniques can be essentially improved by homogenizing the intensity profile of the laser beam with using beam shaping optics, for example, the achromatic field mapping refractive beam shapers like πShaper. The operational principle of these devices presumes transformation of laser beam intensity from Gaussian to flattop one with high flatness of output wavefront, saving of beam consistency, providing collimated output beam of low divergence, high transmittance, extended depth of field, negligible residual wave aberration, and achromatic design provides capability to work with several laser sources with different wavelengths simultaneously. Applying of these beam shapers brings serious benefits to the Spatial Light Modulator based techniques like Computer Generated Holography or Dot-Matrix mastering of security holograms since uniform illumination of an SLM allows simplifying mathematical calculations and increasing predictability and reliability of the imaging results. Another example is multicolour Denisyuk holography when the achromatic πShaper provides uniform illumination of a field at various wavelengths simultaneously. This paper will describe some design basics of the field mapping refractive beam shapers and optical layouts of their applying in holographic systems. Examples of real implementations and experimental results will be presented as well.

  20. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  1. Scaling-Up Successfully: Pathways to Replication for Educational NGOs

    ERIC Educational Resources Information Center

    Jowett, Alice; Dyer, Caroline

    2012-01-01

    Non-government organisations (NGOs) are big players in international development, critical to the achievement of the Millennium Development Goals (MDGs) and constantly under pressure to "achieve more". Scaling-up their initiatives successfully and sustainably can be an efficient and cost effective way for NGOs to increase their impact across a…

  2. Three Collaborative Models for Scaling Up Evidence-Based Practices

    PubMed Central

    Roberts, Rosemarie; Jones, Helen; Marsenich, Lynne; Sosna, Todd; Price, Joseph M.

    2015-01-01

    The current paper describes three models of research-practice collaboration to scale-up evidence-based practices (EBP): (1) the Rolling Cohort model in England, (2) the Cascading Dissemination model in San Diego County, and (3) the Community Development Team model in 53 California and Ohio counties. Multidimensional Treatment Foster Care (MTFC) and KEEP are the focal evidence-based practices that are designed to improve outcomes for children and families in the child welfare, juvenile justice, and mental health systems. The three scale-up models each originated from collaboration between community partners and researchers with the shared goal of wide-spread implementation and sustainability of MTFC/KEEP. The three models were implemented in a variety of contexts; Rolling Cohort was implemented nationally, Cascading Dissemination was implemented within one county, and Community Development Team was targeted at the state level. The current paper presents an overview of the development of each model, the policy frameworks in which they are embedded, system challenges encountered during scale-up, and lessons learned. Common elements of successful scale-up efforts, barriers to success, factors relating to enduring practice relationships, and future research directions are discussed. PMID:21484449

  3. New tuberculosis technologies: challenges for retooling and scale-up.

    PubMed

    Pai, M; Palamountain, K M

    2012-10-01

    The availability of new tools does not mean that they will be adopted, used correctly, scaled up or have public health impact. Experience to date with new diagnostics suggests that many national tuberculosis programmes (NTPs) in high-burden countries are reluctant to adopt and scale up new tools, even when these are backed by evidence and global policy recommendations. We suggest that there are several common barriers to effective national adoption and scale-up of new technologies: global policy recommendations that do not provide sufficient information for scale-up, complex decision-making processes and weak political commitment at the country level, limited engagement of and support to NTP managers, high cost of tools and poor fit with user needs, unregulated markets and inadequate business models, limited capacity for laboratory strengthening and implementation research, and insufficient advocacy and donor support. Overcoming these barriers will require enhanced country-level advocacy, resources, technical assistance and political commitment. Some of the BRICS (Brazil, Russia, India, China, South Africa) countries are emerging as early adopters of policies and technologies, and are increasing their investments in TB control. They may provide the first opportunities to fully assess the public health impact of new tools. PMID:23107630

  4. Scaling up Education Reform: Addressing the Politics of Disparity

    ERIC Educational Resources Information Center

    Bishop, Russell; O'Sullivan, Dominic; Berryman, Mere

    2010-01-01

    What is school reform? What makes it sustainable? Who needs to be involved? How is scaling up achieved? This book is about the need for educational reforms that have built into them, from the outset, those elements that will see them sustained in the original sites and spread to others. Using the Te Kotahitanga Project as a model the authors…

  5. Charter Operators Spell Out Barriers to "Scaling Up"

    ERIC Educational Resources Information Center

    Zehr, Mary Ann

    2011-01-01

    The pace at which the highest-performing charter-management organizations (CMOs) are "scaling up" is being determined largely by how rapidly they can develop and hire strong leaders and acquire physical space, and by the level of support they receive for growth from city or state policies, say leaders from some charter organizations viewed by…

  6. Sustaining and Scaling up the Impact of Professional Development Programmes

    ERIC Educational Resources Information Center

    Zehetmeier, Stefan

    2015-01-01

    This paper deals with a crucial topic: which factors influence the sustainability and scale-up of a professional development programme's impact? Theoretical models and empirical findings from impact research (e.g. Zehetmeier and Krainer, "ZDM Int J Math" 43(6/7):875-887, 2011) and innovation research (e.g. Cobb and Smith,…

  7. Applying machine learning classification techniques to automate sky object cataloguing

    NASA Astrophysics Data System (ADS)

    Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav

    1993-08-01

    We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is

  8. Collaborative Group Learning using the SCALE-UP Pedagogy

    NASA Astrophysics Data System (ADS)

    Feldman, Gerald

    2011-10-01

    The time-honored conventional lecture (``teaching by telling'') has been shown to be an ineffective mode of instruction for science classes. In these cases, where the enhancement of critical thinking skills and the development of problem-solving abilities are emphasized, collaborative group learning environments have proven to be far more effective. In addition, students naturally improve their teamwork skills through the close interaction they have with their group members. Early work on the Studio Physics model at Rensselaer Polytechnic Institute in the mid-1990's was extended to large classes via the SCALE-UP model pioneered at North Carolina State University a few years later. In SCALE-UP, students sit at large round tables in three groups of three --- in this configuration, they carry out a variety of pencil/paper exercises (ponderables) using small whiteboards and perform hands-on activities like demos and labs (tangibles) throughout the class period. They also work on computer simulations using a shared laptop for each group of three. Formal lecture is reduced to a minimal level and the instructor serves more as a ``coach'' to facilitate the academic ``drills'' that the students are working on. Since its inception in 1997, the SCALE-UP pedagogical approach has been adopted by over 100 institutions across the country and about 20 more around the world. In this talk, I will present an overview of the SCALE-UP concept and I will outline the details of its deployment at George Washington University over the past 4 years. I will also discuss empirical data from assessments given to the SCALE-UP collaborative classes and the regular lecture classes at GWU in order to make a comparative study of the effectiveness of the two methodologies.

  9. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  10. Extrapolation techniques applied to matrix methods in neutron diffusion problems

    NASA Technical Reports Server (NTRS)

    Mccready, Robert R

    1956-01-01

    A general matrix method is developed for the solution of characteristic-value problems of the type arising in many physical applications. The scheme employed is essentially that of Gauss and Seidel with appropriate modifications needed to make it applicable to characteristic-value problems. An iterative procedure produces a sequence of estimates to the answer; and extrapolation techniques, based upon previous behavior of iterants, are utilized in speeding convergence. Theoretically sound limits are placed on the magnitude of the extrapolation that may be tolerated. This matrix method is applied to the problem of finding criticality and neutron fluxes in a nuclear reactor with control rods. The two-dimensional finite-difference approximation to the two-group neutron fluxes in a nuclear reactor with control rods. The two-dimensional finite-difference approximation to the two-group neutron-diffusion equations is treated. Results for this example are indicated.

  11. Vibration Monitoring Techniques Applied to Detect Damage in Rotating Disks

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, Andrew L.; Sawicki, Jerzy T.

    2002-01-01

    Rotor health monitoring and online damage detection are increasingly gaining the interest of the manufacturers of aircraft engines. This is primarily due to the need for improved safety during operation as well as the need for lower maintenance costs. Applied techniques for detecting damage in and monitoring the health of rotors are essential for engine safety, reliability, and life prediction. The goals of engine safety are addressed within the NASA-sponsored Aviation Safety Program (AvSP). AvSP provides research and technology products needed to help the Federal Aviation Administration and the aerospace industry improve aviation safety. The Nondestructive Evaluation Group at the NASA Glenn Research Center is addressing propulsion health management and the development of propulsion-system-specific technologies intended to detect potential failures prior to catastrophe.

  12. Image analysis technique applied to lock-exchange gravity currents

    NASA Astrophysics Data System (ADS)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  13. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  14. Airflow measurement techniques applied to radon mitigation problems

    SciTech Connect

    Harrje, D.T.; Gadsby, K.J.

    1989-01-01

    During the past decade a multitude of diagnostic procedures associated with the evaluation of air infiltration and air leakage sites have been developed. The spirit of international cooperation and exchange of ideas within the AIC-AIVC conferences has greatly facilitated the adoption and use of these measurement techniques in the countries participating in Annex V. But wide application of such diagnostic methods are not limited to air infiltration alone. The subject of this paper concerns the ways to evaluate and improve radon reduction in buildings using diagnostic methods directly related to developments familiar to the AIVC. Radon problems are certainly not unique to the United States, and the methods described here have to a degree been applied by researchers of other countries faced with similar problems. The radon problem involves more than a harmful pollutant of the living spaces of our buildings -- it also involves energy to operate radon removal equipment and the loss of interior conditioned air as a direct result. The techniques used for air infiltration evaluation will be shown to be very useful in dealing with the radon mitigation challenge. 10 refs., 7 figs., 1 tab.

  15. Scaling up adsorption media reactors for copper removal with the aid of dimensionless numbers.

    PubMed

    Chang, Ni-Bin; Houmann, Cameron; Wanielista, Martin

    2016-02-01

    Adsorption media may be used to sorb copper in an aquatic environment for pollution control. Effective design of adsorption media reactors is highly dependent on selection of the hydraulic residence time when scaling up a pilot-scale reactor to a field-scale reactor. This paper seeks to improve scaling-up technique of the reactor design process through the use of the Damköhler and Péclet numbers via a dimensional analysis. A new scaling-up theory is developed in this study through a joint consideration of the Damköhler and Péclet numbers for a constant media particle size such that a balance between transport control and reaction control can be harmonized. A series of column breakthrough tests at varying hydraulic residence times revealed a clear peak adsorption capacity at a Damköhler number of 2.74. The Péclet numbers for the column breakthrough tests indicated that mechanical dispersion is an important effect that requires further consideration in the scaling-up process. However, perfect similitude of the Damköhler number cannot be maintained for a constant media particle size, and relaxation of hydrodynamic similitude through variation of the Péclet number must occur. PMID:26454119

  16. Scaling up high-impact interventions: how is it done?

    PubMed

    Smith, Jeffrey Michael; de Graft-Johnson, Joseph; Zyaee, Pashtoon; Ricca, Jim; Fullerton, Judith

    2015-06-01

    Building upon the World Health Organization's ExpandNet framework, 12 key principles of scale-up have emerged from the implementation of maternal and newborn health interventions. These principles are illustrated by three case studies of scale up of high-impact interventions: the Helping Babies Breathe initiative; pre-service midwifery education in Afghanistan; and advanced distribution of misoprostol for self-administration at home births to prevent postpartum hemorrhage. Program planners who seek to scale a maternal and/or newborn health intervention must ensure that: the necessary evidence and mechanisms for local ownership for the intervention are well-established; the intervention is as simple and cost-effective as possible; and the implementers and beneficiaries of the intervention are working in tandem to build institutional capacity at all levels and in consideration of all perspectives. PMID:26115856

  17. Scale-Up of Advanced Hot-Gas Desulfurization Sorbents

    SciTech Connect

    Jothimurugesan, K.; Gangwal, Santosh K.

    1996-10-14

    The overall objective of this project is to develop regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective of the project is to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high activity at temperatures as low as 343 C (650 F). A number of formulations will be prepared and screened in a 1/2-inch fixed bed reactor at high pressure (1 to 20 atm) and high temperatures using simulated coal-derived fuel-gases. Screening criteria will include, chemical reactivity, stability, and regenerability over the temperature range of 343 C to 650 C. After initial screening, at least 3 promising formulations will be tested for 25-30 cycles of absorption and regeneration. One of the superior formulations with the best cyclic performance will be selected for investigating scale up parameters. The scaled-up formulation will be tested for long term durability and chemical reactivity.

  18. Scaling up of manufacturing processes of recycled carpet based composites

    NASA Astrophysics Data System (ADS)

    Lakshminarayanan, Krishnan

    2011-12-01

    In this work, feasibility of recycling post-consumer carpets using a modified vacuum assisted resisted molding process into large-scale components was successfully demonstrated. The scale up also included the incorporation of nano-clay films in the carpet composites. It is expected that the films will enhance the ability of the composite to withstand environmental degradation and also serve as a fire retardant. Low-cost resins were used to fabricate the recycled carpet-based composites. The scale up in terms of process was achieved by manufacturing composites without a hot press and thereby saving additional equipment cost. Mechanical and physical properties were evaluated. Large-scale samples demonstrated mechanical properties that were different from results from small samples. Acoustic tests indicate good sound absorption of the carpet composite. Cost analysis of the composite material based on the cost of the raw materials and the manufacturing process has been presented.

  19. Drug nanocrystals: A way toward scale-up.

    PubMed

    Raghava Srivalli, Kale Mohana; Mishra, Brahmeshwar

    2016-07-01

    Drug nanocrystals comprise unique drug delivery platforms playing a significantly important and distinctive role in drug delivery and as such, the industry and academia are spending a lot of their time and money in developing the nanocrystal products. The current research works in this field depict a vivid shift from lab scale optimization studies to scale up focused studies. In this emerging scenario of nanocrystal technology, a review on some exemplary and progressing research studies with either scalability as their objective or upscaling as their future scope may smoothen the future upscaling attempts in this field. Hence, this paper reviews the efforts of such research works as case studies since an analysis of such research studies may input certain beneficial knowledge to carry out more scale up based research works on nanocrystals. PMID:27330370

  20. Routes to scaling up the fruits of biotechnology

    SciTech Connect

    Spalding, B.J.

    1985-07-17

    A fed-batch process without recycling, a fed batch process with recycling, a straight batch process, a continuous growth process and, a multichamber process are some of the routes used for scaling up the fruits of biotechnology. Although computers can help in optimization of the process the best bet is still the intelligent guess. Some important considerations in scaling up are as follows: the imprecision of current chemical tests to measure the amount of sugar in a fermentation broth, temperature control, separation procedures. The type of medium to be used must also be considered. For example, as the cost of corn accounts for half the production cost in the production of ethanol from corn, it is imperative that only small amounts of carbon end up in non-product form.

  1. Scale-up in Poroelastic System and Applications to Reservoirs

    SciTech Connect

    Berryman, J G

    2003-07-01

    A fundamental problem of heterogeneous systems is that the macroscale behavior is not necessarily well-described by equations familiar to us at the meso- or microscale. In relatively simple cases like electrical conduction and elasticity, it is hue that the equations describing macroscale behavior take the same form as those at the microscale. But in more complex systems, these simple results do not hold. Consider fluid flow in porous media where the microscale behavior is well-described by Navier-Stokes' equations for liquid in the pores while the macroscale behavior instead obeys Darcy's equation. Rigorous methods for establishing the form of such equations for macroscale behavior include multiscale homogenization methods and also the volume averaging method. In addition, it has been shown that Biot's equations of poroelasticity follow in a scale-up of the microscale equations of elasticity coupled to Navier-Stokes. Laboratory measurements have shown that Biot's equations indeed hold for simple systems but heterogeneous systems can have quite different behavior. So the question arises whether there is yet another level of scale-up needed to arrive at equations valid for the reservoir scale? And if so, do these equations take the form of Biot's equations or some other form? We will discuss these issues and show that the double-porosity equations play a special role in the scale-up to equations describing reservoir behavior, for fluid pumping, geomechanics, as well as seismic wave propagation.

  2. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  3. Scaling-up voluntary medical male circumcision – what have we learned?

    PubMed Central

    Ledikwe, Jenny H; Nyanga, Robert O; Hagon, Jaclyn; Grignon, Jessica S; Mpofu, Mulamuli; Semo, Bazghina-werq

    2014-01-01

    In 2007, the World Health Organization (WHO) and the joint United Nations agency program on HIV/AIDS (UNAIDS) recommended voluntary medical male circumcision (VMMC) as an add-on strategy for HIV prevention. Fourteen priority countries were tasked with scaling-up VMMC services to 80% of HIV-negative men aged 15–49 years by 2016, representing a combined target of 20 million circumcisions. By December 2012, approximately 3 million procedures had been conducted. Within the following year, there was marked improvement in the pace of the scale-up. During 2013, the total number of circumcisions performed nearly doubled, with approximately 6 million total circumcisions conducted by the end of the year, reaching 30% of the initial target. The purpose of this review article was to apply a systems thinking approach, using the WHO health systems building blocks as a framework to examine the factors influencing the scale-up of the VMMC programs from 2008–2013. Facilitators that accelerated the VMMC program scale-up included: country ownership; sustained political will; service delivery efficiencies, such as task shifting and task sharing; use of outreach and mobile services; disposable, prepackaged VMMC kits; external funding; and a standardized set of indicators for VMMC. A low demand for the procedure has been a major barrier to achieving circumcision targets, while weak supply chain management systems and the lack of adequate financial resources with a heavy reliance on donor support have also adversely affected scale-up. Health systems strengthening initiatives and innovations have progressively improved VMMC service delivery, but an understanding of the contextual barriers and the facilitators of demand for the procedure is critical in reaching targets. There is a need for countries implementing VMMC programs to share their experiences more frequently to identify and to enhance best practices by other programs. PMID:25336991

  4. Scaling-up voluntary medical male circumcision - what have we learned?

    PubMed

    Ledikwe, Jenny H; Nyanga, Robert O; Hagon, Jaclyn; Grignon, Jessica S; Mpofu, Mulamuli; Semo, Bazghina-Werq

    2014-01-01

    In 2007, the World Health Organization (WHO) and the joint United Nations agency program on HIV/AIDS (UNAIDS) recommended voluntary medical male circumcision (VMMC) as an add-on strategy for HIV prevention. Fourteen priority countries were tasked with scaling-up VMMC services to 80% of HIV-negative men aged 15-49 years by 2016, representing a combined target of 20 million circumcisions. By December 2012, approximately 3 million procedures had been conducted. Within the following year, there was marked improvement in the pace of the scale-up. During 2013, the total number of circumcisions performed nearly doubled, with approximately 6 million total circumcisions conducted by the end of the year, reaching 30% of the initial target. The purpose of this review article was to apply a systems thinking approach, using the WHO health systems building blocks as a framework to examine the factors influencing the scale-up of the VMMC programs from 2008-2013. Facilitators that accelerated the VMMC program scale-up included: country ownership; sustained political will; service delivery efficiencies, such as task shifting and task sharing; use of outreach and mobile services; disposable, prepackaged VMMC kits; external funding; and a standardized set of indicators for VMMC. A low demand for the procedure has been a major barrier to achieving circumcision targets, while weak supply chain management systems and the lack of adequate financial resources with a heavy reliance on donor support have also adversely affected scale-up. Health systems strengthening initiatives and innovations have progressively improved VMMC service delivery, but an understanding of the contextual barriers and the facilitators of demand for the procedure is critical in reaching targets. There is a need for countries implementing VMMC programs to share their experiences more frequently to identify and to enhance best practices by other programs. PMID:25336991

  5. Volcanic Monitoring Techniques Applied to Controlled Fragmentation Experiments

    NASA Astrophysics Data System (ADS)

    Kueppers, U.; Alatorre-Ibarguengoitia, M. A.; Hort, M. K.; Kremers, S.; Meier, K.; Scharff, L.; Scheu, B.; Taddeucci, J.; Dingwell, D. B.

    2010-12-01

    Volcanic eruptions are an inevitable natural threat. The range of eruptive styles is large and short term fluctuations of explosivity or vent position pose a large risk that is not necessarily confined to the immediate vicinity of a volcano. Explosive eruptions rather may also affect aviation, infrastructure and climate, regionally as well as globally. Multiparameter monitoring networks are deployed on many active volcanoes to record signs of magmatic processes and help elucidate the secrets of volcanic phenomena. However, our mechanistic understanding of many processes hiding in recorded signals is still poor. As a direct consequence, a solid interpretation of the state of a volcano is still a challenge. In an attempt to bridge this gap, we combined volcanic monitoring and experimental volcanology. We performed 15 well-monitored, field-based, experiments and fragmented natural rock samples from Colima volcano (Mexico) by rapid decompression. We used cylindrical samples of 60 mm height and 25 mm and 60 mm diameter, respectively, and 25 and 35 vol.% open porosity. The applied pressure range was from 4 to 18 MPa. Using different experimental set-ups, the pressurised volume above the samples ranged from 60 - 170 cm3. The experiments were performed at ambient conditions and at controlled sample porosity and size, confinement geometry, and applied pressure. The experiments have been thoroughly monitored with 1) Doppler Radar (DR), 2) high-speed and high-definition cameras, 3) acoustic and infrasound sensors, 4) pressure transducers, and 5) electrically conducting wires. Our aim was to check for common results achieved by the different approaches and, if so, calibrate state-of-the-art monitoring tools. We present how the velocity of the ejected pyroclasts was measured by and evaluated for the different approaches and how it was affected by the experimental conditions and sample characteristics. We show that all deployed instruments successfully measured the pyroclast

  6. Droplet size measurements for spray dryer scale-up.

    PubMed

    Thybo, Pia; Hovgaard, Lars; Andersen, Sune Klint; Lindeløv, Jesper Saederup

    2008-01-01

    This study was dedicated to facilitate scale-up in spray drying from an atomization standpoint. The purpose was to investigate differences in operating conditions between a pilot and a production scale nozzle. The intension was to identify the operating ranges in which the two nozzles produced similar droplet size distributions. Furthermore, method optimization and validation were also covered. Externally mixing two-fluid nozzles of similar designs were used in this study. Both nozzles are typically used in commercially available spray dryers, and they have been characterized with respect to droplet size distributions as a function of liquid type, liquid flow rate, atomization gas flow rate, liquid orifice diameter, and atomization gas orifice diameter. All droplet size measurements were carried out by using the Malvern Spraytec with nozzle operating conditions corresponding to typical settings for spray drying. This gave droplets with Sauter Mean Diameters less than 40 microm and typically 5-20 microm. A model previously proposed by Mansour and Chigier was used to correlate the droplet size to the operating parameters. It was possible to make a correlation for water incorporating the droplet sizes for both the pilot scale and the production scale nozzle. However, a single correlation was not able to account properly for the physical properties of the liquid to be atomized. Therefore, the droplet size distributions of ethanol could not be adequately predicted on the basis of the water data. This study has shown that it was possible to scale up from a pilot to production scale nozzle in a systematic fashion. However, a prerequisite was that the nozzles were geometrically similar. When externally mixing two-fluid nozzles are used as atomizers, the results obtained from this study could be a useful guideline for selecting appropriate operating conditions when scaling up the spray-drying process. PMID:18379901

  7. Pretreatment optimization of Sorghum pioneer biomass for bioethanol production and its scale-up.

    PubMed

    Koradiya, Manoj; Duggirala, Srinivas; Tipre, Devayani; Dave, Shailesh

    2016-01-01

    Based on one parameter at a time, saccharification of delignified sorghum biomass by 4% and 70% v/v sulfuric acid resulted in maximum 30.8 and 33.8 g% sugar production from biomass respectively. The Box Behnken Design was applied for further optimization of acid hydrolysis. As a result of the designed experiment 36.3g% sugar production was achieved when 3% v/v H2SO4 treatment given for 60 min at 180°C. The process was scaled-up to treat 2 kg of biomass. During the screening of yeast cultures, isolate C, MK-I and N were found to be potent ethanol producers from sorghum hydrolyzate. Culture MK-I was the best so used for scale up of ethanol production up to 25 L capacity, which gave a yield of 0.49 g ethanol/g sugar from hydrolyzate obtained from 2 kg of sorghum biomass. PMID:26384087

  8. Scale-up of commercial PCFB boiler plant technology

    SciTech Connect

    Lamar, T.W.

    1993-10-01

    The DMEC-1 Demonstration Project will provide an 80 MWe commercial-scale demonstration of the Pressurized Circulating Fluidized Bed (PCFB) technology. Following confirmation of the PCFB design in the 80 MWe scale, the technology with be scaled to even larger commercial units. It is anticipated that the market for commercial scale PCFB plants will exist most predominantly in the utility and independent power producer (IPP) sectors. These customers will require the best possible plant efficiency and the lowest achievable emissions at competitive cost. This paper will describe the PCFB technology and the expected performance of a nominal 400 MWe PCFB power plant Illinois No. 6 coal was used as a representative fuel for the analysis. The description of the plant performance will be followed by a discussion of the scale-up of the major PCFB components such as the PCFB boiler, the pressure vessel, the ceramic filter, the coal/sorbent handling steam, the gas turbine, the heat recovery unit and the steam turbine, demonstrating the reasonableness of scale-up from demonstration plant to a nominal 400 MWe unit.

  9. The First Scale-Up Production of Theranostic Nanoemulsions

    PubMed Central

    Liu, Lu; Bagia, Christina; Janjic, Jelena M.

    2015-01-01

    Abstract Theranostic nanomedicines are a promising new technological advancement toward personalized medicine. Although much progress has been made in pre-clinical studies, their clinical utilization is still under development. A key ingredient for successful theranostic clinical translation is pharmaceutical process design for production on a sufficient scale for clinical testing. In this study, we report, for the first time, a successful scale-up of a model theranostic nanoemulsion. Celecoxib-loaded near-infrared-labeled perfluorocarbon nanoemulsion was produced on three levels of scale (small at 54 mL, medium at 270 mL, and large at 1,000 mL) using microfluidization. The average size and polydispersity were not affected by the equipment used or production scale. The overall nanoemulsion stability was maintained for 90 days upon storage and was not impacted by nanoemulsion production scale or composition. Cell-based evaluations show comparable results for all nanoemulsions with no significant impact of nanoemulsion scale on cell toxicity and their pharmacological effects. This report serves as the first example of a successful scale-up of a theranostic nanoemulsion and a model for future studies on theranostic nanomedicine production and development. PMID:26309798

  10. Scale-Up of Advanced Hot-Gas desulfurization Sorbents.

    SciTech Connect

    Jothimurugesan, K.; Gangwal, S.K.

    1997-10-02

    The overall objective of this project is to develop regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective of the project is to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high activity at temperatures as low as 343 {degrees}C (650{degrees}F). A number of formulations will be prepared and screened in a one-half inch fixed bed reactor at high pressure (1 to 20 atm) and high temperatures using simulated coal-derived fuel- gases. Screening criteria will include chemical reactivity, stability, and regenerability over the temperature range of 343{degrees}C to 650{degrees}C. After initial screening, at least 3 promising formulations will be tested for 25-30 cycles of absorption and regeneration. One of the superior formulations with the best cyclic performance will be selected for investigating scale up parameters. The scaled-up formulation will be tested for long term durability and chemical reactivity.

  11. Scale-Up of Advanced Hot-Gas Desulfurization Sorbents

    SciTech Connect

    Jothimurugesan, K.; Gangwal, S.K.

    1997-04-21

    The overall objective of this project is to develop regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective of the project is to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high activity at temperatures as low as 343{degrees}C (650{degrees}F). A number of formulations will be prepared and screened in a 1/2-inch fixed bed reactor at high pressure (1 to 20 atm) and high temperatures using simulated coal-derived fuel-gases. Screening criteria will include, chemical reactivity, stability, and regenerability over the temperature range of 343{degrees}C to 650{degrees}C. After initial screening, at least 3 promising formulations will be tested for 25-30 cycles of absorption and regeneration. One of the superior formulations with the best cyclic performance will be selected for investigating scale up parameters. The scaled-up formulation will be tested for long term durability and chemical reactivity.

  12. Photoacoustic technique applied to the study of skin and leather

    SciTech Connect

    Vargas, M.; Varela, J.; Hernandez, L.; Gonzalez, A.

    1998-08-28

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  13. Photoacoustic technique applied to the study of skin and leather

    NASA Astrophysics Data System (ADS)

    Vargas, M.; Varela, J.; Hernández, L.; González, A.

    1998-08-01

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  14. Signal detection techniques applied to the Chandler wobble

    NASA Technical Reports Server (NTRS)

    Gross, R. S.

    1985-01-01

    A sudden excitation event of the Chandler wobble should induce the earth's rotation pole to undergo damped harmonic motion. This type of motion has been searched for in the observations of the Chandler wobble using techniques based upon the concept of a matched filter. Although the signal detection techniques used here were not sensitive enough to detect any such isolated sudden excitation events, the result that was obtained is consistent with a randomly excited model of the Chandler wobble.

  15. Quantity Versus Quality: A Survey Experiment to Improve the Network Scale-up Method.

    PubMed

    Feehan, Dennis M; Umubyeyi, Aline; Mahy, Mary; Hladik, Wolfgang; Salganik, Matthew J

    2016-04-15

    The network scale-up method is a promising technique that uses sampled social network data to estimate the sizes of epidemiologically important hidden populations, such as sex workers and people who inject illicit drugs. Although previous scale-up research has focused exclusively on networks of acquaintances, we show that the type of personal network about which survey respondents are asked to report is a potentially crucial parameter that researchers are free to vary. This generalization leads to a method that is more flexible and potentially more accurate. In 2011, we conducted a large, nationally representative survey experiment in Rwanda that randomized respondents to report about one of 2 different personal networks. Our results showed that asking respondents for less information can, somewhat surprisingly, produce more accurate size estimates. We also estimated the sizes of 4 key populations at risk for human immunodeficiency virus infection in Rwanda. Our estimates were higher than earlier estimates from Rwanda but lower than international benchmarks. Finally, in this article we develop a new sensitivity analysis framework and use it to assess the possible biases in our estimates. Our design can be customized and extended for other settings, enabling researchers to continue to improve the network scale-up method. PMID:27015875

  16. Quantity Versus Quality: A Survey Experiment to Improve the Network Scale-up Method

    PubMed Central

    Feehan, Dennis M.; Umubyeyi, Aline; Mahy, Mary; Hladik, Wolfgang; Salganik, Matthew J.

    2016-01-01

    The network scale-up method is a promising technique that uses sampled social network data to estimate the sizes of epidemiologically important hidden populations, such as sex workers and people who inject illicit drugs. Although previous scale-up research has focused exclusively on networks of acquaintances, we show that the type of personal network about which survey respondents are asked to report is a potentially crucial parameter that researchers are free to vary. This generalization leads to a method that is more flexible and potentially more accurate. In 2011, we conducted a large, nationally representative survey experiment in Rwanda that randomized respondents to report about one of 2 different personal networks. Our results showed that asking respondents for less information can, somewhat surprisingly, produce more accurate size estimates. We also estimated the sizes of 4 key populations at risk for human immunodeficiency virus infection in Rwanda. Our estimates were higher than earlier estimates from Rwanda but lower than international benchmarks. Finally, in this article we develop a new sensitivity analysis framework and use it to assess the possible biases in our estimates. Our design can be customized and extended for other settings, enabling researchers to continue to improve the network scale-up method. PMID:27015875

  17. Identifying the Characteristics of Effective High Schools: Report from Year One of the National Center on Scaling up Effective Schools. Research Report

    ERIC Educational Resources Information Center

    Rutledge, Stacey; Cohen-Vogel, Lora; Osborne-Lampkin, La'Tara

    2012-01-01

    The National Center on Scaling up Effective Schools (NCSU) is a five-year project working to develop, implement, and test new processes to scale up effective practices in high schools that districts will be able to apply within the context of their own unique goals and circumstances. This report describes the activities and findings of the first…

  18. Gravimetry and Space Techniques Applied to Geodynamics and Ocean Dynamics

    NASA Astrophysics Data System (ADS)

    Schutz, Bob E.; Anderson, Allen; Froidevaux, Claude; Parke, Michael

    The variety of disciplines represented in this volume (including space geodesy, oceanography, geophysics, and celestial mechanics) attest to the interdisciplinary applications of gravimetry and space techniques. The relation to sea level is addressed within some of the papers and the contributions of the techniques to development of global gravity models are discussed. The space technique of satellite altimetry has become a prominent contributor to sea surface topography as well as ocean tide models and determination of gravity, especially in ocean areas. Ocean tides influence the motion of near-Earth satellites and the rotation of the Earth. Modern space geodesy is increasingly relying on the Global Positioning System for measuring geophysical phenomena manifested at the surface through crustal deformations. Furthermore, the geophysical interpretation of gravity anomalies has been facilitated by the introduction of modern techniques. This volume represents only a small "snapshot" of the interdisciplinary research being conducted. Modem space geodesy is one of the common links between the disciplines reflected in this volume. New developments in gravimetry and space techniques will further enhance and foster interdisciplinary work in coming years.

  19. TA Beliefs in a SCALE-UP Style Classroom

    NASA Astrophysics Data System (ADS)

    DeBeck, George; Settelmeyer, Sam; Li, Sissi; Demaree, Dedra

    2010-10-01

    In Spring 2010, the Oregon State University physics department instituted a SCALE-UP (Student-Centered Active Learning Environment for Undergraduate Programs) style studio classroom in the introductory, calculus-based physics series. In our initial implementation, comprised of two hours lecture, two hours of studio, and two hours lab work, the studio session was lead by a faculty member and either 2 GTAs or 1 GTA and 1 LA. We plan to move to a model where senior GTAs can lead studio sections after co-teaching with the faculty member. It is critical that we know how to prepare and support the instructional team in facilitating student learning in this setting. We examine GTA and LA pedagogical beliefs through reflective journaling, interviews, and personal experience of the authors. In particular, we examine how these beliefs changed over their first quarter of instruction, as well as the resources used to adapt to the new classroom environment.

  20. Scaled-up in vitro experiments of vocal fold paralysis

    NASA Astrophysics Data System (ADS)

    Peterson, Keith; Wei, Timothy; Krane, Michael

    2006-11-01

    Vocal fold paralysis is the inability of either one, or both vocal folds to open and close properly. Digital Particle Image Velocimetry (DPIV) measurements were taken to further understand the consequences paralyzed vocal folds have on the fluid dynamics downstream of the vocal folds during human phonation. The experiments were taken in a free-stream water tunnel using a simplified scaled-up model of human vocal folds. The Reynolds and Strouhal numbers ranged from 4500 to 10000, and 0.01 to 0.04, respectively. Various configuration setups were tested to emulate several types of vocal fold paralyses. These configurations include unilateral vocal fold immobility (UVFI), bilateral vocal fold immobility (BVFI) and the vocal folds operating at different oscillating frequencies. Data from these different conditions will be compared with an eye toward understanding the critical dynamics associated with this class of disease.

  1. Biological conversion of synthesis gas. Limiting conditions/scale-up

    SciTech Connect

    Basu, R.; Klasson, K.T.; Takriff, M.; Clausen, E.C.; Gaddy, J.L.

    1993-09-01

    The purpose of this research is to develop a technically and economically feasible process for biologically producing H(sub 2) from synthesis gas while, at the same time, removing harmful sulfur gas compounds. Six major tasks are being studied: 1. Culture development, where the best cultures are selected and conditions optimized for simultaneous hydrogen production and sulfur gas removal; 2. Mass transfer and kinetic studies in which equations necessary for process design are developed; 3. Bioreactor design studies, where the cultures chosen in Task 1 are utilized in continuous reaction vessels to demonstrate process feasibility and define operating conditions; 4. Evaluation of biological synthetic gas conversion under limiting conditions in preparation for industrial demonstration studies; 5. Process scale-up where laboratory data are scaled to larger-size units in preparation for process demonstration in a pilot-scale unit; and 6. Economic evaluation, where process simulations are used to project process economics and identify high cost areas during sensitivity analyses.

  2. Scaling Up Family Therapy in Fragile, Conflict-Affected States.

    PubMed

    Charlés, Laurie L

    2015-09-01

    This article discusses the design and delivery of two international family therapy-focused mental health and psychosocial support training projects, one in a fragile state and one in a post-conflict state. The training projects took place in Southeast Asia and the Middle East/North Africa. Each was funded, supported, and implemented by local, regional, and international stakeholders, and delivered as part of a broader humanitarian agenda to develop human resource capacity to work with families affected by atrocities. The two examples illustrate how task-shifting/task-sharing and transitional justice approaches were used to inform the scaling-up of professionals involved in each project. They also exemplify how state-citizen phenomena in each location affected the project design and delivery. PMID:25315510

  3. Production Scale-Up or Activated Carbons for Ultracapacitors

    SciTech Connect

    Dr. Steven D. Dietz

    2007-01-10

    Transportation use accounts for 67% of the petroleum consumption in the US. Electric and hybrid vehicles are promising technologies for decreasing our dependence on petroleum, and this is the objective of the FreedomCAR & Vehicle Technologies Program. Inexpensive and efficient energy storage devices are needed for electric and hybrid vehicle to be economically viable, and ultracapacitors are a leading energy storage technology being investigated by the FreedomCAR program. The most important parameter in determining the power and energy density of a carbon-based ultracapacitor is the amount of surface area accessible to the electrolyte, which is primarily determined by the pore size distribution. The major problems with current carbons are that their pore size distribution is not optimized for liquid electrolytes and the best carbons are very expensive. TDA Research, Inc. (TDA) has developed methods to prepare porous carbons with tunable pore size distributions from inexpensive carbohydrate based precursors. The use of low-cost feedstocks and processing steps greatly lowers the production costs. During this project with the assistance of Maxwell Technologies, we found that an impurity was limiting the performance of our carbon and the major impurity found was sulfur. A new carbon with low sulfur content was made and found that the performance of the carbon was greatly improved. We also scaled-up the process to pre-production levels and we are currently able to produce 0.25 tons/year of activated carbon. We could easily double this amount by purchasing a second rotary kiln. More importantly, we are working with MeadWestvaco on a Joint Development Agreement to scale-up the process to produce hundreds of tons of high quality, inexpensive carbon per year based on our processes.

  4. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  5. Novel method for constructing a large-scale design space in lubrication process by using Bayesian estimation based on the reliability of a scale-up rule.

    PubMed

    Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo

    2012-01-01

    A reliable large-scale design space was constructed by integrating the reliability of a scale-up rule into the Bayesian estimation without enforcing a large-scale design of experiments (DoE). A small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. A constant Froude number was applied as a scale-up rule. Experiments were conducted at four different small scales with the same Froude number and blending time in order to determine the discrepancies in the response variables between the scales so as to indicate the reliability of the scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on the large scale by Bayesian estimation using the large-scale results and the reliability of the scale-up rule. Large-scale experiments performed under three additional sets of conditions showed that the corrected design space was more reliable than the small-scale design space even when there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale. PMID:22976324

  6. Machine-Learning Techniques Applied to Antibacterial Drug Discovery

    PubMed Central

    Durrant, Jacob D.; Amaro, Rommie E.

    2014-01-01

    The emergence of drug-resistant bacteria threatens to catapult humanity back to the pre-antibiotic era. Even now, multi-drug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the resulting vacuum. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics in an academic setting, leading to improved hit rates and faster transitions to pre-clinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. PMID:25521642

  7. Machine-learning techniques applied to antibacterial drug discovery.

    PubMed

    Durrant, Jacob D; Amaro, Rommie E

    2015-01-01

    The emergence of drug-resistant bacteria threatens to revert humanity back to the preantibiotic era. Even now, multidrug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the pipeline. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug-discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics, leading to improved hit rates and faster transitions to preclinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. PMID:25521642

  8. Technology Assessment of Dust Suppression Techniques Applied During Structural Demolition

    SciTech Connect

    Boudreaux, J.F.; Ebadian, M.A.; Williams, P.T.; Dua, S.K.

    1998-10-20

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure properly and, at the same time, minimize the amount of dust generated from a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology given site-specific conditions. Thus, the purpose of this research, which was carried out at the Hemispheric Center for Environmental Technology (HCET) at Florida International University, was to conduct an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study targeted the problem of dust suppression during the demolition of nuclear facilities. The resulting data were employed to assist in the development of mathematical correlations that can be applied to predict dust generation during structural demolition.

  9. Microwave de-embedding techniques applied to acoustics.

    PubMed

    Jackson, Charles M

    2005-07-01

    This paper describes the use of the microwave techniques of time domain reflectometry (TDR) and de-embedding in an acoustical application. Two methods of calibrating the reflectometer are presented to evaluate the consistency of the method. Measured and modeled S-parameters of woodwind instruments are presented. The raw measured data is de-embedded to obtain an accurate measurement. The acoustic TDR setup is described. PMID:16212248

  10. Analysis of soil images applying Laplacian Pyramidal techniques

    NASA Astrophysics Data System (ADS)

    Ballesteros, F.; de Castro, J.; Tarquis, A. M.; Méndez, A.

    2012-04-01

    The Laplacian pyramid is a technique for image encoding in which local operators of many scales but identical shape are the basis functions. Our work describes some properties of the filters of the Laplacian pyramid. Specially, we pay attention to Gaussian and fractal behaviour of these filters, and we determine the normal and fractal ranges in the case of single parameter filters, while studying the influence of these filters in soil image processing. One usual property of any image is that neighboring pixels are highly correlated. This property makes inefficient to represent the image directly in terms of the pixel values, because most of the encoded information would be redundant. Burt and Adelson designed a technique, named Laplacian pyramid, for removing image correlation which combines features of predictive and transform methods. This technique is non causal, and its computations are simple and local. The predicted value for each pixel is computed as a local weighted average, using a unimodal weighting function centred on the pixel itself. Pyramid construction is equivalent to convolving the original image with a set of weighting functions determined by a parameter that defines the filter. According to the parameter values, these filters have a behaviour that goes from the Gaussian shape to the fractal. Previous works only analyze Gaussian filters, but we determine the Gaussian and fractal intervals and study the energy of the Laplacian pyramid images according to the filter types. The different behaviour, qualitatively, involves a significant change in statistical characteristics at different levels of iteration, especially the fractal case, which can highlight specific information from the images. Funding provided by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no. AGL2010-21501/AGR is greatly appreciated.

  11. Low background techniques applied in the BOREXINO experiment

    SciTech Connect

    Zuzel, G.

    2015-08-17

    The BOREXINO detector, located in the Gran Sasso National Laboratory in Italy, has been designed for real-time spectroscopy of low-energy solar neutrinos. Within the experiment several novel background reduction and assay techniques have been established. In many cases they are still the most sensitive world-wide. Developed methods and apparatus provided tools for a strict quality control program during the construction phase of the BOREXINO detector, which was the key to meet the background requirements. Achievement of extremely low background rate opened the possibility to probe in realtime almost entire spectrum of the solar neutrinos.

  12. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  13. Unconventional Coding Technique Applied to Multi-Level Polarization Modulation

    NASA Astrophysics Data System (ADS)

    Rutigliano, G. G.; Betti, S.; Perrone, P.

    2016-05-01

    A new technique is proposed to improve information confidentiality in optical-fiber communications without bandwidth consumption. A pseudorandom vectorial sequence was generated by a dynamic system algorithm and used to codify a multi-level polarization modulation based on the Stokes vector. Optical-fiber birefringence, usually considered as a disturbance, was exploited to obfuscate the signal transmission. At the receiver end, the same pseudorandom sequence was generated and used to decode the multi-level polarization modulated signal. The proposed scheme, working at the physical layer, provides strong information security without introducing complex processing and thus latency.

  14. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  15. Applied geophysical techniques to evaluate earth dams and foundations

    NASA Astrophysics Data System (ADS)

    Llopis, Jose L.; Sharp, Michael K.; Butler, Dwain K.; Yule, Donald E.

    1995-05-01

    Mill Creek Dam, near Walla Walla, Washington has experienced anomalous seepage since its first filling in 1941. Various attempts to abate and control the seepage, including construction of a concrete wall, have not been completely successful. Construction of the cutoff wall reduced the seepage by about 30 percent, from 33 cubic feet per second to 22 cubic feet per second, and downstream saturated farmland was reduced by 56 percent. However, there are indications of increased seepage pressures in a conglomerate formation in the right abutment. A comprehensive, integrated geophysics investigation of the right abutment area of the dam was conducted to detect and map anomalous conditions and assist in the evaluation of remedial measures. The geophysics program consisted of microgravity, ground penetrating radar, seismic reflection, electromagnetic conductivity, and electrical resistivity surveying. Results of the program indicate anomalous conditions extending from the reservoir area through the right abutment. The aspects of the program planning leading to technique selection and field procedures are emphasized, as well as the role of different geophysical techniques in defining the nature of anomalous condition.

  16. Image enhancement techniques applied to solar feature detection

    NASA Astrophysics Data System (ADS)

    Kowalski, Artur J.

    This dissertation presents the development of automatic image enhancement techniques for solar feature detection. The new method allows for detection and tracking of the evolution of filaments in solar images. Series of H-alpha full-disk images are taken in regular time intervals to observe the changes of the solar disk features. In each picture, the solar chromosphere filaments are identified for further evolution examination. The initial preprocessing step involves local thresholding to convert grayscale images into black-and-white pictures with chromosphere granularity enhanced. An alternative preprocessing method, based on image normalization and global thresholding is presented. The next step employs morphological closing operations with multi-directional linear structuring elements to extract elongated shapes in the image. After logical union of directional filtering results, the remaining noise is removed from the final outcome using morphological dilation and erosion with a circular structuring element. Experimental results show that the developed techniques can achieve excellent results in detecting large filaments and good detection rates for small filaments. The final chapter discusses proposed directions of the future research and applications to other areas of solar image processing, in particular to detection of solar flares, plages and sunspots.

  17. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  18. Applying manifold learning techniques to the CAESAR database

    NASA Astrophysics Data System (ADS)

    Mendoza-Schrock, Olga; Patrick, James; Arnold, Gregory; Ferrara, Matthew

    2010-04-01

    Understanding and organizing data is the first step toward exploiting sensor phenomenology for dismount tracking. What image features are good for distinguishing people and what measurements, or combination of measurements, can be used to classify the dataset by demographics including gender, age, and race? A particular technique, Diffusion Maps, has demonstrated the potential to extract features that intuitively make sense [1]. We want to develop an understanding of this tool by validating existing results on the Civilian American and European Surface Anthropometry Resource (CAESAR) database. This database, provided by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International, is a rich dataset which includes 40 traditional, anthropometric measurements of 4400 human subjects. If we could specifically measure the defining features for classification, from this database, then the future question will then be to determine a subset of these features that can be measured from imagery. This paper briefly describes the Diffusion Map technique, shows potential for dimension reduction of the CAESAR database, and describes interesting problems to be further explored.

  19. Object Detection Techniques Applied on Mobile Robot Semantic Navigation

    PubMed Central

    Astua, Carlos; Barber, Ramon; Crespo, Jonathan; Jardon, Alberto

    2014-01-01

    The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency. PMID:24732101

  20. Discrete filtering techniques applied to sequential GPS range measurements

    NASA Technical Reports Server (NTRS)

    Vangraas, Frank

    1987-01-01

    The basic navigation solution is described for position and velocity based on range and delta range (Doppler) measurements from NAVSTAR Global Positioning System satellites. The application of discrete filtering techniques is examined to reduce the white noise distortions on the sequential range measurements. A second order (position and velocity states) Kalman filter is implemented to obtain smoothed estimates of range by filtering the dynamics of the signal from each satellite separately. Test results using a simulated GPS receiver show a steady-state noise reduction, the input noise variance divided by the output noise variance, of a factor of four. Recommendations for further noise reduction based on higher order Kalman filters or additional delta range measurements are included.

  1. Object detection techniques applied on mobile robot semantic navigation.

    PubMed

    Astua, Carlos; Barber, Ramon; Crespo, Jonathan; Jardon, Alberto

    2014-01-01

    The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency. PMID:24732101

  2. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    NASA Astrophysics Data System (ADS)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  3. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other

  4. Finite element techniques applied to cracks interacting with selected singularities

    NASA Technical Reports Server (NTRS)

    Conway, J. C.

    1975-01-01

    The finite-element method for computing the extensional stress-intensity factor for cracks approaching selected singularities of varied geometry is described. Stress-intensity factors are generated using both displacement and J-integral techniques, and numerical results are compared to those obtained experimentally in a photoelastic investigation. The selected singularities considered are a colinear crack, a circular penetration, and a notched circular penetration. Results indicate that singularities greatly influence the crack-tip stress-intensity factor as the crack approaches the singularity. In addition, the degree of influence can be regulated by varying the overall geometry of the singularity. Local changes in singularity geometry have little effect on the stress-intensity factor for the cases investigated.

  5. Status of text-mining techniques applied to biomedical text.

    PubMed

    Erhardt, Ramón A-A; Schneider, Reinhard; Blaschke, Christian

    2006-04-01

    Scientific progress is increasingly based on knowledge and information. Knowledge is now recognized as the driver of productivity and economic growth, leading to a new focus on the role of information in the decision-making process. Most scientific knowledge is registered in publications and other unstructured representations that make it difficult to use and to integrate the information with other sources (e.g. biological databases). Making a computer understand human language has proven to be a complex achievement, but there are techniques capable of detecting, distinguishing and extracting a limited number of different classes of facts. In the biomedical field, extracting information has specific problems: complex and ever-changing nomenclature (especially genes and proteins) and the limited representation of domain knowledge. PMID:16580973

  6. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  7. Applying Clustering Techniques to Reduce Complexity in Automated Planning Domains

    NASA Astrophysics Data System (ADS)

    Dicken, Luke; Levine, John

    Automated Planning is a very active area of research within Artificial Intelligence. Broadly this discipline deals with the methods by which an agent can independently determine the action sequence required to successfully achieve a set of objectives. In this paper, we will present initial work outlining a new approach to planning based on Clustering techniques, in order to group states of the world together and use the fundamental structure of the world to lift out more abstract representations. We will show that this approach can limit the combinatorial explosion of a typical planning problem in a way that is much more intuitive and reusable than has previously been possible, and outline ways that this approach can be developed further.

  8. Technology Assessment of Dust Suppression Techniques applied During Structural Demolition

    SciTech Connect

    Boudreaux, J.F.; Ebadian, M.A.; Dua, S.K.

    1997-08-06

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure and, at the same time, minimize the amount of dust generated by a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology. Thus, the purpose of this research, which was conducted by the Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU), was to perform an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study specifically targeted the problem of dust suppression during demolition. The resulting data were used in the development of mathematical correlations that can be applied to structural demolition. In the Fiscal Year 1996 (FY96), the effectiveness of different dust suppressing agents was investigated for different types of concrete blocks. Initial tests were conducted in a broad particle size range. In Fiscal Year 1997 (FY97), additional tests were performed in the size range in which most of the particles were detected. Since particle distribution is an important parameter for predicting deposition in various compartments of the human respiratory tract, various tests were aimed at determining the particle size distribution of the airborne dust particles. The effectiveness of dust suppressing agents for particles of various size was studied. Instead of conducting experiments on various types of blocks, it was thought prudent to carry out additional tests on blocks of the same type. Several refinements were also incorporated in the test procedures and data acquisition system used in FY96.

  9. Polyethylene encapsulation of mixed wastes: Scale-up feasibility

    SciTech Connect

    Kalb, P.D.; Heiser, J.H.; Colombo, P.

    1991-12-31

    A polyethylene process for the improved encapsulation of radioactive, hazardous, and mixed wastes have been developed at Brookhaven National Laboratory (BNL). Improvements in waste loading and waste form performance have been demonstrated through bench-scale development and testing. Maximum waste loadings of up to 70 dry wt % mixed waste nitrate salt were achieved, compared with 13--20 dry wt % using conventional cement processes. Stability under anticipated storage and disposal conditions and compliance with applicable hazardous waste regulations were demonstrated through a series of lab-scale waste form performance tests. Full-scale demonstration of this process using actual or surrogate waste is currently planned. A scale-up feasibility test was successfully conducted, demonstrating the ability to process nitrate salts at production rates (up to 450 kg/hr) and the close agreement between bench- and full-scale process parameters. Cored samples from the resulting pilot-scale (114 liter) waste form were used to verify homogeneity and to provide additional specimens for confirmatory performance testing.

  10. Fungal biosynthesis of gold nanoparticles: mechanism and scale up

    PubMed Central

    Kitching, Michael; Ramani, Meghana; Marsili, Enrico

    2015-01-01

    Gold nanoparticles (AuNPs) are a widespread research tool because of their oxidation resistance, biocompatibility and stability. Chemical methods for AuNP synthesis often produce toxic residues that raise environmental concern. On the other hand, the biological synthesis of AuNPs in viable microorganisms and their cell-free extracts is an environmentally friendly and low-cost process. In general, fungi tolerate higher metal concentrations than bacteria and secrete abundant extracellular redox proteins to reduce soluble metal ions to their insoluble form and eventually to nanocrystals. Fungi harbour untapped biological diversity and may provide novel metal reductases for metal detoxification and bioreduction. A thorough understanding of the biosynthetic mechanism of AuNPs in fungi is needed to reduce the time of biosynthesis and to scale up the AuNP production process. In this review, we describe the known mechanisms for AuNP biosynthesis in viable fungi and fungal protein extracts and discuss the most suitable bioreactors for industrial AuNP biosynthesis. PMID:25154648

  11. Scaling up: Assessing social impacts at the macro-scale

    SciTech Connect

    Schirmer, Jacki

    2011-04-15

    Social impacts occur at various scales, from the micro-scale of the individual to the macro-scale of the community. Identifying the macro-scale social changes that results from an impacting event is a common goal of social impact assessment (SIA), but is challenging as multiple factors simultaneously influence social trends at any given time, and there are usually only a small number of cases available for examination. While some methods have been proposed for establishing the contribution of an impacting event to macro-scale social change, they remain relatively untested. This paper critically reviews methods recommended to assess macro-scale social impacts, and proposes and demonstrates a new approach. The 'scaling up' method involves developing a chain of logic linking change at the individual/site scale to the community scale. It enables a more problematised assessment of the likely contribution of an impacting event to macro-scale social change than previous approaches. The use of this approach in a recent study of change in dairy farming in south east Australia is described.

  12. Challenges and Opportunities in Scaling-Up Nutrition in Healthcare

    PubMed Central

    Darnton-Hill, Ian; Samman, Samir

    2015-01-01

    Healthcare continues to be in a state of flux; conventionally, this provides opportunities and challenges. The opportunities include technological breakthroughs, improved economies and increasing availability of healthcare. On the other hand, economic disparities are increasing and leading to differing accessibility to healthcare, including within affluent countries. Nutrition has received an increase in attention and resources in recent decades, a lot of it stimulated by the rise in obesity, type 2 diabetes mellitus and hypertension. An increase in ageing populations also has meant increased interest in nutrition-related chronic diseases. In many middle-income countries, there has been an increase in the double burden of malnutrition with undernourished children and overweight/obese parents and adolescents. In low-income countries, an increased evidence base has allowed scaling-up of interventions to address under-nutrition, both nutrition-specific and nutrition-sensitive interventions. Immediate barriers (institutional, structural and biological) and longer-term barriers (staffing shortages where most needed and environmental impacts on health) are discussed. Significant barriers remain for the near universal access to healthcare, especially for those who are socio-economically disadvantaged, geographically isolated, living in war zones or where environmental damage has taken place. However, these barriers are increasingly being recognized, and efforts are being made to address them. The paper aims to take a broad view that identifies and then comments on the many social, political and scientific factors affecting the achievement of improved nutrition through healthcare. PMID:27417744

  13. Scaling Up Data-Centric Middleware on a Cluster Computer

    SciTech Connect

    Liu, D T; Franklin, M J; Garlick, J; Abdulla, G M

    2005-04-29

    Data-centric workflow middleware systems are workflow systems that treat data as first class objects alongside programs. These systems improve the usability, responsiveness and efficiency of workflow execution over cluster (and grid) computers. In this work, we explore the scalability of one such system, GridDB, on cluster computers. We measure the performance and scalability of GridDB in executing data-intensive image processing workflows from the SuperMACHO astrophysics survey on a large cluster computer. Our first experimental study concerns the scale-up of GridDB. We make a rather surprising finding, that while the middleware system issues many queries and transactions to a DBMS, file system operations present the first-tier bottleneck. We circumvent this bottleneck and increase the scalability of GridDB by more than 2-fold on our image processing application (up to 128 nodes). In a second study, we demonstrate the sensitivity of GridDB performance (and therefore application performance) to characteristics of the workflows being executed. To manage these sensitivities, we provide guidelines for trading off the costs and benefits of GridDB at a fine-grain.

  14. Optical Trapping Techniques Applied to the Study of Cell Membranes

    NASA Astrophysics Data System (ADS)

    Morss, Andrew J.

    Optical tweezers allow for manipulating micron-sized objects using pN level optical forces. In this work, we use an optical trapping setup to aid in three separate experiments, all related to the physics of the cellular membrane. In the first experiment, in conjunction with Brian Henslee, we use optical tweezers to allow for precise positioning and control of cells in suspension to evaluate the cell size dependence of electroporation. Theory predicts that all cells porate at a transmembrane potential VTMof roughly 1 V. The Schwann equation predicts that the transmembrane potential depends linearly on the cell radius r, thus predicting that cells should porate at threshold electric fields that go as 1/r. The threshold field required to induce poration is determined by applying a low voltage pulse to the cell and then applying additional pulses of greater and greater magnitude, checking for poration at each step using propidium iodide dye. We find that, contrary to expectations, cells do not porate at a constant value of the transmembrane potential but at a constant value of the electric field which we find to be 692 V/cm for K562 cells. Delivering precise dosages of nanoparticles into cells is of importance for assessing toxicity of nanoparticles or for genetic research. In the second experiment, we conduct nano-electroporation—a novel method of applying precise doses of transfection agents to cells—by using optical tweezers in conjunction with a confocal microscope to manipulate cells into contact with 100 nm wide nanochannels. This work was done in collaboration with Pouyan Boukany of Dr. Lee's group. The small cross sectional area of these nano channels means that the electric field within them is extremely large, 60 MV/m, which allows them to electrophoretically drive transfection agents into the cell. We find that nano electroporation results in excellent dose control (to within 10% in our experiments) compared to bulk electroporation. We also find that

  15. Cleaning techniques for applied-B ion diodes

    SciTech Connect

    Cuneo, M.E.; Menge, P.R.; Hanson, D.L.

    1995-09-01

    Measurements and theoretical considerations indicate that the lithium-fluoride (LiF) lithium ion source operates by electron-assisted field-desorption, and provides a pure lithium beam for 10--20 ns. Evidence on both the SABRE (1 TW) and PBFA-II (20 TW) accelerators indicates that the lithium beam is replaced by a beam of protons, and carbon resulting from electron thermal desorption of hydrocarbon surface and bulk contamination with subsequent avalanche ionization. Appearance of contaminant ions in the beam is accompanied by rapid impedance collapse, possibly resulting from loss of magnetic insulation in the rapidly expanding and ionizing, neutral layer. Electrode surface and source substrate cleaning techniques are being developed on the SABRE accelerator to reduce beam contamination, plasma formation, and impedance collapse. We have increased lithium current density a factor of 3 and lithium energy a factor of 5 through a combination of in-situ surface and substrate coatings, impermeable substrate coatings, and field profile modifications.

  16. Digital prototyping technique applied for redesigning plastic products

    NASA Astrophysics Data System (ADS)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  17. Remote sensing techniques applied to seismic vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  18. Time-resolved infrared spectroscopic techniques as applied to channelrhodopsin

    PubMed Central

    Ritter, Eglof; Puskar, Ljiljana; Bartl, Franz J.; Aziz, Emad F.; Hegemann, Peter; Schade, Ulrich

    2015-01-01

    Among optogenetic tools, channelrhodopsins, the light gated ion channels of the plasma membrane from green algae, play the most important role. Properties like channel selectivity, timing parameters or color can be influenced by the exchange of selected amino acids. Although widely used, in the field of neurosciences for example, there is still little known about their photocycles and the mechanism of ion channel gating and conductance. One of the preferred methods for these studies is infrared spectroscopy since it allows observation of proteins and their function at a molecular level and in near-native environment. The absorption of a photon in channelrhodopsin leads to retinal isomerization within femtoseconds, the conductive states are reached in the microsecond time scale and the return into the fully dark-adapted state may take more than minutes. To be able to cover all these time regimes, a range of different spectroscopical approaches are necessary. This mini-review focuses on time-resolved applications of the infrared technique to study channelrhodopsins and other light triggered proteins. We will discuss the approaches with respect to their suitability to the investigation of channelrhodopsin and related proteins. PMID:26217670

  19. Sputtering as a Technique for Applying Tribological Coatings

    NASA Technical Reports Server (NTRS)

    Ramalingam, S.

    1984-01-01

    Friction and wear-induced mechanical failures may be controlled to extend the life of tribological components through the interposition of selected solid materials between contacting surfaces. Thin solid films of soft and hard materials are appropriate to lower friction and enhance the wear resistance of precision tribo-elements. Tribological characteristics of thin hard coats deposited on a variety of ferrous and non-ferrous substrates were tested. The thin hard coats used were titanium nitride films deposited by reactive magnetron sputtering of metallic titanium. High contact stress, low speed tests showed wear rate reductions of one or more magnitude, even with films a few micrometers in thickness. Low contact stress, high speed tests carried out under rather severe test conditions showed that thin films of TiN afforded significant friction reduction and wear protection. Thin hard coats were shown to improve the friction and wear performance of rolling contacts. Satisfactory film-to-substrate adhesion strengths can be obtained with reactive magnetron sputtering. X-ray diffraction and microhardness tests were employed to assess the effectiveness of the sputtering technique.

  20. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    NASA Astrophysics Data System (ADS)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  1. Applying Realtime Intelligence Acquisition Techniques To Problems In Resource Management

    NASA Astrophysics Data System (ADS)

    Greer, Jerry D.

    1989-02-01

    Most people see little similarity between a battlefield manager and a natural resource manager. However, except for the element of time, many striking similarities may be drawn. Indeed, there are more differences between the tranquil scenes of mountain scenery, forests, rivers or grasslands and bomb scarred battlefields where survival is often the prime objective. The similarities center around the basic need for information upon which good decisions may be made. Both managers of battlefields and of natural resources require accurate, timely, and continuous information about changing conditions. Based on this information, they each make decisions to conserve the materials and resources under their charge. Their common goal is to serve the needs of the people in their society. On the one hand, the goal is victory in battle to perpetuate a way of life or a political system. On the other, the goal is victory in an ongoing battle against fire, insects, disease, soil erosion, vandalism, theft, and misuse in general. Here, a desire to maintain natural resources in a productive and healthy condition prevails. The objective of the natural resource manager is to keep natural resources in such a condition that they will continue to meet the needs and wants of the people who claim them for their common good. In this paper, the different needs for information are compared and a little history of some of the quasi-military aspects of resource management is given. Needs for information are compared and current uses of data acquisition techniques are reviewed. Similarities and differences are discussed and future opportunities for cooperation in data acquisition are outlined.

  2. Robustness of speckle imaging techniques applied to horizontal imaging scenarios

    NASA Astrophysics Data System (ADS)

    Bos, Jeremy P.

    Atmospheric turbulence near the ground severely limits the quality of imagery acquired over long horizontal paths. In defense, surveillance, and border security applications, there is interest in deploying man-portable, embedded systems incorporating image reconstruction to improve the quality of imagery available to operators. To be effective, these systems must operate over significant variations in turbulence conditions while also subject to other variations due to operation by novice users. Systems that meet these requirements and are otherwise designed to be immune to the factors that cause variation in performance are considered robust. In addition to robustness in design, the portable nature of these systems implies a preference for systems with a minimum level of computational complexity. Speckle imaging methods are one of a variety of methods recently been proposed for use in man-portable horizontal imagers. In this work, the robustness of speckle imaging methods is established by identifying a subset of design parameters that provide immunity to the expected variations in operating conditions while minimizing the computation time necessary for image recovery. This performance evaluation is made possible using a novel technique for simulating anisoplanatic image formation. I find that incorporate as few as 15 image frames and 4 estimates of the object phase per reconstructed frame provide an average reduction of 45% reduction in Mean Squared Error (MSE) and 68% reduction in deviation in MSE. In addition, the Knox-Thompson phase recovery method is demonstrated to produce images in half the time required by the bispectrum. Finally, it is shown that certain blind image quality metrics can be used in place of the MSE to evaluate reconstruction quality in field scenarios. Using blind metrics rather depending on user estimates allows for reconstruction quality that differs from the minimum MSE by as little as 1%, significantly reducing the deviation in

  3. Scaling-up ultrasound standing wave enhanced sedimentation filters.

    PubMed

    Prest, Jeff E; Treves Brown, Bernard J; Fielden, Peter R; Wilkinson, Stephen J; Hawkes, Jeremy J

    2015-02-01

    Particle concentration and filtration is a key stage in a wide range of processing industries and also one that can be present challenges for high throughput, continuous operation. Here we demonstrate some features which increase the efficiency of ultrasound enhanced sedimentation and could enable the technology the potential to be scaled up. In this work, 20 mm piezoelectric plates were used to drive 100 mm high chambers formed from single structural elements. The coherent structural resonances were able to drive particles (yeast cells) in the water to nodes throughout the chamber. Ultrasound enhanced sedimentation was used to demonstrate the efficiency of the system (>99% particle clearance). Sub-wavelength pin protrusions were used for the contacts between the resonant chamber and other elements. The pins provided support and transferred power, replacing glue which is inefficient for power transfer. Filtration energies of ∼4 J/ml of suspension were measured. A calculation of thermal convection indicates that the circulation could disrupt cell alignment in ducts >35 mm high when a 1K temperature gradient is present; we predict higher efficiencies when this maximum height is observed. For the acoustic design, although modelling was minimal before construction, the very simple construction allowed us to form 3D models of the nodal patterns in the fluid and the duct structure. The models were compared with visual observations of particle movement, Chladni figures and scanning laser vibrometer mapping. This demonstrates that nodal planes in the fluid can be controlled by the position of clamping points and that the contacts could be positioned to increase the efficiency and reliability of particle manipulations in standing waves. PMID:25193111

  4. Beginning with sustainable scale up in mind: initial results from a population, health and environment project in East Africa.

    PubMed

    Ghiron, Laura; Shillingi, Lucy; Kabiswa, Charles; Ogonda, Godfrey; Omimo, Antony; Ntabona, Alexis; Simmons, Ruth; Fajans, Peter

    2014-05-01

    Small-scale pilot projects have demonstrated that integrated population, health and environment approaches can address the needs and rights of vulnerable communities. However, these and other types of health and development projects have rarely gone on to influence larger policy and programme development. ExpandNet, a network of health professionals working on scaling up, argues this is because projects are often not designed with future sustainability and scaling up in mind. Developing and implementing sustainable interventions that can be applied on a larger scale requires a different mindset and new approaches to small-scale/pilot testing. This paper shows how this new approach is being applied and the initial lessons from its use in the Health of People and Environment in the Lake Victoria Basin Project currently underway in Uganda and Kenya. Specific lessons that are emerging are: 1) ongoing, meaningful stakeholder engagement has significantly shaped the design and implementation, 2) multi-sectoral projects are complex and striving for simplicity in the interventins is challenging, and 3) projects that address a sharply felt need experience substantial pressure for scale up, even before their effectiveness is established. Implicit in this paper is the recommendation that other projects would also benefit from applying a scale-up perspective from the outset. PMID:24908459

  5. Using Advanced Modeling to Accelerate the Scale-Up of Carbon Capture Technologies

    SciTech Connect

    Miller, David; Sun, Xin; Storlie, Curtis; Bhattacharyya, Debangsu

    2015-06-18

    Carbon capture and storage (CCS) is one of many approaches that are critical for significantly reducing domestic and global CO2 emissions. The U.S. Department of Energy’s Clean Coal Technology Program Plan envisions 2nd generation CO2 capture technologies ready for demonstration-scale testing around 2020 with the goal of enabling commercial deployment by 2025 [1]. Third generation technologies have a similarly aggressive timeline. A major challenge is that the development and scale-up of new technologies in the energy sector historically takes up to 15 years to move from the laboratory to pre-deployment and another 20 to 30 years for widespread industrial scale deployment. In order to help meet the goals of the DOE carbon capture program, the Carbon Capture Simulation Initiative (CCSI) was launched in early 2011 to develop, demonstrate, and deploy advanced computational tools and validated multi-scale models to reduce the time required to develop and scale up new carbon capture technologies. The CCSI Toolset (1) enables promising concepts to be more quickly identified through rapid computational screening of processes and devices, (2) reduces the time to design and troubleshoot new devices and processes by using optimization techniques to focus development on the best overall process conditions and by using detailed device-scale models to better understand and improve the internal behavior of complex equipment, and (3) provides quantitative predictions of device and process performance during scale up based on rigorously validated smaller scale simulations that take into account model and parameter uncertainty[2]. This article focuses on essential elements related to the development and validation of multi-scale models in order to help minimize risk and maximize learning as new technologies progress from pilot to demonstration scale.

  6. Scaling up debris-flow experiments on a centrifuge

    NASA Astrophysics Data System (ADS)

    Hung, C.; Capart, H.; Crone, T. J.; Grinspum, E.; Hsu, L.; Kaufman, D.; Li, L.; Ling, H.; Reitz, M. D.; Smith, B.; Stark, C. P.

    2013-12-01

    Boundary forces generated by debris flows can be powerful enough to erode bedrock and cause considerable damage to infrastructure during runout. Formulation of an erosion-rate law for debris flows is therefore a high priority, and it makes sense to build such a law around laboratory experiments. However, running experiments big enough to generate realistic boundary forces is a logistical challenge to say the least [1]. One alternative is to run table-top simulations with unnaturally weak but fast-eroding pseudo-bedrock, another is to extrapolate from micro-erosion of natural substrates driven by unnaturally weak impacts; hybrid-scale experiments have also been conducted [2]. Here we take a different approach in which we scale up granular impact forces by running our experiments under enhanced gravity in a geotechnical centrifuge [3]. Using a 40cm-diameter rotating drum [2] spun at up to 100g, we generate debris flows with an effective depth of over several meters. By varying effective gravity from 1g to 100g we explore the scaling of granular flow forces and the consequent bed and wall erosion rates. The velocity and density structure of these granular flows is monitored using laser sheets, high-speed video, and particle tracking [4], and the progressive erosion of the boundary surfaces is measured by laser scanning. The force structures and their fluctuations within the granular mass and at the boundaries are explored with contact dynamics numerical simulations that mimic the lab experimental conditions [5]. In this presentation we summarize these results and discuss how they can contribute to the formulation of debris-flow erosion law. [1] Major, J. J. (1997), Journal of Geology 105: 345-366, doi:10.1086/515930 [2] Hsu, L. (2010), Ph.D. thesis, University of California, Berkeley [3] Brucks, A., et al (2007), Physical Review E 75, 032301, doi:10.1103/PhysRevE.75.032301 [4] Spinewine, B., et al (2011), Experiments in Fluids 50: 1507-1525, doi: 10.1007/s00348

  7. Scale up tools in reactive extrusion and compounding processes. Could 1D-computer modeling be helpful?

    NASA Astrophysics Data System (ADS)

    Pradel, J.-L.; David, C.; Quinebèche, S.; Blondel, P.

    2014-05-01

    Industrial scale-up (or scale down) in Compounding and Reactive Extrusion processes is one of the most critical R&D challenges. Indeed, most of High Performances Polymers are obtained within a reactive compounding involving chemistry: free radical grafting, in situ compatibilization, rheology control... but also side reactions: oxidation, branching, chain scission... As described by basic Arrhenius and kinetics laws, the competition between all chemical reactions depends on residence time distribution and temperature. Then, to ensure the best possible scale up methodology, we need tools to match thermal history of the formulation along the screws from a lab scale twin screw extruder to an industrial one. This paper proposes a comparison between standard scale-up laws and the use of Computer modeling Software such as Ludovic® applied and compared to experimental data. Scaling data from a compounding line to another one, applying general rules (for example at constant specific mechanical energy), shows differences between experimental and computed data, and error depends on the screw speed range. For more accurate prediction, 1D-Computer Modeling could be used to optimize the process conditions to ensure the best scale-up product, especially in temperature sensitive reactive extrusion processes. When the product temperature along the screws is the key, Ludovic® software could help to compute the temperature profile along the screws and extrapolate conditions, even screw profile, on industrial extruders.

  8. Validation and scale-up of plasmid DNA purification by phenyl-boronic acid chromatography.

    PubMed

    Gomes, A Gabriela; Azevedo, Ana M; Aires-Barros, M Raquel; Prazeres, D Miguel F

    2012-11-01

    This study addresses the feasibility of scaling-up the removal of host cell impurities from plasmid DNA (pDNA)-containing Escherichia coli lysates by phenyl-boronic (PB) acid chromatography using columns packed with 7.6 and 15.2 cm(3) of controlled porous glass beads (CPG) derivatized with PB ligands. Equilibration was performed with water at 10 cm(3) /min and no conditioning of the lysate feed was required. At a ratio of lysate feed to adsorbent volume of 1.3, 93-96% of pDNA was recovered in the flow through while 66-71% of impurities remained bound (~2.5-fold purification). The entire sequence of loading, washing, elution, and re-equilibration was completed in 20 min. Run-to-run consistency was observed in terms of chromatogram features and performance (yield, purification factor, agarose electrophoresis) across the different amounts of adsorbent (0.75-15.2 cm(3) ) by performing successive injections of lysates prepared independently and containing 3.7 or 6.1 kbp plasmids. The column productivity at large scale was 4 dm(3) of alkaline lysate per hour per dm(3) of PB-CPG resin. The method is rapid, reproducible, simple, and straightforward to scale-up. Furthermore, it is capable of handling heavily contaminated samples, constituting a good alternative to purification techniques such as isopropanol precipitation, aqueous two-phase systems, and tangential flow filtration. PMID:23175141

  9. Semantic Representation and Scale-Up of Integrated Air Traffic Management Data

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Ranjan, Shubha; Wei, Mie; Eshow, Michelle

    2016-01-01

    Each day, the global air transportation industry generates a vast amount of heterogeneous data from air carriers, air traffic control providers, and secondary aviation entities handling baggage, ticketing, catering, fuel delivery, and other services. Generally, these data are stored in isolated data systems, separated from each other by significant political, regulatory, economic, and technological divides. These realities aside, integrating aviation data into a single, queryable, big data store could enable insights leading to major efficiency, safety, and cost advantages. In this paper, we describe an implemented system for combining heterogeneous air traffic management data using semantic integration techniques. The system transforms data from its original disparate source formats into a unified semantic representation within an ontology-based triple store. Our initial prototype stores only a small sliver of air traffic data covering one day of operations at a major airport. The paper also describes our analysis of difficulties ahead as we prepare to scale up data storage to accommodate successively larger quantities of data -- eventually covering all US commercial domestic flights over an extended multi-year timeframe. We review several approaches to mitigating scale-up related query performance concerns.

  10. Scale-up of the production of highly reactive biogenic magnetite nanoparticles using Geobacter sulfurreducens.

    PubMed

    Byrne, J M; Muhamadali, H; Coker, V S; Cooper, J; Lloyd, J R

    2015-06-01

    Although there are numerous examples of large-scale commercial microbial synthesis routes for organic bioproducts, few studies have addressed the obvious potential for microbial systems to produce inorganic functional biomaterials at scale. Here we address this by focusing on the production of nanoscale biomagnetite particles by the Fe(III)-reducing bacterium Geobacter sulfurreducens, which was scaled up successfully from laboratory- to pilot plant-scale production, while maintaining the surface reactivity and magnetic properties which make this material well suited to commercial exploitation. At the largest scale tested, the bacterium was grown in a 50 l bioreactor, harvested and then inoculated into a buffer solution containing Fe(III)-oxyhydroxide and an electron donor and mediator, which promoted the formation of magnetite in under 24 h. This procedure was capable of producing up to 120 g of biomagnetite. The particle size distribution was maintained between 10 and 15 nm during scale-up of this second step from 10 ml to 10 l, with conserved magnetic properties and surface reactivity; the latter demonstrated by the reduction of Cr(VI). The process presented provides an environmentally benign route to magnetite production and serves as an alternative to harsher synthetic techniques, with the clear potential to be used to produce kilogram to tonne quantities. PMID:25972437

  11. Scale-up of the production of highly reactive biogenic magnetite nanoparticles using Geobacter sulfurreducens

    PubMed Central

    Byrne, J. M.; Muhamadali, H.; Coker, V. S.; Cooper, J.; Lloyd, J. R.

    2015-01-01

    Although there are numerous examples of large-scale commercial microbial synthesis routes for organic bioproducts, few studies have addressed the obvious potential for microbial systems to produce inorganic functional biomaterials at scale. Here we address this by focusing on the production of nanoscale biomagnetite particles by the Fe(III)-reducing bacterium Geobacter sulfurreducens, which was scaled up successfully from laboratory- to pilot plant-scale production, while maintaining the surface reactivity and magnetic properties which make this material well suited to commercial exploitation. At the largest scale tested, the bacterium was grown in a 50 l bioreactor, harvested and then inoculated into a buffer solution containing Fe(III)-oxyhydroxide and an electron donor and mediator, which promoted the formation of magnetite in under 24 h. This procedure was capable of producing up to 120 g of biomagnetite. The particle size distribution was maintained between 10 and 15 nm during scale-up of this second step from 10 ml to 10 l, with conserved magnetic properties and surface reactivity; the latter demonstrated by the reduction of Cr(VI). The process presented provides an environmentally benign route to magnetite production and serves as an alternative to harsher synthetic techniques, with the clear potential to be used to produce kilogram to tonne quantities. PMID:25972437

  12. Scale-up of hydrophobin-assisted recombinant protein production in tobacco BY-2 suspension cells.

    PubMed

    Reuter, Lauri J; Bailey, Michael J; Joensuu, Jussi J; Ritala, Anneli

    2014-05-01

    Plant suspension cell cultures are emerging as an alternative to mammalian cells for production of complex recombinant proteins. Plant cell cultures provide low production cost, intrinsic safety and adherence to current regulations, but low yields and costly purification technology hinder their commercialization. Fungal hydrophobins have been utilized as fusion tags to improve yields and facilitate efficient low-cost purification by surfactant-based aqueous two-phase separation (ATPS) in plant, fungal and insect cells. In this work, we report the utilization of hydrophobin fusion technology in tobacco bright yellow 2 (BY-2) suspension cell platform and the establishment of pilot-scale propagation and downstream processing including first-step purification by ATPS. Green fluorescent protein-hydrophobin fusion (GFP-HFBI) induced the formation of protein bodies in tobacco suspension cells, thus encapsulating the fusion protein into discrete compartments. Cultivation of the BY-2 suspension cells was scaled up in standard stirred tank bioreactors up to 600 L production volume, with no apparent change in growth kinetics. Subsequently, ATPS was applied to selectively capture the GFP-HFBI product from crude cell lysate, resulting in threefold concentration, good purity and up to 60% recovery. The ATPS was scaled up to 20 L volume, without loss off efficiency. This study provides the first proof of concept for large-scale hydrophobin-assisted production of recombinant proteins in tobacco BY-2 cell suspensions. PMID:24341724

  13. Microbial electrolysis cell scale-up for combined wastewater treatment and hydrogen production.

    PubMed

    Gil-Carrera, L; Escapa, A; Mehta, P; Santoyo, G; Guiot, S R; Morán, A; Tartakovsky, B

    2013-02-01

    This study demonstrates microbial electrolysis cell (MEC) scale-up from a 50mL to a 10L cell. Initially, a 50mL membraneless MEC with a gas diffusion cathode was operated on synthetic wastewater at different organic loads. It was concluded that process scale-up might be best accomplished using a "reactor-in-series" concept. Consequently, 855mL and 10L MECs were built and operated. By optimizing the hydraulic retention time (HRT) of the 855mL MEC and individually controlling the applied voltages of three anodic compartments with a real-time optimization algorithm, a COD removal of 5.7g L(R)(-1)d(-1) and a hydrogen production of 1.0-2.6L L(R)(-1)d(-1) was achieved. Furthermore, a two MECs in series 10L setup was constructed and operated on municipal wastewater. This test showed a COD removal rate of 0.5g L(R)(-1)d(-1), a removal efficiency of 60-76%, and an energy consumption of 0.9Whperg of COD removed. PMID:23334014

  14. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  15. Scale-up and advanced performance analysis of boiler combustion chambers

    SciTech Connect

    Richter, W.

    1985-12-31

    This paper discusses methods for evaluation of thermal performance of large boiler furnaces. Merits and limitations of pilot-scale testing and mathematical modeling are pointed out. Available computer models for furnace performance predictions are reviewed according to their classification into finite-difference methods and zone methods. Current state of the art models for industrial application are predominantly zone methods based on advanced Monte-Carlo type techniques for calculation of radiation heat transfer. A representation of this model type is described in more detail together with examples of its practical application. It is also shown, how pilot-scale results can be scaled-up with help of the model to predict full-scale performance of particular boiler furnaces.

  16. Scale-up of phosphate remobilization from sewage sludge in a microbial fuel cell.

    PubMed

    Happe, Manuel; Sugnaux, Marc; Cachelin, Christian Pierre; Stauffer, Marc; Zufferey, Géraldine; Kahoun, Thomas; Salamin, Paul-André; Egli, Thomas; Comninellis, Christos; Grogg, Alain-François; Fischer, Fabian

    2016-01-01

    Phosphate remobilization from digested sewage sludge containing iron phosphate was scaled-up in a microbial fuel cell (MFC). A 3litre triple chambered MFC was constructed. This reactor was operated as a microbial fuel cell and later as a microbial electrolysis cell to accelerate cathodic phosphate remobilization. Applying an additional voltage and exceeding native MFC power accelerated chemical base formation and the related phosphate remobilization rate. The electrolysis approach was extended using a platinum-RVC cathode. The pH rose to 12.6 and phosphate was recovered by 67% in 26h. This was significantly faster than using microbial fuel cell conditions. Shrinking core modelling particle fluid kinetics showed that the reaction resistance has to move inside the sewage sludge particle for considerable rate enhancement. Remobilized phosphate was subsequently precipitated as struvite and inductively coupled plasma mass spectrometry indicated low levels of cadmium, lead, and other metals as required by law for recycling fertilizers. PMID:26519694

  17. Imaging techniques applied to the study of fluids in porous media

    SciTech Connect

    Tomutsa, L.; Brinkmeyer, A.; Doughty, D.

    1993-04-01

    A synergistic rock characterization methodology has been developed. It derives reservoir engineering parameters from X-ray tomography (CT) scanning, computer assisted petrographic image analysis, minipermeameter measurements, and nuclear magnetic resonance imaging (NMRI). This rock characterization methodology is used to investigate the effect of small-scale rock heterogeneity on oil distribution and recovery. It is also used to investigate the applicability of imaging technologies to the development of scaleup procedures from core plug to whole core, by comparing the results of detailed simulations with the images ofthe fluid distributions observed by CT scanning. By using the rock and fluid detailed data generated by imaging technology describe, one can verify directly, in the laboratory, various scaling up techniques. Asan example, realizations of rock properties statistically and spatially compatible with the observed values are generated by one of the various stochastic methods available (fuming bands) and are used as simulator input. The simulation results were compared with both the simulation results using the true rock properties and the fluid distributions observed by CT. Conclusions regarding the effect of the various permeability models on waterflood oil recovery were formulated.

  18. Microalgal biohydrogen production considering light energy and mixing time as the two key features for scale-up.

    PubMed

    Oncel, S; Sabankay, M

    2012-10-01

    This study focuses on a scale-up procedure considering two vital parameters light energy and mixing for microalgae cultivation, taking Chlamydomonas reinhardtii as the model microorganism. Applying two stage hydrogen production protocol to 1L flat type and 2.5L tank type photobioreactors hydrogen production was investigated with constant light energy and mixing time. The conditions that provide the shortest transfer time to anaerobic culture (light energy; 2.96 kJ s(-1)m(-3) and mixing time; 1 min) and highest hydrogen production rate (light energy; 1.22 kJ s(-1)m(-3) and mixing time; 2.5 min) are applied to 5L photobioreactor. The final hydrogen production for 5L system after 192 h was measured as 195 ± 10 mL that is comparable with the other systems is a good validation for the scale-up procedure. PMID:22858490

  19. Scale Up of Pan Coating Process Using Quality by Design Principles.

    PubMed

    Agrawal, Anjali M; Pandey, Preetanshu

    2015-11-01

    Scale up of pan coating process is of high importance to the pharmaceutical and food industry. The number of process variables and their interdependence in a pan coating process can make it a rather complex scale-up problem. This review discusses breaking down the coating process variables into three main categories: pan-related, spray-related, and thermodynamic-related factors. A review on how to scale up each of these factors is presented via two distinct strategies--"macroscopic" and "microscopic" scale-up. In a Quality by Design paradigm, where an increased process understanding is required, there is increased emphasis on "microscopic" scale-up, which by definition ensures a more reproducible process and thereby robust scale-up. This article also reviews the various existing and new modeling and process analytical technology tools that can provide additional information to facilitate a more fundamental understanding of the coating process. PMID:26202540

  20. SCALE-UP OF ADVANCED HOT-GAS DESULFURIZATION SORBENTS

    SciTech Connect

    K. JOTHIMURUGESAN; S.K. GANGWAL

    1998-03-01

    The objective of this study was to develop advanced regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective was to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high sulfidation activity at temperatures as low as 343 C (650 F). Twenty sorbents were synthesized in this work. Details of the preparation technique and the formulations are proprietary, pending a patent application, thus no details regarding the technique are divulged in this report. Sulfidations were conducted with a simulated gas containing (vol %) 10 H{sub 2}, 15 CO, 5 CO{sub 2}, 0.4-1 H{sub 2}S, 15 H{sub 2}O, and balance N{sub 2} in the temperature range of 343-538 C. Regenerations were conducted at temperatures in the range of 400-600 C with air-N{sub 2} mixtures. To prevent sulfation, catalyst additives were investigated that promote regeneration at lower temperatures. Characterization were performed for fresh, sulfided and regenerated sorbents.

  1. What are the barriers to scaling up health interventions in low and middle income countries? A qualitative study of academic leaders in implementation science

    PubMed Central

    2012-01-01

    Background Most low and middle income countries (LMICs) are currently not on track to reach the health-related Millennium Development Goals (MDGs). One way to accelerate progress would be through the large-scale implementation of evidence-based health tools and interventions. This study aimed to: (a) explore the barriers that have impeded such scale-up in LMICs, and (b) lay out an “implementation research agenda”—a series of key research questions that need to be addressed in order to help overcome such barriers. Methods Interviews were conducted with fourteen key informants, all of whom are academic leaders in the field of implementation science, who were purposively selected for their expertise in scaling up in LMICs. Interviews were transcribed by hand and manually coded to look for emerging themes related to the two study aims. Barriers to scaling up, and unanswered research questions, were organized into six categories, representing different components of the scaling up process: attributes of the intervention; attributes of the implementers; scale-up approach; attributes of the adopting community; socio-political, fiscal, and cultural context; and research context. Results Factors impeding the success of scale-up that emerged from the key informant interviews, and which are areas for future investigation, include: complexity of the intervention and lack of technical consensus; limited human resource, leadership, management, and health systems capacity; poor application of proven diffusion techniques; lack of engagement of local implementers and of the adopting community; and inadequate integration of research into scale-up efforts. Conclusions Key steps in expanding the evidence base on implementation in LMICs include studying how to: simplify interventions; train “scale-up leaders” and health workers dedicated to scale-up; reach and engage communities; match the best delivery strategy to the specific health problem and context; and raise the low

  2. Scaling up watershed model parameters--Flow and load simulations of the Edisto River Basin

    USGS Publications Warehouse

    Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul A.

    2014-01-01

    The Edisto River is the longest and largest river system completely contained in South Carolina and is one of the longest free flowing blackwater rivers in the United States. The Edisto River basin also has fish-tissue mercury concentrations that are some of the highest recorded in the United States. As part of an effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River basin, analyses and simulations of the hydrology of the Edisto River basin were made with the topography-based hydrological model (TOPMODEL). The potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River basin, was assessed. Scaling up was done in a step-wise process beginning with applying the calibration parameters, meteorological data, and topographic wetness index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made with subsequent simulations culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River basin and updated calibration parameters for some of the TOPMODEL calibration parameters. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the two models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the significant difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL hydrologic simulations, a visualization tool (the Edisto River Data Viewer) was developed to help assess trends and influencing variables in the stream ecosystem. Incorporated into the visualization tool were the water-quality load models TOPLOAD, TOPLOAD-H, and LOADEST

  3. Scaling up parallel scientific applications on the IBM SP

    SciTech Connect

    Skinner, David

    2004-01-08

    This document provides a technical description of the IBM SP seaborg.nersc.gov with an emphasis on how the system performs as a production environment for large scale scientific applications. The overall goal is to provide developers with the information they need to write and run such applications. While some of the information presented here may be applicable in a larger context, the focus is on experiences and scaling techniques specific to seaborg.nersc.gov. In the first few sections we seek to determine how well the theoretical capabilities of this machine may be realized in practice. Most of the measured performance numbers come from small code microkernels or test codes rather than from real user codes. In a companion document several real applications are explored in detail. The microkernel approach described above has value to the user since the most widely quoted performance numbers often reflect the theoretical (peak) performance values rather than those realized in practice. Likewise anecdotes from full blown applications often involve mixed algorithms, hidden constraints or other specifics which make the results hard to generalize. Microkernels and test codes provide a middle ground which is a reasonable best case scenario for real user codes and provide the user with a small kernel of code can serve as a template for the writing or modification of more substantial codes.

  4. Scaling Up Nature: Large Area Flexible Biomimetic Surfaces.

    PubMed

    Li, Yinyong; John, Jacob; Kolewe, Kristopher W; Schiffman, Jessica D; Carter, Kenneth R

    2015-10-28

    The fabrication and advanced function of large area biomimetic superhydrophobic surfaces (SHS) and slippery lubricant-infused porous surfaces (SLIPS) are reported. The use of roll-to-roll nanoimprinting techniques enabled the continuous fabrication of SHS and SLIPS based on hierarchically wrinkled surfaces. Perfluoropolyether hybrid molds were used as flexible molds for roll-to-roll imprinting into a newly designed thiol-ene based photopolymer resin coated on flexible polyethylene terephthalate films. The patterned surfaces exhibit feasible superhydrophobicity with a water contact angle around 160° without any further surface modification. The SHS can be easily converted into SLIPS by roll-to-roll coating of a fluorinated lubricant, and these surfaces have outstanding repellence to a variety of liquids. Furthermore, both SHS and SLIPS display antibiofouling properties when challenged with Escherichia coli K12 MG1655. The current article describes the transformation of artificial biomimetic structures from small, lab-scale coupons to low-cost, large area platforms. PMID:26423494

  5. Scale-up of catalytic wet oxidation under moderate conditions

    SciTech Connect

    Harf, J.; Hug, A.; Vogel, F.; Rohr, P.R. von

    1999-05-01

    The Catalytic Wet Oxidation with pure oxygen is a suitable treatment process for the degradation of organic matter in wastewaters and sludges. The applied moderate reaction conditions lead only to a partial oxidation of the organics. Therefore the resulting process water has to be purified in a biological treatment plant. In this study, experimental data collected during the wet oxidation of phenol and sewage sludge in a laboratory batch reactor as well as in a pilot plant are presented. A generalized kinetic model combined with a residence time analysis allows to predict accurately the degradation of organic matter in the pilot plant. The wet oxidation of wastewaters and sewage sludge was realized in one single plant concept. Treating suspended or diluted organic wastes produces a highly biodegradable process water containing low molecular oxidation products. The investigated Catalytic Wet Oxidation of sewage sludge generates a residual solid complying with the European quality standards of disposal concerning leachability and organic content. Due to its low capital and operating costs, the Catalytic Wet Oxidation process constitutes an acceptable alternative to incineration for the disposal of sludges.

  6. Soil vapor extraction system design scale-up considerations

    SciTech Connect

    Peterson, E.F.; Battey, R.F.

    1997-12-31

    Tried and true design considerations need to be reexamined when designing and implementing a 10,000 scfm soil vapor extraction system. Soil vapor extraction systems have typically been applied at many sites on a fairly small scale, involving air flows of several hundred to a thousand cubic feet per minute. Systems of 10,000 scfm are rarely encountered and entail some unique design considerations. This paper describes the technology options, equipment availability, and other design considerations for a 10,000 scfm system (installed at a former aircraft maintenance facility in Southern California). During the design, low pressure centrifugal fans, higher pressure centrifugal blowers, regenerative blowers and positive-displacement blowers are considered as exhausters. Several technologies are considered for treatment of the extracted air to reduce volatile organic compound (VOC) content: granular activated carbon adsorption, resin adsorption, thermal oxidation and catalytic oxidation. Cost and efficiency criteria are evaluated for the final selection of process equipment at this site. The choice of technology for reduction of VOCs is strongly dependent upon the estimate of recoverable VOCs initially in the soil, the cost of activated carbon replacement and reactivation service, and the cost of fuel. The choice of exhauster is most strongly influenced by the vacuum required at the vapor extraction wells to efficiently move air through the soil matrix and the treatment equipment. The choice of type of exhauster is also limited by the 10,000 scfm air flow rate.

  7. Scale-up of miscible flood processes. Annual report

    SciTech Connect

    Orr, F.M. Jr.

    1992-05-01

    Results of a wide-ranging investigation of the scaling of the physical mechanisms of miscible floods are reported. Advanced techniques for analysis of crude oils are considered in Chapter 2. Application of supercritical fluid chromatography is demonstrated for characterization of crude oils for equation-of-state calculations of phase equilibrium. Results of measurements of crude oil and phase compositions by gas chromatography and mass spectrometry are also reported. The theory of development of miscibility is considered in detail in Chapter 3. The theory is extended to four components, and sample solutions for a variety of gas injection systems are presented. The analytical theory shows that miscibility can develop even though standard tie-line extension criteria developed for ternary systems are not satisfied. In addition, the theory includes the first analytical solutions for condensing/vaporizing gas drives. In Chapter 4, methods for simulation of viscous fingering are considered. The scaling of the growth of transition zones in linear viscous fingering is considered. In addition, extension of the models developed previously to three dimensions is described, as is the inclusion of effects of equilibrium phase behavior. In Chapter 5, the combined effects of capillary and gravity-driven crossflow are considered. The experimental results presented show that very high recovery can be achieved by gravity segregation when interfacial tensions are moderately low. We argue that such crossflow mechanisms are important in multicontact miscible floods in heterogeneous reservoirs. In addition, results of flow visualization experiments are presented that illustrate the interplay of crossflow driven by gravity with that driven by viscous forces.

  8. EFFECT OF PRELOADING ON THE SCALE-UP OF GAC MICRO- COLUMNS

    EPA Science Inventory

    A previously presented microcolumn scale-up procedure is evaluated. Scale-up assumptions that involve equal capacities in microcolumns and field columns are studied in an effort to determine whether preloading activated carbon with a natural water significantly reduces the carbo...

  9. 78 FR 25977 - Applications for New Awards; Investing in Innovation Fund, Scale-up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-03

    ... Applications for New Awards; Investing in Innovation Fund, Scale- up Grants AGENCY: Office of Innovation and Improvement, Department of Education. ACTION: Notice. Overview Information Investing in Innovation Fund, Scale... Domestic Assistance (CFDA) Number: 84.411A (Scale-up grants). DATES: Applications Available: May 6,...

  10. What Does It Take to Scale Up and Sustain Evidence-Based Practices?

    ERIC Educational Resources Information Center

    Klingner, Janette K.; Boardman, Alison G.; Mcmaster, Kristen L.

    2013-01-01

    This article discusses the strategic scaling up of evidence-based practices. The authors draw from the scholarly work of fellow special education researchers and from the field of learning sciences. The article defines scaling up as the process by which researchers or educators initially implement interventions on a small scale, validate them, and…

  11. Fundamental Issues Concerning the Sustainment and Scaling Up of Professional Development Programs

    ERIC Educational Resources Information Center

    Tirosh, Dina; Tsamir, Pessia; Levenson, Esther

    2015-01-01

    The issue of sustaining and scaling up professional development for mathematics teachers raises several fundamental issues for researchers. This commentary addresses various definitions for sustainability and scaling up and how these definitions may affect the design of programs as well as the design of research. We consider four of the papers in…

  12. 'Scaling-up is a craft not a science': Catalysing scale-up of health innovations in Ethiopia, India and Nigeria.

    PubMed

    Spicer, Neil; Bhattacharya, Dipankar; Dimka, Ritgak; Fanta, Feleke; Mangham-Jefferies, Lindsay; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Walt, Gill; Wickremasinghe, Deepthi

    2014-11-01

    Donors and other development partners commonly introduce innovative practices and technologies to improve health in low and middle income countries. Yet many innovations that are effective in improving health and survival are slow to be translated into policy and implemented at scale. Understanding the factors influencing scale-up is important. We conducted a qualitative study involving 150 semi-structured interviews with government, development partners, civil society organisations and externally funded implementers, professional associations and academic institutions in 2012/13 to explore scale-up of innovative interventions targeting mothers and newborns in Ethiopia, the Indian state of Uttar Pradesh and the six states of northeast Nigeria, which are settings with high burdens of maternal and neonatal mortality. Interviews were analysed using a common analytic framework developed for cross-country comparison and themes were coded using Nvivo. We found that programme implementers across the three settings require multiple steps to catalyse scale-up. Advocating for government to adopt and finance health innovations requires: designing scalable innovations; embedding scale-up in programme design and allocating time and resources; building implementer capacity to catalyse scale-up; adopting effective approaches to advocacy; presenting strong evidence to support government decision making; involving government in programme design; invoking policy champions and networks; strengthening harmonisation among external programmes; aligning innovations with health systems and priorities. Other steps include: supporting government to develop policies and programmes and strengthening health systems and staff; promoting community uptake by involving media, community leaders, mobilisation teams and role models. We conclude that scale-up has no magic bullet solution - implementers must embrace multiple activities, and require substantial support from donors and governments in

  13. Fed-batch bioreactor process scale-up from 3-L to 2,500-L scale for monoclonal antibody production from cell culture.

    PubMed

    Yang, Jeng-Dar; Lu, Canghai; Stasny, Brad; Henley, Joseph; Guinto, Woodrow; Gonzalez, Carlos; Gleason, Joseph; Fung, Monica; Collopy, Brett; Benjamino, Michael; Gangi, Jennifer; Hanson, Melissa; Ille, Elisabeth

    2007-09-01

    This case study focuses on the scale-up of a Sp2/0 mouse myeloma cell line based fed-batch bioreactor process, from the initial 3-L bench scale to the 2,500-L scale. A stepwise scale-up strategy that involved several intermediate steps in increasing the bioreactor volume was adopted to minimize the risks associated with scale-up processes. Careful selection of several available mixing models from literature, and appropriately applying the calculated results to our settings, resulted in successful scale-up of agitation speed for the large bioreactors. Consideration was also given to scale-up of the nutrient feeding, inoculation, and the set-points of operational parameters such as temperature, pH, dissolved oxygen, dissolved carbon dioxide, and aeration in an integrated manner. It has been demonstrated through the qualitative and the quantitative side-by-side comparison of bioreactor performance as well as through a panel of biochemical characterization tests that the comparability of the process and the product was well controlled and maintained during the process scale-up. The 2,500-L process is currently in use for the routine clinical production of Epratuzumab in support of two global Phase III clinical trials in patients with lupus. Today, the 2,500 L, fed-batch production process for Epratuzumab has met all scheduled batch releases, and the quality of the antibody is consistent and reproducible, meeting all specifications, thus confirming the robustness of the process. PMID:17657776

  14. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1987-01-01

    Optimization techniques applied to passive measures for in-orbit spacecraft survivability, is a six-month study, designed to evaluate the effectiveness of the geometric programming (GP) optimization technique in determining the optimal design of a meteoroid and space debris protection system for the Space Station Core Module configuration. Geometric Programming was found to be superior to other methods in that it provided maximum protection from impact problems at the lowest weight and cost.

  15. Development of Promising Insulating Oil and Applied Techniques of EHD, ER·MR

    NASA Astrophysics Data System (ADS)

    Hanaoka, Ryoichi

    The development of an environment-friendly insulating liquid has been noticed for a new design of oil-filled power apparatus such as transformer from viewpoints of the protection of the environment. The dielectric liquids can also widely be applied to various fields which are concerned in the electromagnetic field. This article introduces the recent trend on promising new vegetable based oil as an electrical insulation, and EHD pumping, ER fluid and MR fluid as the applied techniques of dielectric liquids.

  16. At-line process analytical technology (PAT) for more efficient scale up of biopharmaceutical microfiltration unit operations.

    PubMed

    Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I

    2016-01-01

    Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 32:108-115, 2016. PMID:26519135

  17. The technique of linear prediction filters applied to studies of solar wind-magnetosphere coupling

    NASA Technical Reports Server (NTRS)

    Clauer, C. Robert

    1986-01-01

    Linear prediction filtering is a powerful empirical technique suitable for the study of stimulus-response behavior. The technique enables one to determine the most general linear relationship between multiple time-varying quantities, assuming that the physical systems relating the quantities are linear and time invariant. Several researchers have applied linear prediction analysis to investigate solar wind-magnetosphere interactions. This short review describes the method of linear prediction analysis, its application to solar wind-magnetosphere coupling studies both in terms of physical processes, and the results of investigations which have used this technique.

  18. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  19. Recreation in a Zoo Environment: Applying Animal Behavior Research Techniques to Understand How Visitors Allocate Time.

    ERIC Educational Resources Information Center

    Harris, Lisa

    1995-01-01

    A focal-animal sampling technique was applied to measure and quantify visitor behavior at an enclosed hummingbird aviary. The amount of time visitors stayed within the aviary and how they allocated time was measured. Results can be used by exhibit designers to create and modify museum exhibits. (LZ)

  20. Scaling-up Process-Oriented Guided Inquiry Learning Techniques for Teaching Large Information Systems Courses

    ERIC Educational Resources Information Center

    Trevathan, Jarrod; Myers, Trina; Gray, Heather

    2014-01-01

    Promoting engagement during lectures becomes significantly more challenging as class sizes increase. Therefore, lecturers need to experiment with new teaching methodologies to embolden deep learning outcomes and to develop interpersonal skills amongst students. Process Oriented Guided Inquiry Learning is a teaching approach that uses highly…

  1. Methodologies Used for Scaling-up From a Single Energy Production Unit to State Energy Sector

    NASA Astrophysics Data System (ADS)

    Cimdina, Ginta; Timma, Lelde; Veidenbergs, Ivars; Blumberga, Dagnija

    2015-12-01

    In a well-functioning and sustainable national energy sector, each of its elements should function with maximum efficiency. To ensure maximum efficiency and study possible improvement of the sector, a scaling-up framework is presented in this work. The scaling-up framework means that the starting point is a CHP unit and its operation, the next step of aggregation is in a district heating network, followed by a municipal energy plan and finally leading to a low carbon strategy. In this framework the authors argue, that the successful, innovative practices developed and tested at the lower level of aggregation can be then transferred to the upper levels of aggregation, thus leading to a scaling-up effect of innovative practices. The work summarizes 12 methodologies used in the energy sector, by dividing these methodologies among the levels of aggregation in a scaling-up framework.

  2. A Route to Scale Up DNA Origami Using DNA Tiles as Folding Staples

    SciTech Connect

    Zhao, Zhao; Yan, Hao; Liu, Yan

    2010-01-26

    A new strategy is presented to scale up DNA origami using multi-helical DNA tiles as folding staples. Atomic force microscopy images demonstrate the two-dimensional structures formed by using this strategy.

  3. Scaling Up Graph-Based Semisupervised Learning via Prototype Vector Machines

    PubMed Central

    Zhang, Kai; Lan, Liang; Kwok, James T.; Vucetic, Slobodan; Parvin, Bahram

    2014-01-01

    When the amount of labeled data are limited, semi-supervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via ℓ1-regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning. PMID:25720002

  4. Scaling up graph-based semisupervised learning via prototype vector machines.

    PubMed

    Zhang, Kai; Lan, Liang; Kwok, James T; Vucetic, Slobodan; Parvin, Bahram

    2015-03-01

    When the amount of labeled data are limited, semisupervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via l1 -regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning. PMID:25720002

  5. Scaling up and error analysis of transpiration for Populus euphratica in a desert riparian forest

    NASA Astrophysics Data System (ADS)

    Si, J.; Li, W.; Feng, Q.

    2013-12-01

    Water consumption information of the forest stand is the most important factor for regional water resources management. However, water consumption of individual trees are usually measured based on the limited sample trees , so, it is an important issue how to realize eventual scaling up of data from a series of sample trees to entire stand. Estimation of sap flow flux density (Fd) and stand sapwood area (AS-stand) are among the most critical factors for determining forest stand transpiration using sap flow measurement. To estimate Fd, the various links in sap flow technology have great impact on the measurement of sap flow, to estimate AS-stand, an appropriate indirect technique for measuring each tree sapwood area (AS-tree) is required, because it is impossible to measure the AS-tree of all trees in a forest stand. In this study, Fd was measured in 2 mature P. euphratic trees at several radial depths, 0~10, 10~30mm, using sap flow sensors with the heat ratio method, the relationship model between AS-tree and stem diameter (DBH), growth model of AS-tree were established, using investigative original data of DBH, tree-age, and AS-tree. The results revealed that it can achieve scaling up of transpiration from sample trees to entire forest stand using AS-tree and Fd, however, the transpiration of forest stand (E) will be overvalued by 12.6% if using Fd of 0~10mm, and it will be underestimated by 25.3% if using Fd of 10~30mm, it implied that major uncertainties in mean stand Fd estimations are caused by radial variations in Fd. E will be obviously overvalued when the AS-stand is constant, this result imply that it is the key to improve the prediction accuracy that how to simulate the AS-stand changes in the day scale; They also showed that the potential errors in transpiration with a sample size of approximately ≥30 were almost stable for P.euphrtica, this suggests that to make an allometric equation it might be necessary to sample at least 30 trees.

  6. Metabolic Profiling of Geobacter sulfurreducens during Industrial Bioprocess Scale-Up

    PubMed Central

    Muhamadali, Howbeer; Xu, Yun; Ellis, David I.; Allwood, J. William; Rattray, Nicholas J. W.; Correa, Elon; Alrabiah, Haitham

    2015-01-01

    During the industrial scale-up of bioprocesses it is important to establish that the biological system has not changed significantly when moving from small laboratory-scale shake flasks or culturing bottles to an industrially relevant production level. Therefore, during upscaling of biomass production for a range of metal transformations, including the production of biogenic magnetite nanoparticles by Geobacter sulfurreducens, from 100-ml bench-scale to 5-liter fermentors, we applied Fourier transform infrared (FTIR) spectroscopy as a metabolic fingerprinting approach followed by the analysis of bacterial cell extracts by gas chromatography-mass spectrometry (GC-MS) for metabolic profiling. FTIR results clearly differentiated between the phenotypic changes associated with different growth phases as well as the two culturing conditions. Furthermore, the clustering patterns displayed by multivariate analysis were in agreement with the turbidimetric measurements, which displayed an extended lag phase for cells grown in a 5-liter bioreactor (24 h) compared to those grown in 100-ml serum bottles (6 h). GC-MS analysis of the cell extracts demonstrated an overall accumulation of fumarate during the lag phase under both culturing conditions, coinciding with the detected concentrations of oxaloacetate, pyruvate, nicotinamide, and glycerol-3-phosphate being at their lowest levels compared to other growth phases. These metabolites were overlaid onto a metabolic network of G. sulfurreducens, and taking into account the levels of these metabolites throughout the fermentation process, the limited availability of oxaloacetate and nicotinamide would seem to be the main metabolic bottleneck resulting from this scale-up process. Additional metabolite-feeding experiments were carried out to validate the above hypothesis. Nicotinamide supplementation (1 mM) did not display any significant effects on the lag phase of G. sulfurreducens cells grown in the 100-ml serum bottles. However

  7. Metabolic Profiling of Geobacter sulfurreducens during Industrial Bioprocess Scale-Up.

    PubMed

    Muhamadali, Howbeer; Xu, Yun; Ellis, David I; Allwood, J William; Rattray, Nicholas J W; Correa, Elon; Alrabiah, Haitham; Lloyd, Jonathan R; Goodacre, Royston

    2015-05-15

    During the industrial scale-up of bioprocesses it is important to establish that the biological system has not changed significantly when moving from small laboratory-scale shake flasks or culturing bottles to an industrially relevant production level. Therefore, during upscaling of biomass production for a range of metal transformations, including the production of biogenic magnetite nanoparticles by Geobacter sulfurreducens, from 100-ml bench-scale to 5-liter fermentors, we applied Fourier transform infrared (FTIR) spectroscopy as a metabolic fingerprinting approach followed by the analysis of bacterial cell extracts by gas chromatography-mass spectrometry (GC-MS) for metabolic profiling. FTIR results clearly differentiated between the phenotypic changes associated with different growth phases as well as the two culturing conditions. Furthermore, the clustering patterns displayed by multivariate analysis were in agreement with the turbidimetric measurements, which displayed an extended lag phase for cells grown in a 5-liter bioreactor (24 h) compared to those grown in 100-ml serum bottles (6 h). GC-MS analysis of the cell extracts demonstrated an overall accumulation of fumarate during the lag phase under both culturing conditions, coinciding with the detected concentrations of oxaloacetate, pyruvate, nicotinamide, and glycerol-3-phosphate being at their lowest levels compared to other growth phases. These metabolites were overlaid onto a metabolic network of G. sulfurreducens, and taking into account the levels of these metabolites throughout the fermentation process, the limited availability of oxaloacetate and nicotinamide would seem to be the main metabolic bottleneck resulting from this scale-up process. Additional metabolite-feeding experiments were carried out to validate the above hypothesis. Nicotinamide supplementation (1 mM) did not display any significant effects on the lag phase of G. sulfurreducens cells grown in the 100-ml serum bottles. However

  8. Comparison of a laboratory and a production coating spray gun with respect to scale-up.

    PubMed

    Mueller, Ronny; Kleinebudde, Peter

    2007-01-01

    A laboratory spray gun and a production spray gun were investigated in a scale-up study. Two Schlick spray guns, which are equipped with a new antibearding cap, were used in this study. The influence of the atomization air pressure, spray gun-to tablet bed distance, polymer solution viscosity, and spray rate were analyzed in a statistical design of experiments. The 2 spray guns were compared with respect to the spray width and height, droplet size, droplet velocity, and spray density. The droplet size, velocity, and spray density were measured with a Phase Doppler Particle Analyzer. A successful scale-up of the atomization is accomplished if similar droplet sizes, droplet velocities, and spray densities are achieved in the production scale as in the laboratory scale. This study gives basic information for the scale-up of the settings from the laboratory spray gun to the production spray gun. Both spray guns are highly comparable with respect to the droplet size and velocity. The scale-up of the droplet size should be performed by an adjustment of the atomization air pressure. The scale-up of the droplet velocity should be performed by an adjustment of the spray gun to tablet bed distance. The presented statistical model and surface plots are convenient and powerful tools for scaling up the spray settings if the spray gun is changed from laboratory spray gun to the production spray gun. PMID:17408226

  9. Applied potential tomography. A new noninvasive technique for measuring gastric emptying

    SciTech Connect

    Avill, R.; Mangnall, Y.F.; Bird, N.C.; Brown, B.H.; Barber, D.C.; Seagar, A.D.; Johnson, A.G.; Read, N.W.

    1987-04-01

    Applied potential tomography is a new, noninvasive technique that yields sequential images of the resistivity of gastric contents after subjects have ingested a liquid or semisolid meal. This study validates the technique as a means of measuring gastric emptying. Experiments in vitro showed an excellent correlation between measurements of resistivity and either the square of the radius of a glass rod or the volume of water in a spherical balloon when both were placed in an oval tank containing saline. Altering the lateral position of the rod in the tank did not alter the values obtained. Images of abdominal resistivity were also directly correlated with the volume of air in a gastric balloon. Profiles of gastric emptying of liquid meals obtained using applied potential tomography were very similar to those obtained using scintigraphy or dye dilution techniques, provided that acid secretion was inhibited by cimetidine. Profiles of emptying of a mashed potato meal using applied potential tomography were also very similar to those obtained by scintigraphy. Measurements of the emptying of a liquid meal from the stomach were reproducible if acid secretion was inhibited by cimetidine. Thus, applied potential tomography is an accurate and reproducible method of measuring gastric emptying of liquids and particulate food. It is inexpensive, well tolerated, easy to use, and ideally suited for multiple studies in patients, even those who are pregnant.

  10. Color metallography and electron microscopy techniques applied to the characterization of 413.0 aluminum alloys.

    PubMed

    Vander Voort, George; Asensio-Lozano, Juan; Suárez-Peña, Beatriz

    2013-08-01

    The influence on alloy 413.0 of the refinement and modification of its microstructure was analyzed by means of several microscopy techniques, as well as the effect of the application of high pressure during solidification. For each treatment and solidification pressure condition employed, the most suitable microscopy techniques for identifying and characterizing the phases present were investigated. Color metallography and electron microscopy techniques were applied to the qualitative microstructural analysis. Volume fraction and grain size of the primary α-Al were characterized by quantitative metallographic techniques. The results show that the effect caused by applying high pressure during solidification of the alloy is more pronounced than that caused by modification and refinement of the microstructure when it solidifies at atmospheric pressure. Furthermore, it has been shown that, for Al-Si alloy characterization, when aiming to characterize the primary α-Al phase, optical color metallography observed under crossed polarized light plus a sensitive tint filter is the most suitable technique. When the goal is to characterize the eutectic Si, the use of optical color metallography or electron microscopy is equally valid. The characterization of iron-rich intermetallic compounds should preferably be performed by means of backscattered electron imaging. PMID:23701972

  11. An Optimized Integrator Windup Protection Technique Applied to a Turbofan Engine Control

    NASA Technical Reports Server (NTRS)

    Watts, Stephen R.; Garg, Sanjay

    1995-01-01

    This paper introduces a new technique for providing memoryless integrator windup protection which utilizes readily available optimization software tools. This integrator windup protection synthesis provides a concise methodology for creating integrator windup protection for each actuation system loop independently while assuring both controller and closed loop system stability. The individual actuation system loops' integrator windup protection can then be combined to provide integrator windup protection for the entire system. This technique is applied to an H(exp infinity) based multivariable control designed for a linear model of an advanced afterburning turbofan engine. The resulting transient characteristics are examined for the integrated system while encountering single and multiple actuation limits.

  12. Scaling up health interventions in resource-poor countries: what role does research in stated-preference framework play?

    PubMed

    Pokhrel, Subhash

    2006-01-01

    Despite improved supply of health care services in low-income countries in the recent past, their uptake continues to be lower than anticipated. This has made it difficult to scale-up those interventions which are not only cost-effective from supply perspectives but that might have substantial impacts on improving the health status of these countries. Understanding demand-side barriers is therefore critically important. With the help of a case study from Nepal, this commentary argues that more research on demand-side barriers needs to be carried out and that the stated-preference (SP) approach to such research might be helpful. Since SP techniques place service users' preferences at the centre of the analysis, and because preferences reflect individual or social welfare, SP techniques are likely to be helpful in devising policies to increase social welfare (e.g. improved service coverage). Moreover, the SP data are collected in a controlled environment which allows straightforward identification of effects (e.g. that of process attributes of care) and large quantities of relevant data can be collected at moderate cost. In addition to providing insights into current preferences, SP data also provide insights into how preferences are likely to respond to a proposed change in resource allocation (e.g. changing service delivery strategy). Finally, the SP-based techniques have been used widely in resource-rich countries and their experience can be valuable in conducting scaling-up research in low-income countries. PMID:16573821

  13. A comparison of two conformal mapping techniques applied to an aerobrake body

    NASA Technical Reports Server (NTRS)

    Hommel, Mark J.

    1987-01-01

    Conformal mapping is a classical technique which has been utilized for solving problems in aerodynamics and hydrodynamics. Conformal mapping has been successfully applied in the construction of grids around airfoils, engine inlets and other aircraft configurations. Conformal mapping techniques were applied to an aerobrake body having an axis of symmetry. Two different approaches were utilized: (1) Karman-Trefftz transformation; and (2) Point Wise Schwarz Christoffel transformation. In both cases, the aerobrake body was mapped onto a near circle, and a grid was generated in the mapped plane. The mapped body and grid were then mapped back into physical space and the properties of the associated grids were examined. Advantages and disadvantages of both approaches are discussed.

  14. Scaling up strategies of the chronic respiratory disease programme of the European Innovation Partnership on Active and Healthy Ageing (Action Plan B3: Area 5).

    PubMed

    Bousquet, J; Farrell, J; Crooks, G; Hellings, P; Bel, E H; Bewick, M; Chavannes, N H; de Sousa, J Correia; Cruz, A A; Haahtela, T; Joos, G; Khaltaev, N; Malva, J; Muraro, A; Nogues, M; Palkonen, S; Pedersen, S; Robalo-Cordeiro, C; Samolinski, B; Strandberg, T; Valiulis, A; Yorgancioglu, A; Zuberbier, T; Bedbrook, A; Aberer, W; Adachi, M; Agusti, A; Akdis, C A; Akdis, M; Ankri, J; Alonso, A; Annesi-Maesano, I; Ansotegui, I J; Anto, J M; Arnavielhe, S; Arshad, H; Bai, C; Baiardini, I; Bachert, C; Baigenzhin, A K; Barbara, C; Bateman, E D; Beghé, B; Kheder, A Ben; Bennoor, K S; Benson, M; Bergmann, K C; Bieber, T; Bindslev-Jensen, C; Bjermer, L; Blain, H; Blasi, F; Boner, A L; Bonini, M; Bonini, S; Bosnic-Anticevitch, S; Boulet, L P; Bourret, R; Bousquet, P J; Braido, F; Briggs, A H; Brightling, C E; Brozek, J; Buhl, R; Burney, P G; Bush, A; Caballero-Fonseca, F; Caimmi, D; Calderon, M A; Calverley, P M; Camargos, P A M; Canonica, G W; Camuzat, T; Carlsen, K H; Carr, W; Carriazo, A; Casale, T; Cepeda Sarabia, A M; Chatzi, L; Chen, Y Z; Chiron, R; Chkhartishvili, E; Chuchalin, A G; Chung, K F; Ciprandi, G; Cirule, I; Cox, L; Costa, D J; Custovic, A; Dahl, R; Dahlen, S E; Darsow, U; De Carlo, G; De Blay, F; Dedeu, T; Deleanu, D; De Manuel Keenoy, E; Demoly, P; Denburg, J A; Devillier, P; Didier, A; Dinh-Xuan, A T; Djukanovic, R; Dokic, D; Douagui, H; Dray, G; Dubakiene, R; Durham, S R; Dykewicz, M S; El-Gamal, Y; Emuzyte, R; Fabbri, L M; Fletcher, M; Fiocchi, A; Fink Wagner, A; Fonseca, J; Fokkens, W J; Forastiere, F; Frith, P; Gaga, M; Gamkrelidze, A; Garces, J; Garcia-Aymerich, J; Gemicioğlu, B; Gereda, J E; González Diaz, S; Gotua, M; Grisle, I; Grouse, L; Gutter, Z; Guzmán, M A; Heaney, L G; Hellquist-Dahl, B; Henderson, D; Hendry, A; Heinrich, J; Heve, D; Horak, F; Hourihane, J O' B; Howarth, P; Humbert, M; Hyland, M E; Illario, M; Ivancevich, J C; Jardim, J R; Jares, E J; Jeandel, C; Jenkins, C; Johnston, S L; Jonquet, O; Julge, K; Jung, K S; Just, J; Kaidashev, I; Kaitov, M R; Kalayci, O; Kalyoncu, A F; Keil, T; Keith, P K; Klimek, L; Koffi N'Goran, B; Kolek, V; Koppelman, G H; Kowalski, M L; Kull, I; Kuna, P; Kvedariene, V; Lambrecht, B; Lau, S; Larenas-Linnemann, D; Laune, D; Le, L T T; Lieberman, P; Lipworth, B; Li, J; Lodrup Carlsen, K; Louis, R; MacNee, W; Magard, Y; Magnan, A; Mahboub, B; Mair, A; Majer, I; Makela, M J; Manning, P; Mara, S; Marshall, G D; Masjedi, M R; Matignon, P; Maurer, M; Mavale-Manuel, S; Melén, E; Melo-Gomes, E; Meltzer, E O; Menzies-Gow, A; Merk, H; Michel, J P; Miculinic, N; Mihaltan, F; Milenkovic, B; Mohammad, G M Y; Molimard, M; Momas, I; Montilla-Santana, A; Morais-Almeida, M; Morgan, M; Mösges, R; Mullol, J; Nafti, S; Namazova-Baranova, L; Naclerio, R; Neou, A; Neffen, H; Nekam, K; Niggemann, B; Ninot, G; Nyembue, T D; O'Hehir, R E; Ohta, K; Okamoto, Y; Okubo, K; Ouedraogo, S; Paggiaro, P; Pali-Schöll, I; Panzner, P; Papadopoulos, N; Papi, A; Park, H S; Passalacqua, G; Pavord, I; Pawankar, R; Pengelly, R; Pfaar, O; Picard, R; Pigearias, B; Pin, I; Plavec, D; Poethig, D; Pohl, W; Popov, T A; Portejoie, F; Potter, P; Postma, D; Price, D; Rabe, K F; Raciborski, F; Radier Pontal, F; Repka-Ramirez, S; Reitamo, S; Rennard, S; Rodenas, F; Roberts, J; Roca, J; Rodriguez Mañas, L; Rolland, C; Roman Rodriguez, M; Romano, A; Rosado-Pinto, J; Rosario, N; Rosenwasser, L; Rottem, M; Ryan, D; Sanchez-Borges, M; Scadding, G K; Schunemann, H J; Serrano, E; Schmid-Grendelmeier, P; Schulz, H; Sheikh, A; Shields, M; Siafakas, N; Sibille, Y; Similowski, T; Simons, F E R; Sisul, J C; Skrindo, I; Smit, H A; Solé, D; Sooronbaev, T; Spranger, O; Stelmach, R; Sterk, P J; Sunyer, J; Thijs, C; To, T; Todo-Bom, A; Triggiani, M; Valenta, R; Valero, A L; Valia, E; Valovirta, E; Van Ganse, E; van Hage, M; Vandenplas, O; Vasankari, T; Vellas, B; Vestbo, J; Vezzani, G; Vichyanond, P; Viegi, G; Vogelmeier, C; Vontetsianos, T; Wagenmann, M; Wallaert, B; Walker, S; Wang, D Y; Wahn, U; Wickman, M; Williams, D M; Williams, S; Wright, J; Yawn, B P; Yiallouros, P K; Yusuf, O M; Zaidi, A; Zar, H J; Zernotti, M E; Zhang, L; Zhong, N; Zidarn, M; Mercier, J

    2016-01-01

    Action Plan B3 of the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA) focuses on the integrated care of chronic diseases. Area 5 (Care Pathways) was initiated using chronic respiratory diseases as a model. The chronic respiratory disease action plan includes (1) AIRWAYS integrated care pathways (ICPs), (2) the joint initiative between the Reference site MACVIA-LR (Contre les MAladies Chroniques pour un VIeillissement Actif) and ARIA (Allergic Rhinitis and its Impact on Asthma), (3) Commitments for Action to the European Innovation Partnership on Active and Healthy Ageing and the AIRWAYS ICPs network. It is deployed in collaboration with the World Health Organization Global Alliance against Chronic Respiratory Diseases (GARD). The European Innovation Partnership on Active and Healthy Ageing has proposed a 5-step framework for developing an individual scaling up strategy: (1) what to scale up: (1-a) databases of good practices, (1-b) assessment of viability of the scaling up of good practices, (1-c) classification of good practices for local replication and (2) how to scale up: (2-a) facilitating partnerships for scaling up, (2-b) implementation of key success factors and lessons learnt, including emerging technologies for individualised and predictive medicine. This strategy has already been applied to the chronic respiratory disease action plan of the European Innovation Partnership on Active and Healthy Ageing. PMID:27478588

  15. Non-Intrusive Measurement Techniques Applied to the Hybrid Solid Fuel Degradation

    NASA Astrophysics Data System (ADS)

    Cauty, F.

    2004-10-01

    The knowledge of the solid fuel regression rate and the time evolution of the grain geometry are requested for hybrid motor design and control of its operating conditions. Two non-intrusive techniques (NDT) have been applied to hybrid propulsion : both are based on wave propagation, the X-rays and the ultrasounds, through the materials. X-ray techniques allow local thickness measurements (attenuated signal level) using small probes or 2D images (Real Time Radiography), with a link between the size of field of view and accuracy. Beside the safety hazards associated with the high-intensity X-ray systems, the image analysis requires the use of quite complex post-processing techniques. The ultrasound technique is more widely used in energetic material applications, including hybrid fuels. Depending upon the transducer size and the associated equipment, the application domain is large, from tiny samples to the quad-port wagon wheel grain of the 1.1 MN thrust HPDP motor. The effect of the physical quantities has to be taken into account in the wave propagation analysis. With respect to the various applications, there is no unique and perfect experimental method to measure the fuel regression rate. The best solution could be obtained by combining two techniques at the same time, each technique enhancing the quality of the global data.

  16. The Atomic AXAFS and XANES Techniques as Applied to Heterogeneous Catalysis and Electrocatalysis

    SciTech Connect

    Ramaker, D.; Koningsberger, D

    2010-01-01

    X-Ray absorption spectroscopy (XAFS) is an attractive in situ and in operando technique. In recent years, the more conventional extended X-ray absorption fine structure (EXAFS) data analysis technique has been complemented by two newer analysis methods: the 'atomic' XAFS (AXAFS) technique, which analyzes the scattering from the absorber atom itself, and the {Delta}{mu} XANES technique, which uses a difference method to isolate the changes in the X-ray absorption near edge structure (XANES) due to adsorbates on a metal surface. With AXAFS it is possible to follow the electronic effect a support has on a metal particle; with {Delta}{mu} XANES it is possible to determine the adsorbate, the specific adsorption sites and adsorbate coverage on a metal catalyst. This unprecedented new information helps a great deal to unravel the complex kinetic mechanisms operating in working reactors or fuelcell systems. The fundamental principles and methodology for applying the AXAFS and {Delta}{mu} XANES techniques are given here, and then specific applications are summarized, including H adsorption on supported Pt in the gas phase, wateractivation at a Pt cathode and methanol oxidation at a Pt anode in an electrochemical cell, sulfur oxidation on Pt, and oxygenreduction on a Au/SnO{sub x} cathode. Finally, the future outlook for time and/or space resolved applications of these techniques is contemplated.

  17. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  18. A THREE-DIMENSIONAL MICROFLUIDIC APPROACH TO SCALING UP MICROENCAPSULATION OF CELLS

    PubMed Central

    Tendulkar, Sameer; Mirmalek-Sani, Sayed-Hadi; Childers, Charles; Saul, Justin; Opara, Emmanuel C; Ramasubramanian, Melur K.

    2013-01-01

    Current applications of the microencapsulation technique include the use of encapsulated islet cells to treat Type 1 diabetes, and encapsulated hepatocytes for providing temporary but adequate metabolic support to allow spontaneous liver regeneration, or as a bridge to liver transplantation for patients with chronic liver disease. Also, microcapsules can be used for controlled delivery of therapeutic drugs. The two most widely used devices for microencapsulation are the air-syringe pump droplet generator and the electrostatic bead generator, each of which is fitted with a single needle through which droplets of cells suspended in alginate solution are produced and cross-linked into microbeads. A major drawback in the design of these instruments is that they are incapable of producing sufficient numbers of microcapsules in a short-time period to permit mass production of encapsulated and viable cells for transplantation in large animals and humans. We present in this paper a microfluidic approach to scaling up cell and protein encapsulations. The microfluidic chip consists of a 3D air supply and multi-nozzle outlet for microcapsule generation. It has one alginate inlet and compressed air inlet. The outlet has 8 nozzles, each having 380 micrometers inner diameter, which produce hydrogel microspheres ranging from 500–700 µm in diameter. These nozzles are concentrically surrounded by air nozzles with 2mm inner diameter. There are two tubes connected at the top to allow the air to escape as the alginate solution fills up the chamber. A variable flow pump 115 V is used to pump alginate solution and Tygon® tubing is used to connect in-house air supply to the air channel and peristaltic/syringe pump to the alginate chamber. A pressure regulator is used to control the flow rate of air. We have encapsulated islets and proteins with this high throughput device, which is expected to improve product quality control in microencapsulation of cells, and hence the outcome of

  19. Comparison between different techniques applied to quartz CPO determination in granitoid mylonites

    NASA Astrophysics Data System (ADS)

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish

    2016-04-01

    Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.

  20. Enabling and challenging factors in institutional reform: The case of SCALE-UP

    NASA Astrophysics Data System (ADS)

    Foote, Kathleen; Knaub, Alexis; Henderson, Charles; Dancy, Melissa; Beichner, Robert J.

    2016-06-01

    While many innovative teaching strategies exist, integration into undergraduate science teaching has been frustratingly slow. This study aims to understand the low uptake of research-based instructional innovations by studying 21 successful implementations of the Student Centered Active Learning with Upside-down Pedagogies (SCALE-UP) instructional reform. SCALE-UP significantly restructures the classroom environment and pedagogy to promote highly active and interactive instruction. Although originally designed for university introductory physics courses, SCALE-UP has spread to many other disciplines at hundreds of departments around the world. This study reports findings from in-depth, open-ended interviews with 21 key contact people involved with successful secondary implementations of SCALE-UP throughout the United States. We defined successful implementations as those who restructured their pedagogy and classroom and sustained and/or spread the change. Interviews were coded to identify the most common enabling and challenging factors during reform implementation and compared to the theoretical framework of Kotter's 8-step Change Model. The most common enabling influences that emerged are documenting and leveraging evidence of local success, administrative support, interaction with outside SCALE-UP user(s), and funding. Many challenges are linked to the lack of these enabling factors including difficulty finding funding, space, and administrative and/or faculty support for reform. Our focus on successful secondary implementations meant that most interviewees were able to overcome challenges. Presentation of results is illuminated with case studies, quotes, and examples that can help secondary implementers with SCALE-UP reform efforts specifically. We also discuss the implications for policy makers, researchers, and the higher education community concerned with initiating structural change.

  1. Scaling up antiretroviral treatment and improving patient retention in care: lessons from Ethiopia, 2005-2013

    PubMed Central

    2014-01-01

    Background Antiretroviral treatment (ART) was provided to more than nine million people by the end of 2012. Although ART programs in resource-limited settings have expanded treatment, inadequate retention in care has been a challenge. Ethiopia has been scaling up ART and improving retention (defined as continuous engagement of patients in care) in care. We aimed to analyze the ART program in Ethiopia. Methods A mix of quantitative and qualitative methods was used. Routine ART program data was used to study ART scale up and patient retention in care. In-depth interviews and focus group discussions were conducted with program managers. Results The number of people receiving ART in Ethiopia increased from less than 9,000 in 2005 to more than 439, 000 in 2013. Initially, the public health approach, health system strengthening, community mobilization and provision of care and support services allowed scaling up of ART services. While ART was being scaled up, retention was recognized to be insufficient. To improve retention, a second wave of interventions, related to programmatic, structural, socio-cultural, and patient information systems, have been implemented. Retention rate increased from 77% in 2004/5 to 92% in 2012/13. Conclusion Ethiopia has been able to scale up ART and improve retention in care in spite of its limited resources. This has been possible due to interventions by the ART program, supported by health systems strengthening, community-based organizations and the communities themselves. ART programs in resource-limited settings need to put in place similar measures to scale up ART and retain patients in care. PMID:24886686

  2. A comparative study of progressive versus successive spectrophotometric resolution techniques applied for pharmaceutical ternary mixtures.

    PubMed

    Saleh, Sarah S; Lotfy, Hayam M; Hassan, Nagiba Y; Salem, Hesham

    2014-11-11

    This work represents a comparative study of a novel progressive spectrophotometric resolution technique namely, amplitude center method (ACM), versus the well-established successive spectrophotometric resolution techniques namely; successive derivative subtraction (SDS); successive derivative of ratio spectra (SDR) and mean centering of ratio spectra (MCR). All the proposed spectrophotometric techniques consist of several consecutive steps utilizing ratio and/or derivative spectra. The novel amplitude center method (ACM) can be used for the determination of ternary mixtures using single divisor where the concentrations of the components are determined through progressive manipulation performed on the same ratio spectrum. Those methods were applied for the analysis of the ternary mixture of chloramphenicol (CHL), dexamethasone sodium phosphate (DXM) and tetryzoline hydrochloride (TZH) in eye drops in the presence of benzalkonium chloride as a preservative. The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitation and sensitivity. The obtained results were statistically compared with those obtained from the official BP methods, showing no significant difference with respect to accuracy and precision. PMID:24873889

  3. A comparative study of progressive versus successive spectrophotometric resolution techniques applied for pharmaceutical ternary mixtures

    NASA Astrophysics Data System (ADS)

    Saleh, Sarah S.; Lotfy, Hayam M.; Hassan, Nagiba Y.; Salem, Hesham

    2014-11-01

    This work represents a comparative study of a novel progressive spectrophotometric resolution technique namely, amplitude center method (ACM), versus the well-established successive spectrophotometric resolution techniques namely; successive derivative subtraction (SDS); successive derivative of ratio spectra (SDR) and mean centering of ratio spectra (MCR). All the proposed spectrophotometric techniques consist of several consecutive steps utilizing ratio and/or derivative spectra. The novel amplitude center method (ACM) can be used for the determination of ternary mixtures using single divisor where the concentrations of the components are determined through progressive manipulation performed on the same ratio spectrum. Those methods were applied for the analysis of the ternary mixture of chloramphenicol (CHL), dexamethasone sodium phosphate (DXM) and tetryzoline hydrochloride (TZH) in eye drops in the presence of benzalkonium chloride as a preservative. The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitation and sensitivity. The obtained results were statistically compared with those obtained from the official BP methods, showing no significant difference with respect to accuracy and precision.

  4. Roller compaction process development and scale up using Johanson model calibrated with instrumented roll data.

    PubMed

    Nesarikar, Vishwas V; Patel, Chandrakant; Early, William; Vatsaraj, Nipa; Sprockel, Omar; Jerzweski, Robert

    2012-10-15

    Roller compaction is a dry granulation process used to convert powder blends into free flowing agglomerates. During scale up or transfer of roller compaction process, it is critical to maintain comparable ribbon densities at each scale in order to achieve similar tensile strengths and subsequently similar particle size distribution of milled material. Similar ribbon densities can be reached by maintaining analogous normal stress applied by the rolls on ribbon for a given gap between rolls. Johanson (1965) developed a model to predict normal stress based on material properties and roll diameter. However, the practical application of Johanson model to estimate normal stress on the ribbon is limited due to its requirement of accurate estimate of nip pressure i.e. pressure at the nip angle. Another weakness of Johanson model is the assumption of a fixed angle of wall friction that leads to use of a fixed nip angle in the model. To overcome the above mentioned limitations, we developed a novel approach using roll force equations based on a modified Johanson model in which the requirement of pressure value at nip angle was eliminated. An instrumented roll on WP120 roller compactor was used to collect normal stress data measured at three locations across the width of a roll (P1, P2, P3), as well as gap and nip angle data on ribbon for placebo and various active blends along with corresponding process parameters. The nip angles were estimated directly using experimental pressure profile data of each run. The roll force equation of Johanson model was validated using normal stress, gap, and nip angle data of the placebo runs. The calculated roll force values compared well with those determined from the roll force equation provided for the Alexanderwerk(®) WP120 roller compactor. Subsequently, the calculation was reversed to estimate normal stress and corresponding ribbon densities as a function of gap and RFU (roll force per unit roll width). A placebo model was developed

  5. Near-infrared spectroscopy and pattern recognition techniques applied to the identification of Jinhua ham

    NASA Astrophysics Data System (ADS)

    Li, Honglian; Zhao, Zhilei; Pang, Yanping; Wu, Guancheng; Wang, Yanfeng; Li, Xiaoting

    2009-11-01

    Near-infrared (NIR) diffuse reflectance spectroscopy and pattern recognition techniques are applied to develop a fast identification method of Jinhua ham. The samples are collected from different manufactures and they are nineteen Jinhua ham samples and four Xuanwei ham samples. NIR spectra are pretreated with second derivative calculation and vector normalization. The pattern recognition techniques which are cluster analysis, conformity test and principal component analysis (PCA) are separately used to qualify Jinhua ham. The three methods can all distinguish Jinhua ham successfully. The result indicated that a 100 % recognition ration is achieved by the methods and the PCA method is the best one. Overall, NIR reflectance spectroscopy using pattern recognition is shown to have significant potential as a rapid and accurate method for identification of ham.

  6. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images.

    PubMed

    Boix, Macarena; Cantó, Begoña

    2013-04-01

    Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells. PMID:23458301

  7. Geophysical techniques applied to urban planning in complex near surface environments. Examples of Zaragoza, NE Spain

    NASA Astrophysics Data System (ADS)

    Pueyo-Anchuela, Ó.; Casas-Sainz, A. M.; Soriano, M. A.; Pocoví-Juan, A.

    Complex geological shallow subsurface environments represent an important handicap in urban and building projects. The geological features of the Central Ebro Basin, with sharp lateral changes in Quaternary deposits, alluvial karst phenomena and anthropic activity can preclude the characterization of future urban areas only from isolated geomechanical tests or from non-correctly dimensioned geophysical techniques. This complexity is here analyzed in two different test fields, (i) one of them linked to flat-bottomed valleys with irregular distribution of Quaternary deposits related to sharp lateral facies changes and irregular preconsolidated substratum position and (ii) a second one with similar complexities in the alluvial deposits and karst activity linked to solution of the underlying evaporite substratum. The results show that different geophysical techniques allow for similar geological models to be obtained in the first case (flat-bottomed valleys), whereas only the application of several geophysical techniques can permit to correctly evaluate the geological model complexities in the second case (alluvial karst). In this second case, the geological and superficial information permit to refine the sensitivity of the applied geophysical techniques to different indicators of karst activity. In both cases 3D models are needed to correctly distinguish alluvial lateral sedimentary changes from superimposed karstic activity.

  8. Contact Nd:YAG Laser Technique Applied To Head And Neck Reconstructive Surgery

    NASA Astrophysics Data System (ADS)

    Nobori, Takuo; Miyazaki, Yasuhiro; Moriyama, Ichiro; Sannikorn, Phakdee; Ohyama, Masaru

    1989-09-01

    The contact Nd:YAG laser system with ceramics tip was applied to head and neck reconstructive surgery. Plastic surgery was performed in 78 patients with head and neck diseases during the past 11 years. Since 1984 reconstructive surgery in these patients was made on 60 cases and on 45 cases(75%) of these cases the contact Nd:YAG laser surgery was used. Using this laser technique, half volume of bleeding in the operation was obtained as compared with that of the conventional procedure.

  9. A systems approach of the nondestructive evaluation techniques applied to Scout solid rocket motors.

    NASA Technical Reports Server (NTRS)

    Oaks, A. E.

    1971-01-01

    Review and appraisal of the status of the nondestructive tests applied to Scout solid-propellant rocket motors, using analytical techniques to evaluate radiography for detecting internal discontinuities such as voids and unbonds. Information relating to selecting, performing, controlling, and evaluating the results of NDE tests was reduced to a common simplified format. With these data and the results of the analytical studies performed, it was possible to make the basic appraisals of the ability of a test to meet all pertinent acceptance criteria and, where necessary, provide suggestions to improve the situation.

  10. Grid-based Moment Tensor Inversion Technique Apply for Earthquakes Offshore of Northeast Taiwan

    NASA Astrophysics Data System (ADS)

    Cheng, H.; Lee, S.; Ma, K.

    2010-12-01

    We use a grid-based moment tensor inversion technique and broadband continuous recordings to real-time monitoring the earthquakes offshore northeast Taiwan. The moment tensor inversion technique and a grid search scheme are applied to obtain the information of source parameters, including the hypocenter, moment magnitude, and focal mechanism. In Taiwan, the routine moment tensor solutions are reported by CWB(Central Weather Bureau) and BATS(Broadband Array in Taiwan for Seismology) which both require some lag time for the information on event time and location before doing CMT(Centroid Moment Tensor) analysis. By using the Grid-based moment tensor inversion technique, the event location and focal mechanism could be obtained simultaneously within about two minutes after the occurrence of the earthquake. This inversion procedure is based on a 1-D Green’s functions database calculated by frequency-wavenumber(fk) method. The northeast offshore of Taiwan has been taken into account as our first test area which covers the region of 121.5E to 123E, 23.5N to 25N, and the depth to 136 km. A 3D grid system is set in this study area with average grid size of 10 x 10 x 10 km3. We compare our results with the past earthquakes from 2008 to 2010 which had analyzed by BATS CMT. We also compare the event time detected by GridMT with the CWB earthquake reports. The results indicate that the grid-based moment tensor inversion system is efficient and realizable to be applied real-time on monitoring the local seismic activity. Our long-term goal is to use the GridMT technique with fully 3-D Green’s functions for the whole Taiwan in the future.

  11. Scaling up watershed model parameters: flow and load simulations of the Edisto River Basin, South Carolina, 2007-09

    USGS Publications Warehouse

    Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul A.

    2014-01-01

    As part of an ongoing effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River Basin, analyses and simulations of the hydrology of the Edisto River Basin were made using the topography-based hydrological model (TOPMODEL). A primary focus of the investigation was to assess the potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River Basin. Scaling up was done in a step-wise manner, beginning with applying the calibration parameters, meteorological data, and topographic-wetness-index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made for subsequent simulations, culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River Basin and updated calibration parameters for some of the TOPMODEL calibration parameters. The scaling-up process resulted in nine simulations being made. Simulation 7 best matched the streamflows at station 02175000, Edisto River near Givhans, SC, which was the downstream limit for the TOPMODEL setup, and was obtained by adjusting the scaling factor, including streamflow routing, and using NEXRAD precipitation data for the Edisto River Basin. The Nash-Sutcliffe coefficient of model-fit efficiency and Pearson’s correlation coefficient for simulation 7 were 0.78 and 0.89, respectively. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the McTier Creek and Edisto River models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the substantial difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL

  12. Enabling and Challenging Factors in Institutional Reform: The Case of SCALE-UP

    ERIC Educational Resources Information Center

    Foote, Kathleen; Knaub, Alexis; Henderson, Charles; Dancy, Melissa; Beichner, Robert J.

    2016-01-01

    While many innovative teaching strategies exist, integration into undergraduate science teaching has been frustratingly slow. This study aims to understand the low uptake of research-based instructional innovations by studying 21 successful implementations of the Student Centered Active Learning with Upside-down Pedagogies (SCALE-UP) instructional…

  13. Compaction Scale Up and Optimization of Cylindrical Fuel Compacts for the Next Generation Nuclear Plant

    SciTech Connect

    Jeffrey J. Einerson; Jeffrey A. Phillips; Eric L. Shaber; Scott E. Niedzialek; W. Clay Richardson; Scott G. Nagley

    2012-10-01

    Multiple process approaches have been used historically to manufacture cylindrical nuclear fuel compacts. Scale-up of fuel compacting was required for the Next Generation Nuclear Plant (NGNP) project to achieve an economically viable automated production process capable of providing a minimum of 10 compacts/minute with high production yields. In addition, the scale-up effort was required to achieve matrix density equivalent to baseline historical production processes, and allow compacting at fuel packing fractions up to 46% by volume. The scale-up approach of jet milling, fluid-bed overcoating, and hot-press compacting adopted in the U.S. Advanced Gas Reactor (AGR) Fuel Development Program involves significant paradigm shifts to capitalize on distinct advantages in simplicity, yield, and elimination of mixed waste. A series of designed experiments have been completed to optimize compaction conditions of time, temperature, and forming pressure using natural uranium oxycarbide (NUCO) fuel. Results from these experiments are included. The scale-up effort is nearing completion with the process installed and operational using nuclear fuel materials. The process is being certified for manufacture of qualification test fuel compacts for the AGR-5/6/7 experiment at the Advanced Test Reactor (ATR) at the Idaho National Laboratory (INL).

  14. Scaling up Comprehensive Sexuality Education in Nigeria: From National Policy to Nationwide Application

    ERIC Educational Resources Information Center

    Huaynoca, Silvia; Chandra-Mouli, Venkatraman; Yaqub, Nuhu, Jr.; Denno, Donna Marie

    2014-01-01

    Nigeria is one of few countries that reports having translated national policies on school-based comprehensive sexuality education (CSE) into near-nationwide implementation. We analysed data using the World Health Organization-ExpandNet framework, which provides a systematic structure for planning and managing the scaling up of health innovations.…

  15. The Role of Scaling up Research in Designing for and Evaluating Robustness

    ERIC Educational Resources Information Center

    Roschelle, J.; Tatar, D.; Shechtman, N.; Knudsen, J.

    2008-01-01

    One of the great strengths of Jim Kaput's research program was his relentless drive towards scaling up his innovative approach to teaching the mathematics of change and variation. The SimCalc mission, "democratizing access to the mathematics of change," was enacted by deliberate efforts to reach an increasing number of teachers and students each…

  16. Introduction to SCALE-UP: Student-Centered Activities for Large Enrollment University Physics.

    ERIC Educational Resources Information Center

    Beichner, Robert J.; Saul, Jeffery M.; Allain, Rhett J.; Deardorff, Duane L.; Abbott, David S.

    SCALE-UP is an extension of the highly successful IMPEC (Integrated Math, Physics, Engineering, and Chemistry) project, one of North Carolina State's curricular reform efforts undertaken as part of the SUCCEED coalition. The authors utilize the interactive, collaboratively based instruction that worked well in smaller class settings and find ways…

  17. Scaling Up Success: Lessons Learned from Technology-Based Educational Improvement

    ERIC Educational Resources Information Center

    Dede, Chris, Ed.; Honan, James P., Ed.; Peters, Laurence C., Ed.

    2005-01-01

    Drawing from the information presented at a conference sponsored by the Harvard Graduate School of Education and the Mid-Atlantic Regional Technology in Education Consortium, educators, researchers, and policymakers translate theory into practice to provide a hands-on resource that describes different models for scaling up success. This resource…

  18. From scaling up to sustainability in HIV: potential lessons for moving forward

    PubMed Central

    2013-01-01

    Background In 30 years of experience in responding to the HIV epidemic, critical decisions and program characteristics for successful scale-up have been studied. Now leaders face a new challenge: sustaining large-scale HIV prevention programs. Implementers, funders, and the communities served need to assess what strategies and practices of scaling up are also relevant for sustaining delivery at scale. Methods We reviewed white and gray literature to identify domains central to scaling-up programs and reviewed HIV case studies to identify how these domains might relate to sustaining delivery at scale. Results We found 10 domains identified as important for successfully scaling up programs that have potential relevance for sustaining delivery at scale: fiscal support; political support; community involvement, integration, buy-in, and depth; partnerships; balancing flexibility/adaptability and standardization; supportive policy, regulatory, and legal environment; building and sustaining strong organizational capacity; transferring ownership; decentralization; and ongoing focus on sustainability. We identified one additional potential domain important for programs sustaining delivery at scale: emphasizing equity. Conclusions Today, the public and private sector are examining their ability to generate value for populations. All stakeholders are aiming to stem the tide of the HIV epidemic. Implementers need a framework to guide the evolution of their strategies and management practices. Greater research is needed to refine the domains for policy and program implementers working to sustain HIV program delivery at scale. PMID:24199749

  19. Evaluating scale-up rules of a high-shear wet granulation process.

    PubMed

    Tao, Jing; Pandey, Preetanshu; Bindra, Dilbir S; Gao, Julia Z; Narang, Ajit S

    2015-07-01

    This work aimed to evaluate the commonly used scale-up rules for high-shear wet granulation process using a microcrystalline cellulose-lactose-based low drug loading formulation. Granule properties such as particle size, porosity, flow, and tabletability, and tablet dissolution were compared across scales using scale-up rules based on different impeller speed calculations or extended wet massing time. Constant tip speed rule was observed to produce slightly less granulated material at the larger scales. Longer wet massing time can be used to compensate for the lower shear experienced by the granules at the larger scales. Constant Froude number and constant empirical stress rules yielded granules that were more comparable across different scales in terms of compaction performance and tablet dissolution. Granule porosity was shown to correlate well with blend tabletability and tablet dissolution, indicating the importance of monitoring granule densification (porosity) during scale-up. It was shown that different routes can be chosen during scale-up to achieve comparable granule growth and densification by altering one of the three parameters: water amount, impeller speed, and wet massing time. PMID:26010137

  20. Sustainability of a Scale-Up Intervention in Early Mathematics: A Longitudinal Evaluation of Implementation Fidelity

    ERIC Educational Resources Information Center

    Clements, Douglas H.; Sarama, Julie; Wolfe, Christopher B.; Spitler, Mary Elaine

    2015-01-01

    Research Findings: We evaluated the fidelity of implementation and the sustainability of effects of a research-based model for scaling up educational interventions. The model was implemented by 64 teachers in 26 schools in 2 distal city districts serving low-resource communities, with schools randomly assigned to condition. Practice or Policy:…

  1. "Scaling Up" Educational Change: Some Musings on Misrecognition and Doxic Challenges

    ERIC Educational Resources Information Center

    Thomson, Pat

    2014-01-01

    Educational policy-makers around the world are strongly committed to the notion of "scaling up". This can mean anything from encouraging more teachers to take up a pedagogical innovation, all the way through to system-wide efforts to implement "what works" across all schools. In this paper, I use Bourdieu's notions of…

  2. 77 FR 25152 - Applications for New Awards; Investing in Innovation Fund, Scale-Up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-27

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Applications for New Awards; Investing in Innovation Fund, Scale- Up Grants Correction In notice document 2012-7362 appearing on pages 18216-18229 in the issue of Tuesday, March 27, 2012 make the...

  3. Including Performance Assessments in Accountability Systems: A Review of Scale-Up Efforts

    ERIC Educational Resources Information Center

    Tung, Rosann

    2010-01-01

    The purpose of this literature and field review is to understand previous efforts at scaling up performance assessments for use across districts and states. Performance assessments benefit students and teachers by providing more opportunities for students to demonstrate their knowledge and complex skills, by providing teachers with better…

  4. Lessons from scaling up a depression treatment program in primary care in Chile.

    PubMed

    Araya, Ricardo; Alvarado, Rubén; Sepúlveda, Rodrigo; Rojas, Graciela

    2012-09-01

    In Chile, the National Depression Detection and Treatment Program (Programa Nacional de Diagnóstico y Tratamiento de la Depresión, PNDTD) in primary care is a rare example of an evidence-based mental health program that was scaled up to the national level in a low- or middle-income country. This retrospective qualitative study aimed to better understand how policymakers made the decision to scale up mental health services to the national level, and to explore the elements, contexts, and processes that facilitated the decision to implement and sustain PNDTD. In-depth semistructured interviews with six key informants selected through intentional sampling were conducted in August-December 2008. Interviewees were senior officers at the Ministry of Health who were directly involved in the decision to scale up the program. Results yielded four elements pivotal to the decisionmaking process: scientific evidence, teamwork and leadership, strategic alliances, and program institutionalization. Each element contributed to building consensus, securing funding, attracting resources, and gaining lasting support from policymakers. Additionally, a review of available documentation led the authors to consider sociopolitical context and use of the media to be important factors. While research evidence for the effectiveness of mental health services in the primary care setting continues to accumulate, low- and middle-income countries should get started on the lengthy process of scaling up by incorporating the elements that led to decisionmaking and implementation of the PNDTD in Chile. PMID:23183564

  5. 77 FR 18216 - Applications for New Awards; Investing in Innovation Fund, Scale-Up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-27

    ... FR 11087) and is available at http://www.gpo.gov/fdsys/pkg/FR-2012-02-24/pdf/2012-4357.pdf . Scale-up... FR 12004- 12071)(2010 i3 NFP). Priorities: This competition includes five absolute priorities and... the Federal Register on December 15, 2010 (75 FR 78486-78511), and corrected on May 12, 2011 (76...

  6. Final-Year Results from the i3 Scale-Up of Reading Recovery

    ERIC Educational Resources Information Center

    May, Henry; Sirinides, Philip; Gray, Abby; Davila, Heather Goldsworthy; Sam, Cecile; Blalock, Toscha; Blackman, Horatio; Anderson-Clark, Helen; Schiera, Andrew J.

    2015-01-01

    As part of the 2010 economic stimulus, a $55 million "Investing in Innovation" (i3) grant from the US Department of Education was awarded to scale up Reading Recovery across the nation. This paper presents the final round of results from the large-scale, mixed methods randomized evaluation of the implementation and impacts of Reading…

  7. Implementation in a Longitudinal Sample of New American Schools: Four Years into Scale-Up.

    ERIC Educational Resources Information Center

    Kirby, Sheila Nataraj; Berends, Mark; Naftel, Scott

    New American Schools (NAS) initiated whole-school reform in 1991 as a response to school reforms that had produced little change in the nation's test scores. Its mission is to help schools and districts raise student achievement using whole-school designs. NAS is in the scale-up phase of its effort in which designs are being diffused in partnering…

  8. Integrated Graduate and Continuing Education in Protein Chromatography for Bioprocess Development and Scale-Up

    ERIC Educational Resources Information Center

    Carta, Jungbauer

    2011-01-01

    We describe an intensive course that integrates graduate and continuing education focused on the development and scale-up of chromatography processes used for the recovery and purification of proteins with special emphasis on biotherapeutics. The course includes lectures, laboratories, teamwork, and a design exercise and offers a complete view of…

  9. Education Reform Support: A Framework for Scaling Up School Reform. Policy Paper Series.

    ERIC Educational Resources Information Center

    Healey, F. Henry; DeStefano, Joseph

    The Bureau for Africa of the United States Agency for International Development (USAID) has been examining in detail the question of how best to support and sustain sectorwide education reform in Africa. The USAID and Education Commission of the States jointly sponsored a seminar in October 1996 to examine the issue of "scaling up" and to bring…

  10. Scaling-Up Aid to Education: Is Absorptive Capacity a Constraint?

    ERIC Educational Resources Information Center

    Rose, Pauline

    2009-01-01

    "Absorptive capacity" is a frequently used term amongst development practitioners in education. It is adopted by some as a reason for caution over scaling up aid. Others are of the view that absorptive capacity is an excuse by some donors for not delivering on their Education for All financing commitments. Drawing on interviews with…

  11. Early College for All: Efforts to Scale up Early Colleges in Multiple Settings

    ERIC Educational Resources Information Center

    Edmunds, Julie A.

    2016-01-01

    Given the positive impacts of the small, stand-alone early college model and the desire to provide those benefits to more students, organizations have begun efforts to scale up the early college model in a variety of settings. These efforts have been supported by the federal government, particularly by the Investing in Innovation (i3) program.…

  12. TANK 18-F AND 19-F TANK FILL GROUT SCALE UP TEST SUMMARY

    SciTech Connect

    Stefanko, D.; Langton, C.

    2012-01-03

    High-level waste (HLW) tanks 18-F and 19-F have been isolated from FTF facilities. To complete operational closure the tanks will be filled with grout for the purpose of: (1) physically stabilizing the tanks, (2) limiting/eliminating vertical pathways to residual waste, (3) entombing waste removal equipment, (4) discouraging future intrusion, and (5) providing an alkaline, chemical reducing environment within the closure boundary to control speciation and solubility of select radionuclides. This report documents the results of a four cubic yard bulk fill scale up test on the grout formulation recommended for filling Tanks 18-F and 19-F. Details of the scale up test are provided in a Test Plan. The work was authorized under a Technical Task Request (TTR), HLE-TTR-2011-008, and was performed according to Task Technical and Quality Assurance Plan (TTQAP), SRNL-RP-2011-00587. The bulk fill scale up test described in this report was intended to demonstrate proportioning, mixing, and transportation, of material produced in a full scale ready mix concrete batch plant. In addition, the material produced for the scale up test was characterized with respect to fresh properties, thermal properties, and compressive strength as a function of curing time.

  13. Scaling up STEM Academies Statewide: Implementation, Network Supports, and Early Outcomes

    ERIC Educational Resources Information Center

    Young, Viki; House, Ann; Sherer, David; Singleton, Corinne; Wang, Haiwen; Klopfenstein, Kristin

    2016-01-01

    This chapter presents a case study of scaling up the T-STEM initiative in Texas. Data come from the four-year longitudinal evaluation of the Texas High School Project (THSP). The evaluation studied the implementation and impact of T-STEM and the other THSP reforms using a mixed-methods design, including qualitative case studies; principal,…

  14. Heat and mass transfer scale-up issues during freeze drying: II. Control and characterization of the degree of supercooling.

    PubMed

    Rambhatla, Shailaja; Ramot, Roee; Bhugra, Chandan; Pikal, Michael J

    2004-01-01

    This study aims to investigate the effect of the ice nucleation temperature on the primary drying process using an ice fog technique for temperature-controlled nucleation. In order to facilitate scale up of the freeze-drying process, this research seeks to find a correlation of the product resistance and the degree of supercooling with the specific surface area of the product. Freeze-drying experiments were performed using 5% wt/vol solutions of sucrose, dextran, hydroxyethyl starch (HES), and mannitol. Temperature-controlled nucleation was achieved using the ice fog technique where cold nitrogen gas was introduced into the chamber to form an "ice fog," thereby facilitating nucleation of samples at the temperature of interest. Manometric temperature measurement (MTM) was used during primary drying to evaluate the product resistance as a function of cake thickness. Specific surface areas (SSA) of the freeze-dried cakes were determined. The ice fog technique was refined to successfully control the ice nucleation temperature of solutions within 1 degrees C. A significant increase in product resistance was produced by a decrease in nucleation temperature. The SSA was found to increase with decreasing nucleation temperature, and the product resistance increased with increasing SSA. The ice fog technique can be refined into a viable method for nucleation temperature control. The SSA of the product correlates well with the degree of supercooling and with the resistance of the product to mass transfer (ie, flow of water vapor through the dry layer). Using this correlation and SSA measurements, one could predict scale-up drying differences and accordingly alter the freeze-drying process so as to bring about equivalence of product temperature history during lyophilization. PMID:15760055

  15. Scale-up of an electrical capacitance tomography sensor for imaging pharmaceutical fluidized beds and validation by computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Haigang; Yang, Wuqiang

    2011-10-01

    The aim of this research is to apply electrical capacitance tomography (ECT) in pharmaceutical fluidized beds and scale up the application of ECT from a lab-scale fluidized bed to a production-scale fluidized bed. The objective is to optimize the design of the production-scale fluidized bed and to improve the operation efficiency of the fluidization processes. This is the first time that ECT has been scaled up to a production-scale fluidized bed of 1.0 m diameter and batch process capacity of 100 kg in a real industrial environment. With a large-scale fluidized bed in a real industrial environment, some key issues on the ECT sensor design must be addressed. To validate ECT measurement results, a two-phase flow model has been used to simulate the process in a lab-scale and pilot-scale fluidized bed. The key process parameters include solid concentration, average concentration profiles, the frequency spectrum of signal fluctuation obtained by the fast Fourier transfer (FFT) and multi-level wavelet decomposition in the time domain. The results show different hydrodynamic behaviour of fluidized beds of different scales. The time-averaged parameters from ECT and computational fluid dynamics are compared. Future work on the ECT sensor design for large-scale fluidized beds are given in the end of the paper.

  16. Evaluation of models for predicting spray mist diameter for scaling-up of the fluidized bed granulation process.

    PubMed

    Fujiwara, Maya; Dohi, Masafumi; Otsuka, Tomoko; Yamashita, Kazunari; Sako, Kazuhiro

    2012-01-01

    We evaluated models for predicting spray mist diameter suitable for scaling-up the fluidized bed granulation process. By precise selection of experimental conditions, we were able to identify a suitable prediction model that considers changes in binder solution, nozzle dimension, and spray conditions. We used hydroxypropyl cellulose (HPC), hydroxypropyl methylcellulose (HPMC), or polyvinylpyrrolidone (PVP) binder solutions, which are commonly employed by the pharmaceutical industry. Nozzle dimension and spray conditions for oral dosing were carefully selected to reflect manufacturing and small (1/10) scale process conditions. We were able to demonstrate that the prediction model proposed by Mulhem optimally estimated spray mist diameter when each coefficient was modified. Moreover, we developed a simple scale-up rule to produce the same spray mist diameter at different process scales. We confirmed that the Rosin-Rammler distribution could be applied to this process, and that its distribution coefficient was 1.43-1.72 regardless of binder solution, spray condition, or nozzle dimension. PMID:23124561

  17. Decision making for HIV prevention and treatment scale up: Bridging the gap between theory and practice

    PubMed Central

    Alistar, Sabina S.; Brandeau, Margaret L.

    2011-01-01

    Background Effectively controlling the HIV epidemic will require efficient use of limited resources. Despite ambitious global goals for HIV prevention and treatment scale up, few comprehensive practical tools exist to inform such decisions. Methods We briefly summarize modeling approaches for resource allocation for epidemic control, and discuss the practical limitations of these models. We describe typical challenges of HIV resource allocation in practice and some of the tools used by decision makers. We identify the characteristics needed in a model that can effectively support planners in decision making about HIV prevention and treatment scale up. Results An effective model to support HIV scale-up decisions will be flexible, with capability for parameter customization and incorporation of uncertainty. Such a model needs certain key technical features: it must capture epidemic effects; account for how intervention effectiveness depends on the target population and the level of scale up; capture benefit and cost differentials for packages of interventions versus single interventions, including both treatment and prevention interventions; incorporate key constraints on potential funding allocations; identify optimal or near-optimal solutions; and estimate the impact of HIV interventions on the health care system and the resulting resource needs. Additionally, an effective model needs a user-friendly design and structure, ease of calibration and validation, and accessibility to decision makers in all settings. Conclusions Resource allocation theory can make a significant contribution to decision making about HIV prevention and treatment scale up. What remains now is to develop models that can bridge the gap between theory and practice. PMID:21191118

  18. A New Change Detection Technique Applied to COSMO-SkyMed Stripmap Himage Data

    NASA Astrophysics Data System (ADS)

    Losurdo, A.; Marzo, C.; Guariglia, A.

    2015-05-01

    Change Detection techniques in SAR images is very relevant for the locationing and the monitoring of interesting land changes. At present, it is a very important topic due to the high repetitiveness and of the new SAR satellite instruments (e.g. COSMO-SkyMed and Sentinel-1). Geocart S.p.A. has reached important results about SAR change detection techniques within a technological project designed and implemented for the Italian Space Agency. The project's title is Integrated Monitoring System: application to the GAS pipeline". The aim of the project is the development of a new remote sensing service integrating aerial and satellite data for GAS pipeline monitoring. An important Work-Package of the project aims to develop algorithms regarding the change detection to be applied on COSMO-SkyMed Stripmap Himage data in order to identify heavy lorries on pipelines. Particularly, the paper presents a new change detection technique based on a probabilistic approach and the corresponding applicative results.

  19. Photothermal Techniques Applied to the Thermal Characterization of l-Cysteine Nanofluids

    NASA Astrophysics Data System (ADS)

    Alvarado, E. Maldonado; Ramón-Gallegos, E.; Jiménez Pérez, J. L.; Cruz-Orea, A.; Hernández Rosas, J.

    2013-05-01

    Thermal-diffusivity ( D) and thermal-effusivity ( e) measurements were carried out in l-cysteine nanoliquids l-cysteine in combination with Au nanoparticles and protoporphyrin IX (PpIX) nanofluid) by using thermal lens spectrometry (TLS) and photopyroelectric (PPE) techniques. The TLS technique was used in the two mismatched mode experimental configuration to obtain the thermal-diffusivity of the samples. On the other hand, the sample thermal effusivity ( e) was obtained by using the PPE technique where the temperature variation of a sample, exposed to modulated radiation, is measured with a pyrolectric sensor. From the obtained thermal-diffusivity and thermal-effusivity values, the thermal conductivity and specific heat capacity of the sample were calculated. The obtained thermal parameters were compared with the thermal parameters of water. The results of this study could be applied to the detection of tumors by using the l-cysteine in combination with Au nanoparticles and PpIX nanofluid, called conjugated in this study.

  20. Inverse problem solution techniques as applied to indirect in situ estimation of fish target strength.

    PubMed

    Stepnowski, A; Moszyński, M

    2000-05-01

    In situ indirect methods of fish target strength (TS) estimation are analyzed in terms of the inverse techniques recently applied to the problem in question. The solution of this problem requires finding the unknown probability density function (pdf) of fish target strength from acoustic echoes, which can be estimated by solving the integral equation, relating pdf's of echo variable, target strength, and beam pattern of the echosounder transducer. In the first part of the paper the review of existing indirect in situ TS-estimation methods is presented. The second part introduces the novel TS-estimation methods, viz.: Expectation, Maximization, and Smoothing (EMS), Windowed Singular Value Decomposition (WSVD), Regularization and Wavelet Decomposition, which are compared using simulations as well as actual data from acoustic surveys. The survey data, acquired by the dual-beam digital echosounder, were thoroughly analyzed by numerical algorithms and the target strength and acoustical backscattering length pdf's estimates were calculated from fish echoes received in the narrow beam channel of the echosounder. Simultaneously, the estimates obtained directly from the dual-beam system were used as a reference for comparison of the estimates calculated by the newly introduced inverse techniques. The TS estimates analyzed in the paper are superior to those obtained from deconvolution or other conventional techniques, as the newly introduced methods partly avoid the problem of ill-conditioned equations and matrix inversion. PMID:10830379

  1. Micropillar compression technique applied to micron-scale mudstone elasto-plastic deformation.

    SciTech Connect

    Michael, Joseph Richard; Chidsey, Thomas; Heath, Jason E.; Dewers, Thomas A.; Boyce, Brad Lee; Buchheit, Thomas Edward

    2010-12-01

    Mudstone mechanical testing is often limited by poor core recovery and sample size, preservation and preparation issues, which can lead to sampling bias, damage, and time-dependent effects. A micropillar compression technique, originally developed by Uchic et al. 2004, here is applied to elasto-plastic deformation of small volumes of mudstone, in the range of cubic microns. This study examines behavior of the Gothic shale, the basal unit of the Ismay zone of the Pennsylvanian Paradox Formation and potential shale gas play in southeastern Utah, USA. Precision manufacture of micropillars 5 microns in diameter and 10 microns in length are prepared using an ion-milling method. Characterization of samples is carried out using: dual focused ion - scanning electron beam imaging of nano-scaled pores and distribution of matrix clay and quartz, as well as pore-filling organics; laser scanning confocal (LSCM) 3D imaging of natural fractures; and gas permeability, among other techniques. Compression testing of micropillars under load control is performed using two different nanoindenter techniques. Deformation of 0.5 cm in diameter by 1 cm in length cores is carried out and visualized by a microscope loading stage and laser scanning confocal microscopy. Axisymmetric multistage compression testing and multi-stress path testing is carried out using 2.54 cm plugs. Discussion of results addresses size of representative elementary volumes applicable to continuum-scale mudstone deformation, anisotropy, and size-scale plasticity effects. Other issues include fabrication-induced damage, alignment, and influence of substrate.

  2. Strategy for applying scaling technique to water retention curves of forest soils

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Kosugi, K.; Mizuyama, T.

    2009-12-01

    Describing the infiltration of water in soils on a forested hillslope requires the information of spatial variability of water retention curve (WRC). By using a scaling technique, Hayashi et al. (2009), found that the porosity mostly characterizes the spatial variability of the WRCs on a forested hillslope. This scaling technique was based on a model, which assumes a lognormal pore size distribution and contains three parameters: the median of log-transformed pore radius, ψm, the variance of log-transformed pore radius, σ, and the effective porosity, θe. Thus, in the scaling method proposed by Hayashi et al. (2009), θe is a scaling factor, which should be determined for each individual soil, and that ψm and σ are reference parameter common for the whole data set. They examined this scaling method using θe calculated as a difference between the observed saturated water content and water content observed at ψ = -1000 cm for each sample and, ψm and σ derived from the whole data set of WRCs on the slope. Then it was showed that this scaling method could explain almost 90 % of the spatial variability in WRCs on the forested hillslope. However, this method requires the whole data set of WRCs for deriving the reference parameters (ψm and σ). For applying the scaling technique more practically, in this study, we tested a scaling method using the reference parameter derived from the WRCs at a small part of the slope. In order to examine the proposed scaling method, the WRCs for the 246 undisturbed forest soil samples, collected at 15 points distributed from downslope to upslope segments, were observed. In the proposed scaling method, we optimized the common ψm and σ to the WRCs for six soil samples, collected at one point on the middle-slope, and applied these parameters to a reference parameter for the whole data sets. The scaling method proposed by this study exhibited an increase of only 6 % in the residual sum of squares as compared with that of the method

  3. Applying machine learning techniques to DNA sequence analysis. Progress report, February 14, 1991--February 13, 1992

    SciTech Connect

    Shavlik, J.W.

    1992-04-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a ``domain theory``), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  4. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    NASA Astrophysics Data System (ADS)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  5. Data compression techniques applied to high resolution high frame rate video technology

    NASA Technical Reports Server (NTRS)

    Hartz, William G.; Alexovich, Robert E.; Neustadter, Marc S.

    1989-01-01

    An investigation is presented of video data compression applied to microgravity space experiments using High Resolution High Frame Rate Video Technology (HHVT). An extensive survey of methods of video data compression, described in the open literature, was conducted. The survey examines compression methods employing digital computing. The results of the survey are presented. They include a description of each method and assessment of image degradation and video data parameters. An assessment is made of present and near term future technology for implementation of video data compression in high speed imaging system. Results of the assessment are discussed and summarized. The results of a study of a baseline HHVT video system, and approaches for implementation of video data compression, are presented. Case studies of three microgravity experiments are presented and specific compression techniques and implementations are recommended.

  6. A New Normalized Difference Cloud Retrieval Technique Applied to Landsat Radiances Over the Oklahoma ARM Site

    NASA Technical Reports Server (NTRS)

    Orepoulos, Lazaros; Cahalan, Robert; Marshak, Alexander; Wen, Guoyong

    1999-01-01

    We suggest a new approach to cloud retrieval, using a normalized difference of nadir reflectivities (NDNR) constructed from a non-absorbing and absorbing (with respect to liquid water) wavelength. Using Monte Carlo simulations we show that this quantity has the potential of removing first order scattering effects caused by cloud side illumination and shadowing at oblique Sun angles. Application of the technique to TM (Thematic Mapper) radiance observations from Landsat-5 over the Southern Great Plains site of the ARM (Atmospheric Radiation Measurement) program gives very similar regional statistics and histograms, but significant differences at the pixel level. NDNR can be also combined with the inverse NIPA (Nonlocal Independent Pixel Approximation) of Marshak (1998) which is applied for the first time on overcast Landsat scene subscenes. We demonstrate the sensitivity of the NIPA-retrieved cloud fields on the parameters of the method and discuss practical issues related to the optimal choice of these parameters.

  7. Feasibility Studies of Applying Kalman Filter Techniques to Power System Dynamic State Estimation

    SciTech Connect

    Huang, Zhenyu; Schneider, Kevin P.; Nieplocha, Jarek

    2007-08-01

    Abstract—Lack of dynamic information in power system operations mainly attributes to the static modeling of traditional state estimation, as state estimation is the basis driving many other operations functions. This paper investigates the feasibility of applying Kalman filter techniques to enable the inclusion of dynamic modeling in the state estimation process and the estimation of power system dynamic states. The proposed Kalman-filter-based dynamic state estimation is tested on a multi-machine system with both large and small disturbances. Sensitivity studies of the dynamic state estimation performance with respect to measurement characteristics – sampling rate and noise level – are presented as well. The study results show that there is a promising path forward to implementation the Kalman-filter-based dynamic state estimation with the emerging phasor measurement technologies.

  8. Roller compaction scale-up using roll width as scale factor and laser-based determined ribbon porosity as critical material attribute.

    PubMed

    Allesø, Morten; Holm, René; Holm, Per

    2016-05-25

    Due to the complexity and difficulties associated with the mechanistic modeling of roller compaction process for scale-up, an innovative equipment approach is to keep roll diameter fixed between scales and instead vary the roll width. Assuming a fixed gap and roll force, this approach should create similar conditions for the nip regions of the two compactor scales, and thus result in a scale-reproducible ribbon porosity. In the present work a non-destructive laser-based technique was used to measure the ribbon porosity at-line with high precision and high accuracy as confirmed by an initial comparison to a well-established volume displacement oil intrusion method. The ribbon porosity was found to be scale-independent when comparing the average porosity of a group of ribbon samples (n=12) from small-scale (Mini-Pactor®) to large-scale (Macro-Pactor®). A higher standard deviation of ribbons fragment porosities from the large-scale roller compactor was attributed to minor variations in powder densification across the roll width. With the intention to reproduce ribbon porosity from one scale to the other, process settings of roll force and gap size applied to the Mini-Pactor® (and identified during formulation development) were therefore directly transferrable to subsequent commercial scale production on the Macro-Pactor®. This creates a better link between formulation development and tech transfer and decreases the number of batches needed to establish the parameter settings of the commercial process. PMID:26545485

  9. Single Layer Centrifugation Can Be Scaled-Up Further to Process up to 150 mL Semen

    PubMed Central

    Morrell, J. M.; van Wienen, M.; Wallgren, M.

    2011-01-01

    Single-Layer centrifugation has been used to improve the quality of sperm samples in several species. However, where stallion or boar semen is to be used for AI, larger volumes of semen have to be processed than for other species, thus limiting the effectiveness of the original technique. The objective of the present study was to scale up the SLC method for both stallion and boar semen. Stallion semen could be processed in 100 mL glass tubes without a loss of sperm quality, and similarly, boar semen could be processed in 200 mL and 500 mL tubes without losing sperm quality. The results of these preliminary studies are encouraging, and larger trials are underway to evaluate using these methods in the field. PMID:23738111

  10. Pore-Water Extraction Scale-Up Study for the SX Tank Farm

    SciTech Connect

    Truex, Michael J.; Oostrom, Martinus; Wietsma, Thomas W.; Last, George V.; Lanigan, David C.

    2013-01-15

    The phenomena related to pore-water extraction from unsaturated sediments have been previously examined with limited laboratory experiments and numerical modeling. However, key scale-up issues have not yet been addressed. Laboratory experiments and numerical modeling were conducted to specifically examine pore-water extraction for sediment conditions relevant to the vadose zone beneath the SX Tank Farm at Hanford Site in southeastern Washington State. Available SX Tank Farm data were evaluated to generate a conceptual model of the subsurface for a targeted pore-water extraction application in areas with elevated moisture and Tc-99 concentration. The hydraulic properties of the types of porous media representative of the SX Tank Farm target application were determined using sediment mixtures prepared in the laboratory based on available borehole sediment particle size data. Numerical modeling was used as an evaluation tool for scale-up of pore-water extraction for targeted field applications.