Science.gov

Sample records for scaling-up technique applied

  1. An efficient permeability scaling-up technique applied to the discretized flow equations

    SciTech Connect

    Urgelli, D.; Ding, Yu

    1997-08-01

    Grid-block permeability scaling-up for numerical reservoir simulations has been discussed for a long time in the literature. It is now recognized that a full permeability tensor is needed to get an accurate reservoir description at large scale. However, two major difficulties are encountered: (1) grid-block permeability cannot be properly defined because it depends on boundary conditions; (2) discretization of flow equations with a full permeability tensor is not straightforward and little work has been done on this subject. In this paper, we propose a new method, which allows us to get around both difficulties. As the two major problems are closely related, a global approach will preserve the accuracy. So, in the proposed method, the permeability up-scaling technique is integrated in the discretized numerical scheme for flow simulation. The permeability is scaled-up via the transmissibility term, in accordance with the fluid flow calculation in the numerical scheme. A finite-volume scheme is particularly studied, and the transmissibility scaling-up technique for this scheme is presented. Some numerical examples are tested for flow simulation. This new method is compared with some published numerical schemes for full permeability tensor discretization where the full permeability tensor is scaled-up through various techniques. Comparing the results with fine grid simulations shows that the new method is more accurate and more efficient.

  2. Imaging techniques applied to the study of fluids in porous media. Scaling up in Class 1 reservoir type rock

    SciTech Connect

    Tomutsa, L.; Brinkmeyer, A.; Doughty, D.

    1993-04-01

    A synergistic rock characterization methodology has been developed. It derives reservoir engineering parameters from X-ray tomography (CT) scanning, computer assisted petrographic image analysis, minipermeameter measurements, and nuclear magnetic resonance imaging (NMRI). This rock characterization methodology is used to investigate the effect of small-scale rock heterogeneity on oil distribution and recovery. It is also used to investigate the applicability of imaging technologies to the development of scaleup procedures from core plug to whole core, by comparing the results of detailed simulations with the images ofthe fluid distributions observed by CT scanning. By using the rock and fluid detailed data generated by imaging technology describe, one can verify directly, in the laboratory, various scaling up techniques. Asan example, realizations of rock properties statistically and spatially compatible with the observed values are generated by one of the various stochastic methods available (fuming bands) and are used as simulator input. The simulation results were compared with both the simulation results using the true rock properties and the fluid distributions observed by CT. Conclusions regarding the effect of the various permeability models on waterflood oil recovery were formulated.

  3. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    Tri-isotropic (TRISO) fuel particle coating is critical for the future use of nuclear energy produced byadvanced gas reactors (AGRs). The fuel kernels are coated using chemical vapor deposition in a spouted fluidized bed. The challenges encountered in operating TRISO fuel coaters are due to the fact that in modern AGRs, such as High Temperature Gas Reactors (HTGRs), the acceptable level of defective/failed coated particles is essentially zero. This specification requires processes that produce coated spherical particles with even coatings having extremely low defect fractions. Unfortunately, the scale-up and design of the current processes and coaters have been based on empirical approaches and are operated as black boxes. Hence, a voluminous amount of experimental development and trial and error work has been conducted. It has been clearly demonstrated that the quality of the coating applied to the fuel kernels is impacted by the hydrodynamics, solids flow field, and flow regime characteristics of the spouted bed coaters, which themselves are influenced by design parameters and operating variables. Further complicating the outlook for future fuel-coating technology and nuclear energy production is the fact that a variety of new concepts will involve fuel kernels of different sizes and with compositions of different densities. Therefore, without a fundamental understanding the underlying phenomena of the spouted bed TRISO coater, a significant amount of effort is required for production of each type of particle with a significant risk of not meeting the specifications. This difficulty will significantly and negatively impact the applications of AGRs for power generation and cause further challenges to them as an alternative source of commercial energy production. Accordingly, the proposed work seeks to overcome such hurdles and advance the scale-up, design, and performance of TRISO fuel particle spouted bed coaters. The overall objectives of the proposed work are

  4. Mechanochemistry applied to reformulation and scale-up production of Ethionamide: Salt selection and solubility enhancement.

    PubMed

    de Melo, Cristiane C; da Silva, Cecilia C P; Pereira, Carla C S S; Rosa, Paulo C P; Ellena, Javier

    2016-01-01

    Ethionamide (ETH), a Biopharmaceutics Classification System class II drug, is a second-line drug manufactured as an oral dosage form by Pfizer to treat tuberculosis. Since its discovery in 1956, only one reformulation was proposed in 2005 as part of the efforts to improve its solubility. Due to the limited scientific research on active pharmaceutical ingredients (APIs) for the treatment of neglected diseases, we focused on the development of an approachable and green supramolecular synthesis protocol for the production of novel solid forms of ETH. Initially, three salts were crystal engineered and supramolecular synthesized via slow evaporation of the solvent: a saccharinate, a maleate and an oxalate. The crystal structures of all salts were determined by single crystal X-ray diffraction. In sequence, mechanochemical protocols for them were developed, being the scale-up production of the maleate salt successfully reproducible and confirmed by powder X-ray diffraction. Finally, a more complete solid-state characterization was carried out for the ETH maleate salt, including thermal analysis, infrared spectroscopy, scanning electron microscopy and equilibrium solubility at different dissolution media. Although ETH maleate is thermodynamically less stable than ETH, the equilibrium solubility results revealed that this novel salt is much more soluble in purified water than ETH, thus being a suitable new candidate for future formulations. PMID:26472469

  5. Estimating the Annual Incidence of Abortions in Iran Applying a Network Scale-up Approach

    PubMed Central

    Rastegari, Azam; Baneshi, Mohammad Reza; Haji-maghsoudi, Saiedeh; Nakhaee, Nowzar; Eslami, Mohammad; Malekafzali, Hossein; Haghdoost, Ali Akbar

    2014-01-01

    Background: Abortions are of major public health concern in developing countries. In settings in which abortion is highly prohibited, the direct interview is not a reliable method to estimate the abortion rate. The indirect estimation methods to measure the rate of abortion might overcome this dilemma; They are practical methods to estimate the size of the hidden group who do not agree to participate in a direct interview. Objectives: The aim of this study was to explore the practicality of an indirect method for estimating the abortion rate , Known as Network Scale-up, and to provide an estimate about the episode of abortion with and without medical indications (AWMI+ and AWMI-) in Iran. Materials and Methods: This cross-sectional study was conducted in 31 provinces of Iran in 2012. A random sample between 200 and 1000 was selected in each province by the multistage sampling method that 75% of the data were collected from the capital and 25% from one main city. We selected samples from urban people more than 18 years old (12960) and we asked them about the number of abortion in women they knew who had experienced the medical and non-medical abortions in the past year. A range for the transparency factor was estimated based on the expert opinion. Results: The range of the transparency factors for AWMI+ and AWOMI- were 0.43-0.75 and 0.2-0.34, respectively. Regarding the AWMI+, our minimum and maximum estimations (per 1000 pregnancies) were 70.54 and 116.9, respectively. The corresponding figures for AWMI- were 93.18, and 148.7. Conclusions: The frequency rates for AWMI+ and AWMI- were relatively high. Therefore, the system has to address to this hidden problem using the appropriate preventive policies. PMID:25558379

  6. Scale effects: HCMM data simulation. Usage of filtering techniques for scaling-up simulations

    NASA Technical Reports Server (NTRS)

    Digennaro, V. (Principal Investigator)

    1980-01-01

    Image reduction used to simulate increase in altitude of an acquisition platform is equivalent to data smoothing, and can be achieved either by neighborhood averaging or by filtering techniques. The averaging approach is limited for accurate simulation. A filtering method is described which was based on the hypothesis that all changes due to altitude increase can be represented by a point spread function. Determination of the scale function and factor are discussed as well as implementation of the filtering. Filtering can be either in the spatial or frequency domain. In the spatial domain, filtering consists of the convolution of the image with the weights mask, and then of the declination of the points according to the appropriates scale factor. A simulation of an aircraft day image in the infrared channel is examined.

  7. Scaling up Psycholinguistics

    ERIC Educational Resources Information Center

    Smith, Nathaniel J.

    2011-01-01

    This dissertation contains several projects, each addressing different questions with different techniques. In chapter 1, I argue that they are unified thematically by their goal of "scaling up psycholinguistics"; they are all aimed at analyzing large data-sets using tools that reveal patterns to propose and test mechanism-neutral hypotheses about…

  8. Evaluating of scale-up methodologies of gas-solid spouted beds for coating TRISO nuclear fuel particles using advanced measurement techniques

    NASA Astrophysics Data System (ADS)

    Ali, Neven Y.

    The work focuses on implementing for the first time advanced non-invasive measurement techniques to evaluate the scale-up methodology of gas-solid spouted beds for hydrodynamics similarity that has been reported in the literature based on matching dimensionless groups and the new mechanistic scale up methodology that has been developed in our laboratory based on matching the radial profile of gas holdup since the gas dynamics dictate the hydrodynamics of the gas-solid spouted beds. These techniques are gamma-ray computed tomography (CT) to measure the cross-sectional distribution of the phases' holdups and their radial profiles along the bed height and radioactive particle tracking (RPT) to measure in three-dimension (3D) solids velocity and their turbulent parameters. The measured local parameters and the analysis of the results obtained in this work validate our new methodology of scale up of gas-solid spouted beds by comparing for the similarity the phases' holdups and the dimensionless solids velocities and their turbulent parameters that are non-dimensionalized using the minimum spouting superficial gas velocity. However, the scale-up methodology of gas-solid spouted beds that is based on matching dimensionless groups has not been validated for hydrodynamics similarity with respect to the local parameters such as phases' holdups and dimensionless solids velocities and their turbulent parameters. Unfortunately, this method was validated in the literature by only measuring the global parameters. Thus, this work confirms that validation of the scale-up methods of gas-solid spouted beds for hydrodynamics similarity should reside on measuring and analyzing the local hydrodynamics parameters.

  9. Applied ALARA techniques

    SciTech Connect

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  10. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  11. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  12. Scaling up of renewable chemicals.

    PubMed

    Sanford, Karl; Chotani, Gopal; Danielson, Nathan; Zahn, James A

    2016-04-01

    The transition of promising technologies for production of renewable chemicals from a laboratory scale to commercial scale is often difficult and expensive. As a result the timeframe estimated for commercialization is typically underestimated resulting in much slower penetration of these promising new methods and products into the chemical industries. The theme of 'sugar is the next oil' connects biological, chemical, and thermochemical conversions of renewable feedstocks to products that are drop-in replacements for petroleum derived chemicals or are new to market chemicals/materials. The latter typically offer a functionality advantage and can command higher prices that result in less severe scale-up challenges. However, for drop-in replacements, price is of paramount importance and competitive capital and operating expenditures are a prerequisite for success. Hence, scale-up of relevant technologies must be interfaced with effective and efficient management of both cell and steel factories. Details involved in all aspects of manufacturing, such as utilities, sterility, product recovery and purification, regulatory requirements, and emissions must be managed successfully. PMID:26874264

  13. Wax deposition scale-up modeling for waxy crude production lines

    SciTech Connect

    Hsu, J.J.C.; Brubaker, J.P.

    1995-12-01

    A wax deposition scale-up model has been developed to scale-up laboratory wax deposition results for waxy crude production lines. The wax deposition model allows users to predict wax deposition profile along a cold pipeline and predict potential wax problems and pigging frequency. Consideration of the flow turbulence effect significantly increases prediction accuracy. Accurate wax deposition prediction should save capital and operation investments for waxy crude production systems. Many wax deposition models only apply a molecular diffusion mechanism in modeling and neglect shear effect. However, the flow turbulence effect has significant impact on wax deposition and can not be neglected in wax deposition modeling. Wax deposition scale-up parameters including shear rate, shear stress, and Reynolds number have been studied. None of these parameters can be used as a scaler. Critical wax tension concept has been proposed as a scaler. A technique to scale up shear effect and then wax deposition is described. For a given oil and oil temperature, the laboratory wax deposition data can be scaled up by heat flux and flow velocity. The scale-up techniques could be applied to multiphase flow conditions. Examples are presented in this paper to describe profiles of wax deposition and effective inside diameter along North Sea and West Africa subsea pipelines. The difference of wax deposition profiles from stock tank oil and live oil is also presented.

  14. Steamflood modeling and scale up in a hetrogeneous reservoir

    SciTech Connect

    Dehghani, K.; Basham, W.M.; Durlofsky, L.J.

    1996-12-31

    A study was undertaken to investigate the effects of different levels of reservoir description for modeling the steamflood process in Midway-Sunset 26C south. This reservoir is highly stratified and interbedded with fine scale heterogeneities. The study also included development of a methodology for coarsening very detailed geostatistically derived models so that the effects of fine scale heterogeneities on the steamflood performance prediction can be captured. porosity and permeability cubes with three different levels of detail in layering were generated geostatistically for a 30 acre area in the 26C south using data from 57 wells. Cross section models were extracted from these cubes in both dipping and non-dipping structures for steamflood simulations using field injection data and the results were compared.The most detailed models were coarsened by two methods. In the first method, the impermeable layers were retained and the permeable sands coarsened by averaging each 10 layers. In the second method the dominant flow paths in the cross section are identified and used to selectively scale up the reservoir properties, leaving detail in regions where required and coarsening in other regions. The results showed that the detailed geostatistical model gives a more accurate performance prediction than a coarse geostatistical model. The results also showed that the most accurate coarsened models were obtained by applying the general scale up technique.

  15. Steamflood modeling and scale up in a hetrogeneous reservoir

    SciTech Connect

    Dehghani, K.; Basham, W.M.; Durlofsky, L.J. )

    1996-01-01

    A study was undertaken to investigate the effects of different levels of reservoir description for modeling the steamflood process in Midway-Sunset 26C south. This reservoir is highly stratified and interbedded with fine scale heterogeneities. The study also included development of a methodology for coarsening very detailed geostatistically derived models so that the effects of fine scale heterogeneities on the steamflood performance prediction can be captured. porosity and permeability cubes with three different levels of detail in layering were generated geostatistically for a 30 acre area in the 26C south using data from 57 wells. Cross section models were extracted from these cubes in both dipping and non-dipping structures for steamflood simulations using field injection data and the results were compared.The most detailed models were coarsened by two methods. In the first method, the impermeable layers were retained and the permeable sands coarsened by averaging each 10 layers. In the second method the dominant flow paths in the cross section are identified and used to selectively scale up the reservoir properties, leaving detail in regions where required and coarsening in other regions. The results showed that the detailed geostatistical model gives a more accurate performance prediction than a coarse geostatistical model. The results also showed that the most accurate coarsened models were obtained by applying the general scale up technique.

  16. Application of a new scale up methodology to the simulation of displacement processes in heterogeneous reservoirs

    SciTech Connect

    Durlofsky, L.J.; Milliken, W.J.; Dehghani, K.; Jones, R.C.

    1994-12-31

    A general method for the scale up of highly detailed, heterogeneous reservoir cross sections is presented and applied to the simulation of several recovery processes in a variety of geologic settings. The scale up technique proceeds by first identifying portions of the fine scale reservoir description which could potentially lead to high fluid velocities, typically regions of connected, high permeability. These regions are then modeled in detail while the remainder of the domain is coarsened using a general numerical technique for the calculation of effective permeability. The overall scale up method is applied to the cross sectional simulation of three actual fields. Waterflood, steamflood and miscible flood recovery processes are considered. In all these cases, the scale up technique is shown to give coarsened reservoir descriptions which provide simulation results in very good agreement with those of the detailed reservoir descriptions. For these simulations, speedups in computation times, for the coarsened models relative to their fine grid counterparts, range from a factor of 10 to a factor of 200.

  17. Scaling up Effects in the Organic Laboratory

    ERIC Educational Resources Information Center

    Persson, Anna; Lindstrom, Ulf M.

    2004-01-01

    A simple and effective way of exposing chemistry students to some of the effects of scaling up an organic reaction is described. It gives the student an experience that may encounter in an industrial setting.

  18. Diagnostic technique applied for FEL electron bunches

    NASA Astrophysics Data System (ADS)

    Brovko, O.; Grebentsov, A.; Morozov, N.; Syresin, E.; Yurkov, M.

    2016-05-01

    Diagnostic technique applied for FEL ultrashort electron bunches is developed at JINR-DESY collaboration within the framework of the FLASH and XFEL projects. Photon diagnostics are based on calorimetric measurements and detection of undulator radiation. The infrared undulator constructed at JINR and installed at FLASH is used for longitudinal bunch shape measurements and for two-color lasing provided by the FIR and VUV undulators. The pump probe experiments with VUV and FIR undulators provide the bunch profile measurements with resolution of several femtosecond. The new three microchannel plates (MCP) detectors operated in X-ray range are under development now in JINR for SASE1-SASE 3 European XFEL.

  19. Flash Diffusivity Technique Applied to Individual Fibers

    NASA Technical Reports Server (NTRS)

    Mayeaux, Brian; Yowell, Leonard; Wang, Hsin

    2007-01-01

    A variant of the flash diffusivity technique has been devised for determining the thermal diffusivities, and thus the thermal conductivities, of individual aligned fibers. The technique is intended especially for application to nanocomposite fibers, made from narrower fibers of polyphenylene benzobisthiazole (PBZT) and carbon nanotubes. These highly aligned nanocomposite fibers could exploit the high thermal conductivities of carbon nanotubes for thermal-management applications. In the flash diffusivity technique as practiced heretofore, one or more heat pulse(s) is (are) applied to the front face of a plate or disk material specimen and the resulting time-varying temperature on the rear face is measured. Usually, the heat pulse is generated by use of a xenon flash lamp, and the variation of temperature on the rear face is measured by use of an infrared detector. The flash energy is made large enough to produce a usefully high temperature rise on the rear face, but not so large as to significantly alter the specimen material. Once the measurement has been completed, the thermal diffusivity of the specimen is computed from the thickness of the specimen and the time dependence of the temperature variation on the rear face. Heretofore, the infrared detector used in the flash diffusivity technique has been a single-point detector, which responds to a spatial average of the thermal radiation from the rear specimen surface. Such a detector cannot distinguish among regions of differing diffusivity within the specimen. Moreover, two basic assumptions of the thermaldiffusivity technique as practiced heretofore are that the specimen is homogeneous and that heat flows one-dimensionally from the front to the rear face. These assumptions are not valid for an inhomogeneous (composite) material.

  20. Digital techniques applied to sine test control

    NASA Astrophysics Data System (ADS)

    Westoby, T. J.

    1981-09-01

    Digital techniques are applied to solve problems experienced in analogue circuitry, enabling the design of a highly reliable sine control system. A sine wave is generated whose frequency is proportional to a digital number, held in the counters of the sweep generator, using the frequency related pulse stream. This pulse stream is used to generate a ramp by applying it to a count. The rate of rise is varied by using a rate multiplier arranged to slow the pulse stream as the ramp proceeds. Variation of frequency depends only on the frequency of the pulse stream entering the circuit, and the oscillator runs quite acceptably at 0.1 Hz and 10 kHz. The total distortion at this stage is less than 2%. Since the control signal is quantized, only discrete changes in control are experienced, and the control lines are static most of the time; the digital system can reduce the effects of a noisy return signal by as much as 64 times. The greatest advantage of digital techniques is its use in integrator stabilization. A tracking capacitor ensures that conversion is done to an accuracy of 1%, and residual ripple on the output is removed by a low pass filter.

  1. Characterization of Filtration Scale-Up Performance

    SciTech Connect

    Daniel, Richard C.; Billing, Justin M.; Luna, Maria L.; Cantrell, Kirk J.; Peterson, Reid A.; Bonebrake, Michael L.; Shimskey, Rick W.; Jagoda, Lynette K.

    2009-03-09

    The scale-up performance of sintered stainless steel crossflow filter elements planned for use at the Pretreatment Engineering Platform (PEP) and at the Waste Treatment and Immobilization Plant (WTP) were characterized in partial fulfillment (see Table S.1) of the requirements of Test Plan TP RPP WTP 509. This test report details the results of experimental activities related only to filter scale-up characterization. These tests were performed under the Simulant Testing Program supporting Phase 1 of the demonstration of the pretreatment leaching processes at PEP. Pacific Northwest National Laboratory (PNNL) conducted the tests discussed herein for Bechtel National, Inc. (BNI) to address the data needs of Test Specification 24590-WTP-TSP-RT-07-004. Scale-up characterization tests employ high-level waste (HLW) simulants developed under the Test Plan TP-RPP-WTP-469. The experimental activities outlined in TP-RPP-WTP-509 examined specific processes from two broad areas of simulant behavior: 1) leaching performance of the boehmite simulant as a function of suspending phase chemistry and 2) filtration performance of the blended simulant with respect to filter scale-up and fouling. With regard to leaching behavior, the effect of anions on the kinetics of boehmite leaching was examined. Two experiments were conducted: 1) one examined the effect of the aluminate anion on the rate of boehmite dissolution and 2) another determined the effect of secondary anions typical of Hanford tank wastes on the rate of boehmite dissolution. Both experiments provide insight into how compositional variations in the suspending phase impact the effectiveness of the leaching processes. In addition, the aluminate anion studies provide information on the consequences of gibbsite in waste. The latter derives from the expected fast dissolution of gibbsite relative to boehmite. This test report concerns only results of the filtration performance with respect to scale-up. Test results for boehmite

  2. Applying Renormalization Group Techniques to Nuclear Reactions

    NASA Astrophysics Data System (ADS)

    Eldredge, Zachary; Bogner, Scott; Nunes, Filomena

    2013-10-01

    Nuclear reactions are commonly used to explore the physics of unstable nuclei. Therefore, it is important that accurate, computationally favorable methods exist to describe them. Reaction models often make use of effective nucleon-nucleus potentials (optical potentials) which fit low-energy scattering data and include an imaginary component to account for the removal of flux from the elastic channel. When describing reactions in momentum space, the coupling between low- and high-momentum states can pose a technical challenge. We would like potentials which allow us to compute low-momentum interactions without including highly virtual momentum states. A solution to this problem is to apply renormalization group (RG) techniques to produce a new effective potential in which high and low momentum degrees of freedom are decoupled, so that we need only consider momenta below some cutoff. This poster will present results relating to an implementation of RG techniques on optical potentials, including complex potentials and spin-orbit effects. We show that our evolved optical potentials reproduce bound states and scattering phase shifts without the inclusion of any momenta above a selected cutoff, and compare new potentials to old ones to examine the effect of transformation.

  3. Lessons Learned on "Scaling Up" of Projects

    ERIC Educational Resources Information Center

    Viadero, Debra

    2007-01-01

    Having developed a technology-based teaching unit on weather that appeared to work well for middle school students, Nancy Butler Songer and her colleagues at the University of Michigan decided in the late 1990s to take the next logical step in their research program: They scaled up. This article discusses lessons learned by several faculty…

  4. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  5. Scaling up: Distributed machine learning with cooperation

    SciTech Connect

    Provost, F.J.; Hennessy, D.N.

    1996-12-31

    Machine-learning methods are becoming increasingly popular for automated data analysis. However, standard methods do not scale up to massive scientific and business data sets without expensive hardware. This paper investigates a practical alternative for scaling up: the use of distributed processing to take advantage of the often dormant PCs and workstations available on local networks. Each workstation runs a common rule-learning program on a subset of the data. We first show that for commonly used rule-evaluation criteria, a simple form of cooperation can guarantee that a rule will look good to the set of cooperating learners if and only if it would look good to a single learner operating with the entire data set. We then show how such a system can further capitalize on different perspectives by sharing learned knowledge for significant reduction in search effort. We demonstrate the power of the method by learning from a massive data set taken from the domain of cellular fraud detection. Finally, we provide an overview of other methods for scaling up machine learning.

  6. Nanomedicine scale-up technologies: feasibilities and challenges.

    PubMed

    Paliwal, Rishi; Babu, R Jayachandra; Palakurthi, Srinath

    2014-12-01

    Nanomedicine refers to biomedical and pharmaceutical applications of nanosized cargos of drugs/vaccine/DNA therapeutics including nanoparticles, nanoclusters, and nanospheres. Such particles have unique characteristics related to their size, surface, drug loading, and targeting potential. They are widely used to combat disease by controlled delivery of bioactive(s) or for diagnosis of life-threatening problems in their very early stage. The bioactive agent can be combined with a diagnostic agent in a nanodevice for theragnostic applications. However, the formulation scientist faces numerous challenges related to their development, scale-up feasibilities, regulatory aspects, and commercialization. This article reviews recent progress in the method of development of nanoparticles with a focus on polymeric and lipid nanoparticles, their scale-up techniques, and challenges in their commercialization. PMID:25047256

  7. The projected effect of scaling up midwifery.

    PubMed

    Homer, Caroline S E; Friberg, Ingrid K; Dias, Marcos Augusto Bastos; ten Hoope-Bender, Petra; Sandall, Jane; Speciale, Anna Maria; Bartlett, Linda A

    2014-09-20

    We used the Lives Saved Tool (LiST) to estimate deaths averted if midwifery was scaled up in 78 countries classified into three tertiles using the Human Development Index (HDI). We selected interventions in LiST to encompass the scope of midwifery practice, including prepregnancy, antenatal, labour, birth, and post-partum care, and family planning. Modest (10%), substantial (25%), or universal (95%) scale-up scenarios from present baseline levels were all found to reduce maternal deaths, stillbirths, and neonatal deaths by 2025 in all countries tested. With universal coverage of midwifery interventions for maternal and newborn health, excluding family planning, for the countries with the lowest HDI, 61% of all maternal, fetal, and neonatal deaths could be prevented. Family planning alone could prevent 57% of all deaths because of reduced fertility and fewer pregnancies. Midwifery with both family planning and interventions for maternal and newborn health could avert a total of 83% of all maternal deaths, stillbirths, and neonatal deaths. The inclusion of specialist care in the scenarios resulted in an increased number of deaths being prevented, meaning that midwifery care has the greatest effect when provided within a functional health system with effective referral and transfer mechanisms to specialist care. PMID:24965814

  8. Applying Cryopreservation Techniques to Diverse Biological Materials

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Developing new cryopreservation protocols for each new plant or tissue is time consuming and often unnecessary. Existing standard protocols can be applied to many plants resulting in moderate to excellent results or protocols may require only a few changes for optimum recovery of plants. Protocols ...

  9. Novel techniques applied to polymer lifetime predictions

    SciTech Connect

    Gillen, K.T.; Wise, J.; Clough, R.L.

    1993-12-31

    A study aimed at testing the Arrhenius life prediction approach is described. After aging elastomeric materials at several elevated (accelerated) temperatures, a modulus profiling apparatus was used to demonstrate the complicated diffusion-limited oxidation anomalies are typically present under accelerated oven-aging conditions. By using surface modulus results (oxidation less to a monotonic increase in modulus), estimates are made of the true activation energy (E{sub a}) appropriate to the oxidation reactions dominating degradation. Even though macroscopic properties should be influenced by the diffusion-limited oxidation complications, ultimate tensile elongation results were found to be correlated to the true E{sub a}. This implies that cracks initiate at the hardened surface of the material and then quickly propagate through the less oxidized interior. If values of E{sub a} obtained from accelerated exposures can be determined and rationalized, another important question involves the Arrhenius assumption that E{sub a} remains constant in the extrapolation region. Preliminary data from two ultra-sensitive techniques (oxygen consumption and microcalorimetry) aimed at testing this fundamental assumption are described.

  10. Scaling Up Decision Theoretic Planning to Planetary Rover Problems

    NASA Technical Reports Server (NTRS)

    Meuleau, Nicolas; Dearden, Richard; Washington, Rich

    2004-01-01

    Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.

  11. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  12. Scale-up on electrokinetic remediation: Engineering and technological parameters.

    PubMed

    López-Vizcaíno, Rubén; Navarro, Vicente; León, María J; Risco, Carolina; Rodrigo, Manuel A; Sáez, Cristina; Cañizares, Pablo

    2016-09-01

    This study analyses the effect of the scale-up of electrokinetic remediation (EKR) processes in natural soils. A procedure is proposed to prepare soils based on a compacting process to obtaining soils with similar moisture content and density to those found in real soils in the field. The soil used here was from a region with a high agrarian activity (Mora, Spain). The scale-up study was performed in two installations at different scales: a mock-up pilot scale (0.175m(3)) and a prototype with a scale that was very similar to a real application (16m(3)). The electrode configuration selected consisted of rows of graphite electrodes facing each other located in electrolyte wells. The discharge of 20mg of 2,4-dichlorophenoxyacetic acid [2,4-D] per kg of dry soil was treated by applying an electric potential gradient of 1Vcm(-1). An increase in scale was observed to directly influence the amount of energy supplied to the soil being treated. As a result, electroosmotic and electromigration flows and electric heating are more intense than in smaller-scale tests (24%, 1% and 25%, respectively respect to the values in prototype). In addition, possible leaks were evaluated by conducting a watertightness test and quantifying evaporation losses. PMID:27209275

  13. Scale-up of sediment microbial fuel cells

    NASA Astrophysics Data System (ADS)

    Ewing, Timothy; Ha, Phuc Thi; Babauta, Jerome T.; Tang, Nghia Trong; Heo, Deukhyoun; Beyenal, Haluk

    2014-12-01

    Sediment microbial fuel cells (SMFCs) are used as renewable power sources to operate remote sensors. However, increasing the electrode surface area results in decreased power density, which demonstrates that SMFCs do not scale up with size. As an alternative to the physical scale-up of SMFCs, we proposed that it is possible to scale up power by using smaller-sized individually operated SMFCs connected to a power management system that electrically isolates the anodes and cathodes. To demonstrate our electronic scale-up approach, we operated one 0.36-m2 SMFC (called a single-equivalent SMFC) and four independent SMFCs of 0.09 m2 each (called scaled-up SMFCs) and managed the power using an innovative custom-developed power management system. We found that the single-equivalent SMFC and the scaled-up SMFCs produced similar power for the first 155 days. However, in the long term (>155 days) our scaled-up SMFCs generated significantly more power than the single-equivalent SMFC (2.33 mW vs. 0.64 mW). Microbial community analysis of the single-equivalent SMFC and the scaled-up SMFCs showed very similar results, demonstrating that the difference in operation mode had no significant effect on the microbial community. When we compared scaled-up SMFCs with parallel SMFCs, we found that the scaled-up SMFCs generated more power. Our novel approach demonstrates that SMFCs can be scaled up electronically.

  14. Scale-up of ecological experiments: Density variation in the mobile bivalve Macomona liliana

    USGS Publications Warehouse

    Schneider, D.C.; Walters, R.; Thrush, S.; Dayton, P.

    1997-01-01

    At present the problem of scaling up from controlled experiments (necessarily at a small spatial scale) to questions of regional or global importance is perhaps the most pressing issue in ecology. Most of the proposed techniques recommend iterative cycling between theory and experiment. We present a graphical technique that facilitates this cycling by allowing the scope of experiments, surveys, and natural history observations to be compared to the scope of models and theory. We apply the scope analysis to the problem of understanding the population dynamics of a bivalve exposed to environmental stress at the scale of a harbour. Previous lab and field experiments were found not to be 1:1 scale models of harbour-wide processes. Scope analysis allowed small scale experiments to be linked to larger scale surveys and to a spatially explicit model of population dynamics.

  15. GENOMIC AND PROTEOMIC TECHNIQUES APPLIED TO REPRODUCTIVE BIOLOGY

    EPA Science Inventory

    Genomic and proteomic techniques applied to reproductive biology
    John C. Rockett
    Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, Office of Research and Development, United States Environmental Protection Agency, Research Tria...

  16. Accounting for the cost of scaling-up health interventions.

    PubMed

    Johns, Benjamin; Baltussen, Rob

    2004-11-01

    Recent studies such as the Commission on Macroeconomics and Health have highlighted the need for expanding the coverage of services for HIV/AIDS, malaria, tuberculosis, immunisations and other diseases. In order for policy makers to plan for these changes, they need to analyse the change in costs when interventions are 'scaled-up' to cover greater percentages of the population. Previous studies suggest that applying current unit costs to an entire population can misconstrue the true costs of an intervention. This study presents the methodology used in WHO-CHOICE's generalised cost effectiveness analysis, which includes non-linear cost functions for health centres, transportation and supervision costs, as well as the presence of fixed costs of establishing a health infrastructure. Results show changing marginal costs as predicted by economic theory. PMID:15386683

  17. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    Automated validation of flight-critical embedded systems is being done at ARC Dryden Flight Research Facility. The automated testing techniques are being used to perform closed-loop validation of man-rated flight control systems. The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 High Alpha Research Vehicle (HARV) automated test systems are discussed. Operationally applying automated testing techniques has accentuated flight control system features that either help or hinder the application of these techniques. The paper also discusses flight control system features which foster the use of automated testing techniques.

  18. Scale-up of miscible flood processes

    SciTech Connect

    Orr, F.M. Jr.

    1992-05-01

    Results of a wide-ranging investigation of the scaling of the physical mechanisms of miscible floods are reported. Advanced techniques for analysis of crude oils are considered in Chapter 2. Application of supercritical fluid chromatography is demonstrated for characterization of crude oils for equation-of-state calculations of phase equilibrium. Results of measurements of crude oil and phase compositions by gas chromatography and mass spectrometry are also reported. The theory of development of miscibility is considered in detail in Chapter 3. The theory is extended to four components, and sample solutions for a variety of gas injection systems are presented. The analytical theory shows that miscibility can develop even though standard tie-line extension criteria developed for ternary systems are not satisfied. In addition, the theory includes the first analytical solutions for condensing/vaporizing gas drives. In Chapter 4, methods for simulation of viscous fingering are considered. The scaling of the growth of transition zones in linear viscous fingering is considered. In addition, extension of the models developed previously to three dimensions is described, as is the inclusion of effects of equilibrium phase behavior. In Chapter 5, the combined effects of capillary and gravity-driven crossflow are considered. The experimental results presented show that very high recovery can be achieved by gravity segregation when interfacial tensions are moderately low. We argue that such crossflow mechanisms are important in multicontact miscible floods in heterogeneous reservoirs. In addition, results of flow visualization experiments are presented that illustrate the interplay of crossflow driven by gravity with that driven by viscous forces.

  19. Scale Up in Education. Volume 1: Ideas in Principle

    ERIC Educational Resources Information Center

    Schneider, Barbara Ed.; McDonald, Sarah-Kathryn Ed.

    2006-01-01

    "Scale Up in Education, Volume 1: Ideas in Principle" examines the challenges of "scaling up" from a multidisciplinary perspective. It brings together contributions from disciplines that routinely take promising innovations to scale, including medicine, business, engineering, computing, and education. Together the contributors explore appropriate…

  20. Readiness for Change. Scaling-Up Brief. Number 3

    ERIC Educational Resources Information Center

    Fixsen, Dean L.; Blase, Karen A.; Horner, Rob; Sugai, George

    2009-01-01

    The purpose of this "Brief" is to define the variables a state or large district leadership team may wish to consider as they determine if they are "ready" to invest in the scaling-up of an innovation in education. As defined here, "scaling up" means that at least 60% of the students who could benefit from an innovation have access to that…

  1. Multisite Studies and Scaling up in Educational Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2012-01-01

    A scale-up study in education typically expands the sample of students, schools, districts, and/or practices or materials used in smaller studies in ways that build in heterogeneity. Yet surprisingly little is known about the factors that promote successful scaling up efforts in education, in large part due to the absence of empirically supported…

  2. Applying Parallel Processing Techniques to Tether Dynamics Simulation

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    1996-01-01

    The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.

  3. Scaling up depot medroxyprogesterone acetate (DMPA): a systematic literature review illustrating the AIDED model

    PubMed Central

    2013-01-01

    Background Use of depot medroxyprogesterone acetate (DMPA), often known by the brand name Depo-Provera, has increased globally, particularly in multiple low- and middle-income countries (LMICs). As a reproductive health technology that has scaled up in diverse contexts, DMPA is an exemplar product innovation with which to illustrate the utility of the AIDED model for scaling up family health innovations. Methods We conducted a systematic review of the enabling factors and barriers to scaling up DMPA use in LMICs. We searched 11 electronic databases for academic literature published through January 2013 (n = 284 articles), and grey literature from major health organizations. We applied exclusion criteria to identify relevant articles from peer-reviewed (n = 10) and grey literature (n = 9), extracting data on scale up of DMPA in 13 countries. We then mapped the resulting factors to the five AIDED model components: ASSESS, INNOVATE, DEVELOP, ENGAGE, and DEVOLVE. Results The final sample of sources included studies representing variation in geographies and methodologies. We identified 15 enabling factors and 10 barriers to dissemination, diffusion, scale up, and/or sustainability of DMPA use. The greatest number of factors were mapped to the ASSESS, DEVELOP, and ENGAGE components. Conclusions Findings offer early empirical support for the AIDED model, and provide insights into scale up of DMPA that may be relevant for other family planning product innovations. PMID:23915274

  4. The Use of Video Tape as an Applied Research Technique.

    ERIC Educational Resources Information Center

    Deshler, J. David; Czaplewski, Ellen C.

    This report describes the application of videotape recordings as an applied research technique in a project directed toward the research and development of training designs and materials for human service agency professionals. The multiple applications of VTR and the types of data collection situations in which it has been utilized are briefly…

  5. Sensor Data Qualification Technique Applied to Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Simon, Donald L.

    2013-01-01

    This paper applies a previously developed sensor data qualification technique to a commercial aircraft engine simulation known as the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k). The sensor data qualification technique is designed to detect, isolate, and accommodate faulty sensor measurements. It features sensor networks, which group various sensors together and relies on an empirically derived analytical model to relate the sensor measurements. Relationships between all member sensors of the network are analyzed to detect and isolate any faulty sensor within the network.

  6. Digital image correlation techniques applied to LANDSAT multispectral imagery

    NASA Technical Reports Server (NTRS)

    Bonrud, L. O. (Principal Investigator); Miller, W. J.

    1976-01-01

    The author has identified the following significant results. Automatic image registration and resampling techniques applied to LANDSAT data achieved accuracies, resulting in mean radial displacement errors of less than 0.2 pixel. The process method utilized recursive computational techniques and line-by-line updating on the basis of feedback error signals. Goodness of local feature matching was evaluated through the implementation of a correlation algorithm. An automatic restart allowed the system to derive control point coordinates over a portion of the image and to restart the process, utilizing this new control point information as initial estimates.

  7. Scale-up Synthesis of Diallyl Phthalate Prepolymer

    SciTech Connect

    Carey, D. A.

    1984-10-01

    This project was initiated to develop processes for the synthesis of diallyl phthalate (DAP) prepolymer in the Bendix Chemical Polymer Facility. Thus far, five scale-up reactions have been carried out in a 100-gallon reactor and fourteen have been conducted in the 15-gallon resin kettle. The synthesis of diallyl isophthalate prepolymer (DAIPP) was also investigated; eight scale-up reactions of this prepolymer have been carried out. Aging studies on DAIPP were also conducted.

  8. A new scale-up approach for dispersive mixing in twin-screw compounding

    NASA Astrophysics Data System (ADS)

    Fukuda, Graeme; Bigio, David I.; Andersen, Paul; Wetzel, Mark

    2015-05-01

    Scale-up rules in polymer processing are critical in ensuring consistency in product quality and properties when transitioning from low volume laboratory mixing processes to high volume industrial compounding. The scale-up approach investigated in this study evaluates the processes with respect to dispersive mixing. Demand of polymer composites with solid additives, such as carbon microfibers and nanotubes, has become increasingly popular. Dispersive mixing breaks down particles that agglomerate, which is paramount in processing composites because solid additives tend to collect and clump. The amount of stress imparted on the material governs the degree of dispersive mixing. A methodology has been developed to characterize the Residence Stress Distribution (RSD) within a twin-screw extruder in real time through the use of polymeric stress beads. Through this technique, certain mixing scale-up rules can be analyzed. The following research investigated two different scale-up rules. The industry standard for mixing scale-up takes the ratio of outer diameters cubed to convert the volumetric flow rate from the smaller process to a flow rate appropriate in the larger machine. This procedure then resolves both operating conditions since shear rate remains constant. The second rule studied is based on percent drag flow, or the fraction of pumping potential, for different elements along the screw configuration. The percent drag flow rule aims to bring greater focus to operating conditions when scaling-up with respect to dispersive mixing. Through the use of the RSD methodology and a Design of Experiment (DOE) approach, rigorous statistical analysis was used to determine the validity between the scale-up rules of argument.

  9. A quality by design approach to scale-up of high-shear wet granulation process.

    PubMed

    Pandey, Preetanshu; Badawy, Sherif

    2016-01-01

    High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review. PMID:26489403

  10. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  11. Scaling up breastfeeding programmes in a complex adaptive world.

    PubMed

    Pérez-Escamilla, Rafael; Hall Moran, Victoria

    2016-07-01

    The 2016 Breastfeeding Lancet Series continues to provide unequivocal evidence regarding the numerous benefits that optimal breastfeeding practices offer to children and women worldwide and the major savings that improving these practices can have as a result of their major public health benefits. Unfortunately, this knowledge remains underutilized as there has been little progress scaling up effective breastfeeding programmes globally. Improving the uptake and scaling up of effective national breastfeeding programmes that are potent enough to improve exclusive breastfeeding duration should be a top priority for all countries. Complex analysis systems longitudinal research is needed to understand how best to empower decision makers to achieve this goal through well-validated participatory decision-making tools to help their countries assess baseline needs, including costs, as well as progress with their scaling-up efforts. Sound systems thinking frameworks and scaling-up models are now available to guide and research prospectively future scaling-up efforts that can be replicated, with proper adaptations, across countries. PMID:27161881

  12. Soil Moisture Estimation under Vegetation Applying Polarimetric Decomposition Techniques

    NASA Astrophysics Data System (ADS)

    Jagdhuber, T.; Schön, H.; Hajnsek, I.; Papathanassiou, K. P.

    2009-04-01

    Polarimetric decomposition techniques and inversion algorithms are developed and applied on the OPAQUE data set acquired in spring 2007 to investigate their potential and limitations for soil moisture estimation. A three component model-based decomposition is used together with an eigenvalue decomposition in a combined approach to invert for soil moisture over bare and vegetated soils at L-band. The applied approach indicates a feasible capability to invert soil moisture after decomposing volume and ground scattering components over agricultural land surfaces. But there are still deficiencies in modeling the volume disturbance. The results show a root mean square error below 8.5vol.-% for the winter crop fields (winter wheat, winter triticale and winter barley) and below 11.5Vol-% for the summer crop field (summer barley) whereas all fields have a distinct volume layer of 55-85cm height.

  13. Estimating Population Size Using the Network Scale Up Method

    PubMed Central

    Maltiel, Rachael; Raftery, Adrian E.; McCormick, Tyler H.; Baraff, Aaron J.

    2015-01-01

    We develop methods for estimating the size of hard-to-reach populations from data collected using network-based questions on standard surveys. Such data arise by asking respondents how many people they know in a specific group (e.g. people named Michael, intravenous drug users). The Network Scale up Method (NSUM) is a tool for producing population size estimates using these indirect measures of respondents’ networks. Killworth et al. (1998a,b) proposed maximum likelihood estimators of population size for a fixed effects model in which respondents’ degrees or personal network sizes are treated as fixed. We extend this by treating personal network sizes as random effects, yielding principled statements of uncertainty. This allows us to generalize the model to account for variation in people’s propensity to know people in particular subgroups (barrier effects), such as their tendency to know people like themselves, as well as their lack of awareness of or reluctance to acknowledge their contacts’ group memberships (transmission bias). NSUM estimates also suffer from recall bias, in which respondents tend to underestimate the number of members of larger groups that they know, and conversely for smaller groups. We propose a data-driven adjustment method to deal with this. Our methods perform well in simulation studies, generating improved estimates and calibrated uncertainty intervals, as well as in back estimates of real sample data. We apply them to data from a study of HIV/AIDS prevalence in Curitiba, Brazil. Our results show that when transmission bias is present, external information about its likely extent can greatly improve the estimates. The methods are implemented in the NSUM R package. PMID:26949438

  14. Image reconstruction techniques applied to nuclear mass models

    NASA Astrophysics Data System (ADS)

    Morales, Irving O.; Isacker, P. Van; Velazquez, V.; Barea, J.; Mendoza-Temis, J.; Vieyra, J. C. López; Hirsch, J. G.; Frank, A.

    2010-02-01

    A new procedure is presented that combines well-known nuclear models with image reconstruction techniques. A color-coded image is built by taking the differences between measured masses and the predictions given by the different theoretical models. This image is viewed as part of a larger array in the (N,Z) plane, where unknown nuclear masses are hidden, covered by a “mask.” We apply a suitably adapted deconvolution algorithm, used in astronomical observations, to “open the window” and see the rest of the pattern. We show that it is possible to improve significantly mass predictions in regions not too far from measured nuclear masses.

  15. Scaling Up Impact on Nutrition: What Will It Take?1234

    PubMed Central

    Gillespie, Stuart; Menon, Purnima; Kennedy, Andrew L

    2015-01-01

    Despite consensus on actions to improve nutrition globally, less is known about how to operationalize the right mix of actions—nutrition-specific and nutrition-sensitive—equitably, at scale, in different contexts. This review draws on a large scaling-up literature search and 4 case studies of large-scale nutrition programs with proven impact to synthesize critical elements for impact at scale. Nine elements emerged as central: 1) having a clear vision or goal for impact; 2) intervention characteristics; 3) an enabling organizational context for scaling up; 4) establishing drivers such as catalysts, champions, systemwide ownership, and incentives; 5) choosing contextually relevant strategies and pathways for scaling up, 6) building operational and strategic capacities; 7) ensuring adequacy, stability, and flexibility of financing; 8) ensuring adequate governance structures and systems; and 9) embedding mechanisms for monitoring, learning, and accountability. Translating current political commitment to large-scale impact on nutrition will require robust attention to these elements. PMID:26178028

  16. Scaling up impact on nutrition: what will it take?

    PubMed

    Gillespie, Stuart; Menon, Purnima; Kennedy, Andrew L

    2015-07-01

    Despite consensus on actions to improve nutrition globally, less is known about how to operationalize the right mix of actions-nutrition-specific and nutrition-sensitive-equitably, at scale, in different contexts. This review draws on a large scaling-up literature search and 4 case studies of large-scale nutrition programs with proven impact to synthesize critical elements for impact at scale. Nine elements emerged as central: 1) having a clear vision or goal for impact; 2) intervention characteristics; 3) an enabling organizational context for scaling up; 4) establishing drivers such as catalysts, champions, systemwide ownership, and incentives; 5) choosing contextually relevant strategies and pathways for scaling up, 6) building operational and strategic capacities; 7) ensuring adequacy, stability, and flexibility of financing; 8) ensuring adequate governance structures and systems; and 9) embedding mechanisms for monitoring, learning, and accountability. Translating current political commitment to large-scale impact on nutrition will require robust attention to these elements. PMID:26178028

  17. Advances and Practices of Bioprocess Scale-up.

    PubMed

    Xia, Jianye; Wang, Guan; Lin, Jihan; Wang, Yonghong; Chu, Ju; Zhuang, Yingping; Zhang, Siliang

    2016-01-01

    : This chapter addresses the update progress in bioprocess engineering. In addition to an overview of the theory of multi-scale analysis for fermentation process, examples of scale-up practice combining microbial physiological parameters with bioreactor fluid dynamics are also described. Furthermore, the methodology for process optimization and bioreactor scale-up by integrating fluid dynamics with biokinetics is highlighted. In addition to a short review of the heterogeneous environment in large-scale bioreactor and its effect, a scale-down strategy for investigating this issue is addressed. Mathematical models and simulation methodology for integrating flow field in the reactor and microbial kinetics response are described. Finally, a comprehensive discussion on the advantages and challenges of the model-driven scale-up method is given at the end of this chapter. PMID:25636486

  18. Brazilian meningococcal C conjugate vaccine: Scaling up studies.

    PubMed

    Bastos, Renata Chagas; de Souza, Iaralice Medeiros; da Silva, Milton Neto; Silva, Flavia de Paiva; Figueira, Elza Scott; Leal, Maria de Lurdes; Jessouroun, Ellen; da Silva, José Godinho; Medronho, Ricardo de Andrade; da Silveira, Ivna Alana Freitas Brasileiro

    2015-08-20

    Several outbreaks caused by Neisseria meningitidis group C have been occurred in different regions of Brazil. A conjugate vaccine for Neisseria meningitidis was produced by chemical linkage between periodate-oxidized meningococcal C polysaccharide and hydrazide-activated monomeric tetanus toxoid via a modified reductive amination conjugation method. Vaccine safety and immunogenicity tested in Phase I and II trials showed satisfactory results. Before starting Phase III trials, vaccine production was scaled up to obtain industrial lots under Good Manufacture Practices (GMP). Comparative analysis between data obtained from industrial and pilot scales of the meningococcal C conjugate bulk showed similar execution times in the scaling up production process without significant losses or alterations in the quality attributes of purified compounds. In conclusion, scale up was considered satisfactory and the Brazilian meningococcal conjugate vaccine production aiming to perform Phase III trials is feasible. PMID:25865466

  19. Applying field mapping refractive beam shapers to improve holographic techniques

    NASA Astrophysics Data System (ADS)

    Laskin, Alexander; Williams, Gavin; McWilliam, Richard; Laskin, Vadim

    2012-03-01

    Performance of various holographic techniques can be essentially improved by homogenizing the intensity profile of the laser beam with using beam shaping optics, for example, the achromatic field mapping refractive beam shapers like πShaper. The operational principle of these devices presumes transformation of laser beam intensity from Gaussian to flattop one with high flatness of output wavefront, saving of beam consistency, providing collimated output beam of low divergence, high transmittance, extended depth of field, negligible residual wave aberration, and achromatic design provides capability to work with several laser sources with different wavelengths simultaneously. Applying of these beam shapers brings serious benefits to the Spatial Light Modulator based techniques like Computer Generated Holography or Dot-Matrix mastering of security holograms since uniform illumination of an SLM allows simplifying mathematical calculations and increasing predictability and reliability of the imaging results. Another example is multicolour Denisyuk holography when the achromatic πShaper provides uniform illumination of a field at various wavelengths simultaneously. This paper will describe some design basics of the field mapping refractive beam shapers and optical layouts of their applying in holographic systems. Examples of real implementations and experimental results will be presented as well.

  20. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  1. New tuberculosis technologies: challenges for retooling and scale-up.

    PubMed

    Pai, M; Palamountain, K M

    2012-10-01

    The availability of new tools does not mean that they will be adopted, used correctly, scaled up or have public health impact. Experience to date with new diagnostics suggests that many national tuberculosis programmes (NTPs) in high-burden countries are reluctant to adopt and scale up new tools, even when these are backed by evidence and global policy recommendations. We suggest that there are several common barriers to effective national adoption and scale-up of new technologies: global policy recommendations that do not provide sufficient information for scale-up, complex decision-making processes and weak political commitment at the country level, limited engagement of and support to NTP managers, high cost of tools and poor fit with user needs, unregulated markets and inadequate business models, limited capacity for laboratory strengthening and implementation research, and insufficient advocacy and donor support. Overcoming these barriers will require enhanced country-level advocacy, resources, technical assistance and political commitment. Some of the BRICS (Brazil, Russia, India, China, South Africa) countries are emerging as early adopters of policies and technologies, and are increasing their investments in TB control. They may provide the first opportunities to fully assess the public health impact of new tools. PMID:23107630

  2. Scaling up Education Reform: Addressing the Politics of Disparity

    ERIC Educational Resources Information Center

    Bishop, Russell; O'Sullivan, Dominic; Berryman, Mere

    2010-01-01

    What is school reform? What makes it sustainable? Who needs to be involved? How is scaling up achieved? This book is about the need for educational reforms that have built into them, from the outset, those elements that will see them sustained in the original sites and spread to others. Using the Te Kotahitanga Project as a model the authors…

  3. Charter Operators Spell Out Barriers to "Scaling Up"

    ERIC Educational Resources Information Center

    Zehr, Mary Ann

    2011-01-01

    The pace at which the highest-performing charter-management organizations (CMOs) are "scaling up" is being determined largely by how rapidly they can develop and hire strong leaders and acquire physical space, and by the level of support they receive for growth from city or state policies, say leaders from some charter organizations viewed by…

  4. Sustaining and Scaling up the Impact of Professional Development Programmes

    ERIC Educational Resources Information Center

    Zehetmeier, Stefan

    2015-01-01

    This paper deals with a crucial topic: which factors influence the sustainability and scale-up of a professional development programme's impact? Theoretical models and empirical findings from impact research (e.g. Zehetmeier and Krainer, "ZDM Int J Math" 43(6/7):875-887, 2011) and innovation research (e.g. Cobb and Smith,…

  5. Scaling-Up Successfully: Pathways to Replication for Educational NGOs

    ERIC Educational Resources Information Center

    Jowett, Alice; Dyer, Caroline

    2012-01-01

    Non-government organisations (NGOs) are big players in international development, critical to the achievement of the Millennium Development Goals (MDGs) and constantly under pressure to "achieve more". Scaling-up their initiatives successfully and sustainably can be an efficient and cost effective way for NGOs to increase their impact across a…

  6. Three Collaborative Models for Scaling Up Evidence-Based Practices

    PubMed Central

    Roberts, Rosemarie; Jones, Helen; Marsenich, Lynne; Sosna, Todd; Price, Joseph M.

    2015-01-01

    The current paper describes three models of research-practice collaboration to scale-up evidence-based practices (EBP): (1) the Rolling Cohort model in England, (2) the Cascading Dissemination model in San Diego County, and (3) the Community Development Team model in 53 California and Ohio counties. Multidimensional Treatment Foster Care (MTFC) and KEEP are the focal evidence-based practices that are designed to improve outcomes for children and families in the child welfare, juvenile justice, and mental health systems. The three scale-up models each originated from collaboration between community partners and researchers with the shared goal of wide-spread implementation and sustainability of MTFC/KEEP. The three models were implemented in a variety of contexts; Rolling Cohort was implemented nationally, Cascading Dissemination was implemented within one county, and Community Development Team was targeted at the state level. The current paper presents an overview of the development of each model, the policy frameworks in which they are embedded, system challenges encountered during scale-up, and lessons learned. Common elements of successful scale-up efforts, barriers to success, factors relating to enduring practice relationships, and future research directions are discussed. PMID:21484449

  7. Applying machine learning classification techniques to automate sky object cataloguing

    NASA Astrophysics Data System (ADS)

    Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav

    1993-08-01

    We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is

  8. Collaborative Group Learning using the SCALE-UP Pedagogy

    NASA Astrophysics Data System (ADS)

    Feldman, Gerald

    2011-10-01

    The time-honored conventional lecture (``teaching by telling'') has been shown to be an ineffective mode of instruction for science classes. In these cases, where the enhancement of critical thinking skills and the development of problem-solving abilities are emphasized, collaborative group learning environments have proven to be far more effective. In addition, students naturally improve their teamwork skills through the close interaction they have with their group members. Early work on the Studio Physics model at Rensselaer Polytechnic Institute in the mid-1990's was extended to large classes via the SCALE-UP model pioneered at North Carolina State University a few years later. In SCALE-UP, students sit at large round tables in three groups of three --- in this configuration, they carry out a variety of pencil/paper exercises (ponderables) using small whiteboards and perform hands-on activities like demos and labs (tangibles) throughout the class period. They also work on computer simulations using a shared laptop for each group of three. Formal lecture is reduced to a minimal level and the instructor serves more as a ``coach'' to facilitate the academic ``drills'' that the students are working on. Since its inception in 1997, the SCALE-UP pedagogical approach has been adopted by over 100 institutions across the country and about 20 more around the world. In this talk, I will present an overview of the SCALE-UP concept and I will outline the details of its deployment at George Washington University over the past 4 years. I will also discuss empirical data from assessments given to the SCALE-UP collaborative classes and the regular lecture classes at GWU in order to make a comparative study of the effectiveness of the two methodologies.

  9. Vibration Monitoring Techniques Applied to Detect Damage in Rotating Disks

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, Andrew L.; Sawicki, Jerzy T.

    2002-01-01

    Rotor health monitoring and online damage detection are increasingly gaining the interest of the manufacturers of aircraft engines. This is primarily due to the need for improved safety during operation as well as the need for lower maintenance costs. Applied techniques for detecting damage in and monitoring the health of rotors are essential for engine safety, reliability, and life prediction. The goals of engine safety are addressed within the NASA-sponsored Aviation Safety Program (AvSP). AvSP provides research and technology products needed to help the Federal Aviation Administration and the aerospace industry improve aviation safety. The Nondestructive Evaluation Group at the NASA Glenn Research Center is addressing propulsion health management and the development of propulsion-system-specific technologies intended to detect potential failures prior to catastrophe.

  10. Image analysis technique applied to lock-exchange gravity currents

    NASA Astrophysics Data System (ADS)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  11. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W. . Dept. of Computer Sciences); Noordewier, M.O. . Dept. of Computer Science)

    1992-01-01

    We are primarily developing a machine teaming (ML) system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being teamed. Using this information, our teaming algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, our KBANN algorithm maps inference rules about a given recognition task into a neural network. Neural network training techniques then use the training examples to refine these inference rules. We call these rules a domain theory, following the convention in the machine teaming community. We have been applying this approach to several problems in DNA sequence analysis. In addition, we have been extending the capabilities of our teaming system along several dimensions. We have also been investigating parallel algorithms that perform sequence alignments in the presence of frameshift errors.

  12. Extrapolation techniques applied to matrix methods in neutron diffusion problems

    NASA Technical Reports Server (NTRS)

    Mccready, Robert R

    1956-01-01

    A general matrix method is developed for the solution of characteristic-value problems of the type arising in many physical applications. The scheme employed is essentially that of Gauss and Seidel with appropriate modifications needed to make it applicable to characteristic-value problems. An iterative procedure produces a sequence of estimates to the answer; and extrapolation techniques, based upon previous behavior of iterants, are utilized in speeding convergence. Theoretically sound limits are placed on the magnitude of the extrapolation that may be tolerated. This matrix method is applied to the problem of finding criticality and neutron fluxes in a nuclear reactor with control rods. The two-dimensional finite-difference approximation to the two-group neutron fluxes in a nuclear reactor with control rods. The two-dimensional finite-difference approximation to the two-group neutron-diffusion equations is treated. Results for this example are indicated.

  13. Applying machine learning techniques to DNA sequence analysis

    SciTech Connect

    Shavlik, J.W.

    1992-01-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a domain theory''), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  14. Airflow measurement techniques applied to radon mitigation problems

    SciTech Connect

    Harrje, D.T.; Gadsby, K.J.

    1989-01-01

    During the past decade a multitude of diagnostic procedures associated with the evaluation of air infiltration and air leakage sites have been developed. The spirit of international cooperation and exchange of ideas within the AIC-AIVC conferences has greatly facilitated the adoption and use of these measurement techniques in the countries participating in Annex V. But wide application of such diagnostic methods are not limited to air infiltration alone. The subject of this paper concerns the ways to evaluate and improve radon reduction in buildings using diagnostic methods directly related to developments familiar to the AIVC. Radon problems are certainly not unique to the United States, and the methods described here have to a degree been applied by researchers of other countries faced with similar problems. The radon problem involves more than a harmful pollutant of the living spaces of our buildings -- it also involves energy to operate radon removal equipment and the loss of interior conditioned air as a direct result. The techniques used for air infiltration evaluation will be shown to be very useful in dealing with the radon mitigation challenge. 10 refs., 7 figs., 1 tab.

  15. Scaling up adsorption media reactors for copper removal with the aid of dimensionless numbers.

    PubMed

    Chang, Ni-Bin; Houmann, Cameron; Wanielista, Martin

    2016-02-01

    Adsorption media may be used to sorb copper in an aquatic environment for pollution control. Effective design of adsorption media reactors is highly dependent on selection of the hydraulic residence time when scaling up a pilot-scale reactor to a field-scale reactor. This paper seeks to improve scaling-up technique of the reactor design process through the use of the Damköhler and Péclet numbers via a dimensional analysis. A new scaling-up theory is developed in this study through a joint consideration of the Damköhler and Péclet numbers for a constant media particle size such that a balance between transport control and reaction control can be harmonized. A series of column breakthrough tests at varying hydraulic residence times revealed a clear peak adsorption capacity at a Damköhler number of 2.74. The Péclet numbers for the column breakthrough tests indicated that mechanical dispersion is an important effect that requires further consideration in the scaling-up process. However, perfect similitude of the Damköhler number cannot be maintained for a constant media particle size, and relaxation of hydrodynamic similitude through variation of the Péclet number must occur. PMID:26454119

  16. Scaling up high-impact interventions: how is it done?

    PubMed

    Smith, Jeffrey Michael; de Graft-Johnson, Joseph; Zyaee, Pashtoon; Ricca, Jim; Fullerton, Judith

    2015-06-01

    Building upon the World Health Organization's ExpandNet framework, 12 key principles of scale-up have emerged from the implementation of maternal and newborn health interventions. These principles are illustrated by three case studies of scale up of high-impact interventions: the Helping Babies Breathe initiative; pre-service midwifery education in Afghanistan; and advanced distribution of misoprostol for self-administration at home births to prevent postpartum hemorrhage. Program planners who seek to scale a maternal and/or newborn health intervention must ensure that: the necessary evidence and mechanisms for local ownership for the intervention are well-established; the intervention is as simple and cost-effective as possible; and the implementers and beneficiaries of the intervention are working in tandem to build institutional capacity at all levels and in consideration of all perspectives. PMID:26115856

  17. Scale-Up of Advanced Hot-Gas Desulfurization Sorbents

    SciTech Connect

    Jothimurugesan, K.; Gangwal, Santosh K.

    1996-10-14

    The overall objective of this project is to develop regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective of the project is to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high activity at temperatures as low as 343 C (650 F). A number of formulations will be prepared and screened in a 1/2-inch fixed bed reactor at high pressure (1 to 20 atm) and high temperatures using simulated coal-derived fuel-gases. Screening criteria will include, chemical reactivity, stability, and regenerability over the temperature range of 343 C to 650 C. After initial screening, at least 3 promising formulations will be tested for 25-30 cycles of absorption and regeneration. One of the superior formulations with the best cyclic performance will be selected for investigating scale up parameters. The scaled-up formulation will be tested for long term durability and chemical reactivity.

  18. Scaling up of manufacturing processes of recycled carpet based composites

    NASA Astrophysics Data System (ADS)

    Lakshminarayanan, Krishnan

    2011-12-01

    In this work, feasibility of recycling post-consumer carpets using a modified vacuum assisted resisted molding process into large-scale components was successfully demonstrated. The scale up also included the incorporation of nano-clay films in the carpet composites. It is expected that the films will enhance the ability of the composite to withstand environmental degradation and also serve as a fire retardant. Low-cost resins were used to fabricate the recycled carpet-based composites. The scale up in terms of process was achieved by manufacturing composites without a hot press and thereby saving additional equipment cost. Mechanical and physical properties were evaluated. Large-scale samples demonstrated mechanical properties that were different from results from small samples. Acoustic tests indicate good sound absorption of the carpet composite. Cost analysis of the composite material based on the cost of the raw materials and the manufacturing process has been presented.

  19. Drug nanocrystals: A way toward scale-up.

    PubMed

    Raghava Srivalli, Kale Mohana; Mishra, Brahmeshwar

    2016-07-01

    Drug nanocrystals comprise unique drug delivery platforms playing a significantly important and distinctive role in drug delivery and as such, the industry and academia are spending a lot of their time and money in developing the nanocrystal products. The current research works in this field depict a vivid shift from lab scale optimization studies to scale up focused studies. In this emerging scenario of nanocrystal technology, a review on some exemplary and progressing research studies with either scalability as their objective or upscaling as their future scope may smoothen the future upscaling attempts in this field. Hence, this paper reviews the efforts of such research works as case studies since an analysis of such research studies may input certain beneficial knowledge to carry out more scale up based research works on nanocrystals. PMID:27330370

  20. Routes to scaling up the fruits of biotechnology

    SciTech Connect

    Spalding, B.J.

    1985-07-17

    A fed-batch process without recycling, a fed batch process with recycling, a straight batch process, a continuous growth process and, a multichamber process are some of the routes used for scaling up the fruits of biotechnology. Although computers can help in optimization of the process the best bet is still the intelligent guess. Some important considerations in scaling up are as follows: the imprecision of current chemical tests to measure the amount of sugar in a fermentation broth, temperature control, separation procedures. The type of medium to be used must also be considered. For example, as the cost of corn accounts for half the production cost in the production of ethanol from corn, it is imperative that only small amounts of carbon end up in non-product form.

  1. Scale-up in Poroelastic System and Applications to Reservoirs

    SciTech Connect

    Berryman, J G

    2003-07-01

    A fundamental problem of heterogeneous systems is that the macroscale behavior is not necessarily well-described by equations familiar to us at the meso- or microscale. In relatively simple cases like electrical conduction and elasticity, it is hue that the equations describing macroscale behavior take the same form as those at the microscale. But in more complex systems, these simple results do not hold. Consider fluid flow in porous media where the microscale behavior is well-described by Navier-Stokes' equations for liquid in the pores while the macroscale behavior instead obeys Darcy's equation. Rigorous methods for establishing the form of such equations for macroscale behavior include multiscale homogenization methods and also the volume averaging method. In addition, it has been shown that Biot's equations of poroelasticity follow in a scale-up of the microscale equations of elasticity coupled to Navier-Stokes. Laboratory measurements have shown that Biot's equations indeed hold for simple systems but heterogeneous systems can have quite different behavior. So the question arises whether there is yet another level of scale-up needed to arrive at equations valid for the reservoir scale? And if so, do these equations take the form of Biot's equations or some other form? We will discuss these issues and show that the double-porosity equations play a special role in the scale-up to equations describing reservoir behavior, for fluid pumping, geomechanics, as well as seismic wave propagation.

  2. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  3. Scaling-up voluntary medical male circumcision – what have we learned?

    PubMed Central

    Ledikwe, Jenny H; Nyanga, Robert O; Hagon, Jaclyn; Grignon, Jessica S; Mpofu, Mulamuli; Semo, Bazghina-werq

    2014-01-01

    In 2007, the World Health Organization (WHO) and the joint United Nations agency program on HIV/AIDS (UNAIDS) recommended voluntary medical male circumcision (VMMC) as an add-on strategy for HIV prevention. Fourteen priority countries were tasked with scaling-up VMMC services to 80% of HIV-negative men aged 15–49 years by 2016, representing a combined target of 20 million circumcisions. By December 2012, approximately 3 million procedures had been conducted. Within the following year, there was marked improvement in the pace of the scale-up. During 2013, the total number of circumcisions performed nearly doubled, with approximately 6 million total circumcisions conducted by the end of the year, reaching 30% of the initial target. The purpose of this review article was to apply a systems thinking approach, using the WHO health systems building blocks as a framework to examine the factors influencing the scale-up of the VMMC programs from 2008–2013. Facilitators that accelerated the VMMC program scale-up included: country ownership; sustained political will; service delivery efficiencies, such as task shifting and task sharing; use of outreach and mobile services; disposable, prepackaged VMMC kits; external funding; and a standardized set of indicators for VMMC. A low demand for the procedure has been a major barrier to achieving circumcision targets, while weak supply chain management systems and the lack of adequate financial resources with a heavy reliance on donor support have also adversely affected scale-up. Health systems strengthening initiatives and innovations have progressively improved VMMC service delivery, but an understanding of the contextual barriers and the facilitators of demand for the procedure is critical in reaching targets. There is a need for countries implementing VMMC programs to share their experiences more frequently to identify and to enhance best practices by other programs. PMID:25336991

  4. Scaling-up voluntary medical male circumcision - what have we learned?

    PubMed

    Ledikwe, Jenny H; Nyanga, Robert O; Hagon, Jaclyn; Grignon, Jessica S; Mpofu, Mulamuli; Semo, Bazghina-Werq

    2014-01-01

    In 2007, the World Health Organization (WHO) and the joint United Nations agency program on HIV/AIDS (UNAIDS) recommended voluntary medical male circumcision (VMMC) as an add-on strategy for HIV prevention. Fourteen priority countries were tasked with scaling-up VMMC services to 80% of HIV-negative men aged 15-49 years by 2016, representing a combined target of 20 million circumcisions. By December 2012, approximately 3 million procedures had been conducted. Within the following year, there was marked improvement in the pace of the scale-up. During 2013, the total number of circumcisions performed nearly doubled, with approximately 6 million total circumcisions conducted by the end of the year, reaching 30% of the initial target. The purpose of this review article was to apply a systems thinking approach, using the WHO health systems building blocks as a framework to examine the factors influencing the scale-up of the VMMC programs from 2008-2013. Facilitators that accelerated the VMMC program scale-up included: country ownership; sustained political will; service delivery efficiencies, such as task shifting and task sharing; use of outreach and mobile services; disposable, prepackaged VMMC kits; external funding; and a standardized set of indicators for VMMC. A low demand for the procedure has been a major barrier to achieving circumcision targets, while weak supply chain management systems and the lack of adequate financial resources with a heavy reliance on donor support have also adversely affected scale-up. Health systems strengthening initiatives and innovations have progressively improved VMMC service delivery, but an understanding of the contextual barriers and the facilitators of demand for the procedure is critical in reaching targets. There is a need for countries implementing VMMC programs to share their experiences more frequently to identify and to enhance best practices by other programs. PMID:25336991

  5. Volcanic Monitoring Techniques Applied to Controlled Fragmentation Experiments

    NASA Astrophysics Data System (ADS)

    Kueppers, U.; Alatorre-Ibarguengoitia, M. A.; Hort, M. K.; Kremers, S.; Meier, K.; Scharff, L.; Scheu, B.; Taddeucci, J.; Dingwell, D. B.

    2010-12-01

    Volcanic eruptions are an inevitable natural threat. The range of eruptive styles is large and short term fluctuations of explosivity or vent position pose a large risk that is not necessarily confined to the immediate vicinity of a volcano. Explosive eruptions rather may also affect aviation, infrastructure and climate, regionally as well as globally. Multiparameter monitoring networks are deployed on many active volcanoes to record signs of magmatic processes and help elucidate the secrets of volcanic phenomena. However, our mechanistic understanding of many processes hiding in recorded signals is still poor. As a direct consequence, a solid interpretation of the state of a volcano is still a challenge. In an attempt to bridge this gap, we combined volcanic monitoring and experimental volcanology. We performed 15 well-monitored, field-based, experiments and fragmented natural rock samples from Colima volcano (Mexico) by rapid decompression. We used cylindrical samples of 60 mm height and 25 mm and 60 mm diameter, respectively, and 25 and 35 vol.% open porosity. The applied pressure range was from 4 to 18 MPa. Using different experimental set-ups, the pressurised volume above the samples ranged from 60 - 170 cm3. The experiments were performed at ambient conditions and at controlled sample porosity and size, confinement geometry, and applied pressure. The experiments have been thoroughly monitored with 1) Doppler Radar (DR), 2) high-speed and high-definition cameras, 3) acoustic and infrasound sensors, 4) pressure transducers, and 5) electrically conducting wires. Our aim was to check for common results achieved by the different approaches and, if so, calibrate state-of-the-art monitoring tools. We present how the velocity of the ejected pyroclasts was measured by and evaluated for the different approaches and how it was affected by the experimental conditions and sample characteristics. We show that all deployed instruments successfully measured the pyroclast

  6. Droplet size measurements for spray dryer scale-up.

    PubMed

    Thybo, Pia; Hovgaard, Lars; Andersen, Sune Klint; Lindeløv, Jesper Saederup

    2008-01-01

    This study was dedicated to facilitate scale-up in spray drying from an atomization standpoint. The purpose was to investigate differences in operating conditions between a pilot and a production scale nozzle. The intension was to identify the operating ranges in which the two nozzles produced similar droplet size distributions. Furthermore, method optimization and validation were also covered. Externally mixing two-fluid nozzles of similar designs were used in this study. Both nozzles are typically used in commercially available spray dryers, and they have been characterized with respect to droplet size distributions as a function of liquid type, liquid flow rate, atomization gas flow rate, liquid orifice diameter, and atomization gas orifice diameter. All droplet size measurements were carried out by using the Malvern Spraytec with nozzle operating conditions corresponding to typical settings for spray drying. This gave droplets with Sauter Mean Diameters less than 40 microm and typically 5-20 microm. A model previously proposed by Mansour and Chigier was used to correlate the droplet size to the operating parameters. It was possible to make a correlation for water incorporating the droplet sizes for both the pilot scale and the production scale nozzle. However, a single correlation was not able to account properly for the physical properties of the liquid to be atomized. Therefore, the droplet size distributions of ethanol could not be adequately predicted on the basis of the water data. This study has shown that it was possible to scale up from a pilot to production scale nozzle in a systematic fashion. However, a prerequisite was that the nozzles were geometrically similar. When externally mixing two-fluid nozzles are used as atomizers, the results obtained from this study could be a useful guideline for selecting appropriate operating conditions when scaling up the spray-drying process. PMID:18379901

  7. Pretreatment optimization of Sorghum pioneer biomass for bioethanol production and its scale-up.

    PubMed

    Koradiya, Manoj; Duggirala, Srinivas; Tipre, Devayani; Dave, Shailesh

    2016-01-01

    Based on one parameter at a time, saccharification of delignified sorghum biomass by 4% and 70% v/v sulfuric acid resulted in maximum 30.8 and 33.8 g% sugar production from biomass respectively. The Box Behnken Design was applied for further optimization of acid hydrolysis. As a result of the designed experiment 36.3g% sugar production was achieved when 3% v/v H2SO4 treatment given for 60 min at 180°C. The process was scaled-up to treat 2 kg of biomass. During the screening of yeast cultures, isolate C, MK-I and N were found to be potent ethanol producers from sorghum hydrolyzate. Culture MK-I was the best so used for scale up of ethanol production up to 25 L capacity, which gave a yield of 0.49 g ethanol/g sugar from hydrolyzate obtained from 2 kg of sorghum biomass. PMID:26384087

  8. The First Scale-Up Production of Theranostic Nanoemulsions

    PubMed Central

    Liu, Lu; Bagia, Christina; Janjic, Jelena M.

    2015-01-01

    Abstract Theranostic nanomedicines are a promising new technological advancement toward personalized medicine. Although much progress has been made in pre-clinical studies, their clinical utilization is still under development. A key ingredient for successful theranostic clinical translation is pharmaceutical process design for production on a sufficient scale for clinical testing. In this study, we report, for the first time, a successful scale-up of a model theranostic nanoemulsion. Celecoxib-loaded near-infrared-labeled perfluorocarbon nanoemulsion was produced on three levels of scale (small at 54 mL, medium at 270 mL, and large at 1,000 mL) using microfluidization. The average size and polydispersity were not affected by the equipment used or production scale. The overall nanoemulsion stability was maintained for 90 days upon storage and was not impacted by nanoemulsion production scale or composition. Cell-based evaluations show comparable results for all nanoemulsions with no significant impact of nanoemulsion scale on cell toxicity and their pharmacological effects. This report serves as the first example of a successful scale-up of a theranostic nanoemulsion and a model for future studies on theranostic nanomedicine production and development. PMID:26309798

  9. Scale-Up of Advanced Hot-Gas desulfurization Sorbents.

    SciTech Connect

    Jothimurugesan, K.; Gangwal, S.K.

    1997-10-02

    The overall objective of this project is to develop regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective of the project is to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high activity at temperatures as low as 343 {degrees}C (650{degrees}F). A number of formulations will be prepared and screened in a one-half inch fixed bed reactor at high pressure (1 to 20 atm) and high temperatures using simulated coal-derived fuel- gases. Screening criteria will include chemical reactivity, stability, and regenerability over the temperature range of 343{degrees}C to 650{degrees}C. After initial screening, at least 3 promising formulations will be tested for 25-30 cycles of absorption and regeneration. One of the superior formulations with the best cyclic performance will be selected for investigating scale up parameters. The scaled-up formulation will be tested for long term durability and chemical reactivity.

  10. Scale-Up of Advanced Hot-Gas Desulfurization Sorbents

    SciTech Connect

    Jothimurugesan, K.; Gangwal, S.K.

    1997-04-21

    The overall objective of this project is to develop regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective of the project is to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high activity at temperatures as low as 343{degrees}C (650{degrees}F). A number of formulations will be prepared and screened in a 1/2-inch fixed bed reactor at high pressure (1 to 20 atm) and high temperatures using simulated coal-derived fuel-gases. Screening criteria will include, chemical reactivity, stability, and regenerability over the temperature range of 343{degrees}C to 650{degrees}C. After initial screening, at least 3 promising formulations will be tested for 25-30 cycles of absorption and regeneration. One of the superior formulations with the best cyclic performance will be selected for investigating scale up parameters. The scaled-up formulation will be tested for long term durability and chemical reactivity.

  11. Scale-up of commercial PCFB boiler plant technology

    SciTech Connect

    Lamar, T.W.

    1993-10-01

    The DMEC-1 Demonstration Project will provide an 80 MWe commercial-scale demonstration of the Pressurized Circulating Fluidized Bed (PCFB) technology. Following confirmation of the PCFB design in the 80 MWe scale, the technology with be scaled to even larger commercial units. It is anticipated that the market for commercial scale PCFB plants will exist most predominantly in the utility and independent power producer (IPP) sectors. These customers will require the best possible plant efficiency and the lowest achievable emissions at competitive cost. This paper will describe the PCFB technology and the expected performance of a nominal 400 MWe PCFB power plant Illinois No. 6 coal was used as a representative fuel for the analysis. The description of the plant performance will be followed by a discussion of the scale-up of the major PCFB components such as the PCFB boiler, the pressure vessel, the ceramic filter, the coal/sorbent handling steam, the gas turbine, the heat recovery unit and the steam turbine, demonstrating the reasonableness of scale-up from demonstration plant to a nominal 400 MWe unit.

  12. Photoacoustic technique applied to the study of skin and leather

    NASA Astrophysics Data System (ADS)

    Vargas, M.; Varela, J.; Hernández, L.; González, A.

    1998-08-01

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  13. Photoacoustic technique applied to the study of skin and leather

    SciTech Connect

    Vargas, M.; Varela, J.; Hernandez, L.; Gonzalez, A.

    1998-08-28

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  14. Signal detection techniques applied to the Chandler wobble

    NASA Technical Reports Server (NTRS)

    Gross, R. S.

    1985-01-01

    A sudden excitation event of the Chandler wobble should induce the earth's rotation pole to undergo damped harmonic motion. This type of motion has been searched for in the observations of the Chandler wobble using techniques based upon the concept of a matched filter. Although the signal detection techniques used here were not sensitive enough to detect any such isolated sudden excitation events, the result that was obtained is consistent with a randomly excited model of the Chandler wobble.

  15. Quantity Versus Quality: A Survey Experiment to Improve the Network Scale-up Method

    PubMed Central

    Feehan, Dennis M.; Umubyeyi, Aline; Mahy, Mary; Hladik, Wolfgang; Salganik, Matthew J.

    2016-01-01

    The network scale-up method is a promising technique that uses sampled social network data to estimate the sizes of epidemiologically important hidden populations, such as sex workers and people who inject illicit drugs. Although previous scale-up research has focused exclusively on networks of acquaintances, we show that the type of personal network about which survey respondents are asked to report is a potentially crucial parameter that researchers are free to vary. This generalization leads to a method that is more flexible and potentially more accurate. In 2011, we conducted a large, nationally representative survey experiment in Rwanda that randomized respondents to report about one of 2 different personal networks. Our results showed that asking respondents for less information can, somewhat surprisingly, produce more accurate size estimates. We also estimated the sizes of 4 key populations at risk for human immunodeficiency virus infection in Rwanda. Our estimates were higher than earlier estimates from Rwanda but lower than international benchmarks. Finally, in this article we develop a new sensitivity analysis framework and use it to assess the possible biases in our estimates. Our design can be customized and extended for other settings, enabling researchers to continue to improve the network scale-up method. PMID:27015875

  16. Quantity Versus Quality: A Survey Experiment to Improve the Network Scale-up Method.

    PubMed

    Feehan, Dennis M; Umubyeyi, Aline; Mahy, Mary; Hladik, Wolfgang; Salganik, Matthew J

    2016-04-15

    The network scale-up method is a promising technique that uses sampled social network data to estimate the sizes of epidemiologically important hidden populations, such as sex workers and people who inject illicit drugs. Although previous scale-up research has focused exclusively on networks of acquaintances, we show that the type of personal network about which survey respondents are asked to report is a potentially crucial parameter that researchers are free to vary. This generalization leads to a method that is more flexible and potentially more accurate. In 2011, we conducted a large, nationally representative survey experiment in Rwanda that randomized respondents to report about one of 2 different personal networks. Our results showed that asking respondents for less information can, somewhat surprisingly, produce more accurate size estimates. We also estimated the sizes of 4 key populations at risk for human immunodeficiency virus infection in Rwanda. Our estimates were higher than earlier estimates from Rwanda but lower than international benchmarks. Finally, in this article we develop a new sensitivity analysis framework and use it to assess the possible biases in our estimates. Our design can be customized and extended for other settings, enabling researchers to continue to improve the network scale-up method. PMID:27015875

  17. Identifying the Characteristics of Effective High Schools: Report from Year One of the National Center on Scaling up Effective Schools. Research Report

    ERIC Educational Resources Information Center

    Rutledge, Stacey; Cohen-Vogel, Lora; Osborne-Lampkin, La'Tara

    2012-01-01

    The National Center on Scaling up Effective Schools (NCSU) is a five-year project working to develop, implement, and test new processes to scale up effective practices in high schools that districts will be able to apply within the context of their own unique goals and circumstances. This report describes the activities and findings of the first…

  18. Scaled-up in vitro experiments of vocal fold paralysis

    NASA Astrophysics Data System (ADS)

    Peterson, Keith; Wei, Timothy; Krane, Michael

    2006-11-01

    Vocal fold paralysis is the inability of either one, or both vocal folds to open and close properly. Digital Particle Image Velocimetry (DPIV) measurements were taken to further understand the consequences paralyzed vocal folds have on the fluid dynamics downstream of the vocal folds during human phonation. The experiments were taken in a free-stream water tunnel using a simplified scaled-up model of human vocal folds. The Reynolds and Strouhal numbers ranged from 4500 to 10000, and 0.01 to 0.04, respectively. Various configuration setups were tested to emulate several types of vocal fold paralyses. These configurations include unilateral vocal fold immobility (UVFI), bilateral vocal fold immobility (BVFI) and the vocal folds operating at different oscillating frequencies. Data from these different conditions will be compared with an eye toward understanding the critical dynamics associated with this class of disease.

  19. Biological conversion of synthesis gas. Limiting conditions/scale-up

    SciTech Connect

    Basu, R.; Klasson, K.T.; Takriff, M.; Clausen, E.C.; Gaddy, J.L.

    1993-09-01

    The purpose of this research is to develop a technically and economically feasible process for biologically producing H(sub 2) from synthesis gas while, at the same time, removing harmful sulfur gas compounds. Six major tasks are being studied: 1. Culture development, where the best cultures are selected and conditions optimized for simultaneous hydrogen production and sulfur gas removal; 2. Mass transfer and kinetic studies in which equations necessary for process design are developed; 3. Bioreactor design studies, where the cultures chosen in Task 1 are utilized in continuous reaction vessels to demonstrate process feasibility and define operating conditions; 4. Evaluation of biological synthetic gas conversion under limiting conditions in preparation for industrial demonstration studies; 5. Process scale-up where laboratory data are scaled to larger-size units in preparation for process demonstration in a pilot-scale unit; and 6. Economic evaluation, where process simulations are used to project process economics and identify high cost areas during sensitivity analyses.

  20. Scaling Up Family Therapy in Fragile, Conflict-Affected States.

    PubMed

    Charlés, Laurie L

    2015-09-01

    This article discusses the design and delivery of two international family therapy-focused mental health and psychosocial support training projects, one in a fragile state and one in a post-conflict state. The training projects took place in Southeast Asia and the Middle East/North Africa. Each was funded, supported, and implemented by local, regional, and international stakeholders, and delivered as part of a broader humanitarian agenda to develop human resource capacity to work with families affected by atrocities. The two examples illustrate how task-shifting/task-sharing and transitional justice approaches were used to inform the scaling-up of professionals involved in each project. They also exemplify how state-citizen phenomena in each location affected the project design and delivery. PMID:25315510

  1. TA Beliefs in a SCALE-UP Style Classroom

    NASA Astrophysics Data System (ADS)

    DeBeck, George; Settelmeyer, Sam; Li, Sissi; Demaree, Dedra

    2010-10-01

    In Spring 2010, the Oregon State University physics department instituted a SCALE-UP (Student-Centered Active Learning Environment for Undergraduate Programs) style studio classroom in the introductory, calculus-based physics series. In our initial implementation, comprised of two hours lecture, two hours of studio, and two hours lab work, the studio session was lead by a faculty member and either 2 GTAs or 1 GTA and 1 LA. We plan to move to a model where senior GTAs can lead studio sections after co-teaching with the faculty member. It is critical that we know how to prepare and support the instructional team in facilitating student learning in this setting. We examine GTA and LA pedagogical beliefs through reflective journaling, interviews, and personal experience of the authors. In particular, we examine how these beliefs changed over their first quarter of instruction, as well as the resources used to adapt to the new classroom environment.

  2. Gravimetry and Space Techniques Applied to Geodynamics and Ocean Dynamics

    NASA Astrophysics Data System (ADS)

    Schutz, Bob E.; Anderson, Allen; Froidevaux, Claude; Parke, Michael

    The variety of disciplines represented in this volume (including space geodesy, oceanography, geophysics, and celestial mechanics) attest to the interdisciplinary applications of gravimetry and space techniques. The relation to sea level is addressed within some of the papers and the contributions of the techniques to development of global gravity models are discussed. The space technique of satellite altimetry has become a prominent contributor to sea surface topography as well as ocean tide models and determination of gravity, especially in ocean areas. Ocean tides influence the motion of near-Earth satellites and the rotation of the Earth. Modern space geodesy is increasingly relying on the Global Positioning System for measuring geophysical phenomena manifested at the surface through crustal deformations. Furthermore, the geophysical interpretation of gravity anomalies has been facilitated by the introduction of modern techniques. This volume represents only a small "snapshot" of the interdisciplinary research being conducted. Modem space geodesy is one of the common links between the disciplines reflected in this volume. New developments in gravimetry and space techniques will further enhance and foster interdisciplinary work in coming years.

  3. Production Scale-Up or Activated Carbons for Ultracapacitors

    SciTech Connect

    Dr. Steven D. Dietz

    2007-01-10

    Transportation use accounts for 67% of the petroleum consumption in the US. Electric and hybrid vehicles are promising technologies for decreasing our dependence on petroleum, and this is the objective of the FreedomCAR & Vehicle Technologies Program. Inexpensive and efficient energy storage devices are needed for electric and hybrid vehicle to be economically viable, and ultracapacitors are a leading energy storage technology being investigated by the FreedomCAR program. The most important parameter in determining the power and energy density of a carbon-based ultracapacitor is the amount of surface area accessible to the electrolyte, which is primarily determined by the pore size distribution. The major problems with current carbons are that their pore size distribution is not optimized for liquid electrolytes and the best carbons are very expensive. TDA Research, Inc. (TDA) has developed methods to prepare porous carbons with tunable pore size distributions from inexpensive carbohydrate based precursors. The use of low-cost feedstocks and processing steps greatly lowers the production costs. During this project with the assistance of Maxwell Technologies, we found that an impurity was limiting the performance of our carbon and the major impurity found was sulfur. A new carbon with low sulfur content was made and found that the performance of the carbon was greatly improved. We also scaled-up the process to pre-production levels and we are currently able to produce 0.25 tons/year of activated carbon. We could easily double this amount by purchasing a second rotary kiln. More importantly, we are working with MeadWestvaco on a Joint Development Agreement to scale-up the process to produce hundreds of tons of high quality, inexpensive carbon per year based on our processes.

  4. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  5. Novel method for constructing a large-scale design space in lubrication process by using Bayesian estimation based on the reliability of a scale-up rule.

    PubMed

    Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo

    2012-01-01

    A reliable large-scale design space was constructed by integrating the reliability of a scale-up rule into the Bayesian estimation without enforcing a large-scale design of experiments (DoE). A small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. A constant Froude number was applied as a scale-up rule. Experiments were conducted at four different small scales with the same Froude number and blending time in order to determine the discrepancies in the response variables between the scales so as to indicate the reliability of the scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on the large scale by Bayesian estimation using the large-scale results and the reliability of the scale-up rule. Large-scale experiments performed under three additional sets of conditions showed that the corrected design space was more reliable than the small-scale design space even when there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale. PMID:22976324

  6. Machine-Learning Techniques Applied to Antibacterial Drug Discovery

    PubMed Central

    Durrant, Jacob D.; Amaro, Rommie E.

    2014-01-01

    The emergence of drug-resistant bacteria threatens to catapult humanity back to the pre-antibiotic era. Even now, multi-drug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the resulting vacuum. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics in an academic setting, leading to improved hit rates and faster transitions to pre-clinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. PMID:25521642

  7. Machine-learning techniques applied to antibacterial drug discovery.

    PubMed

    Durrant, Jacob D; Amaro, Rommie E

    2015-01-01

    The emergence of drug-resistant bacteria threatens to revert humanity back to the preantibiotic era. Even now, multidrug-resistant bacterial infections annually result in millions of hospital days, billions in healthcare costs, and, most importantly, tens of thousands of lives lost. As many pharmaceutical companies have abandoned antibiotic development in search of more lucrative therapeutics, academic researchers are uniquely positioned to fill the pipeline. Traditional high-throughput screens and lead-optimization efforts are expensive and labor intensive. Computer-aided drug-discovery techniques, which are cheaper and faster, can accelerate the identification of novel antibiotics, leading to improved hit rates and faster transitions to preclinical and clinical testing. The current review describes two machine-learning techniques, neural networks and decision trees, that have been used to identify experimentally validated antibiotics. We conclude by describing the future directions of this exciting field. PMID:25521642

  8. Technology Assessment of Dust Suppression Techniques Applied During Structural Demolition

    SciTech Connect

    Boudreaux, J.F.; Ebadian, M.A.; Williams, P.T.; Dua, S.K.

    1998-10-20

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure properly and, at the same time, minimize the amount of dust generated from a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology given site-specific conditions. Thus, the purpose of this research, which was carried out at the Hemispheric Center for Environmental Technology (HCET) at Florida International University, was to conduct an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study targeted the problem of dust suppression during the demolition of nuclear facilities. The resulting data were employed to assist in the development of mathematical correlations that can be applied to predict dust generation during structural demolition.

  9. Microwave de-embedding techniques applied to acoustics.

    PubMed

    Jackson, Charles M

    2005-07-01

    This paper describes the use of the microwave techniques of time domain reflectometry (TDR) and de-embedding in an acoustical application. Two methods of calibrating the reflectometer are presented to evaluate the consistency of the method. Measured and modeled S-parameters of woodwind instruments are presented. The raw measured data is de-embedded to obtain an accurate measurement. The acoustic TDR setup is described. PMID:16212248

  10. Analysis of soil images applying Laplacian Pyramidal techniques

    NASA Astrophysics Data System (ADS)

    Ballesteros, F.; de Castro, J.; Tarquis, A. M.; Méndez, A.

    2012-04-01

    The Laplacian pyramid is a technique for image encoding in which local operators of many scales but identical shape are the basis functions. Our work describes some properties of the filters of the Laplacian pyramid. Specially, we pay attention to Gaussian and fractal behaviour of these filters, and we determine the normal and fractal ranges in the case of single parameter filters, while studying the influence of these filters in soil image processing. One usual property of any image is that neighboring pixels are highly correlated. This property makes inefficient to represent the image directly in terms of the pixel values, because most of the encoded information would be redundant. Burt and Adelson designed a technique, named Laplacian pyramid, for removing image correlation which combines features of predictive and transform methods. This technique is non causal, and its computations are simple and local. The predicted value for each pixel is computed as a local weighted average, using a unimodal weighting function centred on the pixel itself. Pyramid construction is equivalent to convolving the original image with a set of weighting functions determined by a parameter that defines the filter. According to the parameter values, these filters have a behaviour that goes from the Gaussian shape to the fractal. Previous works only analyze Gaussian filters, but we determine the Gaussian and fractal intervals and study the energy of the Laplacian pyramid images according to the filter types. The different behaviour, qualitatively, involves a significant change in statistical characteristics at different levels of iteration, especially the fractal case, which can highlight specific information from the images. Funding provided by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no. AGL2010-21501/AGR is greatly appreciated.

  11. Unconventional Coding Technique Applied to Multi-Level Polarization Modulation

    NASA Astrophysics Data System (ADS)

    Rutigliano, G. G.; Betti, S.; Perrone, P.

    2016-05-01

    A new technique is proposed to improve information confidentiality in optical-fiber communications without bandwidth consumption. A pseudorandom vectorial sequence was generated by a dynamic system algorithm and used to codify a multi-level polarization modulation based on the Stokes vector. Optical-fiber birefringence, usually considered as a disturbance, was exploited to obfuscate the signal transmission. At the receiver end, the same pseudorandom sequence was generated and used to decode the multi-level polarization modulated signal. The proposed scheme, working at the physical layer, provides strong information security without introducing complex processing and thus latency.

  12. Low background techniques applied in the BOREXINO experiment

    SciTech Connect

    Zuzel, G.

    2015-08-17

    The BOREXINO detector, located in the Gran Sasso National Laboratory in Italy, has been designed for real-time spectroscopy of low-energy solar neutrinos. Within the experiment several novel background reduction and assay techniques have been established. In many cases they are still the most sensitive world-wide. Developed methods and apparatus provided tools for a strict quality control program during the construction phase of the BOREXINO detector, which was the key to meet the background requirements. Achievement of extremely low background rate opened the possibility to probe in realtime almost entire spectrum of the solar neutrinos.

  13. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  14. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  15. Image enhancement techniques applied to solar feature detection

    NASA Astrophysics Data System (ADS)

    Kowalski, Artur J.

    This dissertation presents the development of automatic image enhancement techniques for solar feature detection. The new method allows for detection and tracking of the evolution of filaments in solar images. Series of H-alpha full-disk images are taken in regular time intervals to observe the changes of the solar disk features. In each picture, the solar chromosphere filaments are identified for further evolution examination. The initial preprocessing step involves local thresholding to convert grayscale images into black-and-white pictures with chromosphere granularity enhanced. An alternative preprocessing method, based on image normalization and global thresholding is presented. The next step employs morphological closing operations with multi-directional linear structuring elements to extract elongated shapes in the image. After logical union of directional filtering results, the remaining noise is removed from the final outcome using morphological dilation and erosion with a circular structuring element. Experimental results show that the developed techniques can achieve excellent results in detecting large filaments and good detection rates for small filaments. The final chapter discusses proposed directions of the future research and applications to other areas of solar image processing, in particular to detection of solar flares, plages and sunspots.

  16. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  17. Applied geophysical techniques to evaluate earth dams and foundations

    NASA Astrophysics Data System (ADS)

    Llopis, Jose L.; Sharp, Michael K.; Butler, Dwain K.; Yule, Donald E.

    1995-05-01

    Mill Creek Dam, near Walla Walla, Washington has experienced anomalous seepage since its first filling in 1941. Various attempts to abate and control the seepage, including construction of a concrete wall, have not been completely successful. Construction of the cutoff wall reduced the seepage by about 30 percent, from 33 cubic feet per second to 22 cubic feet per second, and downstream saturated farmland was reduced by 56 percent. However, there are indications of increased seepage pressures in a conglomerate formation in the right abutment. A comprehensive, integrated geophysics investigation of the right abutment area of the dam was conducted to detect and map anomalous conditions and assist in the evaluation of remedial measures. The geophysics program consisted of microgravity, ground penetrating radar, seismic reflection, electromagnetic conductivity, and electrical resistivity surveying. Results of the program indicate anomalous conditions extending from the reservoir area through the right abutment. The aspects of the program planning leading to technique selection and field procedures are emphasized, as well as the role of different geophysical techniques in defining the nature of anomalous condition.

  18. Applying manifold learning techniques to the CAESAR database

    NASA Astrophysics Data System (ADS)

    Mendoza-Schrock, Olga; Patrick, James; Arnold, Gregory; Ferrara, Matthew

    2010-04-01

    Understanding and organizing data is the first step toward exploiting sensor phenomenology for dismount tracking. What image features are good for distinguishing people and what measurements, or combination of measurements, can be used to classify the dataset by demographics including gender, age, and race? A particular technique, Diffusion Maps, has demonstrated the potential to extract features that intuitively make sense [1]. We want to develop an understanding of this tool by validating existing results on the Civilian American and European Surface Anthropometry Resource (CAESAR) database. This database, provided by the Air Force Research Laboratory (AFRL) Human Effectiveness Directorate and SAE International, is a rich dataset which includes 40 traditional, anthropometric measurements of 4400 human subjects. If we could specifically measure the defining features for classification, from this database, then the future question will then be to determine a subset of these features that can be measured from imagery. This paper briefly describes the Diffusion Map technique, shows potential for dimension reduction of the CAESAR database, and describes interesting problems to be further explored.

  19. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    NASA Astrophysics Data System (ADS)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  20. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other

  1. Finite element techniques applied to cracks interacting with selected singularities

    NASA Technical Reports Server (NTRS)

    Conway, J. C.

    1975-01-01

    The finite-element method for computing the extensional stress-intensity factor for cracks approaching selected singularities of varied geometry is described. Stress-intensity factors are generated using both displacement and J-integral techniques, and numerical results are compared to those obtained experimentally in a photoelastic investigation. The selected singularities considered are a colinear crack, a circular penetration, and a notched circular penetration. Results indicate that singularities greatly influence the crack-tip stress-intensity factor as the crack approaches the singularity. In addition, the degree of influence can be regulated by varying the overall geometry of the singularity. Local changes in singularity geometry have little effect on the stress-intensity factor for the cases investigated.

  2. Status of text-mining techniques applied to biomedical text.

    PubMed

    Erhardt, Ramón A-A; Schneider, Reinhard; Blaschke, Christian

    2006-04-01

    Scientific progress is increasingly based on knowledge and information. Knowledge is now recognized as the driver of productivity and economic growth, leading to a new focus on the role of information in the decision-making process. Most scientific knowledge is registered in publications and other unstructured representations that make it difficult to use and to integrate the information with other sources (e.g. biological databases). Making a computer understand human language has proven to be a complex achievement, but there are techniques capable of detecting, distinguishing and extracting a limited number of different classes of facts. In the biomedical field, extracting information has specific problems: complex and ever-changing nomenclature (especially genes and proteins) and the limited representation of domain knowledge. PMID:16580973

  3. Object Detection Techniques Applied on Mobile Robot Semantic Navigation

    PubMed Central

    Astua, Carlos; Barber, Ramon; Crespo, Jonathan; Jardon, Alberto

    2014-01-01

    The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency. PMID:24732101

  4. Discrete filtering techniques applied to sequential GPS range measurements

    NASA Technical Reports Server (NTRS)

    Vangraas, Frank

    1987-01-01

    The basic navigation solution is described for position and velocity based on range and delta range (Doppler) measurements from NAVSTAR Global Positioning System satellites. The application of discrete filtering techniques is examined to reduce the white noise distortions on the sequential range measurements. A second order (position and velocity states) Kalman filter is implemented to obtain smoothed estimates of range by filtering the dynamics of the signal from each satellite separately. Test results using a simulated GPS receiver show a steady-state noise reduction, the input noise variance divided by the output noise variance, of a factor of four. Recommendations for further noise reduction based on higher order Kalman filters or additional delta range measurements are included.

  5. Object detection techniques applied on mobile robot semantic navigation.

    PubMed

    Astua, Carlos; Barber, Ramon; Crespo, Jonathan; Jardon, Alberto

    2014-01-01

    The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency. PMID:24732101

  6. Applying Clustering Techniques to Reduce Complexity in Automated Planning Domains

    NASA Astrophysics Data System (ADS)

    Dicken, Luke; Levine, John

    Automated Planning is a very active area of research within Artificial Intelligence. Broadly this discipline deals with the methods by which an agent can independently determine the action sequence required to successfully achieve a set of objectives. In this paper, we will present initial work outlining a new approach to planning based on Clustering techniques, in order to group states of the world together and use the fundamental structure of the world to lift out more abstract representations. We will show that this approach can limit the combinatorial explosion of a typical planning problem in a way that is much more intuitive and reusable than has previously been possible, and outline ways that this approach can be developed further.

  7. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  8. Technology Assessment of Dust Suppression Techniques applied During Structural Demolition

    SciTech Connect

    Boudreaux, J.F.; Ebadian, M.A.; Dua, S.K.

    1997-08-06

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure and, at the same time, minimize the amount of dust generated by a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology. Thus, the purpose of this research, which was conducted by the Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU), was to perform an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study specifically targeted the problem of dust suppression during demolition. The resulting data were used in the development of mathematical correlations that can be applied to structural demolition. In the Fiscal Year 1996 (FY96), the effectiveness of different dust suppressing agents was investigated for different types of concrete blocks. Initial tests were conducted in a broad particle size range. In Fiscal Year 1997 (FY97), additional tests were performed in the size range in which most of the particles were detected. Since particle distribution is an important parameter for predicting deposition in various compartments of the human respiratory tract, various tests were aimed at determining the particle size distribution of the airborne dust particles. The effectiveness of dust suppressing agents for particles of various size was studied. Instead of conducting experiments on various types of blocks, it was thought prudent to carry out additional tests on blocks of the same type. Several refinements were also incorporated in the test procedures and data acquisition system used in FY96.

  9. Fungal biosynthesis of gold nanoparticles: mechanism and scale up

    PubMed Central

    Kitching, Michael; Ramani, Meghana; Marsili, Enrico

    2015-01-01

    Gold nanoparticles (AuNPs) are a widespread research tool because of their oxidation resistance, biocompatibility and stability. Chemical methods for AuNP synthesis often produce toxic residues that raise environmental concern. On the other hand, the biological synthesis of AuNPs in viable microorganisms and their cell-free extracts is an environmentally friendly and low-cost process. In general, fungi tolerate higher metal concentrations than bacteria and secrete abundant extracellular redox proteins to reduce soluble metal ions to their insoluble form and eventually to nanocrystals. Fungi harbour untapped biological diversity and may provide novel metal reductases for metal detoxification and bioreduction. A thorough understanding of the biosynthetic mechanism of AuNPs in fungi is needed to reduce the time of biosynthesis and to scale up the AuNP production process. In this review, we describe the known mechanisms for AuNP biosynthesis in viable fungi and fungal protein extracts and discuss the most suitable bioreactors for industrial AuNP biosynthesis. PMID:25154648

  10. Scaling up: Assessing social impacts at the macro-scale

    SciTech Connect

    Schirmer, Jacki

    2011-04-15

    Social impacts occur at various scales, from the micro-scale of the individual to the macro-scale of the community. Identifying the macro-scale social changes that results from an impacting event is a common goal of social impact assessment (SIA), but is challenging as multiple factors simultaneously influence social trends at any given time, and there are usually only a small number of cases available for examination. While some methods have been proposed for establishing the contribution of an impacting event to macro-scale social change, they remain relatively untested. This paper critically reviews methods recommended to assess macro-scale social impacts, and proposes and demonstrates a new approach. The 'scaling up' method involves developing a chain of logic linking change at the individual/site scale to the community scale. It enables a more problematised assessment of the likely contribution of an impacting event to macro-scale social change than previous approaches. The use of this approach in a recent study of change in dairy farming in south east Australia is described.

  11. Scaling Up Data-Centric Middleware on a Cluster Computer

    SciTech Connect

    Liu, D T; Franklin, M J; Garlick, J; Abdulla, G M

    2005-04-29

    Data-centric workflow middleware systems are workflow systems that treat data as first class objects alongside programs. These systems improve the usability, responsiveness and efficiency of workflow execution over cluster (and grid) computers. In this work, we explore the scalability of one such system, GridDB, on cluster computers. We measure the performance and scalability of GridDB in executing data-intensive image processing workflows from the SuperMACHO astrophysics survey on a large cluster computer. Our first experimental study concerns the scale-up of GridDB. We make a rather surprising finding, that while the middleware system issues many queries and transactions to a DBMS, file system operations present the first-tier bottleneck. We circumvent this bottleneck and increase the scalability of GridDB by more than 2-fold on our image processing application (up to 128 nodes). In a second study, we demonstrate the sensitivity of GridDB performance (and therefore application performance) to characteristics of the workflows being executed. To manage these sensitivities, we provide guidelines for trading off the costs and benefits of GridDB at a fine-grain.

  12. Challenges and Opportunities in Scaling-Up Nutrition in Healthcare

    PubMed Central

    Darnton-Hill, Ian; Samman, Samir

    2015-01-01

    Healthcare continues to be in a state of flux; conventionally, this provides opportunities and challenges. The opportunities include technological breakthroughs, improved economies and increasing availability of healthcare. On the other hand, economic disparities are increasing and leading to differing accessibility to healthcare, including within affluent countries. Nutrition has received an increase in attention and resources in recent decades, a lot of it stimulated by the rise in obesity, type 2 diabetes mellitus and hypertension. An increase in ageing populations also has meant increased interest in nutrition-related chronic diseases. In many middle-income countries, there has been an increase in the double burden of malnutrition with undernourished children and overweight/obese parents and adolescents. In low-income countries, an increased evidence base has allowed scaling-up of interventions to address under-nutrition, both nutrition-specific and nutrition-sensitive interventions. Immediate barriers (institutional, structural and biological) and longer-term barriers (staffing shortages where most needed and environmental impacts on health) are discussed. Significant barriers remain for the near universal access to healthcare, especially for those who are socio-economically disadvantaged, geographically isolated, living in war zones or where environmental damage has taken place. However, these barriers are increasingly being recognized, and efforts are being made to address them. The paper aims to take a broad view that identifies and then comments on the many social, political and scientific factors affecting the achievement of improved nutrition through healthcare. PMID:27417744

  13. Polyethylene encapsulation of mixed wastes: Scale-up feasibility

    SciTech Connect

    Kalb, P.D.; Heiser, J.H.; Colombo, P.

    1991-12-31

    A polyethylene process for the improved encapsulation of radioactive, hazardous, and mixed wastes have been developed at Brookhaven National Laboratory (BNL). Improvements in waste loading and waste form performance have been demonstrated through bench-scale development and testing. Maximum waste loadings of up to 70 dry wt % mixed waste nitrate salt were achieved, compared with 13--20 dry wt % using conventional cement processes. Stability under anticipated storage and disposal conditions and compliance with applicable hazardous waste regulations were demonstrated through a series of lab-scale waste form performance tests. Full-scale demonstration of this process using actual or surrogate waste is currently planned. A scale-up feasibility test was successfully conducted, demonstrating the ability to process nitrate salts at production rates (up to 450 kg/hr) and the close agreement between bench- and full-scale process parameters. Cored samples from the resulting pilot-scale (114 liter) waste form were used to verify homogeneity and to provide additional specimens for confirmatory performance testing.

  14. Optical Trapping Techniques Applied to the Study of Cell Membranes

    NASA Astrophysics Data System (ADS)

    Morss, Andrew J.

    Optical tweezers allow for manipulating micron-sized objects using pN level optical forces. In this work, we use an optical trapping setup to aid in three separate experiments, all related to the physics of the cellular membrane. In the first experiment, in conjunction with Brian Henslee, we use optical tweezers to allow for precise positioning and control of cells in suspension to evaluate the cell size dependence of electroporation. Theory predicts that all cells porate at a transmembrane potential VTMof roughly 1 V. The Schwann equation predicts that the transmembrane potential depends linearly on the cell radius r, thus predicting that cells should porate at threshold electric fields that go as 1/r. The threshold field required to induce poration is determined by applying a low voltage pulse to the cell and then applying additional pulses of greater and greater magnitude, checking for poration at each step using propidium iodide dye. We find that, contrary to expectations, cells do not porate at a constant value of the transmembrane potential but at a constant value of the electric field which we find to be 692 V/cm for K562 cells. Delivering precise dosages of nanoparticles into cells is of importance for assessing toxicity of nanoparticles or for genetic research. In the second experiment, we conduct nano-electroporation—a novel method of applying precise doses of transfection agents to cells—by using optical tweezers in conjunction with a confocal microscope to manipulate cells into contact with 100 nm wide nanochannels. This work was done in collaboration with Pouyan Boukany of Dr. Lee's group. The small cross sectional area of these nano channels means that the electric field within them is extremely large, 60 MV/m, which allows them to electrophoretically drive transfection agents into the cell. We find that nano electroporation results in excellent dose control (to within 10% in our experiments) compared to bulk electroporation. We also find that

  15. Remote sensing techniques applied to seismic vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  16. Time-resolved infrared spectroscopic techniques as applied to channelrhodopsin

    PubMed Central

    Ritter, Eglof; Puskar, Ljiljana; Bartl, Franz J.; Aziz, Emad F.; Hegemann, Peter; Schade, Ulrich

    2015-01-01

    Among optogenetic tools, channelrhodopsins, the light gated ion channels of the plasma membrane from green algae, play the most important role. Properties like channel selectivity, timing parameters or color can be influenced by the exchange of selected amino acids. Although widely used, in the field of neurosciences for example, there is still little known about their photocycles and the mechanism of ion channel gating and conductance. One of the preferred methods for these studies is infrared spectroscopy since it allows observation of proteins and their function at a molecular level and in near-native environment. The absorption of a photon in channelrhodopsin leads to retinal isomerization within femtoseconds, the conductive states are reached in the microsecond time scale and the return into the fully dark-adapted state may take more than minutes. To be able to cover all these time regimes, a range of different spectroscopical approaches are necessary. This mini-review focuses on time-resolved applications of the infrared technique to study channelrhodopsins and other light triggered proteins. We will discuss the approaches with respect to their suitability to the investigation of channelrhodopsin and related proteins. PMID:26217670

  17. Sputtering as a Technique for Applying Tribological Coatings

    NASA Technical Reports Server (NTRS)

    Ramalingam, S.

    1984-01-01

    Friction and wear-induced mechanical failures may be controlled to extend the life of tribological components through the interposition of selected solid materials between contacting surfaces. Thin solid films of soft and hard materials are appropriate to lower friction and enhance the wear resistance of precision tribo-elements. Tribological characteristics of thin hard coats deposited on a variety of ferrous and non-ferrous substrates were tested. The thin hard coats used were titanium nitride films deposited by reactive magnetron sputtering of metallic titanium. High contact stress, low speed tests showed wear rate reductions of one or more magnitude, even with films a few micrometers in thickness. Low contact stress, high speed tests carried out under rather severe test conditions showed that thin films of TiN afforded significant friction reduction and wear protection. Thin hard coats were shown to improve the friction and wear performance of rolling contacts. Satisfactory film-to-substrate adhesion strengths can be obtained with reactive magnetron sputtering. X-ray diffraction and microhardness tests were employed to assess the effectiveness of the sputtering technique.

  18. Cleaning techniques for applied-B ion diodes

    SciTech Connect

    Cuneo, M.E.; Menge, P.R.; Hanson, D.L.

    1995-09-01

    Measurements and theoretical considerations indicate that the lithium-fluoride (LiF) lithium ion source operates by electron-assisted field-desorption, and provides a pure lithium beam for 10--20 ns. Evidence on both the SABRE (1 TW) and PBFA-II (20 TW) accelerators indicates that the lithium beam is replaced by a beam of protons, and carbon resulting from electron thermal desorption of hydrocarbon surface and bulk contamination with subsequent avalanche ionization. Appearance of contaminant ions in the beam is accompanied by rapid impedance collapse, possibly resulting from loss of magnetic insulation in the rapidly expanding and ionizing, neutral layer. Electrode surface and source substrate cleaning techniques are being developed on the SABRE accelerator to reduce beam contamination, plasma formation, and impedance collapse. We have increased lithium current density a factor of 3 and lithium energy a factor of 5 through a combination of in-situ surface and substrate coatings, impermeable substrate coatings, and field profile modifications.

  19. Digital prototyping technique applied for redesigning plastic products

    NASA Astrophysics Data System (ADS)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  20. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    NASA Astrophysics Data System (ADS)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  1. Applying Realtime Intelligence Acquisition Techniques To Problems In Resource Management

    NASA Astrophysics Data System (ADS)

    Greer, Jerry D.

    1989-02-01

    Most people see little similarity between a battlefield manager and a natural resource manager. However, except for the element of time, many striking similarities may be drawn. Indeed, there are more differences between the tranquil scenes of mountain scenery, forests, rivers or grasslands and bomb scarred battlefields where survival is often the prime objective. The similarities center around the basic need for information upon which good decisions may be made. Both managers of battlefields and of natural resources require accurate, timely, and continuous information about changing conditions. Based on this information, they each make decisions to conserve the materials and resources under their charge. Their common goal is to serve the needs of the people in their society. On the one hand, the goal is victory in battle to perpetuate a way of life or a political system. On the other, the goal is victory in an ongoing battle against fire, insects, disease, soil erosion, vandalism, theft, and misuse in general. Here, a desire to maintain natural resources in a productive and healthy condition prevails. The objective of the natural resource manager is to keep natural resources in such a condition that they will continue to meet the needs and wants of the people who claim them for their common good. In this paper, the different needs for information are compared and a little history of some of the quasi-military aspects of resource management is given. Needs for information are compared and current uses of data acquisition techniques are reviewed. Similarities and differences are discussed and future opportunities for cooperation in data acquisition are outlined.

  2. Robustness of speckle imaging techniques applied to horizontal imaging scenarios

    NASA Astrophysics Data System (ADS)

    Bos, Jeremy P.

    Atmospheric turbulence near the ground severely limits the quality of imagery acquired over long horizontal paths. In defense, surveillance, and border security applications, there is interest in deploying man-portable, embedded systems incorporating image reconstruction to improve the quality of imagery available to operators. To be effective, these systems must operate over significant variations in turbulence conditions while also subject to other variations due to operation by novice users. Systems that meet these requirements and are otherwise designed to be immune to the factors that cause variation in performance are considered robust. In addition to robustness in design, the portable nature of these systems implies a preference for systems with a minimum level of computational complexity. Speckle imaging methods are one of a variety of methods recently been proposed for use in man-portable horizontal imagers. In this work, the robustness of speckle imaging methods is established by identifying a subset of design parameters that provide immunity to the expected variations in operating conditions while minimizing the computation time necessary for image recovery. This performance evaluation is made possible using a novel technique for simulating anisoplanatic image formation. I find that incorporate as few as 15 image frames and 4 estimates of the object phase per reconstructed frame provide an average reduction of 45% reduction in Mean Squared Error (MSE) and 68% reduction in deviation in MSE. In addition, the Knox-Thompson phase recovery method is demonstrated to produce images in half the time required by the bispectrum. Finally, it is shown that certain blind image quality metrics can be used in place of the MSE to evaluate reconstruction quality in field scenarios. Using blind metrics rather depending on user estimates allows for reconstruction quality that differs from the minimum MSE by as little as 1%, significantly reducing the deviation in

  3. Scaling-up ultrasound standing wave enhanced sedimentation filters.

    PubMed

    Prest, Jeff E; Treves Brown, Bernard J; Fielden, Peter R; Wilkinson, Stephen J; Hawkes, Jeremy J

    2015-02-01

    Particle concentration and filtration is a key stage in a wide range of processing industries and also one that can be present challenges for high throughput, continuous operation. Here we demonstrate some features which increase the efficiency of ultrasound enhanced sedimentation and could enable the technology the potential to be scaled up. In this work, 20 mm piezoelectric plates were used to drive 100 mm high chambers formed from single structural elements. The coherent structural resonances were able to drive particles (yeast cells) in the water to nodes throughout the chamber. Ultrasound enhanced sedimentation was used to demonstrate the efficiency of the system (>99% particle clearance). Sub-wavelength pin protrusions were used for the contacts between the resonant chamber and other elements. The pins provided support and transferred power, replacing glue which is inefficient for power transfer. Filtration energies of ∼4 J/ml of suspension were measured. A calculation of thermal convection indicates that the circulation could disrupt cell alignment in ducts >35 mm high when a 1K temperature gradient is present; we predict higher efficiencies when this maximum height is observed. For the acoustic design, although modelling was minimal before construction, the very simple construction allowed us to form 3D models of the nodal patterns in the fluid and the duct structure. The models were compared with visual observations of particle movement, Chladni figures and scanning laser vibrometer mapping. This demonstrates that nodal planes in the fluid can be controlled by the position of clamping points and that the contacts could be positioned to increase the efficiency and reliability of particle manipulations in standing waves. PMID:25193111

  4. Beginning with sustainable scale up in mind: initial results from a population, health and environment project in East Africa.

    PubMed

    Ghiron, Laura; Shillingi, Lucy; Kabiswa, Charles; Ogonda, Godfrey; Omimo, Antony; Ntabona, Alexis; Simmons, Ruth; Fajans, Peter

    2014-05-01

    Small-scale pilot projects have demonstrated that integrated population, health and environment approaches can address the needs and rights of vulnerable communities. However, these and other types of health and development projects have rarely gone on to influence larger policy and programme development. ExpandNet, a network of health professionals working on scaling up, argues this is because projects are often not designed with future sustainability and scaling up in mind. Developing and implementing sustainable interventions that can be applied on a larger scale requires a different mindset and new approaches to small-scale/pilot testing. This paper shows how this new approach is being applied and the initial lessons from its use in the Health of People and Environment in the Lake Victoria Basin Project currently underway in Uganda and Kenya. Specific lessons that are emerging are: 1) ongoing, meaningful stakeholder engagement has significantly shaped the design and implementation, 2) multi-sectoral projects are complex and striving for simplicity in the interventins is challenging, and 3) projects that address a sharply felt need experience substantial pressure for scale up, even before their effectiveness is established. Implicit in this paper is the recommendation that other projects would also benefit from applying a scale-up perspective from the outset. PMID:24908459

  5. Using Advanced Modeling to Accelerate the Scale-Up of Carbon Capture Technologies

    SciTech Connect

    Miller, David; Sun, Xin; Storlie, Curtis; Bhattacharyya, Debangsu

    2015-06-18

    Carbon capture and storage (CCS) is one of many approaches that are critical for significantly reducing domestic and global CO2 emissions. The U.S. Department of Energy’s Clean Coal Technology Program Plan envisions 2nd generation CO2 capture technologies ready for demonstration-scale testing around 2020 with the goal of enabling commercial deployment by 2025 [1]. Third generation technologies have a similarly aggressive timeline. A major challenge is that the development and scale-up of new technologies in the energy sector historically takes up to 15 years to move from the laboratory to pre-deployment and another 20 to 30 years for widespread industrial scale deployment. In order to help meet the goals of the DOE carbon capture program, the Carbon Capture Simulation Initiative (CCSI) was launched in early 2011 to develop, demonstrate, and deploy advanced computational tools and validated multi-scale models to reduce the time required to develop and scale up new carbon capture technologies. The CCSI Toolset (1) enables promising concepts to be more quickly identified through rapid computational screening of processes and devices, (2) reduces the time to design and troubleshoot new devices and processes by using optimization techniques to focus development on the best overall process conditions and by using detailed device-scale models to better understand and improve the internal behavior of complex equipment, and (3) provides quantitative predictions of device and process performance during scale up based on rigorously validated smaller scale simulations that take into account model and parameter uncertainty[2]. This article focuses on essential elements related to the development and validation of multi-scale models in order to help minimize risk and maximize learning as new technologies progress from pilot to demonstration scale.

  6. Scaling up debris-flow experiments on a centrifuge

    NASA Astrophysics Data System (ADS)

    Hung, C.; Capart, H.; Crone, T. J.; Grinspum, E.; Hsu, L.; Kaufman, D.; Li, L.; Ling, H.; Reitz, M. D.; Smith, B.; Stark, C. P.

    2013-12-01

    Boundary forces generated by debris flows can be powerful enough to erode bedrock and cause considerable damage to infrastructure during runout. Formulation of an erosion-rate law for debris flows is therefore a high priority, and it makes sense to build such a law around laboratory experiments. However, running experiments big enough to generate realistic boundary forces is a logistical challenge to say the least [1]. One alternative is to run table-top simulations with unnaturally weak but fast-eroding pseudo-bedrock, another is to extrapolate from micro-erosion of natural substrates driven by unnaturally weak impacts; hybrid-scale experiments have also been conducted [2]. Here we take a different approach in which we scale up granular impact forces by running our experiments under enhanced gravity in a geotechnical centrifuge [3]. Using a 40cm-diameter rotating drum [2] spun at up to 100g, we generate debris flows with an effective depth of over several meters. By varying effective gravity from 1g to 100g we explore the scaling of granular flow forces and the consequent bed and wall erosion rates. The velocity and density structure of these granular flows is monitored using laser sheets, high-speed video, and particle tracking [4], and the progressive erosion of the boundary surfaces is measured by laser scanning. The force structures and their fluctuations within the granular mass and at the boundaries are explored with contact dynamics numerical simulations that mimic the lab experimental conditions [5]. In this presentation we summarize these results and discuss how they can contribute to the formulation of debris-flow erosion law. [1] Major, J. J. (1997), Journal of Geology 105: 345-366, doi:10.1086/515930 [2] Hsu, L. (2010), Ph.D. thesis, University of California, Berkeley [3] Brucks, A., et al (2007), Physical Review E 75, 032301, doi:10.1103/PhysRevE.75.032301 [4] Spinewine, B., et al (2011), Experiments in Fluids 50: 1507-1525, doi: 10.1007/s00348

  7. Scale up tools in reactive extrusion and compounding processes. Could 1D-computer modeling be helpful?

    NASA Astrophysics Data System (ADS)

    Pradel, J.-L.; David, C.; Quinebèche, S.; Blondel, P.

    2014-05-01

    Industrial scale-up (or scale down) in Compounding and Reactive Extrusion processes is one of the most critical R&D challenges. Indeed, most of High Performances Polymers are obtained within a reactive compounding involving chemistry: free radical grafting, in situ compatibilization, rheology control... but also side reactions: oxidation, branching, chain scission... As described by basic Arrhenius and kinetics laws, the competition between all chemical reactions depends on residence time distribution and temperature. Then, to ensure the best possible scale up methodology, we need tools to match thermal history of the formulation along the screws from a lab scale twin screw extruder to an industrial one. This paper proposes a comparison between standard scale-up laws and the use of Computer modeling Software such as Ludovic® applied and compared to experimental data. Scaling data from a compounding line to another one, applying general rules (for example at constant specific mechanical energy), shows differences between experimental and computed data, and error depends on the screw speed range. For more accurate prediction, 1D-Computer Modeling could be used to optimize the process conditions to ensure the best scale-up product, especially in temperature sensitive reactive extrusion processes. When the product temperature along the screws is the key, Ludovic® software could help to compute the temperature profile along the screws and extrapolate conditions, even screw profile, on industrial extruders.

  8. Validation and scale-up of plasmid DNA purification by phenyl-boronic acid chromatography.

    PubMed

    Gomes, A Gabriela; Azevedo, Ana M; Aires-Barros, M Raquel; Prazeres, D Miguel F

    2012-11-01

    This study addresses the feasibility of scaling-up the removal of host cell impurities from plasmid DNA (pDNA)-containing Escherichia coli lysates by phenyl-boronic (PB) acid chromatography using columns packed with 7.6 and 15.2 cm(3) of controlled porous glass beads (CPG) derivatized with PB ligands. Equilibration was performed with water at 10 cm(3) /min and no conditioning of the lysate feed was required. At a ratio of lysate feed to adsorbent volume of 1.3, 93-96% of pDNA was recovered in the flow through while 66-71% of impurities remained bound (~2.5-fold purification). The entire sequence of loading, washing, elution, and re-equilibration was completed in 20 min. Run-to-run consistency was observed in terms of chromatogram features and performance (yield, purification factor, agarose electrophoresis) across the different amounts of adsorbent (0.75-15.2 cm(3) ) by performing successive injections of lysates prepared independently and containing 3.7 or 6.1 kbp plasmids. The column productivity at large scale was 4 dm(3) of alkaline lysate per hour per dm(3) of PB-CPG resin. The method is rapid, reproducible, simple, and straightforward to scale-up. Furthermore, it is capable of handling heavily contaminated samples, constituting a good alternative to purification techniques such as isopropanol precipitation, aqueous two-phase systems, and tangential flow filtration. PMID:23175141

  9. Semantic Representation and Scale-Up of Integrated Air Traffic Management Data

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Ranjan, Shubha; Wei, Mie; Eshow, Michelle

    2016-01-01

    Each day, the global air transportation industry generates a vast amount of heterogeneous data from air carriers, air traffic control providers, and secondary aviation entities handling baggage, ticketing, catering, fuel delivery, and other services. Generally, these data are stored in isolated data systems, separated from each other by significant political, regulatory, economic, and technological divides. These realities aside, integrating aviation data into a single, queryable, big data store could enable insights leading to major efficiency, safety, and cost advantages. In this paper, we describe an implemented system for combining heterogeneous air traffic management data using semantic integration techniques. The system transforms data from its original disparate source formats into a unified semantic representation within an ontology-based triple store. Our initial prototype stores only a small sliver of air traffic data covering one day of operations at a major airport. The paper also describes our analysis of difficulties ahead as we prepare to scale up data storage to accommodate successively larger quantities of data -- eventually covering all US commercial domestic flights over an extended multi-year timeframe. We review several approaches to mitigating scale-up related query performance concerns.

  10. Scale-up of the production of highly reactive biogenic magnetite nanoparticles using Geobacter sulfurreducens.

    PubMed

    Byrne, J M; Muhamadali, H; Coker, V S; Cooper, J; Lloyd, J R

    2015-06-01

    Although there are numerous examples of large-scale commercial microbial synthesis routes for organic bioproducts, few studies have addressed the obvious potential for microbial systems to produce inorganic functional biomaterials at scale. Here we address this by focusing on the production of nanoscale biomagnetite particles by the Fe(III)-reducing bacterium Geobacter sulfurreducens, which was scaled up successfully from laboratory- to pilot plant-scale production, while maintaining the surface reactivity and magnetic properties which make this material well suited to commercial exploitation. At the largest scale tested, the bacterium was grown in a 50 l bioreactor, harvested and then inoculated into a buffer solution containing Fe(III)-oxyhydroxide and an electron donor and mediator, which promoted the formation of magnetite in under 24 h. This procedure was capable of producing up to 120 g of biomagnetite. The particle size distribution was maintained between 10 and 15 nm during scale-up of this second step from 10 ml to 10 l, with conserved magnetic properties and surface reactivity; the latter demonstrated by the reduction of Cr(VI). The process presented provides an environmentally benign route to magnetite production and serves as an alternative to harsher synthetic techniques, with the clear potential to be used to produce kilogram to tonne quantities. PMID:25972437

  11. Scale-up of the production of highly reactive biogenic magnetite nanoparticles using Geobacter sulfurreducens

    PubMed Central

    Byrne, J. M.; Muhamadali, H.; Coker, V. S.; Cooper, J.; Lloyd, J. R.

    2015-01-01

    Although there are numerous examples of large-scale commercial microbial synthesis routes for organic bioproducts, few studies have addressed the obvious potential for microbial systems to produce inorganic functional biomaterials at scale. Here we address this by focusing on the production of nanoscale biomagnetite particles by the Fe(III)-reducing bacterium Geobacter sulfurreducens, which was scaled up successfully from laboratory- to pilot plant-scale production, while maintaining the surface reactivity and magnetic properties which make this material well suited to commercial exploitation. At the largest scale tested, the bacterium was grown in a 50 l bioreactor, harvested and then inoculated into a buffer solution containing Fe(III)-oxyhydroxide and an electron donor and mediator, which promoted the formation of magnetite in under 24 h. This procedure was capable of producing up to 120 g of biomagnetite. The particle size distribution was maintained between 10 and 15 nm during scale-up of this second step from 10 ml to 10 l, with conserved magnetic properties and surface reactivity; the latter demonstrated by the reduction of Cr(VI). The process presented provides an environmentally benign route to magnetite production and serves as an alternative to harsher synthetic techniques, with the clear potential to be used to produce kilogram to tonne quantities. PMID:25972437

  12. Scale-up of hydrophobin-assisted recombinant protein production in tobacco BY-2 suspension cells.

    PubMed

    Reuter, Lauri J; Bailey, Michael J; Joensuu, Jussi J; Ritala, Anneli

    2014-05-01

    Plant suspension cell cultures are emerging as an alternative to mammalian cells for production of complex recombinant proteins. Plant cell cultures provide low production cost, intrinsic safety and adherence to current regulations, but low yields and costly purification technology hinder their commercialization. Fungal hydrophobins have been utilized as fusion tags to improve yields and facilitate efficient low-cost purification by surfactant-based aqueous two-phase separation (ATPS) in plant, fungal and insect cells. In this work, we report the utilization of hydrophobin fusion technology in tobacco bright yellow 2 (BY-2) suspension cell platform and the establishment of pilot-scale propagation and downstream processing including first-step purification by ATPS. Green fluorescent protein-hydrophobin fusion (GFP-HFBI) induced the formation of protein bodies in tobacco suspension cells, thus encapsulating the fusion protein into discrete compartments. Cultivation of the BY-2 suspension cells was scaled up in standard stirred tank bioreactors up to 600 L production volume, with no apparent change in growth kinetics. Subsequently, ATPS was applied to selectively capture the GFP-HFBI product from crude cell lysate, resulting in threefold concentration, good purity and up to 60% recovery. The ATPS was scaled up to 20 L volume, without loss off efficiency. This study provides the first proof of concept for large-scale hydrophobin-assisted production of recombinant proteins in tobacco BY-2 cell suspensions. PMID:24341724

  13. Microbial electrolysis cell scale-up for combined wastewater treatment and hydrogen production.

    PubMed

    Gil-Carrera, L; Escapa, A; Mehta, P; Santoyo, G; Guiot, S R; Morán, A; Tartakovsky, B

    2013-02-01

    This study demonstrates microbial electrolysis cell (MEC) scale-up from a 50mL to a 10L cell. Initially, a 50mL membraneless MEC with a gas diffusion cathode was operated on synthetic wastewater at different organic loads. It was concluded that process scale-up might be best accomplished using a "reactor-in-series" concept. Consequently, 855mL and 10L MECs were built and operated. By optimizing the hydraulic retention time (HRT) of the 855mL MEC and individually controlling the applied voltages of three anodic compartments with a real-time optimization algorithm, a COD removal of 5.7g L(R)(-1)d(-1) and a hydrogen production of 1.0-2.6L L(R)(-1)d(-1) was achieved. Furthermore, a two MECs in series 10L setup was constructed and operated on municipal wastewater. This test showed a COD removal rate of 0.5g L(R)(-1)d(-1), a removal efficiency of 60-76%, and an energy consumption of 0.9Whperg of COD removed. PMID:23334014

  14. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  15. Scale-up and advanced performance analysis of boiler combustion chambers

    SciTech Connect

    Richter, W.

    1985-12-31

    This paper discusses methods for evaluation of thermal performance of large boiler furnaces. Merits and limitations of pilot-scale testing and mathematical modeling are pointed out. Available computer models for furnace performance predictions are reviewed according to their classification into finite-difference methods and zone methods. Current state of the art models for industrial application are predominantly zone methods based on advanced Monte-Carlo type techniques for calculation of radiation heat transfer. A representation of this model type is described in more detail together with examples of its practical application. It is also shown, how pilot-scale results can be scaled-up with help of the model to predict full-scale performance of particular boiler furnaces.

  16. Scale-up of phosphate remobilization from sewage sludge in a microbial fuel cell.

    PubMed

    Happe, Manuel; Sugnaux, Marc; Cachelin, Christian Pierre; Stauffer, Marc; Zufferey, Géraldine; Kahoun, Thomas; Salamin, Paul-André; Egli, Thomas; Comninellis, Christos; Grogg, Alain-François; Fischer, Fabian

    2016-01-01

    Phosphate remobilization from digested sewage sludge containing iron phosphate was scaled-up in a microbial fuel cell (MFC). A 3litre triple chambered MFC was constructed. This reactor was operated as a microbial fuel cell and later as a microbial electrolysis cell to accelerate cathodic phosphate remobilization. Applying an additional voltage and exceeding native MFC power accelerated chemical base formation and the related phosphate remobilization rate. The electrolysis approach was extended using a platinum-RVC cathode. The pH rose to 12.6 and phosphate was recovered by 67% in 26h. This was significantly faster than using microbial fuel cell conditions. Shrinking core modelling particle fluid kinetics showed that the reaction resistance has to move inside the sewage sludge particle for considerable rate enhancement. Remobilized phosphate was subsequently precipitated as struvite and inductively coupled plasma mass spectrometry indicated low levels of cadmium, lead, and other metals as required by law for recycling fertilizers. PMID:26519694

  17. Scale Up of Pan Coating Process Using Quality by Design Principles.

    PubMed

    Agrawal, Anjali M; Pandey, Preetanshu

    2015-11-01

    Scale up of pan coating process is of high importance to the pharmaceutical and food industry. The number of process variables and their interdependence in a pan coating process can make it a rather complex scale-up problem. This review discusses breaking down the coating process variables into three main categories: pan-related, spray-related, and thermodynamic-related factors. A review on how to scale up each of these factors is presented via two distinct strategies--"macroscopic" and "microscopic" scale-up. In a Quality by Design paradigm, where an increased process understanding is required, there is increased emphasis on "microscopic" scale-up, which by definition ensures a more reproducible process and thereby robust scale-up. This article also reviews the various existing and new modeling and process analytical technology tools that can provide additional information to facilitate a more fundamental understanding of the coating process. PMID:26202540

  18. Microalgal biohydrogen production considering light energy and mixing time as the two key features for scale-up.

    PubMed

    Oncel, S; Sabankay, M

    2012-10-01

    This study focuses on a scale-up procedure considering two vital parameters light energy and mixing for microalgae cultivation, taking Chlamydomonas reinhardtii as the model microorganism. Applying two stage hydrogen production protocol to 1L flat type and 2.5L tank type photobioreactors hydrogen production was investigated with constant light energy and mixing time. The conditions that provide the shortest transfer time to anaerobic culture (light energy; 2.96 kJ s(-1)m(-3) and mixing time; 1 min) and highest hydrogen production rate (light energy; 1.22 kJ s(-1)m(-3) and mixing time; 2.5 min) are applied to 5L photobioreactor. The final hydrogen production for 5L system after 192 h was measured as 195 ± 10 mL that is comparable with the other systems is a good validation for the scale-up procedure. PMID:22858490

  19. Imaging techniques applied to the study of fluids in porous media

    SciTech Connect

    Tomutsa, L.; Brinkmeyer, A.; Doughty, D.

    1993-04-01

    A synergistic rock characterization methodology has been developed. It derives reservoir engineering parameters from X-ray tomography (CT) scanning, computer assisted petrographic image analysis, minipermeameter measurements, and nuclear magnetic resonance imaging (NMRI). This rock characterization methodology is used to investigate the effect of small-scale rock heterogeneity on oil distribution and recovery. It is also used to investigate the applicability of imaging technologies to the development of scaleup procedures from core plug to whole core, by comparing the results of detailed simulations with the images ofthe fluid distributions observed by CT scanning. By using the rock and fluid detailed data generated by imaging technology describe, one can verify directly, in the laboratory, various scaling up techniques. Asan example, realizations of rock properties statistically and spatially compatible with the observed values are generated by one of the various stochastic methods available (fuming bands) and are used as simulator input. The simulation results were compared with both the simulation results using the true rock properties and the fluid distributions observed by CT. Conclusions regarding the effect of the various permeability models on waterflood oil recovery were formulated.

  20. SCALE-UP OF ADVANCED HOT-GAS DESULFURIZATION SORBENTS

    SciTech Connect

    K. JOTHIMURUGESAN; S.K. GANGWAL

    1998-03-01

    The objective of this study was to develop advanced regenerable sorbents for hot gas desulfurization in IGCC systems. The specific objective was to develop durable advanced sorbents that demonstrate a strong resistance to attrition and chemical deactivation, and high sulfidation activity at temperatures as low as 343 C (650 F). Twenty sorbents were synthesized in this work. Details of the preparation technique and the formulations are proprietary, pending a patent application, thus no details regarding the technique are divulged in this report. Sulfidations were conducted with a simulated gas containing (vol %) 10 H{sub 2}, 15 CO, 5 CO{sub 2}, 0.4-1 H{sub 2}S, 15 H{sub 2}O, and balance N{sub 2} in the temperature range of 343-538 C. Regenerations were conducted at temperatures in the range of 400-600 C with air-N{sub 2} mixtures. To prevent sulfation, catalyst additives were investigated that promote regeneration at lower temperatures. Characterization were performed for fresh, sulfided and regenerated sorbents.

  1. What are the barriers to scaling up health interventions in low and middle income countries? A qualitative study of academic leaders in implementation science

    PubMed Central

    2012-01-01

    Background Most low and middle income countries (LMICs) are currently not on track to reach the health-related Millennium Development Goals (MDGs). One way to accelerate progress would be through the large-scale implementation of evidence-based health tools and interventions. This study aimed to: (a) explore the barriers that have impeded such scale-up in LMICs, and (b) lay out an “implementation research agenda”—a series of key research questions that need to be addressed in order to help overcome such barriers. Methods Interviews were conducted with fourteen key informants, all of whom are academic leaders in the field of implementation science, who were purposively selected for their expertise in scaling up in LMICs. Interviews were transcribed by hand and manually coded to look for emerging themes related to the two study aims. Barriers to scaling up, and unanswered research questions, were organized into six categories, representing different components of the scaling up process: attributes of the intervention; attributes of the implementers; scale-up approach; attributes of the adopting community; socio-political, fiscal, and cultural context; and research context. Results Factors impeding the success of scale-up that emerged from the key informant interviews, and which are areas for future investigation, include: complexity of the intervention and lack of technical consensus; limited human resource, leadership, management, and health systems capacity; poor application of proven diffusion techniques; lack of engagement of local implementers and of the adopting community; and inadequate integration of research into scale-up efforts. Conclusions Key steps in expanding the evidence base on implementation in LMICs include studying how to: simplify interventions; train “scale-up leaders” and health workers dedicated to scale-up; reach and engage communities; match the best delivery strategy to the specific health problem and context; and raise the low

  2. Scaling up watershed model parameters--Flow and load simulations of the Edisto River Basin

    USGS Publications Warehouse

    Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul A.

    2014-01-01

    The Edisto River is the longest and largest river system completely contained in South Carolina and is one of the longest free flowing blackwater rivers in the United States. The Edisto River basin also has fish-tissue mercury concentrations that are some of the highest recorded in the United States. As part of an effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River basin, analyses and simulations of the hydrology of the Edisto River basin were made with the topography-based hydrological model (TOPMODEL). The potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River basin, was assessed. Scaling up was done in a step-wise process beginning with applying the calibration parameters, meteorological data, and topographic wetness index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made with subsequent simulations culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River basin and updated calibration parameters for some of the TOPMODEL calibration parameters. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the two models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the significant difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL hydrologic simulations, a visualization tool (the Edisto River Data Viewer) was developed to help assess trends and influencing variables in the stream ecosystem. Incorporated into the visualization tool were the water-quality load models TOPLOAD, TOPLOAD-H, and LOADEST

  3. Scaling up parallel scientific applications on the IBM SP

    SciTech Connect

    Skinner, David

    2004-01-08

    This document provides a technical description of the IBM SP seaborg.nersc.gov with an emphasis on how the system performs as a production environment for large scale scientific applications. The overall goal is to provide developers with the information they need to write and run such applications. While some of the information presented here may be applicable in a larger context, the focus is on experiences and scaling techniques specific to seaborg.nersc.gov. In the first few sections we seek to determine how well the theoretical capabilities of this machine may be realized in practice. Most of the measured performance numbers come from small code microkernels or test codes rather than from real user codes. In a companion document several real applications are explored in detail. The microkernel approach described above has value to the user since the most widely quoted performance numbers often reflect the theoretical (peak) performance values rather than those realized in practice. Likewise anecdotes from full blown applications often involve mixed algorithms, hidden constraints or other specifics which make the results hard to generalize. Microkernels and test codes provide a middle ground which is a reasonable best case scenario for real user codes and provide the user with a small kernel of code can serve as a template for the writing or modification of more substantial codes.

  4. Scaling Up Nature: Large Area Flexible Biomimetic Surfaces.

    PubMed

    Li, Yinyong; John, Jacob; Kolewe, Kristopher W; Schiffman, Jessica D; Carter, Kenneth R

    2015-10-28

    The fabrication and advanced function of large area biomimetic superhydrophobic surfaces (SHS) and slippery lubricant-infused porous surfaces (SLIPS) are reported. The use of roll-to-roll nanoimprinting techniques enabled the continuous fabrication of SHS and SLIPS based on hierarchically wrinkled surfaces. Perfluoropolyether hybrid molds were used as flexible molds for roll-to-roll imprinting into a newly designed thiol-ene based photopolymer resin coated on flexible polyethylene terephthalate films. The patterned surfaces exhibit feasible superhydrophobicity with a water contact angle around 160° without any further surface modification. The SHS can be easily converted into SLIPS by roll-to-roll coating of a fluorinated lubricant, and these surfaces have outstanding repellence to a variety of liquids. Furthermore, both SHS and SLIPS display antibiofouling properties when challenged with Escherichia coli K12 MG1655. The current article describes the transformation of artificial biomimetic structures from small, lab-scale coupons to low-cost, large area platforms. PMID:26423494

  5. Scale-up of catalytic wet oxidation under moderate conditions

    SciTech Connect

    Harf, J.; Hug, A.; Vogel, F.; Rohr, P.R. von

    1999-05-01

    The Catalytic Wet Oxidation with pure oxygen is a suitable treatment process for the degradation of organic matter in wastewaters and sludges. The applied moderate reaction conditions lead only to a partial oxidation of the organics. Therefore the resulting process water has to be purified in a biological treatment plant. In this study, experimental data collected during the wet oxidation of phenol and sewage sludge in a laboratory batch reactor as well as in a pilot plant are presented. A generalized kinetic model combined with a residence time analysis allows to predict accurately the degradation of organic matter in the pilot plant. The wet oxidation of wastewaters and sewage sludge was realized in one single plant concept. Treating suspended or diluted organic wastes produces a highly biodegradable process water containing low molecular oxidation products. The investigated Catalytic Wet Oxidation of sewage sludge generates a residual solid complying with the European quality standards of disposal concerning leachability and organic content. Due to its low capital and operating costs, the Catalytic Wet Oxidation process constitutes an acceptable alternative to incineration for the disposal of sludges.

  6. Soil vapor extraction system design scale-up considerations

    SciTech Connect

    Peterson, E.F.; Battey, R.F.

    1997-12-31

    Tried and true design considerations need to be reexamined when designing and implementing a 10,000 scfm soil vapor extraction system. Soil vapor extraction systems have typically been applied at many sites on a fairly small scale, involving air flows of several hundred to a thousand cubic feet per minute. Systems of 10,000 scfm are rarely encountered and entail some unique design considerations. This paper describes the technology options, equipment availability, and other design considerations for a 10,000 scfm system (installed at a former aircraft maintenance facility in Southern California). During the design, low pressure centrifugal fans, higher pressure centrifugal blowers, regenerative blowers and positive-displacement blowers are considered as exhausters. Several technologies are considered for treatment of the extracted air to reduce volatile organic compound (VOC) content: granular activated carbon adsorption, resin adsorption, thermal oxidation and catalytic oxidation. Cost and efficiency criteria are evaluated for the final selection of process equipment at this site. The choice of technology for reduction of VOCs is strongly dependent upon the estimate of recoverable VOCs initially in the soil, the cost of activated carbon replacement and reactivation service, and the cost of fuel. The choice of exhauster is most strongly influenced by the vacuum required at the vapor extraction wells to efficiently move air through the soil matrix and the treatment equipment. The choice of type of exhauster is also limited by the 10,000 scfm air flow rate.

  7. Scale-up of miscible flood processes. Annual report

    SciTech Connect

    Orr, F.M. Jr.

    1992-05-01

    Results of a wide-ranging investigation of the scaling of the physical mechanisms of miscible floods are reported. Advanced techniques for analysis of crude oils are considered in Chapter 2. Application of supercritical fluid chromatography is demonstrated for characterization of crude oils for equation-of-state calculations of phase equilibrium. Results of measurements of crude oil and phase compositions by gas chromatography and mass spectrometry are also reported. The theory of development of miscibility is considered in detail in Chapter 3. The theory is extended to four components, and sample solutions for a variety of gas injection systems are presented. The analytical theory shows that miscibility can develop even though standard tie-line extension criteria developed for ternary systems are not satisfied. In addition, the theory includes the first analytical solutions for condensing/vaporizing gas drives. In Chapter 4, methods for simulation of viscous fingering are considered. The scaling of the growth of transition zones in linear viscous fingering is considered. In addition, extension of the models developed previously to three dimensions is described, as is the inclusion of effects of equilibrium phase behavior. In Chapter 5, the combined effects of capillary and gravity-driven crossflow are considered. The experimental results presented show that very high recovery can be achieved by gravity segregation when interfacial tensions are moderately low. We argue that such crossflow mechanisms are important in multicontact miscible floods in heterogeneous reservoirs. In addition, results of flow visualization experiments are presented that illustrate the interplay of crossflow driven by gravity with that driven by viscous forces.

  8. EFFECT OF PRELOADING ON THE SCALE-UP OF GAC MICRO- COLUMNS

    EPA Science Inventory

    A previously presented microcolumn scale-up procedure is evaluated. Scale-up assumptions that involve equal capacities in microcolumns and field columns are studied in an effort to determine whether preloading activated carbon with a natural water significantly reduces the carbo...

  9. 78 FR 25977 - Applications for New Awards; Investing in Innovation Fund, Scale-up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-03

    ... Applications for New Awards; Investing in Innovation Fund, Scale- up Grants AGENCY: Office of Innovation and Improvement, Department of Education. ACTION: Notice. Overview Information Investing in Innovation Fund, Scale... Domestic Assistance (CFDA) Number: 84.411A (Scale-up grants). DATES: Applications Available: May 6,...

  10. What Does It Take to Scale Up and Sustain Evidence-Based Practices?

    ERIC Educational Resources Information Center

    Klingner, Janette K.; Boardman, Alison G.; Mcmaster, Kristen L.

    2013-01-01

    This article discusses the strategic scaling up of evidence-based practices. The authors draw from the scholarly work of fellow special education researchers and from the field of learning sciences. The article defines scaling up as the process by which researchers or educators initially implement interventions on a small scale, validate them, and…

  11. Fundamental Issues Concerning the Sustainment and Scaling Up of Professional Development Programs

    ERIC Educational Resources Information Center

    Tirosh, Dina; Tsamir, Pessia; Levenson, Esther

    2015-01-01

    The issue of sustaining and scaling up professional development for mathematics teachers raises several fundamental issues for researchers. This commentary addresses various definitions for sustainability and scaling up and how these definitions may affect the design of programs as well as the design of research. We consider four of the papers in…

  12. 'Scaling-up is a craft not a science': Catalysing scale-up of health innovations in Ethiopia, India and Nigeria.

    PubMed

    Spicer, Neil; Bhattacharya, Dipankar; Dimka, Ritgak; Fanta, Feleke; Mangham-Jefferies, Lindsay; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Walt, Gill; Wickremasinghe, Deepthi

    2014-11-01

    Donors and other development partners commonly introduce innovative practices and technologies to improve health in low and middle income countries. Yet many innovations that are effective in improving health and survival are slow to be translated into policy and implemented at scale. Understanding the factors influencing scale-up is important. We conducted a qualitative study involving 150 semi-structured interviews with government, development partners, civil society organisations and externally funded implementers, professional associations and academic institutions in 2012/13 to explore scale-up of innovative interventions targeting mothers and newborns in Ethiopia, the Indian state of Uttar Pradesh and the six states of northeast Nigeria, which are settings with high burdens of maternal and neonatal mortality. Interviews were analysed using a common analytic framework developed for cross-country comparison and themes were coded using Nvivo. We found that programme implementers across the three settings require multiple steps to catalyse scale-up. Advocating for government to adopt and finance health innovations requires: designing scalable innovations; embedding scale-up in programme design and allocating time and resources; building implementer capacity to catalyse scale-up; adopting effective approaches to advocacy; presenting strong evidence to support government decision making; involving government in programme design; invoking policy champions and networks; strengthening harmonisation among external programmes; aligning innovations with health systems and priorities. Other steps include: supporting government to develop policies and programmes and strengthening health systems and staff; promoting community uptake by involving media, community leaders, mobilisation teams and role models. We conclude that scale-up has no magic bullet solution - implementers must embrace multiple activities, and require substantial support from donors and governments in

  13. Fed-batch bioreactor process scale-up from 3-L to 2,500-L scale for monoclonal antibody production from cell culture.

    PubMed

    Yang, Jeng-Dar; Lu, Canghai; Stasny, Brad; Henley, Joseph; Guinto, Woodrow; Gonzalez, Carlos; Gleason, Joseph; Fung, Monica; Collopy, Brett; Benjamino, Michael; Gangi, Jennifer; Hanson, Melissa; Ille, Elisabeth

    2007-09-01

    This case study focuses on the scale-up of a Sp2/0 mouse myeloma cell line based fed-batch bioreactor process, from the initial 3-L bench scale to the 2,500-L scale. A stepwise scale-up strategy that involved several intermediate steps in increasing the bioreactor volume was adopted to minimize the risks associated with scale-up processes. Careful selection of several available mixing models from literature, and appropriately applying the calculated results to our settings, resulted in successful scale-up of agitation speed for the large bioreactors. Consideration was also given to scale-up of the nutrient feeding, inoculation, and the set-points of operational parameters such as temperature, pH, dissolved oxygen, dissolved carbon dioxide, and aeration in an integrated manner. It has been demonstrated through the qualitative and the quantitative side-by-side comparison of bioreactor performance as well as through a panel of biochemical characterization tests that the comparability of the process and the product was well controlled and maintained during the process scale-up. The 2,500-L process is currently in use for the routine clinical production of Epratuzumab in support of two global Phase III clinical trials in patients with lupus. Today, the 2,500 L, fed-batch production process for Epratuzumab has met all scheduled batch releases, and the quality of the antibody is consistent and reproducible, meeting all specifications, thus confirming the robustness of the process. PMID:17657776

  14. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1987-01-01

    Optimization techniques applied to passive measures for in-orbit spacecraft survivability, is a six-month study, designed to evaluate the effectiveness of the geometric programming (GP) optimization technique in determining the optimal design of a meteoroid and space debris protection system for the Space Station Core Module configuration. Geometric Programming was found to be superior to other methods in that it provided maximum protection from impact problems at the lowest weight and cost.

  15. Development of Promising Insulating Oil and Applied Techniques of EHD, ER·MR

    NASA Astrophysics Data System (ADS)

    Hanaoka, Ryoichi

    The development of an environment-friendly insulating liquid has been noticed for a new design of oil-filled power apparatus such as transformer from viewpoints of the protection of the environment. The dielectric liquids can also widely be applied to various fields which are concerned in the electromagnetic field. This article introduces the recent trend on promising new vegetable based oil as an electrical insulation, and EHD pumping, ER fluid and MR fluid as the applied techniques of dielectric liquids.

  16. At-line process analytical technology (PAT) for more efficient scale up of biopharmaceutical microfiltration unit operations.

    PubMed

    Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I

    2016-01-01

    Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 32:108-115, 2016. PMID:26519135

  17. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  18. The technique of linear prediction filters applied to studies of solar wind-magnetosphere coupling

    NASA Technical Reports Server (NTRS)

    Clauer, C. Robert

    1986-01-01

    Linear prediction filtering is a powerful empirical technique suitable for the study of stimulus-response behavior. The technique enables one to determine the most general linear relationship between multiple time-varying quantities, assuming that the physical systems relating the quantities are linear and time invariant. Several researchers have applied linear prediction analysis to investigate solar wind-magnetosphere interactions. This short review describes the method of linear prediction analysis, its application to solar wind-magnetosphere coupling studies both in terms of physical processes, and the results of investigations which have used this technique.

  19. Recreation in a Zoo Environment: Applying Animal Behavior Research Techniques to Understand How Visitors Allocate Time.

    ERIC Educational Resources Information Center

    Harris, Lisa

    1995-01-01

    A focal-animal sampling technique was applied to measure and quantify visitor behavior at an enclosed hummingbird aviary. The amount of time visitors stayed within the aviary and how they allocated time was measured. Results can be used by exhibit designers to create and modify museum exhibits. (LZ)

  20. Scaling-up Process-Oriented Guided Inquiry Learning Techniques for Teaching Large Information Systems Courses

    ERIC Educational Resources Information Center

    Trevathan, Jarrod; Myers, Trina; Gray, Heather

    2014-01-01

    Promoting engagement during lectures becomes significantly more challenging as class sizes increase. Therefore, lecturers need to experiment with new teaching methodologies to embolden deep learning outcomes and to develop interpersonal skills amongst students. Process Oriented Guided Inquiry Learning is a teaching approach that uses highly…

  1. Methodologies Used for Scaling-up From a Single Energy Production Unit to State Energy Sector

    NASA Astrophysics Data System (ADS)

    Cimdina, Ginta; Timma, Lelde; Veidenbergs, Ivars; Blumberga, Dagnija

    2015-12-01

    In a well-functioning and sustainable national energy sector, each of its elements should function with maximum efficiency. To ensure maximum efficiency and study possible improvement of the sector, a scaling-up framework is presented in this work. The scaling-up framework means that the starting point is a CHP unit and its operation, the next step of aggregation is in a district heating network, followed by a municipal energy plan and finally leading to a low carbon strategy. In this framework the authors argue, that the successful, innovative practices developed and tested at the lower level of aggregation can be then transferred to the upper levels of aggregation, thus leading to a scaling-up effect of innovative practices. The work summarizes 12 methodologies used in the energy sector, by dividing these methodologies among the levels of aggregation in a scaling-up framework.

  2. A Route to Scale Up DNA Origami Using DNA Tiles as Folding Staples

    SciTech Connect

    Zhao, Zhao; Yan, Hao; Liu, Yan

    2010-01-26

    A new strategy is presented to scale up DNA origami using multi-helical DNA tiles as folding staples. Atomic force microscopy images demonstrate the two-dimensional structures formed by using this strategy.

  3. Scaling up graph-based semisupervised learning via prototype vector machines.

    PubMed

    Zhang, Kai; Lan, Liang; Kwok, James T; Vucetic, Slobodan; Parvin, Bahram

    2015-03-01

    When the amount of labeled data are limited, semisupervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via l1 -regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning. PMID:25720002

  4. Scaling Up Graph-Based Semisupervised Learning via Prototype Vector Machines

    PubMed Central

    Zhang, Kai; Lan, Liang; Kwok, James T.; Vucetic, Slobodan; Parvin, Bahram

    2014-01-01

    When the amount of labeled data are limited, semi-supervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via ℓ1-regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning. PMID:25720002

  5. Scaling up and error analysis of transpiration for Populus euphratica in a desert riparian forest

    NASA Astrophysics Data System (ADS)

    Si, J.; Li, W.; Feng, Q.

    2013-12-01

    Water consumption information of the forest stand is the most important factor for regional water resources management. However, water consumption of individual trees are usually measured based on the limited sample trees , so, it is an important issue how to realize eventual scaling up of data from a series of sample trees to entire stand. Estimation of sap flow flux density (Fd) and stand sapwood area (AS-stand) are among the most critical factors for determining forest stand transpiration using sap flow measurement. To estimate Fd, the various links in sap flow technology have great impact on the measurement of sap flow, to estimate AS-stand, an appropriate indirect technique for measuring each tree sapwood area (AS-tree) is required, because it is impossible to measure the AS-tree of all trees in a forest stand. In this study, Fd was measured in 2 mature P. euphratic trees at several radial depths, 0~10, 10~30mm, using sap flow sensors with the heat ratio method, the relationship model between AS-tree and stem diameter (DBH), growth model of AS-tree were established, using investigative original data of DBH, tree-age, and AS-tree. The results revealed that it can achieve scaling up of transpiration from sample trees to entire forest stand using AS-tree and Fd, however, the transpiration of forest stand (E) will be overvalued by 12.6% if using Fd of 0~10mm, and it will be underestimated by 25.3% if using Fd of 10~30mm, it implied that major uncertainties in mean stand Fd estimations are caused by radial variations in Fd. E will be obviously overvalued when the AS-stand is constant, this result imply that it is the key to improve the prediction accuracy that how to simulate the AS-stand changes in the day scale; They also showed that the potential errors in transpiration with a sample size of approximately ≥30 were almost stable for P.euphrtica, this suggests that to make an allometric equation it might be necessary to sample at least 30 trees.

  6. Metabolic Profiling of Geobacter sulfurreducens during Industrial Bioprocess Scale-Up

    PubMed Central

    Muhamadali, Howbeer; Xu, Yun; Ellis, David I.; Allwood, J. William; Rattray, Nicholas J. W.; Correa, Elon; Alrabiah, Haitham

    2015-01-01

    During the industrial scale-up of bioprocesses it is important to establish that the biological system has not changed significantly when moving from small laboratory-scale shake flasks or culturing bottles to an industrially relevant production level. Therefore, during upscaling of biomass production for a range of metal transformations, including the production of biogenic magnetite nanoparticles by Geobacter sulfurreducens, from 100-ml bench-scale to 5-liter fermentors, we applied Fourier transform infrared (FTIR) spectroscopy as a metabolic fingerprinting approach followed by the analysis of bacterial cell extracts by gas chromatography-mass spectrometry (GC-MS) for metabolic profiling. FTIR results clearly differentiated between the phenotypic changes associated with different growth phases as well as the two culturing conditions. Furthermore, the clustering patterns displayed by multivariate analysis were in agreement with the turbidimetric measurements, which displayed an extended lag phase for cells grown in a 5-liter bioreactor (24 h) compared to those grown in 100-ml serum bottles (6 h). GC-MS analysis of the cell extracts demonstrated an overall accumulation of fumarate during the lag phase under both culturing conditions, coinciding with the detected concentrations of oxaloacetate, pyruvate, nicotinamide, and glycerol-3-phosphate being at their lowest levels compared to other growth phases. These metabolites were overlaid onto a metabolic network of G. sulfurreducens, and taking into account the levels of these metabolites throughout the fermentation process, the limited availability of oxaloacetate and nicotinamide would seem to be the main metabolic bottleneck resulting from this scale-up process. Additional metabolite-feeding experiments were carried out to validate the above hypothesis. Nicotinamide supplementation (1 mM) did not display any significant effects on the lag phase of G. sulfurreducens cells grown in the 100-ml serum bottles. However

  7. Metabolic Profiling of Geobacter sulfurreducens during Industrial Bioprocess Scale-Up.

    PubMed

    Muhamadali, Howbeer; Xu, Yun; Ellis, David I; Allwood, J William; Rattray, Nicholas J W; Correa, Elon; Alrabiah, Haitham; Lloyd, Jonathan R; Goodacre, Royston

    2015-05-15

    During the industrial scale-up of bioprocesses it is important to establish that the biological system has not changed significantly when moving from small laboratory-scale shake flasks or culturing bottles to an industrially relevant production level. Therefore, during upscaling of biomass production for a range of metal transformations, including the production of biogenic magnetite nanoparticles by Geobacter sulfurreducens, from 100-ml bench-scale to 5-liter fermentors, we applied Fourier transform infrared (FTIR) spectroscopy as a metabolic fingerprinting approach followed by the analysis of bacterial cell extracts by gas chromatography-mass spectrometry (GC-MS) for metabolic profiling. FTIR results clearly differentiated between the phenotypic changes associated with different growth phases as well as the two culturing conditions. Furthermore, the clustering patterns displayed by multivariate analysis were in agreement with the turbidimetric measurements, which displayed an extended lag phase for cells grown in a 5-liter bioreactor (24 h) compared to those grown in 100-ml serum bottles (6 h). GC-MS analysis of the cell extracts demonstrated an overall accumulation of fumarate during the lag phase under both culturing conditions, coinciding with the detected concentrations of oxaloacetate, pyruvate, nicotinamide, and glycerol-3-phosphate being at their lowest levels compared to other growth phases. These metabolites were overlaid onto a metabolic network of G. sulfurreducens, and taking into account the levels of these metabolites throughout the fermentation process, the limited availability of oxaloacetate and nicotinamide would seem to be the main metabolic bottleneck resulting from this scale-up process. Additional metabolite-feeding experiments were carried out to validate the above hypothesis. Nicotinamide supplementation (1 mM) did not display any significant effects on the lag phase of G. sulfurreducens cells grown in the 100-ml serum bottles. However

  8. Comparison of a laboratory and a production coating spray gun with respect to scale-up.

    PubMed

    Mueller, Ronny; Kleinebudde, Peter

    2007-01-01

    A laboratory spray gun and a production spray gun were investigated in a scale-up study. Two Schlick spray guns, which are equipped with a new antibearding cap, were used in this study. The influence of the atomization air pressure, spray gun-to tablet bed distance, polymer solution viscosity, and spray rate were analyzed in a statistical design of experiments. The 2 spray guns were compared with respect to the spray width and height, droplet size, droplet velocity, and spray density. The droplet size, velocity, and spray density were measured with a Phase Doppler Particle Analyzer. A successful scale-up of the atomization is accomplished if similar droplet sizes, droplet velocities, and spray densities are achieved in the production scale as in the laboratory scale. This study gives basic information for the scale-up of the settings from the laboratory spray gun to the production spray gun. Both spray guns are highly comparable with respect to the droplet size and velocity. The scale-up of the droplet size should be performed by an adjustment of the atomization air pressure. The scale-up of the droplet velocity should be performed by an adjustment of the spray gun to tablet bed distance. The presented statistical model and surface plots are convenient and powerful tools for scaling up the spray settings if the spray gun is changed from laboratory spray gun to the production spray gun. PMID:17408226

  9. Applied potential tomography. A new noninvasive technique for measuring gastric emptying

    SciTech Connect

    Avill, R.; Mangnall, Y.F.; Bird, N.C.; Brown, B.H.; Barber, D.C.; Seagar, A.D.; Johnson, A.G.; Read, N.W.

    1987-04-01

    Applied potential tomography is a new, noninvasive technique that yields sequential images of the resistivity of gastric contents after subjects have ingested a liquid or semisolid meal. This study validates the technique as a means of measuring gastric emptying. Experiments in vitro showed an excellent correlation between measurements of resistivity and either the square of the radius of a glass rod or the volume of water in a spherical balloon when both were placed in an oval tank containing saline. Altering the lateral position of the rod in the tank did not alter the values obtained. Images of abdominal resistivity were also directly correlated with the volume of air in a gastric balloon. Profiles of gastric emptying of liquid meals obtained using applied potential tomography were very similar to those obtained using scintigraphy or dye dilution techniques, provided that acid secretion was inhibited by cimetidine. Profiles of emptying of a mashed potato meal using applied potential tomography were also very similar to those obtained by scintigraphy. Measurements of the emptying of a liquid meal from the stomach were reproducible if acid secretion was inhibited by cimetidine. Thus, applied potential tomography is an accurate and reproducible method of measuring gastric emptying of liquids and particulate food. It is inexpensive, well tolerated, easy to use, and ideally suited for multiple studies in patients, even those who are pregnant.

  10. Color metallography and electron microscopy techniques applied to the characterization of 413.0 aluminum alloys.

    PubMed

    Vander Voort, George; Asensio-Lozano, Juan; Suárez-Peña, Beatriz

    2013-08-01

    The influence on alloy 413.0 of the refinement and modification of its microstructure was analyzed by means of several microscopy techniques, as well as the effect of the application of high pressure during solidification. For each treatment and solidification pressure condition employed, the most suitable microscopy techniques for identifying and characterizing the phases present were investigated. Color metallography and electron microscopy techniques were applied to the qualitative microstructural analysis. Volume fraction and grain size of the primary α-Al were characterized by quantitative metallographic techniques. The results show that the effect caused by applying high pressure during solidification of the alloy is more pronounced than that caused by modification and refinement of the microstructure when it solidifies at atmospheric pressure. Furthermore, it has been shown that, for Al-Si alloy characterization, when aiming to characterize the primary α-Al phase, optical color metallography observed under crossed polarized light plus a sensitive tint filter is the most suitable technique. When the goal is to characterize the eutectic Si, the use of optical color metallography or electron microscopy is equally valid. The characterization of iron-rich intermetallic compounds should preferably be performed by means of backscattered electron imaging. PMID:23701972

  11. An Optimized Integrator Windup Protection Technique Applied to a Turbofan Engine Control

    NASA Technical Reports Server (NTRS)

    Watts, Stephen R.; Garg, Sanjay

    1995-01-01

    This paper introduces a new technique for providing memoryless integrator windup protection which utilizes readily available optimization software tools. This integrator windup protection synthesis provides a concise methodology for creating integrator windup protection for each actuation system loop independently while assuring both controller and closed loop system stability. The individual actuation system loops' integrator windup protection can then be combined to provide integrator windup protection for the entire system. This technique is applied to an H(exp infinity) based multivariable control designed for a linear model of an advanced afterburning turbofan engine. The resulting transient characteristics are examined for the integrated system while encountering single and multiple actuation limits.

  12. Scaling up health interventions in resource-poor countries: what role does research in stated-preference framework play?

    PubMed

    Pokhrel, Subhash

    2006-01-01

    Despite improved supply of health care services in low-income countries in the recent past, their uptake continues to be lower than anticipated. This has made it difficult to scale-up those interventions which are not only cost-effective from supply perspectives but that might have substantial impacts on improving the health status of these countries. Understanding demand-side barriers is therefore critically important. With the help of a case study from Nepal, this commentary argues that more research on demand-side barriers needs to be carried out and that the stated-preference (SP) approach to such research might be helpful. Since SP techniques place service users' preferences at the centre of the analysis, and because preferences reflect individual or social welfare, SP techniques are likely to be helpful in devising policies to increase social welfare (e.g. improved service coverage). Moreover, the SP data are collected in a controlled environment which allows straightforward identification of effects (e.g. that of process attributes of care) and large quantities of relevant data can be collected at moderate cost. In addition to providing insights into current preferences, SP data also provide insights into how preferences are likely to respond to a proposed change in resource allocation (e.g. changing service delivery strategy). Finally, the SP-based techniques have been used widely in resource-rich countries and their experience can be valuable in conducting scaling-up research in low-income countries. PMID:16573821

  13. A comparison of two conformal mapping techniques applied to an aerobrake body

    NASA Technical Reports Server (NTRS)

    Hommel, Mark J.

    1987-01-01

    Conformal mapping is a classical technique which has been utilized for solving problems in aerodynamics and hydrodynamics. Conformal mapping has been successfully applied in the construction of grids around airfoils, engine inlets and other aircraft configurations. Conformal mapping techniques were applied to an aerobrake body having an axis of symmetry. Two different approaches were utilized: (1) Karman-Trefftz transformation; and (2) Point Wise Schwarz Christoffel transformation. In both cases, the aerobrake body was mapped onto a near circle, and a grid was generated in the mapped plane. The mapped body and grid were then mapped back into physical space and the properties of the associated grids were examined. Advantages and disadvantages of both approaches are discussed.

  14. Scaling up strategies of the chronic respiratory disease programme of the European Innovation Partnership on Active and Healthy Ageing (Action Plan B3: Area 5).

    PubMed

    Bousquet, J; Farrell, J; Crooks, G; Hellings, P; Bel, E H; Bewick, M; Chavannes, N H; de Sousa, J Correia; Cruz, A A; Haahtela, T; Joos, G; Khaltaev, N; Malva, J; Muraro, A; Nogues, M; Palkonen, S; Pedersen, S; Robalo-Cordeiro, C; Samolinski, B; Strandberg, T; Valiulis, A; Yorgancioglu, A; Zuberbier, T; Bedbrook, A; Aberer, W; Adachi, M; Agusti, A; Akdis, C A; Akdis, M; Ankri, J; Alonso, A; Annesi-Maesano, I; Ansotegui, I J; Anto, J M; Arnavielhe, S; Arshad, H; Bai, C; Baiardini, I; Bachert, C; Baigenzhin, A K; Barbara, C; Bateman, E D; Beghé, B; Kheder, A Ben; Bennoor, K S; Benson, M; Bergmann, K C; Bieber, T; Bindslev-Jensen, C; Bjermer, L; Blain, H; Blasi, F; Boner, A L; Bonini, M; Bonini, S; Bosnic-Anticevitch, S; Boulet, L P; Bourret, R; Bousquet, P J; Braido, F; Briggs, A H; Brightling, C E; Brozek, J; Buhl, R; Burney, P G; Bush, A; Caballero-Fonseca, F; Caimmi, D; Calderon, M A; Calverley, P M; Camargos, P A M; Canonica, G W; Camuzat, T; Carlsen, K H; Carr, W; Carriazo, A; Casale, T; Cepeda Sarabia, A M; Chatzi, L; Chen, Y Z; Chiron, R; Chkhartishvili, E; Chuchalin, A G; Chung, K F; Ciprandi, G; Cirule, I; Cox, L; Costa, D J; Custovic, A; Dahl, R; Dahlen, S E; Darsow, U; De Carlo, G; De Blay, F; Dedeu, T; Deleanu, D; De Manuel Keenoy, E; Demoly, P; Denburg, J A; Devillier, P; Didier, A; Dinh-Xuan, A T; Djukanovic, R; Dokic, D; Douagui, H; Dray, G; Dubakiene, R; Durham, S R; Dykewicz, M S; El-Gamal, Y; Emuzyte, R; Fabbri, L M; Fletcher, M; Fiocchi, A; Fink Wagner, A; Fonseca, J; Fokkens, W J; Forastiere, F; Frith, P; Gaga, M; Gamkrelidze, A; Garces, J; Garcia-Aymerich, J; Gemicioğlu, B; Gereda, J E; González Diaz, S; Gotua, M; Grisle, I; Grouse, L; Gutter, Z; Guzmán, M A; Heaney, L G; Hellquist-Dahl, B; Henderson, D; Hendry, A; Heinrich, J; Heve, D; Horak, F; Hourihane, J O' B; Howarth, P; Humbert, M; Hyland, M E; Illario, M; Ivancevich, J C; Jardim, J R; Jares, E J; Jeandel, C; Jenkins, C; Johnston, S L; Jonquet, O; Julge, K; Jung, K S; Just, J; Kaidashev, I; Kaitov, M R; Kalayci, O; Kalyoncu, A F; Keil, T; Keith, P K; Klimek, L; Koffi N'Goran, B; Kolek, V; Koppelman, G H; Kowalski, M L; Kull, I; Kuna, P; Kvedariene, V; Lambrecht, B; Lau, S; Larenas-Linnemann, D; Laune, D; Le, L T T; Lieberman, P; Lipworth, B; Li, J; Lodrup Carlsen, K; Louis, R; MacNee, W; Magard, Y; Magnan, A; Mahboub, B; Mair, A; Majer, I; Makela, M J; Manning, P; Mara, S; Marshall, G D; Masjedi, M R; Matignon, P; Maurer, M; Mavale-Manuel, S; Melén, E; Melo-Gomes, E; Meltzer, E O; Menzies-Gow, A; Merk, H; Michel, J P; Miculinic, N; Mihaltan, F; Milenkovic, B; Mohammad, G M Y; Molimard, M; Momas, I; Montilla-Santana, A; Morais-Almeida, M; Morgan, M; Mösges, R; Mullol, J; Nafti, S; Namazova-Baranova, L; Naclerio, R; Neou, A; Neffen, H; Nekam, K; Niggemann, B; Ninot, G; Nyembue, T D; O'Hehir, R E; Ohta, K; Okamoto, Y; Okubo, K; Ouedraogo, S; Paggiaro, P; Pali-Schöll, I; Panzner, P; Papadopoulos, N; Papi, A; Park, H S; Passalacqua, G; Pavord, I; Pawankar, R; Pengelly, R; Pfaar, O; Picard, R; Pigearias, B; Pin, I; Plavec, D; Poethig, D; Pohl, W; Popov, T A; Portejoie, F; Potter, P; Postma, D; Price, D; Rabe, K F; Raciborski, F; Radier Pontal, F; Repka-Ramirez, S; Reitamo, S; Rennard, S; Rodenas, F; Roberts, J; Roca, J; Rodriguez Mañas, L; Rolland, C; Roman Rodriguez, M; Romano, A; Rosado-Pinto, J; Rosario, N; Rosenwasser, L; Rottem, M; Ryan, D; Sanchez-Borges, M; Scadding, G K; Schunemann, H J; Serrano, E; Schmid-Grendelmeier, P; Schulz, H; Sheikh, A; Shields, M; Siafakas, N; Sibille, Y; Similowski, T; Simons, F E R; Sisul, J C; Skrindo, I; Smit, H A; Solé, D; Sooronbaev, T; Spranger, O; Stelmach, R; Sterk, P J; Sunyer, J; Thijs, C; To, T; Todo-Bom, A; Triggiani, M; Valenta, R; Valero, A L; Valia, E; Valovirta, E; Van Ganse, E; van Hage, M; Vandenplas, O; Vasankari, T; Vellas, B; Vestbo, J; Vezzani, G; Vichyanond, P; Viegi, G; Vogelmeier, C; Vontetsianos, T; Wagenmann, M; Wallaert, B; Walker, S; Wang, D Y; Wahn, U; Wickman, M; Williams, D M; Williams, S; Wright, J; Yawn, B P; Yiallouros, P K; Yusuf, O M; Zaidi, A; Zar, H J; Zernotti, M E; Zhang, L; Zhong, N; Zidarn, M; Mercier, J

    2016-01-01

    Action Plan B3 of the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA) focuses on the integrated care of chronic diseases. Area 5 (Care Pathways) was initiated using chronic respiratory diseases as a model. The chronic respiratory disease action plan includes (1) AIRWAYS integrated care pathways (ICPs), (2) the joint initiative between the Reference site MACVIA-LR (Contre les MAladies Chroniques pour un VIeillissement Actif) and ARIA (Allergic Rhinitis and its Impact on Asthma), (3) Commitments for Action to the European Innovation Partnership on Active and Healthy Ageing and the AIRWAYS ICPs network. It is deployed in collaboration with the World Health Organization Global Alliance against Chronic Respiratory Diseases (GARD). The European Innovation Partnership on Active and Healthy Ageing has proposed a 5-step framework for developing an individual scaling up strategy: (1) what to scale up: (1-a) databases of good practices, (1-b) assessment of viability of the scaling up of good practices, (1-c) classification of good practices for local replication and (2) how to scale up: (2-a) facilitating partnerships for scaling up, (2-b) implementation of key success factors and lessons learnt, including emerging technologies for individualised and predictive medicine. This strategy has already been applied to the chronic respiratory disease action plan of the European Innovation Partnership on Active and Healthy Ageing. PMID:27478588

  15. Non-Intrusive Measurement Techniques Applied to the Hybrid Solid Fuel Degradation

    NASA Astrophysics Data System (ADS)

    Cauty, F.

    2004-10-01

    The knowledge of the solid fuel regression rate and the time evolution of the grain geometry are requested for hybrid motor design and control of its operating conditions. Two non-intrusive techniques (NDT) have been applied to hybrid propulsion : both are based on wave propagation, the X-rays and the ultrasounds, through the materials. X-ray techniques allow local thickness measurements (attenuated signal level) using small probes or 2D images (Real Time Radiography), with a link between the size of field of view and accuracy. Beside the safety hazards associated with the high-intensity X-ray systems, the image analysis requires the use of quite complex post-processing techniques. The ultrasound technique is more widely used in energetic material applications, including hybrid fuels. Depending upon the transducer size and the associated equipment, the application domain is large, from tiny samples to the quad-port wagon wheel grain of the 1.1 MN thrust HPDP motor. The effect of the physical quantities has to be taken into account in the wave propagation analysis. With respect to the various applications, there is no unique and perfect experimental method to measure the fuel regression rate. The best solution could be obtained by combining two techniques at the same time, each technique enhancing the quality of the global data.

  16. The Atomic AXAFS and XANES Techniques as Applied to Heterogeneous Catalysis and Electrocatalysis

    SciTech Connect

    Ramaker, D.; Koningsberger, D

    2010-01-01

    X-Ray absorption spectroscopy (XAFS) is an attractive in situ and in operando technique. In recent years, the more conventional extended X-ray absorption fine structure (EXAFS) data analysis technique has been complemented by two newer analysis methods: the 'atomic' XAFS (AXAFS) technique, which analyzes the scattering from the absorber atom itself, and the {Delta}{mu} XANES technique, which uses a difference method to isolate the changes in the X-ray absorption near edge structure (XANES) due to adsorbates on a metal surface. With AXAFS it is possible to follow the electronic effect a support has on a metal particle; with {Delta}{mu} XANES it is possible to determine the adsorbate, the specific adsorption sites and adsorbate coverage on a metal catalyst. This unprecedented new information helps a great deal to unravel the complex kinetic mechanisms operating in working reactors or fuelcell systems. The fundamental principles and methodology for applying the AXAFS and {Delta}{mu} XANES techniques are given here, and then specific applications are summarized, including H adsorption on supported Pt in the gas phase, wateractivation at a Pt cathode and methanol oxidation at a Pt anode in an electrochemical cell, sulfur oxidation on Pt, and oxygenreduction on a Au/SnO{sub x} cathode. Finally, the future outlook for time and/or space resolved applications of these techniques is contemplated.

  17. Confirmation of standard error analysis techniques applied to EXAFS using simulations

    SciTech Connect

    Booth, Corwin H; Hu, Yung-Jin

    2009-12-14

    Systematic uncertainties, such as those in calculated backscattering amplitudes, crystal glitches, etc., not only limit the ultimate accuracy of the EXAFS technique, but also affect the covariance matrix representation of real parameter errors in typical fitting routines. Despite major advances in EXAFS analysis and in understanding all potential uncertainties, these methods are not routinely applied by all EXAFS users. Consequently, reported parameter errors are not reliable in many EXAFS studies in the literature. This situation has made many EXAFS practitioners leery of conventional error analysis applied to EXAFS data. However, conventional error analysis, if properly applied, can teach us more about our data, and even about the power and limitations of the EXAFS technique. Here, we describe the proper application of conventional error analysis to r-space fitting to EXAFS data. Using simulations, we demonstrate the veracity of this analysis by, for instance, showing that the number of independent dat a points from Stern's rule is balanced by the degrees of freedom obtained from a 2 statistical analysis. By applying such analysis to real data, we determine the quantitative effect of systematic errors. In short, this study is intended to remind the EXAFS community about the role of fundamental noise distributions in interpreting our final results.

  18. A THREE-DIMENSIONAL MICROFLUIDIC APPROACH TO SCALING UP MICROENCAPSULATION OF CELLS

    PubMed Central

    Tendulkar, Sameer; Mirmalek-Sani, Sayed-Hadi; Childers, Charles; Saul, Justin; Opara, Emmanuel C; Ramasubramanian, Melur K.

    2013-01-01

    Current applications of the microencapsulation technique include the use of encapsulated islet cells to treat Type 1 diabetes, and encapsulated hepatocytes for providing temporary but adequate metabolic support to allow spontaneous liver regeneration, or as a bridge to liver transplantation for patients with chronic liver disease. Also, microcapsules can be used for controlled delivery of therapeutic drugs. The two most widely used devices for microencapsulation are the air-syringe pump droplet generator and the electrostatic bead generator, each of which is fitted with a single needle through which droplets of cells suspended in alginate solution are produced and cross-linked into microbeads. A major drawback in the design of these instruments is that they are incapable of producing sufficient numbers of microcapsules in a short-time period to permit mass production of encapsulated and viable cells for transplantation in large animals and humans. We present in this paper a microfluidic approach to scaling up cell and protein encapsulations. The microfluidic chip consists of a 3D air supply and multi-nozzle outlet for microcapsule generation. It has one alginate inlet and compressed air inlet. The outlet has 8 nozzles, each having 380 micrometers inner diameter, which produce hydrogel microspheres ranging from 500–700 µm in diameter. These nozzles are concentrically surrounded by air nozzles with 2mm inner diameter. There are two tubes connected at the top to allow the air to escape as the alginate solution fills up the chamber. A variable flow pump 115 V is used to pump alginate solution and Tygon® tubing is used to connect in-house air supply to the air channel and peristaltic/syringe pump to the alginate chamber. A pressure regulator is used to control the flow rate of air. We have encapsulated islets and proteins with this high throughput device, which is expected to improve product quality control in microencapsulation of cells, and hence the outcome of

  19. Comparison between different techniques applied to quartz CPO determination in granitoid mylonites

    NASA Astrophysics Data System (ADS)

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish

    2016-04-01

    Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.

  20. Enabling and challenging factors in institutional reform: The case of SCALE-UP

    NASA Astrophysics Data System (ADS)

    Foote, Kathleen; Knaub, Alexis; Henderson, Charles; Dancy, Melissa; Beichner, Robert J.

    2016-06-01

    While many innovative teaching strategies exist, integration into undergraduate science teaching has been frustratingly slow. This study aims to understand the low uptake of research-based instructional innovations by studying 21 successful implementations of the Student Centered Active Learning with Upside-down Pedagogies (SCALE-UP) instructional reform. SCALE-UP significantly restructures the classroom environment and pedagogy to promote highly active and interactive instruction. Although originally designed for university introductory physics courses, SCALE-UP has spread to many other disciplines at hundreds of departments around the world. This study reports findings from in-depth, open-ended interviews with 21 key contact people involved with successful secondary implementations of SCALE-UP throughout the United States. We defined successful implementations as those who restructured their pedagogy and classroom and sustained and/or spread the change. Interviews were coded to identify the most common enabling and challenging factors during reform implementation and compared to the theoretical framework of Kotter's 8-step Change Model. The most common enabling influences that emerged are documenting and leveraging evidence of local success, administrative support, interaction with outside SCALE-UP user(s), and funding. Many challenges are linked to the lack of these enabling factors including difficulty finding funding, space, and administrative and/or faculty support for reform. Our focus on successful secondary implementations meant that most interviewees were able to overcome challenges. Presentation of results is illuminated with case studies, quotes, and examples that can help secondary implementers with SCALE-UP reform efforts specifically. We also discuss the implications for policy makers, researchers, and the higher education community concerned with initiating structural change.

  1. Scaling up antiretroviral treatment and improving patient retention in care: lessons from Ethiopia, 2005-2013

    PubMed Central

    2014-01-01

    Background Antiretroviral treatment (ART) was provided to more than nine million people by the end of 2012. Although ART programs in resource-limited settings have expanded treatment, inadequate retention in care has been a challenge. Ethiopia has been scaling up ART and improving retention (defined as continuous engagement of patients in care) in care. We aimed to analyze the ART program in Ethiopia. Methods A mix of quantitative and qualitative methods was used. Routine ART program data was used to study ART scale up and patient retention in care. In-depth interviews and focus group discussions were conducted with program managers. Results The number of people receiving ART in Ethiopia increased from less than 9,000 in 2005 to more than 439, 000 in 2013. Initially, the public health approach, health system strengthening, community mobilization and provision of care and support services allowed scaling up of ART services. While ART was being scaled up, retention was recognized to be insufficient. To improve retention, a second wave of interventions, related to programmatic, structural, socio-cultural, and patient information systems, have been implemented. Retention rate increased from 77% in 2004/5 to 92% in 2012/13. Conclusion Ethiopia has been able to scale up ART and improve retention in care in spite of its limited resources. This has been possible due to interventions by the ART program, supported by health systems strengthening, community-based organizations and the communities themselves. ART programs in resource-limited settings need to put in place similar measures to scale up ART and retain patients in care. PMID:24886686

  2. A comparative study of progressive versus successive spectrophotometric resolution techniques applied for pharmaceutical ternary mixtures.

    PubMed

    Saleh, Sarah S; Lotfy, Hayam M; Hassan, Nagiba Y; Salem, Hesham

    2014-11-11

    This work represents a comparative study of a novel progressive spectrophotometric resolution technique namely, amplitude center method (ACM), versus the well-established successive spectrophotometric resolution techniques namely; successive derivative subtraction (SDS); successive derivative of ratio spectra (SDR) and mean centering of ratio spectra (MCR). All the proposed spectrophotometric techniques consist of several consecutive steps utilizing ratio and/or derivative spectra. The novel amplitude center method (ACM) can be used for the determination of ternary mixtures using single divisor where the concentrations of the components are determined through progressive manipulation performed on the same ratio spectrum. Those methods were applied for the analysis of the ternary mixture of chloramphenicol (CHL), dexamethasone sodium phosphate (DXM) and tetryzoline hydrochloride (TZH) in eye drops in the presence of benzalkonium chloride as a preservative. The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitation and sensitivity. The obtained results were statistically compared with those obtained from the official BP methods, showing no significant difference with respect to accuracy and precision. PMID:24873889

  3. A comparative study of progressive versus successive spectrophotometric resolution techniques applied for pharmaceutical ternary mixtures

    NASA Astrophysics Data System (ADS)

    Saleh, Sarah S.; Lotfy, Hayam M.; Hassan, Nagiba Y.; Salem, Hesham

    2014-11-01

    This work represents a comparative study of a novel progressive spectrophotometric resolution technique namely, amplitude center method (ACM), versus the well-established successive spectrophotometric resolution techniques namely; successive derivative subtraction (SDS); successive derivative of ratio spectra (SDR) and mean centering of ratio spectra (MCR). All the proposed spectrophotometric techniques consist of several consecutive steps utilizing ratio and/or derivative spectra. The novel amplitude center method (ACM) can be used for the determination of ternary mixtures using single divisor where the concentrations of the components are determined through progressive manipulation performed on the same ratio spectrum. Those methods were applied for the analysis of the ternary mixture of chloramphenicol (CHL), dexamethasone sodium phosphate (DXM) and tetryzoline hydrochloride (TZH) in eye drops in the presence of benzalkonium chloride as a preservative. The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitation and sensitivity. The obtained results were statistically compared with those obtained from the official BP methods, showing no significant difference with respect to accuracy and precision.

  4. Roller compaction process development and scale up using Johanson model calibrated with instrumented roll data.

    PubMed

    Nesarikar, Vishwas V; Patel, Chandrakant; Early, William; Vatsaraj, Nipa; Sprockel, Omar; Jerzweski, Robert

    2012-10-15

    Roller compaction is a dry granulation process used to convert powder blends into free flowing agglomerates. During scale up or transfer of roller compaction process, it is critical to maintain comparable ribbon densities at each scale in order to achieve similar tensile strengths and subsequently similar particle size distribution of milled material. Similar ribbon densities can be reached by maintaining analogous normal stress applied by the rolls on ribbon for a given gap between rolls. Johanson (1965) developed a model to predict normal stress based on material properties and roll diameter. However, the practical application of Johanson model to estimate normal stress on the ribbon is limited due to its requirement of accurate estimate of nip pressure i.e. pressure at the nip angle. Another weakness of Johanson model is the assumption of a fixed angle of wall friction that leads to use of a fixed nip angle in the model. To overcome the above mentioned limitations, we developed a novel approach using roll force equations based on a modified Johanson model in which the requirement of pressure value at nip angle was eliminated. An instrumented roll on WP120 roller compactor was used to collect normal stress data measured at three locations across the width of a roll (P1, P2, P3), as well as gap and nip angle data on ribbon for placebo and various active blends along with corresponding process parameters. The nip angles were estimated directly using experimental pressure profile data of each run. The roll force equation of Johanson model was validated using normal stress, gap, and nip angle data of the placebo runs. The calculated roll force values compared well with those determined from the roll force equation provided for the Alexanderwerk(®) WP120 roller compactor. Subsequently, the calculation was reversed to estimate normal stress and corresponding ribbon densities as a function of gap and RFU (roll force per unit roll width). A placebo model was developed

  5. Near-infrared spectroscopy and pattern recognition techniques applied to the identification of Jinhua ham

    NASA Astrophysics Data System (ADS)

    Li, Honglian; Zhao, Zhilei; Pang, Yanping; Wu, Guancheng; Wang, Yanfeng; Li, Xiaoting

    2009-11-01

    Near-infrared (NIR) diffuse reflectance spectroscopy and pattern recognition techniques are applied to develop a fast identification method of Jinhua ham. The samples are collected from different manufactures and they are nineteen Jinhua ham samples and four Xuanwei ham samples. NIR spectra are pretreated with second derivative calculation and vector normalization. The pattern recognition techniques which are cluster analysis, conformity test and principal component analysis (PCA) are separately used to qualify Jinhua ham. The three methods can all distinguish Jinhua ham successfully. The result indicated that a 100 % recognition ration is achieved by the methods and the PCA method is the best one. Overall, NIR reflectance spectroscopy using pattern recognition is shown to have significant potential as a rapid and accurate method for identification of ham.

  6. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images.

    PubMed

    Boix, Macarena; Cantó, Begoña

    2013-04-01

    Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells. PMID:23458301

  7. Geophysical techniques applied to urban planning in complex near surface environments. Examples of Zaragoza, NE Spain

    NASA Astrophysics Data System (ADS)

    Pueyo-Anchuela, Ó.; Casas-Sainz, A. M.; Soriano, M. A.; Pocoví-Juan, A.

    Complex geological shallow subsurface environments represent an important handicap in urban and building projects. The geological features of the Central Ebro Basin, with sharp lateral changes in Quaternary deposits, alluvial karst phenomena and anthropic activity can preclude the characterization of future urban areas only from isolated geomechanical tests or from non-correctly dimensioned geophysical techniques. This complexity is here analyzed in two different test fields, (i) one of them linked to flat-bottomed valleys with irregular distribution of Quaternary deposits related to sharp lateral facies changes and irregular preconsolidated substratum position and (ii) a second one with similar complexities in the alluvial deposits and karst activity linked to solution of the underlying evaporite substratum. The results show that different geophysical techniques allow for similar geological models to be obtained in the first case (flat-bottomed valleys), whereas only the application of several geophysical techniques can permit to correctly evaluate the geological model complexities in the second case (alluvial karst). In this second case, the geological and superficial information permit to refine the sensitivity of the applied geophysical techniques to different indicators of karst activity. In both cases 3D models are needed to correctly distinguish alluvial lateral sedimentary changes from superimposed karstic activity.

  8. A systems approach of the nondestructive evaluation techniques applied to Scout solid rocket motors.

    NASA Technical Reports Server (NTRS)

    Oaks, A. E.

    1971-01-01

    Review and appraisal of the status of the nondestructive tests applied to Scout solid-propellant rocket motors, using analytical techniques to evaluate radiography for detecting internal discontinuities such as voids and unbonds. Information relating to selecting, performing, controlling, and evaluating the results of NDE tests was reduced to a common simplified format. With these data and the results of the analytical studies performed, it was possible to make the basic appraisals of the ability of a test to meet all pertinent acceptance criteria and, where necessary, provide suggestions to improve the situation.

  9. Contact Nd:YAG Laser Technique Applied To Head And Neck Reconstructive Surgery

    NASA Astrophysics Data System (ADS)

    Nobori, Takuo; Miyazaki, Yasuhiro; Moriyama, Ichiro; Sannikorn, Phakdee; Ohyama, Masaru

    1989-09-01

    The contact Nd:YAG laser system with ceramics tip was applied to head and neck reconstructive surgery. Plastic surgery was performed in 78 patients with head and neck diseases during the past 11 years. Since 1984 reconstructive surgery in these patients was made on 60 cases and on 45 cases(75%) of these cases the contact Nd:YAG laser surgery was used. Using this laser technique, half volume of bleeding in the operation was obtained as compared with that of the conventional procedure.

  10. Grid-based Moment Tensor Inversion Technique Apply for Earthquakes Offshore of Northeast Taiwan

    NASA Astrophysics Data System (ADS)

    Cheng, H.; Lee, S.; Ma, K.

    2010-12-01

    We use a grid-based moment tensor inversion technique and broadband continuous recordings to real-time monitoring the earthquakes offshore northeast Taiwan. The moment tensor inversion technique and a grid search scheme are applied to obtain the information of source parameters, including the hypocenter, moment magnitude, and focal mechanism. In Taiwan, the routine moment tensor solutions are reported by CWB(Central Weather Bureau) and BATS(Broadband Array in Taiwan for Seismology) which both require some lag time for the information on event time and location before doing CMT(Centroid Moment Tensor) analysis. By using the Grid-based moment tensor inversion technique, the event location and focal mechanism could be obtained simultaneously within about two minutes after the occurrence of the earthquake. This inversion procedure is based on a 1-D Green’s functions database calculated by frequency-wavenumber(fk) method. The northeast offshore of Taiwan has been taken into account as our first test area which covers the region of 121.5E to 123E, 23.5N to 25N, and the depth to 136 km. A 3D grid system is set in this study area with average grid size of 10 x 10 x 10 km3. We compare our results with the past earthquakes from 2008 to 2010 which had analyzed by BATS CMT. We also compare the event time detected by GridMT with the CWB earthquake reports. The results indicate that the grid-based moment tensor inversion system is efficient and realizable to be applied real-time on monitoring the local seismic activity. Our long-term goal is to use the GridMT technique with fully 3-D Green’s functions for the whole Taiwan in the future.

  11. Scaling up watershed model parameters: flow and load simulations of the Edisto River Basin, South Carolina, 2007-09

    USGS Publications Warehouse

    Feaster, Toby D.; Benedict, Stephen T.; Clark, Jimmy M.; Bradley, Paul M.; Conrads, Paul A.

    2014-01-01

    As part of an ongoing effort by the U.S. Geological Survey to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River Basin, analyses and simulations of the hydrology of the Edisto River Basin were made using the topography-based hydrological model (TOPMODEL). A primary focus of the investigation was to assess the potential for scaling up a previous application of TOPMODEL for the McTier Creek watershed, which is a small headwater catchment to the Edisto River Basin. Scaling up was done in a step-wise manner, beginning with applying the calibration parameters, meteorological data, and topographic-wetness-index data from the McTier Creek TOPMODEL to the Edisto River TOPMODEL. Additional changes were made for subsequent simulations, culminating in the best simulation, which included meteorological and topographic wetness index data from the Edisto River Basin and updated calibration parameters for some of the TOPMODEL calibration parameters. The scaling-up process resulted in nine simulations being made. Simulation 7 best matched the streamflows at station 02175000, Edisto River near Givhans, SC, which was the downstream limit for the TOPMODEL setup, and was obtained by adjusting the scaling factor, including streamflow routing, and using NEXRAD precipitation data for the Edisto River Basin. The Nash-Sutcliffe coefficient of model-fit efficiency and Pearson’s correlation coefficient for simulation 7 were 0.78 and 0.89, respectively. Comparison of goodness-of-fit statistics between measured and simulated daily mean streamflow for the McTier Creek and Edisto River models showed that with calibration, the Edisto River TOPMODEL produced slightly better results than the McTier Creek model, despite the substantial difference in the drainage-area size at the outlet locations for the two models (30.7 and 2,725 square miles, respectively). Along with the TOPMODEL

  12. Evaluating scale-up rules of a high-shear wet granulation process.

    PubMed

    Tao, Jing; Pandey, Preetanshu; Bindra, Dilbir S; Gao, Julia Z; Narang, Ajit S

    2015-07-01

    This work aimed to evaluate the commonly used scale-up rules for high-shear wet granulation process using a microcrystalline cellulose-lactose-based low drug loading formulation. Granule properties such as particle size, porosity, flow, and tabletability, and tablet dissolution were compared across scales using scale-up rules based on different impeller speed calculations or extended wet massing time. Constant tip speed rule was observed to produce slightly less granulated material at the larger scales. Longer wet massing time can be used to compensate for the lower shear experienced by the granules at the larger scales. Constant Froude number and constant empirical stress rules yielded granules that were more comparable across different scales in terms of compaction performance and tablet dissolution. Granule porosity was shown to correlate well with blend tabletability and tablet dissolution, indicating the importance of monitoring granule densification (porosity) during scale-up. It was shown that different routes can be chosen during scale-up to achieve comparable granule growth and densification by altering one of the three parameters: water amount, impeller speed, and wet massing time. PMID:26010137

  13. Sustainability of a Scale-Up Intervention in Early Mathematics: A Longitudinal Evaluation of Implementation Fidelity

    ERIC Educational Resources Information Center

    Clements, Douglas H.; Sarama, Julie; Wolfe, Christopher B.; Spitler, Mary Elaine

    2015-01-01

    Research Findings: We evaluated the fidelity of implementation and the sustainability of effects of a research-based model for scaling up educational interventions. The model was implemented by 64 teachers in 26 schools in 2 distal city districts serving low-resource communities, with schools randomly assigned to condition. Practice or Policy:…

  14. "Scaling Up" Educational Change: Some Musings on Misrecognition and Doxic Challenges

    ERIC Educational Resources Information Center

    Thomson, Pat

    2014-01-01

    Educational policy-makers around the world are strongly committed to the notion of "scaling up". This can mean anything from encouraging more teachers to take up a pedagogical innovation, all the way through to system-wide efforts to implement "what works" across all schools. In this paper, I use Bourdieu's notions of…

  15. 77 FR 25152 - Applications for New Awards; Investing in Innovation Fund, Scale-Up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-27

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Applications for New Awards; Investing in Innovation Fund, Scale- Up Grants Correction In notice document 2012-7362 appearing on pages 18216-18229 in the issue of Tuesday, March 27, 2012 make the...

  16. Including Performance Assessments in Accountability Systems: A Review of Scale-Up Efforts

    ERIC Educational Resources Information Center

    Tung, Rosann

    2010-01-01

    The purpose of this literature and field review is to understand previous efforts at scaling up performance assessments for use across districts and states. Performance assessments benefit students and teachers by providing more opportunities for students to demonstrate their knowledge and complex skills, by providing teachers with better…

  17. Introduction to SCALE-UP: Student-Centered Activities for Large Enrollment University Physics.

    ERIC Educational Resources Information Center

    Beichner, Robert J.; Saul, Jeffery M.; Allain, Rhett J.; Deardorff, Duane L.; Abbott, David S.

    SCALE-UP is an extension of the highly successful IMPEC (Integrated Math, Physics, Engineering, and Chemistry) project, one of North Carolina State's curricular reform efforts undertaken as part of the SUCCEED coalition. The authors utilize the interactive, collaboratively based instruction that worked well in smaller class settings and find ways…

  18. Scaling Up Success: Lessons Learned from Technology-Based Educational Improvement

    ERIC Educational Resources Information Center

    Dede, Chris, Ed.; Honan, James P., Ed.; Peters, Laurence C., Ed.

    2005-01-01

    Drawing from the information presented at a conference sponsored by the Harvard Graduate School of Education and the Mid-Atlantic Regional Technology in Education Consortium, educators, researchers, and policymakers translate theory into practice to provide a hands-on resource that describes different models for scaling up success. This resource…

  19. From scaling up to sustainability in HIV: potential lessons for moving forward

    PubMed Central

    2013-01-01

    Background In 30 years of experience in responding to the HIV epidemic, critical decisions and program characteristics for successful scale-up have been studied. Now leaders face a new challenge: sustaining large-scale HIV prevention programs. Implementers, funders, and the communities served need to assess what strategies and practices of scaling up are also relevant for sustaining delivery at scale. Methods We reviewed white and gray literature to identify domains central to scaling-up programs and reviewed HIV case studies to identify how these domains might relate to sustaining delivery at scale. Results We found 10 domains identified as important for successfully scaling up programs that have potential relevance for sustaining delivery at scale: fiscal support; political support; community involvement, integration, buy-in, and depth; partnerships; balancing flexibility/adaptability and standardization; supportive policy, regulatory, and legal environment; building and sustaining strong organizational capacity; transferring ownership; decentralization; and ongoing focus on sustainability. We identified one additional potential domain important for programs sustaining delivery at scale: emphasizing equity. Conclusions Today, the public and private sector are examining their ability to generate value for populations. All stakeholders are aiming to stem the tide of the HIV epidemic. Implementers need a framework to guide the evolution of their strategies and management practices. Greater research is needed to refine the domains for policy and program implementers working to sustain HIV program delivery at scale. PMID:24199749

  20. Scaling-Up Aid to Education: Is Absorptive Capacity a Constraint?

    ERIC Educational Resources Information Center

    Rose, Pauline

    2009-01-01

    "Absorptive capacity" is a frequently used term amongst development practitioners in education. It is adopted by some as a reason for caution over scaling up aid. Others are of the view that absorptive capacity is an excuse by some donors for not delivering on their Education for All financing commitments. Drawing on interviews with…

  1. Early College for All: Efforts to Scale up Early Colleges in Multiple Settings

    ERIC Educational Resources Information Center

    Edmunds, Julie A.

    2016-01-01

    Given the positive impacts of the small, stand-alone early college model and the desire to provide those benefits to more students, organizations have begun efforts to scale up the early college model in a variety of settings. These efforts have been supported by the federal government, particularly by the Investing in Innovation (i3) program.…

  2. TANK 18-F AND 19-F TANK FILL GROUT SCALE UP TEST SUMMARY

    SciTech Connect

    Stefanko, D.; Langton, C.

    2012-01-03

    High-level waste (HLW) tanks 18-F and 19-F have been isolated from FTF facilities. To complete operational closure the tanks will be filled with grout for the purpose of: (1) physically stabilizing the tanks, (2) limiting/eliminating vertical pathways to residual waste, (3) entombing waste removal equipment, (4) discouraging future intrusion, and (5) providing an alkaline, chemical reducing environment within the closure boundary to control speciation and solubility of select radionuclides. This report documents the results of a four cubic yard bulk fill scale up test on the grout formulation recommended for filling Tanks 18-F and 19-F. Details of the scale up test are provided in a Test Plan. The work was authorized under a Technical Task Request (TTR), HLE-TTR-2011-008, and was performed according to Task Technical and Quality Assurance Plan (TTQAP), SRNL-RP-2011-00587. The bulk fill scale up test described in this report was intended to demonstrate proportioning, mixing, and transportation, of material produced in a full scale ready mix concrete batch plant. In addition, the material produced for the scale up test was characterized with respect to fresh properties, thermal properties, and compressive strength as a function of curing time.

  3. Scaling up STEM Academies Statewide: Implementation, Network Supports, and Early Outcomes

    ERIC Educational Resources Information Center

    Young, Viki; House, Ann; Sherer, David; Singleton, Corinne; Wang, Haiwen; Klopfenstein, Kristin

    2016-01-01

    This chapter presents a case study of scaling up the T-STEM initiative in Texas. Data come from the four-year longitudinal evaluation of the Texas High School Project (THSP). The evaluation studied the implementation and impact of T-STEM and the other THSP reforms using a mixed-methods design, including qualitative case studies; principal,…

  4. Integrated Graduate and Continuing Education in Protein Chromatography for Bioprocess Development and Scale-Up

    ERIC Educational Resources Information Center

    Carta, Jungbauer

    2011-01-01

    We describe an intensive course that integrates graduate and continuing education focused on the development and scale-up of chromatography processes used for the recovery and purification of proteins with special emphasis on biotherapeutics. The course includes lectures, laboratories, teamwork, and a design exercise and offers a complete view of…

  5. Education Reform Support: A Framework for Scaling Up School Reform. Policy Paper Series.

    ERIC Educational Resources Information Center

    Healey, F. Henry; DeStefano, Joseph

    The Bureau for Africa of the United States Agency for International Development (USAID) has been examining in detail the question of how best to support and sustain sectorwide education reform in Africa. The USAID and Education Commission of the States jointly sponsored a seminar in October 1996 to examine the issue of "scaling up" and to bring…

  6. 77 FR 18216 - Applications for New Awards; Investing in Innovation Fund, Scale-Up Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-27

    ... FR 11087) and is available at http://www.gpo.gov/fdsys/pkg/FR-2012-02-24/pdf/2012-4357.pdf . Scale-up... FR 12004- 12071)(2010 i3 NFP). Priorities: This competition includes five absolute priorities and... the Federal Register on December 15, 2010 (75 FR 78486-78511), and corrected on May 12, 2011 (76...

  7. Compaction Scale Up and Optimization of Cylindrical Fuel Compacts for the Next Generation Nuclear Plant

    SciTech Connect

    Jeffrey J. Einerson; Jeffrey A. Phillips; Eric L. Shaber; Scott E. Niedzialek; W. Clay Richardson; Scott G. Nagley

    2012-10-01

    Multiple process approaches have been used historically to manufacture cylindrical nuclear fuel compacts. Scale-up of fuel compacting was required for the Next Generation Nuclear Plant (NGNP) project to achieve an economically viable automated production process capable of providing a minimum of 10 compacts/minute with high production yields. In addition, the scale-up effort was required to achieve matrix density equivalent to baseline historical production processes, and allow compacting at fuel packing fractions up to 46% by volume. The scale-up approach of jet milling, fluid-bed overcoating, and hot-press compacting adopted in the U.S. Advanced Gas Reactor (AGR) Fuel Development Program involves significant paradigm shifts to capitalize on distinct advantages in simplicity, yield, and elimination of mixed waste. A series of designed experiments have been completed to optimize compaction conditions of time, temperature, and forming pressure using natural uranium oxycarbide (NUCO) fuel. Results from these experiments are included. The scale-up effort is nearing completion with the process installed and operational using nuclear fuel materials. The process is being certified for manufacture of qualification test fuel compacts for the AGR-5/6/7 experiment at the Advanced Test Reactor (ATR) at the Idaho National Laboratory (INL).

  8. Scaling up Comprehensive Sexuality Education in Nigeria: From National Policy to Nationwide Application

    ERIC Educational Resources Information Center

    Huaynoca, Silvia; Chandra-Mouli, Venkatraman; Yaqub, Nuhu, Jr.; Denno, Donna Marie

    2014-01-01

    Nigeria is one of few countries that reports having translated national policies on school-based comprehensive sexuality education (CSE) into near-nationwide implementation. We analysed data using the World Health Organization-ExpandNet framework, which provides a systematic structure for planning and managing the scaling up of health innovations.…

  9. The Role of Scaling up Research in Designing for and Evaluating Robustness

    ERIC Educational Resources Information Center

    Roschelle, J.; Tatar, D.; Shechtman, N.; Knudsen, J.

    2008-01-01

    One of the great strengths of Jim Kaput's research program was his relentless drive towards scaling up his innovative approach to teaching the mathematics of change and variation. The SimCalc mission, "democratizing access to the mathematics of change," was enacted by deliberate efforts to reach an increasing number of teachers and students each…

  10. Implementation in a Longitudinal Sample of New American Schools: Four Years into Scale-Up.

    ERIC Educational Resources Information Center

    Kirby, Sheila Nataraj; Berends, Mark; Naftel, Scott

    New American Schools (NAS) initiated whole-school reform in 1991 as a response to school reforms that had produced little change in the nation's test scores. Its mission is to help schools and districts raise student achievement using whole-school designs. NAS is in the scale-up phase of its effort in which designs are being diffused in partnering…

  11. Lessons from scaling up a depression treatment program in primary care in Chile.

    PubMed

    Araya, Ricardo; Alvarado, Rubén; Sepúlveda, Rodrigo; Rojas, Graciela

    2012-09-01

    In Chile, the National Depression Detection and Treatment Program (Programa Nacional de Diagnóstico y Tratamiento de la Depresión, PNDTD) in primary care is a rare example of an evidence-based mental health program that was scaled up to the national level in a low- or middle-income country. This retrospective qualitative study aimed to better understand how policymakers made the decision to scale up mental health services to the national level, and to explore the elements, contexts, and processes that facilitated the decision to implement and sustain PNDTD. In-depth semistructured interviews with six key informants selected through intentional sampling were conducted in August-December 2008. Interviewees were senior officers at the Ministry of Health who were directly involved in the decision to scale up the program. Results yielded four elements pivotal to the decisionmaking process: scientific evidence, teamwork and leadership, strategic alliances, and program institutionalization. Each element contributed to building consensus, securing funding, attracting resources, and gaining lasting support from policymakers. Additionally, a review of available documentation led the authors to consider sociopolitical context and use of the media to be important factors. While research evidence for the effectiveness of mental health services in the primary care setting continues to accumulate, low- and middle-income countries should get started on the lengthy process of scaling up by incorporating the elements that led to decisionmaking and implementation of the PNDTD in Chile. PMID:23183564

  12. Enabling and Challenging Factors in Institutional Reform: The Case of SCALE-UP

    ERIC Educational Resources Information Center

    Foote, Kathleen; Knaub, Alexis; Henderson, Charles; Dancy, Melissa; Beichner, Robert J.

    2016-01-01

    While many innovative teaching strategies exist, integration into undergraduate science teaching has been frustratingly slow. This study aims to understand the low uptake of research-based instructional innovations by studying 21 successful implementations of the Student Centered Active Learning with Upside-down Pedagogies (SCALE-UP) instructional…

  13. Final-Year Results from the i3 Scale-Up of Reading Recovery

    ERIC Educational Resources Information Center

    May, Henry; Sirinides, Philip; Gray, Abby; Davila, Heather Goldsworthy; Sam, Cecile; Blalock, Toscha; Blackman, Horatio; Anderson-Clark, Helen; Schiera, Andrew J.

    2015-01-01

    As part of the 2010 economic stimulus, a $55 million "Investing in Innovation" (i3) grant from the US Department of Education was awarded to scale up Reading Recovery across the nation. This paper presents the final round of results from the large-scale, mixed methods randomized evaluation of the implementation and impacts of Reading…

  14. Heat and mass transfer scale-up issues during freeze drying: II. Control and characterization of the degree of supercooling.

    PubMed

    Rambhatla, Shailaja; Ramot, Roee; Bhugra, Chandan; Pikal, Michael J

    2004-01-01

    This study aims to investigate the effect of the ice nucleation temperature on the primary drying process using an ice fog technique for temperature-controlled nucleation. In order to facilitate scale up of the freeze-drying process, this research seeks to find a correlation of the product resistance and the degree of supercooling with the specific surface area of the product. Freeze-drying experiments were performed using 5% wt/vol solutions of sucrose, dextran, hydroxyethyl starch (HES), and mannitol. Temperature-controlled nucleation was achieved using the ice fog technique where cold nitrogen gas was introduced into the chamber to form an "ice fog," thereby facilitating nucleation of samples at the temperature of interest. Manometric temperature measurement (MTM) was used during primary drying to evaluate the product resistance as a function of cake thickness. Specific surface areas (SSA) of the freeze-dried cakes were determined. The ice fog technique was refined to successfully control the ice nucleation temperature of solutions within 1 degrees C. A significant increase in product resistance was produced by a decrease in nucleation temperature. The SSA was found to increase with decreasing nucleation temperature, and the product resistance increased with increasing SSA. The ice fog technique can be refined into a viable method for nucleation temperature control. The SSA of the product correlates well with the degree of supercooling and with the resistance of the product to mass transfer (ie, flow of water vapor through the dry layer). Using this correlation and SSA measurements, one could predict scale-up drying differences and accordingly alter the freeze-drying process so as to bring about equivalence of product temperature history during lyophilization. PMID:15760055

  15. Evaluation of models for predicting spray mist diameter for scaling-up of the fluidized bed granulation process.

    PubMed

    Fujiwara, Maya; Dohi, Masafumi; Otsuka, Tomoko; Yamashita, Kazunari; Sako, Kazuhiro

    2012-01-01

    We evaluated models for predicting spray mist diameter suitable for scaling-up the fluidized bed granulation process. By precise selection of experimental conditions, we were able to identify a suitable prediction model that considers changes in binder solution, nozzle dimension, and spray conditions. We used hydroxypropyl cellulose (HPC), hydroxypropyl methylcellulose (HPMC), or polyvinylpyrrolidone (PVP) binder solutions, which are commonly employed by the pharmaceutical industry. Nozzle dimension and spray conditions for oral dosing were carefully selected to reflect manufacturing and small (1/10) scale process conditions. We were able to demonstrate that the prediction model proposed by Mulhem optimally estimated spray mist diameter when each coefficient was modified. Moreover, we developed a simple scale-up rule to produce the same spray mist diameter at different process scales. We confirmed that the Rosin-Rammler distribution could be applied to this process, and that its distribution coefficient was 1.43-1.72 regardless of binder solution, spray condition, or nozzle dimension. PMID:23124561

  16. Scale-up of an electrical capacitance tomography sensor for imaging pharmaceutical fluidized beds and validation by computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Haigang; Yang, Wuqiang

    2011-10-01

    The aim of this research is to apply electrical capacitance tomography (ECT) in pharmaceutical fluidized beds and scale up the application of ECT from a lab-scale fluidized bed to a production-scale fluidized bed. The objective is to optimize the design of the production-scale fluidized bed and to improve the operation efficiency of the fluidization processes. This is the first time that ECT has been scaled up to a production-scale fluidized bed of 1.0 m diameter and batch process capacity of 100 kg in a real industrial environment. With a large-scale fluidized bed in a real industrial environment, some key issues on the ECT sensor design must be addressed. To validate ECT measurement results, a two-phase flow model has been used to simulate the process in a lab-scale and pilot-scale fluidized bed. The key process parameters include solid concentration, average concentration profiles, the frequency spectrum of signal fluctuation obtained by the fast Fourier transfer (FFT) and multi-level wavelet decomposition in the time domain. The results show different hydrodynamic behaviour of fluidized beds of different scales. The time-averaged parameters from ECT and computational fluid dynamics are compared. Future work on the ECT sensor design for large-scale fluidized beds are given in the end of the paper.

  17. Decision making for HIV prevention and treatment scale up: Bridging the gap between theory and practice

    PubMed Central

    Alistar, Sabina S.; Brandeau, Margaret L.

    2011-01-01

    Background Effectively controlling the HIV epidemic will require efficient use of limited resources. Despite ambitious global goals for HIV prevention and treatment scale up, few comprehensive practical tools exist to inform such decisions. Methods We briefly summarize modeling approaches for resource allocation for epidemic control, and discuss the practical limitations of these models. We describe typical challenges of HIV resource allocation in practice and some of the tools used by decision makers. We identify the characteristics needed in a model that can effectively support planners in decision making about HIV prevention and treatment scale up. Results An effective model to support HIV scale-up decisions will be flexible, with capability for parameter customization and incorporation of uncertainty. Such a model needs certain key technical features: it must capture epidemic effects; account for how intervention effectiveness depends on the target population and the level of scale up; capture benefit and cost differentials for packages of interventions versus single interventions, including both treatment and prevention interventions; incorporate key constraints on potential funding allocations; identify optimal or near-optimal solutions; and estimate the impact of HIV interventions on the health care system and the resulting resource needs. Additionally, an effective model needs a user-friendly design and structure, ease of calibration and validation, and accessibility to decision makers in all settings. Conclusions Resource allocation theory can make a significant contribution to decision making about HIV prevention and treatment scale up. What remains now is to develop models that can bridge the gap between theory and practice. PMID:21191118

  18. A New Change Detection Technique Applied to COSMO-SkyMed Stripmap Himage Data

    NASA Astrophysics Data System (ADS)

    Losurdo, A.; Marzo, C.; Guariglia, A.

    2015-05-01

    Change Detection techniques in SAR images is very relevant for the locationing and the monitoring of interesting land changes. At present, it is a very important topic due to the high repetitiveness and of the new SAR satellite instruments (e.g. COSMO-SkyMed and Sentinel-1). Geocart S.p.A. has reached important results about SAR change detection techniques within a technological project designed and implemented for the Italian Space Agency. The project's title is Integrated Monitoring System: application to the GAS pipeline". The aim of the project is the development of a new remote sensing service integrating aerial and satellite data for GAS pipeline monitoring. An important Work-Package of the project aims to develop algorithms regarding the change detection to be applied on COSMO-SkyMed Stripmap Himage data in order to identify heavy lorries on pipelines. Particularly, the paper presents a new change detection technique based on a probabilistic approach and the corresponding applicative results.

  19. Photothermal Techniques Applied to the Thermal Characterization of l-Cysteine Nanofluids

    NASA Astrophysics Data System (ADS)

    Alvarado, E. Maldonado; Ramón-Gallegos, E.; Jiménez Pérez, J. L.; Cruz-Orea, A.; Hernández Rosas, J.

    2013-05-01

    Thermal-diffusivity ( D) and thermal-effusivity ( e) measurements were carried out in l-cysteine nanoliquids l-cysteine in combination with Au nanoparticles and protoporphyrin IX (PpIX) nanofluid) by using thermal lens spectrometry (TLS) and photopyroelectric (PPE) techniques. The TLS technique was used in the two mismatched mode experimental configuration to obtain the thermal-diffusivity of the samples. On the other hand, the sample thermal effusivity ( e) was obtained by using the PPE technique where the temperature variation of a sample, exposed to modulated radiation, is measured with a pyrolectric sensor. From the obtained thermal-diffusivity and thermal-effusivity values, the thermal conductivity and specific heat capacity of the sample were calculated. The obtained thermal parameters were compared with the thermal parameters of water. The results of this study could be applied to the detection of tumors by using the l-cysteine in combination with Au nanoparticles and PpIX nanofluid, called conjugated in this study.

  20. Inverse problem solution techniques as applied to indirect in situ estimation of fish target strength.

    PubMed

    Stepnowski, A; Moszyński, M

    2000-05-01

    In situ indirect methods of fish target strength (TS) estimation are analyzed in terms of the inverse techniques recently applied to the problem in question. The solution of this problem requires finding the unknown probability density function (pdf) of fish target strength from acoustic echoes, which can be estimated by solving the integral equation, relating pdf's of echo variable, target strength, and beam pattern of the echosounder transducer. In the first part of the paper the review of existing indirect in situ TS-estimation methods is presented. The second part introduces the novel TS-estimation methods, viz.: Expectation, Maximization, and Smoothing (EMS), Windowed Singular Value Decomposition (WSVD), Regularization and Wavelet Decomposition, which are compared using simulations as well as actual data from acoustic surveys. The survey data, acquired by the dual-beam digital echosounder, were thoroughly analyzed by numerical algorithms and the target strength and acoustical backscattering length pdf's estimates were calculated from fish echoes received in the narrow beam channel of the echosounder. Simultaneously, the estimates obtained directly from the dual-beam system were used as a reference for comparison of the estimates calculated by the newly introduced inverse techniques. The TS estimates analyzed in the paper are superior to those obtained from deconvolution or other conventional techniques, as the newly introduced methods partly avoid the problem of ill-conditioned equations and matrix inversion. PMID:10830379

  1. Micropillar compression technique applied to micron-scale mudstone elasto-plastic deformation.

    SciTech Connect

    Michael, Joseph Richard; Chidsey, Thomas; Heath, Jason E.; Dewers, Thomas A.; Boyce, Brad Lee; Buchheit, Thomas Edward

    2010-12-01

    Mudstone mechanical testing is often limited by poor core recovery and sample size, preservation and preparation issues, which can lead to sampling bias, damage, and time-dependent effects. A micropillar compression technique, originally developed by Uchic et al. 2004, here is applied to elasto-plastic deformation of small volumes of mudstone, in the range of cubic microns. This study examines behavior of the Gothic shale, the basal unit of the Ismay zone of the Pennsylvanian Paradox Formation and potential shale gas play in southeastern Utah, USA. Precision manufacture of micropillars 5 microns in diameter and 10 microns in length are prepared using an ion-milling method. Characterization of samples is carried out using: dual focused ion - scanning electron beam imaging of nano-scaled pores and distribution of matrix clay and quartz, as well as pore-filling organics; laser scanning confocal (LSCM) 3D imaging of natural fractures; and gas permeability, among other techniques. Compression testing of micropillars under load control is performed using two different nanoindenter techniques. Deformation of 0.5 cm in diameter by 1 cm in length cores is carried out and visualized by a microscope loading stage and laser scanning confocal microscopy. Axisymmetric multistage compression testing and multi-stress path testing is carried out using 2.54 cm plugs. Discussion of results addresses size of representative elementary volumes applicable to continuum-scale mudstone deformation, anisotropy, and size-scale plasticity effects. Other issues include fabrication-induced damage, alignment, and influence of substrate.

  2. Strategy for applying scaling technique to water retention curves of forest soils

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Kosugi, K.; Mizuyama, T.

    2009-12-01

    Describing the infiltration of water in soils on a forested hillslope requires the information of spatial variability of water retention curve (WRC). By using a scaling technique, Hayashi et al. (2009), found that the porosity mostly characterizes the spatial variability of the WRCs on a forested hillslope. This scaling technique was based on a model, which assumes a lognormal pore size distribution and contains three parameters: the median of log-transformed pore radius, ψm, the variance of log-transformed pore radius, σ, and the effective porosity, θe. Thus, in the scaling method proposed by Hayashi et al. (2009), θe is a scaling factor, which should be determined for each individual soil, and that ψm and σ are reference parameter common for the whole data set. They examined this scaling method using θe calculated as a difference between the observed saturated water content and water content observed at ψ = -1000 cm for each sample and, ψm and σ derived from the whole data set of WRCs on the slope. Then it was showed that this scaling method could explain almost 90 % of the spatial variability in WRCs on the forested hillslope. However, this method requires the whole data set of WRCs for deriving the reference parameters (ψm and σ). For applying the scaling technique more practically, in this study, we tested a scaling method using the reference parameter derived from the WRCs at a small part of the slope. In order to examine the proposed scaling method, the WRCs for the 246 undisturbed forest soil samples, collected at 15 points distributed from downslope to upslope segments, were observed. In the proposed scaling method, we optimized the common ψm and σ to the WRCs for six soil samples, collected at one point on the middle-slope, and applied these parameters to a reference parameter for the whole data sets. The scaling method proposed by this study exhibited an increase of only 6 % in the residual sum of squares as compared with that of the method

  3. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    NASA Astrophysics Data System (ADS)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  4. Data compression techniques applied to high resolution high frame rate video technology

    NASA Technical Reports Server (NTRS)

    Hartz, William G.; Alexovich, Robert E.; Neustadter, Marc S.

    1989-01-01

    An investigation is presented of video data compression applied to microgravity space experiments using High Resolution High Frame Rate Video Technology (HHVT). An extensive survey of methods of video data compression, described in the open literature, was conducted. The survey examines compression methods employing digital computing. The results of the survey are presented. They include a description of each method and assessment of image degradation and video data parameters. An assessment is made of present and near term future technology for implementation of video data compression in high speed imaging system. Results of the assessment are discussed and summarized. The results of a study of a baseline HHVT video system, and approaches for implementation of video data compression, are presented. Case studies of three microgravity experiments are presented and specific compression techniques and implementations are recommended.

  5. Applying machine learning techniques to DNA sequence analysis. Progress report, February 14, 1991--February 13, 1992

    SciTech Connect

    Shavlik, J.W.

    1992-04-01

    We are developing a machine learning system that modifies existing knowledge about specific types of biological sequences. It does this by considering sample members and nonmembers of the sequence motif being learned. Using this information (which we call a ``domain theory``), our learning algorithm produces a more accurate representation of the knowledge needed to categorize future sequences. Specifically, the KBANN algorithm maps inference rules, such as consensus sequences, into a neural (connectionist) network. Neural network training techniques then use the training examples of refine these inference rules. We have been applying this approach to several problems in DNA sequence analysis and have also been extending the capabilities of our learning system along several dimensions.

  6. Feasibility Studies of Applying Kalman Filter Techniques to Power System Dynamic State Estimation

    SciTech Connect

    Huang, Zhenyu; Schneider, Kevin P.; Nieplocha, Jarek

    2007-08-01

    Abstract—Lack of dynamic information in power system operations mainly attributes to the static modeling of traditional state estimation, as state estimation is the basis driving many other operations functions. This paper investigates the feasibility of applying Kalman filter techniques to enable the inclusion of dynamic modeling in the state estimation process and the estimation of power system dynamic states. The proposed Kalman-filter-based dynamic state estimation is tested on a multi-machine system with both large and small disturbances. Sensitivity studies of the dynamic state estimation performance with respect to measurement characteristics – sampling rate and noise level – are presented as well. The study results show that there is a promising path forward to implementation the Kalman-filter-based dynamic state estimation with the emerging phasor measurement technologies.

  7. A New Normalized Difference Cloud Retrieval Technique Applied to Landsat Radiances Over the Oklahoma ARM Site

    NASA Technical Reports Server (NTRS)

    Orepoulos, Lazaros; Cahalan, Robert; Marshak, Alexander; Wen, Guoyong

    1999-01-01

    We suggest a new approach to cloud retrieval, using a normalized difference of nadir reflectivities (NDNR) constructed from a non-absorbing and absorbing (with respect to liquid water) wavelength. Using Monte Carlo simulations we show that this quantity has the potential of removing first order scattering effects caused by cloud side illumination and shadowing at oblique Sun angles. Application of the technique to TM (Thematic Mapper) radiance observations from Landsat-5 over the Southern Great Plains site of the ARM (Atmospheric Radiation Measurement) program gives very similar regional statistics and histograms, but significant differences at the pixel level. NDNR can be also combined with the inverse NIPA (Nonlocal Independent Pixel Approximation) of Marshak (1998) which is applied for the first time on overcast Landsat scene subscenes. We demonstrate the sensitivity of the NIPA-retrieved cloud fields on the parameters of the method and discuss practical issues related to the optimal choice of these parameters.

  8. Roller compaction scale-up using roll width as scale factor and laser-based determined ribbon porosity as critical material attribute.

    PubMed

    Allesø, Morten; Holm, René; Holm, Per

    2016-05-25

    Due to the complexity and difficulties associated with the mechanistic modeling of roller compaction process for scale-up, an innovative equipment approach is to keep roll diameter fixed between scales and instead vary the roll width. Assuming a fixed gap and roll force, this approach should create similar conditions for the nip regions of the two compactor scales, and thus result in a scale-reproducible ribbon porosity. In the present work a non-destructive laser-based technique was used to measure the ribbon porosity at-line with high precision and high accuracy as confirmed by an initial comparison to a well-established volume displacement oil intrusion method. The ribbon porosity was found to be scale-independent when comparing the average porosity of a group of ribbon samples (n=12) from small-scale (Mini-Pactor®) to large-scale (Macro-Pactor®). A higher standard deviation of ribbons fragment porosities from the large-scale roller compactor was attributed to minor variations in powder densification across the roll width. With the intention to reproduce ribbon porosity from one scale to the other, process settings of roll force and gap size applied to the Mini-Pactor® (and identified during formulation development) were therefore directly transferrable to subsequent commercial scale production on the Macro-Pactor®. This creates a better link between formulation development and tech transfer and decreases the number of batches needed to establish the parameter settings of the commercial process. PMID:26545485

  9. Single Layer Centrifugation Can Be Scaled-Up Further to Process up to 150 mL Semen

    PubMed Central

    Morrell, J. M.; van Wienen, M.; Wallgren, M.

    2011-01-01

    Single-Layer centrifugation has been used to improve the quality of sperm samples in several species. However, where stallion or boar semen is to be used for AI, larger volumes of semen have to be processed than for other species, thus limiting the effectiveness of the original technique. The objective of the present study was to scale up the SLC method for both stallion and boar semen. Stallion semen could be processed in 100 mL glass tubes without a loss of sperm quality, and similarly, boar semen could be processed in 200 mL and 500 mL tubes without losing sperm quality. The results of these preliminary studies are encouraging, and larger trials are underway to evaluate using these methods in the field. PMID:23738111

  10. Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.

    PubMed

    Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe

    2016-08-01

    Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. PMID:27131292

  11. Pore-Water Extraction Scale-Up Study for the SX Tank Farm

    SciTech Connect

    Truex, Michael J.; Oostrom, Martinus; Wietsma, Thomas W.; Last, George V.; Lanigan, David C.

    2013-01-15

    The phenomena related to pore-water extraction from unsaturated sediments have been previously examined with limited laboratory experiments and numerical modeling. However, key scale-up issues have not yet been addressed. Laboratory experiments and numerical modeling were conducted to specifically examine pore-water extraction for sediment conditions relevant to the vadose zone beneath the SX Tank Farm at Hanford Site in southeastern Washington State. Available SX Tank Farm data were evaluated to generate a conceptual model of the subsurface for a targeted pore-water extraction application in areas with elevated moisture and Tc-99 concentration. The hydraulic properties of the types of porous media representative of the SX Tank Farm target application were determined using sediment mixtures prepared in the laboratory based on available borehole sediment particle size data. Numerical modeling was used as an evaluation tool for scale-up of pore-water extraction for targeted field applications.

  12. Scaled-up dual anode/cathode microbial fuel cell stack for actual ethanolamine wastewater treatment.

    PubMed

    An, Byung-Min; Heo, Yoon; Maitlo, Hubdar-Ali; Park, Joo-Yang

    2016-06-01

    The aim of this work was to develop the scale-up microbial fuel cell technology for actual ethanolamine wastewater treatment, dual anode/cathode MFC stacks connected in series to achieve any desired current, treatment capacity, and volume capacity. However, after feeding actual wastewater into the MFC, maximum power density decreased while the corresponding internal resistance increased. With continuous electricity production, a stack of eight MFCs in series achieved 96.05% of COD removal and 97.30% of ammonia removal at a flow rate of 15.98L/d (HRT 12h). The scaled-up dual anode/cathode MFC stack system in this research was demonstrated to treat actual ETA wastewater with the added benefit of harvesting electricity energy. PMID:26888335

  13. Scaling up of HIV-TB collaborative activities: Achievements and challenges in India.

    PubMed

    Deshmukh, Rajesh; Shah, Amar; Sachdeva, K S; Sreenivas, A N; Gupta, R S; Khaparde, S D

    2016-01-01

    India has been implementing HIV/TB collaborative activities since 2001 with rapid scale-up of infrastructure across the country during past decade in National AIDS Control Programme and Revised National TB Control Programme. India has shown over 50% reduction in new infections and around 35% reduction in AIDS-related deaths, thereby being one of the success stories globally. Substantial progress in the implementation of collaborative TB/HIV activities has occurred in India and it is marching towards target set out in the Global Plan to Stop TB and endorsed by the UN General Assembly to halve HIV associated TB deaths by 2015. While the successful approaches have led to impressive gains in HIV/TB control in India, there are emerging challenges including newer pockets with rising HIV trends in North India, increasing drug resistance, high mortality among co-infected patients, low HIV testing rates among TB patients in northern and eastern states in India, treatment delays and drop-outs, stigma and discrimination, etc. In spite of these difficulties, established HIV/TB coordination mechanisms at different levels, rapid scale-up of facilities with decentralisation of treatment services, regular joint supervision and monitoring, newer initiatives like use of rapid diagnostics for early diagnosis of TB among people living with HIV, TB notification, etc. have led to success in combating the threat of HIV/TB in India. This article highlights the steps taken by India, one of the largest HIV/TB programmes in world, in scaling up of the joint HIV-TB collaborative activities, the achievements so far and discusses the emerging challenges which could provide important lessons for other countries in scaling up their programmes. PMID:27235937

  14. CYMIC{reg_sign} -- Boiler scale-up and full scale demonstration experiences

    SciTech Connect

    Kokko, A.; Karvinen, R.; Ahlstedt, H.

    1995-12-31

    This paper describes the CYMIC boiler scale-up principles, first full scale experiences from demonstration plant and results from mathematical modelling of the cyclones. CYMIC pilot testing was successfully completed with very positive results, the next step was a CYMIC scale-up and full scale demonstration. The 30 MWth demonstration plant was commissioned during the fall of 1994. The plant is owned by VAPO Oy and it is in the city of Lieksa, eastern Finland. The CYMIC has been scaled up by developing six different cyclones and the multiplication system to cover the capacity range from 30 to 600 MWth. The design of this CYMIC series and the first sold industrial scale CYMIC are presented in the paper. The scale-up of the cyclone was mathematically modelled by Professor Karvinen and his group at Tampere University of Technology. The model which uses Sflow-code was tested and the parameters were set using the pilot test results. The model operated well, so three bigger cyclones were calculated. The first was the cyclone for the Lieksa plant and the other two were bigger standard cyclones. Particles were also included in the model. The variables in the calculations were the cyclone diameter, inlet vane shape and position. Commissioning of the Lieksa plant began in August 1994. The process including operation of the cyclone and the gaslock were then verified at full scale. Flue gas emissions, the combustion efficiency and the performance of the cyclone were also measured. This paper discuss the most interesting results of the measurements.

  15. Scale-up of HIV Viral Load Monitoring--Seven Sub-Saharan African Countries.

    PubMed

    Lecher, Shirley; Ellenberger, Dennis; Kim, Andrea A; Fonjungo, Peter N; Agolory, Simon; Borget, Marie Yolande; Broyles, Laura; Carmona, Sergio; Chipungu, Geoffrey; De Cock, Kevin M; Deyde, Varough; Downer, Marie; Gupta, Sundeep; Kaplan, Jonathan E; Kiyaga, Charles; Knight, Nancy; MacLeod, William; Makumbi, Boniface; Muttai, Hellen; Mwangi, Christina; Mwangi, Jane W; Mwasekaga, Michael; Ng'Ang'A, Lucy W; Pillay, Yogan; Sarr, Abdoulaye; Sawadogo, Souleymane; Singer, Daniel; Stevens, Wendy; Toure, Christiane Adje; Nkengasong, John

    2015-11-27

    To achieve global targets for universal treatment set forth by the Joint United Nations Programme on human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS) (UNAIDS), viral load monitoring for HIV-infected persons receiving antiretroviral therapy (ART) must become the standard of care in low- and middle-income countries (LMIC) (1). CDC and other U.S. government agencies, as part of the President's Emergency Plan for AIDS Relief, are supporting multiple countries in sub-Saharan Africa to change from the use of CD4 cell counts for monitoring of clinical response to ART to the use of viral load monitoring, which is the standard of care in developed countries. Viral load monitoring is the preferred method for immunologic monitoring because it enables earlier and more accurate detection of treatment failure before immunologic decline. This report highlights the initial successes and challenges of viral load monitoring in seven countries that have chosen to scale up viral load testing as a national monitoring strategy for patients on ART in response to World Health Organization (WHO) recommendations. Countries initiating viral load scale-up in 2014 observed increases in coverage after scale-up, and countries initiating in 2015 are anticipating similar trends. However, in six of the seven countries, viral load testing coverage in 2015 remained below target levels. Inefficient specimen transport, need for training, delays in procurement and distribution, and limited financial resources to support scale-up hindered progress. Country commitment and effective partnerships are essential to address the financial, operational, technical, and policy challenges of the rising demand for viral load monitoring. PMID:26605986

  16. How HIV/AIDS scale-up has impacted on non- HIV priority services in Zambia

    PubMed Central

    2010-01-01

    Background Much of the debate as to whether or not the scaling up of HIV service delivery in Africa benefits non-HIV priority services has focused on the use of nationally aggregated data. This paper analyses and presents routine health facility record data to show trend correlations across priority services. Methods Review of district office and health facility client records for 39 health facilities in three districts of Zambia, covering four consecutive years (2004-07). Intra-facility analyses were conducted, service and coverage trends assessed and rank correlations between services measured to compare service trends within facilities. Results VCT, ART and PMTCT client numbers and coverage levels increased rapidly. There were some strong positive correlations in trends within facilities between reproductive health services (family planning and antenatal care) and ART and PMTCT, with Spearman rank correlations ranging from 0.33 to 0.83. Childhood immunisation coverage also increased. Stock-outs of important drugs for non-HIV priority services were significantly more frequent than were stock-outs of antiretroviral drugs. Conclusions The analysis shows scale-up in reproductive health service numbers in the same facilities where HIV services were scaling up. While district childhood immunisations increased overall, this did not necessarily occur in facility catchment areas where HIV service scale-up occurred. The paper demonstrates an approach for comparing correlation trends across different services, using routine health facility information. Larger samples and explanatory studies are needed to understand the client, facility and health systems factors that contribute to positive and negative synergies between priority services. PMID:20825666

  17. Scale Up of Extended Thin Film Electrocatalyst Structures (ETFECS) (Fact Sheet)

    SciTech Connect

    Not Available

    2012-01-01

    This NREL Hydrogen and Fuel Cell Technical Highlight discusses how NREL synthesized >1 gram of platinum ETFECS (nanotubes) for use as novel fuel cell catalysts. These materials represent the cumulative yield of four individual batch syntheses (each >250 milligrams yield). By producing these materials at gram quantity, NREL has shown the viability of scale up and produced sufficient material to allow further validation of properties, as well as in-depth electrode optimization and fuel cell testing.

  18. Scaling up functional traits for ecosystem services with remote sensing: concepts and methods.

    PubMed

    Abelleira Martínez, Oscar J; Fremier, Alexander K; Günter, Sven; Ramos Bendaña, Zayra; Vierling, Lee; Galbraith, Sara M; Bosque-Pérez, Nilsa A; Ordoñez, Jenny C

    2016-07-01

    Ecosystem service-based management requires an accurate understanding of how human modification influences ecosystem processes and these relationships are most accurate when based on functional traits. Although trait variation is typically sampled at local scales, remote sensing methods can facilitate scaling up trait variation to regional scales needed for ecosystem service management. We review concepts and methods for scaling up plant and animal functional traits from local to regional spatial scales with the goal of assessing impacts of human modification on ecosystem processes and services. We focus our objectives on considerations and approaches for (1) conducting local plot-level sampling of trait variation and (2) scaling up trait variation to regional spatial scales using remotely sensed data. We show that sampling methods for scaling up traits need to account for the modification of trait variation due to land cover change and species introductions. Sampling intraspecific variation, stratification by land cover type or landscape context, or inference of traits from published sources may be necessary depending on the traits of interest. Passive and active remote sensing are useful for mapping plant phenological, chemical, and structural traits. Combining these methods can significantly improve their capacity for mapping plant trait variation. These methods can also be used to map landscape and vegetation structure in order to infer animal trait variation. Due to high context dependency, relationships between trait variation and remotely sensed data are not directly transferable across regions. We end our review with a brief synthesis of issues to consider and outlook for the development of these approaches. Research that relates typical functional trait metrics, such as the community-weighted mean, with remote sensing data and that relates variation in traits that cannot be remotely sensed to other proxies is needed. Our review narrows the gap between

  19. Solar coronal magnetic fields derived using seismology techniques applied to omnipresent sunspot waves

    NASA Astrophysics Data System (ADS)

    Jess, David B.; Reznikova, Veronika E.; Ryans, Robert S. I.; Christian, Damian J.; Keys, Peter H.; Mathioudakis, Mihalis; Mackay, Duncan H.; Krishna Prasad, S.; Banerjee, Dipankar; Grant, Samuel D. T.; Yau, Sean; Diamond, Conor

    2016-02-01

    Sunspots on the surface of the Sun are the observational signatures of intense manifestations of tightly packed magnetic field lines, with near-vertical field strengths exceeding 6,000 G in extreme cases. It is well accepted that both the plasma density and the magnitude of the magnetic field strength decrease rapidly away from the solar surface, making high-cadence coronal measurements through traditional Zeeman and Hanle effects difficult as the observational signatures are fraught with low-amplitude signals that can become swamped with instrumental noise. Magneto-hydrodynamic (MHD) techniques have previously been applied to coronal structures, with single and spatially isolated magnetic field strengths estimated as 9-55 G (refs ,,,). A drawback with previous MHD approaches is that they rely on particular wave modes alongside the detectability of harmonic overtones. Here we show, for the first time, how omnipresent magneto-acoustic waves, originating from within the underlying sunspot and propagating radially outwards, allow the spatial variation of the local coronal magnetic field to be mapped with high precision. We find coronal magnetic field strengths of 32 +/- 5 G above the sunspot, which decrease rapidly to values of approximately 1 G over a lateral distance of 7,000 km, consistent with previous isolated and unresolved estimations. Our results demonstrate a new, powerful technique that harnesses the omnipresent nature of sunspot oscillations to provide magnetic field mapping capabilities close to a magnetic source in the solar corona.

  20. Shadowgraph Technique Applied to STARDUST Facility for Dust Tracking: First Results

    NASA Astrophysics Data System (ADS)

    Gaudio, P.; Malizia, A.; Camplani, M.; Barbato, F.; Antonelli, L.; Gelfusa, M.; Del Vecchio, M.; Salgado, L.; Bellecci, C.; Richetta, M.

    The problem of dust resuspension in case of Loss Of Vacuum Accident (LOVA) in a nuclear fusion plant (ITER or DEMO like) is an important issue for the safety of workers and the security of environment. The Quantum Electronics and Plasma Physics Research Group has implemented an optical set-up to track dust during a LOVA reproduction inside the experimental facility STARDUST. The shadowgraph technique, in this work, it is applied to track dark dust (like Tungsten). The shadowgraph technique is based on an expanded collimated beam of light emitted by a laser (or a lamp) transversely to the flow field direction. Inside STARDUST the dust moving in the air flow causes variations of refractive index of light that can be detected by the means of a CCD camera. A spatial modulation of the light-intensity distribution on the camera can be measured. The resulting pattern is a shadow of the refractive index field that prevails in the region of the disturbance. The authors use an incandescent white lamp to illuminate the vacuum vessel of STARDUST facility. The light-area passes through the test section that has to be investigated and the images of the dust shadows are collected with a fast CCD camera. The images are then elaborated with mathematical algorithms to obtain information about the velocity fields of dust during the accidents reproduction. The experimental set-up together with a critical analysis of the first results are presented in this paper.

  1. In vivo measurement of human skin absorption of topically applied substances by a photoacoustic technique.

    PubMed

    Gutiérrez-Juárez, G; Vargas-Luna, M; Córdova, T; Varela, J B; Bernal-Alvarado, J J; Sosa, M

    2002-08-01

    A photoacoustic technique is used for studying topically applied substance absorption in human skin. The proposed method utilizes a double-chamber PA cell. The absorption determination was obtained through the measurement of the thermal effusivity of the binary system substance-skin. The theoretical model assumes that the effective thermal effusivity of the binary system corresponds to that of a two-phase system. Experimental applications of the method employed different substances of topical application in different parts of the body of a volunteer. The method is demonstrated to be an easily used non-invasive technique for dermatology research. The relative concentrations as a function of time of substances such as ketoconazol and sunscreen were determined by fitting a sigmoidal function to the data, while an exponential function corresponds to the best fit for the set of data for nitrofurazona, vaseline and vaporub. The time constants associated with the rates of absorption, were found to vary in the range between 10 and 58 min, depending on the substance and the part of the body. PMID:12214760

  2. Film thickness measurement techniques applied to micro-scale two-phase flow systems

    SciTech Connect

    Tibirica, Cristiano Bigonha; do Nascimento, Francisco Julio; Ribatski, Gherhardt

    2010-05-15

    Recently semi-empirical models to estimate flow boiling heat transfer coefficient, saturated CHF and pressure drop in micro-scale channels have been proposed. Most of the models were developed based on elongated bubbles and annular flows in the view of the fact that these flow patterns are predominant in smaller channels. In these models, the liquid film thickness plays an important role and such a fact emphasizes that the accurate measurement of the liquid film thickness is a key point to validate them. On the other hand, several techniques have been successfully applied to measure liquid film thicknesses during condensation and evaporation under macro-scale conditions. However, although this subject has been targeted by several leading laboratories around the world, it seems that there is no conclusive result describing a successful technique capable of measuring dynamic liquid film thickness during evaporation inside micro-scale round channels. This work presents a comprehensive literature review of the methods used to measure liquid film thickness in macro- and micro-scale systems. The methods are described and the main difficulties related to their use in micro-scale systems are identified. Based on this discussion, the most promising methods to measure dynamic liquid film thickness in micro-scale channels are identified. (author)

  3. A comparative study of Quaternary dating techniques applied to sedimentary deposits in southwest Victoria, Australia

    NASA Astrophysics Data System (ADS)

    Sherwood, J.; Barbetti, M.; Ditchburn, R.; Kimber, R. W. L.; McCabe, W.; Murray-Wallace, C. V.; Prescott, J. R.; Whitehead, N.

    At five sites in western Victoria a total of five Quaternary dating techniques have been applied to shell beds varying in age from Holocene to beyond the last interglacial. To examine the age concordancy of the methods, 89 analyses were conducted—16 by radiocarbon, 26 by uranium series disequilibrium, 26 by amino acid racemisation, 5 by thermoluminescence and 16 by electron spin resonance, the latter previously reported by Goede (1989). Uncertainties associated with diagenetic environments of samples precluded reliable numerical age assignments for beds older than Holocene. Instead, relative dating of shell beds was based on a reference site (Goose Lagoon) which was assigned to the last interglacial based on its morphostratigraphic setting and concordant results of three of the dating methods (amino acid racemisation, uranium series disequilibrium and electron spin resonance). Overall there was considerable agreement between methods although not all were applied to each site. Uranium series dating proved most problematical. Migration of radionuclides between groundwater and shells introduced large errors at one site and led to appreciable uncertainties at others.

  4. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques

    NASA Astrophysics Data System (ADS)

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R 2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data.

  5. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques.

    PubMed

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data. PMID:23686111

  6. Micropillar Compression Technique Applied to Micron-Scale Mudstone Elasto-Plastic Deformation

    NASA Astrophysics Data System (ADS)

    Dewers, T. A.; Boyce, B.; Buchheit, T.; Heath, J. E.; Chidsey, T.; Michael, J.

    2010-12-01

    Mudstone mechanical testing is often limited by poor core recovery and sample size, preservation and preparation issues, which can lead to sampling bias, damage, and time-dependent effects. A micropillar compression technique, originally developed by Uchic et al. 2004, here is applied to elasto-plastic deformation of small volumes of mudstone, in the range of cubic microns. This study examines behavior of the Gothic shale, the basal unit of the Ismay zone of the Pennsylvanian Paradox Formation and potential shale gas play in southeastern Utah, USA. Precision manufacture of micropillars 5 microns in diameter and 10 microns in length are prepared using an ion-milling method. Characterization of samples is carried out using: dual focused ion - scanning electron beam imaging of nano-scaled pores and distribution of matrix clay and quartz, as well as pore-filling organics; laser scanning confocal (LSCM) 3D imaging of natural fractures; and gas permeability, among other techniques. Compression testing of micropillars under load control is performed using two different nanoindenter techniques. Deformation of 0.5 cm in diameter by 1 cm in length cores is carried out and visualized by a microscope loading stage and laser scanning confocal microscopy. Axisymmetric multistage compression testing and multi-stress path testing is carried out using 2.54 cm plugs. Discussion of results addresses size of representative elementary volumes applicable to continuum-scale mudstone deformation, anisotropy, and size-scale plasticity effects. Other issues include fabrication-induced damage, alignment, and influence of substrate. This work is funded by the US Department of Energy, Office of Basic Energy Sciences. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  7. Evaluation of scale-up from analytical to preparative supercritical fluid chromatography.

    PubMed

    Enmark, Martin; Åsberg, Dennis; Leek, Hanna; Öhlén, Kristina; Klarqvist, Magnus; Samuelsson, Jörgen; Fornstedt, Torgny

    2015-12-18

    An approach for reliable transfer from analytical to preparative scale supercritical fluid chromatography was evaluated. Here, we accounted for the conditions inside the columns as well as to the fact that most analytical instruments are volume-controlled while most preparative scale units are mass-controlled. The latter is a particular problem when performing pilot scale experiments and optimizations prior to scaling up to production scale. This was solved by measuring the mass flow, the pressure and the temperature on the analytical unit using external sensors. Thereafter, it was revealed with a design of experiments approach that the methanol fraction and the pressure are the two most important parameters to control for preserved retention throughout the scale-up; for preserved selectivity the temperature was most important in this particular system. Using this approach, the resulting chromatograms from the preparative unit agreed well with those from the analytical unit while keeping the same column length and particles size. A brief investigation on how the solute elution volume varies with the volumetric flow rate revealed a complex dependency on pressure, density and apparent methanol content. Since the methanol content is a parameter of great importance to control during the scale up, we must be careful when changing operational and column design conditions which generates deviations in pressure, density and methanol content between different columns. PMID:26615709

  8. Theoretical and Practical Issues That Are Relevant When Scaling Up hMSC Microcarrier Production Processes.

    PubMed

    Jossen, Valentin; Schirmer, Cedric; Mostafa Sindi, Dolman; Eibl, Regine; Kraume, Matthias; Pörtner, Ralf; Eibl, Dieter

    2016-01-01

    The potential of human mesenchymal stem cells (hMSCs) for allogeneic cell therapies has created a large amount of interest. However, this presupposes the availability of efficient scale-up procedures. Promising results have been reported for stirred bioreactors that operate with microcarriers. Recent publications focusing on microcarrier-based stirred bioreactors have demonstrated the successful use of Computational Fluid Dynamics (CFD) and suspension criteria (N S1u , N S1) for rapidly scaling up hMSC expansions from mL- to pilot scale. Nevertheless, one obstacle may be the formation of large microcarrier-cell-aggregates, which may result in mass transfer limitations and inhomogeneous distributions of stem cells in the culture broth. The dependence of microcarrier-cell-aggregate formation on impeller speed and shear stress levels was investigated for human adipose derived stromal/stem cells (hASCs) at the spinner scale by recording the Sauter mean diameter (d 32) versus time. Cultivation at the suspension criteria provided d 32 values between 0.2 and 0.7 mm, the highest cell densities (1.25 × 10(6) cells mL(-1) hASCs), and the highest expansion factors (117.0 ± 4.7 on day 7), while maintaining the expression of specific surface markers. Furthermore, suitability of the suspension criterion N S1u was investigated for scaling up microcarrier-based processes in wave-mixed bioreactors for the first time. PMID:26981131

  9. Moving from a project to programmatic response: scaling up harm reduction in Asia.

    PubMed

    Chatterjee, Anindya; Sharma, Mukta

    2010-03-01

    The response to the HIV epidemics among people who inject drugs in Asia began to emerge in the early to mid 1990s, with the rather hesitant implementation of small-scale needle syringe programmes and community care initiatives aiming to support those who were already living with the virus. Since then Asia has seen a significant scaling up of harm reduction, despite very limited resources and difficult policy and legislative environments. One of the major reasons this has happened, is the utilisation of programme based approaches and the firm entrenchment of harm reduction thinking within national HIV/AIDS programmes and strategic plans--in most cases aided by multilateral and bilateral donors. Several models of scale up have been noted in Asia. The transition away from project based approaches, while on the whole positive, can also have a negative impact if the involvement of civil society and a client focussed approach is not protected. Also there are implications for which models of capacity building can be systematised for ongoing scale up. Most crucially, the tensions between drug policy, human rights and public health policies need to be resolved if harm reduction services are to be made available to the millions in Asia who are still unable to access these services. PMID:20079618

  10. Scale-up of controlled-shear affinity filtration using computational fluid dynamics.

    PubMed

    Francis, Patrick; Haynes, Charles A

    2009-05-01

    Controlled shear affinity filtration (CSAF) is an integrated bioprocess that positions a contoured rotor above a membrane affinity chromatography column to permit the capture and purification of a secreted protein product directly from cell culture. Here, computational fluid dynamics (CFD) simulations previously used on a laboratory-scale unit (Francis et al., Biotechnol. Bioeng. 2005, 95, 1207-1217) are extended to study the fluid hydrodynamics and expected filter performance of the CSAF device for rotor sizes up to 140 cm in radius. We show that the fluid hydrodynamics within the rotor chamber of larger-scale CSAF units are complex and include turbulent boundary layers; thus, CFD likely provides the only reliable route to CSAF scale-up. We then model design improvements that will be required for CSAF scale-up to permit processing of industrial feedstock. The result is the in silico design of a preparative CSAF device with an optimized rotor 140 cm in radius. The scaled up device has an effective filtration area of 5.93 m(2), which should allow for complete processing in ca. 2 h of 1000 L of culture harvested from either a perfusion, fed-batch or batch bioreactor. Finally, a novel method for the parallelization of CSAF units is presented for use in bioprocessing operations larger than 1000 L. PMID:19452478

  11. Fermentation scale up for α-arbutin production by Xanthomonas BT-112.

    PubMed

    Wei, Meng; Ren, Yi; Liu, Changxia; Liu, Ruican; Zhang, Peng; Wei, Yi; Xu, Tao; Wang, Fang; Tan, Tianwei; Liu, Chunqiao

    2016-09-10

    α-Arbutin is a glycosylated hydroquinone that has an inhibitory function against tyrosinase. The aim of the present study is to develop an efficient and inexpensive method for large-scale production of α-arbutin by using Xanthomonas BT-112 as biocatalyst. To accomplish this goal, various surfactants were tested to enhance the α-arbutin production, and the optimal operational conditions for 30L jar fermenter were scaled up for a production level of 3000L with using a constant volumetric oxygen transfer coefficient (KLa) and the volumetric aeration rate per volume unit (Q/V) as scale-up criteria. Under the optimized conditions, the α-arbutin produced in the presence of 0.4% (w/v) Tween-80 was 124.8% higher than that of the control, and the yield of α-arbutin in 3000L fermenter was 38.2g/L with a molar conversion ratio of 93.7% based on the amount of hydroquinone supplied. This result is comparable to the results from laboratory-scale fermenter. Hence, 100-fold scale-up was successfully achieved. PMID:27208754

  12. Innovate Use of SCALE-UP for Teaching General Education Astronomy

    NASA Astrophysics Data System (ADS)

    Keller, Luke; Rogers, M.

    2006-12-01

    Current development at Ithaca College is focused on transforming our general education astronomy courses from lecture-based into hands-on, active-learning courses by using the SCALE-UP model. SCALE-UP (Student Centered Activities for Large Enrollment University Physics) pioneered at North Carolina State University expands the successes of Studio Physics (developed at RPI) to large enrollment courses. Studio Physics does away with the usual lecture/recitation/laboratory sessions by having one dynamic, active-learning environment for approximately 40 students. SCALE-UP expands this model to accommodate 100+ students by using large round tables usually seating nine students who work in groups of three. Classes meet three times per week with each class blending lecture, hands-on activities, group problem solving, and the use of student polling devices. It is expected that this mode of teaching astronomy will lead to a better understanding of astronomy and the nature of science. We just finished renovating two existing classrooms and two storerooms to create a 99-seat active learning room. This poster will present the steps we took from initial planning meetings to our current curriculum development stage. We will highlight how we obtained administration buy-in, obtained funding, and planned the renovation with our facilities staff. We will also present our plans for curriculum development and assessment of our efforts.

  13. Nurse Family Partnership: Comparing Costs per Family in Randomized Trials Versus Scale-Up.

    PubMed

    Miller, Ted R; Hendrie, Delia

    2015-12-01

    The literature that addresses cost differences between randomized trials and full-scale replications is quite sparse. This paper examines how costs differed among three randomized trials and six statewide scale-ups of nurse family partnership (NFP) intensive home visitation to low income first-time mothers. A literature review provided data on pertinent trials. At our request, six well-established programs reported their total expenditures. We adjusted the costs to national prices based on mean hourly wages for registered nurses and then inflated them to 2010 dollars. A centralized data system provided utilization. Replications had fewer home visits per family than trials (25 vs. 31, p = .05), lower costs per client ($8860 vs. $12,398, p = .01), and lower costs per visit ($354 vs. $400, p = .30). Sample size limited the significance of these differences. In this type of labor intensive program, costs probably were lower in scale-up than in randomized trials. Key cost drivers were attrition and the stable caseload size possible in an ongoing program. Our estimates reveal a wide variation in cost per visit across six state programs, which suggests that those planning replications should not expect a simple rule to guide cost estimations for scale-ups. Nevertheless, NFP replications probably achieved some economies of scale. PMID:26507844

  14. Theoretical and Practical Issues That Are Relevant When Scaling Up hMSC Microcarrier Production Processes

    PubMed Central

    Jossen, Valentin; Schirmer, Cedric; Mostafa Sindi, Dolman; Eibl, Regine; Kraume, Matthias; Pörtner, Ralf; Eibl, Dieter

    2016-01-01

    The potential of human mesenchymal stem cells (hMSCs) for allogeneic cell therapies has created a large amount of interest. However, this presupposes the availability of efficient scale-up procedures. Promising results have been reported for stirred bioreactors that operate with microcarriers. Recent publications focusing on microcarrier-based stirred bioreactors have demonstrated the successful use of Computational Fluid Dynamics (CFD) and suspension criteria (NS1u, NS1) for rapidly scaling up hMSC expansions from mL- to pilot scale. Nevertheless, one obstacle may be the formation of large microcarrier-cell-aggregates, which may result in mass transfer limitations and inhomogeneous distributions of stem cells in the culture broth. The dependence of microcarrier-cell-aggregate formation on impeller speed and shear stress levels was investigated for human adipose derived stromal/stem cells (hASCs) at the spinner scale by recording the Sauter mean diameter (d32) versus time. Cultivation at the suspension criteria provided d32 values between 0.2 and 0.7 mm, the highest cell densities (1.25 × 106 cells mL−1 hASCs), and the highest expansion factors (117.0 ± 4.7 on day 7), while maintaining the expression of specific surface markers. Furthermore, suitability of the suspension criterion NS1u was investigated for scaling up microcarrier-based processes in wave-mixed bioreactors for the first time. PMID:26981131

  15. Analysis and improvement of a scaled-up and stacked microbial fuel cell.

    PubMed

    Dekker, Arjan; Ter Heijne, Annemiek; Saakes, Michel; Hamelers, Hubertus V M; Buisman, Cees J N

    2009-12-01

    Scaling up microbial fuel cells (MFCs) is inevitable when power outputs have to be obtained that can power electrical devices other than small sensors. This research has used a bipolar plate MFC stack of four cells with a total working volume of 20 L and a total membrane surface area of 2 m(2). The cathode limited MFC performance due to oxygen reduction rate and cell reversal. Furthermore, residence time distribution curves showed that bending membranes resulted in flow paths through which the catholyte could flow from inlet to outlet, while leaving the reactants unconverted. The cathode was improved by decreasing the pH, purging pure oxygen, and increasing the flow rate, which resulted in a 13-fold power density increase to 144 W m(-3) and a volumetric resistivity of only 1.2 mOmega m(3) per cell. Both results are major achievements compared to results currently published for laboratory and scaled-up MFCs. When designing a scaled-up MFC, it is important to ensure optimal contact between electrodes and substrate and to minimize the distances between electrodes. PMID:19943685

  16. Estimates of child deaths prevented from malaria prevention scale-up in Africa 2001-2010

    PubMed Central

    2012-01-01

    Background Funding from external agencies for malaria control in Africa has increased dramatically over the past decade resulting in substantial increases in population coverage by effective malaria prevention interventions. This unprecedented effort to scale-up malaria interventions is likely improving child survival and will likely contribute to meeting Millennium Development Goal (MDG) 4 to reduce the < 5 mortality rate by two thirds between 1990 and 2015. Methods The Lives Saved Tool (LiST) model was used to quantify the likely impact that malaria prevention intervention scale-up has had on malaria mortality over the past decade (2001-2010) across 43 malaria endemic countries in sub-Saharan African. The likely impact of ITNs and malaria prevention interventions in pregnancy (intermittent preventive treatment [IPTp] and ITNs used during pregnancy) over this period was assessed. Results The LiST model conservatively estimates that malaria prevention intervention scale-up over the past decade has prevented 842,800 (uncertainty: 562,800-1,364,645) child deaths due to malaria across 43 malaria-endemic countries in Africa, compared to a baseline of the year 2000. Over the entire decade, this represents an 8.2% decrease in the number of malaria-caused child deaths that would have occurred over this period had malaria prevention coverage remained unchanged since 2000. The biggest impact occurred in 2010 with a 24.4% decrease in malaria-caused child deaths compared to what would have happened had malaria prevention interventions not been scaled-up beyond 2000 coverage levels. ITNs accounted for 99% of the lives saved. Conclusions The results suggest that funding for malaria prevention in Africa over the past decade has had a substantial impact on decreasing child deaths due to malaria. Rapidly achieving and then maintaining universal coverage of these interventions should be an urgent priority for malaria control programmes in the future. Successful scale-up in many

  17. The 40Ar/39Ar dating technique applied to planetary sciences

    NASA Astrophysics Data System (ADS)

    Jourdan, F.

    2012-12-01

    The 40Ar/39Ar technique is a powerful geochronological method that can help to unravel the evolution of the solar system. The 40Ar/39Ar system can not only record the timing of volcanic and metamorphic processes on asteroids and planets, it finds domain of predilection in dating impact events throughout the solar system. However, the 40Ar/39Ar method is a robust analytical technique if, and only if, the events to be dated are well understood and data are not over interpreted. Yet, too many 'ages' reported in the literature are still based on over-interpretation of perturbed age spectra which tends to blur the big picture. This presentation is centred on the most recent applications of the 40Ar/39Ar technique applied to planetary material and through several examples, will attempt to demonstrate the benefit of focusing on statistically robust data. For example, 40Ar/39Ar dating of volcanic events on the Moon suggests that volcanism was mostly concentrated between ca. 3.8 and 3.1 Ga but statistical filtering of the data allow identifying a few well-defined eruptive events. The study of lunar volcanism would also benefit from dating of volcanic spherules. Rigorous filtering of the 40Ar/39Ar age database of lunar melt breccias yielded concordant and ages with high precision for two major basins (i.e. Imbrium & Serenitatis) of the Moon. 40Ar/39Ar dating of lunar impact spherules recovered from four different sites and with high- and low-K compositions shows an increase of ages younger than 400 Ma suggesting a recent increase in the impact flux. The impact history of the LL parent body (bodies?) has yet to be well constrained but may mimic the LHB observed on the Moon, which would indicate that the LL parent body was quite large. 40Ar/39Ar dating (in progress) of grains from the asteroid Itokawa recovered by the japanese Hayabusa mission have the potential to constrain the formation history and exposure age of Itokawa and will allow us to compare the results with the

  18. Specimen Referral Network to Rapidly Scale-Up CD4 Testing: The Hub and Spoke Model for Haiti

    PubMed Central

    Louis, Frantz Jean; Osborne, Anna Janick; Elias, Viala Jean; Buteau, Josiane; Boncy, Jacques; Elong, Angela; Dismer, Amber; Sasi, Vikram; Domercant, Jean Wysler; Lauture, Daniel; Balajee, S Arunmozhi; Marston, Barbara J

    2016-01-01

    Objectives Regular and quality CD4 testing is essential to monitor disease progression in people living with HIV. In Haiti, most laboratories have limited infrastructure and financial resources and have relied on manual laboratory techniques. We report the successful implementation of a national specimen referral network to rapidly increase patient coverage with quality CD4 testing while at the same time building infrastructure for referral of additional sample types over time. Method Following a thorough baseline analysis of facilities, expected workload, patient volumes, cost of technology and infrastructure constraints at health institutions providing care to HIV patients, the Haitian National Public Health Laboratory designed and implemented a national specimen referral network. The specimen referral network was scaled up in a step-wise manner from July 2011 to July 2014. Results Fourteen hubs serving a total of 67 healthcare facilities have been launched; in addition, 10 healthcare facilities operate FACSCount machines, 21 laboratories operate PIMA machines, and 11 healthcare facilities are still using manual CD4 tests. The number of health institutions able to access automated CD4 testing has increased from 27 to 113 (315%). Testing volume increased 76% on average. The number of patients enrolled on ART at the first healthcare facilities to join the network increased 182% within 6 months following linkage to the network. Performance on external quality assessment was acceptable at all 14 hubs. Conclusion A specimen referral network has enabled rapid uptake of quality CD4 testing, and served as a backbone to allow for other future tests to be scaled-up in a similar way. PMID:26900489

  19. Implementation of legal abortion in Nepal: a model for rapid scale-up of high-quality care

    PubMed Central

    2012-01-01

    Unsafe abortion's significant contribution to maternal mortality and morbidity was a critical factor leading to liberalization of Nepal's restrictive abortion law in 2002. Careful, comprehensive planning among a range of multisectoral stakeholders, led by Nepal's Ministry of Health and Population, enabled the country subsequently to introduce and scale up safe abortion services in a remarkably short timeframe. This paper examines factors that contributed to rapid, successful implementation of legal abortion in this mountainous republic, including deliberate attention to the key areas of policy, health system capacity, equipment and supplies, and information dissemination. Important elements of this successful model of scaling up safe legal abortion include: the pre-existence of postabortion care services, through which health-care providers were already familiar with the main clinical technique for safe abortion; government leadership in coordinating complementary contributions from a wide range of public- and private-sector actors; reliance on public-health evidence in formulating policies governing abortion provision, which led to the embrace of medical abortion and authorization of midlevel providers as key strategies for decentralizing care; and integration of abortion care into existing Safe Motherhood and the broader health system. While challenges remain in ensuring that all Nepali women can readily exercise their legal right to early pregnancy termination, the national safe abortion program has already yielded strong positive results. Nepal's experience making high-quality abortion care widely accessible in a short period of time offers important lessons for other countries seeking to reduce maternal mortality and morbidity from unsafe abortion and to achieve Millennium Development Goals. PMID:22475782

  20. Antiretroviral Treatment Scale-Up and Tuberculosis Mortality in High TB/HIV Burden Countries: An Econometric Analysis

    PubMed Central

    Yan, Isabel; Bendavid, Eran; Korenromp, Eline L.

    2016-01-01

    Introduction Antiretroviral therapy (ART) reduces mortality in patients with active tuberculosis (TB), but the population-level relationship between ART coverage and TB mortality is untested. We estimated the reduction in population-level TB mortality that can be attributed to increasing ART coverage across 41 high HIV-TB burden countries. Methods We compiled TB mortality trends between 1996 and 2011 from two sources: (1) national program-reported TB death notifications, adjusted for annual TB case detection rates, and (2) WHO TB mortality estimates. National coverage with ART, as proportion of HIV-infected people in need, was obtained from UNAIDS. We applied panel linear regressions controlling for HIV prevalence (5-year lagged), coverage of TB interventions (estimated by WHO and UNAIDS), gross domestic product per capita, health spending from domestic sources, urbanization, and country fixed effects. Results Models suggest that that increasing ART coverage was followed by reduced TB mortality, across multiple specifications. For death notifications at 2 to 5 years following a given ART scale-up, a 1% increase in ART coverage predicted 0.95% faster mortality rate decline (p = 0.002); resulting in 27% fewer TB deaths in 2011 alone than would have occurred without ART. Based on WHO death estimates, a 1% increase in ART predicted a 1.0% reduced TB death rate (p<0.001), and 31% fewer deaths in 2011. TB mortality was higher at higher HIV prevalence (p<0.001), but not related to coverage of isoniazid preventive therapy, cotrimoxazole preventive therapy, or other covariates. Conclusion This econometric analysis supports a substantial impact of ART on population-level TB mortality realized already within the first decade of ART scale-up, that is apparent despite variable-quality mortality data. PMID:27536864

  1. Unsteady vortex lattice techniques applied to wake formation and performance of the statically thrusting propeller

    NASA Technical Reports Server (NTRS)

    Hall, G. F.

    1975-01-01

    The application is considered of vortex lattice techniques to the problem of describing the aerodynamics and performance of statically thrusting propellers. A numerical lifting surface theory to predict the aerodynamic forces and power is performed. The chordwise and spanwise loading is modelled by bound vortices fixed to a twisted flat plate surface. In order to eliminate any apriori assumptions regarding the wake shape, it is assumed the propeller starts from rest. The wake is generated in time and allowed to deform under its own self-induced velocity field as the motion of the propeller progresses. The bound circulation distribution is then determined with time by applying the flow tangency boundary condition at certain selected control points on the blades. The aerodynamics of the infinite wing and finite wing are also considered. The details of wake formation and roll-up are investigated, particularly the localized induction effect. It is concluded that proper wake roll-up and roll-up rates can be established by considering the details of motion at the instant of start.

  2. A study and evaluation of image analysis techniques applied to remotely sensed data

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.; Dasarathy, B. V.; Lybanon, M.; Ramapriyan, H. K.

    1976-01-01

    An analysis of phenomena causing nonlinearities in the transformation from Landsat multispectral scanner coordinates to ground coordinates is presented. Experimental results comparing rms errors at ground control points indicated a slight improvement when a nonlinear (8-parameter) transformation was used instead of an affine (6-parameter) transformation. Using a preliminary ground truth map of a test site in Alabama covering the Mobile Bay area and six Landsat images of the same scene, several classification methods were assessed. A methodology was developed for automatic change detection using classification/cluster maps. A coding scheme was employed for generation of change depiction maps indicating specific types of changes. Inter- and intraseasonal data of the Mobile Bay test area were compared to illustrate the method. A beginning was made in the study of data compression by applying a Karhunen-Loeve transform technique to a small section of the test data set. The second part of the report provides a formal documentation of the several programs developed for the analysis and assessments presented.

  3. Formulation of indomethacin colon targeted delivery systems using polysaccharides as carriers by applying liquisolid technique.

    PubMed

    Elkhodairy, Kadria A; Elsaghir, Hanna A; Al-Subayiel, Amal M

    2014-01-01

    The present study aimed at the formulation of matrix tablets for colon-specific drug delivery (CSDD) system of indomethacin (IDM) by applying liquisolid (LS) technique. A CSDD system based on time-dependent polymethacrylates and enzyme degradable polysaccharides was established. Eudragit RL 100 (E-RL 100) was employed as time-dependent polymer, whereas bacterial degradable polysaccharides were presented as LS systems loaded with the drug. Indomethacin-loaded LS systems were prepared using different polysaccharides, namely, guar gum (GG), pectin (PEC), and chitosan (CH), as carriers separately or in mixtures of different ratios of 1:3, 1:1, and 3:1. Liquisolid systems that displayed promising results concerning drug release rate in both pH 1.2 and pH 6.8 were compressed into tablets after the addition of the calculated amount of E-RL 100 and lubrication with magnesium stearate and talc in the ratio of 1:9. It was found that E-RL 100 improved the flowability and compressibility of all LS formulations. The release data revealed that all formulations succeeded to sustain drug release over a period of 24 hours. Stability study indicated that PEC-based LS system as well as its matrix tablets was stable over the period of storage (one year) and could provide a minimum shelf life of two years. PMID:24971345

  4. Sampled-Data Techniques Applied to a Digital Controller for an Altitude Autopilot

    NASA Technical Reports Server (NTRS)

    Schmidt, Stanley F.; Harper, Eleanor V.

    1959-01-01

    Sampled-data theory, using the Z transformation, is applied to the design of a digital controller for an aircraft-altitude autopilot. Particular attention is focused on the sensitivity of the design to parameter variations and the abruptness of the response, that is, the normal acceleration required to carry out a transient maneuver. Consideration of these two characteristics of the system has shown that the finite settling time design method produces an unacceptable system, primarily because of the high sensitivity of the response to parameter variations, although abruptness can be controlled by increasing the sampling period. Also demonstrated is the importance of having well-damped poles or zeros if cancellation is attempted in the design methods. A different method of smoothing the response and obtaining a design which is not excessively sensitive is proposed, and examples are carried through to demonstrate the validity of the procedure. This method is based on design concepts of continuous systems, and it is shown that if no pole-zero cancellations are allowed in the design, one can obtain a response which is not too abrupt, is relatively insensitive to parameter variations, and is not sensitive to practical limits on control-surface rate. This particular design also has the simplest possible pulse transfer function for the digital controller. Simulation techniques and root loci are used for the verification of the design philosophy.

  5. Scaling-up the medical workforce in Timor-Leste: challenges of a great leap forward.

    PubMed

    Cabral, Jorge; Dussault, Gilles; Buchan, James; Ferrinho, Paulo

    2013-11-01

    The health services system of Timor-Leste (T-L) will, by 2015, add 800 physicians, most of them trained in Cuba, to the 233 employed by the national health system in 2010-2011. The need for more physicians is not in discussion: poor health indicators, low coverage and utilization of services, and poor quality of services are well documented in T-L. However, the choice of this scaling-up, with a relatively narrow focus on the medical workforce, needs to be assessed for its relevance to the health profile of the country, for its comprehensiveness in terms of other complementary measures needed to make it effective. This article discusses the potential effects of the rapid scaling-up of the medical workforce, and the organizational capacity needed to monitor the process and eventually mitigate any deleterious consequences. The analysis is based on a review of documentation collected on site (T-L) and on interviews with key-informants conducted in 2011. We stress that any workforce scaling-up is not simply a matter of increasing numbers of professionals, but should combine improved training, distribution, working conditions, management and motivation, as a means towards better performing health services' systems. This is a major challenge in a context of limited organizational and managerial capacity, underdeveloped information systems, limited training and research capacity, and dependency on foreign aid and technical assistance. Potential risks are associated with funding the additional costs of recruiting more personnel, associated expenditures on infrastructure, equipment and consumables, the impact on current staff mix, and the expected increased demand for services. We conclude that failing to manage effectively the forthcoming "great leap forward" will have long term effects: formal policies and plans for the balanced development of the health workforce, as well as strengthened institutions are urgently needed. PMID:23932856

  6. Formative evaluation of antiretroviral therapy scale-up efficiency in sub-Saharan Africa.

    PubMed

    Wagner, Glenn; Ryan, Gery; Taylor, Stephanie

    2007-11-01

    With millions in need of HIV antiretroviral therapy (ART) in the developing world, and scarce human and fiscal resources available, we conducted a formative evaluation of scale-up operations at clinics associated with AIDS Healthcare Foundation in Africa to identify lessons learned for improving scale-up efficiency. Site visits were made to six selected clinics in Uganda, Zambia, and South Africa, during which semistructured interviews with key stake-holders and observation of client flows and clinic operations were performed. This evaluation revealed the following lessons related to factors that are critical to efficient ART scale-up: (1) to ensure steady ART uptake, it is important to involve the community and community leaders in outreach, HIV education, and program decision-making; (2) minimizing bottlenecks to smooth patient flow requires efficient staff allocation to appropriate clinical duties, streamlined clinic visit schedule protocols, and tapping clients and the HIV community as a key source of labor; (3) to minimize clients dropping out of care, structures should be developed that enable clients to provide support and a "safety net" for helping each other remain in care; (4) computerized record management systems are essential for accurate antiretroviral inventory and dispensing records, quality assurance monitoring, and client enrollment records and visit scheduling; (5) effective organizational management and human resource policies are essential to maintain high job performance and satisfaction and limit burnout; (6) to maximize impact on social and economic health, it is important for ART programs to develop effective mechanisms for coordinating and referring clients to support service organizations. PMID:18240896

  7. Challenges Associated with Scaling up Artemisinin Combination Therapy in Sub-Saharan Africa A Review Article

    PubMed Central

    Njuguna, J; Qader, SS

    2008-01-01

    Malaria is the leading cause of morbidity and mortality in Sub-Saharan Africa. One key strategic intervention is provision of early diagnosis and prompt effective treatment. A major setback has been the development of drug resistance to commonly used antimalarials. To overcome this, most countries in Sub-Saharan Africa have adopted Artemisinin Combination Therapy (ACT) as a first line treatment for uncomplicated malaria. Artemether Lumefantrine (AL) and Artesunate Amodiaquine (ASAQ) are the main drugs of choice. There are key implementation issues, which may have a bearing on the scaling up of this new treatment. This article reviewed the published papers on ACT with focus on sustainability, compliance, and diagnosis. ACTs are costly, but highly effective. Their scaling up is the most cost effective malaria intervention currently available. Most countries rely heavily on the Global Fund for their scaling up. AL has a short shelf life, a complicated six-dose regimen that requires intake with fat to ensure sufficient bioavailability. High rates of adherence have been reported. Use of parasitic diagnosis is advocated to ensure rational use. Parasitic diagnostics like rapid test and microscopy are currently inadequate. The majority of malaria cases may continue to be diagnosed clinically leading to over prescription of drugs. ACTs are currently not available at the community level for home based management of malaria. Issues related to safety and rational use need to be addressed before their use in the informal health sector like community drug sellers and community health workers. The majority of malaria cases at the community level could go untreated or continue to be treated using less effective drugs. We conclude that ACTs are highly effective. A major challenge is ensuring rational use and access at the household level. It is hoped that addressing these issues will increase the likelihood that ACT achieves its intended goals of reducing morbidity and mortality

  8. Scale-up considerations relevant to experimental studies of nuclear waste-package behavior

    SciTech Connect

    Coles, D.G.; Peters, R.D.

    1986-04-01

    Results from a study that investigated whether testing large-scale nuclear waste-package assemblages was technically warranted are reported. It was recognized that the majority of the investigations for predicting waste-package performance to date have relied primarily on laboratory-scale experimentation. However, methods for the successful extrapolation of the results from such experiments, both geometrically and over time, to actual repository conditions have not been well defined. Because a well-developed scaling technology exists in the chemical-engineering discipline, it was presupposed that much of this technology could be applicable to the prediction of waste-package performance. A review of existing literature documented numerous examples where a consideration of scaling technology was important. It was concluded that much of the existing scale-up technology is applicable to the prediction of waste-package performance for both size and time extrapolations and that conducting scale-up studies may be technically merited. However, the applicability for investigating the complex chemical interactions needs further development. It was recognized that the complexity of the system, and the long time periods involved, renders a completely theoretical approach to performance prediction almost hopeless. However, a theoretical and experimental study was defined for investigating heat and fluid flow. It was concluded that conducting scale-up modeling and experimentation for waste-package performance predictions is possible using existing technology. A sequential series of scaling studies, both theoretical and experimental, will be required to formulate size and time extrapolations of waste-package performance.

  9. Advanced modeling to accelerate the scale up of carbon capture technologies

    SciTech Connect

    Miller, David C.; Sun, XIN; Storlie, Curtis B.; Bhattacharyya, Debangsu

    2015-06-01

    In order to help meet the goals of the DOE carbon capture program, the Carbon Capture Simulation Initiative (CCSI) was launched in early 2011 to develop, demonstrate, and deploy advanced computational tools and validated multi-scale models to reduce the time required to develop and scale-up new carbon capture technologies. This article focuses on essential elements related to the development and validation of multi-scale models in order to help minimize risk and maximize learning as new technologies progress from pilot to demonstration scale.

  10. Scale-up of microwave nitridation of sintered reaction bonded silicon nitride parts. Final report

    SciTech Connect

    Tiegs, T.N.; Kiggans, J.O.; Garvey, G.A.

    1997-10-01

    Scale-up were performed in which microwave heating was used to fabricate reaction-bonded silicon nitride and sintered reaction-bonded silicon nitride (SRBSN). Tests were performed in both a 2.45 GHz, 500 liter and a 2.45 GHz, 4000 liter multimode cavities. The silicon preforms processed in the studies were clevis pins for diesel engines. Up to 230 samples were processed in a single microwave furnace run. Data were collected which included weight gains for nitridation and sintering studies were performed using a conventional resistance-heated furnace.

  11. Recommendations for scale-up of community-based misoprostol distribution programs.

    PubMed

    Robinson, Nuriya; Kapungu, Chisina; Carnahan, Leslie; Geller, Stacie

    2014-06-01

    Community-based distribution of misoprostol for prevention of postpartum hemorrhage (PPH) in resource-poor settings has been shown to be safe and effective. However, global recommendations for prenatal distribution and monitoring within a community setting are not yet available. In order to successfully translate misoprostol and PPH research into policy and practice, several critical points must be considered. A focus on engaging the community, emphasizing the safe nature of community-based misoprostol distribution, supply chain management, effective distribution, coverage, and monitoring plans are essential elements to community-based misoprostol program introduction, expansion, or scale-up. PMID:24680582

  12. Scaling-up treatment for HIV/AIDS: lessons learned from multidrug-resistant tuberculosis.

    PubMed

    Gupta, Rajesh; Irwin, Alexander; Raviglione, Mario C; Kim, Jim Yong

    2004-01-24

    The UN has launched an initiative to place 3 million people in developing countries on antiretroviral AIDS treatment by end 2005 (the 3 by 5 target). Lessons for HIV/AIDS treatment scale-up emerge from recent experience with multidrug-resistant tuberculosis. Expansion of treatment for multidrug-resistant tuberculosis through the multipartner mechanism known as the Green Light Committee (GLC) has enabled gains in areas relevant to 3 by 5, including policy development, drug procurement, rational use of drugs, and the strengthening of health systems. The successes of the GLC and the obstacles it has encountered provide insights for building sustainable HIV/AIDS treatment programmes. PMID:14751708

  13. SCALE-UP Your Astronomy and Physics Undergraduate Courses to Incorporate Heliophysics

    NASA Astrophysics Data System (ADS)

    Al-Rawi, Ahlam N.; Cox, Amanda; Hoshino, Laura; Fitzgerald, Cullen; Cebulka, Rebecca; Rodriguez Garrigues, Alvar; Montgomery, Michele; Velissaris, Chris; Flitsiyan, Elena

    2016-01-01

    Although physics and astronomy courses include heliophysics topics, students still leave these courses without knowing what heliophysics is and how heliophysics relates to their daily lives. To meet goals of NASA's Living With a Star Program of incorporating heliophysics into undergraduate curriculum, UCF Physics has modified courses such as Astronomy (for non-science majors), Astrophysics, and SCALE-UP: Electricity and Magnetism for Engineers and Scientists to incorporate heliophysics topics. In this presentation, we discuss these incorporations and give examples that have been published in NASA Wavelength. In an associated poster, we present data on student learnin

  14. Application of information and communication technology for scaling up youth sexual and reproductive health.

    PubMed

    Edouard, Elizabeth; Edouard, Lindsay

    2012-06-01

    The pervasive presence of the internet and the commonality of mobile devices for communication technology have changed modalities for information exchange. Recent developments in information and communication technology (ICT) have specific implications regarding dissemination of information among youth, as exemplified by the Arab spring. The opportunity of those emerging technologies should be seized upon. ICT platforms should be used to scale-up policies and programmes that promote the sexual and reproductive health of youth due to their low cost, increased access to remote populations, better efficiency and improved flexibility for programming. Successful models should be identified through programme evaluation. PMID:22916552

  15. The scale-up and design of pressure hydrometallurgical process plants

    NASA Astrophysics Data System (ADS)

    Campbell, F.; Vardill, W. D.; Trytten, L.

    1999-09-01

    This article reviews more than 45 years of experience in the scale-up of pressure hydrometallurgical processes, from the pioneering collaboration between Sherritt and Chemical Construction Company to current process development by their successor, Dynatec Corporation. The evolution of test work is discussed, from traditional pilot-plant operations using semicommercial equipment to small scale or minipiloting with equipment several thousand times smaller than commercial units. Nickel, uranium, zinc, and gold processes have been developed and successfully implemented in worldwide operations treating a variety of feed materials, including concentrates, ores, and mattes. Data on test work duration and the ramp-up of commercial plants are presented.

  16. Scaling Up Early Infant Male Circumcision: Lessons From the Kingdom of Swaziland

    PubMed Central

    Fitzgerald, Laura; Benzerga, Wendy; Mirira, Munamato; Adamu, Tigistu; Shissler, Tracey; Bitchong, Raymond; Malaza, Mandla; Mamba, Makhosini; Mangara, Paul; Curran, Kelly; Khumalo, Thembisile; Mlambo, Phumzile; Njeuhmeli, Emmanuel; Maziya, Vusi

    2016-01-01

    ABSTRACT Background: The government of the Kingdom of Swaziland recognizes that it must urgently scale up HIV prevention interventions, such as voluntary medical male circumcision (VMMC). Swaziland has adopted a 2-phase approach to male circumcision scale-up. The catch-up phase prioritizes VMMC services for adolescents and adults, while the sustainability phase involves the establishment of early infant male circumcision (EIMC). Swaziland does not have a modern-day tradition of circumcision, and the VMMC program has met with client demand challenges. However, since the launch of the EIMC program in 2010, Swaziland now leads the Eastern and Southern Africa region in the scale-up of EIMC. Here we review Swaziland’s program and its successes and challenges. Methods: From February to May 2014, we collected data while preparing Swaziland’s “Male Circumcision Strategic and Operational Plan for HIV Prevention 2014–2018.” We conducted structured stakeholder focus group discussions and in-depth interviews, and we collected EIMC service delivery data from an implementing partner responsible for VMMC and EIMC service delivery. Data were summarized in consolidated narratives. Results: Between 2010 and 2014, trained providers performed more than 5,000 EIMCs in 11 health care facilities in Swaziland, and they reported no moderate or severe adverse events. According to a broad group of EIMC program stakeholders, an EIMC program needs robust support from facility, regional, and national leadership, both within and outside of HIV prevention coordination bodies, to promote institutionalization and ownership. Providers and health care managers in 3 of Swaziland’s 4 regional hospitals suggest that when EIMC is introduced into reproductive, maternal, newborn, and child health platforms, dedicated staff attention can help ensure that EIMC is performed amid competing priorities. Creating informed demand from communities also supports EIMC as a service delivery priority

  17. An integrated health sector response to violence against women in Malaysia: lessons for supporting scale up

    PubMed Central

    2012-01-01

    Background Malaysia has been at the forefront of the development and scale up of One-Stop Crisis Centres (OSCC) - an integrated health sector model that provides comprehensive care to women and children experiencing physical, emotional and sexual abuse. This study explored the strengths and challenges faced during the scaling up of the OSCC model to two States in Malaysia in order to identify lessons for supporting successful scale-up. Methods In-depth interviews were conducted with health care providers, policy makers and key informants in 7 hospital facilities. This was complemented by a document analysis of hospital records and protocols. Data were coded and analysed using NVivo 7. Results The implementation of the OSCC model differed between hospital settings, with practise being influenced by organisational systems and constraints. Health providers generally tried to offer care to abused women, but they are not fully supported within their facility due to lack of training, time constraints, limited allocated budget, or lack of referral system to external support services. Non-specialised hospitals in both States struggled with a scarcity of specialised staff and limited referral options for abused women. Despite these challenges, even in more resource-constrained settings staff who took the initiative found it was possible to adapt to provide some level of OSCC services, such as referring women to local NGOs or community support groups, or training nurses to offer basic counselling. Conclusions The national implementation of OSCC provides a potentially important source of support for women experiencing violence. Our findings confirm that pilot interventions for health sector responses to gender based violence can be scaled up only when there is a sound health infrastructure in place – in other words a supportive health system. Furthermore, the successful replication of the OSCC model in other similar settings requires that the model – and the system

  18. 78 FR 32381 - Applications for New Awards, Investing in Innovation Fund, Scale-up and Validation Grants...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... Scale-up and Validation grants (78 FR 25977) and (78 FR 25990). The NIAs inadvertently omitted part of... Applications for New Awards, Investing in Innovation Fund, Scale- up and Validation Grants; Correction AGENCY... Validation grants, the grantee must also ensure that the data from its evaluation are made available to...

  19. Principles for Scaling Up: Choosing, Measuring Effects, and Promoting the Widespread Use of Educational Innovation. CSE Report 634

    ERIC Educational Resources Information Center

    Baker, Eva L.

    2004-01-01

    The goal of scaling up of educational innovation is to produce robust, effective, replicable outcomes. This report addresses requirements to support scale-up of scientifically vetted innovation (or new ideas that are built on the findings of quality research and development). In this report, a number of issues are considered: the context of…

  20. Multidisciplinary Design Techniques Applied to Conceptual Aerospace Vehicle Design. Ph.D. Thesis Final Technical Report

    NASA Technical Reports Server (NTRS)

    Olds, John Robert; Walberg, Gerald D.

    1993-01-01

    Multidisciplinary design optimization (MDO) is an emerging discipline within aerospace engineering. Its goal is to bring structure and efficiency to the complex design process associated with advanced aerospace launch vehicles. Aerospace vehicles generally require input from a variety of traditional aerospace disciplines - aerodynamics, structures, performance, etc. As such, traditional optimization methods cannot always be applied. Several multidisciplinary techniques and methods were proposed as potentially applicable to this class of design problem. Among the candidate options are calculus-based (or gradient-based) optimization schemes and parametric schemes based on design of experiments theory. A brief overview of several applicable multidisciplinary design optimization methods is included. Methods from the calculus-based class and the parametric class are reviewed, but the research application reported focuses on methods from the parametric class. A vehicle of current interest was chosen as a test application for this research. The rocket-based combined-cycle (RBCC) single-stage-to-orbit (SSTO) launch vehicle combines elements of rocket and airbreathing propulsion in an attempt to produce an attractive option for launching medium sized payloads into low earth orbit. The RBCC SSTO presents a particularly difficult problem for traditional one-variable-at-a-time optimization methods because of the lack of an adequate experience base and the highly coupled nature of the design variables. MDO, however, with it's structured approach to design, is well suited to this problem. The result of the application of Taguchi methods, central composite designs, and response surface methods to the design optimization of the RBCC SSTO are presented. Attention is given to the aspect of Taguchi methods that attempts to locate a 'robust' design - that is, a design that is least sensitive to uncontrollable influences on the design. Near-optimum minimum dry weight solutions are

  1. Large-scale biomass plantings in Minnesota: Scale-up and demonstration projects in perspective

    SciTech Connect

    Kroll, T.; Downing, M.

    1995-09-01

    Scale-up projects are an important step toward demonstration and commercialization of woody biomass because simply planting extensive acreage of hybrid poplar will not develop markets. Project objectives are to document the cost to plant and establish, and effort needed to monitor and maintain woody biomass on agricultural land. Conversion technologies and alternative end-uses are examined in a larger framework in order to afford researchers and industrial partners information necessary to develop supply and demand on a local or regional scale. Likely to be determined are risk factors of crop failure and differences between establishment of research plots and agricultural scale field work. Production economics are only one consideration in understanding demonstration and scale-up. Others are environmental, marketing, industrial, and agricultural in nature. Markets for energy crops are only beginning to develop. Although information collected as a result of planting up to 5000 acres of hybrid poplar in central Minnesota will not necessarily be transferable to other areas of the country, a national perspective will come from development of regional markets for woody and herbaceous crops. Several feedstocks, with alternative markets in different regions will eventually comprise the entire picture of biofuels feedstock market development. Current projects offer opportunities to learn about the complexity and requirements that will move biomass from research and development to actual market development. These markets may include energy and other end-uses such as fiber.

  2. Scaling-Up Quantum Heat Engines Efficiently via Shortcuts to Adiabaticity

    NASA Astrophysics Data System (ADS)

    Beau, Mathieu; Jaramillo, Juan; del Campo, Adolfo

    2016-04-01

    The finite-time operation of a quantum heat engine that uses a single particle as a working medium generally increases the output power at the expense of inducing friction that lowers the cycle efficiency. We propose to scale up a quantum heat engine utilizing a many-particle working medium in combination with the use of shortcuts to adiabaticity to boost the nonadiabatic performance by eliminating quantum friction and reducing the cycle time. To this end, we first analyze the finite-time thermodynamics of a quantum Otto cycle implemented with a quantum fluid confined in a time-dependent harmonic trap. We show that nonadiabatic effects can be controlled and tailored to match the adiabatic performance using a variety of shortcuts to adiabaticity. As a result, the nonadiabatic dynamics of the scaled-up many-particle quantum heat engine exhibits no friction and the cycle can be run at maximum efficiency with a tunable output power. We demonstrate our results with a working medium consisting of particles with inverse-square pairwise interactions, that includes noninteracting and hard-core bosons as limiting cases.

  3. A practical approach for the scale-up of roller compaction process.

    PubMed

    Shi, Weixian; Sprockel, Omar L

    2016-09-01

    An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. PMID:26883853

  4. Characterization of enzymatically modified rice and barley starches with amylosucrase at scale-up production.

    PubMed

    Kim, Bum-Su; Kim, Hyun-Seok; Yoo, Sang-Ho

    2015-07-10

    Physicochemical properties of Neisseria polysaccharea amylosucrase (NpAS)-treated rice and barley starches were investigated at scale-up production. Pre-gelatinized rice and barley starches were treated with significantly lower NpAS dose (0.1 U/mL) but 100 times larger reaction volume (3500 mL), compared to the analytical scale (35 mL) used in the previous study. NpAS-treated starches in this scale-up production were characterized with respect to reaction efficiency (RE), resistant starch (RS) content, amylopectin (AP) branch-chain length distribution, solubility, swelling power, pasting viscosity, and thermal transition properties. The RE enhanced up to 1.8 times by increasing the reaction volume, which improved the RS content and AP branch-chain lengths of NpAS-treated starches. Compared with the native starch, NpAS-treated starches exhibited lower solubility and swelling power, lower pasting viscosity, and a large increase in the melting peak temperature. Consequently, NpAS treatment of pre-gelatinized starches in this study would be a potential way of replacing commercial RS production. PMID:25857960

  5. Scaling up early infant diagnosis of HIV in Rwanda, 2008-2010.

    PubMed

    Binagwaho, Agnes; Mugwaneza, Placidie; Irakoze, Ange Anitha; Nsanzimana, Sabin; Agbonyitor, Mawuena; Nutt, Cameron T; Wagner, Claire M; Rukundo, Alphonse; Ahayo, Anita; Drobac, Peter; Karema, Corine; Hinda, Ruton; Leung, Lucinda; Bandara, Sachini; Chopyak, Elena; Fawzi, Mary C Smith

    2013-01-01

    More than 390,000 children are newly infected with HIV each year, only 28 per cent of whom benefit from early infant diagnosis (EID). Rwanda's Ministry of Health identified several major challenges hindering EID scale-up in care of HIV-positive infants. It found poor counseling and follow-up by caregivers of HIV-exposed infants, lack of coordination with maternal and child health-care programs, and long delays between the collection of samples and return of results to the health facility and caregiver. By increasing geographic access, integrating EID with vaccination programs, and investing in a robust mobile phone reporting system, Rwanda increased population coverage of EID from approximately 28 to 72.4 per cent (and to 90.3 per cent within the prevention of mother to child transmission program) between 2008 and 2011. Turnaround time from sample collection to receipt of results at the originating health facility was reduced from 144 to 20 days. Rwanda rapidly scaled up and improved its EID program, but challenges persist for linking infected infants to care. PMID:23191941

  6. Final Aperture Superposition Technique applied to fast calculation of electron output factors and depth dose curves

    SciTech Connect

    Faddegon, B.A.; Villarreal-Barajas, J.E.

    2005-11-15

    The Final Aperture Superposition Technique (FAST) is described and applied to accurate, near instantaneous calculation of the relative output factor (ROF) and central axis percentage depth dose curve (PDD) for clinical electron beams used in radiotherapy. FAST is based on precalculation of dose at select points for the two extreme situations of a fully open final aperture and a final aperture with no opening (fully shielded). This technique is different than conventional superposition of dose deposition kernels: The precalculated dose is differential in position of the electron or photon at the downstream surface of the insert. The calculation for a particular aperture (x-ray jaws or MLC, insert in electron applicator) is done with superposition of the precalculated dose data, using the open field data over the open part of the aperture and the fully shielded data over the remainder. The calculation takes explicit account of all interactions in the shielded region of the aperture except the collimator effect: Particles that pass from the open part into the shielded part, or visa versa. For the clinical demonstration, FAST was compared to full Monte Carlo simulation of 10x10,2.5x2.5, and 2x8 cm{sup 2} inserts. Dose was calculated to 0.5% precision in 0.4x0.4x0.2 cm{sup 3} voxels, spaced at 0.2 cm depth intervals along the central axis, using detailed Monte Carlo simulation of the treatment head of a commercial linear accelerator for six different electron beams with energies of 6-21 MeV. Each simulation took several hours on a personal computer with a 1.7 Mhz processor. The calculation for the individual inserts, done with superposition, was completed in under a second on the same PC. Since simulations for the pre calculation are only performed once, higher precision and resolution can be obtained without increasing the calculation time for individual inserts. Fully shielded contributions were largest for small fields and high beam energy, at the surface, reaching a

  7. Scale-up and evaluation of high solid ionic liquid pretreatment and enzymatic hydrolysis of switchgrass

    PubMed Central

    2013-01-01

    Background Ionic liquid (IL) pretreatment is receiving significant attention as a potential process that enables fractionation of lignocellulosic biomass and produces high yields of fermentable sugars suitable for the production of renewable fuels. However, successful optimization and scale up of IL pretreatment involves challenges, such as high solids loading, biomass handling and transfer, washing of pretreated solids and formation of inhibitors, which are not addressed during the development stages at the small scale in a laboratory environment. As a first in the research community, the Joint BioEnergy Institute, in collaboration with the Advanced Biofuels Process Demonstration Unit, a Department of Energy funded facility that supports academic and industrial entities in scaling their novel biofuels enabling technologies, have performed benchmark studies to identify key challenges associated with IL pretreatment using 1-ethyl-3-methylimidazolium acetate and subsequent enzymatic saccharification beyond bench scale. Results Using switchgrass as the model feedstock, we have successfully executed 600-fold, relative to the bench scale (6 L vs 0.01 L), scale-up of IL pretreatment at 15% (w/w) biomass loading. Results show that IL pretreatment at 15% biomass generates a product containing 87.5% of glucan, 42.6% of xylan and only 22.8% of lignin relative to the starting material. The pretreated biomass is efficiently converted into monosaccharides during subsequent enzymatic hydrolysis at 10% loading over a 150-fold scale of operations (1.5 L vs 0.01 L) with 99.8% fermentable sugar conversion. The yield of glucose and xylose in the liquid streams were 94.8% and 62.2%, respectively, and the hydrolysate generated contains high titers of fermentable sugars (62.1 g/L of glucose and 5.4 g/L cellobiose). The overall glucan and xylan balance from pretreatment and saccharification were 95.0% and 77.1%, respectively. Enzymatic inhibition by [C2mim][OAc] at high solids

  8. Process engineering and scale-up of autotrophic Clostridium strain P11 syngas fermentation

    NASA Astrophysics Data System (ADS)

    Kundiyana, Dimple Kumar Aiyanna

    Scope and Method of Study. Biomass gasification followed by fermentation of syngas to ethanol is a potential process to produce bioenergy. The process is currently being researched under laboratory- and pilot-scale in an effort to optimize the process conditions and make the process feasible for commercial production of ethanol and other biofuels such as butanol and propanol. The broad research objectives for the research were to improve ethanol yields during syngas fermentation and to design a economical fermentation process. The research included four statistically designed experimental studies in serum bottles, bench-scale and pilot-scale fermentors to screen alternate fermentation media components, to determine the effect of process parameters such as pH, temperature and buffer on syngas fermentation, to determine the effect of key limiting nutrients of the acetyl-CoA pathway in a continuous series reactor design, and to scale-up the syngas fermentation in a 100-L pilot scale fermentor. Findings and Conclusions. The first experimental study identified cotton seed extract (CSE) as a feasible medium for Clostridium strain P11 fermentation. The study showed that CSE at 0.5 g L-1 can potentially replace all the standard Clostridium strain P11 fermentation media components while using a media buffer did not significantly improve the ethanol production when used in fermentation with CSE. Scale-up of the CSE fermentation in 2-L and 5-L stirred tank fermentors showed 25% increase in ethanol yield. The second experimental study showed that syngas fermentation at 32°C without buffer was associated with higher ethanol concentration and reduced lag time in switching to solventogenesis. Conducting fermentation at 40°C or by lowering incubation pH to 5.0 resulted in reduced cell growth and no production of ethanol or acetic acid. The third experiment studied the effect of three limiting nutrients, calcium pantothenate, vitamin B12 and CoCl2 on syngas fermentation. Results

  9. Scale-up of Pore Scale Spatiotemporal CO2 Dissolution Data

    NASA Astrophysics Data System (ADS)

    Singh, H.; Srinivasan, S.; Ovaysi, S.; Wheeler, M. F.

    2014-12-01

    One of the potential risks associated with subsurface storage of CO2 is potentially the seepage of CO2 through existing faults and the studies devoted to this topic show that geochemistry plays an important role in rendering these faults as effective conduits for CO2 movement while others show that mineralization due to CO2 injection can result in seep migration and flow channeling. Therefore, understanding the changes in reservoir flow dynamics with time due to geochemical alteration of the porous media and accurately scaling up these changes for representation in field scale models is important to engineer the CO2storage process. After CO2 is injected in a subsurface, dispersion results in mixing of CO2 with aqueous species present in in-situ brine. The reactions of the dissolved CO2 with rock minerals lead to dissolution and/or formation of precipitates which alter the pore structure by changing the porosity and the permeability. Scale-up of reservoir properties and flow response at a single snapshot of time has been described by other authors, however, scale-up for reactive processes cannot be correctly described just at a single snapshot of time. We use the concept of a Representative Elementary Volume (REV) to explore the scaling characteristics of the reactive transport process. We model the REV using a variance-based statistical approach using high-resolution pore-scale data. For comparison purposes, we also compute the REV for a conservative transport using 3-D pressure data. The REV for reactive process is modeled using three different types of data: CO2 concentration, fluid/matrix pore-network data and dissolution data. For simplicity, we consider the spatial variations along a 2-D slice at various times, rendering this a 3D spatiotemporal dataset. The results indicate that the REV in reservoir with reactive flow changes with time and is greater when compared to the REV for conservative flow. The change in REV with time for reservoirs with reactive flow

  10. Novel computational and analytic techniques for nonlinear systems applied to structural and celestial mechanics

    NASA Astrophysics Data System (ADS)

    Elgohary, Tarek Adel Abdelsalam

    In this Dissertation, computational and analytic methods are presented to address nonlinear systems with applications in structural and celestial mechanics. Scalar Homotopy Methods (SHM) are first introduced for the solution of general systems of nonlinear algebraic equations. The methods are applied to the solution of postbuckling and limit load problems of solids and structures as exemplified by simple plane elastic frames, considering only geometrical nonlinearities. In many problems, instead of simply adopting a root solving method, it is useful to study the particular problem in more detail in order to establish an especially efficient and robust method. Such a problem arises in satellite geodesy coordinate transformation where a new highly efficient solution, providing global accuracy with a non-iterative sequence of calculations, is developed. Simulation results are presented to compare the solution accuracy and algorithm performance for applications spanning the LEO-to-GEO range of missions. Analytic methods are introduced to address problems in structural mechanics and astrodynamics. Analytic transfer functions are developed to address the frequency domain control problem of flexible rotating aerospace structures. The transfer functions are used to design a Lyapunov stable controller that drives the spacecraft to a target position while suppressing vibrations in the flexible appendages. In astrodynamics, a Taylor series based analytic continuation technique is developed to address the classical two-body problem. A key algorithmic innovation for the trajectory propagation is that the classical averaged approximation strategy is replaced with a rigorous series based solution for exactly computing the acceleration derivatives. Evidence is provided to demonstrate that high precision solutions are easily obtained with the analytic continuation approach. For general nonlinear initial value problems (IVPs), the method of Radial Basis Functions time domain

  11. Element selective detection of molecular species applying chromatographic techniques and diode laser atomic absorption spectrometry.

    PubMed

    Kunze, K; Zybin, A; Koch, J; Franzke, J; Miclea, M; Niemax, K

    2004-12-01

    Tunable diode laser atomic absorption spectroscopy (DLAAS) combined with separation techniques and atomization in plasmas and flames is presented as a powerful method for analysis of molecular species. The analytical figures of merit of the technique are demonstrated by the measurement of Cr(VI) and Mn compounds, as well as molecular species including halogen atoms, hydrogen, carbon and sulfur. PMID:15561625

  12. Integrating Cognitive Behavioral and Applied Behavior Techniques With Dysfunctional Family Behavior.

    ERIC Educational Resources Information Center

    Barrish, I. J.

    Families experiencing severe conflict are often unable to effectively implement applied behavioral procedures due to interfering emotional responses (anger, blaming, anxiety and depression) and behavioral responses (yelling, crying and physical fighting), which often reduce effective implementation of applied behavioral procedures. Specific…

  13. Scaling Up Chronic Disease Prevention Interventions in Lower- and Middle-Income Countries

    PubMed Central

    Gaziano, Thomas A.; Pagidipati, Neha

    2013-01-01

    Chronic diseases are increasingly becoming a health burden in lower-and middle-income countries, putting pressure on public health efforts to scale up interventions. This article reviews current efforts in interventions on a population and individual level. Population-level interventions include ongoing efforts to reduce smoking rates, reduce intake of salt and trans–fatty acids, and increase physical activity in increasingly sedentary populations. Individual-level interventions include control and treatment of risk factors for chronic diseases and secondary prevention. This review also discusses the barriers in interventions, particularly those specific to low- and middle-income countries. Continued discussion of proven cost-effective interventions for chronic diseases in the developing world will be useful for improving public health policy. PMID:23297660

  14. Single remote sensing image scale-up combining modulation transform function compensation

    NASA Astrophysics Data System (ADS)

    Cao, Shixiang; Liu, Wei; Zhou, Nan; He, Hongyan; Jiang, Jie

    2016-01-01

    Remote sensing images usually need scale-up for visualization or representation, using only one original image. According to the performance of detective sensors, a new and more applicable method is proposed here. To enhance the high-frequency components, the modulation transform function compensation (MTFC) part focuses on how to adjust the spatial response before and after launch, under signal-to-noise ratio control. This largely reduces the ring phenomenon from incorrect point spread function guesses. Then a contour stencil prior manages to limit edge artifacts in the upscaled image after MTFC. An iterative backprojection operation with fast convergence is also utilized to bring about intensity and contour consistency. We finally present our analysis based on real images with parallel design for full speed. Compared with existing algorithms, the operator illustrates its potential to keep geometric features and extend the visual and quantitative quality for further analysis.

  15. Scaling-up of membraneless microbial electrolysis cells (MECs) for domestic wastewater treatment: Bottlenecks and limitations.

    PubMed

    Escapa, A; San-Martín, M I; Mateos, R; Morán, A

    2015-03-01

    Microbial electrolysis cells (MECs) have the potential to become a sustainable domestic wastewater (dWW) treatment system. However, new scale-up experiences are required to gain knowledge of critical issues in MEC designs. In this study we assess the ability of two twin membraneless MEC units (that are part of a modular pilot-scale MEC) to treat dWW. Batch tests yielded COD removal efficiencies as high as 92%, with most of the hydrogen (>80% of the total production) being produced during the first 48h. During the continuous tests, MECs performance deteriorated significantly (energy consumption was relatively high and COD removal efficiencies fell below 10% in many cases), which was attributed to an inadequate configuration of the anodic chamber, insufficient mixing inside this chamber, inefficient hydrogen management on the cathode side and finally to dWW in itself. Some alternatives to the current design are suggested. PMID:25590425

  16. Scale-up of the nitridation and sintering of silicon preforms using microwave heating

    SciTech Connect

    Kiggans, J.O. Jr.; Tiegs, T.N.; Davisson, C.C.; Morrow, M.S.; Garvey, G.J.

    1996-05-01

    Scale-up studies were performed in which microwave heating was used to fabricate reaction-bonded silicon nitride and sintered reaction-bonded silicon nitride (SRBSN). Tests were performed in both a 2.45 GHz, 500 liter and a 2.45 GHz, 4,000 liter multimode cavities. A variety of sizes, shapes, and compositions of silicon preforms were processed in the studies, including bucket tappets and clevis pins for diesel engines. Up to 230 samples were processed in a single microwave furnace run. Data were collected which included weight gains for nitridation experiments, and final densities for nitridation and sintering experiments. For comparison, nitridation and sintering studies were performed using a conventional resistance-heated furnace.

  17. Processing parameters associated with scale-up of balloon film production

    NASA Technical Reports Server (NTRS)

    Simpson, D. M.; Harrison, I. R.

    1993-01-01

    A method is set forth for assessing strain-rate profiles that can be used to develop a scale-up theory for blown-film extrusion. Strain rates are evaluated by placing four ink dots on the stalk of an extruded bubble to follow the displacements of the dots as a function of time. The instantaneous Hencky strain is obtained with the displacement data and plotted for analysis. Specific attention is given to potential sources of error in the distance measurements and corrections for these complex bubble geometries. The method is shown to be effective for deriving strain-rate data related to different processing parameters for the production of balloon film. The strain rates can be compared to frostline height, blow-up ratio, and take-up ratio to optimize these processing variables.

  18. Programme science research on medical male circumcision scale-up in sub-Saharan Africa.

    PubMed

    Gray, Ronald H; Wawer, Maria J; Kigozi, Godfrey

    2013-08-01

    Three randomised trials demonstrate that voluntary medical male circumcision (MMC) reduces male HIV acquisition by 50-60%, and post-trial surveillance has shown that the effects are long lasting. Scale-up of services has been initiated in 14 high-priority sub-Saharan African countries with high rates of HIV and low prevalence of MMC. However, circumcision coverage in the region remains low. Challenges to MMC rollout include suboptimal demand among higher-risk men, the need to expand access and reduce costs of MMC through personnel task shifting and task sharing, assuring and maintaining a high quality of service provision, and the testing and introduction of non-surgical devices. In addition, early infant male circumcision has not been adequately evaluated in Africa. Here, we describe challenges to implementation and discuss the ongoing and future role of implementation and programme science in addressing such challenges. PMID:23698513

  19. Measurement of Glottal Flow across Scaled Up Dynamic Vocal Fold Motion

    NASA Astrophysics Data System (ADS)

    Sherman, Erica; Krane, Michael; Zhang, Lucy; Wei, Timothy

    2009-11-01

    An experiment to provide DPIV measurements of dynamic human vocal folds motion is presented. The experiment is run in a free-stream water tunnel using a 10x scaled-up model of the human vocal folds and vocal tract. The vocal fold model is a new design that incorporates both the rocking as well as the oscillatory open/close motions characteristic of vocal fold motions The Reynolds number and Strouhal number have been matched to human physiologic conditions. Flow measurements show the start-up jet, vortex dynamics and ultimate jet pinch-off as the model progresses through a cycle. The effects of asymmetries associated with disease will be discussed.

  20. Examination of Flow in a Scaled-Up Vocal Fold Model for Diseased Conditions

    NASA Astrophysics Data System (ADS)

    Sherman, Erica; Zhang, Lucy; Xinshi, Wang; Timothy, Wei; Krane, Michael

    2010-11-01

    An experiment to provide DPIV measurements in a scaled up dynamic human vocal fold model is presented. The 10x scale vocal fold model is a new design that incorporates both the rocking as well as the oscillatory open/close motions characteristic of vocal fold motions. The experiment is run in a free-stream water tunnel where the oscillation frequencies and flow speeds are dynamically matched to physiologic conditions for both male and female phonation. The effects associated with vocal fold paralysis will be discussed. Flow measurements showing fluid kinematics including jet velocity and orientation, and vortex shedding as a function of time through an oscillation cycle will be presented. In addition, key data relevant to phonation, such as volumetric flow rate and glottal behavior will be presented.

  1. THE SCALE-UP OF LARGE PRESSURIZED FLUIDIZED BEDS FOR ADVANCED COAL FIRED PROCESSES

    SciTech Connect

    Leon Glicksman; Hesham Younis; Richard Hing-Fung Tan; Michel Louge; Elizabeth Griffith; Vincent Bricout

    1998-04-30

    Pressurized fluidization is a promising new technology for the clean and efficient combustion of coal. Its principle is to operate a coal combustor at high inlet gas velocity to increase the flow of reactants, at an elevated pressure to raise the overall efficiency of the process. Unfortunately, commercialization of large pressurized fluidized beds is inhibited by uncertainties in scaling up units from the current pilot plant levels. In this context, our objective is to conduct a study of the fluid dynamics and solid capture of a large pressurized coal-fired unit. The idea is to employ dimensional similitude to simulate in a cold laboratory model the flow in a Pressurized Circulating Fluid Bed ''Pyrolyzer,'' which is part of a High Performance Power System (HIPPS) developed by Foster Wheeler Development Corporation (FWDC) under the DOE's Combustion 2000 program.

  2. Teaching assistant-student interactions in a modified SCALE-UP classroom

    NASA Astrophysics Data System (ADS)

    DeBeck, George; Demaree, Dedra

    2012-02-01

    In the spring term of 2010, Oregon State University (OSU) began using a SCALE-UP style classroom in the instruction of the introductory calculus-based physics series. Instruction in this classroom was conducted in weekly two-hour sessions facilitated by the primary professor and either two graduate teaching assistants (GTAs) or a graduate teaching assistant and an undergraduate learning assistant (LA). During the course of instruction, two of the eight tables in the room were audio and video recorded. We examine the practices of the GTAs in interacting with the students through both qualitative and quantitative analyses of these recordings. Quantitatively, significant differences are seen between the most experienced GTA and the rest. A major difference in confidence is also observed in the qualitative analysis of this GTA compared to a less experienced GTA.

  3. Scaling up ATLAS Database Release Technology for the LHC Long Run

    NASA Astrophysics Data System (ADS)

    Borodin, M.; Nevski, P.; Vaniachine, A.; ATLAS Collaboration

    2011-12-01

    To overcome scalability limitations in database access on the Grid, ATLAS introduced the Database Release technology replicating databases in files. For years Database Release technology assured scalable database access for Monte Carlo production on the Grid. Since previous CHEP, Database Release technology was used successfully in ATLAS data reprocessing on the Grid. Frozen Conditions DB snapshot guarantees reproducibility and transactional consistency isolating Grid data processing tasks from continuous conditions updates at the "live" Oracle server. Database Release technology fully satisfies the requirements of ALLAS data reprocessing and Monte Carlo production. We parallelized the Database Release build workflow to avoid linear dependency of the build time on the length of LHC data-taking period. In recent data reprocessing campaigns the build time was reduced by an order of magnitude thanks to a proven master-worker architecture used in the Google MapReduce. We describe further Database Release optimizations scaling up the technology for the LHC long run.

  4. Performance of kilowatt-class laser modules in scaling up laser produced plasma (LPP) EUV source

    NASA Astrophysics Data System (ADS)

    Ellwi, Samir; Comley, Andrew; Hay, Nick; Henderson, Ian; Brownell, Michael

    2005-05-01

    Powerlase has made significant steps forward in developing reliable and cost-effective, kilowatt-class laser modules with short pulse duration and small footprint, for use as EUV drivers. These characteristics in parallel to EUV target requirements are essential for the generation of 115W of in-band EUV power at the intermediate focus. These laser modules can be coupled to the EUV target by using our flexible spatial and temporal multiplexing approach in order to scale up the laser average power on target. The multiplexing method developed by Powerlase is modular and optimised for maximum EUV collection angle. To further this goal we are currently evaluating target materials such as xenon in various phases and forms and also have a programme in place to investigate suitable tin targets.

  5. Evaluation of liquid-fed ceramic melter scale-up correlations

    SciTech Connect

    Koegler, S.S.; Mitchell, S.J.

    1988-08-01

    This study was conducted to determine the parameters governing factors of scale for liquid-fed ceramic melters (LFCMs) in order to design full-scale melters using smaller-scale melter data. Results of melter experiments conducted at Pacific Northwest Laboratory (PNL) and Savannah River Laboratory (SRL) are presented for two feed compositions and five different liquid-fed ceramic melters. The melter performance data including nominal feed rate and glass melt rate are correlated as a function of melter surface area. Comparisons are made between the actual melt rate data and melt rates predicted by a cold cap heat transfer model. The heat transfer model could be used in scale-up calculations, but insufficient data are available on the cold cap characteristics. Experiments specifically designed to determine heat transfer parameters are needed to further develop the model. 17 refs.

  6. Scale-up of Lithium Aluminate Pellet Manufacturing with a Flowable Powder

    SciTech Connect

    Hollenberg, Glenn W.; Bagaasen, Larry M.; Kurosky, Randal P.; Tonn, D.; Carty, W.

    2004-01-01

    Thin-walled, high-density lithium aluminate pellets are challenging to manufacture for nuclear reactor applications. The key to scale-up of production was the development of flowable, high density, lithium aluminate powder that permitted (1) automated isostatic pressing, (2) low compaction during pressing, (3) low shrinkage during firing, (4) elimination of chlorine-containing fumed alumina and (5) near-net shape forming. A triple spray drying process was developed that included: (I) a unique-feedstock blend cycle, (II) a post-calcination grinding cycle, and (III) a high-pH final cycle with high solids loading slurry that was spray dried into flowable high-density spheres with large, uniform diameters. Today, pellet manufacturing at a rate of more than 400,000 per year is possible.

  7. Influence of tray geometry on scaling up distillation efficiency from laboratory data

    SciTech Connect

    Lopez, F.; Castells, F.

    1999-07-01

    This paper studies the effect of tray geometry (especially hole diameter) and liquid tray composition on tray efficiency in a bench-scale distillation column. The results of this study are used for scaling up tray efficiency. Two binary systems, ethanol/water and cyclohexane/n-heptane, were considered. The operating conditions were atmospheric pressure and total reflux. For each one, two different hole diameters (small and large) were also tested. Kirschbaum`s industrial data (1962) for the ethanol/water system and of Yanagi and Sakata`s (1982) for the cyclohexane/n-heptane system were considered as reference values. The results show the importance of reproducing the hole diameter and liquid tray composition in small trays for using laboratory data to predict large tray efficiency.

  8. A study on a voloxidizer with an oxygen concentration controller for a scale-up DESIGN

    SciTech Connect

    Kim, Young-Hwan; Yoon, Ji-Sup; Park, Byung-Suk; Jung, Jae-Hoo

    2007-07-01

    For a oxidation of UO{sub 2} pellets of tens/kg in a vol-oxidizer, the existing devices take a long time, also, for their scale-up to an engineering scale, we need the optimum oxygen concentration with an maximum oxidation efficiency. In this study, we attained the optimum oxygen concentration to shorten the oxidation time of a simulation fuel using a vol-oxidizer with an oxygen concentration controller and sensor. We compared the characteristics of a galvanic sensor with a zirconium oxide (ZrO{sub 2}) one. The simulation fuel was manufactured with 14 metallic oxides, and used at a mass of 500 g HM/batch. At 500 deg. C, the galvanic and zirconium oxide sensors measured the oxidation time for the simulation fuel. Also, the oxidation time of the simulation fuel was measured according to a change of the oxygen concentration with the selected sensor, and the sample was analyzed. (authors)

  9. Scale-up research in a dual fluidized bed gasification process.

    PubMed

    Narobe, Miha; Golob, Janvit; Mele, Jernej; Sekavčnik, Mihael; Senegačnik, Andrej; Klinar, Dušan

    2015-01-01

    A successful co-gasification of plastics and biomass was achieved on the 100 kW dual fluidized bed (DFB) gasification pilot plant. The results of a pilot plant experiment were used as a sound basis for scale-up prediction to 750 kW semi-industrial DFB plant. By an eightfold increase of mass and heat flows a rather simplified co-gasification process was predicted. Namely, the losses occurring in gasification plants are expected to be relatively smaller in larger plants. The effect of decreased losses was studied with an equilibrium model. Three different situations were simulated with the following fixed values of losses: 70 kW, 115 kW and 160 kW. The model showed an increase in fuel conversion when losses were reduced. PMID:26085423

  10. Scaling up the 454 Titanium Library Construction and Pooling of Barcoded Libraries

    SciTech Connect

    Phung, Wilson; Hack, Christopher; Shapiro, Harris; Lucas, Susan; Cheng, Jan-Fang

    2009-03-23

    We have been developing a high throughput 454 library construction process at the Joint Genome Institute to meet the needs of de novo sequencing a large number of microbial and eukaryote genomes, EST, and metagenome projects. We have been focusing efforts in three areas: (1) modifying the current process to allow the construction of 454 standard libraries on a 96-well format; (2) developing a robotic platform to perform the 454 library construction; and (3) designing molecular barcodes to allow pooling and sorting of many different samples. In the development of a high throughput process to scale up the number of libraries by adapting the process to a 96-well plate format, the key process change involves the replacement of gel electrophoresis for size selection with Solid Phase Reversible Immobilization (SPRI) beads. Although the standard deviation of the insert sizes increases, the overall quality sequence and distribution of the reads in the genome has not changed. The manual process of constructing 454 shotgun libraries on 96-well plates is a time-consuming, labor-intensive, and ergonomically hazardous process; we have been experimenting to program a BioMek robot to perform the library construction. This will not only enable library construction to be completed in a single day, but will also minimize any ergonomic risk. In addition, we have implemented a set of molecular barcodes (AKA Multiple Identifiers or MID) and a pooling process that allows us to sequence many targets simultaneously. Here we will present the testing of pooling a set of selected fosmids derived from the endomycorrhizal fungus Glomus intraradices. By combining the robotic library construction process and the use of molecular barcodes, it is now possible to sequence hundreds of fosmids that represent a minimal tiling path of this genome. Here we present the progress and the challenges of developing these scaled-up processes.

  11. Minnesota wood energy scale-up project 1994 establishment cost data

    SciTech Connect

    Downing, M.; Pierce, R.; Kroll, T.

    1996-03-18

    The Minnesota Wood Energy Scale-up Project began in late 1993 with the first trees planted in the spring of 1994. The purpose of the project is to track and monitor economic costs of planting, maintaining and monitoring larger scale commercial plantings. For 15 years, smaller scale research plantings of hybrid poplar have been used to screen for promising, high-yielding poplar clones. In this project 1000 acres of hybrid poplar trees were planted on Conservation Reserve Program (CRP) land near Alexandria, Minnesota in 1994. The fourteen landowners involved re-contracted with the CRP for five-year extensions of their existing 10-year contracts. These extended contracts will expire in 2001, when the plantings are 7 years old. The end use for the trees planted in the Minnesota Wood Energy Scale-up Project is undetermined. They will belong to the owner of the land on which they are planted. There are no current contracts in place for the wood these trees are projected to supply. The structure of the wood industry in the Minnesota has changed drastically over the past 5 years. Stumpage values for fiber have risen to more than $20 per cord in some areas raising the possibility that these trees could be used for fiber rather than energy. Several legislative mandates have forced the State of Minnesota to pursue renewable energy including biomass energy. These mandates, a potential need for an additional 1700 MW of power by 2008 by Northern States Power, and agricultural policies will all affect development of energy markets for wood produced much like agricultural crops. There has been a tremendous amount of local and international interest in the project. Contractual negotiations between area landowners, the CRP, a local Resource Conservation and Development District, the Minnesota Department of Natural Resources and others are currently underway for additional planting of 1000 acres in spring 1995.

  12. Supervision, monitoring and evaluation of nationwide scale-up of antiretroviral therapy in Malawi.

    PubMed Central

    Libamba, Edwin; Makombe, Simon; Mhango, Eustice; de Ascurra Teck, Olga; Limbambala, Eddie; Schouten, Erik J.; Harries, Anthony D.

    2006-01-01

    OBJECTIVE: To describe the supervision, monitoring and evaluation strategies used to assess the delivery of antiretroviral therapy during nationwide scale-up of treatment in Malawi. METHODS: In the first quarter of 2005, the HIV Unit of the Ministry of Health and its partners (the Lighthouse Clinic; Médecins Sans Frontières-Belgium, Thyolo district; and WHO's Country Office) undertook structured supervision and monitoring of all public sector health facilities in Malawi delivering antiretroviral therapy. FINDINGS: Data monitoring showed that by the end of 2004, there were 13,183 patients (5274 (40%) male, 12 527 (95%) adults) who had ever started antiretroviral therapy. Of patients who had ever started, 82% (10 761/13,183) were alive and taking antiretrovirals; 8% (1026/13,183) were dead; 8% (1039/13,183) had been lost to follow up; <1% (106/13,183) had stopped treatment; and 2% (251/13,183) had transferred to another facility. Of those alive and on antiretrovirals, 98% (7098/7258) were ambulatory; 85% (6174/7258) were fit to work; 10% (456/4687) had significant side effects; and, based on pill counts, 96% (6824/7114) had taken their treatment correctly. Mistakes in the registration and monitoring of patients were identified and corrected. Drug stocks were checked, and one potential drug stock-out was averted. As a result of the supervisory visits, by the end of March 2005 recruitment of patients to facilities scheduled to start delivering antiretroviral therapy had increased. CONCLUSION: This report demonstrates the importance of early supervision for sites that are starting to deliver antiretroviral therapy, and it shows the value of combining data collection with supervision. Making regular supervisory and monitoring visits to delivery sites are essential for tracking the national scale-up of delivery of antiretrovirals. PMID:16628306

  13. Trans-National Scale-Up of Services in Global Health

    PubMed Central

    Shahin, Ilan; Sohal, Raman; Ginther, John; Hayden, Leigh; MacDonald, John A.; Mossman, Kathryn; Parikh, Himanshu; McGahan, Anita; Mitchell, Will; Bhattacharyya, Onil

    2014-01-01

    Background Scaling up innovative healthcare programs offers a means to improve access, quality, and health equity across multiple health areas. Despite large numbers of promising projects, little is known about successful efforts to scale up. This study examines trans-national scale, whereby a program operates in two or more countries. Trans-national scale is a distinct measure that reflects opportunities to replicate healthcare programs in multiple countries, thereby providing services to broader populations. Methods Based on the Center for Health Market Innovations (CHMI) database of nearly 1200 health programs, the study contrasts 116 programs that have achieved trans-national scale with 1,068 single-country programs. Data was collected on the programs' health focus, service activity, legal status, and funding sources, as well as the programs' locations (rural v. urban emphasis), and founding year; differences are reported with statistical significance. Findings This analysis examines 116 programs that have achieved trans-national scale (TNS) across multiple disease areas and activity types. Compared to 1,068 single-country programs, we find that trans-nationally scaled programs are more donor-reliant; more likely to focus on targeted health needs such as HIV/AIDS, TB, malaria, or family planning rather than provide more comprehensive general care; and more likely to engage in activities that support healthcare services rather than provide direct clinical care. Conclusion This work, based on a large data set of health programs, reports on trans-national scale with comparison to single-country programs. The work is a step towards understanding when programs are able to replicate their services as they attempt to expand health services for the poor across countries and health areas. A subset of these programs should be the subject of case studies to understand factors that affect the scaling process, particularly seeking to identify mechanisms that lead to

  14. Manufacturing process scale-up of optical grade transparent spinel ceramic at ArmorLine Corporation

    NASA Astrophysics Data System (ADS)

    Spilman, Joseph; Voyles, John; Nick, Joseph; Shaffer, Lawrence

    2013-06-01

    While transparent Spinel ceramic's mechanical and optical characteristics are ideal for many Ultraviolet (UV), visible, Short-Wave Infrared (SWIR), Mid-Wave Infrared (MWIR), and multispectral sensor window applications, commercial adoption of the material has been hampered because the material has historically been available in relatively small sizes (one square foot per window or less), low volumes, unreliable supply, and with unreliable quality. Recent efforts, most notably by Technology Assessment and Transfer (TA and T), have scaled-up manufacturing processes and demonstrated the capability to produce larger windows on the order of two square feet, but with limited output not suitable for production type programs. ArmorLine Corporation licensed the hot-pressed Spinel manufacturing know-how of TA and T in 2009 with the goal of building the world's first dedicated full-scale Spinel production facility, enabling the supply of a reliable and sufficient volume of large Transparent Armor and Optical Grade Spinel plates. With over $20 million of private investment by J.F. Lehman and Company, ArmorLine has installed and commissioned the largest vacuum hot press in the world, the largest high-temperature/high-pressure hot isostatic press in the world, and supporting manufacturing processes within 75,000 square feet of manufacturing space. ArmorLine's equipment is capable of producing window blanks as large as 50" x 30" and the facility is capable of producing substantial volumes of material with its Lean configuration and 24/7 operation. Initial production capability was achieved in 2012. ArmorLine will discuss the challenges that were encountered during scale-up of the manufacturing processes, ArmorLine Optical Grade Spinel optical performance, and provide an overview of the facility and its capabilities.

  15. High-Throughput Synthesis, Screening, and Scale-Up of Optimized Conducting Indium Tin Oxides.

    PubMed

    Marchand, Peter; Makwana, Neel M; Tighe, Christopher J; Gruar, Robert I; Parkin, Ivan P; Carmalt, Claire J; Darr, Jawwad A

    2016-02-01

    A high-throughput optimization and subsequent scale-up methodology has been used for the synthesis of conductive tin-doped indium oxide (known as ITO) nanoparticles. ITO nanoparticles with up to 12 at % Sn were synthesized using a laboratory scale (15 g/hour by dry mass) continuous hydrothermal synthesis process, and the as-synthesized powders were characterized by powder X-ray diffraction, transmission electron microscopy, energy-dispersive X-ray analysis, and X-ray photoelectron spectroscopy. Under standard synthetic conditions, either the cubic In2O3 phase, or a mixture of InO(OH) and In2O3 phases were observed in the as-synthesized materials. These materials were pressed into compacts and heat-treated in an inert atmosphere, and their electrical resistivities were then measured using the Van der Pauw method. Sn doping yielded resistivities of ∼ 10(-2) Ω cm for most samples with the lowest resistivity of 6.0 × 10(-3) Ω cm (exceptionally conductive for such pressed nanopowders) at a Sn concentration of 10 at %. Thereafter, the optimized lab-scale composition was scaled-up using a pilot-scale continuous hydrothermal synthesis process (at a rate of 100 g/hour by dry mass), and a comparable resistivity of 9.4 × 10(-3) Ω cm was obtained. The use of the synthesized TCO nanomaterials for thin film fabrication was finally demonstrated by deposition of a transparent, conductive film using a simple spin-coating process. PMID:26798986

  16. A multiblock grid generation technique applied to a jet engine configuration

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1992-01-01

    Techniques are presented for quickly finding a multiblock grid for a 2D geometrically complex domain from geometrical boundary data. An automated technique for determining a block decomposition of the domain is explained. Techniques for representing this domain decomposition and transforming it are also presented. Further, a linear optimization method may be used to solve the equations which determine grid dimensions within the block decomposition. These algorithms automate many stages in the domain decomposition and grid formation process and limit the need for human intervention and inputs. They are demonstrated for the meridional or throughflow geometry of a bladed jet engine configuration.

  17. Barriers and Facilitators to Scaling Up the Non-Pneumatic Anti-Shock Garment for Treating Obstetric Hemorrhage: A Qualitative Study

    PubMed Central

    Jordan, Keely; Butrick, Elizabeth; Yamey, Gavin; Miller, Suellen

    2016-01-01

    Background Obstetric hemorrhage (OH), which includes hemorrhage from multiple etiologies during pregnancy, childbirth, or postpartum, is the leading cause of maternal mortality and accounts for one-quarter of global maternal deaths. The Non-pneumatic Anti-Shock Garment (NASG) is a first-aid device for obstetric hemorrhage that can be applied for post-partum/post miscarriage and for ectopic pregnancies to buy time for a woman to reach a health care facility for definitive treatment. Despite successful field trials, and endorsement by safe motherhood organizations and the World Health Organization (WHO), scale-up has been slow in some countries. This qualitative study explores contextual factors affecting uptake. Methods From March 2013 to April 2013, we conducted 13 key informant interviews across four countries with a large burden of maternal mortality that had achieved varying success in scaling up the NASG: Ethiopia, India, Nigeria, and Zimbabwe. These key informants were health providers or program specialists working with the NASG. We applied a health policy analysis framework to organize the results. The framework has five domains: attributes of the intervention, attributes of the implementers, delivery strategy, attributes of the adopting community, the socio-political context, and the research context. Results The interviews from our study found that relevant facilitators for scale-up are the simplicity of the device, local and international champions, well-developed training sessions, recommendations by WHO and the International Federation of Gynecology and Obstetrics, and dissemination of NASG clinical trial results. Barriers to scaling up the NASG included limited health infrastructure, relatively high upfront cost of the NASG, initial resistance by providers and policy makers, lack of in-country champions or policy makers advocating for NASG implementation, inadequate return and exchange programs, and lack of political will. Conclusions There was a

  18. Nde of Advanced Automotive Composite Materials that Apply Ultrasound Infrared Thermography Technique

    NASA Astrophysics Data System (ADS)

    Choi, Seung-Hyun; Park, Soo-Keun; Kim, Jae-Yeol

    The infrared thermographic nondestructive inspection technique is a quality inspection and stability assessment method used to diagnose the physical characteristics and defects by detecting the infrared ray radiated from the object without destructing it. Recently, the nondestructive inspection and assessment that use the ultrasound-infrared thermography technique are widely adopted in diverse areas. The ultrasound-infrared thermography technique uses the phenomenon that the ultrasound wave incidence to an object with cracks or defects on its mating surface generates local heat on the surface. The car industry increasingly uses composite materials for their lightweight, strength, and environmental resistance. In this study, the car piston passed through the ultrasound-infrared thermography technique for nondestructive testing, among the composite material car parts. This study also examined the effects of the frequency and power to optimize the nondestructive inspection.

  19. Applying image transformation and classification techniques to airborne hyperspectral imagery for mapping Ashe juniper infestations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Ashe juniper (Juniperus ashei Buchholz), in excessive coverage, reduces forage production, interferes with livestock management, and degrades watersheds and wildlife habitat in infested rangelands. The objective of this study was to apply minimum noise fraction (MNF) transformation and different cla...

  20. The radiation techniques of tomotherapy & intensity-modulated radiation therapy applied to lung cancer

    PubMed Central

    Zhu, Zhengfei

    2015-01-01

    Radiotherapy (RT) plays an important role in the management of lung cancer. Development of radiation techniques is a possible way to improve the effect of RT by reducing toxicities through better sparing the surrounding normal tissues. This article will review the application of two forms of intensity-modulated radiation therapy (IMRT), fixed-field IMRT and helical tomotherapy (HT) in lung cancer, including dosimetric and clinical studies. The advantages and potential disadvantages of these two techniques are also discussed. PMID:26207214

  1. Comparison of oxide measurement techniques applied to Ti6Al4V

    SciTech Connect

    Reissig, L.; Czubayko, U.; Wanderka, N.; Voelkl, R.; Glatzel, U. . E-mail: uwe.glatzel@uni-bayreuth.de

    2005-08-15

    Titanium and his alloys can solve high amounts of oxygen, which generally worsen mechanical properties. This paper compares energy dispersive X-ray analysis, three-dimensional-atom-probe and carrier-gas-hot-extraction as techniques in order to quantify the oxygen content in commercial grade titanium alloys. Assets and drawbacks of the techniques are pointed out. Oxygen enrichment by machining processes is verified in the drill hole of automotive connecting rod.

  2. Error analysis of the phase-shifting technique when applied to shadow moire

    SciTech Connect

    Han, Changwoon; Han Bongtae

    2006-02-20

    An exact solution for the intensity distribution of shadow moire fringes produced by a broad spectrum light is presented. A mathematical study quantifies errors in fractional fringe orders determined by the phase-shifting technique, and its validity is corroborated experimentally. The errors vary cyclically as the distance between the reference grating and the specimen increases. The amplitude of the maximum error is approximately 0.017 fringe, which defines the theoretical limit of resolution enhancement offered by the phase-shifting technique.

  3. Nanowire-organic thin film transistor integration and scale up towards developing sensor array for biomedical sensing applications

    NASA Astrophysics Data System (ADS)

    Kumar, Prashanth S.; Hankins, Phillip T.; Rai, Pratyush; Varadan, Vijay K.

    2010-04-01

    Exploratory research works have demonstrated the capability of conducting nanowire arrays in enhancing the sensitivity and selectivity of bio-electrodes in sensing applications. With the help of different surface manipulation techniques, a wide range of biomolecules have been successfully immobilized on these nanowires. Flexible organic electronics, thin film transistor (TFT) fabricated on flexible substrate, was a breakthrough that enabled development of logic circuits on flexible substrate. In many health monitoring scenarios, a series of biomarkers, physical properties and vital signals need to be observed. Since the nano-bio-electrodes are capable of measuring all or most of them, it has been aptly suggested that a series of electrode (array) on single substrate shall be an excellent point of care tool. This requires an efficient control system for signal acquisition and telemetry. An array of flexible TFTs has been designed that acts as active matrix for controlled switching of or scanning by the sensor array. This array is a scale up of the flexible organic TFT that has been fabricated and rigorously tested in previous studies. The integration of nanowire electrodes to the organic electronics was approached by growing nanowires on the same substrate as TFTs and fl ip chip packaging, where the nanowires and TFTs are made on separate substrates. As a proof of concept, its application has been explored in various multi-focal biomedical sensing applications, such as neural probes for monitoring neurite growth, dopamine, and neuron activity; myocardial ischemia for spatial monitoring of myocardium.

  4. Using simulation and budget models to scale-up nitrogen leaching from field to region in Canada.

    PubMed

    Huffman, E C; Yang, J Y; Gameda, S; De Jong, R

    2001-12-11

    Efforts are underway at Agriculture and Agri-Food Canada (AAFC) to develop an integrated, nationally applicable, socioeconomic/biophysical modeling capability in order to predict the environmental impacts of policy and program scenarios. This paper outlines our Decision Support System (DSS), which integrates the IROWCN (Indicator of the Risk of Water Contamination by Nitrogen) index with the agricultural policy model CRAM (Canadian Regional Agricultural Model) and presents an outline of our methodology to provide independent assessments of the IROWCN results through the use of nitrogen (N) simulation models in select, data-rich areas. Three field-level models--DSSAT, N_ABLE, and EPIC--were evaluated using local measured data. The results show that all three dynamic models can be used to simulate biomass, grain yield, and soil N dynamics at the field level; but the accuracy of the models differ, suggesting that models need to be calibrated using local measured data before they are used in Canada. Further simulation of IROWCN in a maize field using N_ABLE showed that soil-mineral N levels are highly affected by the amount of fertilizer N applied and the time of year, meaning that fertilizer and manure N applications and weather data are crucial for improving IROWCN. Methods of scaling-up simulated IROWCN from field-level to soil-landscape polygons and CRAM regions are discussed. PMID:12805754

  5. Development of combinatorial chemistry methods for coatings: high-throughput weathering evaluation and scale-up of combinatorial leads.

    PubMed

    Potyrailo, Radislav A; Ezbiansky, Karin; Chisholm, Bret J; Morris, William G; Cawse, James N; Hassib, Lamyaa; Medford, George; Reitz, Hariklia

    2005-01-01

    Combinatorial screening of materials formulations followed by the scale-up of combinatorial leads has been applied for the development of high-performance coating materials for automotive applications. We replaced labor-intensive coating formulation, testing, and measurement with a "combinatorial factory" that includes robotic formulation of coatings, their deposition as 48 coatings on a 9x12-cm plastic substrate, accelerated performance testing, and automated spectroscopic and image analysis of resulting performance. This high-throughput (HT) performance testing and measurement of the resulting properties provided a powerful set of tools for the 10-fold accelerated discovery of these coating materials. Performance of coatings is evaluated with respect to their weathering, because this parameter is one of the primary considerations in end-use automotive applications. Our HT screening strategy provides previously unavailable capabilities of (1) high speed and reproducibility of testing by using robotic automation and (2) improved quantification by using optical spectroscopic analysis of discoloration of coating-substrate structure and automatic imaging of the integrity loss of coatings. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several cost-competitive coatings leads that match the performance of more costly coatings. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and weathering testing. These validation results have confirmed the improved weathering performance of combinatorially developed coatings over conventional coatings on the traditional scale. PMID:15762746

  6. The Success for All Model of School Reform: Interim Findings from the Investing in Innovation (i3) Scale-Up

    ERIC Educational Resources Information Center

    Quint, Janet C.; Balu, Rekha; DeLaurentis, Micah; Rappaport, Shelley; Smith, Thomas J.; Zhu, Pei

    2014-01-01

    This is the second of three reports from MDRC's evaluation of the Success for All (SFA) scale-up demonstration, funded under the U.S. Department of Education's Investing in Innovation (i3) competition. The report presents updated findings on SFA's implementation and impacts in the scale-up sites participating in the evaluation. The…

  7. Synchroton and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    SciTech Connect

    Chianelli, R.

    2005-01-12

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds are materials like MoS{sub 2-x}C{sub x} that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report.

  8. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry.

    PubMed

    Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D

    2016-01-01

    One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques. PMID:27293535

  9. Pulsed remote field eddy current technique applied to non-magnetic flat conductive plates

    NASA Astrophysics Data System (ADS)

    Yang, Binfeng; Zhang, Hui; Zhang, Chao; Zhang, Zhanbin

    2013-12-01

    Non-magnetic metal plates are widely used in aviation and industrial applications. The detection of cracks in thick plate structures, such as multilayered structures of aircraft fuselage, has been challenging in nondestructive evaluation societies. The remote field eddy current (RFEC) technique has shown advantages of deep penetration and high sensitivity to deeply buried anomalies. However, the RFEC technique is mainly used to evaluate ferromagnetic tubes. There are many problems that should be fixed before the expansion and application of this technique for the inspection of non-magnetic conductive plates. In this article, the pulsed remote field eddy current (PRFEC) technique for the detection of defects in non-magnetic conducting plates was investigated. First, the principle of the PRFEC technique was analysed, followed by the analysis of the differences between the detection of defects in ferromagnetic and non-magnetic plain structures. Three different models of the PRFEC probe were simulated using ANSYS. The location of the transition zone, defect detection sensitivity and the ability to detect defects in thick plates using three probes were analysed and compared. The simulation results showed that the probe with a ferrite core had the highest detecting ability. The conclusions derived from the simulation study were also validated by conducting experiments.

  10. Activity-dependent synaptic GRIP1 accumulation drives synaptic scaling up in response to action potential blockade

    PubMed Central

    Gainey, Melanie A.; Tatavarty, Vedakumar; Nahmani, Marc; Lin, Heather; Turrigiano, Gina G.

    2015-01-01

    Synaptic scaling is a form of homeostatic plasticity that stabilizes neuronal firing in response to changes in synapse number and strength. Scaling up in response to action-potential blockade is accomplished through increased synaptic accumulation of GluA2-containing AMPA receptors (AMPAR), but the receptor trafficking steps that drive this process remain largely obscure. Here, we show that the AMPAR-binding protein glutamate receptor-interacting protein-1 (GRIP1) is essential for regulated synaptic AMPAR accumulation during scaling up. Synaptic abundance of GRIP1 was enhanced by activity deprivation, directly increasing synaptic GRIP1 abundance through overexpression increased the amplitude of AMPA miniature excitatory postsynaptic currents (mEPSCs), and shRNA-mediated GRIP1 knockdown prevented scaling up of AMPA mEPSCs. Furthermore, knockdown and replace experiments targeting either GRIP1 or GluA2 revealed that scaling up requires the interaction between GRIP1 and GluA2. Finally, GRIP1 synaptic accumulation during scaling up did not require GluA2 binding. Taken together, our data support a model in which activity-dependent trafficking of GRIP1 to synaptic sites drives the forward trafficking and enhanced synaptic accumulation of GluA2-containing AMPAR during synaptic scaling up. PMID:26109571

  11. Reformulation linearization technique based branch-and-reduce approach applied to regional water supply system planning

    NASA Astrophysics Data System (ADS)

    Lan, Fujun; Bayraksan, Güzin; Lansey, Kevin

    2016-03-01

    A regional water supply system design problem that determines pipe and pump design parameters and water flows over a multi-year planning horizon is considered. A non-convex nonlinear model is formulated and solved by a branch-and-reduce global optimization approach. The lower bounding problem is constructed via a three-pronged effort that involves transforming the space of certain decision variables, polyhedral outer approximations, and the Reformulation Linearization Technique (RLT). Range reduction techniques are employed systematically to speed up convergence. Computational results demonstrate the efficiency of the proposed algorithm; in particular, the critical role range reduction techniques could play in RLT based branch-and-bound methods. Results also indicate using reclaimed water not only saves freshwater sources but is also a cost-effective non-potable water source in arid regions. Supplemental data for this article can be accessed at http://dx.doi.org/10.1080/0305215X.2015.1016508.

  12. Reliability of the Ultrasonic Technique Applied to Detection of Pipe Weld Defects

    NASA Astrophysics Data System (ADS)

    Rebello, J. M. A.; Carvalho, A. A.; Sagrilo, L. V. S.; Soares, S. D.

    2007-03-01

    The objective of this work is to evaluate the reliability of the ultrasonic nondestructive test technique (NDT), for specific test conditions, using POD (probability of detection) curves developed by experimental procedures. Two classes of defects, lack of penetration (LP) and lack of fusion (LF) were intentionally inserted in 24 weld beads belonging to 4 API X70 steel pipeline specimens with an outer diameter of 254mm and wall thickness of 19.05mm. These specimens were inspected using manual and automatic ultrasonic techniques. The results, besides producing real POD curves, showed the superiority of the automatic techniques over the manual test in the probability of detection of these two classes of defects.

  13. Schlieren technique applied to the arc temperature measurement in a high energy density cutting torch

    SciTech Connect

    Prevosto, L.; Mancinelli, B.; Artana, G.; Kelly, H.

    2010-01-15

    Plasma temperature and radial density profiles of the plasma species in a high energy density cutting arc have been obtained by using a quantitative schlieren technique. A Z-type two-mirror schlieren system was used in this research. Due to its great sensibility such technique allows measuring plasma composition and temperature from the arc axis to the surrounding medium by processing the gray-level contrast values of digital schlieren images recorded at the observation plane for a given position of a transverse knife located at the exit focal plane of the system. The technique has provided a good visualization of the plasma flow emerging from the nozzle and its interactions with the surrounding medium and the anode. The obtained temperature values are in good agreement with those values previously obtained by the authors on the same torch using Langmuir probes.

  14. Monitoring gypsy moth defoliation by applying change detection techniques to Landsat imagery

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Stauffer, M. L.

    1978-01-01

    The overall objective of a research effort at NASA's Goddard Space Flight Center is to develop and evaluate digital image processing techniques that will facilitate the assessment of the intensity and spatial distribution of forest insect damage in Northeastern U.S. forests using remotely sensed data from Landsats 1, 2 and C. Automated change detection techniques are presently being investigated as a method of isolating the areas of change in the forest canopy resulting from pest outbreaks. In order to follow the change detection approach, Landsat scene correction and overlay capabilities are utilized to provide multispectral/multitemporal image files of 'defoliation' and 'nondefoliation' forest stand conditions.

  15. Innovative vibration technique applied to polyurethane foam as a viable substitute for conventional fatigue testing

    NASA Astrophysics Data System (ADS)

    Peralta, Alexander; Just-Agosto, Frederick; Shafiq, Basir; Serrano, David

    2012-12-01

    Lifetime prediction using three-point bending (TPB) can at times be prohibitively time consuming and costly, whereas vibration testing at higher frequency may potentially save time and revenue. A vibration technique that obtains lifetimes that reasonably match those determined under flexural TPB fatigue is developed. The technique designs the specimen with a procedure based on shape optimization and finite element analysis. When the specimen is vibrated in resonance, a stress pattern that mimics the stress pattern observed under conventional TPB fatigue testing is obtained. The proposed approach was verified with polyurethane foam specimens, resulting in an average error of 4.5% when compared with TPB.

  16. Validation and qualification of surface-applied fibre optic strain sensors using application-independent optical techniques

    NASA Astrophysics Data System (ADS)

    Schukar, Vivien G.; Kadoke, Daniel; Kusche, Nadine; Münzenberger, Sven; Gründer, Klaus-Peter; Habel, Wolfgang R.

    2012-08-01

    Surface-applied fibre optic strain sensors were investigated using a unique validation facility equipped with application-independent optical reference systems. First, different adhesives for the sensor's application were analysed regarding their material properties. Measurements resulting from conventional measurement techniques, such as thermo-mechanical analysis and dynamic mechanical analysis, were compared with measurements resulting from digital image correlation, which has the advantage of being a non-contact technique. Second, fibre optic strain sensors were applied to test specimens with the selected adhesives. Their strain-transfer mechanism was analysed in comparison with conventional strain gauges. Relative movements between the applied sensor and the test specimen were visualized easily using optical reference methods, digital image correlation and electronic speckle pattern interferometry. Conventional strain gauges showed limited opportunities for an objective strain-transfer analysis because they are also affected by application conditions.

  17. Scaling up a Mobile Telemedicine Solution in Botswana: Keys to Sustainability

    PubMed Central

    Ndlovu, Kagiso; Littman-Quinn, Ryan; Park, Elizabeth; Dikai, Zambo; Kovarik, Carrie L.

    2014-01-01

    Effective health care delivery is significantly compromised in an environment where resources, both human and technical, are limited. Botswana’s health care system is one of the many in the African continent with few specialized medical doctors, thereby posing a barrier to patients’ access to health care services. In addition, the traditional landline and non-robust Information Technology (IT) network infrastructure characterized by slow bandwidth still dominates the health care system in Botswana. Upgrading of the landline IT infrastructure to meet today’s health care demands is a tedious, long, and expensive process. Despite these challenges, there still lies hope in health care delivery utilizing wireless telecommunication services. Botswana has recently experienced tremendous growth in the mobile telecommunication industry coupled with an increase in the number of individually owned mobile devices. This growth inspired the Botswana-UPenn Partnership (BUP) to collaborate with local partners to explore using mobile devices as tools to improve access to specialized health care delivery. Pilot studies were conducted across four medical specialties, including radiology, oral medicine, dermatology, and cervical cancer screening. Findings from the studies became vital evidence in support of the first scale-up project of a mobile telemedicine solution in Botswana, also known as “Kgonafalo.” Some technical and social challenges were encountered during the initial studies, such as malfunctioning of mobile devices, accidental damage of devices, and cultural misalignment between IT and healthcare providers. These challenges brought about lessons learnt, including a strong need for unwavering senior management support, establishment of solid local public-private partnerships, and efficient project sustainability plans. Sustainability milestones included the development and signing of a Memorandum of Understanding (MOU) between the Botswana government and a private

  18. The Student-Centered Active Learning Environment for Undergraduate Programs (SCALE-UP) Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert J.

    2011-04-01

    How do you keep a classroom of 100 undergraduates actively learning? Can students practice communication and teamwork skills in a large class? How do you boost the performance of underrepresented groups? The Student-Centered Active Learning Environment for Undergraduate Programs (SCALE-UP) Project has addressed these concerns. Because of their inclusion in a leading introductory physics textbook, project materials are used by more than 1/3 of all science, math, and engineering majors nationwide. The room design and pedagogy have been adopted at more than 100 leading institutions across the country. Physics, chemistry, math, astronomy, biology, engineering, earth sciences, and even literature classes are currently being taught this way. Educational research indicates that students should collaborate on interesting tasks and be deeply involved with the material they are studying. We promote active learning in a redesigned classroom for 100 students or more. (Of course, smaller classes can also benefit.) Class time is spent primarily on "tangibles" and "ponderables"--hands-on activities, simulations, and interesting questions. Nine students sit in three teams at round tables. Instructors circulate and engage in Socratic dialogues. The setting looks like a banquet hall, with lively interactions nearly all the time. Hundreds of hours of classroom video and audio recordings, transcripts of numerous interviews and focus groups, data from conceptual learning assessments (using widely-recognized instruments in a pretest/posttest protocol), and collected portfolios of student work are part of our rigorous assessment effort. Our findings (based on data from over 16,000 students collected over five years as well as replications at adopting sites) can be summarized as the following: 1) Female failure rate is 1/5 of previous levels, even though more is demanded of students. 2) Minority failure rate is 1/4 that seen in traditionally taught courses. 3) At-risk students are more

  19. A comparison of automatic filtering techniques applied to biomechanical walking data.

    PubMed

    Giakas, G; Baltzopoulos, V

    1997-08-01

    The purpose of this study was to compare and evaluate six automatic filtering techniques commonly used in biomechanics for filtering gait analysis kinematic signals namely: (1) power spectrum (signal-to-noise ratio) assessment; (2) generalised cross validation spline; (3) least-squares cubic splines; (4) regularisation of Fourier series; (5) regression model and (6) residual analysis. A battery of 1440 signals representing the displacements of seven markers attached upon the surface of the right lower limbs and one marker attached upon the surface of the sacrum during walking were used; their original signal and added noise characteristics were known a priori. The signals were filtered with every technique and the root mean square error between the filtered and reference signal was calculated for each derivative domain. Results indicated that among the investigated techniques there is not one that performs best in all the cases studied. Generally, the techniques of power spectrum estimation, least-squares cubic splines and generalised cross validation produced the most acceptable results. PMID:9239571

  20. Wavelet Techniques Applied to Modeling Transitional/Turbulent Flows in Turbomachinery

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Computer simulation is an essential part of the design and development of jet engines for the aeropropulsion industry. Engineers concerned with calculating the flow in jet engine components, such as compressors and turbines, need simple engineering models that accurately describe the complex flow of air and gases and that allow them to quickly estimate loads, losses, temperatures, and other design parameters. In this ongoing collaborative project, advanced wavelet analysis techniques are being used to gain insight into the complex flow phenomena. These insights, which cannot be achieved by commonly used methods, are being used to develop innovative new flow models and to improve existing ones. Wavelet techniques are very suitable for analyzing the complex turbulent and transitional flows pervasive in jet engines. These flows are characterized by intermittency and a multitude of scales. Wavelet analysis results in information about these scales and their locations. The distribution of scales is equivalent to the frequency spectrum provided by commonly used Fourier analysis techniques; however, no localization information is provided by Fourier analysis. In addition, wavelet techniques allow conditional sampling analyses of the individual scales, which is not possible by Fourier methods.

  1. Eddy current technique applied to the nondestructive evaluation of turbine blade wall thickness

    NASA Astrophysics Data System (ADS)

    Le Bihan, Yann; Joubert, Pierre-Yves; Placko, Dominique

    2000-05-01

    The high pressure turbine blades of jet engines show internal channels designed for air cooling. These recesses define the internal walls (partitions) and external walls of the blade. The external wall thickness is a critical parameter which has to be systematically checked in order to ensure the blade strength. The thickness evaluation is usually lead by ultrasonic technique or by X-ray tomography. Nevertheless, both techniques present some drawbacks related to measurement speed and automation capability. These drawbacks are bypassed by the eddy current (EC) technique, well known for its robustness and reliability. However, the wall thickness evaluation is made difficult because of the complexity of the blade geometry. In particular, some disturbances appear in the thickness evaluation because of the partitions, which exclude the use of classical EC probes such as cup-core probe. In this paper, we show the main advantages of probes creating an uniformly oriented magnetic field in order to reduce the partition disturbances. Furthermore, we propose a measurement process allowing to separate the wall thickness parameter from the EC signals. Finally, we present some experimental results validating the proposed technique.

  2. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    NASA Astrophysics Data System (ADS)

    Fratini, Michela; Campi, Gaetano; Bukreeva, Inna; Pelliccia, Daniele; Burghammer, Manfred; Tromba, Giuliana; Cancedda, Ranieri; Mastrogiacomo, Maddalena; Cedola, Alessia

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic-mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  3. A comparison of model-based and hyperbolic localization techniques as applied to marine mammal calls

    NASA Astrophysics Data System (ADS)

    Tiemann, Christopher O.; Porter, Michael B.

    2003-10-01

    A common technique for the passive acoustic localization of singing marine mammals is that of hyperbolic fixing. This technique assumes straight-line, constant wave speed acoustic propagation to associate travel time with range, but in some geometries, these assumptions can lead to localization errors. A new localization algorithm based on acoustic propagation models can account for waveguide and multipath effects, and it has successfully been tested against real acoustic data from three different environments (Hawaii, California, and Bahamas) and three different species (humpback, blue, and sperm whales). Accuracy of the model-based approach has been difficult to verify given the absence of concurrent visual and acoustic observations of the same animal. However, the model-based algorithm was recently exercised against a controlled source of known position broadcasting recorded whale sounds, and location estimates were then compared to hyperbolic techniques and true source position. In geometries where direct acoustic paths exist, both model-based and hyperbolic techniques perform equally well. However, in geometries where bathymetric and refractive effects are important, such as at long range, the model-based approach shows improved accuracy.

  4. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  5. Applying Web Usability Techniques to Assess Student Awareness of Library Web Resources

    ERIC Educational Resources Information Center

    Krueger, Janice; Ray, Ron L.; Knight, Lorrie

    2004-01-01

    The authors adapted Web usability techniques to assess student awareness of their library's Web site. Students performed search tasks using a Web browser. Approaches were categorized according to a student's preference for, and success with, the library's Web resources. Forty-five percent of the students utilized the library's Web site as first…

  6. Applying Gaming and Simulation Techniques to the Design of Online Instruction

    ERIC Educational Resources Information Center

    Rude-Parkins, Carolyn; Miller, Karen Hughes; Ferguson, Karen; Bauer, Robert

    2006-01-01

    Critical in virtually all educational arenas, gaming and simulation techniques and distance learning are major areas of interest in today's U.S. Army training. The U.S. Army Armor School at Ft. Knox, KY contracted with the University of Louisville and Northrop Grumman Mission Systems in 2003 to develop online training for Army Captains. They…

  7. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  8. Time-lapse motion picture technique applied to the study of geological processes

    USGS Publications Warehouse

    Miller, R.D.; Crandell, D.R.

    1959-01-01

    Light-weight, battery-operated timers were built and coupled to 16-mm motion-picture cameras having apertures controlled by photoelectric cells. The cameras were placed adjacent to Emmons Glacier on Mount Rainier. The film obtained confirms the view that exterior time-lapse photography can be applied to the study of slow-acting geologic processes.

  9. Applying the Management-by-Objectives Technique in an Industrial Library

    ERIC Educational Resources Information Center

    Stanton, Robert O.

    1975-01-01

    An experimental "management-by-objectives" performance system was operated by the Libraries and Information Systems Center of Bell Laboratories during 1973. It was found that, though the system was very effective for work planning and the development of people, difficulties were encountered in applying it to certain classes of employees. (Author)

  10. Simulation for supporting scale-up of a fluidized bed reactor for advanced water oxidation.

    PubMed

    Tisa, Farhana; Raman, Abdul Aziz Abdul; Daud, Wan Mohd Ashri Wan

    2014-01-01

    Simulation of fluidized bed reactor (FBR) was accomplished for treating wastewater using Fenton reaction, which is an advanced oxidation process (AOP). The simulation was performed to determine characteristics of FBR performance, concentration profile of the contaminants, and various prominent hydrodynamic properties (e.g., Reynolds number, velocity, and pressure) in the reactor. Simulation was implemented for 2.8 L working volume using hydrodynamic correlations, continuous equation, and simplified kinetic information for phenols degradation as a model. The simulation shows that, by using Fe(3+) and Fe(2+) mixtures as catalyst, TOC degradation up to 45% was achieved for contaminant range of 40-90 mg/L within 60 min. The concentration profiles and hydrodynamic characteristics were also generated. A subsequent scale-up study was also conducted using similitude method. The analysis shows that up to 10 L working volume, the models developed are applicable. The study proves that, using appropriate modeling and simulation, data can be predicted for designing and operating FBR for wastewater treatment. PMID:25309949

  11. Optimization and Scale-up of Inulin Extraction from Taraxacum kok-saghyz roots.

    PubMed

    Hahn, Thomas; Klemm, Andrea; Ziesse, Patrick; Harms, Karsten; Wach, Wolfgang; Rupp, Steffen; Hirth, Thomas; Zibek, Susanne

    2016-05-01

    The optimization and scale-up of inulin extraction from Taraxacum kok-saghyz Rodin was successfully performed. Evaluating solubility investigations, the extraction temperature was fixed at 85 degrees C. The inulin stability regarding degradation or hydrolysis could be confirmed by extraction in the presence of model inulin. Confirming stability at the given conditions the isolation procedure was transferred from a 1 L- to a 1 m3-reactor. The Reynolds number was selected as the relevant dimensionless number that has to remain constant in both scales. The stirrer speed in the large scale was adjusted to 3.25 rpm regarding a 300 rpm stirrer speed in the 1 L-scale and relevant physical and process engineering parameters. Assumptions were confirmed by approximately homologous extraction kinetics in both scales. Since T. kok-saghyz is in the focus of research due to its rubber content side-product isolation from residual biomass it is of great economic interest. Inulin is one of these additional side-products that can be isolated in high quantity (- 35% of dry mass) and with a high average degree of polymerization (15.5) in large scale with a purity of 77%. PMID:27319152

  12. Advances in membrane emulsification. Part B: recent developments in modelling and scale-up approaches.

    PubMed

    Spyropoulos, Fotis; Lloyd, David M; Hancocks, Robin D; Pawlik, Aleksandra K

    2014-03-15

    Membrane emulsification is a promising process for formulating emulsions and particulates. It offers many advantages over conventional 'high-shear' processes with narrower size distribution products, higher batch repeatability and lower energy consumption commonly demonstrated at a small scale. Since the process was first introduced around 25 years ago, understanding of the underlying mechanisms involved during microstructure formation has advanced significantly leading to the development of modelling approaches that predict processing output; e.g. emulsion droplet size and throughput. The accuracy and ease of application of these models is important to allow for the development of design equations which can potentially facilitate scale-up of the process and meet the manufacturer's specific requirements. Part B of this review considers the advantages and disadvantages of a variety of models developed to predict droplet size, flow behaviour and other phenomena (namely droplet-droplet interactions), with presentation of the appropriate formulae where necessary. Furthermore, the advancement of the process towards an industrial scale is also highlighted with additional recommendations by the authors for future work. PMID:24122852

  13. SCALE UP OF CERAMIC WASTE FORMS FOR THE EBR-II SPENT FUEL TREATMENT PROCESS

    SciTech Connect

    Matthew C. Morrison; Kenneth J. Bateman; Michael F. Simpson

    2010-11-01

    ABSTRACT SCALE UP OF CERAMIC WASTE FORMS FOR THE EBR-II SPENT FUEL TREATMENT PROCESS Matthew C. Morrison, Kenneth J. Bateman, Michael F. Simpson Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415 The ceramic waste process is the intended method for disposing of waste salt electrolyte, which contains fission products from the fuel-processing electrorefiners (ER) at the INL. When mixed and processed with other materials, the waste salt can be stored in a durable ceramic waste form (CWF). The development of the CWF has recently progressed from small-scale testing and characterization to full-scale implementation and experimentation using surrogate materials in lieu of the ER electrolyte. Two full-scale (378 kg and 383 kg) CWF test runs have been successfully completed with final densities of 2.2 g/cm3 and 2.1 g/cm3, respectively. The purpose of the first CWF was to establish material preparation parameters. The emphasis of the second pre-qualification test run was to evaluate a preliminary multi-section CWF container design. Other considerations were to finalize material preparation parameters, measure the material height as it consolidates in the furnace, and identify when cracking occurs during the CWF cooldown process.

  14. Polyethylene encapsulatin of nitrate salt wastes: Waste form stability, process scale-up, and economics

    SciTech Connect

    Kalb, P.D.; Heiser, J.H. III; Colombo, P.

    1991-07-01

    A polyethylene encapsulation system for treatment of low-level radioactive, hazardous, and mixed wastes has been developed at Brookhaven National Laboratory. Polyethylene has several advantages compared with conventional solidification/stabilization materials such as hydraulic cements. Waste can be encapsulated with greater efficiency and with better waste form performance than is possible with hydraulic cement. The properties of polyethylene relevant to its long-term durability in storage and disposal environments are reviewed. Response to specific potential failure mechanisms including biodegradation, radiation, chemical attack, flammability, environmental stress cracking, and photodegradation are examined. These data are supported by results from extensive waste form performance testing including compressive yield strength, water immersion, thermal cycling, leachability of radioactive and hazardous species, irradiation, biodegradation, and flammability. The bench-scale process has been successfully tested for application with a number of specific problem'' waste streams. Quality assurance and performance testing of the resulting waste form confirmed scale-up feasibility. Use of this system at Rocky Flats Plant can result in over 70% fewer drums processed and shipped for disposal, compared with optimal cement formulations. Based on the current Rocky Flats production of nitrate salt per year, polyethylene encapsulation can yield an estimated annual savings between $1.5 million and $2.7 million, compared with conventional hydraulic cement systems. 72 refs., 23 figs., 16 tabs.

  15. Destruction of nuclear organic waste by supercritical water oxidation. Scale-up of the process

    SciTech Connect

    Moussiere, S.; Roubaud, A.; Fournel, B.

    2007-07-01

    In order to design and then define appropriate dimensions for a supercritical oxidation reactor, a 2D and 3D simulation of the fluid dynamics and heat transfer during the oxidation process has been performed. The solver used is a commercial code, Fluent 6.2. The turbulent flow field in the reactor, created by the stirrer is taken into account with a k-omega model and a swirl imposed to the fluid. In the 3D case the rotation of the stirrer can be modeled thanks to the sliding mesh model. The reactivity of the system is taken into account with a classical combustion model EDC. Comparisons with experimental temperature measurements validate the ability of the CFD modeling to simulate the supercritical water oxidation process. Simulation results provide us a view inside the reactor on the flow, temperature fields and the oxidation localization and development. Results indicate that the flow can be considered as piston-like, heat transfers are strongly enhanced by the stirring. Hence the scaling up of the reactor volume, to reach a treatment capacity of 1 Kg/h of pure organics, can be done regarding the necessary residence times and temperature distribution needed for a complete destruction of the organic matter. (authors)

  16. Scale-up of mild gasification to a process development unit

    SciTech Connect

    Campbell, J.A.L.; Carty, R.H.; Saladin, N.; Foster, H.

    1992-06-01

    The work performed during the second quarterly reporting period (February 21 through May 20, 1992) on the research program, Scale-Up of Mild Gasification to a Process Development Unit'' is presented in this report. The overall objective of this project is to develop the IGT Mild-Gasification (MILDGAS) process for near-term commercialization. The specific objectives of the program are to: (1) design, construct, and operate a 24-tons/day adiabatic process development unit (PDU) to obtain process performance data suitable for further design scaleup. (2) obtain large batches of coal-derived co-products for industrial evaluation. (3) prepare a detailed design of a demonstration unit. (4) develop technical and economic plans for commercialization of the MILDGAS process. The MILDGAS process is a continuous closed system for producing liquid and solid (char) co-products at mild operating conditions up to 50 psig and 1300[degrees]F. It is capable of processing a wide range of both eastern caking and western noncaking coals. The PDU to be constructed is comprised of a 2.5-ft ID adiabatic gasifier for the production of char, coal liquids, and gases; a thermal cracker for upgrading of the coal liquids; and a hot briquetting unit for the production of form coke and smokeless fuel briquettes. The facility will also incorporate support equipment for environmentally acceptable disposal of process waste.

  17. Carbon honeycomb grids for advanced lead-acid batteries. Part III: Technology scale-up

    NASA Astrophysics Data System (ADS)

    Kirchev, A.; Serra, L.; Dumenil, S.; Brichard, G.; Alias, M.; Jammet, B.; Vinit, L.

    2015-12-01

    The carbon honeycomb grid technology employs new carbon/carbon composites with ordered 3D structure instead of the classic lead-acid battery current collectors. The technology is laboratory scaled up from small size grids corresponding to electrodes with a capacity of 3 Ah to current collectors suitable for assembly of lead-acid batteries covering the majority of the typical lead-acid battery applications. Two series of 150 grids each (one positive and one negative) are manufactured using low-cost lab-scale equipment. They are further subjected to pasting with active materials and the resulting battery plates are assembled in 12 V AGM-VLRA battery mono-blocks for laboratory testing and outdoor demonstration in electric scooter replacing its original VRLAB pack. The obtained results demonstrate that the technology can replace successfully the state of the art negative grids with considerable benefits. The use of the carbon honeycomb grids as positive plate current collectors is limited by the anodic corrosion of the entire structure attacking both the carbon/carbon composite part and the electroplated lead-tin alloy coating.

  18. Context matters: Successes and challenges of intrapartum care scale-up in four districts of Afghanistan.

    PubMed

    Tappis, Hannah; Koblinsky, Marge; Winch, Peter J; Turkmani, Sabera; Bartlett, Linda

    2016-04-01

    Reducing preventable maternal mortality and achieving Sustainable Development Goal targets for 2030 will require increased investment in improving access to quality health services in fragile and conflict-affected states. This study explores the conditions that affect availability and utilisation of intrapartum care services in four districts of Afghanistan where mortality studies were conducted in 2002 and 2011. Information on changes in each district was collected through interviews with community members; service providers; and district, provincial and national officials. This information was then triangulated with programme and policy documentation to identify factors that affect the coverage of safe delivery and emergency obstetric care services. Comparison of barriers to maternal health service coverage across the four districts highlights the complexities of national health policy planning and resource allocation in Afghanistan, and provides examples of the types of challenges that must be addressed to extend the reach of life-saving maternal health interventions to women in fragile and conflict-affected states. Findings suggest that improvements in service coverage must be measured at a sub-national level, and context-specific service delivery models may be needed to effectively scale up intrapartum care services in extremely remote or insecure settings. PMID:26645366

  19. Scaling Up the Production of Recombinant Antimicrobial Plantaricin E from a Heterologous Host, Escherichia coli.

    PubMed

    Pal, Gargi; Srivastava, Sheela

    2015-09-01

    Enhanced production of heterologously expressed plantaricin (plnE) from Escherichia coli BL21 (DE3) was achieved from a small- to large-scale batch culture. Starting from a 15-ml shake-flask culture grown in Luria-Bertani (LB) broth, the protein expression could be scaled up using 50 ml, 100 ml, 1 l, and 2 l batch culture. Using similar condition, plantaricin E (PlnE) was successfully expressed in a 30-l stirred fermenter. The protein was expressed as TRX-(His)6-fusion protein and separated by Ni(2+) affinity chromatography. Growth in two complex media, LB and Terrific broth (TB), was optimized and compared for the production of PlnE, which was higher in LB in comparison with that of TB. In the fermenter, 140 and 180 mg of PlnE could be produced from 12 l of culture volume at 30 and 25 °C, respectively. The yield of heterologously purified PlnE was found to be 1.2-1.5%, which was much higher in comparison with the plantaricins produced from the native strain of Lactobacillus plantarum (0.3-0.7%). Overproduction of PlnE with the help of heterologous expression can overcome the constraint of the low yield from producer strain and provides an easy and low-cost strategy for large-scale production. PMID:26044056

  20. Scaling-up essential neuropsychiatric services in Ethiopia: a cost-effectiveness analysis.

    PubMed

    Strand, Kirsten Bjerkreim; Chisholm, Dan; Fekadu, Abebaw; Johansson, Kjell Arne

    2016-05-01

    INTRODUCTION : There is an immense need for scaling-up neuropsychiatric care in low-income countries. Contextualized cost-effectiveness analyses (CEAs) provide relevant information for local policies. The aim of this study is to perform a contextualized CEA of neuropsychiatric interventions in Ethiopia and to illustrate expected population health and budget impacts across neuropsychiatric disorders. METHODS : A mathematical population model (PopMod) was used to estimate intervention costs and effectiveness. Existing variables from a previous WHO-CHOICE regional CEA model were substantially revised. Treatments for depression, schizophrenia, bipolar disorder and epilepsy were analysed. The best available local data on epidemiology, intervention efficacy, current and target coverage, resource prices and salaries were used. Data were obtained from expert opinion, local hospital information systems, the Ministry of Health and literature reviews. RESULTS : Treatment of epilepsy with a first generation antiepileptic drug is the most cost-effective treatment (US$ 321 per DALY adverted). Treatments for depression have mid-range values compared with other interventions (US$ 457-1026 per DALY adverted). Treatments for schizophrenia and bipolar disorders are least cost-effective (US$ 1168-3739 per DALY adverted). CONCLUSION : This analysis gives the Ethiopian government a comprehensive overview of the expected costs, effectiveness and cost-effectiveness of introducing basic neuropsychiatric interventions. PMID:26491060

  1. High speed electrospinning for scaled-up production of amorphous solid dispersion of itraconazole.

    PubMed

    Nagy, Zsombor K; Balogh, Attila; Démuth, Balázs; Pataki, Hajnalka; Vigh, Tamás; Szabó, Bence; Molnár, Kolos; Schmidt, Bence T; Horák, Péter; Marosi, György; Verreck, Geert; Van Assche, Ivo; Brewster, Marcus E

    2015-03-01

    High speed electrospinning (HSES), compatible with pharmaceutical industry, was used to demonstrate the viability of the preparation of drug-loaded polymer nanofibers with radically higher productivity than the known single-needle electrospinning (SNES) setup. Poorly water-soluble itraconazole (ITRA) was formulated with PVPVA64 matrix polymer using four different solvent-based methods such as HSES, SNES, spray drying (SD) and film casting (FC). The formulations were assessed in terms of improvement in the dissolution rate of ITRA (using a "tapped basket" dissolution configuration) and analysed by SEM, DSC and XRPD. Despite the significantly increased productivity of HSES, the obtained morphology was very similar to the SNES nanofibrous material. ITRA transformed into an amorphous form, according to the DSC and XRPD results, in most cases except the FC samples. The limited dissolution of crystalline ITRA could be highly improved: fast dissolution occurred (>90% within 10min) in the cases of both (the scaled-up and the single-needle) types of electrospun fibers, while the improvement in the dissolution rate of the spray-dried microspheres was significantly lower. Production of amorphous solid dispersions (ASDs) with the HSES system proved to be flexibly scalable and easy to integrate into a continuous pharmaceutical manufacturing line, which opens new routes for the development of industrially relevant nanopharmaceuticals. PMID:25596415

  2. Scale-up of biopesticide production processes using wastewater sludge as a raw material.

    PubMed

    Yezza, A; Tyagi, R D; Valèro, J R; Surampalli, R Y; Smith, J

    2004-12-01

    Studies were conducted on the production of Bacillus thuringiensis (Bt)-based biopesticides to ascertain the performance of the process in shake flasks, and in two geometrically similar fermentors (15 and 150 l) utilizing wastewater sludge as a raw material. The results showed that it was possible to achieve better oxygen transfer in the larger capacity fermentor. Viable cell counts increased by 38-55% in the bioreactor compared to shake flasks. As for spore counts, an increase of 25% was observed when changing from shake flask to fermentor experiments. Spore counts were unchanged in bench (15 l) and pilot scale (5.3-5.5 e(+08) cfu/ml; 150 l). An improvement of 30% in the entomotoxicity potential was obtained at pilot scale. Protease activity increased by two to four times at bench and pilot scale, respectively, compared to the maximum activity obtained in shake flasks. The maximum protease activity (4.1 IU/ml) was obtained in pilot scale due to better oxygen transfer. The Bt fermentation process using sludge as raw material was successfully scaled up and resulted in high productivity for toxin protein yield and a high protease activity. PMID:15662544

  3. Linear Nitramine (DNDA-57): Synthesis, Scale-Up, Characterization, and Quantitative Estimation by GC/MS

    NASA Astrophysics Data System (ADS)

    Vijayalakshmi, R.; Naik, N. H.; Gore, G. M.; Sikder, A. K.

    2015-01-01

    Dinitro-diaza-alkanes (DNDA-57) are linear nitramine plasticizers and find use in low-temperature sensitivity coefficient propellants. DNDA-57 is a mixture of 2,4-dinitro-2,4-diazapentane (DNDA-5), 2,4-dinitro-2,4-diazahexane (DNDA-6), and 3,5-dinitro-3,5-diazaheptane (DNDA-7) with percentage composition of 40 ± 5%, 44 ± 5% and 11 ± 2%, respectively. The synthesis process of DNDA-57 was established with slight modification of the reaction parameters to obtain good yield and the process was scaled up. The synthesized compound was thoroughly characterized by spectroscopic as well as thermal methods. The present study emphasizes gas chromatographic-mass spectrometric (GC/MS) characterization by electron impact (EI) mode and chemical ionization (CI) mode to determine the fragmentation pattern. Further, the identified components were confirmed with general characterization. The study reveals that DNDA-5, DNDA-6, and DNDA-7 follow identical decomposition pattern. The friction and impact sensitivity study unveils the insensitive nature of DNDA-57.

  4. Isolation of prebiotic carbohydrates by supercritical fluid extraction. Scaling-up and economical feasibility.

    PubMed

    Montañés, F; Fornari, T; Olano, A; Ibáñez, E

    2012-08-10

    Production of prebiotic carbohydrates at competitive prices is a challenge nowadays since the well-established production processes involve many purification steps which are labour intensive and require important amounts of reagents and products thus increasing prebiotic's price. Several processes have been studied in our laboratory involving the use of Supercritical Fluid Technology to fractionate and purify carbohydrate solid mixtures. Research carried out at laboratory scale using theoretical mixtures (lactose/lactulose and galactose/tagatose), commercially available carbohydrate mixtures and carbohydrate mixtures produced by enzymatic transglycosylation and isomerized with complexating reagents demonstrated that purification of prebiotic carbohydrates was technically possible by supercritical fluid extraction. In the present work, the process optimized at laboratory scale to fractionate carbohydrate mixtures produced by enzymatic transglycosylation has been scaled-up to an industrial level and its economic feasibility has been simulated employing AspenONE(®) V7.3 software to obtain consistent data supporting the interest of a potential investment for prebiotics production at large scale using supercritical fluids. PMID:22560345

  5. Scale-up on basis of structured mixing models: A new concept.

    PubMed

    Mayr, B; Moser, A; Nagy, E; Horvat, P

    1994-02-01

    A new scale-up concept based upon mixing models for bioreactors equipped with Rushton turbines using the tanks-in-series concept is presented. The physical mixing model includes four adjustable parameters, i.e., radial and axial circulation time, number of ideally mixed elements in one cascade, and the volume of the ideally mixed turbine region. The values of the model parameters were adjusted with the application of a modified Monte-Carlo optimization method, which fitted the simulated response function to the experimental curve. The number of cascade elements turned out to be constant (N = 4). The model parameter radial circulation time is in good agreement with the one obtained by the pumping capacity. In case of remaining parameters a first or second order formal equation was developed, including four operational parameters (stirring and aeration intensity, scale, viscosity). This concept can be extended to several other types of bioreactors as well, and it seems to be a suitable tool to compare the bioprocess performance of different types of bioreactors. (c) 1994 John Wiley & Sons, Inc. PMID:18615651

  6. Optimization and scale-up of a fluid bed tangential spray rotogranulation process.

    PubMed

    Bouffard, J; Dumont, H; Bertrand, F; Legros, R

    2007-04-20

    The production of pellets in the pharmaceutical industry generally involves multi-step processing: (1) mixing, (2) wet granulation, (3) spheronization and (4) drying. While extrusion-spheronization processes have been popular because of their simplicity, fluid-bed rotogranulation (FBRG) is now being considered as an alternative, since it offers the advantages of combining the different steps into one processing unit, thus reducing processing time and material handling. This work aimed at the development of a FBRG process for the production of pellets in a 4.5-l Glatt GCPG1 tangential spray rotoprocessor and its optimization using factorial design. The factors considered were: (1) rotor disc velocity, (2) gap air pressure, (3) air flow rate, (4) binder spray rate and (5) atomization pressure. The pellets were characterized for their physical properties by measuring size distribution, roundness and flow properties. The results indicated that: pellet mean particle size is negatively affected by air flow rate and rotor plate speed, while binder spray rate has a positive effect on size; pellet flow properties are enhanced by operating with increased air flow rate and worsened with increased binder spray rate. Multiple regression analysis enabled the identification of an optimal operating window for production of acceptable pellets. Scale-up of these operating conditions was tested in a 30-l Glatt GPCG15 FBRG. PMID:17166677

  7. Scale-up of HIV Treatment Through PEPFAR: A Historic Public Health Achievement

    PubMed Central

    El-Sadr, Wafaa M.; Holmes, Charles B.; Mugyenyi, Peter; Thirumurthy, Harsha; Ellerbrock, Tedd; Ferris, Robert; Sanne, Ian; Asiimwe, Anita; Hirnschall, Gottfried; Nkambule, Rejoice N.; Stabinski, Lara; Affrunti, Megan; Teasdale, Chloe; Zulu, Isaac; Whiteside, Alan

    2012-01-01

    Since its inception in 2003, the US President’s Emergency Plan for AIDS Relief (PEPFAR) has been an important driving force behind the global scale-up of HIV care and treatment services, particularly in expansion of access to antiretroviral therapy. Despite initial concerns about cost and feasibility, PEPFAR overcame challenges by leveraging and coordinating with other funders, by working in partnership with the most affected countries, by supporting local ownership, by using a public health approach, by supporting task-shifting strategies, and by paying attention to health systems strengthening. As of September 2011, PEPFAR directly supported initiation of antiretroviral therapy for 3.9 million people and provided care and support for nearly 13 million people. Benefits in terms of prevention of morbidity and mortality have been reaped by those receiving the services, with evidence of societal benefits beyond the anticipated clinical benefits. However, much remains to be accomplished to achieve universal access, to enhance the quality of programs, to ensure retention of patients in care, and to continue to strengthen health systems. PMID:22797746

  8. Public-private interactions on health in South Africa: opportunities for scaling up.

    PubMed

    Kula, Nothemba; Fryatt, Robert J

    2014-08-01

    South Africa has long recognized partnerships between the public and private sectors as a policy objective in health, but experience is still limited and poorly documented. The objectives of this article are to understand the factors that increase the likelihood of success of public-private interactions in South Africa, and identify and discuss opportunities for them to be scaled up. There is a strong legislative framework and a number of guidelines and tools that have been developed by the Treasury for managing partnerships. The review of literature confirmed the need for the state to have effective regulations in order to oversee quality and standards and to provide stewardship and oversight. The public sector requires sufficient capacity not only to manage relationships with the private sector but also to enable innovation and experimentation. Evaluation is an integral part of all interactions not only to learn from successes but also to identify any perverse incentives that may lead to unintended consequences. Four case studies show that the private for-profit sector is already engaged in a number of projects that are closely aligned to current health system reform priorities. Factors that increase the likelihood of interactions being successful include: increasing the government's capacity to manage public-private relationships; choosing public-private interactions that are strategically important to national goals; building a knowledge base on what works, where and why; moving from pilots to large scale initiatives; harnessing the contracting expertise in private providers; and encouraging innovation and learning. PMID:23962441

  9. Methadone maintenance therapy in Vietnam: an overview and scaling-up plan.

    PubMed

    Nguyen, Tam T M; Nguyen, Long T; Pham, Manh D; Vu, Hoang H; Mulvey, Kevin P

    2012-01-01

    Vietnam is among the countries with the highest rate of HIV transmission through injecting drug users. HIV prevalence among injecting drug users is 20% and up to 50% in many provinces. An estimated number of drug users in the country by the end of 2011 were 171,000 in which the most common is heroin (85%). Detoxification at home, community, and in rehabilitation centers have been the main modalities for managing heroin addiction until Methadone Maintenance Treatment (MMT) was piloted in 2008. Recent reports have demonstrated positive treatment outcomes. Incidence of HIV was found remarkably low among patients on MMT. Treatment has significantly improved the quality of life as well as stability for society. The government has granted the Ministry of Health (MoH) to expand Methadone treatment to at least 30 provinces to provide treatment for more than 80,000 drug users by 2015. The Vietnam Administration for HIV/AIDS Control (VAAC) and MOH have outlined the role and responsibility of key departments at the central and local levels in implementing and maintaining MMT treatment. This paper will describe the achievements of the MMT pilot program and the scaling-up plan as well as strategies to ensure quality and sustainability and to overcome the challenges in the coming years. PMID:23227351

  10. Scaling up the global nursing health workforce: contributions of an international organization.

    PubMed

    Rukholm, Ellen E; Stamler, Lynnette Leeseberg; Talbot, Lise R; Bednash, Geraldine; Raines, Fay; Potempa, Kathleen; Nugent, Pauline; Clark, Dame Jill Macleod; Bernhauser, Sue; Parfitt, Barbara

    2009-01-01

    In this paper key highlights of the scholarly work presented at the Toronto 2008 Global Alliance for Nursing Education & Scholarship (GANES) conference are summarized, challenges opportunities and issues facing nursing education globally arising from the conference discourse are outlined and initial steps are suggested as a way forward to a shared global view of baccalaureate and graduate nursing education and scholarship. This shared view arises from beginning understandings of the issues and opportunities we face globally starting with and building upon the lessons learned from the literature and from the experiences of nursing educators and nursing education organization locally, regionally, nationally and internationally. The theme of the groundbreaking GANES Toronto conference was "Educating the future nursing and health workforce: A global challenge". One hundred seventy delegates from 17 countries attended the event, with over 80 papers presented. A primary focus of GANES is the contribution of a strategic alliance of national nursing education organizations to contribute to nursing education leading practices and policy that address the scaling up of global nursing and health workforce. The founding members of GANES see a clear link between a strong educational infrastructure and strong scholarship activities in nursing and the ability of a society to be healthy and prosperous. Evidence presented at the recent GANES conference supports that belief. Through the strength of partnerships and other capacity-building efforts, member countries can support each other to address the global nursing education and health challenges while respecting the local issues. PMID:19388426

  11. Transforming Global Health by Improving the Science of Scale-Up.

    PubMed

    Kruk, Margaret E; Yamey, Gavin; Angell, Sonia Y; Beith, Alix; Cotlear, Daniel; Guanais, Frederico; Jacobs, Lisa; Saxenian, Helen; Victora, Cesar; Goosby, Eric

    2016-03-01

    In its report Global Health 2035, the Commission on Investing in Health proposed that health investments can reduce mortality in nearly all low- and middle-income countries to very low levels, thereby averting 10 million deaths per year from 2035 onward. Many of these gains could be achieved through scale-up of existing technologies and health services. A key instrument to close this gap is policy and implementation research (PIR) that aims to produce generalizable evidence on what works to implement successful interventions at scale. Rigorously designed PIR promotes global learning and local accountability. Much greater national and global investments in PIR capacity will be required to enable the scaling of effective approaches and to prevent the recycling of failed ideas. Sample questions for the PIR research agenda include how to close the gap in the delivery of essential services to the poor, which population interventions for non-communicable diseases are most applicable in different contexts, and how to engage non-state actors in equitable provision of health services in the context of universal health coverage. PMID:26934704

  12. Ensembl Genomes 2013: scaling up access to genome-wide data.

    PubMed

    Kersey, Paul Julian; Allen, James E; Christensen, Mikkel; Davis, Paul; Falin, Lee J; Grabmueller, Christoph; Hughes, Daniel Seth Toney; Humphrey, Jay; Kerhornou, Arnaud; Khobova, Julia; Langridge, Nicholas; McDowall, Mark D; Maheswari, Uma; Maslen, Gareth; Nuhn, Michael; Ong, Chuang Kee; Paulini, Michael; Pedro, Helder; Toneva, Iliana; Tuli, Mary Ann; Walts, Brandon; Williams, Gareth; Wilson, Derek; Youens-Clark, Ken; Monaco, Marcela K; Stein, Joshua; Wei, Xuehong; Ware, Doreen; Bolser, Daniel M; Howe, Kevin Lee; Kulesha, Eugene; Lawson, Daniel; Staines, Daniel Michael

    2014-01-01

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species. The project exploits and extends technologies for genome annotation, analysis and dissemination, developed in the context of the vertebrate-focused Ensembl project, and provides a complementary set of resources for non-vertebrate species through a consistent set of programmatic and interactive interfaces. These provide access to data including reference sequence, gene models, transcriptional data, polymorphisms and comparative analysis. This article provides an update to the previous publications about the resource, with a focus on recent developments. These include the addition of important new genomes (and related data sets) including crop plants, vectors of human disease and eukaryotic pathogens. In addition, the resource has scaled up its representation of bacterial genomes, and now includes the genomes of over 9000 bacteria. Specific extensions to the web and programmatic interfaces have been developed to support users in navigating these large data sets. Looking forward, analytic tools to allow targeted selection of data for visualization and download are likely to become increasingly important in future as the number of available genomes increases within all domains of life, and some of the challenges faced in representing bacterial data are likely to become commonplace for eukaryotes in future. PMID:24163254

  13. Ensembl Genomes 2013: scaling up access to genome-wide data

    PubMed Central

    Kersey, Paul Julian; Allen, James E.; Christensen, Mikkel; Davis, Paul; Falin, Lee J.; Grabmueller, Christoph; Hughes, Daniel Seth Toney; Humphrey, Jay; Kerhornou, Arnaud; Khobova, Julia; Langridge, Nicholas; McDowall, Mark D.; Maheswari, Uma; Maslen, Gareth; Nuhn, Michael; Ong, Chuang Kee; Paulini, Michael; Pedro, Helder; Toneva, Iliana; Tuli, Mary Ann; Walts, Brandon; Williams, Gareth; Wilson, Derek; Youens-Clark, Ken; Monaco, Marcela K.; Stein, Joshua; Wei, Xuehong; Ware, Doreen; Bolser, Daniel M.; Howe, Kevin Lee; Kulesha, Eugene; Lawson, Daniel; Staines, Daniel Michael

    2014-01-01

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species. The project exploits and extends technologies for genome annotation, analysis and dissemination, developed in the context of the vertebrate-focused Ensembl project, and provides a complementary set of resources for non-vertebrate species through a consistent set of programmatic and interactive interfaces. These provide access to data including reference sequence, gene models, transcriptional data, polymorphisms and comparative analysis. This article provides an update to the previous publications about the resource, with a focus on recent developments. These include the addition of important new genomes (and related data sets) including crop plants, vectors of human disease and eukaryotic pathogens. In addition, the resource has scaled up its representation of bacterial genomes, and now includes the genomes of over 9000 bacteria. Specific extensions to the web and programmatic interfaces have been developed to support users in navigating these large data sets. Looking forward, analytic tools to allow targeted selection of data for visualization and download are likely to become increasingly important in future as the number of available genomes increases within all domains of life, and some of the challenges faced in representing bacterial data are likely to become commonplace for eukaryotes in future. PMID:24163254

  14. Transforming Global Health by Improving the Science of Scale-Up

    PubMed Central

    Kruk, Margaret E.; Yamey, Gavin; Angell, Sonia Y.; Beith, Alix; Cotlear, Daniel; Guanais, Frederico; Jacobs, Lisa; Saxenian, Helen; Victora, Cesar; Goosby, Eric

    2016-01-01

    In its report Global Health 2035, the Commission on Investing in Health proposed that health investments can reduce mortality in nearly all low- and middle-income countries to very low levels, thereby averting 10 million deaths per year from 2035 onward. Many of these gains could be achieved through scale-up of existing technologies and health services. A key instrument to close this gap is policy and implementation research (PIR) that aims to produce generalizable evidence on what works to implement successful interventions at scale. Rigorously designed PIR promotes global learning and local accountability. Much greater national and global investments in PIR capacity will be required to enable the scaling of effective approaches and to prevent the recycling of failed ideas. Sample questions for the PIR research agenda include how to close the gap in the delivery of essential services to the poor, which population interventions for non-communicable diseases are most applicable in different contexts, and how to engage non-state actors in equitable provision of health services in the context of universal health coverage. PMID:26934704

  15. From powder to technical body: the undervalued science of catalyst scale up.

    PubMed

    Mitchell, Sharon; Michels, Nina-Luisa; Pérez-Ramírez, Javier

    2013-07-21

    Progress in catalysis has been, is, and will always be motivated by societal needs (e.g. environment, energy, chemicals, fuels), with the ultimate aim of improving process efficiency on a technical scale. Technical catalysts are often complex multicomponent millimetre-sized bodies consisting of active phases, supports, and numerous additives in shaped forms suitable for their commercial application. They can differ strongly in composition, structure, porosity, and performance from research catalysts, i.e. laboratory-developed materials constituted by a single bulk or supported active phase in powder form, which are the predominant focus of academic investigations. The industrial manufacture of heterogeneous catalysts, encompassing the upscaled preparation, formulation, and structuring, is encircled by secrecy and is decisive for the overall process viability. Yet despite the tremendous relevance, understanding the added complexity of these multicomponent systems and the consequences for the respective structure-property-function relationships has been largely neglected. Accordingly, our review examines the intricacies of the scale up of heterogeneous catalysts. While emphasising the lack of fundamental knowledge we point out the multiple functions that additives could provide by enhancing the mass and heat transfer properties, acting as co-catalysts, or imparting improved chemical, mechanical, or thermal stability. Recent exemplary studies developing rational approaches to prepare, characterise, and evaluate technical catalysts are analysed in detail and new directions for research in this field are put forward. PMID:23648466

  16. Simulation for Supporting Scale-Up of a Fluidized Bed Reactor for Advanced Water Oxidation

    PubMed Central

    Abdul Raman, Abdul Aziz; Daud, Wan Mohd Ashri Wan

    2014-01-01

    Simulation of fluidized bed reactor (FBR) was accomplished for treating wastewater using Fenton reaction, which is an advanced oxidation process (AOP). The simulation was performed to determine characteristics of FBR performance, concentration profile of the contaminants, and various prominent hydrodynamic properties (e.g., Reynolds number, velocity, and pressure) in the reactor. Simulation was implemented for 2.8 L working volume using hydrodynamic correlations, continuous equation, and simplified kinetic information for phenols degradation as a model. The simulation shows that, by using Fe3+ and Fe2+ mixtures as catalyst, TOC degradation up to 45% was achieved for contaminant range of 40–90 mg/L within 60 min. The concentration profiles and hydrodynamic characteristics were also generated. A subsequent scale-up study was also conducted using similitude method. The analysis shows that up to 10 L working volume, the models developed are applicable. The study proves that, using appropriate modeling and simulation, data can be predicted for designing and operating FBR for wastewater treatment. PMID:25309949

  17. Accelerated reforms in healthcare financing: the need to scale up private sector participation in Nigeria

    PubMed Central

    Ejughemre, Ufuoma John

    2014-01-01

    The health sector, a foremost service sector in Nigeria, faces a number of challenges; primarily, the persistent under-funding of the health sector by the Nigerian government as evidence reveals low allocations to the health sector and poor health system performance which are reflected in key health indices of the country.Notwithstanding, there is evidence that the private sector could be a key player in delivering health services and impacting health outcomes, including those related to healthcare financing. This underscores the need to optimize the role of private sector in complementing the government’s commitment to financing healthcare delivery and strengthening the health system in Nigeria. There are also concerns about uneven quality and affordability of private-driven health systems, which necessitates reforms aimed at regulation. Accordingly, the argument is that the benefits of leveraging the private sector in complementing the national government in healthcare financing outweigh the challenges, particularly in light of lean public resources and finite donor supports. This article, therefore, highlights the potential for the Nigerian government to scale up healthcare financing by leveraging private resources, innovations and expertise, while working to achieve the universal health coverage. PMID:24596895

  18. Completing Pre-Pilot Tasks To Scale Up Biomass Fractionation Pretreatment Apparatus From Batch To Continuous

    SciTech Connect

    Dick Wingerson

    2004-12-15

    PureVision Technology, Inc. (PureVision) was the recipient of a $200,000 Invention and Innovations (I&I) grant from the U. S. Department of Energy (DOE) to complete prepilot tasks in order to scale up its patented biomass fractionation pretreatment apparatus from batch to continuous processing. The initial goal of the I&I program, as detailed in PureVision's original application to the DOE, was to develop the design criteria to build a small continuous biomass fractionation pilot apparatus utilizing a retrofitted extruder with a novel screw configuration to create multiple reaction zones, separated by dynamic plugs within the reaction chamber that support the continuous counter-flow of liquids and solids at elevated temperature and pressure. Although the ultimate results of this 27-month I&I program exceeded the initial expectations, some of the originally planned tasks were not completed due to a modification of direction in the program. PureVision achieved its primary milestone by establishing the design criteria for a continuous process development unit (PDU). In addition, PureVision was able to complete the procurement, assembly, and initiate shake down of the PDU at Western Research Institute (WRI) in Laramie, WY during August 2003 to February 2004. During the month of March 2004, PureVision and WRI performed initial testing of the continuous PDU at WRI.

  19. Scaling up the national methadone maintenance treatment program in China: achievements and challenges

    PubMed Central

    Yin, Wenyuan; Hao, Yang; Sun, Xinhua; Gong, Xiuli; Li, Fang; Li, Jianhua; Rou, Keming; Sullivan, Sheena G; Wang, Changhe; Cao, Xiaobin; Luo, Wei; Wu, Zunyou

    2010-01-01

    China’s methadone maintenance treatment program was initiated in 2004 as a small pilot project in just eight sites. It has since expanded into a nationwide program encompassing more than 680 clinics covering 27 provinces and serving some 242 000 heroin users by the end of 2009. The agencies that were tasked with the program’s expansion have been confronted with many challenges, including high drop-out rates, poor cooperation between local governing authorities and poor service quality at the counter. In spite of these difficulties, ongoing evaluation has suggested reductions in heroin use, risky injection practices and, importantly, criminal behaviours among clients, which has thus provided the impetus for further expansion. Clinic services have been extended to offer clients a range of ancillary services, including HIV, syphilis and hepatitis C testing, information, education and communication, psychosocial support services and referrals for treatment of HIV, tuberculosis and sexually transmitted diseases. Cooperation between health and public security officials has improved through regular meetings and dialogue. However, institutional capacity building is still needed to deliver sustainable and standardized services that will ultimately improve retention rates. This article documents the steps China made in overcoming the many barriers to success of its methadone program. These lessons might be useful for other countries in the region that are scaling-up their methadone programs. PMID:21113034

  20. Magnetic Resonance Techniques Applied to the Diagnosis and Treatment of Parkinson's Disease.

    PubMed

    de Celis Alonso, Benito; Hidalgo-Tobón, Silvia S; Menéndez-González, Manuel; Salas-Pacheco, José; Arias-Carrión, Oscar

    2015-01-01

    Parkinson's disease (PD) affects at least 10 million people worldwide. It is a neurodegenerative disease, which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance (MR) has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging (DTI). However, deep brain stimulation, a current strategy for treating PD, is guided by MR imaging (MRI). For clinical prognosis, diagnosis, and follow-up investigations, blood oxygen level-dependent MRI, DTI, spectroscopy, and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last 5 years. Here, we focus on MR techniques for the diagnosis and treatment of Parkinson's disease. PMID:26191037

  1. CORDIC algorithm based digital detection technique applied in resonator fiber optic gyroscope

    NASA Astrophysics Data System (ADS)

    Yang, Zhihuai; Jin, Xiaojun; Ma, Huilian; Jin, Zhonghe

    2009-06-01

    A digital detection technique based on the coordinate rotation digital computer (CORDIC) algorithm is proposed for a resonator fiber optic gyroscope (R-FOG). It makes the generation of modulation signal, synchronous demodulation and signal processing in R-FOG to be realized in a single field programmable gate array (FPGA). The frequency synthesis and synchronous detection techniques based on the CORDIC algorithm have been analyzed and designed firstly. The experimental results indicate that the precision of the detection circuit satisfies the requirements for the closed-loop feedback in R-FOG system. The frequency of the laser is locked to the resonance frequency of the fiber ring resonator stably and the open-loop gyro output signal is observed successfully. The dynamic range and the bias drift of the R-FOG are ±1.91 rad/s and 0.005 rad/s over 10 s, respectively.

  2. Speckle interferometric techniques applied to the observation of the solar photosphere

    NASA Astrophysics Data System (ADS)

    Aime, C.; Ricort, G.

    1980-01-01

    Speckle interferometric techniques are used to study the solar granulation. Calibration of the effects of atmospheric turbulence is performed by using either the difference in behavior between redundant and non redundant apertures in presence of atmospheric turbulence, or by analysing moon-limb blurring during a solar eclipse, or by using the changes in seeing conditions during speckle-interferometric measurements. These techniques require a theoretical knowledge of the effects of atmospheric turbulence on the modulation transfer function (M.T.F.) of the image as it is impractical to use an unresolved star near the sun as a reference source during day time observations. The agreement between the experimental M.T.F. obtained with an unresolved star and the theoretical form deduced from Korff's log-normal assumptions is extended to day time conditions.

  3. Magnetic Resonance Techniques Applied to the Diagnosis and Treatment of Parkinson’s Disease

    PubMed Central

    de Celis Alonso, Benito; Hidalgo-Tobón, Silvia S.; Menéndez-González, Manuel; Salas-Pacheco, José; Arias-Carrión, Oscar

    2015-01-01

    Parkinson’s disease (PD) affects at least 10 million people worldwide. It is a neurodegenerative disease, which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance (MR) has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging (DTI). However, deep brain stimulation, a current strategy for treating PD, is guided by MR imaging (MRI). For clinical prognosis, diagnosis, and follow-up investigations, blood oxygen level-dependent MRI, DTI, spectroscopy, and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last 5 years. Here, we focus on MR techniques for the diagnosis and treatment of Parkinson’s disease. PMID:26191037

  4. Modelling laser speckle photographs of decayed teeth by applying a digital image information technique

    NASA Astrophysics Data System (ADS)

    Ansari, M. Z.; da Silva, L. C.; da Silva, J. V. P.; Deana, A. M.

    2016-09-01

    We report on the application of a digital image model to assess early carious lesions on teeth. When decay is in its early stages, the lesions were illuminated with a laser and the laser speckle images were obtained. Due to the differences in the optical properties between healthy and carious tissue, both regions produced different scatter patterns. The digital image information technique allowed us to produce colour-coded 3D surface plots of the intensity information in the speckle images, where the height (on the z-axis) and the colour in the rendering correlate with the intensity of a pixel in the image. The quantitative changes in colour component density enhance the contrast between the decayed and sound tissue, and visualization of the carious lesions become significantly evident. Therefore, the proposed technique may be adopted in the early diagnosis of carious lesions.

  5. Automated Boundary-Extraction and Region-Growing Techniques Applied to Solar Magnetograms

    NASA Technical Reports Server (NTRS)

    McAteer, R. T. James; Gallagher, Peter; Ireland, Jack; Young, C Alex

    2005-01-01

    We present an automated approach to active region extraction from full disc MDI longitudinal magnetograms. This uses a region-growing technique in conjunction with boundary-extraction to define a number of enclosed contours as belonging to separate regions of magnetic significance on the solar disc. This provides an objective definition of active regions and areas of plage on the Sun. A number of parameters relating to the flare-potential of each region is discussed.

  6. A fast technique applied to the analysis of Resistive Wall Modes with 3D conducting structures

    SciTech Connect

    Rubinacci, Guglielmo Liu, Yueqiang

    2009-03-20

    This paper illustrates the development of a 'fast' technique for the analysis of Resistive Wall Modes (RWMs) in fusion devices with three-dimensional conducting structures, by means of the recently developed CarMa code. Thanks to its peculiar features, the computational cost scales almost linearly with the number of discrete unknowns. Some large scale problems are solved in configurations of interest for the International Thermonuclear Experimental Reactor (ITER)

  7. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    NASA Technical Reports Server (NTRS)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  8. Vibrational techniques applied to photosynthesis: Resonance Raman and fluorescence line-narrowing.

    PubMed

    Gall, Andrew; Pascal, Andrew A; Robert, Bruno

    2015-01-01

    Resonance Raman spectroscopy may yield precise information on the conformation of, and the interactions assumed by, the chromophores involved in the first steps of the photosynthetic process. Selectivity is achieved via resonance with the absorption transition of the chromophore of interest. Fluorescence line-narrowing spectroscopy is a complementary technique, in that it provides the same level of information (structure, conformation, interactions), but in this case for the emitting pigment(s) only (whether isolated or in an ensemble of interacting chromophores). The selectivity provided by these vibrational techniques allows for the analysis of pigment molecules not only when they are isolated in solvents, but also when embedded in soluble or membrane proteins and even, as shown recently, in vivo. They can be used, for instance, to relate the electronic properties of these pigment molecules to their structure and/or the physical properties of their environment. These techniques are even able to follow subtle changes in chromophore conformation associated with regulatory processes. After a short introduction to the physical principles that govern resonance Raman and fluorescence line-narrowing spectroscopies, the information content of the vibrational spectra of chlorophyll and carotenoid molecules is described in this article, together with the experiments which helped in determining which structural parameter(s) each vibrational band is sensitive to. A selection of applications is then presented, in order to illustrate how these techniques have been used in the field of photosynthesis, and what type of information has been obtained. This article is part of a Special Issue entitled: Vibrational spectroscopies and bioenergetic systems. PMID:25268562

  9. A photoacoustic technique applied to detection of ethylene emissions in edible coated passion fruit

    NASA Astrophysics Data System (ADS)

    Alves, G. V. L.; dos Santos, W. C.; Waldman, W. R.; Oliveira, J. G.; Vargas, H.; da Silva, M. G.

    2010-03-01

    Photoacoustic spectroscopy was applied to study the physiological behavior of passion fruit when coated with edible films. The results have shown a reduction of the ethylene emission rate. Weight loss monitoring has not shown any significant differences between the coated and uncoated passion fruit. On the other hand, slower color changes of coated samples suggest a slowdown of the ripening process in coated passion fruit.

  10. Applying Data-mining techniques to study drought periods in Spain

    NASA Astrophysics Data System (ADS)

    Belda, F.; Penades, M. C.

    2010-09-01

    Data-mining is a technique that it can be used to interact with large databases and to help in the discovery relations between parameters by extracting information from massive and multiple data archives. Drought affects many economic and social sectors, from agricultural to transportation, going through urban water deficit and the development of modern industries. With these problems and drought geographical and temporal distribution it's difficult to find a single definition of drought. Improving the understanding of the knowledge of climatic index is necessary to reduce the impacts of drought and to facilitate quick decisions regarding this problem. The main objective is to analyze drought periods from 1950 to 2009 in Spain. We use several kinds of information, different formats, sources and transmission mode. We use satellite-based Vegetation Index, dryness index for several temporal periods. We use daily and monthly precipitation and temperature data and soil moisture data from numerical weather model. We calculate mainly Standardized Precipitation Index (SPI) that it has been used amply in the bibliography. We use OLAP-Mining techniques to discovery of association rules between remote-sensing, numerical weather model and climatic index. Time series Data- Mining techniques organize data as a sequence of events, with each event having a time of recurrence, to cluster the data into groups of records or cluster with similar characteristics. Prior climatological classification is necessary if we want to study drought periods over all Spain.

  11. Metal oxide collectors for storing matter technique applied in secondary ion mass spectrometry

    NASA Astrophysics Data System (ADS)

    Miśnik, Maciej; Konarski, Piotr; Zawada, Aleksander

    2016-03-01

    We present results of the use of metal and metal oxide substrates that serve as collectors in 'storing matter', the quantitative technique of secondary ion mass spectrometry (SIMS). This technique allows separating the two base processes of secondary ion formation in SIMS. Namely, the process of ion sputtering is separated from the process of ionisation. The technique allows sputtering of the analysed sample and storing the sputtered material, with sub-monolayer coverage, onto a collector surface. Such deposits can be then analysed by SIMS, and as a result, the so called 'matrix effects' are significantly reduced. We perform deposition of the sputtered material onto Ti and Cu substrates and also onto metal oxide substrates as molybdenum, titanium, tin and indium oxides. The process of sputtering is carried within the same vacuum chamber where the SIMS analysis of the collected material is performed. For sputtering and SIMS analysis of the deposited material we use 5 keV Ar+ beam of 500 nA. The presented results are obtained with the use of stationary collectors. Here we present a case study of chromium. The obtained results show that the molybdenum and titanium oxide substrates used as collectors increase useful yield by two orders, with respect to such pure elemental collectors as Cu and Ti. Here we define useful yield as a ratio of the number of detected secondary ions during SIMS analysis and the number of atoms sputtered during the deposition process.

  12. Air flow measurement techniques applied to noise reduction of a centrifugal blower

    NASA Astrophysics Data System (ADS)

    Laage, John W.; Armstrong, Ashli J.; Eilers, Daniel J.; Olsen, Michael G.; Mann, J. Adin

    2005-09-01

    The air flow in a centrifugal blower was studied using a variety of flow and sound measurement techniques. The flow measurement techniques employed included Particle Image Velocimetry (PIV), pitot tubes, and a five hole spherical probe. PIV was used to measure instantaneous and ensemble-averaged velocity fields over large area of the outlet duct as a function of fan position, allowing for the visualization of the flow as it leave the fan blades and progressed downstream. The results from the flow measurements were reviewed along side the results of the sound measurements with the goal of identifying sources of noise and inefficiencies in flow performance. The radiated sound power was divided into broadband and tone noise and measures of the flow. The changes in the tone and broadband sound were compared to changes in flow quantities such as the turbulent kinetic energy and Reynolds stress. Results for each method will be presented to demonstrate the strengths of each flow measurement technique as well as their limitations. Finally, the role that each played in identifying noise sources is described.

  13. Computer Vision Techniques Applied to Space Object Detect, Track, ID, Characterize

    NASA Astrophysics Data System (ADS)

    Flewelling, B.

    2014-09-01

    Space-based object detection and tracking represents a fundamental step necessary for detailed analysis of space objects. Initial observations of a resident space object (RSO) may result from careful sensor tasking to observe an object with well understood dynamics, or measurements-of-opportunity on an object with poorly understood dynamics. Dim and eccentric objects present a particular challenge which requires more dynamic use of imaging systems. As a result of more stressing data acquisition strategies, advanced techniques for the accurate processing of both point and streaking sources are needed. This paper will focus on two key methods in computer vision used to determine interest points within imagery. The Harris Corner method and the method of Phase Congruency can be used to effectively extract static and streaking point sources and to indicate when apparent motion is present within an observation. The geometric inferences which can be made from the resulting detections will be discussed, including a method to evaluate the localization uncertainty of the extracted detections which is based on the computation of the Hessian of the detector response. Finally a technique which exploits the additional information found in detected streak endpoints to provide a better centroid in the presence of curved streaks is explained and additional applications for the presented techniques are discussed.

  14. Quantification of material slippage in the iliotibial tract when applying the partial plastination clamping technique.

    PubMed

    Sichting, Freddy; Steinke, Hanno; Wagner, Martin F-X; Fritsch, Sebastian; Hädrich, Carsten; Hammer, Niels

    2015-09-01

    The objective of this study was to evaluate the potential of the partial plastination technique in minimizing material slippage and to discuss the effects on the tensile properties of thin dense connective tissue. The ends of twelve iliotibial tract samples were primed with polyurethane resin and covered by plastic plates to provide sufficient grip between the clamps. The central part of the samples remained in an anatomically unfixed condition. Strain data of twelve partially plastinated samples and ten samples in a completely anatomically unfixed state were obtained using uniaxial crosshead displacement and an optical image tracking technique. Testing of agreement between the strain data revealed ongoing but markedly reduced material slippage in partially plastinated samples compared to the unfixed samples. The mean measurement error introduced by material slippage was up to 18.0% in partially plastinated samples. These findings might complement existing data on measurement errors during material testing and highlight the importance of individual quantitative evaluation of errors that come along with self-made clamping techniques. PMID:26005842

  15. Assessment of ground-based monitoring techniques applied to landslide investigations

    NASA Astrophysics Data System (ADS)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements

  16. Implications of the Fast-Evolving Scale-Up of Adult Voluntary Medical Male Circumcision for Quality of Services in South Africa

    PubMed Central

    Rech, Dino; Spyrelis, Alexandra; Frade, Sasha; Perry, Linnea; Farrell, Margaret; Fertziger, Rebecca; Toledo, Carlos; Castor, Delivette; Njeuhmeli, Emmanuel; Loykissoonlal, Dayanund; Bertrand, Jane T.

    2014-01-01

    Background The scale-up of voluntary medical male circumcision (VMMC) services in South Africa has been rapid, in an attempt to achieve the national government target of 4.3 million adult male circumcisions for HIV prevention by 2016. This study assesses the effect of the scale-up on the quality of the VMMC program. Methods and Findings This analysis compares the quality of services at 15 sites operational in 2011 to (1) the same 15 sites in 2012 and (2) to a set of 40 sites representing the expanded program in 2012. Trained clinicians scored each site on 29 items measuring readiness to provide quality services (abbreviated version of the WHO Quality Assessment [QA] Guide) and 29 items to assess quality of surgical care provided (pre-op, surgical technique and post-op) based on the observation of VMMC procedures at each site. Declines in quality far outnumbered improvements. The negative effects in terms of readiness to provide quality services were most evident in expanded sites, whereas the declines in provision of quality services tended to affect both repeat sites and expanded sites equally. Areas of notable concern included the monitoring of adverse events, external supervision, post-operative counselling, and some infection control issues. Scores on quality of surgical technique tended to be among the highest across the 58 items observed, and the South Africa program has clearly institutionalized three “best practices” for surgical efficiency. Conclusions These findings demonstrate the challenges of rapidly developing large numbers of new VMMC sites with the necessary equipment, supplies, and protocols. The scale-up in South Africa has diluted human resources, with negative effects for both the original sites and the expanded program. PMID:24801209

  17. Applied Protein and Molecular Techniques for Characterization of B Cell Neoplasms in Horses

    PubMed Central

    Badial, Peres R.; Tallmadge, Rebecca L.; Miller, Steven; Stokol, Tracy; Richards, Kristy; Borges, Alexandre S.

    2015-01-01

    Mature B cell neoplasms cover a spectrum of diseases involving lymphoid tissues (lymphoma) or blood (leukemia), with an overlap between these two presentations. Previous studies describing equine lymphoid neoplasias have not included analyses of clonality using molecular techniques. The objective of this study was to use molecular techniques to advance the classification of B cell lymphoproliferative diseases in five adult equine patients with a rare condition of monoclonal gammopathy, B cell leukemia, and concurrent lymphadenopathy (lymphoma/leukemia). The B cell neoplasms were phenotypically characterized by gene and cell surface molecule expression, secreted immunoglobulin (Ig) isotype concentrations, Ig heavy-chain variable (IGHV) region domain sequencing, and spectratyping. All five patients had hyperglobulinemia due to IgG1 or IgG4/7 monoclonal gammopathy. Peripheral blood leukocyte immunophenotyping revealed high proportions of IgG1- or IgG4/7-positive cells and relative T cell lymphopenia. Most leukemic cells lacked the surface B cell markers CD19 and CD21. IGHG1 or IGHG4/7 gene expression was consistent with surface protein expression, and secreted isotype and Ig spectratyping revealed one dominant monoclonal peak. The mRNA expression of the B cell-associated developmental genes EBF1, PAX5, and CD19 was high compared to that of the plasma cell-associated marker CD38. Sequence analysis of the IGHV domain of leukemic cells revealed mutated Igs. In conclusion, the protein and molecular techniques used in this study identified neoplastic cells compatible with a developmental transition between B cell and plasma cell stages, and they can be used for the classification of equine B cell lymphoproliferative disease. PMID:26311245

  18. Enhanced nonlinear iterative techniques applied to a non-equilibrium plasma flow

    SciTech Connect

    Knoll, D.A.; McHugh, P.R.

    1996-12-31

    We study the application of enhanced nonlinear iterative methods to the steady-state solution of a system of two-dimensional convection-diffusion-reaction partial differential equations that describe the partially-ionized plasma flow in the boundary layer of a tokamak fusion reactor. This system of equations is characterized by multiple time and spatial scales, and contains highly anisotropic transport coefficients due to a strong imposed magnetic field. We use Newton`s method to linearize the nonlinear system of equations resulting from an implicit, finite volume discretization of the governing partial differential equations, on a staggered Cartesian mesh. The resulting linear systems are neither symmetric nor positive definite, and are poorly conditioned. Preconditioned Krylov iterative techniques are employed to solve these linear systems. We investigate both a modified and a matrix-free Newton-Krylov implementation, with the goal of reducing CPU cost associated with the numerical formation of the Jacobian. A combination of a damped iteration, one-way multigrid and a pseudo-transient continuation technique are used to enhance global nonlinear convergence and CPU efficiency. GMRES is employed as the Krylov method with Incomplete Lower-Upper(ILU) factorization preconditioning. The goal is to construct a combination of nonlinear and linear iterative techniques for this complex physical problem that optimizes trade-offs between robustness, CPU time, memory requirements, and code complexity. It is shown that a one-way multigrid implementation provides significant CPU savings for fine grid calculations. Performance comparisons of the modified Newton-Krylov and matrix-free Newton-Krylov algorithms will be presented.

  19. Sounding rocket thermal analysis techniques applied to GAS payloads. [Get Away Special payloads (STS)

    NASA Technical Reports Server (NTRS)

    Wing, L. D.

    1979-01-01

    Simplified analytical techniques of sounding rocket programs are suggested as a means of bringing the cost of thermal analysis of the Get Away Special (GAS) payloads within acceptable bounds. Particular attention is given to two methods adapted from sounding rocket technology - a method in which the container and payload are assumed to be divided in half vertically by a thermal plane of symmetry, and a method which considers the container and its payload to be an analogous one-dimensional unit having the real or correct container top surface area for radiative heat transfer and a fictitious mass and geometry which model the average thermal effects.

  20. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Helba, Michael J.; Hill, Janeil B.

    1992-01-01

    The purpose of this research is to provide Space Station Freedom protective structures design insight through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. The goals of the research are: (1) to develop a Monte Carlo simulation tool which will provide top level insight for Space Station protective structures designers; (2) to develop advanced shielding concepts relevant to Space Station Freedom using unique multiple bumper approaches; and (3) to investigate projectile shape effects on protective structures design.

  1. Full-field speckle correlation technique as applied to blood flow monitoring

    NASA Astrophysics Data System (ADS)

    Vilensky, M. A.; Agafonov, D. N.; Timoshina, P. A.; Shipovskaya, O. V.; Zimnyakov, D. A.; Tuchin, V. V.; Novikov, P. A.

    2011-03-01

    The results of experimental study of monitoring the microcirculation in tissue superficial layers of the internal organs at gastro-duodenal hemorrhage with the use of laser speckles contrast analysis technique are presented. The microcirculation monitoring was provided in the course of the laparotomy of rat abdominal cavity in the real time. Microscopic hemodynamics was analyzed for small intestine and stomach under different conditions (normal state, provoked ischemia, administration of vasodilative agents such as papaverine, lidocaine). The prospects and problems of internal monitoring of micro-vascular flow in clinical conditions are discussed.

  2. New twist on dating: radiocarbon dating techniques applied to air pollution studies

    SciTech Connect

    Porter, G.

    1981-05-01

    This paper deals with the problem of urban air pollution and to what extent it is caused by the burning of fossil fuels at factories or in cars, and to what extent it is due to the breathing processes of trees or the burning of natural fuels like wood. With the use of radiocarbon dating techniques the distinction between the pollutants can be made. The article describes the design of the gas proportional counter used to measure the extremely small samples of carbon in polluted air. (KRM)

  3. Evaluation of Bending Strength in Friction Welded Alumina/mild Steel Joints by Applying Factorial Technique

    NASA Astrophysics Data System (ADS)

    Jesudoss Hynes, N. Rajesh; Nagaraj, P.; Vivek Prabhu, M.

    Joining of metal with ceramics has become significant in many applications, because they combine properties like ductility with high hardness and wear resistance. By friction welding technique, alumina can be joined to mild steel with AA1100 sheet of 1mm thickness as interlayer. In the present work, investigation of the effect of friction time on interlayer thickness reduction and bending strength is carried out by factorial design. By using ANOVA, a statistical tool, regression modeling is done. The regression model predicts the bending strength of welded ceramic/metal joints accurately with ± 2% deviation from the experimental values.

  4. A review of post-modern management techniques as currently applied to Turkish forestry.

    PubMed

    Dölarslan, Emre Sahin

    2009-01-01

    This paper reviews the effects of six post-modern management concepts as applied to Turkish forestry. Up to now, Turkish forestry has been constrained, both in terms of its operations and internal organization, by a highly bureaucratic system. The application of new thinking in forestry management, however, has recently resulted in new organizational and production concepts that promise to address problems specific to this Turkish industry and bring about positive changes. This paper will elucidate these specific issues and demonstrate how post-modern management thinking is influencing the administration and operational capacity of Turkish forestry within its current structure. PMID:18194835

  5. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form

    NASA Astrophysics Data System (ADS)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-02-01

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories.

  6. In-place recalibration technique applied to a capacitance-type system for measuring rotor blade tip clearance

    NASA Technical Reports Server (NTRS)

    Barranger, J. P.

    1978-01-01

    The rotor blade tip clearance measurement system consists of a capacitance sensing probe with self contained tuning elements, a connecting coaxial cable, and remotely located electronics. Tests show that the accuracy of the system suffers from a strong dependence on probe tip temperature and humidity. A novel inplace recalibration technique was presented which partly overcomes this problem through a simple modification of the electronics that permits a scale factor correction. This technique, when applied to a commercial system significantly reduced errors under varying conditions of humidity and temperature. Equations were also found that characterize the important cable and probe design quantities.

  7. A New Astrometric Technique Applied to the Likely Tidal Disruption Event, Swift J166+57

    NASA Astrophysics Data System (ADS)

    Alianora Hounsell, Rebekah; Fruchter, Andrew S.; Levan, Andrew J.

    2015-01-01

    We have developed a new technique to align Hubble Space Telescope (HST) data using background galaxies as astrometric markers. This technique involves the cross correlation of cutouts of regions about individual galaxies from different epochs, enabling the determination of an astrometric solution. The method avoids errors introduced by proper motion when the locations of stars are used to transform the images. We have used this approach to investigate the nature of the unusual gamma-ray source Sw J1644+57, which was initially classified as a long gamma ray burst (LGRB). However, due to the object's atypical behavior in the X-ray and optical, along with its location within the host (150 ± 150 pc, see Levan et al. 2011) it has been suggested that the transient may be caused by a tidal disruption event (TDE). Additional theories have also been suggested for its origin which remain based on the collapsar model for a long burst, such as the collapse of a red giant, rather than a stripped star as is typical in LGRBs, or the creation of a magnetar.Precise astrometry of the transient with respect to the galaxy can potentially distinguish between these scenarios. Here we show that our method of alignment dramatically reduces the astrometric error of the position of the transient with respect to the nucleus of the host. We therefore discuss the implication of our result on the astrophysical nature of the object.

  8. Random sets technique for information fusion applied to estimation of brain functional images

    NASA Astrophysics Data System (ADS)

    Smith, Therese M.; Kelly, Patrick A.

    1999-05-01

    A new mathematical technique for information fusion based on random sets, developed and described by Goodman, Mahler and Nguyen (The Mathematics of Data Fusion, Kluwer, 1997) can be useful for estimation of functional brian images. Many image estimation algorithms employ prior models that incorporate general knowledge about sizes, shapes and locations of brain regions. Recently, algorithms have been proposed using specific prior knowledge obtained from other imaging modalities (for example, Bowsher, et al., IEEE Trans. Medical Imaging, 1996). However, there is more relevant information than is presently used. A technique that permits use of additional prior information about activity levels would improve the quality of prior models, and hence, of the resulting image estimate. The use of random sets provides this capability because it allows seemingly non-statistical (or ambiguous) information such as that contained in inference rules to be represented and combined with observations in a single statistical model, corresponding to a global joint density. This paper illustrates the use of this approach by constructing an example global joint density function for brain functional activity from measurements of functional activity, anatomical information, clinical observations and inference rules. The estimation procedure is tested on a data phantom with Poisson noise.

  9. New seismic reflection techniques applied to gas recognition in the Rharb Basin, Morocco

    SciTech Connect

    Jabour, H.; Dakki, M. )

    1994-07-01

    The Rharb basin in Morocco is a Tertiary foreland filled by clastic series during the Miocene and Pliocene. This terrigenous influx, derived from the prerif to the northeast and the Meseta to the south, is characterized by a sandy episode during much of the Messinian and the Tortonian. The sand deposits were probably related to the uplift and major erosion of a part of the prerif during the sliding of an olistostrome (prerif nappe). Although most of the wells drilled in the basin have encountered biogenic gas accumulations, the problem still facing exploration in the area is seismic resolution and thin-bed tuning analysis. Recent studies using high seismic resolution techniques have permitted the authors to gain a deep insight into the stratigraphy and depositional environment of the thin sand reservoirs and their fluid content. AVO stratigraphy, inversion of seismic traces into acoustic impedance traces and seismic attributes calculation, and computing provide a remarkable example of the possibilities of depicting the lateral and vertical evolution of reservoir facies and localizing biogenic gas accumulations. Out of five recent exploratory wells drilled based on this new technique, three encountered gas-bearing sands with economic potential. Fifty-three amplitude anomalies have been identified and await processing.

  10. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    NASA Technical Reports Server (NTRS)

    Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas

    2003-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  11. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    NASA Technical Reports Server (NTRS)

    Wang, Nanbor; Kircher, Michael; Schmidt, Douglas C.

    2000-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of-service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and often sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration frame-work for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines reflective middleware techniques designed to adaptively: (1) select optimal communication mechanisms, (2) man- age QoS properties of CORBA components in their containers, and (3) (re)configure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of reflective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  12. Hyphenated GC-FTIR and GC-MS techniques applied in the analysis of bioactive compounds

    NASA Astrophysics Data System (ADS)

    Gosav, Steluta; Paduraru, Nicoleta; Praisler, Mirela

    2014-08-01

    The drugs of abuse, which affect human nature and cause numerous crimes, have become a serious problem throughout the world. There are hundreds of amphetamine analogues on the black market. They consist of various alterations of the basic amphetamine molecular structure, which are yet not yet included in the lists of forbidden compounds although they retain or slightly modify the hallucinogenic effects of their parent compound. It is their important variety that makes their identification quite a challenge. A number of analytical procedures for the identification of amphetamines and their analogues have recently been reported. We are presenting the profile of the main hallucinogenic amphetamines obtained with the hyphenated techniques that are recommended for the identification of illicit amphetamines, i. e. gas chromatography combined with mass spectrometry (GC-MS) and gas chromatography coupled with Fourier transform infrared spectrometry (GC-FTIR). The infrared spectra of the analyzed hallucinogenic amphetamines present some absorption bands (1490 cm-1, 1440 cm-1, 1245 cm-1, 1050 cm-1 and 940 cm-1) that are very stable as position and shape, while their intensity depends of the side-chain substitution. The specific ionic fragment of the studied hallucinogenic compounds is the 3,4-methylenedioxybenzyl cation (m/e = 135) which has a small relative abundance (lesser than 20%). The complementarity of the above mentioned techniques for the identification of hallucinogenic compounds is discussed.

  13. Ultrasonic Nondestructive Evaluation Techniques Applied to the Quantitative Characterization of Textile Composite Materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1998-01-01

    An overall goal of this research has been to enhance our understanding of the scientific principles necessary to develop advanced ultrasonic nondestructive techniques for the quantitative characterization of advanced composite structures. To this end, we have investigated a thin woven composite (5-harness biaxial weave). We have studied the effects that variations of the physical parameters of the experimental setup can have on the ultrasonic determination of the material properties for this thin composite. In particular, we have considered the variation of the nominal center frequency and the f-number of the transmitting transducer which in turn address issues such as focusing and beam spread of ultrasonic fields. This study has employed a planar, two-dimensional, receiving pseudo-array that has permitted investigation of the diffraction patterns of ultrasonic fields. Distortion of the ultrasonic field due to the spatial anisotropy of the thin composite has prompted investigation of the phenomenon of phase cancellation at the face of a finite-aperture, piezoelectric receiver. We have performed phase-sensitive and phase-insensitive analyses to provide a measure of the amount of phase cancellation at the face of a finite-aperture, piezoelectric receiver. The pursuit of robust measurements of received energy (i.e., those not susceptible to phase cancellation at the face of a finite-aperture, piezoelectric receiver) supports the development of robust techniques to determine material properties from measure ultrasonic parameters.

  14. Comparison of a Traditional Teaching Model to the Scale-Up Teaching Model in Undergraduate Biology: A Mixed Method Study

    NASA Astrophysics Data System (ADS)

    Mears, Samantha

    This project compared a SCALE-UP teaching model to a traditional teaching model. Traditional teaching is now considered a poor motivator for student performance and interests, and the SCALE-UP model was proposed to combat these problems. SCALE-UP classrooms are designed to encourage cooperative learning as well as other active learning methods. The study looked at teacher and student opinions of the two models to determine which one they preferred and why. The study also compared the students' grades between the two classes to see if there was a difference between test scores, as well as learning gains for pre-test to post-test. Student and teacher behaviors were also quantified based on categories of engagement in class. The purpose of this study was to support the literature on the idea of a viable and better option to traditional lecture in the form of the SCALE-UP model. Based on the results, students prefer and enjoy a SCALE-UP classroom more than a traditional lecture. The students also performed better and learn more when compared to the traditional lecture class.

  15. The desirability and feasibility of scaling up community health insurance in low-income settings--lessons from Armenia.

    PubMed

    Poletti, Tim; Balabanova, Dina; Ghazaryan, Olga; Kocharyan, Hasmik; Hakobyan, Margarita; Arakelyan, Karen; Normand, Charles

    2007-02-01

    There is growing evidence that community financing mechanisms can raise additional revenue, increase equitable access to primary health care (PHC), and improve social protection. More recently there has been interest in scaling up community financing as a step towards universal coverage either via tax-based systems or social health insurance. Using key informant interviews and focus group discussions, this study sought to assess the desirability and feasibility of scaling-up community health insurance in Armenia. The results suggest that there is broad-based political support for scaling up the schemes and that community financing is synergistic with major health sector reforms. High levels of social capital within the rural communities should facilitate scaling up. Existing schemes have increased access and quality of care, but expansion of coverage is constrained by affordability, poor infrastructure, and weak linkages with the broader health system. Long-term subsidies and system-building will be essential if the expanded schemes are to be financially viable and pro-poor. Overall, successfully scaling up community financing in Armenia would depend on addressing a range of obstacles related to legislation, institutional capacity, human resources and resistance to change among certain stakeholders. PMID:17097789

  16. Excellence in Physics Education Award: SCALE-UP, Student Centered Active Learning Environment with Upside-down Pedagogies

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2016-03-01

    The Student-Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) Project combines curricula and a specially-designed instructional space to enhance learning. SCALE-UP students practice communication and teamwork skills while performing activities that enhance their conceptual understanding and problem solving skills. This can be done with small or large classes and has been implemented at more than 250 institutions. Educational research indicates that students should collaborate on interesting tasks and be deeply involved with the material they are studying. SCALE-UP classtime is spent primarily on ``tangibles'' and ``ponderables''--hands-on measurements/observations and interesting questions. There are also computer simulations (called ``visibles'') and hypothesis-driven labs. Students sit at tables designed to facilitate group interactions. Instructors circulate and engage in Socratic dialogues. The setting looks like a banquet hall, with lively interactions nearly all the time. Impressive learning gains have been measured at institutions across the US and internationally. This talk describes today's students, how lecturing got started, what happens in a SCALE-UP classroom, and how the approach has spread. The SCALE-UP project has greatly benefitted from numerous Grants made by NSF and FIPSE to NCSU and other institutions.

  17. Act local, think global: how the Malawi experience of scaling up antiretroviral treatment has informed global policy.

    PubMed

    Harries, Anthony D; Ford, Nathan; Jahn, Andreas; Schouten, Erik J; Libamba, Edwin; Chimbwandira, Frank; Maher, Dermot

    2016-01-01

    The scale-up of antiretroviral therapy (ART) in Malawi was based on a public health approach adapted to its resource-poor setting, with principles and practices borrowed from the successful tuberculosis control framework. From 2004 to 2015, the number of new patients started on ART increased from about 3000 to over 820,000. Despite being a small country, Malawi has made a significant contribution to the 15 million people globally on ART and has also contributed policy and service delivery innovations that have supported international guidelines and scale up in other countries. The first set of global guidelines for scaling up ART released by the World Health Organization (WHO) in 2002 focused on providing clinical guidance. In Malawi, the ART guidelines adopted from the outset a more operational and programmatic approach with recommendations on health systems and services that were needed to deliver HIV treatment to affected populations. Seven years after the start of national scale-up, Malawi launched a new strategy offering all HIV-infected pregnant women lifelong ART regardless of the CD4-cell count, named Option B+. This strategy was subsequently incorporated into a WHO programmatic guide in 2012 and WHO ART guidelines in 2013, and has since then been adopted by the majority of countries worldwide. In conclusion, the Malawi experience of ART scale-up has become a blueprint for a public health response to HIV and has informed international efforts to end the AIDS epidemic by 2030. PMID:27600800

  18. Evaluating Ecotypes as a means of Scaling-up Permafrost Thermal Measurements in Western Alaska.

    NASA Astrophysics Data System (ADS)

    Cable, William; Romanovsky, Vladimir

    2015-04-01

    In many regions, permafrost temperatures are increasing due to climate change and in some cases permafrost is thawing and degrading. In areas where degradation has already occurred the effects can be dramatic, resulting in changing ecosystems, carbon release, and damage to infrastructure. Yet in many areas we lack baseline data, such as subsurface temperatures, needed to assess future changes and potential risk areas. Besides climate, the physical properties of the vegetation cover and subsurface material have a major influence on the thermal state of permafrost. These properties are often directly related to the type of ecosystem overlaying permafrost. Thus, classifying the landscape into general ecotypes might be an effective way to scale up permafrost thermal data. To evaluate using ecotypes as a way of scaling-up permafrost thermal data within a region we selected an area in Western Alaska, the Selawik National Wildlife Refuge, which is on the boundary between continuous and discontinuous permafrost. This region was selected because previously an ecological land classification had been conducted and a very high-resolution ecotype map was generated. Using this information we selected 18 spatially distributed sites covering the most abundant ecotypes, where we are collecting low vertical resolution soil temperature data to a depth of 1.5 meters at most sites. At three additional core sites, we are collecting air temperature, snow depth, and high vertical resolution soil temperature to a depth of 3 meters. The sites were installed in the summers of 2011 and 2012; consequently, we have at least two years of data from all sites. Mean monthly and mean annual air temperature and snow depth for all three core sites are similar within the 2012-2014 period. Additionally, the average air temperature and snow depth from our three cores sites compares well with that of a nearby meteorological station for which long-term data is available. During the study period snow depth

  19. Radio-isotope production scale-up at the University of Wisconsin

    SciTech Connect

    Nickles, Robert Jerome

    2014-06-19

    Our intent has been to scale up our production capacity for a subset of the NSAC-I list of radioisotopes in jeopardy, so as to make a significant impact on the projected national needs for Cu-64, Zr-89, Y-86, Ga-66, Br-76, I-124 and other radioisotopes that offer promise as PET synthons. The work-flow and milestones in this project have been compressed into a single year (Aug 1, 2012- July 31, 2013). The grant budget was virtually dominated by the purchase of a pair of dual-mini-cells that have made the scale-up possible, now permitting the Curie-level processing of Cu-64 and Zr-89 with greatly reduced radiation exposure. Mile stones: 1. We doubled our production of Cu-64 and Zr-89 during the grant period, both for local use and out-bound distribution to ≈ 30 labs nationwide. This involved the dove-tailing of beam schedules of both our PETtrace and legacy RDS cyclotron. 2. Implemented improved chemical separation of Zr-89, Ga-66, Y-86 and Sc-44, with remote, semi-automated dissolution, trap-and-release separation under LabView control in the two dual-mini-cells provided by this DOE grant. A key advance was to fit the chemical stream with miniature radiation detectors to confirm the transfer operations. 3. Implemented improved shipping of radioisotopes (Cu-64, Zr-89, Tc-95m, and Ho-163) with approved DOT 7A boxes, with a much-improved FedEx shipping success compared to our previous steel drums. 4. Implemented broad range quantitative trace metal analysis, employing a new microwave plasma atomic emission spectrometer (Agilent 4200) capable of ppb sensitivity across the periodic table. This new instrument will prove essential in bringing our radiometals into FDA compliance needing CoA’s for translational research in clinical trials. 5. Expanded our capabilities in target fabrication, with the purchase of a programmable 1600 oC inert gas tube furnace for the smelting of binary alloy target materials. A similar effort makes use of our RF induction furnace, allowing

  20. Mass Movement Hazards in the Mediterranean; A review on applied techniques and methodologies

    NASA Astrophysics Data System (ADS)

    Ziade, R.; Abdallah, C.; Baghdadi, N.

    2012-04-01

    Emergent population and expansions of settlements and life-lines over hazardous areas in the Mediterranean region have largely increased the impact of Mass Movements (MM) both in industrialized and developing countries. This trend is expected to continue in the next decades due to increased urbanization and development, continued deforestation and increased regional precipitation in MM-prone areas due to changing climatic patterns. Consequently, and over the past few years, monitoring of MM has acquired great importance from the scientific community as well as the civilian one. This article begins with a discussion of the MM classification, and the different topographic, geologic, hydrologic and environmental impacting factors. The intrinsic (preconditioning) variables determine the susceptibility of MM and extrinsic factors (triggering) can induce the probability of MM occurrence. The evolution of slope instability studies is charted from geodetic or observational techniques, to geotechnical field-based origins to recent higher levels of data acquisition through Remote Sensing (RS) and Geographic Information System (GIS) techniques. Since MM detection and zoning is difficult in remote areas, RS and GIS have enabled regional studies to predominate over site-based ones where they provide multi-temporal images hence facilitate greatly MM monitoring. The unusual extent of the spectrum of MM makes it difficult to define a single methodology to establish MM hazard. Since the probability of occurrence of MM is one of the key components in making rational decisions for management of MM risk, scientists and engineers have developed physical parameters, equations and environmental process models that can be used as assessment tools for management, education, planning and legislative purposes. Assessment of MM is attained through various modeling approaches mainly divided into three main sections: quantitative/Heuristic (1:2.000-1:10.000), semi-quantitative/Statistical (1

  1. A comparison of new, old and future densiometic techniques as applied to volcanologic study.

    NASA Astrophysics Data System (ADS)

    Pankhurst, Matthew; Moreland, William; Dobson, Kate; Þórðarson, Þorvaldur; Fitton, Godfrey; Lee, Peter

    2015-04-01

    The density of any material imposes a primary control upon its potential or actual physical behaviour in relation to its surrounds. It follows that a thorough understanding of the physical behaviour of dynamic, multi-component systems, such as active volcanoes, requires knowledge of the density of each component. If we are to accurately predict the physical behaviour of synthesized or natural volcanic systems, quantitative densiometric measurements are vital. The theoretical density of melt, crystals and bubble phases may be calculated using composition, structure, temperature and pressure inputs. However, measuring the density of natural, non-ideal, poly-phase materials remains problematic, especially if phase specific measurement is important. Here we compare three methods; Archimedes principle, He-displacement pycnometry and X-ray micro computed tomography (XMT) and discuss the utility and drawbacks of each in the context of modern volcanologic study. We have measured tephra, ash and lava from the 934 AD Eldgjá eruption (Iceland), and the 2010 AD Eyjafjallajökull eruption (Iceland), using each technique. These samples exhibit a range of particle sizes, phases and textures. We find that while the Archimedes method remains a useful, low-cost technique to generate whole-rock density data, relative precision is problematic at small particles sizes. Pycnometry offers a more precise whole-rock density value, at a comparable cost-per-sample. However, this technique is based upon the assumption pore spaces within the sample are equally available for gas exchange, which may or may not be the case. XMT produces 3D images, at resolutions from nm to tens of µm per voxel where X-ray attenuation is a qualitative measure of relative electron density, expressed as greyscale number/brightness (usually 16-bit). Phases and individual particles can be digitally segmented according to their greyscale and other characteristics. This represents a distinct advantage over both

  2. STANFORD IN-SITU HIGH RATE YBCO PROCESS: TRANSFER TO METAL TAPES AND PROCESS SCALE UP

    SciTech Connect

    Malcolm R. Beasley; Robert H.Hammond

    2009-04-14

    Executive Summary The materials science understanding of high rate low cost processes for Coated Conductor will benefit the application to power utilities for low loss energy transportation and power generation as well for DOD applications. The research in this program investigated several materials processing approaches that are new and original, and are not being investigated elsewhere. This work added to the understanding of the material science of high rate PVD growth of HTSC YBCO assisted by a liquid phase. A new process discovered uses amorphous glassy precursors which can be made at high rate under flexible conditions of temperature and oxygen, and later brought to conditions of oxygen partial pressure and temperature for rapid conversion to YBCO superconductor. Good critical current densities were found, but further effort is needed to optimize the vortex pinning using known artificial inclusions. A new discovery of the physics and materials science of vortex pinning in the HTSC system using Sm in place of Y came at growth at unusually low oxygen pressure resulting in clusters of a low or non superconducting phase within the nominal high temperature phase. The driving force for this during growth is new physics, perhaps due to the low oxygen. This has the potential for high current in large magnetic fields at low cost, applicable to motors, generators and transformers. The technical demands of this project were the motivation for the development of instrumentation that could be essential to eventual process scale up. These include atomic absorption based on tunable diode lasers for remote monitoring and control of evaporation sources (developed under DARPA support), and the utility of Fourier Transform Infrared Reflectivity (FTIR) for aid in the synthesis of complex thin film materials (purchased by a DURIP-AFOSR grant).

  3. Towards scale-up and regulatory shelf-stability testing of curcumin encapsulated polyester nanoparticles.

    PubMed

    Grama, Charitra N; Venkatpurwar, Vinod P; Lamprou, Dimitrios A; Ravi Kumar, M N V

    2013-06-01

    This study reports scale-up and shelf-stability of curcumin encapsulated poly(lactic acid-co-glycolic acid) (PLGA) nanoparticles. The curcumin encapsulated PLGA nanoparticles were prepared by emulsification solvent evaporation/diffusion, and large quantities were made by varying the homogenisation time (5, 15 and 30 min). The particle size decreased as the homogenisation duration increased from 5 to 30 min, and the particles were spherical as confirmed by atomic force microscopy. For the large-scale preparations, the mean particles size was found to be 288.7 ± 3.4 (polydispersity index 0.15 ± 0.01) with curcumin entrapment 52.5 ± 4.3 %, which were comparable to the lab-scale preparations. The curcumin encapsulated nanoparticles were freeze-dried using sucrose (5 %, w/v) as a cryoprotectant. The freeze-dried nanoparticles were subjected to 6-month stability study as per the International Conference on Harmonisation guideline at room temperature and refrigerated storage conditions. Intermediate sampling was done (monthly), and the nanoparticles were thoroughly characterised for particle size, entrapment efficiency, surface morphology and crystallinity, which were compared to fresh preparations. The curcumin encapsulated PLGA nanoparticles were found to be stable at refrigerated as well as room temperature storage test conditions indicated by their particle characteristics. X-ray diffraction results confirm amorphous nature of curcumin on nano-encapsulation that stays intact after freeze drying and 6-month stability testing. Together these data offer possibility of producing large quantities of polymer nanoparticles that are suitable for room as well as refrigerated storage conditions opening up possibilities to conduct repeated dosings in a chronic setting or regulatory toxicology studies of such nanomedicines. PMID:25788136

  4. Scaling up nutrition in fragile and conflict-affected states: the pivotal role of governance.

    PubMed

    Taylor, Sebastian A J; Perez-Ferrer, Carolina; Griffiths, Andrew; Brunner, Eric

    2015-02-01

    Acute and chronic undernutrition undermine conditions for health, stability and socioeconomic development across the developing world. Although fragile and conflict-affected states have some of the highest rates of undernutrition globally, their response to the multilateral 'Scaling Up Nutrition' (SUN) initiative in its first two-year period was ambivalent. The purpose of this research was to investigate factors affecting fragile and conflict-affected states' engagement with SUN, and to examine what differentiated those fragile states that joined SUN in its first phase from those that did not. Drawing on global databases (Unicef, World Bank, UNDP), and qualitative country case studies (Afghanistan, the Democratic Republic of Congo, Sierra Leone, Pakistan and Yemen) we used bivariate logistic regressions and principal component analysis to assess social, economic and political factors across 41 fragile states looking for systematic differences between those that had signed up to SUN before March 2013 (n = 16), and those that had not (n = 25). While prevalence of malnutrition, health system functioning and level of citizen empowerment had little or no impact on a fragile state's likelihood of joining SUN, the quality of governance (QOG) strongly predicted accession. SUN-signatory fragile states scored systematically better on the World Bank's Country Policy and Institutional Assessment (CPIA) and the Worldwide Governance Indicators 'effectiveness of government' indices. We conclude that strengthening governance in fragile states may enhance their engagement with initiatives such as SUN, but also (recognising the potential for endogeneity), that the way aid is structured and delivered in fragile states may be an underlying determinant of whether and how governance in such contexts improves. The research demonstrates that more nuanced analysis of conditions within and among countries classed as 'fragile and conflict-affected' is both possible and necessary if aid

  5. Scaling up the evaluation of psychotherapy: evaluating motivational interviewing fidelity via statistical text classification

    PubMed Central

    2014-01-01

    Background Behavioral interventions such as psychotherapy are leading, evidence-based practices for a variety of problems (e.g., substance abuse), but the evaluation of provider fidelity to behavioral interventions is limited by the need for human judgment. The current study evaluated the accuracy of statistical text classification in replicating human-based judgments of provider fidelity in one specific psychotherapy—motivational interviewing (MI). Method Participants (n = 148) came from five previously conducted randomized trials and were either primary care patients at a safety-net hospital or university students. To be eligible for the original studies, participants met criteria for either problematic drug or alcohol use. All participants received a type of brief motivational interview, an evidence-based intervention for alcohol and substance use disorders. The Motivational Interviewing Skills Code is a standard measure of MI provider fidelity based on human ratings that was used to evaluate all therapy sessions. A text classification approach called a labeled topic model was used to learn associations between human-based fidelity ratings and MI session transcripts. It was then used to generate codes for new sessions. The primary comparison was the accuracy of model-based codes with human-based codes. Results Receiver operating characteristic (ROC) analyses of model-based codes showed reasonably strong sensitivity and specificity with those from human raters (range of area under ROC curve (AUC) scores: 0.62 – 0.81; average AUC: 0.72). Agreement with human raters was evaluated based on talk turns as well as code tallies for an entire session. Generated codes had higher reliability with human codes for session tallies and also varied strongly by individual code. Conclusion To scale up the evaluation of behavioral interventions, technological solutions will be required. The current study demonstrated preliminary, encouraging findings regarding the utility

  6. Protecting HIV information in countries scaling up HIV services: a baseline study

    PubMed Central

    2011-01-01

    Background Individual-level data are needed to optimize clinical care and monitor and evaluate HIV services. Confidentiality and security of such data must be safeguarded to avoid stigmatization and discrimination of people living with HIV. We set out to assess the extent that countries scaling up HIV services have developed and implemented guidelines to protect the confidentiality and security of HIV information. Methods Questionnaires were sent to UNAIDS field staff in 98 middle- and lower-income countries, some reportedly with guidelines (G-countries) and others intending to develop them (NG-countries). Responses were scored, aggregated and weighted to produce standard scores for six categories: information governance, country policies, data collection, data storage, data transfer and data access. Responses were analyzed using regression analyses for associations with national HIV prevalence, gross national income per capita, OECD income, receiving US PEPFAR funding, and being a G- or NG-country. Differences between G- and NG-countries were investigated using non-parametric methods. Results Higher information governance scores were observed for G-countries compared with NG-countries; no differences were observed between country policies or data collection categories. However, for data storage, data transfer and data access, G-countries had lower scores compared with NG-countries. No significant associations were observed between country score and HIV prevalence, per capita gross national income, OECD economic category, and whether countries had received PEPFAR funding. Conclusions Few countries, including G-countries, had developed comprehensive guidelines on protecting the confidentiality and security of HIV information. Countries must develop their own guidelines, using established frameworks to guide their efforts, and may require assistance in adapting, adopting and implementing them. PMID:21294916

  7. Scaling up of facility-based neonatal care: a district health system experience.

    PubMed

    Shantharam Baliga, B; Raghuveera, K; Vivekananda Prabhu, B; Shenoy, Rathika; Rajeev, A

    2007-04-01

    With proportion of neonatal mortality increasing within under-five deaths, innovative approaches and stronger health systems are needed in neonatal care. We present data of a scaled-up neonatal facility in a District Government Headquarters hospital in Southern India. The special care neonatal unit (SCNU) was a community propelled, public private partnership worked out on the principles of private funding of public institutions and effective budgeting of the public health care system. In the first phase the unit was optimized over 3 years with non-governmental organizations (NGO) and government support from a basic nursery to a SCNU. The unit was operational through fixed maintenance budget from government and mobilized funds from NGOs and beneficiaries. Community health workers were motivated for effective utilization. In the second phase the unit's performance was studied and statistically analyzed in two time frames before and 5 years into the upgradation process. Neonatal admissions from the district increased by 14.65%. Hospital stillbirth, early neonatal and perinatal mortality rates showed significant decline (p < 0.05). There was a 48.59% (CI: 25.46-77.80) increase in antenatal referrals from community health centers. Caesarian sections for neonatal parameters that affect obstetric decisions showed percent changes of 163.25 (CI: 31.18-430.45) and 73.4 (CI: 14.15-164.39) for prematurity and low birth weight (LBW), respectively. Significant decline in case fatality rates for LBW, sepsis and birth asphyxia (p < 0.001) were observed. The district perinatal mortality rate showed a decline. Within the purview of financial constraints of the public health system, private funding, public-private cooperation and effective budgeting may become significant. Motivation of health workers and community to effectively utilize public health care services sets an evolutionary process of referral and vertical linkage of health care system. PMID:17166935

  8. Scale-up of nanoemulsion produced by emulsification and solvent diffusion.

    PubMed

    Mitri, Khalil; Vauthier, Christine; Huang, Nicolas; Menas, Assia; Ringard-Lefebvre, Catherine; Anselmi, Cecilia; Stambouli, Moncef; Rosilio, Veronique; Vachon, Jean-Jacques; Bouchemal, Kawthar

    2012-11-01

    The scale-up of nanoemulsions (NEs) produced by emulsification and solvent diffusion process was successfully achieved in the present work. Up to 1500 mL of NEs were produced with olive oil, castor oil, almond oil, or Arlamol™ E by using a Y-shaped mixer device. NE droplet sizes were significantly modulated from 290 to 185 nm by changing the process parameters without modification of the formulation composition. Smaller NE droplet sizes were obtained by (1) decreasing the internal diameter of the Y-mixer from 5 to 0.8 mm, (2) increasing the flow rates of the organic and the aqueous phases upon mixing, and (3) increasing the temperature of the experiment from 5°C to 40°C. All the results of NE diameters (d(sc) ) expressed as a function of the Reynolds number (Re) and the shear rate inside the Y-mixer (\\documentclass{article}\\usepackage{amssymb}\\begin{document}\\pagestyle{empty}$\\dot \\gamma$\\end{document}) showed the existence of typical power-law relationships: d(sc) = 10(2.82) Re(- 0.14) and \\documentclass{article}\\usepackage{amssymb}\\begin{document}\\pagestyle{empty}$d_{{\\rm sc}} = 10^{2.60} \\dot \\gamma ^{- 0.06}$\\end{document}, respectively. The existence of these power-laws for NE formation by emulsification and solvent diffusion process has never been reported in the literature yet and constitutes a new finding in this work. We definitely proved that the high turbulences created upon NE formation are the most important parameter allowing to decrease droplet size. PMID:22886515

  9. Computational Psychotherapy Research: Scaling up the evaluation of patient-provider interactions

    PubMed Central

    Imel, Zac E.; Steyvers, Mark; Atkins, David C.

    2014-01-01

    In psychotherapy, the patient-provider interaction contains the treatment’s active ingredients. However, the technology for analyzing the content of this interaction has not fundamentally changed in decades, limiting both the scale and specificity of psychotherapy research. New methods are required in order to “scale up” to larger evaluation tasks and “drill down” into the raw linguistic data of patient-therapist interactions. In the current paper we demonstrate the utility of statistical text analysis models called topic models for discovering the underlying linguistic structure in psychotherapy. Topic models identify semantic themes (or topics) in a collection of documents (here, transcripts). We used topic models to summarize and visualize 1,553 psychotherapy and drug therapy (i.e., medication management) transcripts. Results showed that topic models identified clinically relevant content, including affective, content, and intervention related topics. In addition, topic models learned to identify specific types of therapist statements associated with treatment related codes (e.g., different treatment approaches, patient-therapist discussions about the therapeutic relationship). Visualizations of semantic similarity across sessions indicate that topic models identify content that discriminates between broad classes of therapy (e.g., cognitive behavioral therapy vs. psychodynamic therapy). Finally, predictive modeling demonstrated that topic model derived features can classify therapy type with a high degree of accuracy. Computational psychotherapy research has the potential to scale up the study of psychotherapy to thousands of sessions at a time, and we conclude by discussing the implications of computational methods such as topic models for the future of psychotherapy research and practice. PMID:24866972

  10. THE FERMI BUBBLES AS A SCALED-UP VERSION OF SUPERNOVA REMNANTS

    SciTech Connect

    Fujita, Yutaka; Ohira, Yutaka; Yamazaki, Ryo

    2013-09-20

    In this study, we treat Fermi bubbles as a scaled-up version of supernova remnants (SNRs). The bubbles are created through activities of the super-massive black hole (SMBH) or starbursts at the Galactic center (GC). Cosmic-rays (CRs) are accelerated at the forward shocks of the bubbles like SNRs, which means that we cannot decide whether the bubbles were created by the SMBH or starbursts from the radiation from the CRs. We follow the evolution of CR distribution by solving a diffusion-advection equation, considering the reduction of the diffusion coefficient by CR streaming. In this model, gamma rays are created through hadronic interaction between CR protons and the gas in the Galactic halo. In the GeV band, we can well reproduce the observed flat distribution of gamma-ray surface brightness because some amount of gas is left behind the shock. The edge of the bubbles is fairly sharp owing to the high gas density behind the shock and the reduction of the diffusion coefficient there. The latter also contributes the hard gamma-ray spectrum of the bubbles. We find that the CR acceleration at the shock began when the bubbles were small, and the time scale of the energy injection at the GC was much smaller than the age of the bubbles. We predict that if CRs are accelerated to the TeV regime, the apparent bubble size should be larger in the TeV band, which could be used to discriminate our hadronic model from other leptonic models. We also present neutrino fluxes.

  11. Nucleation of Laboratory Earthquakes: Observation, Characterization, and Scaling up to the Natural Earthquakes Dimensions

    NASA Astrophysics Data System (ADS)

    Latour, S.; Schubnel, A.; Nielsen, S. B.; Madariaga, R. I.; Vinciguerra, S.

    2013-12-01

    In this work we observe the nucleation phase of in-plane ruptures in the laboratory and characterize its dynamics. We use a laboratory toy-model, where mode II shear ruptures are produced on a pre-cut fault in a plate of polycarbonate. The fault is cut at the critical angle that allows a stick-slip behavior under uniaxal loading. The ruptures are thus naturally nucleated. The material is birefringent under stress, so that the rupture propagation can be followed by ultra-rapid elastophotometry. A network of acoustic sensors and accelerometers is disposed on the plate to measure the radiated wavefield and record laboratory near-field accelograms. The far field stress level is also measured using strain gages. We show that the nucleation is composed of two distinct phases, a quasi-static and an acceleration stage, followed by dynamic propagation. We propose an empirical model which describes the rupture length evolution: the quasi-static phase is described by an exponential growth while the acceleration phase is described by an inverse power law of time. The transition from quasistatic to accelerating rupture is related to the critical nucleation length, which scales inversely with normal stress in accordance with theoretical predictions, and to a critical surfacic power, which may be an intrinsic property of the interface. Finally, we discuss these results in the frame of previous studies and propose a scaling up to natural earthquake dimensions. Three spontaneously nucleated laboratory earthquakes at increasingly higher normal pre-stresses, visualized by photo-elasticity. The red curves highlight the position of rupture tips as a function of time. We propose an empirical model that describes the dynamics of rupture nucleation and discuss its scaling with the initial normal stress.

  12. Interagency Collaborative Team Model for Capacity Building to Scale-Up Evidence-Based Practice

    PubMed Central

    Hurlburt, Michael; Aarons, Gregory A; Fettes, Danielle; Willging, Cathleen; Gunderson, Lara; Chaffin, Mark J

    2015-01-01

    Background System-wide scale up of evidence-based practice (EBP) is a complex process. Yet, few strategic approaches exist to support EBP implementation and sustainment across a service system. Building on the Exploration, Preparation, Implementation, and Sustainment (EPIS) implementation framework, we developed and are testing the Interagency Collaborative Team (ICT) process model to implement an evidence-based child neglect intervention (i.e., SafeCare®) within a large children’s service system. The ICT model emphasizes the role of local agency collaborations in creating structural supports for successful implementation. Methods We describe the ICT model and present preliminary qualitative results from use of the implementation model in one large scale EBP implementation. Qualitative interviews were conducted to assess challenges in building system, organization, and home visitor collaboration and capacity to implement the EBP. Data collection and analysis centered on EBP implementation issues, as well as the experiences of home visitors under the ICT model. Results Six notable issues relating to implementation process emerged from participant interviews, including: (a) initial commitment and collaboration among stakeholders, (b) leadership, (c) communication, (d) practice fit with local context, (e) ongoing negotiation and problem solving, and (f) early successes. These issues highlight strengths and areas for development in the ICT model. Conclusions Use of the ICT model led to sustained and widespread use of SafeCare in one large county. Although some aspects of the implementation model may benefit from enhancement, qualitative findings suggest that the ICT process generates strong structural supports for implementation and creates conditions in which tensions between EBP structure and local contextual variations can be resolved in ways that support the expansion and maintenance of an EBP while preserving potential for public health benefit.

  13. Pattern Recognition Techniques Applied to the Study of Leishmanial Glyceraldehyde-3-Phosphate Dehydrogenase Inhibition

    PubMed Central

    Lozano, Norka B. H.; Oliveira, Rafael F.; Weber, Karen C.; Honorio, Kathia M.; Guido, Rafael V. C.; Andricopulo, Adriano D.; de Sousa, Alexsandro G.; da Silva, Albérico B. F.

    2014-01-01

    Chemometric pattern recognition techniques were employed in order to obtain Structure-Activity Relationship (SAR) models relating the structures of a series of adenosine compounds to the affinity for glyceraldehyde 3-phosphate dehydrogenase of Leishmania mexicana (LmGAPDH). A training set of 49 compounds was used to build the models and the best ones were obtained with one geometrical and four electronic descriptors. Classification models were externally validated by predictions for a test set of 14 compounds not used in the model building process. Results of good quality were obtained, as verified by the correct classifications achieved. Moreover, the results are in good agreement with previous SAR studies on these molecules, to such an extent that we can suggest that these findings may help in further investigations on ligands of LmGAPDH capable of improving treatment of leishmaniasis. PMID:24566143

  14. Polymer Aging Techniques Applied to Degradation of a Polyurethane Propellant Binder

    SciTech Connect

    Assink, R.A.; Celina, M.; Graham, A.C.; Minier, L.M.

    1999-07-27

    The oxidative thermal aging of a crosslinked hydroxy-terminated polybutadiene (HTPB)/isophorone diisocyanate (IPDI) polyurethane rubber, commonly used as the polymeric binder matrix in solid rocket propellants, was studied at temperatures of RT to 125 C. We investigate changes in tensile elongation, mechanical hardening, polymer network properties, density, O{sub 2} permeation and molecular chain dynamics using a range of techniques including solvent swelling, detailed modulus profiling and NMR relaxation measurements. Using extensive data superposition and highly sensitive oxygen consumption measurements, we critically evaluate the Arrhenius methodology, which normally assumes a linear extrapolation of high temperature aging data. Significant curvature in the Arrhenius diagram of these oxidation rates was observed similar to previous results found for other rubber materials. Preliminary gel/network properties suggest that crosslinking is the dominant process at higher temperatures. We also assess the importance of other constituents such as ammonium perchlorate or aluminum powder in the propellant formulation.

  15. FTIR techniques applied to the detection of gelatine in paper artifacts: from macroscopic to microscopic approach

    NASA Astrophysics Data System (ADS)

    Rouchon, Véronique; Pellizzi, Eleonora; Janssens, Koen

    2010-09-01

    In order to render paper hydrophobic for ink and thus adequate for writing, gelatine has been largely used. To this day, it is still employed in conservation workshops as an adhesive or a sizing agent, for instance, during the treatment of iron gall ink manuscripts. Various types and concentrations of gelatine are recommended, depending on the desired effect, but little information is available regarding to the physical distribution of gelatine in the paper. This aspect is however determinant for a better control of conservation treatments. In this work, we investigate the possibilities offered by FTIR microscopy for the measurement of the gelatine distribution in paper. Laboratory papers were preliminary treated with different types of gelatine and then embedded in a resin and cut in thin slices. Mapping techniques enable to compare the penetration of different types of gelatine in a semiquantitative way. The performance of conventional laboratory equipment and synchrotron radiation experimental setup are discussed.

  16. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1991-01-01

    Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.

  17. A study of universal modulation techniques applied to satellite data collection

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A universal modulation and frequency control system for use with data collection platform (DCP) transmitters is examined. The final design discussed can, under software/firmwave control, generate all of the specific digital data modulation formats currently used in the NASA satellite data collection service and can simultaneously synthesize the proper RF carrier frequencies employed. A novel technique for DCP time and frequency control is presented. The emissions of NBS radio station WWV/WWVH are received, detected, and finally decoded in microcomputer software to generate a highly accurate time base for the platform; with the assistance of external hardware, the microcomputer also directs the recalibration of all DCP oscillators to achieve very high frequency accuracies and low drift rates versus temperature, supply voltage, and time. The final programmable DCP design also employs direct microcomputer control of data reduction, formatting, transmitter switching, and system power management.

  18. Spectroscopic techniques applied to the characterization of decorated potteries from Caltagirone (Sicily, Italy)

    NASA Astrophysics Data System (ADS)

    Barilaro, D.; Barone, G.; Crupi, V.; Donato, M. G.; Majolino, D.; Messina, G.; Ponterio, R.

    2005-06-01

    The aim of the present work is the characterization of decorated pottery samples from Caltagirone (Sicily, Italy), a renowned production centre of this kind of artwork. These fragments were found during archaeological excavations and were attributed to historical periods extremely far in time from each other (from XVIII century b.C. to XVI a.C.). Therefore, we expect that the manufacture techniques result rather different over so long time. The measurements, performed by Fourier Transform-InfraRed (FT-IR) absorbance and micro-Raman scattering, allowed us a non-destructive study of so precious artefacts. Some pigments were identified, various elements of ceramic paste and glazed layer were characterized.

  19. Statistical damage identification techniques applied to the I-40 bridge over the Rio Grande River

    SciTech Connect

    Doebling, S.W.; Farrar, C.R.

    1998-03-01

    The statistical significance of vibration-based damage identification parameters is studied via application to the data from the tests performed on the Interstate 40 highway bridge in Albuquerque, New Mexico. A test of statistical significance is applied to the mean and confidence interval estimates of the modal properties and the corresponding damage indicators. The damage indicator used in this study is the change in the measured flexibility matrix. Previously presented deterministic results indicate that damage is detectable in all of the damage cases from these data sets. The results of this study indicate that the changes in both the modal properties and the damage indicators are statistically significant for all of the damage cases. However, these changes are distributed spatially for the first three damage cases and do not localize the damage until the fourth and final damage case.

  20. Contrast cancellation technique applied to digital x-ray imaging using silicon strip detectors

    SciTech Connect

    Avila, C.; Lopez, J.; Sanabria, J. C.; Baldazzi, G.; Bollini, D.; Gombia, M.; Cabal, A.E.; Ceballos, C.; Diaz Garcia, A.; Gambaccini, M.; Taibi, A.; Sarnelli, A.; Tuffanelli, A.; Giubellino, P.; Marzari-Chiesa, A.; Prino, F.; Tomassi, E.; Grybos, P.; Idzik, M.; Swientek, K.

    2005-12-15

    Dual-energy mammographic imaging experimental tests have been performed using a compact dichromatic imaging system based on a conventional x-ray tube, a mosaic crystal, and a 384-strip silicon detector equipped with full-custom electronics with single photon counting capability. For simulating mammal tissue, a three-component phantom, made of Plexiglass, polyethylene, and water, has been used. Images have been collected with three different pairs of x-ray energies: 16-32 keV, 18-36 keV, and 20-40 keV. A Monte Carlo simulation of the experiment has also been carried out using the MCNP-4C transport code. The Alvarez-Macovski algorithm has been applied both to experimental and simulated data to remove the contrast between two of the phantom materials so as to enhance the visibility of the third one.

  1. Inverting travel times with a triplication. [spline fitting technique applied to lunar seismic data reduction

    NASA Technical Reports Server (NTRS)

    Jarosch, H. S.

    1982-01-01

    A method based on the use of constrained spline fits is used to overcome the difficulties arising when body-wave data in the form of T-delta are reduced to the tau-p form in the presence of cusps. In comparison with unconstrained spline fits, the method proposed here tends to produce much smoother models which lie approximately in the middle of the bounds produced by the extremal method. The method is noniterative and, therefore, computationally efficient. The method is applied to the lunar seismic data, where at least one triplication is presumed to occur in the P-wave travel-time curve. It is shown, however, that because of an insufficient number of data points for events close to the antipode of the center of the lunar network, the present analysis is not accurate enough to resolve the problem of a possible lunar core.

  2. Contrast cancellation technique applied to digital x-ray imaging using silicon strip detectors.

    PubMed

    Avila, C; Lopez, J; Sanabria, J C; Baldazzi, G; Bollini, D; Gombia, M; Cabal, A E; Ceballos, C; Diaz Garcia, A; Gambaccini, M; Taibi, A; Sarnelli, A; Tuffanelli, A; Giubellino, P; Marzari-Chiesa, A; Prino, F; Tomassi, E; Grybos, P; Idzik, M; Swientek, K; Wiacek, P; Montaño, L M; Ramello, L; Sitta, M

    2005-12-01

    Dual-energy mammographic imaging experimental tests have been performed using a compact dichromatic imaging system based on a conventional x-ray tube, a mosaic crystal, and a 384-strip silicon detector equipped with full-custom electronics with single photon counting capability. For simulating mammal tissue, a three-component phantom, made of Plexiglass, polyethylene, and water, has been used. Images have been collected with three different pairs of x-ray energies: 16-32 keV, 18-36 keV, and 20-40 keV. A Monte Carlo simulation of the experiment has also been carried out using the MCNP-4C transport code. The Alvarez-Macovski algorithm has been applied both to experimental and simulated data to remove the contrast between two of the phantom materials so as to enhance the visibility of the third one. PMID:16475775

  3. Discrimination and classification techniques applied on Mallotus and Phyllanthus high performance liquid chromatography fingerprints.

    PubMed

    Viaene, J; Goodarzi, M; Dejaegher, B; Tistaert, C; Hoang Le Tuan, A; Nguyen Hoai, N; Chau Van, M; Quetin-Leclercq, J; Vander Heyden, Y

    2015-06-01

    Mallotus and Phyllanthus genera, both containing several species commonly used as traditional medicines around the world, are the subjects of this discrimination and classification study. The objective of this study was to compare different discrimination and classification techniques to distinguish the two genera (Mallotus and Phyllanthus) on the one hand, and the six species (Mallotus apelta, Mallotus paniculatus, Phyllanthus emblica, Phyllanthus reticulatus, Phyllanthus urinaria L. and Phyllanthus amarus), on the other. Fingerprints of 36 samples from the 6 species were developed using reversed-phase high-performance liquid chromatography with ultraviolet detection (RP-HPLC-UV). After fingerprint data pretreatment, first an exploratory data analysis was performed using Principal Component Analysis (PCA), revealing two outlying samples, which were excluded from the calibration set used to develop the discrimination and classification models. Models were built by means of Linear Discriminant Analysis (LDA), Quadratic Discriminant Analysis (QDA), Classification and Regression Trees (CART) and Soft Independent Modeling of Class Analogy (SIMCA). Application of the models on the total data set (outliers included) confirmed a possible labeling issue for the outliers. LDA, QDA and CART, independently of the pretreatment, or SIMCA after "normalization and column centering (N_CC)" or after "Standard Normal Variate transformation and column centering (SNV_CC)" were found best to discriminate the two genera, while LDA after column centering (CC), N_CC or SNV_CC; QDA after SNV_CC; and SIMCA after N_CC or after SNV_CC best distinguished between the 6 species. As classification technique, SIMCA after N_CC or after SNV_CC results in the best overall sensitivity and specificity. PMID:26002209

  4. Applying simple geostatistical techniques to a routine production geology problem - a case study

    SciTech Connect

    Norris, R.J.; Hewitt, A.; Massonnat, G.J. )

    1994-07-01

    A production geology reservoir description was shown to represent poorly the known dynamic reservoir behavior. The permeability field was originally generated from a general porosity-permeability law applied to a contoured porosity field. This resulted in unrealistically high permeability values in some layers and early water breakthrough in flow simulations. Furthermore, known well values of permeability were not honored. To improve the model, a geostatistical approach was used. The first step involved the use of a porosity-permeability law based on total layer porosity values. That is, with no cutoff applied, thereby allowing obtainment of values of k closer to the known reservoir values. This alone produces a permeability field, which although better represents the absolute permeability values is still too smooth-preferential pathways occurring as artifacts of contouring. A second step, therefore, involves the [open quotes]unsmoothing[close quotes] of the permeability field. At each point of the permeability field the permeability values are resampled from a distribution centered around the original value. This is a form of Monte-Carlo replacement, conditioned to well data and honoring the original trend to the data. In this case no correlation was included due to lack of information. This simple Gaussian simulation approach provides multiple realizations based on deterministic information. The advantages are clear, more realistic images (no artificial pathways), improved match of water breakthrough, and the honoring of all deterministic data (wells and trend). A subsequent step was the incorporation of further [open quotes]soft[close quotes] information. Geological analysis suggested that there was a degradation of reservoir properties along the east-west axis. This new information was rapidly assimilated into the model to produce final images of the reservoir.

  5. Scaling up multi-camera tracking for real-world deployment

    NASA Astrophysics Data System (ADS)

    Raja, Yogesh; Gong, Shaogang

    2012-10-01

    A user-assisted multi-camera tracking system employing several key novel methodologies has previously been shown to be highly effective in assisting human users in tracking targets of interest through industry-standard-LIDS multi-camera benchmark data.1 A prototype system was developed in order to test and evaluate the effectiveness of this approach. In this paper, we develop this system further in order to improve tracking accuracy and further facilitate scalability to arbitrary numbers of camera views across much larger spatial areas and different locations. Specifically, we describe the following three areas of improvement: (1) dynamic learning mechanisms apply user feedback in adapting internal models to improve performance over time; (2) modular design and hardware acceleration techniques are explored with a view to real-time performance, extensive configurability to leverage available hardware and scalability to larger datasets; and (3) re-design of the user interface for deployment as a secure asynchronous remote web-based service. We conduct an extensive evaluation of the system in terms of: (1) tracking performance; and (2) the speed of the system in computation and in usage over a network. We use a newly collected real-world dataset significantly more challenging than i-LIDS, which comprises six cameras covering two London Underground stations. We show that: (1) dynamic learning is effective; (2) the user-assisted paradigm retains its effectiveness with this significantly more challenging dataset; (3) large-scale deployment and real-time computation is feasible due to linear scalability; (4) context-aware user search strategies and external non-visual information can aid search convergence; and (5) storage and querying of meta-data is a bottleneck to be overcome.

  6. Scale-up of industrial biodiesel production to 40 m(3) using a liquid lipase formulation.

    PubMed

    Price, Jason; Nordblad, Mathias; Martel, Hannah H; Chrabas, Brent; Wang, Huali; Nielsen, Per Munk; Woodley, John M

    2016-08-01

    In this work, we demonstrate the scale-up from an 80 L fed-batch scale to 40 m(3) along with the design of a 4 m(3) continuous process for enzymatic biodiesel production catalyzed by NS-40116 (a liquid formulation of a modified Thermomyces lanuginosus lipase). Based on the analysis of actual pilot plant data for the transesterification of used cooking oil and brown grease, we propose a method applying first order integral analysis to fed-batch data based on either the bound glycerol or free fatty acid content in the oil. This method greatly simplifies the modeling process and gives an indication of the effect of mixing at the various scales (80 L to 40 m(3) ) along with the prediction of the residence time needed to reach a desired conversion in a CSTR. Suitable process metrics reflecting commercial performance such as the reaction time, enzyme efficiency, and reactor productivity were evaluated for both the fed-batch and CSTR cases. Given similar operating conditions, the CSTR operation on average, has a reaction time which is 1.3 times greater than the fed-batch operation. We also showed how the process metrics can be used to quickly estimate the selling price of the enzyme. Assuming a biodiesel selling price of 0.6 USD/kg and a one-time use of the enzyme (0.1% (w/woil ) enzyme dosage); the enzyme can then be sold for 30 USD/kg which ensures that that the enzyme cost is not more than 5% of the biodiesel revenue. Biotechnol. Bioeng. 2016;113: 1719-1728. © 2016 Wiley Periodicals, Inc. PMID:26806356

  7. High throughput screening of particle conditioning operations: II. Evaluation of scale-up heuristics with prokaryotically expressed polysaccharide vaccines.

    PubMed

    Noyes, Aaron; Huffman, Ben; Berrill, Alex; Merchant, Nick; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Sunasara, Khurram; Mukhopadhyay, Tarit

    2015-08-01

    Multivalent polysaccharide conjugate vaccines are typically comprised of several different polysaccharides produced with distinct and complex production processes. Particle conditioning steps, such as precipitation and flocculation, may be used to aid the recovery and purification of such microbial vaccine products. An ultra scale-down approach to purify vaccine polysaccharides at the micro-scale would greatly enhance productivity, robustness, and speed the development of novel conjugate vaccines. In part one of this series, we described a modular and high throughput approach to develop particle conditioning processes (HTPC) for biologicals that combines flocculation, solids removal, and streamlined analytics. In this second part of the series, we applied HTPC to industrially relevant feedstreams comprised of capsular polysaccharides (CPS) from several bacterial species. The scalability of HTPC was evaluated between 0.8 mL and 13 L scales, with several different scaling methodologies examined. Clarification, polysaccharide yield, impurity clearance, and product quality achieved with HTPC were reproducible and comparable with larger scales. Particle sizing was the response with greatest sensitivity to differences in processing scale and enabled the identification of useful scaling rules. Scaling with constant impeller tip speed or power per volume in the impeller swept zone offered the most accurate scale up, with evidence that time integration of these values provided the optimal basis for scaling. The capability to develop a process at the micro-scale combined with evidence-based scaling metrics provide a significant advance for purification process development of vaccine processes. The USD system offers similar opportunities for HTPC of proteins and other complex biological molecules. PMID:25727194

  8. Time-reversal imaging techniques applied to tremor waveforms near Cholame, California to locate tectonic tremor

    NASA Astrophysics Data System (ADS)

    Horstmann, T.; Harrington, R. M.; Cochran, E. S.

    2012-12-01

    Frequently, the lack of distinctive phase arrivals makes locating tectonic tremor more challenging than locating earthquakes. Classic location algorithms based on travel times cannot be directly applied because impulsive phase arrivals are often difficult to recognize. Traditional location algorithms are often modified to use phase arrivals identified from stacks of recurring low-frequency events (LFEs) observed within tremor episodes, rather than single events. Stacking the LFE waveforms improves the signal-to-noise ratio for the otherwise non-distinct phase arrivals. In this study, we apply a different method to locate tectonic tremor: a modified time-reversal imaging approach that potentially exploits the information from the entire tremor waveform instead of phase arrivals from individual LFEs. Time reversal imaging uses the waveforms of a given seismic source recorded by multiple seismometers at discrete points on the surface and a 3D velocity model to rebroadcast the waveforms back into the medium to identify the seismic source location. In practice, the method works by reversing the seismograms recorded at each of the stations in time, and back-propagating them from the receiver location individually into the sub-surface as a new source time function. We use a staggered-grid, finite-difference code with 2.5 ms time steps and a grid node spacing of 50 m to compute the rebroadcast wavefield. We calculate the time-dependent curl field at each grid point of the model volume for each back-propagated seismogram. To locate the tremor, we assume that the source time function back-propagated from each individual station produces a similar curl field at the source position. We then cross-correlate the time dependent curl field functions and calculate a median cross-correlation coefficient at each grid point. The highest median cross-correlation coefficient in the model volume is expected to represent the source location. For our analysis, we use the velocity model of

  9. Large-timestep techniques for particle-in-cell simulation of systems with applied fields that vary rapidly in space

    SciTech Connect

    Friedman, A.; Grote, D.P.

    1996-10-01

    Under conditions which arise commonly in space-charge-dominated beam applications, the applied focusing, bending, and accelerating fields vary rapidly with axial position, while the self-fields (which are, on average, comparable in strength to the applied fields) vary smoothly. In such cases it is desirable to employ timesteps which advance the particles over distances greater than the characteristic scales over which the applied fields vary. Several related concepts are potentially applicable: sub-cycling of the particle advance relative to the field solution, a higher-order time-advance algorithm, force-averaging by integration along approximate orbits, and orbit-averaging. We report on our investigations into the utility of such techniques for systems typical of those encountered in accelerator studies for heavy-ion beam-driven inertial fusion.

  10. A Novel Solution-Technique Applied to a Novel WAAS Architecture

    NASA Technical Reports Server (NTRS)

    Bavuso, J.

    1998-01-01

    The Federal Aviation Administration has embarked on an historic task of modernizing and significantly improving the national air transportation system. One system that uses the Global Positioning System (GPS) to determine aircraft navigational information is called the Wide Area Augmentation System (WAAS). This paper describes a reliability assessment of one candidate system architecture for the WAAS. A unique aspect of this study regards the modeling and solution of a candidate system that allows a novel cold sparing scheme. The cold spare is a WAAS communications satellite that is fabricated and launched after a predetermined number of orbiting satellite failures have occurred and after some stochastic fabrication time transpires. Because these satellites are complex systems with redundant components, they exhibit an increasing failure rate with a Weibull time to failure distribution. Moreover, the cold spare satellite build-time is Weibull and upon launch is considered to be a good-as-new system with an increasing failure rate and a Weibull time to failure distribution as well. The reliability model for this system is non-Markovian because three distinct system clocks are required: the time to failure of the orbiting satellites, the build time for the cold spare, and the time to failure for the launched spare satellite. A powerful dynamic fault tree modeling notation and Monte Carlo simulation technique with importance sampling are shown to arrive at a reliability prediction for a 10 year mission.

  11. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    NASA Technical Reports Server (NTRS)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  12. Dependency between treatment outcome in pseudarthrosis of the humeral shaft and the surgical technique applied.

    PubMed

    Piotrowski, Maciej; Baczkowski, Bogusław; Markowicz, Agnieszka; Pankowski, Rafał; Luczkiewicz, Piotr

    2005-08-30

    Background. Treatment of non-union has always been one of the most difficult problems in bone pathology. In the present study we compare outcomes using 9 different methods of non-union treatment. Material and methods. From 1976 to 2003, 70 patients with 85 cases of pseudoarthrosis in the humeral shaft were operated. During that period, 103 operations using 9 different methods were performed. The study group consisted of 17 females, 36 males and 17 children, ranging in age from 3 to 85 years. The operation techniques were compared based on the achievement of bone union and recovery of limb functional efficiency. Nonunion type was also taken into account. Results. A high percentage of bone union was obtained by using a perforated block of corticocancellous graft taken from the iliac crest. The most complete limb function recovery was achieved using this method, as well as Judet's decortication with cancellous grafting and firm osteosynthesis. Conclusions. In oligotrophic and non-viable humeral shaft non-union, the most effective method is pseudarthrosis excision, using a perforated block of corticocancellous graft from the iliac crest to fill the gap, and firm osteosynthesis. Judet's decortication with cancellous grafting and firm osteosynthesis secured good outcome in hypertrophic pseudarthrosis. PMID:17611455

  13. Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun

    2015-06-01

    Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.

  14. Laser ultrasound technique applied in material characterization of thermally sprayed nickel aluminum coatings

    NASA Astrophysics Data System (ADS)

    Yeh, C. H.; Yang, C. H.; Hsiao, W. T.; Su, C.-Y.

    2012-05-01

    Thermal spraying processing usually uses a nickel-aluminum alloy system as the major powder due to its strong adhesion to substrates. The contents of powder material and the processing parameters used in the spraying process cause material properties of coatings exhibiting a wide variation. This research aims at nondestructive characterization of thermal spraying coatings. A laser-generation/laser-detection laser ultrasound technique (LUT) is used for the measurements of dispersion spectra of surface waves propagating along the coated surfaces. Theoretical model for surface waves propagating along a multi-layered structure with coating and substrate is used to model the sprayed coatings. An inversion algorithm based on Shuffled Complex Evolution (SCE-UA) is used to extract mechanical properties from the measured dispersion spectra cooperating with theoretical model. Three coatings with different sprayed powders and powder processing are investigated. Results indicate that substantial linear scatterings are observed for the inverted properties due to the measured dispersion spectra with limited bandwidth inherited from the relatively high attenuations. The slope of linear scattering can be used to distinguish the coating properties. The ANiBNb sample with ball-milled coating has the best properties based on its highest velocity and least attenuation. This method is potentially useful to characterize the mechanical properties of thermally spraying coating in a nondestructive way.

  15. Comparison of motion correction techniques applied to functional near-infrared spectroscopy data from children

    NASA Astrophysics Data System (ADS)

    Hu, Xiao-Su; Arredondo, Maria M.; Gomba, Megan; Confer, Nicole; DaSilva, Alexandre F.; Johnson, Timothy D.; Shalinsky, Mark; Kovelman, Ioulia

    2015-12-01

    Motion artifacts are the most significant sources of noise in the context of pediatric brain imaging designs and data analyses, especially in applications of functional near-infrared spectroscopy (fNIRS), in which it can completely affect the quality of the data acquired. Different methods have been developed to correct motion artifacts in fNIRS data, but the relative effectiveness of these methods for data from child and infant subjects (which is often found to be significantly noisier than adult data) remains largely unexplored. The issue is further complicated by the heterogeneity of fNIRS data artifacts. We compared the efficacy of the six most prevalent motion artifact correction techniques with fNIRS data acquired from children participating in a language acquisition task, including wavelet, spline interpolation, principal component analysis, moving average (MA), correlation-based signal improvement, and combination of wavelet and MA. The evaluation of five predefined metrics suggests that the MA and wavelet methods yield the best outcomes. These findings elucidate the varied nature of fNIRS data artifacts and the efficacy of artifact correction methods with pediatric populations, as well as help inform both the theory and practice of optical brain imaging analysis.

  16. Correlation techniques as applied to pose estimation in space station docking

    NASA Astrophysics Data System (ADS)

    Rollins, John M.; Juday, Richard D.; Monroe, Stanley E., Jr.

    2002-08-01

    The telerobotic assembly of space-station components has become the method of choice for the International Space Station (ISS) because it offers a safe alternative to the more hazardous option of space walks. The disadvantage of telerobotic assembly is that it does not necessarily provide for direct arbitrary views of mating interfaces for the teleoperator. Unless cameras are present very close to the interface positions, such views must be generated graphically, based on calculated pose relationships derived from images. To assist in this photogrammetric pose estimation, circular targets, or spots, of high contrast have been affixed on each connecting module at carefully surveyed positions. The appearance of a subset of spots must form a constellation of specific relative positions in the incoming image stream in order for the docking to proceed. Spot positions are expressed in terms of their apparent centroids in an image. The precision of centroid estimation is required to be as fine as 1/20th pixel, in some cases. This paper presents an approach to spot centroid estimation using cross correlation between spot images and synthetic spot models of precise centration. Techniques for obtaining sub-pixel accuracy and for shadow and lighting irregularity compensation are discussed.

  17. Low temperature techniques applied for CTEM and STEM analysis of cellular components at a molecular level.

    PubMed

    Sjöstrand, F S

    1982-12-01

    One of the most important problems in tissue preparation for electron microscopic analysis at a molecular level involves the preservation of the tissue without introducing extensive denaturation of the proteins. Low temperature is a most efficient condition for the inhibition of protein denaturation and freeze-drying offers favourable conditions for transferring proteins to a dry state with minimal denaturation of the proteins. However, the embedding of the dried tissue in a plastic leads to extensive denaturation of the proteins when performed in the conventional way. This eliminates very efficiently the advantages of the method. The situation becomes even worse when subjecting the tissue to freeze-substitution. To eliminate as far as possible the denaturing effect of plastic embedding, freeze-drying can be combined with low temperature embedding in a plastic. Freeze-fracturing allows a most efficient use of low temperature to reduce conformation changes in proteins. The value of the freeze-fracturing technique depends entirely on a precise knowledge of the location of the fracture planes. Since this location is not known, it must be determined on the basis of a deduction. If this deduction is wrong, the method becomes misleading. Two methods which allow a certain testing of the correctness of the deduced location of the fracture planes are mentioned. PMID:6759657

  18. Experiences in applying optimization techniques to configurations for the Control of Flexible Structures (COFS) program

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.

    1989-01-01

    Optimization procedures are developed to systematically provide closely-spaced vibration frequencies. A general purpose finite-element program for eigenvalue and sensitivity analyses is combined with formal mathematical programming techniques. Results are presented for three studies. The first study uses a simple model to obtain a design with two pairs of closely-spaced frequencies. Two formulations are developed: an objective function-based formulation and constraint-based formulation for the frequency spacing. It is found that conflicting goals are handled better by a constraint-based formulation. The second study uses a detailed model to obtain a design with one pair of closely-spaced frequencies while satisfying requirements on local member frequencies and manufacturing tolerances. Two formulations are developed. Both the constraint-based and the objective function-based formulations perform reasonably well and converge to the same results. However, no feasible design solution exists which satisfies all design requirements for the choices of design variables and the upper and lower design variable values used. More design freedom is needed to achieve a fully satisfactory design. The third study is part of a redesign activity in which a detailed model is used.

  19. Experiences in applying optimization techniques to configurations for the Control Of Flexible Structures (COFS) Program

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.

    1988-01-01

    Optimization procedures are developed to systematically provide closely-spaced vibration frequencies. A general-purpose finite-element program for eigenvalue and sensitivity analyses is combined with formal mathematical programming techniques. Results are presented for three studies. The first study uses a simple model to obtain a design with two pairs of closely-spaced frequencies. Two formulations are developed: an objective function-based formulation and constraint-based formulation for the frequency spacing. It is found that conflicting goals are handled better by a constraint-based formulation. The second study uses a detailed model to obtain a design with one pair of closely-spaced frequencies while satisfying requirements on local member frequencies and manufacturing tolerances. Two formulations are developed. Both the constraint-based and the objective function-based formulations perform reasonably well and converge to the same results. However, no feasible design solution exists which satisfies all design requirements for the choices of design variables and the upper and lower design variable values used. More design freedom is needed to achieve a fully satisfactory design. The third study is part of a redesign activity in which a detailed model is used. The use of optimization in this activity allows investigation of numerous options (such as number of bays, material, minimum diagonal wall thicknesses) in a relatively short time. The procedure provides data for judgments on the effects of different options on the design.

  20. Machine Learning Techniques Applied to Sensor Data Correction in Building Technologies

    SciTech Connect

    Smith, Matt K; Castello, Charles C; New, Joshua Ryan

    2013-01-01

    Since commercial and residential buildings account for nearly half of the United States' energy consumption, making them more energy-efficient is a vital part of the nation's overall energy strategy. Sensors play an important role in this research by collecting data needed to analyze performance of components, systems, and whole-buildings. Given this reliance on sensors, ensuring that sensor data are valid is a crucial problem. Solutions being researched are machine learning techniques, namely: artificial neural networks and Bayesian Networks. Types of data investigated in this study are: (1) temperature; (2) humidity; (3) refrigerator energy consumption; (4) heat pump liquid pressure; and (5) water flow. These data are taken from Oak Ridge National Laboratory's (ORNL) ZEBRAlliance research project which is composed of four single-family homes in Oak Ridge, TN. Results show that for the temperature, humidity, pressure, and flow sensors, data can mostly be predicted with root-mean-square error (RMSE) of less than 10% of the respective sensor's mean value. Results for the energy sensor are not as good; RMSE are centered about 100% of the mean value and are often well above 200%. Bayesian networks have RSME of less than 5% of the respective sensor's mean value, but took substantially longer to train.

  1. Asymptotic approximation of the Wiener-Hopf technique as applied to jet atomisation

    NASA Astrophysics Data System (ADS)

    Chen, X.-N.; Kirchgässner, K.

    An approximate Wiener-Hopf (WH) technique is developed for solving problems involving fine spatial structure. As an example of the application of this method we investigate the atomisation of a liquid jet. The jet exits from a nozzle into an ambient fluid. Short interfacial waves become unstable and break into small particles. This problem is treated as a potential flow under the influence of capillary effects at the interface and the pressure fluctuation at the nozzle wall. Two simultaneous WH equations are obtained. To solve them, the singular parts in each equation are separated from the regular ones, that leads to a linear system of algebraic equations for the residues. The response-wave amplitudes are evaluated numerically and the instability diagram is presented. It is found that resonance occurs at double roots of the dispersion relation. For a given azimuthal number m, the double roots form two curves parametrised by the Weber number β. They merge at a certain critical point, where an even stronger resonance occurs. This finally selects the dominant modes. By gauging one parameter, namely the velocity ratio U, the theoretical prediction agrees quite well with experimental results of the jet atomisation.

  2. Modern Chemistry Techniques Applied to Metal Behavior and Chelation in Medical and Environmental Systems ? Final Report

    SciTech Connect

    Sutton, M; Andresen, B; Burastero, S R; Chiarappa-Zucca, M L; Chinn, S C; Coronado, P R; Gash, A E; Perkins, J; Sawvel, A M; Szechenyi, S C

    2005-02-03

    This report details the research and findings generated over the course of a 3-year research project funded by Lawrence Livermore National Laboratory (LLNL) Laboratory Directed Research and Development (LDRD). Originally tasked with studying beryllium chemistry and chelation for the treatment of Chronic Beryllium Disease and environmental remediation of beryllium-contaminated environments, this work has yielded results in beryllium and uranium solubility and speciation associated with toxicology; specific and effective chelation agents for beryllium, capable of lowering beryllium tissue burden and increasing urinary excretion in mice, and dissolution of beryllium contamination at LLNL Site 300; {sup 9}Be NMR studies previously unstudied at LLNL; secondary ionization mass spec (SIMS) imaging of beryllium in spleen and lung tissue; beryllium interactions with aerogel/GAC material for environmental cleanup. The results show that chelator development using modern chemical techniques such as chemical thermodynamic modeling, was successful in identifying and utilizing tried and tested beryllium chelators for use in medical and environmental scenarios. Additionally, a study of uranium speciation in simulated biological fluids identified uranium species present in urine, gastric juice, pancreatic fluid, airway surface fluid, simulated lung fluid, bile, saliva, plasma, interstitial fluid and intracellular fluid.

  3. Imaging techniques applied to the study of fluids in porous media

    SciTech Connect

    Tomutsa, L.; Doughty, D.; Mahmood, S.; Brinkmeyer, A.; Madden, M.P.

    1991-01-01

    A detailed understanding of rock structure and its influence on fluid entrapment, storage capacity, and flow behavior can improve the effective utilization and design of methods to increase the recovery of oil and gas from petroleum reservoirs. The dynamics of fluid flow and trapping phenomena in porous media was investigated. Miscible and immiscible displacement experiments in heterogeneous Berea and Shannon sandstone samples were monitored using X-ray computed tomography (CT scanning) to determine the effect of heterogeneities on fluid flow and trapping. The statistical analysis of pore and pore throat sizes in thin sections cut from these sandstone samples enabled the delineation of small-scale spatial distributions of porosity and permeability. Multiphase displacement experiments were conducted with micromodels constructed using thin slabs of the sandstones. The combination of the CT scanning, thin section, and micromodel techniques enables the investigation of how variations in pore characteristics influence fluid front advancement, fluid distributions, and fluid trapping. Plugs cut from the sandstone samples were investigated using high resolution nuclear magnetic resonance imaging permitting the visualization of oil, water or both within individual pores. The application of these insights will aid in the proper interpretation of relative permeability, capillary pressure, and electrical resistivity data obtained from whole core studies. 7 refs., 14 figs., 2 tabs.

  4. Correlation Techniques as Applied to Pose Estimation in Space Station Docking

    NASA Technical Reports Server (NTRS)

    Rollins, J. Michael; Juday, Richard D.; Monroe, Stanley E., Jr.

    2002-01-01

    The telerobotic assembly of space-station components has become the method of choice for the International Space Station (ISS) because it offers a safe alternative to the more hazardous option of space walks. The disadvantage of telerobotic assembly is that it does not provide for direct arbitrary views of mating interfaces for the teleoperator. Unless cameras are present very close to the interface positions, such views must be generated graphically, based on calculated pose relationships derived from images. To assist in this photogrammetric pose estimation, circular targets, or spots, of high contrast have been affixed on each connecting module at carefully surveyed positions. The appearance of a subset of spots essentially must form a constellation of specific relative positions in the incoming digital image stream in order for the docking to proceed. Spot positions are expressed in terms of their apparent centroids in an image. The precision of centroid estimation is required to be as fine as 1I20th pixel, in some cases. This paper presents an approach to spot centroid estimation using cross correlation between spot images and synthetic spot models of precise centration. Techniques for obtaining sub-pixel accuracy and for shadow, obscuration and lighting irregularity compensation are discussed.

  5. Applied Focused Ion Beam Techniques for Sample Preparation of Astromaterials for Integrated Nano-Analysis

    SciTech Connect

    Graham, G A; Teslich, N E; Kearsley, A T; Stadermann, F J; Stroud, R M; Dai, Z R; Ishii, H A; Hutcheon, I D; Bajt, S; Snead, C J; Weber, P K; Bradley, J P

    2007-02-20

    Sample preparation is always a critical step in study of micrometer sized astromaterials available for study in the laboratory, whether their subsequent analysis is by electron microscopy or secondary ion mass spectrometry. A focused beam of gallium ions has been used to prepare electron transparent sections from an interplanetary dust particle, as part of an integrated analysis protocol to maximize the mineralogical, elemental, isotopic and spectroscopic information extracted from one individual particle. In addition, focused ion beam techniques have been employed to extract cometary residue preserved on the rims and walls of micro-craters in 1100 series aluminum foils that were wrapped around the sample tray assembly on the Stardust cometary sample collector. Non-ideal surface geometries and inconveniently located regions of interest required creative solutions. These include support pillar construction and relocation of a significant portion of sample to access a region of interest. Serial sectioning, in a manner similar to ultramicrotomy, is a significant development and further demonstrates the unique capabilities of focused ion beam microscopy for sample preparation of astromaterials.

  6. An alternative approach for delineating eco-sensitive zones around a wildlife sanctuary applying geospatial techniques.

    PubMed

    Deb, Shovik; Ahmed, Akram; Datta, Debajit

    2014-04-01

    The dynamics, degradation, and conservation of forest ecosystems are matters of prime concerns worldwide at the present. Proper planning and management of a forest area are essentially needed to protect it from the grasp of burgeoning pressure of urban-industrial sprawl. Establishment of eco-sensitive zones (ESZs), which act as buffer areas around the core forests, is one of the key approaches towards achieving this goal. This paper deals with the applicability of geospatial techniques to identify the ESZ around an Indian wildlife sanctuary following the different rules and acts prescribed by the Government of India. Gumti Wildlife Sanctuary, located in the northeastern state of Tripura in India, has been selected here as a case study. Collected pieces of information on the distribution of biodiversity and human population in the area were also used to make the approach more holistic. As inferred from this study, remote sensing and geographical information systems were found to be easily implementable and time as well as cost-effective tools for this purpose with a distinct advantage of spatial as well as temporal accuracy in identifying the existing land use and land cover patterns in pilot assessments. However, the results indicated that only appropriate hybridization of field-based information on the biodiversity and ecological aspects of the forest as well as patterns of human interferences with the remote sensing and GIS-based data could make this approach more relevant in actual implementations. PMID:24338098

  7. Percutaneous implantation of gastric electrodes - a novel technique applied in animals and in patients.

    PubMed

    Elfvin, A; Andersson, S; Abrahamsson, H; Edebo, A; Simrén, M; Lönroth, H

    2007-02-01

    Temporary electrodes implanted under general anaesthesia, or via an oral or percutaneous endoscopic gastrostomy route have been used for testing of gastric electrical stimulation (GES). We have developed a principle for percutaneous electrode implantation. Leads were constructed so that the tip could be anchored to the gastric submucosa under gastroscopic control. Acute experiments were performed in anaesthetized pigs. Three patients referred for nausea and/or vomiting and non-established indications for GES (chronic intestinal pseudo-obstruction, functional dyspepsia without gastroparesis) were evaluated. Electrode function was tested by recording and stimulation techniques. In the pigs, a slow-wave (SW) rhythm (3 min(-1)) was recorded with decrease in frequency at the end of the experiments. In the patients, implantation time from start of gastroscopy to end of electrode placement was 12-20 min. Electrode distance varied from 12 to 45 mm. Gastric electromyography showed a regular SW rhythm of about 3 min(-1). Antral pressure waves had intervals being multiples of the SW-to-SW time. With temporary GES for 7-9 days, weekly frequency of the referral symptoms decreased >80% in two patients and 33% in one patient. Temporary percutaneous gastric leads can easily be implanted and may be used for testing of GES and study of gastric electrophysiology. PMID:17244164

  8. Analysis and simulation of wireless signal propagation applying geostatistical interpolation techniques

    NASA Astrophysics Data System (ADS)

    Kolyaie, S.; Yaghooti, M.; Majidi, G.

    2011-12-01

    This paper is a part of an ongoing research to examine the capability of geostatistical analysis for mobile networks coverage prediction, simulation and tuning. Mobile network coverage predictions are used to find network coverage gaps and areas with poor serviceability. They are essential data for engineering and management in order to make better decision regarding rollout, planning and optimisation of mobile networks.The objective of this research is to evaluate different interpolation techniques in coverage prediction. In method presented here, raw data collected from drive testing a sample of roads in study area is analysed and various continuous surfaces are created using different interpolation methods. Two general interpolation methods are used in this paper with different variables; first, Inverse Distance Weighting (IDW) with various powers and number of neighbours and second, ordinary kriging with Gaussian, spherical, circular and exponential semivariogram models with different number of neighbours. For the result comparison, we have used check points coming from the same drive test data. Prediction values for check points are extracted from each surface and the differences with actual value are computed. The output of this research helps finding an optimised and accurate model for coverage prediction.

  9. Hyperspectral imaging techniques applied to the monitoring of wine waste anaerobic digestion process

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Fabbri, Andrea; Bonifazi, Giuseppe

    2012-11-01

    An anaerobic digestion process, finalized to biogas production, is characterized by different steps involving the variation of some chemical and physical parameters related to the presence of specific biomasses as: pH, chemical oxygen demand (COD), volatile solids, nitrate (NO3-) and phosphate (PO3-). A correct process characterization requires a periodical sampling of the organic mixture in the reactor and a further analysis of the samples by traditional chemical-physical methods. Such an approach is discontinuous, time-consuming and expensive. A new analytical approach based on hyperspectral imaging in the NIR field (1000 to 1700 nm) is investigated and critically evaluated, with reference to the monitoring of wine waste anaerobic digestion process. The application of the proposed technique was addressed to identify and demonstrate the correlation existing, in terms of quality and reliability of the results, between "classical" chemical-physical parameters and spectral features of the digestate samples. Good results were obtained, ranging from a R2=0.68 and a RMSECV=12.83 mg/l for nitrate to a R2=0.90 and a RMSECV=5495.16 mg O2/l for COD. The proposed approach seems very useful in setting up innovative control strategies allowing for full, continuous control of the anaerobic digestion process.

  10. Experimental Studies of Active and Passive Flow Control Techniques Applied in a Twin Air-Intake

    PubMed Central

    Joshi, Shrey; Jindal, Aman; Maurya, Shivam P.; Jain, Anuj

    2013-01-01

    The flow control in twin air-intakes is necessary to improve the performance characteristics, since the flow traveling through curved and diffused paths becomes complex, especially after merging. The paper presents a comparison between two well-known techniques of flow control: active and passive. It presents an effective design of a vortex generator jet (VGJ) and a vane-type passive vortex generator (VG) and uses them in twin air-intake duct in different combinations to establish their effectiveness in improving the performance characteristics. The VGJ is designed to insert flow from side wall at pitch angle of 90 degrees and 45 degrees. Corotating (parallel) and counterrotating (V-shape) are the configuration of vane type VG. It is observed that VGJ has the potential to change the flow pattern drastically as compared to vane-type VG. While the VGJ is directed perpendicular to the side walls of the air-intake at a pitch angle of 90 degree, static pressure recovery is increased by 7.8% and total pressure loss is reduced by 40.7%, which is the best among all other cases tested for VGJ. For bigger-sized VG attached to the side walls of the air-intake, static pressure recovery is increased by 5.3%, but total pressure loss is reduced by only 4.5% as compared to all other cases of VG. PMID:23935422

  11. Applying satellite remote sensing technique in disastrous rainfall systems around Taiwan

    NASA Astrophysics Data System (ADS)

    Liu, Gin-Rong; Chen, Kwan-Ru; Kuo, Tsung-Hua; Liu, Chian-Yi; Lin, Tang-Huang; Chen, Liang-De

    2016-05-01

    Many people in Asia regions have been suffering from disastrous rainfalls year by year. The rainfall from typhoons or tropical cyclones (TCs) is one of their key water supply sources, but from another perspective such TCs may also bring forth unexpected heavy rainfall, thereby causing flash floods, mudslides or other disasters. So far we cannot stop or change a TC route or intensity via present techniques. Instead, however we could significantly mitigate the possible heavy casualties and economic losses if we can earlier know a TC's formation and can estimate its rainfall amount and distribution more accurate before its landfalling. In light of these problems, this short article presents methods to detect a TC's formation as earlier and to delineate its rainfall potential pattern more accurate in advance. For this first part, the satellite-retrieved air-sea parameters are obtained and used to estimate the thermal and dynamic energy fields and variation over open oceans to delineate the high-possibility typhoon occurring ocean areas and cloud clusters. For the second part, an improved tropical rainfall potential (TRaP) model is proposed with better assumptions then the original TRaP for TC rainfall band rotations, rainfall amount estimation, and topographic effect correction, to obtain more accurate TC rainfall distributions, especially for hilly and mountainous areas, such as Taiwan.

  12. Remote field eddy current technique applied to the inspection of nonmagnetic steam generator tubes

    NASA Astrophysics Data System (ADS)

    Shin, Young-Kil; Chung, Tae-Eon; Lord, William

    2001-04-01

    As steam generator (SG) tubes have aged, new and subtle degradations have appeared. Most of them start growing from outside the tubes. Since outer diameter defects might not be detected by conventional eddy current testing due to skin effect phenomena, this paper studies the feasibility of using the remote field eddy current (RFEC) technique, which has shown equal sensitivity to inner diameter (ID) and outer diameter (OD) defects in ferromagnetic pipe inspection. Finite element modeling studies show that the operating frequency needs to be increased up to a few hundred kHz in order for RFEC effects to occur in the nonmagnetic SG tube. The proper distance between exciter and sensor coils is also found to be 1.5 OD, which is half of the distance used in ferromagnetic pipe inspection. The resulting defect signals show equal sensitivity to ID and OD defects. These results demonstrate superior capability of the proposed RFEC probe compared to the differential ECT probe in detecting OD defects.

  13. Blade Displacement Measurement Technique Applied to a Full-Scale Rotor Test

    NASA Technical Reports Server (NTRS)

    Abrego, Anita I.; Olson, Lawrence E.; Romander, Ethan A.; Barrows, Danny A.; Burner, Alpheus W.

    2012-01-01

    Blade displacement measurements using multi-camera photogrammetry were acquired during the full-scale wind tunnel test of the UH-60A Airloads rotor, conducted in the National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel. The objectives were to measure the blade displacement and deformation of the four rotor blades as they rotated through the entire rotor azimuth. These measurements are expected to provide a unique dataset to aid in the development and validation of rotorcraft prediction techniques. They are used to resolve the blade shape and position, including pitch, flap, lag and elastic deformation. Photogrammetric data encompass advance ratios from 0.15 to slowed rotor simulations of 1.0, thrust coefficient to rotor solidity ratios from 0.01 to 0.13, and rotor shaft angles from -10.0 to 8.0 degrees. An overview of the blade displacement measurement methodology and system development, descriptions of image processing, uncertainty considerations, preliminary results covering static and moderate advance ratio test conditions and future considerations are presented. Comparisons of experimental and computational results for a moderate advance ratio forward flight condition show good trend agreements, but also indicate significant mean discrepancies in lag and elastic twist. Blade displacement pitch measurements agree well with both the wind tunnel commanded and measured values.

  14. Morphological analysis of the flippers in the Franciscana dolphin, Pontoporia blainvillei, applying X-ray technique.

    PubMed

    Del Castillo, Daniela Laura; Panebianco, María Victoria; Negri, María Fernanda; Cappozzo, Humberto Luis

    2014-07-01

    Pectoral flippers of cetaceans function to provide stability and maneuverability during locomotion. Directional asymmetry (DA) is a common feature among odontocete cetaceans, as well as sexual dimorphism (SD). For the first time DA, allometry, physical maturity, and SD of the flipper skeleton--by X-ray technique--of Pontoporia blainvillei were analyzed. The number of carpals, metacarpals, phalanges, and morphometric characters from the humerus, radius, ulna, and digit two were studied in franciscana dolphins from Buenos Aires, Argentina. The number of visible epiphyses and their degree of fusion at the proximal and distal ends of the humerus, radius, and ulna were also analyzed. The flipper skeleton was symmetrical, showing a negative allometric trend, with similar growth patterns in both sexes with the exception of the width of the radius (P ≤ 0.01). SD was found on the number of phalanges of digit two (P ≤ 0.01), ulna and digit two lengths. Females showed a higher relative ulna length and shorter relative digit two length, and the opposite occurred in males (P ≤ 0.01). Epiphyseal fusion pattern proved to be a tool to determine dolphin's age; franciscana dolphins with a mature flipper were, at least, four years old. This study indicates that the flippers of franciscana dolphins are symmetrical; both sexes show a negative allometric trend; SD is observed in radius, ulna, and digit two; and flipper skeleton allows determine the age class of the dolphins. PMID:24700648

  15. Comparison of infrared and 3D digital image correlation techniques applied for mechanical testing of materials

    NASA Astrophysics Data System (ADS)

    Krstulović-Opara, Lovre; Surjak, Martin; Vesenjak, Matej; Tonković, Zdenko; Kodvanj, Janoš; Domazet, Željko

    2015-11-01

    To investigate the applicability of infrared thermography as a tool for acquiring dynamic yielding in metals, a comparison of infrared thermography with three dimensional digital image correlation has been made. Dynamical tension tests and three point bending tests of aluminum alloys have been performed to evaluate results obtained by IR thermography in order to detect capabilities and limits for these two methods. Both approaches detect pastification zone migrations during the yielding process. The results of the tension test and three point bending test proved the validity of the IR approach as a method for evaluating the dynamic yielding process when used on complex structures such as cellular porous materials. The stability of the yielding process in the three point bending test, as contrary to the fluctuation of the plastification front in the tension test, is of great importance for the validation of numerical constitutive models. The research proved strong performance, robustness and reliability of the IR approach when used to evaluate yielding during dynamic loading processes, while the 3D DIC method proved to be superior in the low velocity loading regimes. This research based on two basic tests, proved the conclusions and suggestions presented in our previous research on porous materials where middle wave infrared thermography was applied.

  16. Zoneless and Mixture techniques applied to the Integrated Brazilian PSHA using GEM-OpenQuake

    NASA Astrophysics Data System (ADS)

    Pirchiner, M.; Drouet, S.; Assumpcao, M.

    2013-12-01

    The main goal of this work is to propose some variations to the classic Probabilistic Seismic Hazard Analysis (PSHA) calculations, on one hand, applying the zoneless methodology to seismic source activity characterization and, on the other hand, using the gaussian mixture models to mix Ground Motion Prediction Equation (GMPE) models onto a mixed model. Our actual knowledge of the Brazilian intraplate seismicity does not allow us to identify the causative neotectonic active faults with confidence. This issue makes difficult the characterization of main seismic sources and the computation of the Gutenberg-Richter relation. Indeed seismic zonings made by different specialist could have big differences, while the zone less approach imposes a quantitative method to seismic source characterization, avoiding the subjective source zone definition. In addition, the low seismicity rate and the limited coverage in space and time of the seismic networks, do not offer enough observations to fit a confident GMPE to this region. In this case, our purpose was use a Gaussian Mixture Model to estimate a composed model from pre-existents well-fitted GMPE models which better describes the observed peak ground motion data. The other methodological evaluation is to use the OpenQuake engine (a Global Earthquake Model's initiative) for the hazard calculation. The logic tree input will allow us, in near future, to combine with weights, other hazard models from different specialists. We expect that these results will offer a new and solid basis to upgrade the brazilian civil engineering seismic rules.

  17. Linear and non-linear control techniques applied to actively lubricated journal bearings

    NASA Astrophysics Data System (ADS)

    Nicoletti, R.; Santos, I. F.

    2003-03-01

    The main objectives of actively lubricated bearings are the simultaneous reduction of wear and vibration between rotating and stationary machinery parts. For reducing wear and dissipating vibration energy until certain limits, one can use the conventional hydrodynamic lubrication. For further reduction of shaft vibrations one can use the active lubrication action, which is based on injecting pressurized oil into the bearing gap through orifices machined in the bearing sliding surface. The design and efficiency of some linear (PD, PI and PID) and a non-linear controller, applied to a tilting-pad journal bearing, are analysed and discussed. Important conclusions about the application of integral controllers, responsible for changing the rotor-bearing equilibrium position and consequently the "passive" oil film damping coefficients, are achieved. Numerical results show an effective vibration reduction of unbalance response of a rigid rotor, where the PD and the non-linear P controllers show better performance for the frequency range of study (0-80 Hz). The feasibility of eliminating rotor-bearing instabilities (phenomena of whirl) by using active lubrication is also investigated, illustrating clearly one of its most promising applications.

  18. Ultrasonic Nondestructive Evaluation Techniques Applied to the Quantitative Characterization of Textile Composite Materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1997-01-01

    In this Progress Report, we describe our further development of advanced ultrasonic nondestructive evaluation methods applied to the characterization of anisotropic materials. We present images obtained from experimental measurements of ultrasonic diffraction patterns transmitted through water only and transmitted through water and a thin woven composite. All images of diffraction patterns have been included on the accompanying CD-ROM in the JPEG format and Adobe TM Portable Document Format (PDF), in addition to the inclusion of hardcopies of the images contained in this report. In our previous semi-annual Progress Report (NAG 1-1848, December, 1996), we proposed a simple model to simulate the effect of a thin woven composite on an insonifying ultrasonic pressure field. This initial approach provided an avenue to begin development of a robust measurement method for nondestructive evaluation of anisotropic materials. In this Progress Report, we extend that work by performing experimental measurements on a single layer of a five-harness biaxial woven composite to investigate how a thin, yet architecturally complex, material interacts with the insonifying ultrasonic field. In Section 2 of this Progress Report we describe the experimental arrangement and methods for data acquisition of the ultrasonic diffraction patterns upon transmission through a thin woven composite. We also briefly describe the thin composite specimen investigated. Section 3 details the analysis of the experimental data followed by the experimental results in Section 4. Finally, a discussion of the observations and conclusions is found in Section 5.

  19. Evaluation and optimisation of bacterial genomic DNA extraction for no-culture techniques applied to vinegars.

    PubMed

    Mamlouk, Dhouha; Hidalgo, Claudio; Torija, María-Jesús; Gullo, Maria

    2011-10-01

    Direct genomic DNA extraction from vinegars was set up and suitability for PCR assays performed by PCR/DGGE and sequencing of 16S rRNA gene. The method was tested on 12 intermediary products of special vinegars, fruit vinegars and condiments produced from different raw materials and procedures. DNAs extraction was performed on pellets by chemical, enzymatic, resin mediated methods and their modifications. Suitable yield and DNA purity were obtained by modification of a method based on the use of PVP/CTAB to remove polyphenolic components and esopolysaccharides. By sequencing of bands from DGGE gel, Gluconacetobacter europaeus, Acetobacter malorum/cerevisiae and Acetobacter orleanensis were detected as main species in samples having more than 4% of acetic acid content. From samples having no acetic acid content, sequences retrieved from excised bands revealed high similarity with prokaryotes with no function on vinegar fermentation: Burkholderia spp., Cupriavidus spp., Lactococcus lactis and Leuconostoc mesenteroides. The method was suitable to be applied for no-culture study of vinegars containing polyphenols and esopolysaccharides allowing a more complete assessment of vinegar bacteria. PMID:21839388

  20. Hybrid Technique in SCALE for Fission Source Convergence Applied to Used Nuclear Fuel Analysis

    SciTech Connect

    Ibrahim, Ahmad M; Peplow, Douglas E.; Bekar, Kursat B; Celik, Cihangir; Scaglione, John M; Ilas, Dan; Wagner, John C

    2013-01-01

    The new hybrid SOURCE ConveRgence accelERator (SOURCERER) sequence in SCALE deterministically computes a fission distribution and uses it as the starting source in a Monte Carlo eigenvalue criticality calculation. In addition to taking the guesswork out of defining an appropriate, problem-dependent starting source, the more accurate starting source provided by the deterministic calculation decreases the probability of producing inaccurate tally estimates associated with undersampling problems caused by inadequate source convergence. Furthermore, SOURCERER can increase the efficiency of the overall simulation by decreasing the number of cycles that has to be skipped before the keff accumulation. SOURCERER was applied to a representative example for a used nuclear fuel cask utilized at the Maine Yankee storage site {Scaglione and Ilas}. Because of the time constraints of the Used Fuel Research, Development, and Demonstration project, it was found that using more than 30,000 neutrons per cycle will lead to inaccurate Monte Carlo calculation of keff due to the inevitable decrease in the number of skipped and active cycles used with this problem. For a fixed uncertainty objective and by using 30,000 neutron per cycle, the use of SOURCERER increased the efficiency of the keff calculation by 60%compared to a Monte Carlo calculation that used a starting source distributed uniformly in fissionable regions, even with the inclusion of the extra computational time required by the deterministic calculation. Additionally, the use of SOURCERER increased the reliability of keff calculation using any number of skipped cycles below 350.