Sample records for homogenization methods

  1. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    PubMed

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  2. Utilizing Hierarchical Clustering to improve Efficiency of Self-Organizing Feature Map to Identify Hydrological Homogeneous Regions

    NASA Astrophysics Data System (ADS)

    Farsadnia, Farhad; Ghahreman, Bijan

    2016-04-01

    Hydrologic homogeneous group identification is considered both fundamental and applied research in hydrology. Clustering methods are among conventional methods to assess the hydrological homogeneous regions. Recently, Self-Organizing feature Map (SOM) method has been applied in some studies. However, the main problem of this method is the interpretation on the output map of this approach. Therefore, SOM is used as input to other clustering algorithms. The aim of this study is to apply a two-level Self-Organizing feature map and Ward hierarchical clustering method to determine the hydrologic homogenous regions in North and Razavi Khorasan provinces. At first by principal component analysis, we reduced SOM input matrix dimension, then the SOM was used to form a two-dimensional features map. To determine homogeneous regions for flood frequency analysis, SOM output nodes were used as input into the Ward method. Generally, the regions identified by the clustering algorithms are not statistically homogeneous. Consequently, they have to be adjusted to improve their homogeneity. After adjustment of the homogeneity regions by L-moment tests, five hydrologic homogeneous regions were identified. Finally, adjusted regions were created by a two-level SOM and then the best regional distribution function and associated parameters were selected by the L-moment approach. The results showed that the combination of self-organizing maps and Ward hierarchical clustering by principal components as input is more effective than the hierarchical method, by principal components or standardized inputs to achieve hydrologic homogeneous regions.

  3. Mechanized syringe homogenization of human and animal tissues.

    PubMed

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  4. Method of chaotic mixing and improved stirred tank reactors

    DOEpatents

    Muzzio, F.J.; Lamberto, D.J.

    1999-07-13

    The invention provides a method and apparatus for efficiently achieving a homogeneous mixture of fluid components by introducing said components having a Reynolds number of between about [le]1 to about 500 into a vessel and continuously perturbing the mixing flow by altering the flow speed and mixing time until homogeneity is reached. This method prevents the components from aggregating into non-homogeneous segregated regions within said vessel during mixing and substantially reduces the time the admixed components reach homogeneity. 19 figs.

  5. Spatial homogenization methods for pin-by-pin neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Kozlowski, Tomasz

    For practical reactor core applications low-order transport approximations such as SP3 have been shown to provide sufficient accuracy for both static and transient calculations with considerably less computational expense than the discrete ordinate or the full spherical harmonics methods. These methods have been applied in several core simulators where homogenization was performed at the level of the pin cell. One of the principal problems has been to recover the error introduced by pin-cell homogenization. Two basic approaches to treat pin-cell homogenization error have been proposed: Superhomogenization (SPH) factors and Pin-Cell Discontinuity Factors (PDF). These methods are based on well established Equivalence Theory and Generalized Equivalence Theory to generate appropriate group constants. These methods are able to treat all sources of error together, allowing even few-group diffusion with one mesh per cell to reproduce the reference solution. A detailed investigation and consistent comparison of both homogenization techniques showed potential of PDF approach to improve accuracy of core calculation, but also reveal its limitation. In principle, the method is applicable only for the boundary conditions at which it was created, i.e. for boundary conditions considered during the homogenization process---normally zero current. Therefore, there exists a need to improve this method, making it more general and environment independent. The goal of proposed general homogenization technique is to create a function that is able to correctly predict the appropriate correction factor with only homogeneous information available, i.e. a function based on heterogeneous solution that could approximate PDFs using homogeneous solution. It has been shown that the PDF can be well approximated by least-square polynomial fit of non-dimensional heterogeneous solution and later used for PDF prediction using homogeneous solution. This shows a promise for PDF prediction for off-reference conditions, such as during reactor transients which provide conditions that can not typically be anticipated a priori.

  6. Homogenization of periodic bi-isotropic composite materials

    NASA Astrophysics Data System (ADS)

    Ouchetto, Ouail; Essakhi, Brahim

    2018-07-01

    In this paper, we present a new method for homogenizing the bi-periodic materials with bi-isotropic components phases. The presented method is a numerical method based on the finite element method to compute the local electromagnetic properties. The homogenized constitutive parameters are expressed as a function of the macroscopic electromagnetic properties which are obtained from the local properties. The obtained results are compared to Unfolding Finite Element Method and Maxwell-Garnett formulas.

  7. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  8. Homogenization-based interval analysis for structural-acoustic problem involving periodical composites and multi-scale uncertain-but-bounded parameters.

    PubMed

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong

    2017-04-01

    This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.

  9. Pseudo-thermosetting chitosan hydrogels for biomedical application.

    PubMed

    Berger, J; Reist, M; Chenite, A; Felt-Baeyens, O; Mayer, J M; Gurny, R

    2005-01-20

    To prepare transparent chitosan/beta-glycerophosphate (betaGP) pseudo-thermosetting hydrogels, the deacetylation degree (DD) of chitosan has been modified by reacetylation with acetic anhydride. Two methods (I and II) of reacetylation have been compared and have shown that the use of previously filtered chitosan, dilution of acetic anhydride and reduction of temperature in method II improves efficiency and reproducibility. Chitosans with DD ranging from 35.0 to 83.2% have been prepared according to method II under homogeneous and non-homogeneous reacetylation conditions and the turbidity of chitosan/betaGP hydrogels containing homogeneously or non-homogeneously reacetylated chitosan has been investigated. Turbidity is shown to be modulated by the DD of chitosan and by the homogeneity of the medium during reacetylation, which influences the distribution mode of the chitosan monomers. The preparation of transparent chitosan/betaGP hydrogels requires a homogeneously reacetylated chitosan with a DD between 35 and 50%.

  10. Pseudo-thermosetting chitosan hydrogels for biomedical application.

    PubMed

    Berger, J; Reist, M; Chenite, A; Felt-Baeyens, O; Mayer, J M; Gurny, R

    2005-01-06

    To prepare transparent chitosan/beta-glycerophosphate (betaGP) pseudo-thermosetting hydrogels, the deacetylation degree (DD) of chitosan has been modified by reacetylation with acetic anhydride. Two methods (I and II) of reacetylation have been compared and have shown that the use of previously filtered chitosan, dilution of acetic anhydride and reduction of temperature in method II improves efficiency and reproducibility. Chitosans with DD ranging from 35.0 to 83.2% have been prepared according to method II under homogeneous and non-homogeneous reacetylation conditions and the turbidity of chitosan/betaGP hydrogels containing homogeneously or non-homogeneously reacetylated chitosan has been investigated. Turbidity is shown to be modulated by the DD of chitosan and by the homogeneity of the medium during reacetylation, which influences the distribution mode of the chitosan monomers. The preparation of transparent chitosan/betaGP hydrogels requires a homogeneously reacetylated chitosan with a DD between 35 and 50%.

  11. Numerical modeling of the acoustic wave propagation across a homogenized rigid microstructure in the time domain

    NASA Astrophysics Data System (ADS)

    Lombard, Bruno; Maurel, Agnès; Marigo, Jean-Jacques

    2017-04-01

    Homogenization of a thin micro-structure yields effective jump conditions that incorporate the geometrical features of the scatterers. These jump conditions apply across a thin but nonzero thickness interface whose interior is disregarded. This paper aims (i) to propose a numerical method able to handle the jump conditions in order to simulate the homogenized problem in the time domain, (ii) to inspect the validity of the homogenized problem when compared to the real one. For this purpose, we adapt the Explicit Simplified Interface Method originally developed for standard jump conditions across a zero-thickness interface. Doing so allows us to handle arbitrary-shaped interfaces on a Cartesian grid with the same efficiency and accuracy of the numerical scheme than those obtained in a homogeneous medium. Numerical experiments are performed to test the properties of the numerical method and to inspect the validity of the homogenization problem.

  12. Homogenization of Mammalian Cells.

    PubMed

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  13. Method of fabricating a homogeneous wire of inter-metallic alloy

    DOEpatents

    Ohriner, Evan Keith; Blue, Craig Alan

    2001-01-01

    A method for fabricating a homogeneous wire of inter-metallic alloy comprising the steps of providing a base-metal wire bundle comprising a metal, an alloy or a combination thereof; working the wire bundle through at least one die to obtain a desired dimension and to form a precursor wire; and, controllably heating the precursor wire such that a portion of the wire will become liquid while simultaneously maintaining its desired shape, whereby substantial homogenization of the wire occurs in the liquid state and additional homogenization occurs in the solid state resulting in a homogenous alloy product.

  14. [Methods for enzymatic determination of triglycerides in liver homogenates].

    PubMed

    Höhn, H; Gartzke, J; Burck, D

    1987-10-01

    An enzymatic method is described for the determination of triacylglycerols in liver homogenate. In contrast to usual methods, higher reliability and selectivity are achieved by omitting the extraction step.

  15. A comparison of techniques for preparing fish fillet for ICP-AES multielemental analysis and the microwave digestion of whole fish.

    PubMed

    Moeller, A; Ambrose, R F; Que Hee, S S

    2001-01-01

    Four catfish fillet homogenate treatments before multielemental metal analysis by simultaneous inductively coupled plasma/atomic emission spectroscopy were compared in triplicate. These treatments were: nitric acid wet-ashing by Parr bomb digestion; nitric acid wet-ashing by microwave digestion; tetramethylammonium hydroxide/nitric acid wet digestion; and dry-ashing. The tetramethylammonium hydroxide/nitric acid method was imprecise (coefficients of variation > 20%). The dry-ashing method was fast and sensitive but had low recoveries of 50% for spiked Pb and Al and was not as precise as the Parr bomb or microwave treatments. The Parr bomb method was the most precise method but was less sensitive than the microwave method which had nearly the same precision. The microwave method was then adapted to homogenates of small whole fish < or = 3 cm in length. The whole fish homogenate required more vigorous digestion conditions, and addition of more acid after the evaporative step because of the presence of less oxidizable and acid-soluble components than fillet. The whole fish homogenate was also more heterogeneous than catfish fillet. A quality assurance protocol to demonstrate homogenate uniformity is essential. The use of a non-specialized microwave oven system allowed precise results for fillet and whole fish homogenates.

  16. Progesterone lipid nanoparticles: Scaling up and in vivo human study.

    PubMed

    Esposito, Elisabetta; Sguizzato, Maddalena; Drechsler, Markus; Mariani, Paolo; Carducci, Federica; Nastruzzi, Claudio; Cortesi, Rita

    2017-10-01

    This investigation describes a scaling up study aimed at producing progesterone containing nanoparticles in a pilot scale. Particularly hot homogenization techniques based on ultrasound homogenization or high pressure homogenization have been employed to produce lipid nanoparticles constituted of tristearin or tristearin in association with caprylic-capric triglyceride. It was found that the high pressure homogenization method enabled to obtain nanoparticles without agglomerates and smaller mean diameters with respect to ultrasound homogenization method. X-ray characterization suggested a lamellar structural organization of both type of nanoparticles. Progesterone encapsulation efficiency was almost 100% in the case of high pressure homogenization method. Shelf life study indicated a double fold stability of progesterone when encapsulated in nanoparticles produced by the high pressure homogenization method. Dialysis and Franz cell methods were performed to mimic subcutaneous and skin administration. Nanoparticles constituted of tristearin in mixture with caprylic/capric triglyceride display a slower release of progesterone with respect to nanoparticles constituted of pure tristearin. Franz cell evidenced a higher progesterone skin uptake in the case of pure tristearin nanoparticles. A human in vivo study, based on tape stripping, was conducted to investigate the performance of nanoparticles as progesterone skin delivery systems. Tape stripping results indicated a decrease of progesterone concentration in stratum corneum within six hours, suggesting an interaction between nanoparticle material and skin lipids. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Differential reactivities of four homogeneous assays for LDL-cholesterol in serum to intermediate-density lipoproteins and small dense LDL: comparisons with the Friedewald equation.

    PubMed

    Yamashita, Shizuya; Kawase, Ryota; Nakaoka, Hajime; Nakatani, Kazuhiro; Inagaki, Miwako; Yuasa-Kawase, Miyako; Tsubakio-Yamamoto, Kazumi; Sandoval, Jose C; Masuda, Daisaku; Ohama, Tohru; Nakagawa-Toyama, Yumiko; Matsuyama, Akifumi; Nishida, Makoto; Ishigami, Masato

    2009-12-01

    In routine clinical laboratory testing and numerous epidemiological studies, LDL-cholesterol (LDL-C) has been estimated commonly using the Friedewald equation. We investigated the relationship between the Friedewald equation and 4 homogeneous assays for LDL-C. LDL-C was determined by 4 homogeneous assays [liquid selective detergent method: LDL-C (L), selective solubilization method: LDL-C (S), elimination method: LDL-C (E), and enzyme selective protecting method: LDL-C (P)]. Samples with discrepancies between the Friedewald equation and the 4 homogeneous assays for LDL-C were subjected to polyacrylamide gel electrophoresis and the beta-quantification method. The correlations between the Friedewald equation and the 4 homogeneous LDL-C assays were as follows: LDL-C (L) (r=0.962), LDL-C (S) (r=0.986), LDL-C (E) (r=0.946) and LDL-C (P) (r=0.963). Discrepancies were observed in sera from type III hyperlipoproteinemia patients and in sera containing large amounts of midband and small dense LDL on polyacrylamide gel electrophoresis. LDL-C (S) was most strongly correlated with the beta-quantification method even in sera from patients with type III hyperlipoproteinemia. Of the 4 homogeneous assays for LDL-C, LDL-C (S) exhibited the closest correlation with the Friedewald equation and the beta-quantification method, thus reflecting the current clinical databases for coronary heart disease.

  18. Ultra-thin carbon-fiber paper fabrication and carbon-fiber distribution homogeneity evaluation method

    NASA Astrophysics Data System (ADS)

    Zhang, L. F.; Chen, D. Y.; Wang, Q.; Li, H.; Zhao, Z. G.

    2018-01-01

    A preparation technology of ultra-thin Carbon-fiber paper is reported. Carbon fiber distribution homogeneity has a great influence on the properties of ultra-thin Carbon-fiber paper. In this paper, a self-developed homogeneity analysis system is introduced to assist users to evaluate the distribution homogeneity of Carbon fiber among two or more two-value images of carbon-fiber paper. A relative-uniformity factor W/H is introduced. The experimental results show that the smaller the W/H factor, the higher uniformity of the distribution of Carbon fiber is. The new uniformity-evaluation method provides a practical and reliable tool for analyzing homogeneity of materials.

  19. Comparison of Methods to Assay Liver Glycogen Fractions: The Effects of Starvation

    PubMed Central

    Mojibi, Nastaran

    2017-01-01

    Introduction There are several methods to extract and measure glycogen in animal tissues. Glycogen is extracted with or without homogenization by using cold Perchloric Acid (PCA). Aim Three procedures were compared to determine glycogen fractions in rat liver at different physiological states. Materials and Methods The present study was conducted on two groups of rats, one group of five rats were fed standard rodent laboratory food and were marked as controls, and another five rats were starved overnight (15 hour) as cases. The glycogen fractions were extracted and measured by using three methods: classical homogenization, total-glycogen-fractionation and homogenization-free protocols. Results The data of homogenization methods showed that following 15 hour starvation, total glycogen decreased (36.4±1.9 vs. 27.7±2.5, p=0.01) and the change occurred entirely in Acid Soluble Glycogen (ASG) (32.0±1.1 vs. 22.7±2.5, p=0.01), while Acid Insoluble Glycogen (AIG) did not change significantly (4.9±0.9 vs. 4.6±0.3, p=0.7). Similar results were achieved by using the method of total-glycogen-fractionation. Homogenization-free procedure indicated that ASG and AIG fractions compromise about 2/3 and 1/3 of total glycogen and the changes occurred in both ASG (24.4±2.6 vs. 16.7±0.4, p<0.05) and AIG fraction (8.7±0.8 vs. 7.1±0.3, p=0.05). Conclusion The findings of ‘homogenization assay method’ indicate that ASG is the major portion of liver glycogen and is more metabolically active form. The same results were obtained by using ‘total-glycogen-fractionation method’. ‘Homogenization-free method’ gave different results, because AIG has been contaminated with ASG fraction. In both ‘homogenization’ and ‘homogenization-free’ methods ASG must be extracted at least twice to prevent contamination of AIG with ASG. PMID:28511372

  20. Homogeneity study of fixed-point continuous marine environmental and meteorological data: a review

    NASA Astrophysics Data System (ADS)

    Yang, Jinkun; Yang, Yang; Miao, Qingsheng; Dong, Mingmei; Wan, Fangfang

    2018-02-01

    The principle of inhomogeneity and the classification of homogeneity test methods are briefly described, and several common inhomogeneity methods and relative merits are described in detail. Then based on the applications of the different homogeneity methods to the ground meteorological data and marine environment data, the present status and the progress are reviewed. At present, the homogeneity research of radiosonde and ground meteorological data is mature at home and abroad, and the research and application in the marine environmental data should also be given full attention. To carry out a variety of test and correction methods combined with the use of multi-mode test system, will make the results more reasonable and scientific, and also can be used to provide accurate first-hand information for the coastal climate change researches.

  1. Hydrogen storage materials and method of making by dry homogenation

    DOEpatents

    Jensen, Craig M.; Zidan, Ragaiy A.

    2002-01-01

    Dry homogenized metal hydrides, in particular aluminum hydride compounds, as a material for reversible hydrogen storage is provided. The reversible hydrogen storage material comprises a dry homogenized material having transition metal catalytic sites on a metal aluminum hydride compound, or mixtures of metal aluminum hydride compounds. A method of making such reversible hydrogen storage materials by dry doping is also provided and comprises the steps of dry homogenizing metal hydrides by mechanical mixing, such as be crushing or ball milling a powder, of a metal aluminum hydride with a transition metal catalyst. In another aspect of the invention, a method of powering a vehicle apparatus with the reversible hydrogen storage material is provided.

  2. Macro-architectured cellular materials: Properties, characteristic modes, and prediction methods

    NASA Astrophysics Data System (ADS)

    Ma, Zheng-Dong

    2017-12-01

    Macro-architectured cellular (MAC) material is defined as a class of engineered materials having configurable cells of relatively large (i.e., visible) size that can be architecturally designed to achieve various desired material properties. Two types of novel MAC materials, negative Poisson's ratio material and biomimetic tendon reinforced material, were introduced in this study. To estimate the effective material properties for structural analyses and to optimally design such materials, a set of suitable homogenization methods was developed that provided an effective means for the multiscale modeling of MAC materials. First, a strain-based homogenization method was developed using an approach that separated the strain field into a homogenized strain field and a strain variation field in the local cellular domain superposed on the homogenized strain field. The principle of virtual displacements for the relationship between the strain variation field and the homogenized strain field was then used to condense the strain variation field onto the homogenized strain field. The new method was then extended to a stress-based homogenization process based on the principle of virtual forces and further applied to address the discrete systems represented by the beam or frame structures of the aforementioned MAC materials. The characteristic modes and the stress recovery process used to predict the stress distribution inside the cellular domain and thus determine the material strengths and failures at the local level are also discussed.

  3. Method of chaotic mixing and improved stirred tank reactors

    DOEpatents

    Muzzio, Fernando J.; Lamberto, David J.

    1999-01-01

    The invention provides a method and apparatus for efficiently achieving a homogeneous mixture of fluid components by introducing said components having a Reynolds number of between about .ltoreq.1 to about 500 into a vessel and continuously perturbing the mixing flow by altering the flow speed and mixing time until homogeniety is reached. This method prevents the components from aggregating into non-homogeneous segregated regions within said vessel during mixing and substantially reduces the time the admixed components reach homogeneity.

  4. Reactive sintering of ceramic lithium ion electrolyte membranes

    DOEpatents

    Badding, Michael Edward; Dutta, Indrajit; Iyer, Sriram Rangarajan; Kent, Brian Alan; Lonnroth, Nadja Teresia

    2017-06-06

    Disclosed herein are methods for making a solid lithium ion electrolyte membrane, the methods comprising combining a first reactant chosen from amorphous, glassy, or low melting temperature solid reactants with a second reactant chosen from refractory oxides to form a mixture; heating the mixture to a first temperature to form a homogenized composite, wherein the first temperature is between a glass transition temperature of the first reactant and a crystallization onset temperature of the mixture; milling the homogenized composite to form homogenized particles; casting the homogenized particles to form a green body; and sintering the green body at a second temperature to form a solid membrane. Solid lithium ion electrolyte membranes manufactured according to these methods are also disclosed herein.

  5. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  6. Boundary element modelling of dynamic behavior of piecewise homogeneous anisotropic elastic solids

    NASA Astrophysics Data System (ADS)

    Igumnov, L. A.; Markov, I. P.; Litvinchuk, S. Yu

    2018-04-01

    A traditional direct boundary integral equations method is applied to solve three-dimensional dynamic problems of piecewise homogeneous linear elastic solids. The materials of homogeneous parts are considered to be generally anisotropic. The technique used to solve the boundary integral equations is based on the boundary element method applied together with the Radau IIA convolution quadrature method. A numerical example of suddenly loaded 3D prismatic rod consisting of two subdomains with different anisotropic elastic properties is presented to verify the accuracy of the proposed formulation.

  7. Point matching: A new electronic method for homogenizing the phase characteristics of giant magnetoimpedance sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, E. Costa, E-mail: edusilva@ele.puc-rio.br; Gusmão, L. A. P.; Barbosa, C. R. Hall

    2014-08-15

    Recently, our research group at PUC-Rio discovered that magnetic transducers based on the impedance phase characteristics of GMI sensors have the potential to multiply by one hundred the sensitivity values when compared to magnitude-based GMI transducers. Those GMI sensors can be employed in the measurement of ultra-weak magnetic fields, which intensities are even lower than the environmental magnetic noise. A traditional solution for cancelling the electromagnetic noise and interference makes use of gradiometric configurations, but the performance is strongly tied to the homogeneity of the sensing elements. This paper presents a new method that uses electronic circuits to modify themore » equivalent impedance of the GMI samples, aiming at homogenizing their phase characteristics and, consequently, improving the performance of gradiometric configurations based on GMI samples. It is also shown a performance comparison between this new method and another homogenization method previously developed.« less

  8. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, James Terry

    1998-01-01

    An apparatus and method for generating homogenous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set.

  9. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, James T.

    1998-01-01

    An apparatus and method for generating homogenous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set.

  10. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, James T.

    1997-01-01

    An apparatus and method for generating homogenous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially cancelling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set.

  11. Nonlinear vibration of a traveling belt with non-homogeneous boundaries

    NASA Astrophysics Data System (ADS)

    Ding, Hu; Lim, C. W.; Chen, Li-Qun

    2018-06-01

    Free and forced nonlinear vibrations of a traveling belt with non-homogeneous boundary conditions are studied. The axially moving materials in operation are always externally excited and produce strong vibrations. The moving materials with the homogeneous boundary condition are usually considered. In this paper, the non-homogeneous boundaries are introduced by the support wheels. Equilibrium deformation of the belt is produced by the non-homogeneous boundaries. In order to solve the equilibrium deformation, the differential and integral quadrature methods (DIQMs) are utilized to develop an iterative scheme. The influence of the equilibrium deformation on free and forced nonlinear vibrations of the belt is explored. The DIQMs are applied to solve the natural frequencies and forced resonance responses of transverse vibration around the equilibrium deformation. The Galerkin truncation method (GTM) is utilized to confirm the DIQMs' results. The numerical results demonstrate that the non-homogeneous boundary conditions cause the transverse vibration to deviate from the straight equilibrium, increase the natural frequencies, and lead to coexistence of square nonlinear terms and cubic nonlinear terms. Moreover, the influence of non-homogeneous boundaries can be exacerbated by the axial speed. Therefore, non-homogeneous boundary conditions of axially moving materials especially should be taken into account.

  12. Prediction of Process-Induced Distortions in L-Shaped Composite Profiles Using Path-Dependent Constitutive Law

    NASA Astrophysics Data System (ADS)

    Ding, Anxin; Li, Shuxin; Wang, Jihui; Ni, Aiqing; Sun, Liangliang; Chang, Lei

    2016-10-01

    In this paper, the corner spring-in angles of AS4/8552 L-shaped composite profiles with different thicknesses are predicted using path-dependent constitutive law with the consideration of material properties variation due to phase change during curing. The prediction accuracy mainly depends on the properties in the rubbery and glassy states obtained by homogenization method rather than experimental measurements. Both analytical and finite element (FE) homogenization methods are applied to predict the overall properties of AS4/8552 composite. The effect of fiber volume fraction on the properties is investigated for both rubbery and glassy states using both methods. And the predicted results are compared with experimental measurements for the glassy state. Good agreement is achieved between the predicted results and available experimental data, showing the reliability of the homogenization method. Furthermore, the corner spring-in angles of L-shaped composite profiles are measured experimentally and the reliability of path-dependent constitutive law is validated as well as the properties prediction by FE homogenization method.

  13. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    PubMed

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  14. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    DOE PAGES

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; ...

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less

  15. Mixed mode control method and engine using same

    DOEpatents

    Kesse, Mary L [Peoria, IL; Duffy, Kevin P [Metamora, IL

    2007-04-10

    A method of mixed mode operation of an internal combustion engine includes the steps of controlling a homogeneous charge combustion event timing in a given engine cycle, and controlling a conventional charge injection event to be at least a predetermined time after the homogeneous charge combustion event. An internal combustion engine is provided, including an electronic controller having a computer readable medium with a combustion timing control algorithm recorded thereon, the control algorithm including means for controlling a homogeneous charge combustion event timing and means for controlling a conventional injection event timing to be at least a predetermined time from the homogeneous charge combustion event.

  16. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    NASA Astrophysics Data System (ADS)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  17. Layout optimization using the homogenization method

    NASA Technical Reports Server (NTRS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  18. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, J.T.

    1997-06-24

    An apparatus and method are disclosed for generating homogeneous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set. 26 figs.

  19. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, J.T.

    1998-05-05

    An apparatus and method are disclosed for generating homogeneous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set. 55 figs.

  20. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, J.T.

    1998-02-10

    An apparatus and method for generating homogeneous electromagnetic fields within a volume is disclosed. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set. 39 figs.

  1. LANDSAT-D investigations in snow hydrology

    NASA Technical Reports Server (NTRS)

    Dozier, J. (Principal Investigator)

    1984-01-01

    Two stream methods provide rapid approximate calculations of radiative transfer in scattering and absorbing media. Although they provide information on fluxes only, and not on intensities, their speed makes them attractive to more precise methods. The methods provide a comprehensive, unified review for a homogeneous layer, and solve the equations for reflectance and transmittance for a homogeneous layer over a non reflecting surface. Any of the basic kernels for a single layer can be extended to a vertically inhomogeneous medium over a surface whose reflectance properties vary with illumination angle, as long as the medium can be subdivided into homogeneous layers.

  2. Comparison of Three Different Methods for Pile Integrity Testing on a Cylindrical Homogeneous Polyamide Specimen

    NASA Astrophysics Data System (ADS)

    Lugovtsova, Y. D.; Soldatov, A. I.

    2016-01-01

    Three different methods for pile integrity testing are proposed to compare on a cylindrical homogeneous polyamide specimen. The methods are low strain pile integrity testing, multichannel pile integrity testing and testing with a shaker system. Since the low strain pile integrity testing is well-established and standardized method, the results from it are used as a reference for other two methods.

  3. Chemical Equation Balancing.

    ERIC Educational Resources Information Center

    Blakley, G. R.

    1982-01-01

    Reviews mathematical techniques for solving systems of homogeneous linear equations and demonstrates that the algebraic method of balancing chemical equations is a matter of solving a system of homogeneous linear equations. FORTRAN programs using this matrix method to chemical equation balancing are available from the author. (JN)

  4. A Homogenization Approach for Design and Simulation of Blast Resistant Composites

    NASA Astrophysics Data System (ADS)

    Sheyka, Michael

    Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.

  5. Multicomponent homogeneous alloys and method for making same

    DOEpatents

    Dutta, Partha S.; Miller, Thomas R.

    2003-09-02

    The present application discloses a method for preparing a homogeneous ternary or quaternary alloy from a quaternary melt. The method includes providing a family of phase diagrams for the quaternary melt which shows (i) composition/temperature data, (ii) tie lines connecting equilibrium liquid and solid compositions, and (iii) isotherms representing boundaries of a miscibility gap. Based on the family of phase diagrams, a quaternary melt composition and an alloy growth temperature is selected. A quaternary melt having the selected quaternary melt composition is provided and a ternary or quaternary alloy is grown from the quaternary melt at the selected alloy growth temperature. A method for making homogeneous ternary or quaternary alloy from a ternary or quaternary melt is also disclosed, as are homogeneous quaternary single-crystal alloys which are substantially free from crystal defects and which have the formula A.sub.x B.sub.1-x C.sub.y D.sub.1-y, x and y being the same or different and in the range of 0.001 to 0.999.

  6. Evaluating a novel application of optical fibre evanescent field absorbance: rapid measurement of red colour in winegrape homogenates

    NASA Astrophysics Data System (ADS)

    Lye, Peter G.; Bradbury, Ronald; Lamb, David W.

    Silica optical fibres were used to measure colour (mg anthocyanin/g fresh berry weight) in samples of red wine grape homogenates via optical Fibre Evanescent Field Absorbance (FEFA). Colour measurements from 126 samples of grape homogenate were compared against the standard industry spectrophotometric reference method that involves chemical extraction and subsequent optical absorption measurements of clarified samples at 520 nm. FEFA absorbance on homogenates at 520 nm (FEFA520h) was correlated with the industry reference method measurements of colour (R2 = 0.46, n = 126). Using a simple regression equation colour could be predicted with a standard error of cross-validation (SECV) of 0.21 mg/g, with a range of 0.6 to 2.2 mg anthocyanin/g and a standard deviation of 0.33 mg/g. With a Ratio of Performance Deviation (RPD) of 1.6, the technique when utilizing only a single detection wavelength, is not robust enough to apply in a diagnostic sense, however the results do demonstrate the potential of the FEFA method as a fast and low-cost assay of colour in homogenized samples.

  7. Comparative Evaluation of Three Homogenization Methods for Isolating Middle East Respiratory Syndrome Coronavirus Nucleic Acids From Sputum Samples for Real-Time Reverse Transcription PCR

    PubMed Central

    Yong, Dongeun; Ki, Chang-Seok; Kim, Jae-Seok; Seong, Moon-Woo; Lee, Hyukmin

    2016-01-01

    Background Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. Methods We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). Results While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1–35.4 with the PK-DNase method, 34.7–39.0 with the PBS method, and 33.9–38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both P<0.0001). Conclusions The PK-DNase method is suitable for homogenizing sputum samples prior to RNA extraction. PMID:27374711

  8. Method for preparing hydrous zirconium oxide gels and spherules

    DOEpatents

    Collins, Jack L.

    2003-08-05

    Methods for preparing hydrous zirconium oxide spherules, hydrous zirconium oxide gels such as gel slabs, films, capillary and electrophoresis gels, zirconium monohydrogen phosphate spherules, hydrous zirconium oxide spherules having suspendable particles homogeneously embedded within to form a composite sorbent, zirconium monohydrogen phosphate spherules having suspendable particles of at least one different sorbent homogeneously embedded within to form a composite sorbent having a desired crystallinity, zirconium oxide spherules having suspendable particles homogeneously embedded within to form a composite, hydrous zirconium oxide fiber materials, zirconium oxide fiber materials, hydrous zirconium oxide fiber materials having suspendable particles homogeneously embedded within to form a composite, zirconium oxide fiber materials having suspendable particles homogeneously embedded within to form a composite and spherules of barium zirconate. The hydrous zirconium oxide spherules and gel forms prepared by the gel-sphere, internal gelation process are useful as inorganic ion exchangers, catalysts, getters and ceramics.

  9. Simple method for the generation of multiple homogeneous field volumes inside the bore of superconducting magnets.

    PubMed

    Chou, Ching-Yu; Ferrage, Fabien; Aubert, Guy; Sakellariou, Dimitris

    2015-07-17

    Standard Magnetic Resonance magnets produce a single homogeneous field volume, where the analysis is performed. Nonetheless, several modern applications could benefit from the generation of multiple homogeneous field volumes along the axis and inside the bore of the magnet. In this communication, we propose a straightforward method using a combination of ring structures of permanent magnets in order to cancel the gradient of the stray field in a series of distinct volumes. These concepts were demonstrated numerically on an experimentally measured magnetic field profile. We discuss advantages and limitations of our method and present the key steps required for an experimental validation.

  10. Comparative Evaluation of Three Homogenization Methods for Isolating Middle East Respiratory Syndrome Coronavirus Nucleic Acids From Sputum Samples for Real-Time Reverse Transcription PCR.

    PubMed

    Sung, Heungsup; Yong, Dongeun; Ki, Chang Seok; Kim, Jae Seok; Seong, Moon Woo; Lee, Hyukmin; Kim, Mi Na

    2016-09-01

    Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1-35.4 with the PK-DNase method, 34.7-39.0 with the PBS method, and 33.9-38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both P<0.0001). The PK-DNase method is suitable for homogenizing sputum samples prior to RNA extraction.

  11. Numerical Modelling of Mechanical Properties of C-Pd Film by Homogenization Technique and Finite Element Method

    NASA Astrophysics Data System (ADS)

    Rymarczyk, Joanna; Kowalczyk, Piotr; Czerwosz, Elzbieta; Bielski, Włodzimierz

    2011-09-01

    The nanomechanical properties of nanostructural carbonaceous-palladium films are studied. The nanoindentation experiments are numerically using the Finite Element Method. The homogenization theory is applied to compute the properties of the composite material used as the input data for nanoindentation calculations.

  12. Configuration optimization of space structures

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos; Crivelli, Luis A.; Vandenbelt, David

    1991-01-01

    The objective is to develop a computer aid for the conceptual/initial design of aerospace structures, allowing configurations and shape to be apriori design variables. The topics are presented in viewgraph form and include the following: Kikuchi's homogenization method; a classical shape design problem; homogenization method steps; a 3D mechanical component design example; forming a homogenized finite element; a 2D optimization problem; treatment of volume inequality constraint; algorithms for the volume inequality constraint; object function derivatives--taking advantage of design locality; stiffness variations; variations of potential; and schematics of the optimization problem.

  13. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    PubMed Central

    Krasteva, Vessela TZ; Papazov, Sava P; Daskalov, Ivan K

    2003-01-01

    Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM) of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium. PMID:14693034

  14. Comparison of up-scaling methods in poroelasticity and its generalizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berryman, J G

    2003-12-13

    Four methods of up-scaling coupled equations at the microscale to equations valid at the mesoscale and/or macroscale for fluid-saturated and partially saturated porous media will be discussed, compared, and contrasted. The four methods are: (1) effective medium theory, (2) mixture theory, (3) two-scale and multiscale homogenization, and (4) volume averaging. All these methods have advantages for some applications and disadvantages for others. For example, effective medium theory, mixture theory, and homogenization methods can all give formulas for coefficients in the up-scaled equations, whereas volume averaging methods give the form of the up-scaled equations but generally must be supplemented with physicalmore » arguments and/or data in order to determine the coefficients. Homogenization theory requires a great deal of mathematical insight from the user in order to choose appropriate scalings for use in the resulting power-law expansions, while volume averaging requires more physical insight to motivate the steps needed to find coefficients. Homogenization often is performed on periodic models, while volume averaging does not require any assumption of periodicity and can therefore be related very directly to laboratory and/or field measurements. Validity of the homogenization process is often limited to specific ranges of frequency - in order to justify the scaling hypotheses that must be made - and therefore cannot be used easily over wide ranges of frequency. However, volume averaging methods can quite easily be used for wide band data analysis. So, we learn from these comparisons that a researcher in the theory of poroelasticity and its generalizations needs to be conversant with two or more of these methods to solve problems generally.« less

  15. Simple method for the generation of multiple homogeneous field volumes inside the bore of superconducting magnets

    PubMed Central

    Chou, Ching-Yu; Ferrage, Fabien; Aubert, Guy; Sakellariou, Dimitris

    2015-01-01

    Standard Magnetic Resonance magnets produce a single homogeneous field volume, where the analysis is performed. Nonetheless, several modern applications could benefit from the generation of multiple homogeneous field volumes along the axis and inside the bore of the magnet. In this communication, we propose a straightforward method using a combination of ring structures of permanent magnets in order to cancel the gradient of the stray field in a series of distinct volumes. These concepts were demonstrated numerically on an experimentally measured magnetic field profile. We discuss advantages and limitations of our method and present the key steps required for an experimental validation. PMID:26182891

  16. Homogenization of Periodic Masonry Using Self-Consistent Scheme and Finite Element Method

    NASA Astrophysics Data System (ADS)

    Kumar, Nitin; Lambadi, Harish; Pandey, Manoj; Rajagopal, Amirtham

    2016-01-01

    Masonry is a heterogeneous anisotropic continuum, made up of the brick and mortar arranged in a periodic manner. Obtaining the effective elastic stiffness of the masonry structures has been a challenging task. In this study, the homogenization theory for periodic media is implemented in a very generic manner to derive the anisotropic global behavior of the masonry, through rigorous application of the homogenization theory in one step and through a full three-dimensional behavior. We have considered the periodic Eshelby self-consistent method and the finite element method. Two representative unit cells that represent the microstructure of the masonry wall exactly are considered for calibration and numerical application of the theory.

  17. Linking biotic homogenization to habitat type, invasiveness and growth form of naturalized alien plants in North America

    Treesearch

    Hong Qian; Qinfeng Guo

    2010-01-01

    Aim Biotic homogenization is a growing phenomenon and has recently attracted much attention. Here, we analyse a large dataset of native and alien plants in North America to examine whether biotic homogenization is related to several ecological and biological attributes. Location North America (north of Mexico). Methods We assembled...

  18. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization.

    PubMed

    Kwiatkowski, M; Wurlitzer, M; Krutilin, A; Kiani, P; Nimer, R; Omidi, M; Mannaa, A; Bussmann, T; Bartkowiak, K; Kruber, S; Uschold, S; Steffen, P; Lübberstedt, J; Küpker, N; Petersen, H; Knecht, R; Hansen, N O; Zarrine-Afsar, A; Robertson, W D; Miller, R J D; Schlüter, H

    2016-02-16

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization

    PubMed Central

    Kwiatkowski, M.; Wurlitzer, M.; Krutilin, A.; Kiani, P.; Nimer, R.; Omidi, M.; Mannaa, A.; Bussmann, T.; Bartkowiak, K.; Kruber, S.; Uschold, S.; Steffen, P.; Lübberstedt, J.; Küpker, N.; Petersen, H.; Knecht, R.; Hansen, N.O.; Zarrine-Afsar, A.; Robertson, W.D.; Miller, R.J.D.; Schlüter, H.

    2016-01-01

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Biological significance Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. PMID:26778141

  20. Cellular uptake of beta-carotene from protein stabilized solid lipid nano-particles prepared by homogenization-evaporation method

    USDA-ARS?s Scientific Manuscript database

    Using a homogenization-evaporation method, beta-carotene (BC) loaded nano-particles were prepared with different ratios of food-grade sodium caseinate (SC), whey protein isolate (WPI), or soy protein isolate (SPI) to BC and evaluated for their physiochemical stability, in vitro cytotoxicity, and cel...

  1. Optimization of the Magnetic Field Homogeneity Area for Solenoid Type Magnets

    NASA Astrophysics Data System (ADS)

    Perepelkin, Eugene; Polyakova, Rima; Tarelkin, Aleksandr; Kovalenko, Alexander; Sysoev, Pavel; Sadovnikova, Marianne; Yudin, Ivan

    2018-02-01

    Homogeneous magnetic fields are important requisites in modern physics research. In this paper we discuss the problem of magnetic field homogeneity area maximization for solenoid magnets. We discuss A-model and B-model, which are basic types of solenoid magnets used to provide a homogeneous field, and methods for their optimization. We propose C-model which can be used for the NICA project. We have also carried out a cross-check of the C-model with the parameters stated for the CLEO II detector.

  2. First-order reactant in homogeneous turbulence before the final period of decay. [contaminant fluctuations in chemical reaction

    NASA Technical Reports Server (NTRS)

    Kumar, P.; Patel, S. R.

    1974-01-01

    A method is described for studying theoretically the concentration fluctuations of a dilute contaminate undergoing a first-order chemical reaction. The method is based on Deissler's (1958) theory for homogeneous turbulence for times before the final period, and it follows the approach used by Loeffler and Deissler (1961) to study temperature fluctuations in homogeneous turbulence. Four-point correlation equations are obtained; it is assumed that terms containing fifth-order correlation are very small in comparison with those containing fourth-order correlations, and can therefore be neglected. A spectrum equation is obtained in a form which can be solved numerically, yielding the decay law for the concentration fluctuations in homogeneous turbulence for the period much before the final period of decay.

  3. Pellet pestle homogenization of agarose gel slices at 45 degrees C for deoxyribonucleic acid extraction.

    PubMed

    Kurien, B T; Kaufman, K M; Harley, J B; Scofield, R H

    2001-09-15

    A simple method for extracting DNA from agarose gel slices is described. The extraction is rapid and does not involve harsh chemicals or sophisticated equipment. The method involves homogenization of the excised gel slice (in Tris-EDTA buffer), containing the DNA fragment of interest, at 45 degrees C in a microcentrifuge tube with a Kontes pellet pestle for 1 min. The "homogenate" is then centrifuged for 30 s and the supernatant is saved. The "homogenized" agarose is extracted one more time and the supernatant obtained is combined with the previous supernatant. The DNA extracted using this method lent itself to restriction enzyme analysis, ligation, transformation, and expression of functional protein in bacteria. This method was found to be applicable with 0.8, 1.0, and 2.0% agarose gels. DNA fragments varying from 23 to 0.4 kb were extracted using this procedure and a yield ranging from 40 to 90% was obtained. The yield was higher for fragments 2.0 kb and higher (70-90%). This range of efficiency was maintained when the starting material was kept between 10 and 300 ng. The heat step was found to be critical since homogenization at room temperature failed to yield any DNA. Extracting DNA with our method elicited an increased yield (up to twofold) compared with that extracted with a commercial kit. Also, the number of transformants obtained using the DNA extracted with our method was at least twice that obtained using the DNA extracted with the commercial kit. Copyright 2001 Academic Press.

  4. Method for preparing hydrous titanium oxide spherules and other gel forms thereof

    DOEpatents

    Collins, J.L.

    1998-10-13

    The present invention are methods for preparing hydrous titanium oxide spherules, hydrous titanium oxide gels such as gel slabs, films, capillary and electrophoresis gels, titanium monohydrogen phosphate spherules, hydrous titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite sorbent, titanium monohydrogen phosphate spherules having suspendible particles of at least one different sorbent homogeneously embedded within to form a composite sorbent having a desired crystallinity, titanium oxide spherules in the form of anatase, brookite or rutile, titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite, hydrous titanium oxide fiber materials, titanium oxide fiber materials, hydrous titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite, titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite and spherules of barium titanate. These variations of hydrous titanium oxide spherules and gel forms prepared by the gel-sphere, internal gelation process offer more useful forms of inorganic ion exchangers, catalysts, getters and ceramics. 6 figs.

  5. Method for preparing hydrous titanium oxide spherules and other gel forms thereof

    DOEpatents

    Collins, Jack L.

    1998-01-01

    The present invention are methods for preparing hydrous titanium oxide spherules, hydrous titanium oxide gels such as gel slabs, films, capillary and electrophoresis gels, titanium monohydrogen phosphate spherules, hydrous titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite sorbent, titanium monohydrogen phosphate spherules having suspendible particles of at least one different sorbent homogeneously embedded within to form a composite sorbent having a desired crystallinity, titanium oxide spherules in the form of anatase, brookite or rutile, titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite, hydrous titanium oxide fiber materials, titanium oxide fiber materials, hydrous titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite, titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite and spherules of barium titanate. These variations of hydrous titanium oxide spherules and gel forms prepared by the gel-sphere, internal gelation process offer more useful forms of inorganic ion exchangers, catalysts, getters and ceramics.

  6. Sample preparation methods for scanning electron microscopy of homogenized Al-Mg-Si billets: A comparative study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Österreicher, Johannes Albert; Kumar, Manoj

    Characterization of Mg-Si precipitates is crucial for optimizing the homogenization heat treatment of Al-Mg-Si alloys. Although sample preparation is key for high quality scanning electron microscopy imaging, most common methods lead to dealloying of Mg-Si precipitates. In this article we systematically evaluate different sample preparation methods: mechanical polishing, etching with various reagents, and electropolishing using different electrolytes. We demonstrate that the use of a nitric acid and methanol electrolyte for electropolishing a homogenized Al-Mg-Si alloy prevents the dissolution of Mg-Si precipitates, resulting in micrographs of higher quality. This preparation method is investigated in depth and the obtained scanning electron microscopymore » images are compared with transmission electron micrographs: the shape and size of Mg-Si precipitates appear very similar in either method. The scanning electron micrographs allow proper identification and measurement of the Mg-Si phases including needles with lengths of roughly 200 nm. These needles are β″ precipitates as confirmed by high resolution transmission electron microscopy. - Highlights: •Secondary precipitation in homogenized 6xxx Al alloys is crucial for extrudability. •Existing sample preparation methods for SEM are improvable. •Electropolishing with nitric acid/methanol yields superior quality in SEM. •The obtained micrographs are compared to TEM micrographs.« less

  7. A CUMULATIVE MIGRATION METHOD FOR COMPUTING RIGOROUS TRANSPORT CROSS SECTIONS AND DIFFUSION COEFFICIENTS FOR LWR LATTICES WITH MONTE CARLO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhaoyuan Liu; Kord Smith; Benoit Forget

    2016-05-01

    A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices.more » Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.« less

  8. High Shear Homogenization of Lignin to Nanolignin and Thermal Stability of Nanolignin-Polyvinyl Alcohol Blends

    Treesearch

    Sandeep S. Nair; Sudhir Sharma; Yunqiao Pu; Qining Sun; Shaobo Pan; J.Y. Zhu; Yulin Deng; Art J. Ragauskas

    2014-01-01

    A new method to prepare nanolignin using a simple high shear homogenizer is presented. The kraft lignin particles with a broad distribution ranging from large micron- to nano-sized particles were completely homogenized to nanolignin particles with sizes less than 100 nm after 4 h of mechanical shearing. The 13C nuclear magnetic resonance (NMR)...

  9. A Comparison of Aerosolization and Homogenization Techniques for Production of Alginate Microparticles for Delivery of Corticosteroids to the Colon.

    PubMed

    Samak, Yassmin O; El Massik, Magda; Coombes, Allan G A

    2017-01-01

    Alginate microparticles incorporating hydrocortisone hemisuccinate were produced by aerosolization and homogenization methods to investigate their potential for colonic drug delivery. Microparticle stabilization was achieved by CaCl 2 crosslinking solution (0.5 M and 1 M), and drug loading was accomplished by diffusion into blank microparticles or by direct encapsulation. Homogenization method produced smaller microparticles (45-50 μm), compared to aerosolization (65-90 μm). High drug loadings (40% wt/wt) were obtained for diffusion-loaded aerosolized microparticles. Aerosolized microparticles suppressed drug release in simulated gastric fluid (SGF) and simulated intestinal fluid (SIF) prior to drug release in simulated colonic fluid (SCF) to a higher extent than homogenized microparticles. Microparticles prepared using aerosolization or homogenization (1 M CaCl 2 , diffusion loaded) released 5% and 17% of drug content after 2 h in SGF and 4 h in SIF, respectively, and 75% after 12 h in SCF. Thus, aerosolization and homogenization techniques show potential for producing alginate microparticles for colonic drug delivery in the treatment of inflammatory bowel disease. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  10. Direct vibro-elastography FEM inversion in Cartesian and cylindrical coordinate systems without the local homogeneity assumption

    NASA Astrophysics Data System (ADS)

    Honarvar, M.; Lobo, J.; Mohareri, O.; Salcudean, S. E.; Rohling, R.

    2015-05-01

    To produce images of tissue elasticity, the vibro-elastography technique involves applying a steady-state multi-frequency vibration to tissue, estimating displacements from ultrasound echo data, and using the estimated displacements in an inverse elasticity problem with the shear modulus spatial distribution as the unknown. In order to fully solve the inverse problem, all three displacement components are required. However, using ultrasound, the axial component of the displacement is measured much more accurately than the other directions. Therefore, simplifying assumptions must be used in this case. Usually, the equations of motion are transformed into a Helmholtz equation by assuming tissue incompressibility and local homogeneity. The local homogeneity assumption causes significant imaging artifacts in areas of varying elasticity. In this paper, we remove the local homogeneity assumption. In particular we introduce a new finite element based direct inversion technique in which only the coupling terms in the equation of motion are ignored, so it can be used with only one component of the displacement. Both Cartesian and cylindrical coordinate systems are considered. The use of multi-frequency excitation also allows us to obtain multiple measurements and reduce artifacts in areas where the displacement of one frequency is close to zero. The proposed method was tested in simulations and experiments against a conventional approach in which the local homogeneity is used. The results show significant improvements in elasticity imaging with the new method compared to previous methods that assumes local homogeneity. For example in simulations, the contrast to noise ratio (CNR) for the region with spherical inclusion increases from an average value of 1.5-17 after using the proposed method instead of the local inversion with homogeneity assumption, and similarly in the prostate phantom experiment, the CNR improved from an average value of 1.6 to about 20.

  11. Effect of high-pressure homogenization preparation on mean globule size and large-diameter tail of oil-in-water injectable emulsions.

    PubMed

    Peng, Jie; Dong, Wu-Jun; Li, Ling; Xu, Jia-Ming; Jin, Du-Jia; Xia, Xue-Jun; Liu, Yu-Ling

    2015-12-01

    The effect of different high pressure homogenization energy input parameters on mean diameter droplet size (MDS) and droplets with > 5 μm of lipid injectable emulsions were evaluated. All emulsions were prepared at different water bath temperatures or at different rotation speeds and rotor-stator system times, and using different homogenization pressures and numbers of high-pressure system recirculations. The MDS and polydispersity index (PI) value of the emulsions were determined using the dynamic light scattering (DLS) method, and large-diameter tail assessments were performed using the light-obscuration/single particle optical sensing (LO/SPOS) method. Using 1000 bar homogenization pressure and seven recirculations, the energy input parameters related to the rotor-stator system will not have an effect on the final particle size results. When rotor-stator system energy input parameters are fixed, homogenization pressure and recirculation will affect mean particle size and large diameter droplet. Particle size will decrease with increasing homogenization pressure from 400 bar to 1300 bar when homogenization recirculation is fixed; when the homogenization pressure is fixed at 1000 bar, the particle size of both MDS and percent of fat droplets exceeding 5 μm (PFAT 5 ) will decrease with increasing homogenization recirculations, MDS dropped to 173 nm after five cycles and maintained this level, volume-weighted PFAT 5 will drop to 0.038% after three cycles, so the "plateau" of MDS will come up later than that of PFAT 5 , and the optimal particle size is produced when both of them remained at plateau. Excess homogenization recirculation such as nine times under the 1000 bar may lead to PFAT 5 increase to 0.060% rather than a decrease; therefore, the high-pressure homogenization procedure is the key factor affecting the particle size distribution of emulsions. Varying storage conditions (4-25°C) also influenced particle size, especially the PFAT 5 . Copyright © 2015. Published by Elsevier B.V.

  12. Comparison of statistical algorithms for detecting homogeneous river reaches along a longitudinal continuum

    NASA Astrophysics Data System (ADS)

    Leviandier, Thierry; Alber, A.; Le Ber, F.; Piégay, H.

    2012-02-01

    Seven methods designed to delineate homogeneous river segments, belonging to four families, namely — tests of homogeneity, contrast enhancing, spatially constrained classification, and hidden Markov models — are compared, firstly on their principles, then on a case study, and on theoretical templates. These templates contain patterns found in the case study but not considered in the standard assumptions of statistical methods, such as gradients and curvilinear structures. The influence of data resolution, noise and weak satisfaction of the assumptions underlying the methods is investigated. The control of the number of reaches obtained in order to achieve meaningful comparisons is discussed. No method is found that outperforms all the others on all trials. However, the methods with sequential algorithms (keeping at order n + 1 all breakpoints found at order n) fail more often than those running complete optimisation at any order. The Hubert-Kehagias method and Hidden Markov Models are the most successful at identifying subpatterns encapsulated within the templates. Ergodic Hidden Markov Models are, moreover, liable to exhibit transition areas.

  13. Identification of homogeneous regions for regionalization of watersheds by two-level self-organizing feature maps

    NASA Astrophysics Data System (ADS)

    Farsadnia, F.; Rostami Kamrood, M.; Moghaddam Nia, A.; Modarres, R.; Bray, M. T.; Han, D.; Sadatinejad, J.

    2014-02-01

    One of the several methods in estimating flood quantiles in ungauged or data-scarce watersheds is regional frequency analysis. Amongst the approaches to regional frequency analysis, different clustering techniques have been proposed to determine hydrologically homogeneous regions in the literature. Recently, Self-Organization feature Map (SOM), a modern hydroinformatic tool, has been applied in several studies for clustering watersheds. However, further studies are still needed with SOM on the interpretation of SOM output map for identifying hydrologically homogeneous regions. In this study, two-level SOM and three clustering methods (fuzzy c-mean, K-mean, and Ward's Agglomerative hierarchical clustering) are applied in an effort to identify hydrologically homogeneous regions in Mazandaran province watersheds in the north of Iran, and their results are compared with each other. Firstly the SOM is used to form a two-dimensional feature map. Next, the output nodes of the SOM are clustered by using unified distance matrix algorithm and three clustering methods to form regions for flood frequency analysis. The heterogeneity test indicates the four regions achieved by the two-level SOM and Ward approach after adjustments are sufficiently homogeneous. The results suggest that the combination of SOM and Ward is much better than the combination of either SOM and FCM or SOM and K-mean.

  14. Probabilistic homogenization of random composite with ellipsoidal particle reinforcement by the iterative stochastic finite element method

    NASA Astrophysics Data System (ADS)

    Sokołowski, Damian; Kamiński, Marcin

    2018-01-01

    This study proposes a framework for determination of basic probabilistic characteristics of the orthotropic homogenized elastic properties of the periodic composite reinforced with ellipsoidal particles and a high stiffness contrast between the reinforcement and the matrix. Homogenization problem, solved by the Iterative Stochastic Finite Element Method (ISFEM) is implemented according to the stochastic perturbation, Monte Carlo simulation and semi-analytical techniques with the use of cubic Representative Volume Element (RVE) of this composite containing single particle. The given input Gaussian random variable is Young modulus of the matrix, while 3D homogenization scheme is based on numerical determination of the strain energy of the RVE under uniform unit stretches carried out in the FEM system ABAQUS. The entire series of several deterministic solutions with varying Young modulus of the matrix serves for the Weighted Least Squares Method (WLSM) recovery of polynomial response functions finally used in stochastic Taylor expansions inherent for the ISFEM. A numerical example consists of the High Density Polyurethane (HDPU) reinforced with the Carbon Black particle. It is numerically investigated (1) if the resulting homogenized characteristics are also Gaussian and (2) how the uncertainty in matrix Young modulus affects the effective stiffness tensor components and their PDF (Probability Density Function).

  15. 3D geometric split-merge segmentation of brain MRI datasets.

    PubMed

    Marras, Ioannis; Nikolaidis, Nikolaos; Pitas, Ioannis

    2014-05-01

    In this paper, a novel method for MRI volume segmentation based on region adaptive splitting and merging is proposed. The method, called Adaptive Geometric Split Merge (AGSM) segmentation, aims at finding complex geometrical shapes that consist of homogeneous geometrical 3D regions. In each volume splitting step, several splitting strategies are examined and the most appropriate is activated. A way to find the maximal homogeneity axis of the volume is also introduced. Along this axis, the volume splitting technique divides the entire volume in a number of large homogeneous 3D regions, while at the same time, it defines more clearly small homogeneous regions within the volume in such a way that they have greater probabilities of survival at the subsequent merging step. Region merging criteria are proposed to this end. The presented segmentation method has been applied to brain MRI medical datasets to provide segmentation results when each voxel is composed of one tissue type (hard segmentation). The volume splitting procedure does not require training data, while it demonstrates improved segmentation performance in noisy brain MRI datasets, when compared to the state of the art methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Use of focused acoustics for cell disruption to provide ultra scale-down insights of microbial homogenization and its bioprocess impact--recovery of antibody fragments from rec E. coli.

    PubMed

    Li, Qiang; Aucamp, Jean P; Tang, Alison; Chatel, Alex; Hoare, Mike

    2012-08-01

    An ultra scale-down (USD) device that provides insight of how industrial homogenization impacts bioprocess performance is desirable in the biopharmaceutical industry, especially at the early stage of process development where only a small quantity of material is available. In this work, we assess the effectiveness of focused acoustics as the basis of an USD cell disruption method to mimic and study high-pressure, step-wise homogenization of rec Escherichia coli cells for the recovery of an intracellular protein, antibody fragment (Fab'). The release of both Fab' and of overall protein follows first-order reaction kinetics with respect to time of exposure to focused acoustics. The rate constant is directly proportional to applied electrical power input per unit volume. For nearly total protein or Fab' release (>99%), the key physical properties of the disruptate produced by focused acoustics, such as cell debris particle size distribution and apparent viscosity show good agreement with those for homogenates produced by high-pressure homogenization operated to give the same fractional release. The only key difference is observed for partial disruption of cells where focused acoustics yields a disruptate of lower viscosity than homogenization, evidently due to a greater extent of polynucleic acids degradation. Verification of this USD approach to cell disruption by high-pressure homogenization is achieved using USD centrifugation to demonstrate the same sedimentation characteristics of disruptates prepared using both the scaled-down focused acoustic and the pilot-scale homogenization methods for the same fraction of protein release. Copyright © 2012 Wiley Periodicals, Inc.

  17. Homogeneous fluorescent specific PCR for the authentication of medicinal snakes using cationic conjugated polymers.

    PubMed

    Jiang, Chao; Yuan, Yuan; Liu, Libing; Hou, Jingyi; Jin, Yan; Huang, Luqi

    2015-11-05

    A label-free, homogenous and sensitive one-step method for the molecular authentication of medicinal snakes has been developed by combining a rapid PCR technique with water-soluble cationic conjugated polyelectrolytes (CCPs). Three medicinal snake materials (Deinagkistrodon acutus, Zaocys dhumnades and Bungarus multicinctus; a total of 35 specimens) and 48 snake specimens with similar morphologies and textures were clearly distinguished by the naked eye by utilizing a CCP-based assay in a high-throughput manner. The identification of medicinal snakes in patented Chinese drugs was successfully performed using this detection system. In contrast to previous fluorescence-labeled oligonucleotide detection and direct DNA stain hybridization assays, this method does not require designing dye-labeled primers, and unfavorable dimer fluorescence is avoided in this homogenous method.

  18. Method for preparing hydrous iron oxide gels and spherules

    DOEpatents

    Collins, Jack L.; Lauf, Robert J.; Anderson, Kimberly K.

    2003-07-29

    The present invention is directed to methods for preparing hydrous iron oxide spherules, hydrous iron oxide gels such as gel slabs, films, capillary and electrophoresis gels, iron monohydrogen phosphate spherules, hydrous iron oxide spherules having suspendable particles homogeneously embedded within to form composite sorbents and catalysts, iron monohydrogen phosphate spherules having suspendable particles of at least one different sorbent homogeneously embedded within to form a composite sorbent, iron oxide spherules having suspendable particles homogeneously embedded within to form a composite of hydrous iron oxide fiber materials, iron oxide fiber materials, hydrous iron oxide fiber materials having suspendable particles homogeneously embedded within to form a composite, iron oxide fiber materials having suspendable particles homogeneously embedded within to form a composite, dielectric spherules of barium, strontium, and lead ferrites and mixtures thereof, and composite catalytic spherules of barium or strontium ferrite embedded with oxides of Mg, Zn, Pb, Ce and mixtures thereof. These variations of hydrous iron oxide spherules and gel forms prepared by the gel-sphere, internal gelation process offer more useful forms of inorganic ion exchangers, catalysts, getters, dielectrics, and ceramics.

  19. Effects of ultrasonication and conventional mechanical homogenization processes on the structures and dielectric properties of BaTiO3 ceramics.

    PubMed

    Akbas, Hatice Zehra; Aydin, Zeki; Yilmaz, Onur; Turgut, Selvin

    2017-01-01

    The effects of the homogenization process on the structures and dielectric properties of pure and Nb-doped BaTiO 3 ceramics have been investigated using an ultrasonic homogenization and conventional mechanical methods. The reagents were homogenized using an ultrasonic processor with high-intensity ultrasonic waves and using a compact mixer-shaker. The components and crystal types of the powders were determined by Fourier-transform infrared spectroscopy (FTIR) and X-ray diffraction (XRD) analyses. The complex permittivity (ε ' , ε″) and AC conductivity (σ') of the samples were analyzed in a wide frequency range of 20Hz to 2MHz at room temperature. The structures and dielectric properties of pure and Nb-doped BaTiO 3 ceramics strongly depend on the homogenization process in a solid-state reaction method. Using an ultrasonic processor with high-intensity ultrasonic waves based on acoustic cavitation phenomena can make a significant improvement in producing high-purity BaTiO 3 ceramics without carbonate impurities with a small dielectric loss. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Enzymatic production of N-acetyl-d-glucosamine from crayfish shell wastes pretreated via high pressure homogenization.

    PubMed

    Wei, Guoguang; Zhang, Alei; Chen, Kequan; Ouyang, Pingkai

    2017-09-01

    This study presents an efficient pretreatment of crayfish shell using high pressure homogenization that enables N-acetyl-d-glucosamine (GlcNAc) production by chitinase. Firstly, the chitinase from Serratia proteamaculans NJ303 was screened for its ability to degrade crayfish shell and produce GlcNAc as the sole product. Secondly, high pressure homogenization, which caused the crayfish shell to adopt a fluffy netted structure that was characterized by Scanning electron microscope (SEM), Fourier transform infrared spectrometer (FT-IR), X-ray diffraction (XRD), was evaluated as the best pretreatment method. In addition, the optimal conditions of high pressure homogenization of crayfish shell were determined to be five cycles at a pressure of 400bar, which achieved a yield of 3.9g/L of GlcNAc from 25g/L of crayfish shell in a batch enzymatic reaction over 1.5h. The results showed high pressure homogenization might be an efficient method for direct utilization of crayfish shell for enzymatic production of GlcNAc. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Enhanced Detection of Vibrio Cholerae in Oyster Homogenate Based on Centrifugal Removal of Inhibitory Agents

    NASA Technical Reports Server (NTRS)

    Alexander, Donita; DePaola, Angelo; Young, Ronald B.

    1998-01-01

    The disease cholera, caused by Vibrio cholerae, has been associated with consumption of contaminated seafood, including raw oysters. Detection of V. cholerae in foods typically involves blending the oysters, diluting the homogenate in alkaline peptone water (APW), overnight enrichment, and isolation on selective agar. Unfortunately, the oyster homogenate must be diluted to large volumes because lower dilutions inhibit the growth of V. cholerae. The goals of this study were to develop an alternative to large dilutions and to evaluate the basis for the inhibition observed in lower dilutions of oyster homogenates. Centrifugation of oyster homogenates at 10,000 x g for 15 min, followed by enrichment of the resulting pellet in APW, was found to eliminate the inhibition of V. cholerae growth. Inhibition appears not to be due to competing microflora but to a component(s) released when V. cholerae grows in the presence of oyster homogenate. The inhibitory component(s) kills the V. cholerae after the cell concentration reaches > 10(exp 8) cells/mL, rather than initially preventing their growth. The pH also declines from 8.0 to 5.5 during this period; however, the pH decline by itself appears not to cause V. cholerae death. Seven strains of V. cholerae (01 and non-01) and two strains of V. vulnificus were susceptible to the inhibitory agent(s). However, other Vibrio and non-Vibrio species tested were not inhibited by the oyster homogenates. Based on digestion of oyster homogenates with pronase, trypsin and lipase, the inhibitory reaction involves a protein(s). In a preliminary trial with oyster homogenate seeded with 1 cfu/g of V. cholerae, the modified centrifugation technique detected a slightly higher percentage of samples at a 1:10 dilution than the standard FDA Bacteriological Analytical Method (BAM) detected in uncentrifuged oyster homogenate at a 1:100 dilution. V. cholerae in seeded samples could also be detected more frequently by the modified centrifugation method than by PCR at a 1:10 dilution.

  2. General expressions for downlink signal to interference and noise ratio in homogeneous and heterogeneous LTE-Advanced networks.

    PubMed

    Ali, Nora A; Mourad, Hebat-Allah M; ElSayed, Hany M; El-Soudani, Magdy; Amer, Hassanein H; Daoud, Ramez M

    2016-11-01

    The interference is the most important problem in LTE or LTE-Advanced networks. In this paper, the interference was investigated in terms of the downlink signal to interference and noise ratio (SINR). In order to compare the different frequency reuse methods that were developed to enhance the SINR, it would be helpful to have a generalized expression to study the performance of the different methods. Therefore, this paper introduces general expressions for the SINR in homogeneous and in heterogeneous networks. In homogeneous networks, the expression was applied for the most common types of frequency reuse techniques: soft frequency reuse (SFR) and fractional frequency reuse (FFR). The expression was examined by comparing it with previously developed ones in the literature and the comparison showed that the expression is valid for any type of frequency reuse scheme and any network topology. Furthermore, the expression was extended to include the heterogeneous network; the expression includes the problem of co-tier and cross-tier interference in heterogeneous networks (HetNet) and it was examined by the same method of the homogeneous one.

  3. Method of assessing heterogeneity in images

    DOEpatents

    Jacob, Richard E.; Carson, James P.

    2016-08-23

    A method of assessing heterogeneity in images is disclosed. 3D images of an object are acquired. The acquired images may be filtered and masked. Iterative decomposition is performed on the masked images to obtain image subdivisions that are relatively homogeneous. Comparative analysis, such as variogram analysis or correlogram analysis, is performed of the decomposed images to determine spatial relationships between regions of the images that are relatively homogeneous.

  4. Variation of Parameters in Differential Equations (A Variation in Making Sense of Variation of Parameters)

    ERIC Educational Resources Information Center

    Quinn, Terry; Rai, Sanjay

    2012-01-01

    The method of variation of parameters can be found in most undergraduate textbooks on differential equations. The method leads to solutions of the non-homogeneous equation of the form y = u[subscript 1]y[subscript 1] + u[subscript 2]y[subscript 2], a sum of function products using solutions to the homogeneous equation y[subscript 1] and…

  5. Effects of poling over the orthorhombic-tetragonal phase transition temperature in compositionally homogeneous (K,Na)NbO3-based ceramics

    NASA Astrophysics Data System (ADS)

    Morozov, M. I.; Kungl, H.; Hoffmann, M. J.

    2011-03-01

    Li-, Ta-, and Mn-modified (K,Na)NbO3 ceramics with various compositional homogeneity have been prepared by conventional and precursor methods. The homogeneous ceramic has demonstrated a sharper peak in temperature dependent piezoelectric response. The dielectric and piezoelectric properties of the homogeneous ceramics have been characterized at the experimental subcoercive electric fields near the temperature of the orthorhombic-tetragonal phase transition with respect to poling in both phases. Poling in the tetragonal phase is shown to enhance the low-signal dielectric and piezoelectric properties in the orthorhombic phase.

  6. Homogenization models for thin rigid structured surfaces and films.

    PubMed

    Marigo, Jean-Jacques; Maurel, Agnès

    2016-07-01

    A homogenization method for thin microstructured surfaces and films is presented. In both cases, sound hard materials are considered, associated with Neumann boundary conditions and the wave equation in the time domain is examined. For a structured surface, a boundary condition is obtained on an equivalent flat wall, which links the acoustic velocity to its normal and tangential derivatives (of the Myers type). For a structured film, jump conditions are obtained for the acoustic pressure and the normal velocity across an equivalent interface (of the Ventcels type). This interface homogenization is based on a matched asymptotic expansion technique, and differs slightly from the classical homogenization, which is known to fail for small structuration thicknesses. In order to get insight into what causes this failure, a two-step homogenization is proposed, mixing classical homogenization and matched asymptotic expansion. Results of the two homogenizations are analyzed in light of the associated elementary problems, which correspond to problems of fluid mechanics, namely, potential flows around rigid obstacles.

  7. Computational Homogenization of Mechanical Properties for Laminate Composites Reinforced with Thin Film Made of Carbon Nanotubes

    NASA Astrophysics Data System (ADS)

    El Moumen, A.; Tarfaoui, M.; Lafdi, K.

    2018-06-01

    Elastic properties of laminate composites based Carbone Nanotubes (CNTs), used in military applications, were estimated using homogenization techniques and compared to the experimental data. The composite consists of three phases: T300 6k carbon fibers fabric with 5HS (satin) weave, baseline pure Epoxy matrix and CNTs added with 0.5%, 1%, 2% and 4%. Two step homogenization methods based RVE model were employed. The objective of this paper is to determine the elastic properties of structure starting from the knowledge of those of constituents (CNTs, Epoxy and carbon fibers fabric). It is assumed that the composites have a geometric periodicity and the homogenization model can be represented by a representative volume element (RVE). For multi-scale analysis, finite element modeling of unit cell based two step homogenization method is used. The first step gives the properties of thin film made of epoxy and CNTs and the second is used for homogenization of laminate composite. The fabric unit cell is chosen using a set of microscopic observation and then identified by its ability to enclose the characteristic periodic repeat in the fabric weave. The unit cell model of 5-Harness satin weave fabric textile composite is identified for numerical approach and their dimensions are chosen based on some microstructural measurements. Finally, a good comparison was obtained between the predicted elastic properties using numerical homogenization approach and the obtained experimental data with experimental tests.

  8. Computational Homogenization of Mechanical Properties for Laminate Composites Reinforced with Thin Film Made of Carbon Nanotubes

    NASA Astrophysics Data System (ADS)

    El Moumen, A.; Tarfaoui, M.; Lafdi, K.

    2017-08-01

    Elastic properties of laminate composites based Carbone Nanotubes (CNTs), used in military applications, were estimated using homogenization techniques and compared to the experimental data. The composite consists of three phases: T300 6k carbon fibers fabric with 5HS (satin) weave, baseline pure Epoxy matrix and CNTs added with 0.5%, 1%, 2% and 4%. Two step homogenization methods based RVE model were employed. The objective of this paper is to determine the elastic properties of structure starting from the knowledge of those of constituents (CNTs, Epoxy and carbon fibers fabric). It is assumed that the composites have a geometric periodicity and the homogenization model can be represented by a representative volume element (RVE). For multi-scale analysis, finite element modeling of unit cell based two step homogenization method is used. The first step gives the properties of thin film made of epoxy and CNTs and the second is used for homogenization of laminate composite. The fabric unit cell is chosen using a set of microscopic observation and then identified by its ability to enclose the characteristic periodic repeat in the fabric weave. The unit cell model of 5-Harness satin weave fabric textile composite is identified for numerical approach and their dimensions are chosen based on some microstructural measurements. Finally, a good comparison was obtained between the predicted elastic properties using numerical homogenization approach and the obtained experimental data with experimental tests.

  9. ANALYSIS OF FISH HOMOGENATES FOR PERFLUORINATED COMPOUNDS

    EPA Science Inventory

    Perfluorinated compounds (PFCs) which include PFOS and PFOA are widely distributed in wildlife. Whole fish homogenates were analyzed for PFCs from the upper Mississippi, the Missouri and the Ohio rivers. Methods development, validation data, and preliminary study results will b...

  10. Sol-gel methods for synthesis of aluminosilicates for dental applications.

    PubMed

    Cestari, Alexandre

    2016-12-01

    Amorphous aluminosilicates glasses containing fluorine, phosphorus and calcium are used as a component of the glass ionomer dental cement. This cement is used as a restorative, basis or filling material, but presents lower mechanical resistance than resin-modified materials. The Sol-Gel method is a possible route for preparation of glasses with lower temperature and energy consumption, with higher homogeneity and with uniform and nanometric particles, compared to the industrial methods Glass ionomer cements with uniform, homogeneous and nanometric particles can present higher mechanical resistance than commercial ionomers. The aim of this work was to adapt the Sol-Gel methods to produce new aluminosilicate glass particles by non-hydrolytic, hydrolytic acid and hydrolytic basic routes, to improve glass ionomer cements characteristics. Three materials were synthesized with the same composition, to evaluate the properties of the glasses produced from the different methods, because multicomponent oxides are difficult to prepare with homogeneity. The objective was to develop a new route to produce new glass particles for ionomer cements with possible higher resistance. The particles were characterized by thermal analysis (TG, DTA, DSC), transmission electron microscopy (TEM), X-ray diffraction (XRD), infrared spectroscopy (FTIR) and scanning electron microscopy coupled with energy dispersive spectroscopy (SEM-EDS). The glasses were tested with polyacrylic acid to form the glass ionomer cement by the setting reaction. It was possible to produce distinct materials for dental applications and a sample presented superior characteristics (homogeneity, nanometric particles, and homogenous elemental distribution) than commercial glasses for ionomer cements. The new route for glass production can possible improve the mechanical resistance of the ionomer cements. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Automated cell disruption is a reliable and effective method of isolating RNA from fresh snap-frozen normal and malignant oral mucosa samples.

    PubMed

    Van der Vorst, Sébastien; Dekairelle, Anne-France; Irenge, Léonid; Hamoir, Marc; Robert, Annie; Gala, Jean-Luc

    2009-01-01

    This study compared automated vs. manual tissue grinding in terms of RNA yield obtained from oral mucosa biopsies. A total of 20 patients undergoing uvulectomy for sleep-related disorders and 10 patients undergoing biopsy for head and neck squamous cell carcinoma were enrolled in the study. Samples were collected, snap-frozen in liquid nitrogen, and divided into two parts of similar weight. Sample grinding was performed on one sample from each pair, either manually or using an automated cell disruptor. The performance and efficacy of each homogenization approach was compared in terms of total RNA yield (spectrophotometry, fluorometry), mRNA quantity [densitometry of specific TP53 amplicons and TP53 quantitative reverse-transcribed real-time PCR (qRT-PCR)], and mRNA quality (functional analysis of separated alleles in yeast). Although spectrophotometry and fluorometry results were comparable for both homogenization methods, TP53 expression values obtained by amplicon densitometry and qRT-PCR were significantly and consistently better after automated homogenization (p<0.005) for both uvula and tumor samples. Functional analysis of separated alleles in yeast results was better with the automated technique for tumor samples. Automated tissue homogenization appears to be a versatile, quick, and reliable method of cell disruption and is especially useful in the case of small malignant samples, which show unreliable results when processed by manual homogenization.

  12. On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales

    NASA Astrophysics Data System (ADS)

    Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.

    2018-06-01

    The necessity to find the global optimum of multiextremal functions arises in many applied problems where finding local solutions is insufficient. One of the desirable properties of global optimization methods is strong homogeneity meaning that a method produces the same sequences of points where the objective function is evaluated independently both of multiplication of the function by a scaling constant and of adding a shifting constant. In this paper, several aspects of global optimization using strongly homogeneous methods are considered. First, it is shown that even if a method possesses this property theoretically, numerically very small and large scaling constants can lead to ill-conditioning of the scaled problem. Second, a new class of global optimization problems where the objective function can have not only finite but also infinite or infinitesimal Lipschitz constants is introduced. Third, the strong homogeneity of several Lipschitz global optimization algorithms is studied in the framework of the Infinity Computing paradigm allowing one to work numerically with a variety of infinities and infinitesimals. Fourth, it is proved that a class of efficient univariate methods enjoys this property for finite, infinite and infinitesimal scaling and shifting constants. Finally, it is shown that in certain cases the usage of numerical infinities and infinitesimals can avoid ill-conditioning produced by scaling. Numerical experiments illustrating theoretical results are described.

  13. [Near infrared analysis of blending homogeneity of Chinese medicine formula particles based on moving window F test method].

    PubMed

    Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang

    2016-10-01

    Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, Chris, E-mail: cyuan@uwm.edu; Wang, Endong; Zhai, Qiang

    Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting inmore » LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.« less

  15. Comparing the index-flood and multiple-regression methods using L-moments

    NASA Astrophysics Data System (ADS)

    Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.

    In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin in central Iran. To estimate floods of various return periods for gauged catchments in the study area, the mean annual peak flood of the catchments may be multiplied by corresponding values of the growth factors, and computed using the GEV distribution.

  16. Isolation of Salmonella from alfalfa seed and demonstration of impaired growth of heat-injured cells in seed homogenates.

    PubMed

    Liao, Ching-Hsing; Fett, William F

    2003-05-15

    Three major foodborne outbreaks of salmonellosis in 1998 and 1999 were linked to the consumption of raw alfalfa sprouts. In this report, an improved method is described for isolation of Salmonella from alfalfa seed lots, which had been implicated in these outbreaks. From each seed lot, eight samples each containing 25 g of seed were tested for the presence of Salmonella by the US FDA Bacteriological Analytical Manual (BAM) procedure and by a modified method applying two successive pre-enrichment steps. Depending on the seed lot, one to four out of eight samples tested positive for Salmonella by the standard procedure and two to seven out of eight samples tested positive by the modified method. Thus, the use of two consecutive pre-enrichment steps led to a higher detection rate than a single pre-enrichment step. This result indirectly suggested that Salmonella cells on contaminated seeds might be injured and failed to fully resuscitate in pre-enrichment broth containing seed components during the first 24 h of incubation. Responses of heat-injured Salmonella cells grown in buffered peptone water (BPW) and in three alfalfa seed homogenates were investigated. For preparation of seed homogenates, 25 g of seeds were homogenized in 200 ml of BPW using a laboratory Stomacher and subsequently held at 37 degrees C for 24 h prior to centrifugation and filtration. While untreated cells grew at about the same rate in BPW and in seed homogenates, heat-injured cells (52 degrees C, 10 min) required approximately 0.5 to 4.0 h longer to resuscitate in seed homogenates than in BPW. This result suggests that the alfalfa seed components or fermented metabolites from native bacteria hinder the repair and growth of heat-injured cells. This study also shows that an additional pre-enrichment step increases the frequency of isolation of Salmonella from naturally contaminated seeds, possibly by alleviating the toxic effect of seed homogenates on repair or growth of injured cells.

  17. Voxel-wise meta-analyses of brain blood flow and local synchrony abnormalities in medication-free patients with major depressive disorder

    PubMed Central

    Chen, Zi-Qi; Du, Ming-Ying; Zhao, You-Jin; Huang, Xiao-Qi; Li, Jing; Lui, Su; Hu, Jun-Mei; Sun, Huai-Qiang; Liu, Jia; Kemp, Graham J.; Gong, Qi-Yong

    2015-01-01

    Background Published meta-analyses of resting-state regional cerebral blood flow (rCBF) studies of major depressive disorder (MDD) have included patients receiving antidepressants, which might affect brain activity and thus bias the results. To our knowledge, no meta-analysis has investigated regional homogeneity changes in medication-free patients with MDD. Moreover, an association between regional homogeneity and rCBF has been demonstrated in some brain regions in healthy controls. We sought to explore to what extent resting-state rCBF and regional homogeneity changes co-occur in the depressed brain without the potential confound of medication. Methods Using the effect-size signed differential mapping method, we conducted 2 meta-analyses of rCBF and regional homogeneity studies of medication-free patients with MDD. Results Our systematic search identified 14 rCBF studies and 9 regional homogeneity studies. We identified conjoint decreases in resting-state rCBF and regional homogeneity in the insula and superior temporal gyrus in medication-free patients with MDD compared with controls. Other changes included altered resting-state rCBF in the precuneus and in the frontal–limbic–thalamic–striatal neural circuit as well as altered regional homogeneity in the uncus and parahippocampal gyrus. Meta-regression revealed that the percentage of female patients with MDD was negatively associated with resting-state rCBF in the right anterior cingulate cortex and that the age of patients with MDD was negatively associated with rCBF in the left insula and with regional homogeneity in the left uncus. Limitations The analysis techniques, patient characteristics and clinical variables of the included studies were heterogeneous. Conclusion The conjoint alterations of rCBF and regional homogeneity in the insula and superior temporal gyrus may be core neuropathological changes in medication-free patients with MDD and serve as a specific region of interest for further studies on MDD. PMID:25853283

  18. Assessing Dietary Exposure to Pyrethroid Insecticides by LCIMS/MS of Food Composites

    EPA Science Inventory

    Method Commercially-obtained vegetables, chips, cereal, meat, and other solid food products were homogenized together to create composited control matrices at 1%, 5%, and 100/0 fat content. Lyophilized homogenates were spiked with 7 pyrethroids, 6 degradation products, bisphen...

  19. Using Floquet periodicity to easily calculate dispersion curves and wave structures of homogeneous waveguides

    NASA Astrophysics Data System (ADS)

    Hakoda, Christopher; Rose, Joseph; Shokouhi, Parisa; Lissenden, Clifford

    2018-04-01

    Dispersion curves are essential to any guided-wave-related project. The Semi-Analytical Finite Element (SAFE) method has become the conventional way to compute dispersion curves for homogeneous waveguides. However, only recently has a general SAFE formulation for commercial and open-source software become available, meaning that until now SAFE analyses have been variable and more time consuming than desirable. Likewise, the Floquet boundary conditions enable analysis of waveguides with periodicity and have been an integral part of the development of metamaterials. In fact, we have found the use of Floquet boundary conditions to be an extremely powerful tool for homogeneous waveguides, too. The nuances of using periodic boundary conditions for homogeneous waveguides that do not exhibit periodicity are discussed. Comparisons between this method and SAFE are made for selected homogeneous waveguide applications. The COMSOL Multiphysics software is used for the results shown, but any standard finite element software that can implement Floquet periodicity (user-defined or built-in) should suffice. Finally, we identify a number of complex waveguides for which dispersion curves can be found with relative ease by using the periodicity inherent to the Floquet boundary conditions.

  20. High-temperature viscoelastic creep constitutive equations for polymer composites: Homogenization theory and experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skontorp, A.; Wang, S.S.; Shibuya, Y.

    1994-12-31

    In this paper, a homogenization theory is developed to determine high-temperature effective viscoelastic constitutive equations for fiber-reinforced polymer composites. The homogenization theory approximates the microstructure of a fiber composite, and determine simultaneously effective macroscopic constitutive properties of the composite and the associated microscopic strain and stress in the heterogeneous material. The time-temperature dependent homogenization theory requires that the viscoelastic constituent properties of the matrix phase at elevated temperatures, the governing equations for the composites, and the boundary conditions of the problem be Laplace transformed to a conjugate problem. The homogenized effective properties in the transformed domain are determined, using amore » two-scale asymptotic expansion of field variables and an averaging procedure. Field solutions in the unit cell are determined from basic and first-order governing equations with the aid of a boundary integral method (BIM). Effective viscoelastic constitutive properties of the composite at elevated temperatures are determined by an inverse transformation, as are the microscopic stress and deformation in the composite. Using this method, interactions among fibers and between the fibers and the matrix can be evaluated explicitly, resulting in accurate solutions for composites with high-volume fraction of reinforcing fibers. Examples are given for the case of a carbon-fiber reinforced thermoplastic polyamide composite in an elevated temperature environment. The homogenization predictions are in good agreement with experimental data available for the composite.« less

  1. High-throughput method for optimum solubility screening for homogeneity and crystallization of proteins

    DOEpatents

    Kim, Sung-Hou [Moraga, CA; Kim, Rosalind [Moraga, CA; Jancarik, Jamila [Walnut Creek, CA

    2012-01-31

    An optimum solubility screen in which a panel of buffers and many additives are provided in order to obtain the most homogeneous and monodisperse protein condition for protein crystallization. The present methods are useful for proteins that aggregate and cannot be concentrated prior to setting up crystallization screens. A high-throughput method using the hanging-drop method and vapor diffusion equilibrium and a panel of twenty-four buffers is further provided. Using the present methods, 14 poorly behaving proteins have been screened, resulting in 11 of the proteins having highly improved dynamic light scattering results allowing concentration of the proteins, and 9 were crystallized.

  2. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  3. Homogenized description and retrieval method of nonlinear metasurfaces

    NASA Astrophysics Data System (ADS)

    Liu, Xiaojun; Larouche, Stéphane; Smith, David R.

    2018-03-01

    A patterned, plasmonic metasurface can strongly scatter incident light, functioning as an extremely low-profile lens, filter, reflector or other optical device. When the metasurface is patterned uniformly, its linear optical properties can be expressed using effective surface electric and magnetic polarizabilities obtained through a homogenization procedure. The homogenized description of a nonlinear metasurface, however, presents challenges both because of the inherent anisotropy of the medium as well as the much larger set of potential wave interactions available, making it challenging to assign effective nonlinear parameters to the otherwise inhomogeneous layer of metamaterial elements. Here we show that a homogenization procedure can be developed to describe nonlinear metasurfaces, which derive their nonlinear response from the enhanced local fields arising within the structured plasmonic elements. With the proposed homogenization procedure, we are able to assign effective nonlinear surface polarization densities to a nonlinear metasurface, and link these densities to the effective nonlinear surface susceptibilities and averaged macroscopic pumping fields across the metasurface. These effective nonlinear surface polarization densities are further linked to macroscopic nonlinear fields through the generalized sheet transition conditions (GSTCs). By inverting the GSTCs, the effective nonlinear surface susceptibilities of the metasurfaces can be solved for, leading to a generalized retrieval method for nonlinear metasurfaces. The application of the homogenization procedure and the GSTCs are demonstrated by retrieving the nonlinear susceptibilities of a SiO2 nonlinear slab. As an example, we investigate a nonlinear metasurface which presents nonlinear magnetoelectric coupling in near infrared regime. The method is expected to apply to any patterned metasurface whose thickness is much smaller than the wavelengths of operation, with inclusions of arbitrary geometry and material composition, across the electromagnetic spectrum.

  4. Type of homogenization and fat loss during continuous infusion of human milk.

    PubMed

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; Alonso Díaz, Clara; Vázquez Román, Sara; De la Cruz-Bértolo, Javier; Pallás-Alonso, Carmen Rosa

    2014-11-01

    Substantial fat loss may occur during continuous feeding of human milk (HM). A decrease of fat loss has been described following homogenization. Well-established methods of homogenization of HM for routine use in the neonatal intensive care unit (NICU) would be desirable. We compared the loss of fat based on the use of 3 different methods for homogenizing thawed HM during continuous feeding. Sixteen frozen donor HM samples were thawed, homogenized with ultrasound and separated into 3 aliquots ("baseline agitation," "hourly agitation," and "ultrasound"), and then frozen for 48 hours. Aliquots were thawed again and a baseline agitation was applied. Subsequently, aliquots baseline agitation and hourly agitation were drawn into a syringe, while ultrasound was applied to aliquot ultrasound before it was drawn into a syringe. The syringes were loaded into a pump (2 mL/h; 4 hours). At hourly intervals the hourly agitation infusion was stopped, the syringe was disconnected and gently shaken. During infusion, samples from the 3 groups were collected hourly for analysis of fat and caloric content. The 3 groups of homogenization showed similar fat content at the beginning of the infusion. For fat, mean (SD) hourly changes of -0.03 (0.01), -0.09 (0.01), and -0.09 (0.01) g/dL were observed for the hourly agitation, baseline agitation, and ultrasound groups, respectively. The decrease was smaller for the hourly agitation group (P < .001). When thawed HM is continuously infused, a smaller fat loss is observed when syringes are agitated hourly versus when ultrasound or a baseline homogenization is used. © The Author(s) 2014.

  5. Validation of the U.S. NRC NGNP evaluation model with the HTTR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saller, T.; Seker, V.; Downar, T.

    2012-07-01

    The High Temperature Test Reactor (HTTR) was modeled with TRITON/PARCS. Traditional light water reactor (LWR) homogenization methods rely on the short mean free paths of neutrons in LWR. In gas-cooled, graphite-moderated reactors like the HTTR neutrons have much longer mean free paths and penetrate further into neighboring assemblies than in LWRs. Because of this, conventional lattice calculations with a single assembly may not be valid. In addition to difficulties caused by the longer mean free paths, the HTTR presents unique axial and radial heterogeneities that require additional modifications to the single assembly homogenization method. To handle these challenges, the homogenizationmore » domain is decreased while the computational domain is increased. Instead of homogenizing a single hexagonal fuel assembly, the assembly is split into six triangles on the radial plane and five blocks axially in order to account for the placement of burnable poisons. Furthermore, the radial domain is increased beyond a single fuel assembly to account for spectrum effects from neighboring fuel, reflector, and control rod assemblies. A series of five two-dimensional cases, each closer to the full core, were calculated to evaluate the effectiveness of the homogenization method and cross-sections. (authors)« less

  6. Homogeneous Biosensing Based on Magnetic Particle Labels

    PubMed Central

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  7. Construction of Optimal-Path Maps for Homogeneous-Cost-Region Path-Planning Problems

    DTIC Science & Technology

    1989-09-01

    of Artificial Inteligence , 9%,4. 24. Kirkpatrick, S., Gelatt Jr., C. D., and Vecchi, M. P., "Optinization by Sinmulated Ani- nealing", Science, Vol...studied in depth by researchers in such fields as artificial intelligence, robot;cs, and computa- tional geometry. Most methods require homogeneous...the results of the research. 10 U. L SLEVANT RESEARCH A. APPLICABLE CONCEPTS FROM ARTIFICIAL INTELLIGENCE 1. Search Methods One of the central

  8. Fourier-Accelerated Nodal Solvers (FANS) for homogenization problems

    NASA Astrophysics Data System (ADS)

    Leuschner, Matthias; Fritzen, Felix

    2017-11-01

    Fourier-based homogenization schemes are useful to analyze heterogeneous microstructures represented by 2D or 3D image data. These iterative schemes involve discrete periodic convolutions with global ansatz functions (mostly fundamental solutions). The convolutions are efficiently computed using the fast Fourier transform. FANS operates on nodal variables on regular grids and converges to finite element solutions. Compared to established Fourier-based methods, the number of convolutions is reduced by FANS. Additionally, fast iterations are possible by assembling the stiffness matrix. Due to the related memory requirement, the method is best suited for medium-sized problems. A comparative study involving established Fourier-based homogenization schemes is conducted for a thermal benchmark problem with a closed-form solution. Detailed technical and algorithmic descriptions are given for all methods considered in the comparison. Furthermore, many numerical examples focusing on convergence properties for both thermal and mechanical problems, including also plasticity, are presented.

  9. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    NASA Astrophysics Data System (ADS)

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and fractional homogeneity-degree, to obtain valid estimates of the source parameters in a consistent theoretical framework, so overcoming the limitations imposed by global-homogeneity to widespread methods, such as Euler deconvolution.

  10. Determination of perfluorinated compounds in fish fillet homogenates: method validation and application to fillet homogenates from the Mississippi River.

    PubMed

    Malinsky, Michelle Duval; Jacoby, Cliffton B; Reagen, William K

    2011-01-10

    We report herein a simple protein precipitation extraction-liquid chromatography tandem mass spectrometry (LC/MS/MS) method, validation, and application for the analysis of perfluorinated carboxylic acids (C7-C12), perfluorinated sulfonic acids (C4, C6, and C8), and perfluorooctane sulfonamide (FOSA) in fish fillet tissue. The method combines a rapid homogenization and protein precipitation tissue extraction procedure using stable-isotope internal standard (IS) calibration. Method validation in bluegill (Lepomis macrochirus) fillet tissue evaluated the following: (1) method accuracy and precision in both extracted matrix-matched calibration and solvent (unextracted) calibration, (2) quantitation of mixed branched and linear isomers of perfluorooctanoate (PFOA) and perfluorooctanesulfonate (PFOS) with linear isomer calibration, (3) quantitation of low level (ppb) perfluorinated compounds (PFCs) in the presence of high level (ppm) PFOS, and (4) specificity from matrix interferences. Both calibration techniques produced method accuracy of at least 100±13% with a precision (%RSD) ≤18% for all target analytes. Method accuracy and precision results for fillet samples from nine different fish species taken from the Mississippi River in 2008 and 2009 are also presented. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  12. An infrared small target detection method based on multiscale local homogeneity measure

    NASA Astrophysics Data System (ADS)

    Nie, Jinyan; Qu, Shaocheng; Wei, Yantao; Zhang, Liming; Deng, Lizhen

    2018-05-01

    Infrared (IR) small target detection plays an important role in the field of image detection area owing to its intrinsic characteristics. This paper presents a multiscale local homogeneity measure (MLHM) for infrared small target detection, which can enhance the performance of IR small target detection system. Firstly, intra-patch homogeneity of the target itself and the inter-patch heterogeneity between target and the local background regions are integrated to enhance the significant of small target. Secondly, a multiscale measure based on local regions is proposed to obtain the most appropriate response. Finally, an adaptive threshold method is applied to small target segmentation. Experimental results on three different scenarios indicate that the MLHM has good performance under the interference of strong noise.

  13. Theoretical and experimental research on laser-beam homogenization based on metal gauze

    NASA Astrophysics Data System (ADS)

    Liu, Libao; Zhang, Shanshan; Wang, Ling; Zhang, Yanchao; Tian, Zhaoshuo

    2018-03-01

    Method of homogenization of CO2 laser heating by means of metal gauze is researched theoretically and experimentally. Distribution of light-field of expanded beam passing through metal gauze was numerically calculated with diffractive optical theory and the conclusion is that method is effective, with comparing the results to the situation without metal gauze. Experimentally, using the 30W DC discharge laser as source and enlarging beam by concave lens, with and without metal gauze, beam intensity distributions in thermal paper were compared, meanwhile the experiments based on thermal imager were performed. The experimental result was compatible with theoretical calculation, and all these show that the homogeneity of CO2 laser heating could be enhanced by metal gauze.

  14. Precipitation-lyophilization-homogenization (PLH) for preparation of clarithromycin nanocrystals: influencing factors on physicochemical properties and stability.

    PubMed

    Morakul, Boontida; Suksiriworapong, Jiraphong; Leanpolchareanchai, Jiraporn; Junyaprasert, Varaporn Buraphacheep

    2013-11-30

    Nanocrystals is one of effective technologies used to improve solubility and dissolution behavior of poorly soluble drugs. Clarithromycin is classified in BCS class II having low bioavailability due to very low dissolution behavior. The main purpose of this study was to investigate an efficiency of clarithromycin nanocrystals preparation by precipitation-lyophilization-homogenization (PLH) combination method in comparison with high pressure homogenization (HPH) method. The factors influencing particle size reduction and physical stability were assessed. The results showed that the PLH technique provided an effective and rapid reduction of particle size of nanocrystals to 460 ± 10 nm with homogeneity size distribution after only the fifth cycle of homogenization, whereas the same size was attained after 30 cycles by the HPH method. The smallest nanocrystals were achieved by using the combination of poloxamer 407 (2%, w/v) and SLS (0.1%, w/v) as stabilizers. This combination could prevent the particle aggregation over 3-month storage at 4 °C. The results from SEM showed that the clarithromycin nanocrystals were in cubic-shaped similar to its initial particle morphology. The DSC thermogram and X-ray diffraction pattern of nanocrystals were not different from the original drug except for intensity of peaks which indicated the presenting of nanocrystals in the crystalline state and/or partial amorphous form. In addition, the dissolution of the clarithromycin nanocrystals was dramatically increased as compared to the coarse clarithromycin. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Total lipid extraction of homogenized and intact lean fish muscles using pressurized fluid extraction and batch extraction techniques.

    PubMed

    Isaac, Giorgis; Waldebäck, Monica; Eriksson, Ulla; Odham, Göran; Markides, Karin E

    2005-07-13

    The reliability and efficiency of pressurized fluid extraction (PFE) technique for the extraction of total lipid content from cod and the effect of sample treatment on the extraction efficiency have been evaluated. The results were compared with two liquid-liquid extraction methods, traditional and modified methods according to Jensen. Optimum conditions were found to be with 2-propanol/n-hexane (65:35, v/v) as a first and n-hexane/diethyl ether (90:10, v/v) as a second solvent, 115 degrees C, and 10 min of static time. PFE extracts were cleaned up using the same procedure as in the methods according to Jensen. When total lipid yields obtained from homogenized cod muscle using PFE were compared yields obtained with original and modified Jensen methods, PFE gave significantly higher yields, approximately 10% higher (t test, P < 0.05). Infrared and NMR spectroscopy suggested that the additional material that inflates the gravimetric results is rather homogeneous and is primarily consists of phospholipid with headgroups of inositidic and/or glycosidic nature. The comparative study demonstrated that PFE is an alternative suitable technique to extract total lipid content from homogenized cod (lean fish) and herring (fat fish) muscle showing a precision comparable to that obtained with the traditional and modified Jensen methods. Despite the necessary cleanup step, PFE showed important advantages in the solvent consumption was cut by approximately 50% and automated extraction was possible.

  16. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    PubMed Central

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  17. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  18. Investigation of methods for hydroclimatic data homogenization

    NASA Astrophysics Data System (ADS)

    Steirou, E.; Koutsoyiannis, D.

    2012-04-01

    We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, 'SNHT for single shifts', was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

  19. Homogeneity and internal defects detect of infrared Se-based chalcogenide glass

    NASA Astrophysics Data System (ADS)

    Li, Zupana; Wu, Ligang; Lin, Changgui; Song, Bao'an; Wang, Xunsi; Shen, Xiang; Dai, Shixunb

    2011-10-01

    Ge-Sb-Se chalcogenide glasses is a kind of excellent infrared optical material, which has been enviromental friendly and widely used in infrared thermal imaging systems. However, due to the opaque feature of Se-based glasses in visible spectral region, it's difficult to measure their homogeneity and internal defect as the common oxide ones. In this study, a measurement was proposed to observe the homogeneity and internal defect of these glasses based on near-IR imaging technique and an effective measurement system was also constructed. The testing result indicated the method can gives the information of homogeneity and internal defect of infrared Se-based chalcogenide glass clearly and intuitionally.

  20. Comparative analysis of storage conditions and homogenization methods for tick and flea species for identification by MALDI-TOF MS.

    PubMed

    Nebbak, A; El Hamzaoui, B; Berenger, J-M; Bitam, I; Raoult, D; Almeras, L; Parola, P

    2017-12-01

    Ticks and fleas are vectors for numerous human and animal pathogens. Controlling them, which is important in combating such diseases, requires accurate identification, to distinguish between vector and non-vector species. Recently, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was applied to the rapid identification of arthropods. The growth of this promising tool, however, requires guidelines to be established. To this end, standardization protocols were applied to species of Rhipicephalus sanguineus (Ixodida: Ixodidae) Latreille and Ctenocephalides felis felis (Siphonaptera: Pulicidae) Bouché, including the automation of sample homogenization using two homogenizer devices, and varied sample preservation modes for a period of 1-6 months. The MS spectra were then compared with those obtained from manual pestle grinding, the standard homogenization method. Both automated methods generated intense, reproducible MS spectra from fresh specimens. Frozen storage methods appeared to represent the best preservation mode, for up to 6 months, while storage in ethanol is also possible, with some caveats for tick specimens. Carnoy's buffer, however, was shown to be less compatible with MS analysis for the purpose of identifying ticks or fleas. These standard protocols for MALDI-TOF MS arthropod identification should be complemented by additional MS spectrum quality controls, to generalize their use in monitoring arthropods of medical interest. © 2017 The Royal Entomological Society.

  1. An efficient computational method for the approximate solution of nonlinear Lane-Emden type equations arising in astrophysics

    NASA Astrophysics Data System (ADS)

    Singh, Harendra

    2018-04-01

    The key purpose of this article is to introduce an efficient computational method for the approximate solution of the homogeneous as well as non-homogeneous nonlinear Lane-Emden type equations. Using proposed computational method given nonlinear equation is converted into a set of nonlinear algebraic equations whose solution gives the approximate solution to the Lane-Emden type equation. Various nonlinear cases of Lane-Emden type equations like standard Lane-Emden equation, the isothermal gas spheres equation and white-dwarf equation are discussed. Results are compared with some well-known numerical methods and it is observed that our results are more accurate.

  2. Prediction of fat globule particle size in homogenized milk using Fourier transform mid-infrared spectra.

    PubMed

    Di Marzo, Larissa; Cree, Patrick; Barbano, David M

    2016-11-01

    Our objective was to develop partial least square models using data from Fourier transform mid-infrared (MIR) spectra to predict the particle size distributions d(0.5) and d(0.9), surface volume mean diameter D[3,2], and volume moment mean diameter D[4,3] of milk fat globules and validate the models. The goal of the study was to produce a method built into the MIR milk analyzer that could be used to warn the instrument operator that the homogenizer is near failure and needs to be replaced to ensure quality of results. Five homogenizers with different homogenization efficiency were used to homogenize pasteurized modified unhomogenized milks and farm raw bulk milks. Homogenized milks were collected from the homogenizer outlet and then run through an MIR milk analyzer without an in-line homogenizer to collect a MIR spectrum. A separate portion of each homogenized milk was analyzed with a laser light-scattering particle size analyzer to obtain reference values. The study was replicated 3 times with 3 independent sets of modified milks and bulk tank farm milks. Validation of the models was done with a set of 34 milks that were not used in the model development. Partial least square regression models were developed and validated for predicting the following milk fat globule particle size distribution parameters from MIR spectra: d(0.5) and d(0.9), surface volume mean diameter D[3,2], and volume moment mean diameter D[4,3]. The basis for the ability to model particle size distribution of milk fat emulsions was hypothesized to be the result of the partial least square modeling detecting absorbance shifts in MIR spectra of milk fat due to the Christiansen effect. The independent sample validation of particle size prediction methods found more variation in d(0.9) and D[4,3] predictions than the d(0.5) and D[3,2] predictions relative to laser light-scattering reference values, and this may be due to variation in particle size among different pump strokes. The accuracy of the d(0.9) prediction for routine quality assurance, to determine if a homogenizer within an MIR milk analyzer was near the failure level [i.e., d(0.9) >1.7µm] and needed to be replaced, is fit-for-purpose. The daily average particle size performance [i.e., d(0.9)] of a homogenizer based on the mean for the day could be used for monitoring homogenizer performance. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  3. Particle size analysis of lamb meat: Effect of homogenization speed, comparison with myofibrillar fragmentation index and its relationship with shear force.

    PubMed

    Karumendu, L U; Ven, R van de; Kerr, M J; Lanza, M; Hopkins, D L

    2009-08-01

    The impact of homogenization speed on Particle Size (PS) results was examined using samples from the M.longissimus thoracis et lumborum (LL) of 40 lambs. One gram duplicate samples from meat aged for 1 and 5days were homogenized at five different speeds; 11,000, 13,000, 16,000, 19,000 and 22,000rpm. In addition to this LL samples from 30 different lamb carcases also aged for 1 and 5days were used to study the comparison between PS and myofibrillar fragmentation index (MFI) values. In this case, 1g duplicate samples (n=30) were homogenized at 16,000rpm and the other half (0.5g samples) at 11,000rpm (n=30). The homogenates were then subjected to respective combinations of treatments which included either PS analysis or the determination of MFI, both with or without three cycles of centrifugation. All 140 samples of LL included 65g blocks for subsequent shear force (SF) testing. Homogenization at 16,000rpm provided the greatest ability to detect ageing differences for particle size between samples aged for 1 and 5days. Particle size at the 25% quantile provided the best result for detecting differences due to ageing. It was observed that as ageing increased the mean PS decreased and was significantly (P<0.001) less for 5days aged samples compared to 1day aged samples, while MFI values significantly increased (P<0.001) as ageing period increased. When comparing the PS and MFI methods it became apparent that, as opposed to the MFI method, there was a greater coefficient of variation for the PS method which warranted a quality assurance system. Given this requirement and examination of the mean, standard deviation and the 25% quantile for PS data it was concluded that three cycles of centrifugation were not necessary and this also applied to the MFI method. There were significant correlations (P<0.001) within the same lamb loin sample aged for a given period between mean MFI and mean PS (-0.53), mean MFI and mean SF (-0.38) and mean PS and mean SF (0.23). It was concluded that PS analysis offers significant potential for streamlining determination of myofibrillar degradation when samples are measured after homogenization at 16,000rpm with no centrifugation.

  4. Simulation of homogeneous condensation of small polyatomic systems in high pressure supersonic nozzle flows using Bhatnagar-Gross-Krook model

    NASA Astrophysics Data System (ADS)

    Kumar, Rakesh; Levin, Deborah A.

    2011-03-01

    In the present work, we have simulated the homogeneous condensation of carbon dioxide and ethanol using the Bhatnagar-Gross-Krook based approach. In an earlier work of Gallagher-Rogers et al. [J. Thermophys. Heat Transfer 22, 695 (2008)], it was found that it was not possible to simulate condensation experiments of Wegener et al. [Phys. Fluids 15, 1869 (1972)] using the direct simulation Monte Carlo method. Therefore, in this work, we have used the statistical Bhatnagar-Gross-Krook approach, which was found to be numerically more efficient than direct simulation Monte Carlo method in our previous studies [Kumar et al., AIAA J. 48, 1531 (2010)], to model homogeneous condensation of two small polyatomic systems, carbon dioxide and ethanol. A new weighting scheme is developed in the Bhatnagar-Gross-Krook framework to reduce the computational load associated with the study of homogeneous condensation flows. The solutions obtained by the use of the new scheme are compared with those obtained by the baseline Bhatnagar-Gross-Krook condensation model (without the species weighting scheme) for the condensing flow of carbon dioxide in the stagnation pressure range of 1-5 bars. Use of the new weighting scheme in the present work makes the simulation of homogeneous condensation of ethanol possible. We obtain good agreement between our simulated predictions for homogeneous condensation of ethanol and experiments in terms of the point of condensation onset and the distribution of mass fraction of ethanol condensed along the nozzle centerline.

  5. Isolation of biologically active nanomaterial (inclusion bodies) from bacterial cells

    PubMed Central

    2010-01-01

    Background In recent years bacterial inclusion bodies (IBs) were recognised as highly pure deposits of active proteins inside bacterial cells. Such active nanoparticles are very interesting for further downstream protein isolation, as well as for many other applications in nanomedicine, cosmetic, chemical and pharmaceutical industry. To prepare large quantities of a high quality product, the whole bioprocess has to be optimised. This includes not only the cultivation of the bacterial culture, but also the isolation step itself, which can be of critical importance for the production process. To determine the most appropriate method for the isolation of biologically active nanoparticles, three methods for bacterial cell disruption were analyzed. Results In this study, enzymatic lysis and two mechanical methods, high-pressure homogenization and sonication, were compared. During enzymatic lysis the enzyme lysozyme was found to attach to the surface of IBs, and it could not be removed by simple washing. As this represents an additional impurity in the engineered nanoparticles, we concluded that enzymatic lysis is not the most suitable method for IBs isolation. During sonication proteins are released (lost) from the surface of IBs and thus the surface of IBs appears more porous when compared to the other two methods. We also found that the acoustic output power needed to isolate the IBs from bacterial cells actually damages proteins structures, thereby causing a reduction in biological activity. High-pressure homogenization also caused some damage to IBs, however the protein loss from the IBs was negligible. Furthermore, homogenization had no side-effects on protein biological activity. Conclusions The study shows that among the three methods tested, homogenization is the most appropriate method for the isolation of active nanoparticles from bacterial cells. PMID:20831775

  6. Isolation of biologically active nanomaterial (inclusion bodies) from bacterial cells.

    PubMed

    Peternel, Spela; Komel, Radovan

    2010-09-10

    In recent years bacterial inclusion bodies (IBs) were recognised as highly pure deposits of active proteins inside bacterial cells. Such active nanoparticles are very interesting for further downstream protein isolation, as well as for many other applications in nanomedicine, cosmetic, chemical and pharmaceutical industry.To prepare large quantities of a high quality product, the whole bioprocess has to be optimised. This includes not only the cultivation of the bacterial culture, but also the isolation step itself, which can be of critical importance for the production process.To determine the most appropriate method for the isolation of biologically active nanoparticles, three methods for bacterial cell disruption were analyzed. In this study, enzymatic lysis and two mechanical methods, high-pressure homogenization and sonication, were compared.During enzymatic lysis the enzyme lysozyme was found to attach to the surface of IBs, and it could not be removed by simple washing. As this represents an additional impurity in the engineered nanoparticles, we concluded that enzymatic lysis is not the most suitable method for IBs isolation.During sonication proteins are released (lost) from the surface of IBs and thus the surface of IBs appears more porous when compared to the other two methods. We also found that the acoustic output power needed to isolate the IBs from bacterial cells actually damages proteins structures, thereby causing a reduction in biological activity.High-pressure homogenization also caused some damage to IBs, however the protein loss from the IBs was negligible. Furthermore, homogenization had no side-effects on protein biological activity. The study shows that among the three methods tested, homogenization is the most appropriate method for the isolation of active nanoparticles from bacterial cells.

  7. Two-scale homogenization to determine effective parameters of thin metallic-structured films

    PubMed Central

    Marigo, Jean-Jacques

    2016-01-01

    We present a homogenization method based on matched asymptotic expansion technique to derive effective transmission conditions of thin structured films. The method leads unambiguously to effective parameters of the interface which define jump conditions or boundary conditions at an equivalent zero thickness interface. The homogenized interface model is presented in the context of electromagnetic waves for metallic inclusions associated with Neumann or Dirichlet boundary conditions for transverse electric or transverse magnetic wave polarization. By comparison with full-wave simulations, the model is shown to be valid for thin interfaces up to thicknesses close to the wavelength. We also compare our effective conditions with the two-sided impedance conditions obtained in transmission line theory and to the so-called generalized sheet transition conditions. PMID:27616916

  8. Colloidal nanocrystals and method of making

    DOEpatents

    Kahen, Keith

    2015-10-06

    A tight confinement nanocrystal comprises a homogeneous center region having a first composition and a smoothly varying region having a second composition wherein a confining potential barrier monotonically increases and then monotonically decreases as the smoothly varying region extends from the surface of the homogeneous center region to an outer surface of the nanocrystal. A method of producing the nanocrystal comprises forming a first solution by combining a solvent and at most two nanocrystal precursors; heating the first solution to a nucleation temperature; adding to the first solution, a second solution having a solvent, at least one additional and different precursor to form the homogeneous center region and at most an initial portion of the smoothly varying region; and lowering the solution temperature to a growth temperature to complete growth of the smoothly varying region.

  9. Fuel mixture stratification as a method for improving homogeneous charge compression ignition engine operation

    DOEpatents

    Dec, John E [Livermore, CA; Sjoberg, Carl-Magnus G [Livermore, CA

    2006-10-31

    A method for slowing the heat-release rate in homogeneous charge compression ignition ("HCCI") engines that allows operation without excessive knock at higher engine loads than are possible with conventional HCCI. This method comprises injecting a fuel charge in a manner that creates a stratified fuel charge in the engine cylinder to provide a range of fuel concentrations in the in-cylinder gases (typically with enough oxygen for complete combustion) using a fuel with two-stage ignition fuel having appropriate cool-flame chemistry so that regions of different fuel concentrations autoignite sequentially.

  10. A new multigroup method for cross-sections that vary rapidly in energy

    DOE PAGES

    Haut, Terry Scot; Ahrens, Cory D.; Jonko, Alexandra; ...

    2016-11-04

    Here, we present a numerical method for solving the time-independent thermal radiative transfer (TRT) equation or the neutron transport (NT) equation when the opacity (cross-section) varies rapidly in frequency (energy) on the microscale ε; ε corresponds to the characteristic spacing between absorption lines or resonances, and is much smaller than the macroscopic frequency (energy) variation of interest. The approach is based on a rigorous homogenization of the TRT/NT equation in the frequency (energy) variable. Discretization of the homogenized TRT/NT equation results in a multigroup-type system, and can therefore be solved by standard methods.

  11. Direct current dielectric barrier assistant discharge to get homogeneous plasma in capacitive coupled discharge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Yinchang, E-mail: ycdu@mail.ustc.edu.cn; Max-Planck Institute for Extraterrestrial Physics, D-85748 Garching; Li, Yangfang

    In this paper, we propose a method to get more homogeneous plasma in the geometrically asymmetric capacitive coupled plasma (CCP) discharge. The dielectric barrier discharge (DBD) is used for the auxiliary discharge system to improve the homogeneity of the geometrically asymmetric CCP discharge. The single Langmuir probe measurement shows that the DBD can increase the electron density in the low density volume, where the DBD electrodes are mounted, when the pressure is higher than 5 Pa. By this manner, we are able to improve the homogeneity of the plasma production and increase the overall density in the target volume. At last,more » the finite element simulation results show that the DC bias, applied to the DBD electrodes, can increase the homogeneity of the electron density in the CCP discharge. The simulation results show a good agreement with the experiment results.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Hummel, Andrew John; Hiruta, Hikaru

    The deterministic full core simulators require homogenized group constants covering the operating and transient conditions over the entire lifetime. Traditionally, the homogenized group constants are generated using lattice physics code over an assembly or block in the case of prismatic high temperature reactors (HTR). For the case of strong absorbers that causes strong local depressions on the flux profile require special techniques during homogenization over a large volume. Fuel blocks with burnable poisons or control rod blocks are example of such cases. Over past several decades, there have been a tremendous number of studies performed for improving the accuracy ofmore » full-core calculations through the homogenization procedure. However, those studies were mostly performed for light water reactor (LWR) analyses, thus, may not be directly applicable to advanced thermal reactors such as HTRs. This report presents the application of SuPer-Homogenization correction method to a hypothetical HTR core.« less

  13. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  14. PHYSICAL PROPERTIES OF ZIRCONIUM NITRIDE IN THE HOMOGENEITY REGION (in Ukrainian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samsonov, G.V.; Verkhoglyadova, T.S.

    1962-01-01

    The x-ray method was used to determine the homogeneity region of zirconium nitride as 40 to 50 at.% (9.5 to 13.3% by weight) of nitrogen. It is also shown that part of the ionic bond in the zirconium nitride lattice increases with a decrease in the nitrogen content in this region, this increase being higher than in the homogeneity region of titunium nitride due to the smaller degree of unfilling of the electron d-shell of the zirconium atom in comparison with that of the titanium atom. (auth)

  15. Topology optimization based design of unilateral NMR for generating a remote homogeneous field.

    PubMed

    Wang, Qi; Gao, Renjing; Liu, Shutian

    2017-06-01

    This paper presents a topology optimization based design method for the design of unilateral nuclear magnetic resonance (NMR), with which a remote homogeneous field can be obtained. The topology optimization is actualized by seeking out the optimal layout of ferromagnetic materials within a given design domain. The design objective is defined as generating a sensitive magnetic field with optimal homogeneity and maximal field strength within a required region of interest (ROI). The sensitivity of the objective function with respect to the design variables is derived and the method for solving the optimization problem is presented. A design example is provided to illustrate the utility of the design method, specifically the ability to improve the quality of the magnetic field over the required ROI by determining the optimal structural topology for the ferromagnetic poles. Both in simulations and experiments, the sensitive region of the magnetic field achieves about 2 times larger than that of the reference design, validating validates the feasibility of the design method. Copyright © 2017. Published by Elsevier Inc.

  16. Effects of Annular Electromagnetic Stirring Coupled with Intercooling on Grain Refinement and Homogeneity During Direct Chill Casting of Large-Sized 7005 Alloy Billet

    NASA Astrophysics Data System (ADS)

    Luo, Yajun; Zhang, Zhifeng; Li, Bao; Gao, Mingwei; Qiu, Yang; He, Min

    2017-12-01

    To obtain a large-sized, high-quality aluminum alloy billet, an advanced uniform direct chill (UDC) casting method was developed by combining annular electromagnetic stirring (A-EMS) with intercooling in the sump. The 7005 alloy was chosen to investigate the effect of UDC on grain refinement and homogeneity during normal direct chill (NDC) casting. It was concluded that the microstructure consisting of both primary α-Al phase and secondary phases becomes finer and more homogeneous for the billets prepared with UDC casting compared to those prepared with NDC casting, and the forced cooling from both the inner and outer melt under A-EMS has a measurable effect on grain refinement and homogeneity.

  17. RVR Meander – Migration of meandering rivers in homogeneous and heterogeneous floodplains using physically-based bank erosion

    USDA-ARS?s Scientific Manuscript database

    The RVR Meander platform for computing long-term meandering-channel migration is presented, together with a method for planform migration based on the modeling of the streambank erosion processes of hydraulic erosion and mass failure. An application to a real-world river, with assumption of homogene...

  18. Homogeneous Grouping in the Context of High-Stakes Testing: Does It Improve Reading Achievement?

    ERIC Educational Resources Information Center

    Salcedo-Gonzalez, Trena

    2012-01-01

    As accountability reform intensifies, urban school districts strive to meet No Child Left Behind mandates to avoid severe penalties. This study investigated the resurgence of homogeneous grouping methods as a means to increase reading achievement and meet English Language Arts Adequate Yearly Progress requirements. Specifically, this study…

  19. F-Test Alternatives to Fisher's Exact Test and to the Chi-Square Test of Homogeneity in 2x2 Tables.

    ERIC Educational Resources Information Center

    Overall, John E.; Starbuck, Robert R.

    1983-01-01

    An alternative to Fisher's exact test and the chi-square test for homogeneity in two-by-two tables is developed. The method provides for Type I error rates which are closer to the stated alpha level than either of the alternatives. (JKS)

  20. A method to eliminate wetting during the homogenization of HgCdTe

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Lehoczky, S. L.; Szofran, F. R.

    1986-01-01

    Adhesion of HgCdTe samples to fused silica ampoule walls, or 'wetting', during the homogenization process was eliminated by adopting a slower heating rate. The idea is to decrease Cd activity in the sample so as to reduce the rate of reaction between Cd and the silica wall.

  1. Dynamic analysis and numerical experiments for balancing of the continuous single-disc and single-span rotor-bearing system

    NASA Astrophysics Data System (ADS)

    Wang, Aiming; Cheng, Xiaohan; Meng, Guoying; Xia, Yun; Wo, Lei; Wang, Ziyi

    2017-03-01

    Identification of rotor unbalance is critical for normal operation of rotating machinery. The single-disc and single-span rotor, as the most fundamental rotor-bearing system, has attracted research attention over a long time. In this paper, the continuous single-disc and single-span rotor is modeled as a homogeneous and elastic Euler-Bernoulli beam, and the forces applied by bearings and disc on the shaft are considered as point forces. A fourth-order non-homogeneous partial differential equation set with homogeneous boundary condition is solved for analytical solution, which expresses the unbalance response as a function of position, rotor unbalance and the stiffness and damping coefficients of bearings. Based on this analytical method, a novel Measurement Point Vector Method (MPVM) is proposed to identify rotor unbalance while operating. Only a measured unbalance response registered for four selected cross-sections of the rotor-shaft under steady-state operating conditions is needed when using the method. Numerical simulation shows that the detection error of the proposed method is very small when measurement error is negligible. The proposed method provides an efficient way for rotor balancing without test runs and external excitations.

  2. Optimal lattice-structured materials

    DOE PAGES

    Messner, Mark C.

    2016-07-09

    This paper describes a method for optimizing the mesostructure of lattice-structured materials. These materials are periodic arrays of slender members resembling efficient, lightweight macroscale structures like bridges and frame buildings. Current additive manufacturing technologies can assemble lattice structures with length scales ranging from nanometers to millimeters. Previous work demonstrates that lattice materials have excellent stiffness- and strength-to-weight scaling, outperforming natural materials. However, there are currently no methods for producing optimal mesostructures that consider the full space of possible 3D lattice topologies. The inverse homogenization approach for optimizing the periodic structure of lattice materials requires a parameterized, homogenized material model describingmore » the response of an arbitrary structure. This work develops such a model, starting with a method for describing the long-wavelength, macroscale deformation of an arbitrary lattice. The work combines the homogenized model with a parameterized description of the total design space to generate a parameterized model. Finally, the work describes an optimization method capable of producing optimal mesostructures. Several examples demonstrate the optimization method. One of these examples produces an elastically isotropic, maximally stiff structure, here called the isotruss, that arguably outperforms the anisotropic octet truss topology.« less

  3. Magnetohydrodynamic Flow by a Stretching Cylinder with Newtonian Heating and Homogeneous-Heterogeneous Reactions

    PubMed Central

    Hayat, T.; Hussain, Zakir; Alsaedi, A.; Farooq, M.

    2016-01-01

    This article examines the effects of homogeneous-heterogeneous reactions and Newtonian heating in magnetohydrodynamic (MHD) flow of Powell-Eyring fluid by a stretching cylinder. The nonlinear partial differential equations of momentum, energy and concentration are reduced to the nonlinear ordinary differential equations. Convergent solutions of momentum, energy and reaction equations are developed by using homotopy analysis method (HAM). This method is very efficient for development of series solutions of highly nonlinear differential equations. It does not depend on any small or large parameter like the other methods i. e., perturbation method, δ—perturbation expansion method etc. We get more accurate result as we increase the order of approximations. Effects of different parameters on the velocity, temperature and concentration distributions are sketched and discussed. Comparison of present study with the previous published work is also made in the limiting sense. Numerical values of skin friction coefficient and Nusselt number are also computed and analyzed. It is noticed that the flow accelerates for large values of Powell-Eyring fluid parameter. Further temperature profile decreases and concentration profile increases when Powell-Eyring fluid parameter enhances. Concentration distribution is decreasing function of homogeneous reaction parameter while opposite influence of heterogeneous reaction parameter appears. PMID:27280883

  4. Magnetohydrodynamic Flow by a Stretching Cylinder with Newtonian Heating and Homogeneous-Heterogeneous Reactions.

    PubMed

    Hayat, T; Hussain, Zakir; Alsaedi, A; Farooq, M

    2016-01-01

    This article examines the effects of homogeneous-heterogeneous reactions and Newtonian heating in magnetohydrodynamic (MHD) flow of Powell-Eyring fluid by a stretching cylinder. The nonlinear partial differential equations of momentum, energy and concentration are reduced to the nonlinear ordinary differential equations. Convergent solutions of momentum, energy and reaction equations are developed by using homotopy analysis method (HAM). This method is very efficient for development of series solutions of highly nonlinear differential equations. It does not depend on any small or large parameter like the other methods i. e., perturbation method, δ-perturbation expansion method etc. We get more accurate result as we increase the order of approximations. Effects of different parameters on the velocity, temperature and concentration distributions are sketched and discussed. Comparison of present study with the previous published work is also made in the limiting sense. Numerical values of skin friction coefficient and Nusselt number are also computed and analyzed. It is noticed that the flow accelerates for large values of Powell-Eyring fluid parameter. Further temperature profile decreases and concentration profile increases when Powell-Eyring fluid parameter enhances. Concentration distribution is decreasing function of homogeneous reaction parameter while opposite influence of heterogeneous reaction parameter appears.

  5. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  6. Unified treatment of microscopic boundary conditions and efficient algorithms for estimating tangent operators of the homogenized behavior in the computational homogenization method

    NASA Astrophysics Data System (ADS)

    Nguyen, Van-Dung; Wu, Ling; Noels, Ludovic

    2017-03-01

    This work provides a unified treatment of arbitrary kinds of microscopic boundary conditions usually considered in the multi-scale computational homogenization method for nonlinear multi-physics problems. An efficient procedure is developed to enforce the multi-point linear constraints arising from the microscopic boundary condition either by the direct constraint elimination or by the Lagrange multiplier elimination methods. The macroscopic tangent operators are computed in an efficient way from a multiple right hand sides linear system whose left hand side matrix is the stiffness matrix of the microscopic linearized system at the converged solution. The number of vectors at the right hand side is equal to the number of the macroscopic kinematic variables used to formulate the microscopic boundary condition. As the resolution of the microscopic linearized system often follows a direct factorization procedure, the computation of the macroscopic tangent operators is then performed using this factorized matrix at a reduced computational time.

  7. Homogenate-assisted Vacuum-powered Bubble Extraction of Moso Bamboo Flavonoids for On-line Scavenging Free Radical Capacity Analysis.

    PubMed

    Sun, Yinnan; Yang, Kui; Cao, Qin; Sun, Jinde; Xia, Yu; Wang, Yinhang; Li, Wei; Ma, Chunhui; Liu, Shouxin

    2017-07-11

    A homogenate-assisted vacuum-powered bubble extraction (HVBE) method using ethanol was applied for extraction of flavonoids from Phyllostachys pubescens (P. pubescens) leaves. The mechanisms of homogenate-assisted extraction and vacuum-powered bubble generation were discussed in detail. Furthermore, a method for the rapid determination of flavonoids by HPLC was established. HVBE followed by HPLC was successfully applied for the extraction and quantification of four flavonoids in P. pubescens , including orientin, isoorientin, vitexin, and isovitexin. This method provides a fast and effective means for the preparation and determination of plant active components. Moreover, the on-line antioxidant capacity, including scavenging positive ion and negative ion free radical capacity of different fractions from the bamboo flavonoid extract was evaluated. Results showed that the scavenging DPPH ˙ free radical capacity of vitexin and isovitexin was larger than that of isoorientin and orientin. On the contrary, the scavenging ABTS⁺ ˙ free radical capacity of isoorientin and orientin was larger than that of vitexin and isovitexin.

  8. Keeping an eye on the ring: COMS plaque loading optimization for improved dose conformity and homogeneity.

    PubMed

    Gagne, Nolan L; Cutright, Daniel R; Rivard, Mark J

    2012-09-01

    To improve tumor dose conformity and homogeneity for COMS plaque brachytherapy by investigating the dosimetric effects of varying component source ring radionuclides and source strengths. The MCNP5 Monte Carlo (MC) radiation transport code was used to simulate plaque heterogeneity-corrected dose distributions for individually-activated source rings of 14, 16 and 18 mm diameter COMS plaques, populated with (103)Pd, (125)I and (131)Cs sources. Ellipsoidal tumors were contoured for each plaque size and MATLAB programming was developed to generate tumor dose distributions for all possible ring weighting and radionuclide permutations for a given plaque size and source strength resolution, assuming a 75 Gy apical prescription dose. These dose distributions were analyzed for conformity and homogeneity and compared to reference dose distributions from uniformly-loaded (125)I plaques. The most conformal and homogeneous dose distributions were reproduced within a reference eye environment to assess organ-at-risk (OAR) doses in the Pinnacle(3) treatment planning system (TPS). The gamma-index analysis method was used to quantitatively compare MC and TPS-generated dose distributions. Concentrating > 97% of the total source strength in a single or pair of central (103)Pd seeds produced the most conformal dose distributions, with tumor basal doses a factor of 2-3 higher and OAR doses a factor of 2-3 lower than those of corresponding uniformly-loaded (125)I plaques. Concentrating 82-86% of the total source strength in peripherally-loaded (131)Cs seeds produced the most homogeneous dose distributions, with tumor basal doses 17-25% lower and OAR doses typically 20% higher than those of corresponding uniformly-loaded (125)I plaques. Gamma-index analysis found > 99% agreement between MC and TPS dose distributions. A method was developed to select intra-plaque ring radionuclide compositions and source strengths to deliver more conformal and homogeneous tumor dose distributions than uniformly-loaded (125)I plaques. This method may support coordinated investigations of an appropriate clinical target for eye plaque brachytherapy.

  9. Limit analysis of multi-layered plates. Part I: The homogenized Love-Kirchhoff model

    NASA Astrophysics Data System (ADS)

    Dallot, Julien; Sab, Karam

    The purpose of this paper is to determine Gphom, the overall homogenized Love-Kirchhoff strength domain of a rigid perfectly plastic multi-layered plate, and to study the relationship between the 3D and the homogenized Love-Kirchhoff plate limit analysis problems. In the Love-Kirchhoff model, the generalized stresses are the in-plane (membrane) and the out-of-plane (flexural) stress field resultants. The homogenization method proposed by Bourgeois [1997. Modélisation numérique des panneaux structuraux légers. Ph.D. Thesis, University Aix-Marseille] and Sab [2003. Yield design of thin periodic plates by a homogenization technique and an application to masonry wall. C. R. Méc. 331, 641-646] for in-plane periodic rigid perfectly plastic plates is justified using the asymptotic expansion method. For laminated plates, an explicit parametric representation of the yield surface ∂Gphom is given thanks to the π-function (the plastic dissipation power density function) that describes the local strength domain at each point of the plate. This representation also provides a localization method for the determination of the 3D stress components corresponding to every generalized stress belonging to ∂Gphom. For a laminated plate described with a yield function of the form F(x3,σ)=σu(x3)F^(σ), where σu is a positive even function of the out-of-plane coordinate x3 and F^ is a convex function of the local stress σ, two effective constants and a normalization procedure are introduced. A symmetric sandwich plate consisting of two Von-Mises materials ( σu=σ1u in the skins and σu=σ2u in the core) is studied. It is found that, for small enough contrast ratios ( r=σ1u/σ2u≤5), the normalized strength domain G^phom is close to the one corresponding to a homogeneous Von-Mises plate [Ilyushin, A.-A., 1956. Plasticité. Eyrolles, Paris].

  10. Numerical developments for short-pulsed Near Infra-Red laser spectroscopy. Part I: direct treatment

    NASA Astrophysics Data System (ADS)

    Boulanger, Joan; Charette, André

    2005-03-01

    This two part study is devoted to the numerical treatment of short-pulsed laser near infra-red spectroscopy. The overall goal is to address the possibility of numerical inverse treatment based on a recently developed direct model to solve the transient radiative transfer equation. This model has been constructed in order to incorporate the last improvements in short-pulsed laser interaction with semi-transparent media and combine a discrete ordinates computing of the implicit source term appearing in the radiative transfer equation with an explicit treatment of the transport of the light intensity using advection schemes, a method encountered in reactive flow dynamics. The incident collimated beam is analytically solved through Bouger Beer Lambert extinction law. In this first part, the direct model is extended to fully non-homogeneous materials and tested with two different spatial schemes in order to be adapted to the inversion methods presented in the following second part. As a first point, fundamental methods and schemes used in the direct model are presented. Then, tests are conducted by comparison with numerical simulations given as references. In a third and last part, multi-dimensional extensions of the code are provided. This allows presentation of numerical results of short pulses propagation in 1, 2 and 3D homogeneous and non-homogeneous materials given some parametrical studies on medium properties and pulse shape. For comparison, an integral method adapted to non-homogeneous media irradiated by a pulsed laser beam is also developed for the 3D case.

  11. Non-periodic homogenization of 3-D elastic media for the seismic wave equation

    NASA Astrophysics Data System (ADS)

    Cupillard, Paul; Capdeville, Yann

    2018-05-01

    Because seismic waves have a limited frequency spectrum, the velocity structure of the Earth that can be extracted from seismic records has a limited resolution. As a consequence, one obtains smooth images from waveform inversion, although the Earth holds discontinuities and small scales of various natures. Within the last decade, the non-periodic homogenization method shed light on how seismic waves interact with small geological heterogeneities and `see' upscaled properties. This theory enables us to compute long-wave equivalent density and elastic coefficients of any media, with no constraint on the size, the shape and the contrast of the heterogeneities. In particular, the homogenization leads to the apparent, structure-induced anisotropy. In this paper, we implement this method in 3-D and show 3-D tests for the very first time. The non-periodic homogenization relies on an asymptotic expansion of the displacement and the stress involved in the elastic wave equation. Limiting ourselves to the order 0, we show that the practical computation of an upscaled elastic tensor basically requires (i) to solve an elastostatic problem and (ii) to low-pass filter the strain and the stress associated with the obtained solution. The elastostatic problem consists in finding the displacements due to local unit strains acting in all directions within the medium to upscale. This is solved using a parallel, highly optimized finite-element code. As for the filtering, we rely on the finite-element quadrature to perform the convolution in the space domain. We end up with an efficient numerical tool that we apply on various 3-D models to test the accuracy and the benefit of the homogenization. In the case of a finely layered model, our method agrees with results derived from Backus. In a more challenging model composed by a million of small cubes, waveforms computed in the homogenized medium fit reference waveforms very well. Both direct phases and complex diffracted waves are accurately retrieved in the upscaled model, although it is smooth. Finally, our upscaling method is applied to a realistic geological model. The obtained homogenized medium holds structure-induced anisotropy. Moreover, full seismic wavefields in this medium can be simulated with a coarse mesh (no matter what the numerical solver is), which significantly reduces computation costs usually associated with discontinuities and small heterogeneities. These three tests show that the non-periodic homogenization is both accurate and tractable in large 3-D cases, which opens the path to the correct account of the effect of small-scale features on seismic wave propagation for various applications and to a deeper understanding of the apparent anisotropy.

  12. Method for preparing homogeneous single crystal ternary III-V alloys

    DOEpatents

    Ciszek, Theodore F.

    1991-01-01

    A method for producing homogeneous, single-crystal III-V ternary alloys of high crystal perfection using a floating crucible system in which the outer crucible holds a ternary alloy of the composition desired to be produced in the crystal and an inner floating crucible having a narrow, melt-passing channel in its bottom wall holds a small quantity of melt of a pseudo-binary liquidus composition that would freeze into the desired crystal composition. The alloy of the floating crucilbe is maintained at a predetermined lower temperature than the alloy of the outer crucible, and a single crystal of the desired homogeneous alloy is pulled out of the floating crucible melt, as melt from the outer crucible flows into a bottom channel of the floating crucible at a rate that corresponds to the rate of growth of the crystal.

  13. Nonlinear Boltzmann equation for the homogeneous isotropic case: Some improvements to deterministic methods and applications to relaxation towards local equilibrium

    NASA Astrophysics Data System (ADS)

    Asinari, P.

    2011-03-01

    Boltzmann equation is one the most powerful paradigms for explaining transport phenomena in fluids. Since early fifties, it received a lot of attention due to aerodynamic requirements for high altitude vehicles, vacuum technology requirements and nowadays, micro-electro-mechanical systems (MEMs). Because of the intrinsic mathematical complexity of the problem, Boltzmann himself started his work by considering first the case when the distribution function does not depend on space (homogeneous case), but only on time and the magnitude of the molecular velocity (isotropic collisional integral). The interest with regards to the homogeneous isotropic Boltzmann equation goes beyond simple dilute gases. In the so-called econophysics, a Boltzmann type model is sometimes introduced for studying the distribution of wealth in a simple market. Another recent application of the homogeneous isotropic Boltzmann equation is given by opinion formation modeling in quantitative sociology, also called socio-dynamics or sociophysics. The present work [1] aims to improve the deterministic method for solving homogenous isotropic Boltzmann equation proposed by Aristov [2] by two ideas: (a) the homogeneous isotropic problem is reformulated first in terms of particle kinetic energy (this allows one to ensure exact particle number and energy conservation during microscopic collisions) and (b) a DVM-like correction (where DVM stands for Discrete Velocity Model) is adopted for improving the relaxation rates (this allows one to satisfy exactly the conservation laws at macroscopic level, which is particularly important for describing the late dynamics in the relaxation towards the equilibrium).

  14. High throughput film dosimetry in homogeneous and heterogeneous media for a small animal irradiator

    PubMed Central

    Wack, L.; Ngwa, W.; Tryggestad, E.; Tsiamas, P.; Berbeco, R.; Ng, S.K.; Hesser, J.

    2013-01-01

    Purpose We have established a high-throughput Gafchromic film dosimetry protocol for narrow kilo-voltage beams in homogeneous and heterogeneous media for small-animal radiotherapy applications. The kV beam characterization is based on extensive Gafchromic film dosimetry data acquired in homogeneous and heterogeneous media. An empirical model is used for parameterization of depth and off-axis dependence of measured data. Methods We have modified previously published methods of film dosimetry to suit the specific tasks of the study. Unlike film protocols used in previous studies, our protocol employs simultaneous multichannel scanning and analysis of up to nine Gafchromic films per scan. A scanner and background correction were implemented to improve accuracy of the measurements. Measurements were taken in homogeneous and inhomogeneous phantoms at 220 kVp and a field size of 5 × 5 mm2. The results were compared against Monte Carlo simulations. Results Dose differences caused by variations in background signal were effectively removed by the corrections applied. Measurements in homogeneous phantoms were used to empirically characterize beam data in homogeneous and heterogeneous media. Film measurements in inhomogeneous phantoms and their empirical parameterization differed by about 2%–3%. The model differed from MC by about 1% (water, lung) to 7% (bone). Good agreement was found for measured and modelled off-axis ratios. Conclusions EBT2 films are a valuable tool for characterization of narrow kV beams, though care must be taken to eliminate disturbances caused by varying background signals. The usefulness of the empirical beam model in interpretation and parameterization of film data was demonstrated. PMID:23510532

  15. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Practical Aerobic Oxidations of Alcohols and Amines with Homogeneous Cu/TEMPO and Related Catalyst Systems

    PubMed Central

    Ryland, Bradford L.; Stahl, Shannon S.

    2014-01-01

    Alcohol and amine oxidations are common reactions in laboratory and industrial synthesis of organic molecules. Aerobic oxidation methods have long been sought for these transformations, but few practical methods exist that offer advantages over traditional oxidation methods. Recently developed homogeneous Cu/TEMPO (TEMPO = 2,2,6,6-tetramethylpiperidinyl-N-oxyl) and related catalyst systems appear to fill this void. The reactions exhibit high levels of chemoselectivity and broad functional-group tolerance, and they often operate efficiently at room temperature with ambient air as the oxidant. These advances, together with their historical context and recent applications, are highlighted in this minireview. PMID:25044821

  17. Rapid Solid-State Metathesis Routes to Nanostructured Silicon-Germainum

    NASA Technical Reports Server (NTRS)

    Rodriguez, Marc (Inventor); Kaner, Richard B. (Inventor); Bux, Sabah K. (Inventor); Fleurial, Jean-Pierre (Inventor)

    2014-01-01

    Methods for producing nanostructured silicon and silicon-germanium via solid state metathesis (SSM). The method of forming nanostructured silicon comprises the steps of combining a stoichiometric mixture of silicon tetraiodide (SiI4) and an alkaline earth metal silicide into a homogeneous powder, and initating the reaction between the silicon tetraiodide (SiI4) with the alkaline earth metal silicide. The method of forming nanostructured silicon-germanium comprises the steps of combining a stoichiometric mixture of silicon tetraiodide (SiI4) and a germanium based precursor into a homogeneous powder, and initiating the reaction between the silicon tetraiodide (SiI4) with the germanium based precursors.

  18. Boundary value problems for multi-term fractional differential equations

    NASA Astrophysics Data System (ADS)

    Daftardar-Gejji, Varsha; Bhalekar, Sachin

    2008-09-01

    Multi-term fractional diffusion-wave equation along with the homogeneous/non-homogeneous boundary conditions has been solved using the method of separation of variables. It is observed that, unlike in the one term case, solution of multi-term fractional diffusion-wave equation is not necessarily non-negative, and hence does not represent anomalous diffusion of any kind.

  19. Genetic progress in homogeneous regions of wheat cultivation in Rio Grande do Sul State, Brazil.

    PubMed

    Follmann, D N; Cargnelutti Filho, A; Lúcio, A D; de Souza, V Q; Caraffa, M; Wartha, C A

    2017-03-30

    The State of Rio Grande do Sul (RS) stands out as the largest wheat producer in Brazil. Wheat is the most emphasized winter cereal in RS, attracting public and private investments directed to wheat genetic breeding. The study of genetic progress should be performed routinely at breeding programs to study the behavior of cultivars developed for homogeneous regions of cultivation. The objectives of this study were: 1) to evaluate the genetic progress of wheat grain yield in RS; 2) to evaluate the influence of cultivar competition trial stratification in homogeneous regions of cultivation on the study of genetic progress. Grain yield data of 122 wheat cultivars evaluated in 137 trials arranged in randomized block design with three or four replications were used. Field trials were carried out in 23 locations in RS divided into two homogeneous regions during the period from 2002 to 2013. Genetic progress for RS and homogeneous regions was studied utilizing the method proposed by Vencovsky. Annual genetic progress for wheat grain yield during the period of 12 years in the State of RS was 2.86%, oscillating between homogeneous regions of cultivation. The difference of annual genetic progress in region 1 (1.82%) in relation to region 2 (4.38%) justifies the study of genetic progress by homogeneous regions of cultivation.

  20. Multiscale global identification of porous structures

    NASA Astrophysics Data System (ADS)

    Hatłas, Marcin; Beluch, Witold

    2018-01-01

    The paper is devoted to the evolutionary identification of the material constants of porous structures based on measurements conducted on a macro scale. Numerical homogenization with the RVE concept is used to determine the equivalent properties of a macroscopically homogeneous material. Finite element method software is applied to solve the boundary-value problem in both scales. Global optimization methods in form of evolutionary algorithm are employed to solve the identification task. Modal analysis is performed to collect the data necessary for the identification. A numerical example presenting the effectiveness of proposed attitude is attached.

  1. PDF methods for combustion in high-speed turbulent flows

    NASA Technical Reports Server (NTRS)

    Pope, Stephen B.

    1995-01-01

    This report describes the research performed during the second year of this three-year project. The ultimate objective of the project is extend the applicability of probability density function (pdf) methods from incompressible to compressible turbulent reactive flows. As described in subsequent sections, progress has been made on: (1) formulation and modelling of pdf equations for compressible turbulence, in both homogeneous and inhomogeneous inert flows; and (2) implementation of the compressible model in various flow configurations, namely decaying isotropic turbulence, homogeneous shear flow and plane mixing layer.

  2. Method for removing trace pollutants from aqueous solutions

    DOEpatents

    Silver, Gary L.

    1986-01-01

    A method of substantially removing a trace metallic contaminant from a liquid containing the same comprises, adding an oxidizing agent to a liquid containing a trace amount of a metallic contaminant of a concentration of up to about 10.sup.-1 ppm, the oxidizing agent being one which oxidizes the contaminant to form an oxidized product which is insoluble in the liquid and precipitates therefrom, and the conditions of the addition being selected to ensure that the precipitation of the oxidized product is homogeneous, and separating the homogeneously precipitated product from the liquid.

  3. Method of making metal oxide ceramic powders by using a combustible amino acid compound

    DOEpatents

    Pederson, L.R.; Chick, L.A.; Exarhos, G.J.

    1992-05-19

    This invention is directed to the formation of homogeneous, aqueous precursor mixtures of at least one substantially soluble metal salt and a substantially soluble, combustible co-reactant compound, typically an amino acid. This produces, upon evaporation, a substantially homogeneous intermediate material having a total solids level which would support combustion. The homogeneous intermediate material essentially comprises highly dispersed or solvated metal constituents and the co-reactant compound. The intermediate material is quite flammable. A metal oxide powder results on ignition of the intermediate product which combusts same to produce the product powder.

  4. Method of making metal oxide ceramic powders by using a combustible amino acid compound

    DOEpatents

    Pederson, Larry R.; Chick, Lawrence A.; Exarhos, Gregory J.

    1992-01-01

    This invention is directed to the formation of homogeneous, aqueous precursor mixtures of at least one substantially soluble metal salt and a substantially soluble, combustible co-reactant compound, typically an amino acid. This produces, upon evaporation, a substantially homogeneous intermediate material having a total solids level which would support combustion. The homogeneous intermediate material essentially comprises highly dispersed or solvated metal constituents and the co-reactant compound. The intermediate material is quite flammable. A metal oxide powder results on ignition of the intermediate product which combusts same to produce the product powder.

  5. A novel content-based active contour model for brain tumor segmentation.

    PubMed

    Sachdeva, Jainy; Kumar, Vinod; Gupta, Indra; Khandelwal, Niranjan; Ahuja, Chirag Kamal

    2012-06-01

    Brain tumor segmentation is a crucial step in surgical and treatment planning. Intensity-based active contour models such as gradient vector flow (GVF), magneto static active contour (MAC) and fluid vector flow (FVF) have been proposed to segment homogeneous objects/tumors in medical images. In this study, extensive experiments are done to analyze the performance of intensity-based techniques for homogeneous tumors on brain magnetic resonance (MR) images. The analysis shows that the state-of-art methods fail to segment homogeneous tumors against similar background or when these tumors show partial diversity toward the background. They also have preconvergence problem in case of false edges/saddle points. However, the presence of weak edges and diffused edges (due to edema around the tumor) leads to oversegmentation by intensity-based techniques. Therefore, the proposed method content-based active contour (CBAC) uses both intensity and texture information present within the active contour to overcome above-stated problems capturing large range in an image. It also proposes a novel use of Gray-Level Co-occurrence Matrix to define texture space for tumor segmentation. The effectiveness of this method is tested on two different real data sets (55 patients - more than 600 images) containing five different types of homogeneous, heterogeneous, diffused tumors and synthetic images (non-MR benchmark images). Remarkable results are obtained in segmenting homogeneous tumors of uniform intensity, complex content heterogeneous, diffused tumors on MR images (T1-weighted, postcontrast T1-weighted and T2-weighted) and synthetic images (non-MR benchmark images of varying intensity, texture, noise content and false edges). Further, tumor volume is efficiently extracted from 2-dimensional slices and is named as 2.5-dimensional segmentation. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Identification of intracellular degradation intermediates of aldolase B by antiserum to the denatured enzyme.

    PubMed Central

    Reznick, A Z; Rosenfelder, L; Shpund, S; Gershon, D

    1985-01-01

    A method has been developed that enables us to identify intracellular degradation intermediates of fructose-bisphosphate aldolase B (D-fructose-1,6-bisphosphate D-glyceraldehyde-3-phosphate-lyase, EC 4.1.2.13). This method is based on the use of antibody against thoroughly denatured purified aldolase. This antibody has been shown to recognize only denatured molecules, and it did not interact with "native" enzyme. supernatants (24,000 X g for 30 min) of liver and kidney homogenates were incubated with antiserum to denatured enzyme. The antigen-antibody precipitates thus formed were subjected to NaDodSO4/PAGE, followed by electrotransfer to nitrocellulose paper and immunodecoration with antiserum to denatured enzyme and 125I-labeled protein A. Seven peptides with molecular weights ranging from 38,000 (that of the intact subunit) to 18,000, which cross-reacted antigenically with denatured fructose-bisphosphate aldolase, could be identified in liver. The longest three peptides were also present in kidney. The possibility that these peptides were artifacts of homogenization was ruled out as follows: 125I-labeled tagged purified native aldolase was added to the buffer prior to liver homogenization. The homogenates were than subjected to NaDodSO4/PAGE followed by autoradiography, and the labeled enzyme was shown to remain intact. This method is suggested for general use in the search for degradation products of other cellular proteins. Images PMID:3898080

  7. Evaluation of the method of collecting suspended sediment from large rivers by discharge-weighted pumping and separation by continuous- flow centrifugation

    USGS Publications Warehouse

    Moody, J.A.; Meade, R.H.

    1994-01-01

    The efficacy of the method is evaluated by comparing the particle size distributions of sediment collected by the discharge-weighted pumping method with the particle size distributions of sediment collected by depth integration and separated by gravitational settling. The pumping method was found to undersample the suspended sand sized particles (>63 ??m) but to collect a representative sample of the suspended silt and clay sized particles (<63??m). The success of the discharge-weighted pumping method depends on how homogeneously the silt and clay sized particles (<63 ??m) are distributed in the vertical direction in the river. The degree of homogeneity depends on the composition and degree of aggregation of the suspended sediment particles. -from Authors

  8. A Genetic Algorithm Method for Direct estimation of paleostress states from heterogeneous fault-slip observations

    NASA Astrophysics Data System (ADS)

    Srivastava, D. C.

    2016-12-01

    A Genetic Algorithm Method for Direct estimation of paleostress states from heterogeneous fault-slip observationsDeepak C. Srivastava, Prithvi Thakur and Pravin K. GuptaDepartment of Earth Sciences, Indian Institute of Technology Roorkee, Roorkee 247667, India. Abstract Paleostress estimation from a group of heterogeneous fault-slip observations entails first the classification of the observations into homogeneous fault sets and then a separate inversion of each homogeneous set. This study combines these two issues into a nonlinear inverse problem and proposes a heuristic search method that inverts the heterogeneous fault-slip observations. The method estimates different paleostress states in a group of heterogeneous fault-slip observations and classifies it into homogeneous sets as a byproduct. It uses the genetic algorithm operators, elitism, selection, encoding, crossover and mutation. These processes translate into a guided search that finds successively fitter solutions and operate iteratively until the termination criteria is met and the globally fittest stress tensors are obtained. We explain the basic steps of the algorithm on a working example and demonstrate validity of the method on several synthetic and a natural group of heterogeneous fault-slip observations. The method is independent of any user-defined bias or any entrapment of solution in a local optimum. It succeeds even in the difficult situations where other classification methods are found to fail.

  9. Method to study the effect of blend flowability on the homogeneity of acetaminophen.

    PubMed

    Llusá, Marcos; Pingali, Kalyana; Muzzio, Fernando J

    2013-02-01

    Excipient selection is key to product development because it affects their processability and physical properties, which ultimately affect the quality attributes of the pharmaceutical product. To study how the flowability of lubricated formulations affects acetaminophen (APAP) homogeneity. The formulations studied here contain one of two types of cellulose (Avicel 102 or Ceollus KG-802), one of three grades of Mallinckrodt APAP (fine, semi-fine, or micronized), lactose (Fast-Flo) and magnesium stearate. These components are mixed in a 300-liter bin blender. Blend flowability is assessed with the Gravitational Displacement Rheometer. APAP homogeneity is assessed with off-line NIR. Excluding blends dominated by segregation, there is a trend between APAP homogeneity and blend flow index. Blend flowability is affected by the type of microcrystalline cellulose and by the APAP grade. The preliminary results suggest that the methodology used in this paper is adequate to study of the effect of blend flow index on APAP homogeneity.

  10. Elucidating the DEP phenomena using a volumetric polarization approach with consideration of the electric double layer

    PubMed Central

    Brcka, Jozef; Faguet, Jacques; Zhang, Guigen

    2017-01-01

    Dielectrophoretic (DEP) phenomena have been explored to great success for various applications like particle sorting and separation. To elucidate the underlying mechanism and quantify the DEP force experienced by particles, the point-dipole and Maxwell Stress Tensor (MST) methods are commonly used. However, both methods exhibit their own limitations. For example, the point-dipole method is unable to fully capture the essence of particle-particle interactions and the MST method is not suitable for particles of non-homogeneous property. Moreover, both methods fare poorly when it comes to explaining DEP phenomena such as the dependence of crossover frequency on medium conductivity. To address these limitations, the authors have developed a new method, termed volumetric-integration method, with the aid of computational implementation, to reexamine the DEP phenomena, elucidate the governing mechanism, and quantify the DEP force. The effect of an electric double layer (EDL) on particles' crossover behavior is dealt with through consideration of the EDL structure along with surface ionic/molecular adsorption, unlike in other methods, where the EDL is accounted for through simply assigning a surface conductance value to the particles. For validation, by comparing with literature experimental data, the authors show that the new method can quantify the DEP force on not only homogeneous particles but also non-homogeneous ones, and predict particle-particle interactions fairly accurately. Moreover, the authors also show that the predicted dependence of crossover frequency on medium conductivity and particle size agrees very well with experimental measurements. PMID:28396710

  11. Swelling-induced and controlled curving in layered gel beams

    PubMed Central

    Lucantonio, A.; Nardinocchi, P.; Pezzulla, M.

    2014-01-01

    We describe swelling-driven curving in originally straight and non-homogeneous beams. We present and verify a structural model of swollen beams, based on a new point of view adopted to describe swelling-induced deformation processes in bilayered gel beams, that is based on the split of the swelling-induced deformation of the beam at equilibrium into two components, both depending on the elastic properties of the gel. The method allows us to: (i) determine beam stretching and curving, once assigned the characteristics of the solvent bath and of the non-homogeneous beam, and (ii) estimate the characteristics of non-homogeneous flat gel beams in such a way as to obtain, under free-swelling conditions, three-dimensional shapes. The study was pursued by means of analytical, semi-analytical and numerical tools; excellent agreement of the outcomes of the different techniques was found, thus confirming the strength of the method. PMID:25383031

  12. Deflection load characteristics of laser-welded orthodontic wires.

    PubMed

    Watanabe, Etsuko; Stigall, Garrett; Elshahawy, Waleed; Watanabe, Ikuya

    2012-07-01

    To compare the deflection load characteristics of homogeneous and heterogeneous joints made by laser welding using various types of orthodontic wires. Four kinds of straight orthodontic rectangular wires (0.017 inch × 0.025 inch) were used: stainless-steel (SS), cobalt-chromium-nickel (Co-Cr-Ni), beta-titanium alloy (β-Ti), and nickel-titanium (Ni-Ti). Homogeneous and heterogeneous end-to-end joints (12 mm long each) were made by Nd:YAG laser welding. Two types of welding methods were used: two-point welding and four-point welding. Nonwelded wires were also used as a control. Deflection load (N) was measured by conducting the three-point bending test. The data (n  =  5) were statistically analyzed using analysis of variance/Tukey test (P < .05). The deflection loads for control wires measured were as follows: SS: 21.7 ± 0.8 N; Co-Cr-Ni: 20.0 ± 0.3 N; β-Ti: 13.9 ± 1.3 N; and Ni-Ti: 6.6 ± 0.4 N. All of the homogeneously welded specimens showed lower deflection loads compared to corresponding control wires and exhibited higher deflection loads compared to heterogeneously welded combinations. For homogeneous combinations, Co-Cr-Ni/Co-Cr-Ni showed a significantly (P < .05) higher deflection load than those of the remaining homogeneously welded groups. In heterogeneous combinations, SS/Co-Cr-Ni and β-Ti/Ni-Ti showed higher deflection loads than those of the remaining heterogeneously welded combinations (significantly higher for SS/Co-Cr-Ni). Significance (P < .01) was shown for the interaction between the two factors (materials combination and welding method). However, no significant difference in deflection load was found between four-point and two-point welding in each homogeneous or heterogeneous combination. Heterogeneously laser-welded SS/Co-Cr-Ni and β-Ti/Ni-Ti wires provide a deflection load that is comparable to that of homogeneously welded orthodontic wires.

  13. Simulation assisted characterization of kaolinite-methanol intercalation complexes synthesized using cost-efficient homogenization method

    NASA Astrophysics Data System (ADS)

    Makó, Éva; Kovács, András; Ható, Zoltán; Kristóf, Tamás

    2015-12-01

    Recent experimental and simulation findings with kaolinite-methanol intercalation complexes raised the question of the existence of more stable structures in wet and dry state, which has not been fully cleared up yet. Experimental and molecular simulation analyses were used to investigate different types of kaolinite-methanol complexes, revealing their real structures. Cost-efficient homogenization methods were applied to synthesize the kaolinite-dimethyl sulfoxide and kaolinite-urea pre-intercalation complexes of the kaolinite-methanol ones. The tested homogenization method required an order of magnitude lower amount of reagents than the generally applied solution method. The influence of the type of pre-intercalated molecules and of the wetting or drying (at room temperature and at 150 °C) procedure on the intercalation was characterized experimentally by X-ray diffraction and thermal analysis. Consistent with the suggestion from the present simulations, 1.12-nm and 0.83-nm stable kaolinite-methanol complexes were identified. For these complexes, our molecular simulations predict either single-layered structures of mobile methanol/water molecules or non-intercalated structures of methoxy-functionalized kaolinite. We found that the methoxy-modified kaolinite can easily be intercalated by liquid methanol.

  14. M-Adapting Low Order Mimetic Finite Differences for Dielectric Interface Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGregor, Duncan A.; Gyrya, Vitaliy; Manzini, Gianmarco

    2016-03-07

    We consider a problem of reducing numerical dispersion for electromagnetic wave in the domain with two materials separated by a at interface in 2D with a factor of two di erence in wave speed. The computational mesh in the homogeneous parts of the domain away from the interface consists of square elements. Here the method construction is based on m-adaptation construction in homogeneous domain that leads to fourth-order numerical dispersion (vs. second order in non-optimized method). The size of the elements in two domains also di ers by a factor of two, so as to preserve the same value ofmore » Courant number in each. Near the interface where two meshes merge the mesh with larger elements consists of degenerate pentagons. We demonstrate that prior to m-adaptation the accuracy of the method falls from second to rst due to breaking of symmetry in the mesh. Next we develop m-adaptation framework for the interface region and devise an optimization criteria. We prove that for the interface problem m-adaptation cannot produce increase in method accuracy. This is in contrast to homogeneous medium where m-adaptation can increase accuracy by two orders.« less

  15. Assessment of protein set coherence using functional annotations

    PubMed Central

    Chagoyen, Monica; Carazo, Jose M; Pascual-Montano, Alberto

    2008-01-01

    Background Analysis of large-scale experimental datasets frequently produces one or more sets of proteins that are subsequently mined for functional interpretation and validation. To this end, a number of computational methods have been devised that rely on the analysis of functional annotations. Although current methods provide valuable information (e.g. significantly enriched annotations, pairwise functional similarities), they do not specifically measure the degree of homogeneity of a protein set. Results In this work we present a method that scores the degree of functional homogeneity, or coherence, of a set of proteins on the basis of the global similarity of their functional annotations. The method uses statistical hypothesis testing to assess the significance of the set in the context of the functional space of a reference set. As such, it can be used as a first step in the validation of sets expected to be homogeneous prior to further functional interpretation. Conclusion We evaluate our method by analysing known biologically relevant sets as well as random ones. The known relevant sets comprise macromolecular complexes, cellular components and pathways described for Saccharomyces cerevisiae, which are mostly significantly coherent. Finally, we illustrate the usefulness of our approach for validating 'functional modules' obtained from computational analysis of protein-protein interaction networks. Matlab code and supplementary data are available at PMID:18937846

  16. Comparison of gravimetric and gas chromatographic methods for assessing performance of textile materials against liquid pesticide penetration.

    PubMed

    Shaw, Anugrah; Abbi, Ruchika

    2004-01-01

    Penetration of liquid pesticides through textile materials is a criterion for determining the performance of protective clothing used by pesticide handlers. The pipette method is frequently used to apply liquid pesticides onto textile materials to measure penetration. Typically, analytical techniques such as Gas Chromatography (GC) are used to measure percentage penetration. These techniques are labor intensive and costly. A simpler gravimetric method was developed, and tests were conducted to compare the gravimetric and GC methods of analysis. Three types of pesticide formulations and 4 fabrics were used for the study. Diluted pesticide formulations were pipetted onto the test specimens and percentage penetration was measured using the 2 methods. For homogeneous formulation, the results of the two methods were fairly comparable. However, due to the filtering action of the textile materials, there were differences in the percentage penetration between the 2 methods for formulations that were not homogeneous.

  17. Homogeneity of lithium distribution in cylinder-type Li-ion batteries

    PubMed Central

    Senyshyn, A.; Mühlbauer, M. J.; Dolotko, O.; Hofmann, M.; Ehrenberg, H.

    2015-01-01

    Spatially-resolved neutron powder diffraction with a gauge volume of 2 × 2 × 20 mm3 has been applied as an in situ method to probe the lithium concentration in the graphite anode of different Li-ion cells of 18650-type in charged state. Structural studies performed in combination with electrochemical measurements and X-ray computed tomography under real cell operating conditions unambiguously revealed non-homogeneity of the lithium distribution in the graphite anode. Deviations from a homogeneous behaviour have been found in both radial and axial directions of 18650-type cells and were discussed in the frame of cell geometry and electrical connection of electrodes, which might play a crucial role in the homogeneity of the lithium distribution in the active materials within each electrode. PMID:26681110

  18. An efficient, reliable and inexpensive device for the rapid homogenization of multiple tissue samples by centrifugation.

    PubMed

    Ilyin, S E; Plata-Salamán, C R

    2000-02-15

    Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.

  19. Finite-time consensus for controlled dynamical systems in network

    NASA Astrophysics Data System (ADS)

    Zoghlami, Naim; Mlayeh, Rhouma; Beji, Lotfi; Abichou, Azgal

    2018-04-01

    The key challenges in networked dynamical systems are the component heterogeneities, nonlinearities, and the high dimension of the formulated vector of state variables. In this paper, the emphasise is put on two classes of systems in network include most controlled driftless systems as well as systems with drift. For each model structure that defines homogeneous and heterogeneous multi-system behaviour, we derive protocols leading to finite-time consensus. For each model evolving in networks forming a homogeneous or heterogeneous multi-system, protocols integrating sufficient conditions are derived leading to finite-time consensus. Likewise, for the networking topology, we make use of fixed directed and undirected graphs. To prove our approaches, finite-time stability theory and Lyapunov methods are considered. As illustrative examples, the homogeneous multi-unicycle kinematics and the homogeneous/heterogeneous multi-second order dynamics in networks are studied.

  20. Partitioning of the degradation space for OCR training

    NASA Astrophysics Data System (ADS)

    Barney Smith, Elisa H.; Andersen, Tim

    2006-01-01

    Generally speaking optical character recognition algorithms tend to perform better when presented with homogeneous data. This paper studies a method that is designed to increase the homogeneity of training data, based on an understanding of the types of degradations that occur during the printing and scanning process, and how these degradations affect the homogeneity of the data. While it has been shown that dividing the degradation space by edge spread improves recognition accuracy over dividing the degradation space by threshold or point spread function width alone, the challenge is in deciding how many partitions and at what value of edge spread the divisions should be made. Clustering of different types of character features, fonts, sizes, resolutions and noise levels shows that edge spread is indeed shown to be a strong indicator of the homogeneity of character data clusters.

  1. Indirect boundary element method to simulate elastic wave propagation in piecewise irregular and flat regions

    NASA Astrophysics Data System (ADS)

    Perton, Mathieu; Contreras-Zazueta, Marcial A.; Sánchez-Sesma, Francisco J.

    2016-06-01

    A new implementation of indirect boundary element method allows simulating the elastic wave propagation in complex configurations made of embedded regions that are homogeneous with irregular boundaries or flat layers. In an older implementation, each layer of a flat layered region would have been treated as a separated homogeneous region without taking into account the flat boundary information. For both types of regions, the scattered field results from fictitious sources positioned along their boundaries. For the homogeneous regions, the fictitious sources emit as in a full-space and the wave field is given by analytical Green's functions. For flat layered regions, fictitious sources emit as in an unbounded flat layered region and the wave field is given by Green's functions obtained from the discrete wavenumber (DWN) method. The new implementation allows then reducing the length of the discretized boundaries but DWN Green's functions require much more computation time than the full-space Green's functions. Several optimization steps are then implemented and commented. Validations are presented for 2-D and 3-D problems. Higher efficiency is achieved in 3-D.

  2. HOMPRA Europe - A gridded precipitation data set from European homogenized time series

    NASA Astrophysics Data System (ADS)

    Rustemeier, Elke; Kapala, Alice; Meyer-Christoffer, Anja; Finger, Peter; Schneider, Udo; Venema, Victor; Ziese, Markus; Simmer, Clemens; Becker, Andreas

    2017-04-01

    Reliable monitoring data are essential for robust analyses of climate variability and, in particular, long-term trends. In this regard, a gridded, homogenized data set of monthly precipitation totals - HOMPRA Europe (HOMogenized PRecipitation Analysis of European in-situ data)- is presented. The data base consists of 5373 homogenized monthly time series, a carefully selected subset held by the Global Precipitation Climatology Centre (GPCC). The chosen series cover the period 1951-2005 and contain less than 10% missing values. Due to the large number of data, an automatic algorithm had to be developed for the homogenization of these precipitation series. In principal, the algorithm is based on three steps: * Selection of overlapping station networks in the same precipitation regime, based on rank correlation and Ward's method of minimal variance. Since the underlying time series should be as homogeneous as possible, the station selection is carried out by deterministic first derivation in order to reduce artificial influences. * The natural variability and trends were temporally removed by means of highly correlated neighboring time series to detect artificial break-points in the annual totals. This ensures that only artificial changes can be detected. The method is based on the algorithm of Caussinus and Mestre (2004). * In the last step, the detected breaks are corrected monthly by means of a multiple linear regression (Mestre, 2003). Due to the automation of the homogenization, the validation of the algorithm is essential. Therefore, the method was tested on artificial data sets. Additionally the sensitivity of the method was tested by varying the neighborhood series. If available in digitized form, the station history was also used to search for systematic errors in the jump detection. Finally, the actual HOMPRA Europe product is produced by interpolation of the homogenized series onto a 1° grid using one of the interpolation schems operationally at GPCC (Becker et al., 2013 and Schamm et al., 2014). Caussinus, H., und O. Mestre, 2004: Detection and correction of artificial shifts in climate series, Journal of the Royal, Statistical Society. Series C (Applied Statistics), 53(3), 405-425. Mestre, O., 2003: Correcting climate series using ANOVA technique, Proceedings of the fourth seminar Willmott, C.; Rowe, C. & Philpot, W., 1985: Small-scale climate maps: A sensitivity analysis of some common assumptions associated with grid-point interpolation and contouring The American Carthographer, 12, 5-16 Becker, A.; Finger, P.; Meyer-Christoffer, A.; Rudolf, B.; Schamm, K.; Schneider, U. & Ziese, M., 2013: A description of the global land-surface precipitation data products of the Global Precipitation Climatology Centre with sample applications including centennial (trend) analysis from 1901-present Earth System Science Data, 5, 71-99 Schamm, K.; Ziese, M.; Becker, A.; Finger, P.; Meyer-Christoffer, A.; Schneider, U.; Schröder, M. & Stender, P., 2014: Global gridded precipitation over land: a description of the new GPCC First Guess Daily product, Earth System Science Data, 6, 49-60

  3. Highly accelerated acquisition and homogeneous image reconstruction with rotating RF coil array at 7T-A phantom based study.

    PubMed

    Li, Mingyan; Zuo, Zhentao; Jin, Jin; Xue, Rong; Trakic, Adnan; Weber, Ewald; Liu, Feng; Crozier, Stuart

    2014-03-01

    Parallel imaging (PI) is widely used for imaging acceleration by means of coil spatial sensitivities associated with phased array coils (PACs). By employing a time-division multiplexing technique, a single-channel rotating radiofrequency coil (RRFC) provides an alternative method to reduce scan time. Strategically combining these two concepts could provide enhanced acceleration and efficiency. In this work, the imaging acceleration ability and homogeneous image reconstruction strategy of 4-element rotating radiofrequency coil array (RRFCA) was numerically investigated and experimental validated at 7T with a homogeneous phantom. Each coil of RRFCA was capable of acquiring a large number of sensitivity profiles, leading to a better acceleration performance illustrated by the improved geometry-maps that have lower maximum values and more uniform distributions compared to 4- and 8-element stationary arrays. A reconstruction algorithm, rotating SENSitivity Encoding (rotating SENSE), was proposed to provide image reconstruction. Additionally, by optimally choosing the angular sampling positions and transmit profiles under the rotating scheme, phantom images could be faithfully reconstructed. The results indicate that, the proposed technique is able to provide homogeneous reconstructions with overall higher and more uniform signal-to-noise ratio (SNR) distributions at high reduction factors. It is hoped that, by employing the high imaging acceleration and homogeneous imaging reconstruction ability of RRFCA, the proposed method will facilitate human imaging for ultra high field MRI. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Effect of Freezing Time on Macronutrients and Energy Content of Breastmilk

    PubMed Central

    Escuder-Vieco, Diana; García-Algar, Oscar; De la Cruz, Javier; Lora, David; Pallás-Alonso, Carmen

    2012-01-01

    Abstract Background In neonatal units and human milk banks freezing breastmilk at less than –20°C is the choice for preserving it. Scientific evidence in relation to the loss of nutritional quality during freezing is rare. Our main aim in this study is to determine the effect of freezing time up to 3 months on the content of fat, total nitrogen, lactose, and energy. Our secondary aim is to assess whether ultrasonic homogenization of samples enables a more suitable reading of breastmilk macronutrients with a human milk analyzer (HMA) (MIRIS®, Uppsala, Sweden). Methods Refrigerated breastmilk samples were collected. Each sample was divided into six pairs of aliquots. One pair was analyzed on day 0, and the remaining pairs were frozen and analyzed, one each at 7, 15, 30, 60, and 90 days later. For each pair, one aliquot was homogenized by stirring, and the other by applying ultrasound. Samples were analyzed with the HMA. Results By 3 months from freezing with the two homogenization methods, we observed a relevant and significant decline in the concentration of fat and energy content. The modification of total nitrogen and lactose was not constant and of lower magnitude. The absolute concentration of all macronutrients and calories was greater with ultrasonic homogenization. Conclusions After 3 months from freezing at –20°C, an important decrease in fat and caloric content is observed. Correct homogenization is fundamental for correct nutritional analysis. PMID:22047109

  5. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    PubMed Central

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  6. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    PubMed

    Tang, Liang; Zhang, Jinjie; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  7. [Investigation on the homogeneity and stability of quality controlling cosmetic samples containing arsenic].

    PubMed

    Dong, Bing; Song, Yu; Fan, Wenjia; Zhu, Ying

    2010-11-01

    To study the homogeneity and stability of arsenic in quality controlling cosmetic samples. Arsenic was determined by atomic fluorescence spectrophotometric method. The t-test and F-test were used to evaluate the significant difference of the within-bottle and between-bottle results with three batches. The RSDs of arsenic obtained in different time were compared with the relative expanding uncertainties to evaluate the stability. Average and variance of within-bottle and between-bottle results of arsenic were not different significantly. The RSDs of Arsenic were less than the relative expanding uncertainties. Quality controlling cosmetic samples containing arsenic were considered homogeneous and stable.

  8. Arc melting and homogenization of ZrC and ZrC + B alloys

    NASA Technical Reports Server (NTRS)

    Darolia, R.; Archbold, T. F.

    1973-01-01

    A description is given of the methods used to arc-melt and to homogenize near-stoichiometric ZrC and ZrC-boron alloys, giving attention to the oxygen contamination problem. The starting material for the carbide preparation was ZrC powder with an average particle size of 4.6 micron. Pellets weighing approximately 3 g each were prepared at room temperature from the powder by the use of an isostatic press operated at 50,000 psi. These pellets were individually melted in an arc furnace containing a static atmosphere of purified argon. A graphite resistance furnace was used for the homogenization process.

  9. Rapid and simple procedure for homogenizing leaf tissues suitable for mini-midi-scale DNA extraction in rice.

    PubMed

    Yi, Gihwan; Choi, Jun-Ho; Lee, Jong-Hee; Jeong, Unggi; Nam, Min-Hee; Yun, Doh-Won; Eun, Moo-Young

    2005-01-01

    We describe a rapid and simple procedure for homogenizing leaf samples suitable for mini/midi-scale DNA preparation in rice. The methods used tungsten carbide beads and general vortexer for homogenizing leaf samples. In general, two samples can be ground completely within 11.3+/-1.5 sec at one time. Up to 20 samples can be ground at a time using a vortexer attachment. The yields of the DNA ranged from 2.2 to 7.6 microg from 25-150 mg of young fresh leaf tissue. The quality and quantity of DNA was compatible for most of PCR work and RFLP analysis.

  10. Rapid methods for extraction and concentration of poliovirus from oyster tissues.

    PubMed

    Richards, G P; Goldmintz, D; Green, D L; Babinchak, J A

    1982-12-01

    A procedure is discussed for the extraction of poliovirus from oyster meats by modification of several enterovirus extraction techniques. The modified method uses meat extract and Cat-Floc, a polycationic electrolyte, for virus extraction and concentration. Virus recovery from inoculated oyster homogenates is 93-120%. Adsorption of viruses to oyster proteins by acidification of homogenates does not affect virus recovery. Elution of viruses from oyster proteins appears more efficient at pH 9.5 than at pH 8.0. This technique is relatively simple, economical and requires only 2.5 h to complete the combined extraction and concentration procedure.

  11. Method of refining cracked oil by using metallic soaps. [desulfurization of cracked oils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masakichi, M.; Marunouchi, K.K.; Yoshimura, T.

    1937-04-13

    The method of refining cracked oil consists in dissolving oil-soluble heavy metallic soap of oleic acid in a volatile organic solvent which will disperse homogeneously in cracked oil; pouring the solution thus obtained slowly into cracked oil to effect dispersion naturally and homogeneously at room temperature in the cracked oil. This process serves to react the mercaptans in the cracked oil with the heavy metallic soap by a double decomposition reaction and to precipitate the mercaptans as insoluble metallic salts. The remaining liquid is distilled to separate it from the remaining solvent.

  12. MO-F-CAMPUS-I-02: Accuracy in Converting the Average Breast Dose Into the Mean Glandular Dose (MGD) Using the F-Factor in Cone Beam Breast CT- a Monte Carlo Study Using Homogeneous and Quasi-Homogeneous Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, C; Zhong, Y; Wang, T

    2015-06-15

    Purpose: To investigate the accuracy in estimating the mean glandular dose (MGD) for homogeneous breast phantoms by converting from the average breast dose using the F-factor in cone beam breast CT. Methods: EGSnrc-based Monte Carlo codes were used to estimate the MGDs. 13-cm in diameter, 10-cm high hemi-ellipsoids were used to simulate pendant-geometry breasts. Two different types of hemi-ellipsoidal models were employed: voxels in quasi-homogeneous phantoms were designed as either adipose or glandular tissue while voxels in homogeneous phantoms were designed as the mixture of adipose and glandular tissues. Breast compositions of 25% and 50% volume glandular fractions (VGFs), definedmore » as the ratio of glandular tissue voxels to entire breast voxels in the quasi-homogeneous phantoms, were studied. These VGFs were converted into glandular fractions by weight and used to construct the corresponding homogeneous phantoms. 80 kVp x-rays with a mean energy of 47 keV was used in the simulation. A total of 109 photons were used to image the phantoms and the energies deposited in the phantom voxels were tallied. Breast doses in homogeneous phantoms were averaged over all voxels and then used to calculate the MGDs using the F-factors evaluated at the mean energy of the x-rays. The MGDs for quasi-homogeneous phantoms were computed directly by averaging the doses over all glandular tissue voxels. The MGDs estimated for the two types of phantoms were normalized to the free-in-air dose at the iso-center and compared. Results: The normalized MGDs were 0.756 and 0.732 mGy/mGy for the 25% and 50% VGF homogeneous breasts and 0.761 and 0.733 mGy/mGy for the corresponding quasi-homogeneous breasts, respectively. The MGDs estimated for the two types of phantoms were similar within 1% in this study. Conclusion: MGDs for homogeneous breast models may be adequately estimated by converting from the average breast dose using the F-factor.« less

  13. Vitrification of ion exchange resins

    DOEpatents

    Cicero-Herman, Connie A.; Workman, Rhonda Jackson

    2001-01-01

    The present invention relates to vitrification of ion exchange resins that have become loaded with hazardous or radioactive wastes, in a way that produces a homogenous and durable waste form and reduces the disposal volume of the resin. The methods of the present invention involve directly adding borosilicate glass formers and an oxidizer to the ion exchange resin and heating the mixture at sufficient temperature to produce homogeneous glass.

  14. Processing of non-oxide ceramics from sol-gel methods

    DOEpatents

    Landingham, Richard; Reibold, Robert A.; Satcher, Joe

    2014-12-12

    A general procedure applied to a variety of sol-gel precursors and solvent systems for preparing and controlling homogeneous dispersions of very small particles within each other. Fine homogenous dispersions processed at elevated temperatures and controlled atmospheres make a ceramic powder to be consolidated into a component by standard commercial means: sinter, hot press, hot isostatic pressing (HIP), hot/cold extrusion, spark plasma sinter (SPS), etc.

  15. Ensemble Learning Method for Hidden Markov Models

    DTIC Science & Technology

    2014-12-01

    Ensemble HMM landmine detector Mine signatures vary according to the mine type, mine size , and burial depth. Similarly, clutter signatures vary with soil ...approaches for the di erent K groups depending on their size and homogeneity. In particular, we investigate the maximum likelihood (ML), the minimum...propose using and optimizing various training approaches for the different K groups depending on their size and homogeneity. In particular, we

  16. A new silica-infiltrated Y-TZP obtained by the sol-gel method.

    PubMed

    Campos, T M B; Ramos, N C; Machado, J P B; Bottino, M A; Souza, R O A; Melo, R M

    2016-05-01

    The aim of this study was to evaluate silica infiltration into dental zirconia (VITA In-Ceram 2000 YZ, Vita Zahnfabrik) and its effects on zirconia's surface characteristics, structural homogeneity and bonding to a resin cement. Infiltration was performed by immersion of the pre-sintered zirconia specimens in silica sols for five days (ZIn). Negative (pure zirconia specimens, ZCon-) and positive controls (specimens kept in water for 5 days, ZCon+) were also performed. After sintering, the groups were evaluated by X-ray diffraction (XRD), grazing angle X-ray diffraction (DRXR), scanning electron microscopy (SEM), contact angle measurements, optical profilometry, biaxial flexural test and shear bonding test. Weibull analysis was used to determine the Weibull modulus (m) and characteristic strength (σ0) of all groups. There were no major changes in strength for the infiltrated group, and homogeneity (m) was also increased. A layer of ZrSiO4 was formed on the surface. The bond strength to resin cement was improved after zirconia infiltration, acid conditioning and the use of an MDP primer. The sol-gel method is an efficient and simple method to increase the homogeneity of zirconia. Infiltration also improved bonding to resin cement. The performance of a zirconia infiltrated by silica gel improved in at least two ways: structural homogeneity and bonding to resin cement. The infiltration is simple to perform and can be easily managed in a prosthesis laboratory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    PubMed Central

    Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong

    2012-01-01

    Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270

  18. Effect of four different size reduction methods on the particle size, solubility enhancement and physical stability of nicergoline nanocrystals.

    PubMed

    Martena, Valentina; Shegokar, Ranjita; Di Martino, Piera; Müller, Rainer H

    2014-09-01

    Nicergoline, a poorly soluble active pharmaceutical ingredient, possesses vaso-active properties which causes peripheral and central vasodilatation. In this study, nanocrystals of nicergoline were prepared in an aqueous solution of polysorbate 80 (nanosuspension) by using four different laboratory scale size reduction techniques: high pressure homogenization (HPH), bead milling (BM) and combination techniques (high pressure homogenization followed by bead milling HPH + BM, and bead milling followed by high pressure homogenization BM + HPH). Nanocrystals were investigated regarding to their mean particles size, zeta potential and particle dissolution. A short term physical stability study on nanocrystals stored at three different temperatures (4, 20 and 40 °C) was performed to evaluate the tendency to change in particle size, aggregation and zeta potential. The size reduction technique and the process parameters like milling time, number of homogenization cycles and pressure greatly affected the size of nanocrystals. Among the techniques used, the combination techniques showed superior and consistent particle size reduction compared to the other two methods, HPH + BM and BM + HPH giving nanocrystals of a mean particle size of 260 and 353 nm, respectively. The particle dissolution was increased for any nanocrystals samples, but it was particularly increased by HPH and combination techniques. Independently to the production method, nicergoline nanocrystals showed slight increase in particle size over the time, but remained below 500 nm at 20 °C and refrigeration conditions.

  19. Automatic benchmarking of homogenization packages applied to synthetic monthly series within the frame of the MULTITEST project

    NASA Astrophysics Data System (ADS)

    Guijarro, José A.; López, José A.; Aguilar, Enric; Domonkos, Peter; Venema, Victor; Sigró, Javier; Brunet, Manola

    2017-04-01

    After the successful inter-comparison of homogenization methods carried out in the COST Action ES0601 (HOME), many methods kept improving their algorithms, suggesting the need of performing new inter-comparison exercises. However, manual applications of the methodologies to a large number of testing networks cannot be afforded without involving the work of many researchers over an extended time. The alternative is to make the comparisons as automatic as possible, as in the MULTITEST project, which, funded by the Spanish Ministry of Economy and Competitiveness, tests homogenization methods by applying them to a large number of synthetic networks of monthly temperature and precipitation. One hundred networks of 10 series were sampled from different master networks containing 100 series of 720 values (60 years times 12 months). Three master temperature networks were built with different degree of cross-correlations between the series in order to simulate conditions of different station densities or climatic heterogeneity. Also three master synthetic networks were developed for precipitation, this time mimicking the characteristics of three different climates: Atlantic temperate, Mediterranean and monsoonal. Inhomogeneities were introduced in every network sampled from the master networks, and all publicly available homogenization methods that we could run in an automatic way were applied to them: ACMANT 3.0, Climatol 3.0, MASH 3.03, RHTestV4, USHCN v52d and HOMER 2.6. Most of them were tested with different settings, and their comparative results can be inspected in box-plot graphics of Root Mean Squared Errors and trend biases computed between the homogenized data and their original homogeneous series. In a first stage, inhomogeneities were applied to the synthetic homogeneous series with five different settings with increasing difficulty and realism: i) big shifts in half of the series; ii) the same with a strong seasonality; iii) short term platforms and local trends; iv) random number of shifts with random size and location in all series; and v) the same plus seasonality of random amplitude. The shifts were additive for temperature and multiplicative for precipitation. The second stage is dedicated to study the impact of the number of series in the networks, seasonalities other than sinusoidal, and the occurrence of simultaneous shifts in a high number of series. Finally, tests will be performed on a longer and more realistic benchmark, with varying number of missing data along time, similar to that used in the COST Action ES0601. These inter-comparisons will be valuable both to the users and to the developers of the tested packages, who can see how their algorithms behave under varied climate conditions.

  20. Numerical homogenization of elastic and thermal material properties for metal matrix composites (MMC)

    NASA Astrophysics Data System (ADS)

    Schindler, Stefan; Mergheim, Julia; Zimmermann, Marco; Aurich, Jan C.; Steinmann, Paul

    2017-01-01

    A two-scale material modeling approach is adopted in order to determine macroscopic thermal and elastic constitutive laws and the respective parameters for metal matrix composite (MMC). Since the common homogenization framework violates the thermodynamical consistency for non-constant temperature fields, i.e., the dissipation is not conserved through the scale transition, the respective error is calculated numerically in order to prove the applicability of the homogenization method. The thermomechanical homogenization is applied to compute the macroscopic mass density, thermal expansion, elasticity, heat capacity and thermal conductivity for two specific MMCs, i.e., aluminum alloy Al2024 reinforced with 17 or 30 % silicon carbide particles. The temperature dependency of the material properties has been considered in the range from 0 to 500°C, the melting temperature of the alloy. The numerically determined material properties are validated with experimental data from the literature as far as possible.

  1. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Taylor, Bryant D. (Inventor); Woodard, Stanley E. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  2. Assessing the use of food coloring as an appropriate visual guide for homogenously mixed capsule powders in extemporaneous compounding.

    PubMed

    Hoffmann, Brittany; Carlson, Christie; Rao, Deepa A

    2014-01-01

    The purpose of this work was to assess the use of food colors as a visual aid to determine homogeneous mixing in the extemporaneous preparation of capsules. Six different batches of progesterone slow-release 200-mg capsules were prepared by different mixing methods until visually determined as homogeneous based on yellow food coloring distribution in the preparation by the Central Iowa Compounding Pharmacy, Des Moines, Iowa. UV-Vis spectrophotometry was used to extract and evaluate yellow food coloring content in each of these batches and compared to an in-house, small-batch geometric dilution preparation of progesterone slow- release 200-mg capsules. Of the 6 batches tested, only one, which followed the principles of additive dilution and an appropriate mixing time, was both visually and quantitatively homogeneous in the detection of yellow food coloring. The use of food coloring alone is not a valid quality-assurance tool in determining homogeneous mixing. Principles of geometric and/or additive dilution and appropriate mixing times along with the food color can serve as a quality-assurance tool.

  3. Improved Homogeneity of the Transmit Field by Simultaneous Transmission with Phased Array and Volume Coil

    PubMed Central

    Avdievich, Nikolai I.; Oh, Suk-Hoon; Hetherington, Hoby P.; Collins, Christopher M.

    2010-01-01

    Purpose To improve the homogeneity of transmit volume coils at high magnetic fields (≥ 4 T). Due to RF field/ tissue interactions at high fields, 4–8 T, the transmit profile from head-sized volume coils shows a distinctive pattern with relatively strong RF magnetic field B1 in the center of the brain. Materials and Methods In contrast to conventional volume coils at high field strengths, surface coil phased arrays can provide increased RF field strength peripherally. In theory, simultaneous transmission from these two devices could produce a more homogeneous transmission field. To minimize interactions between the phased array and the volume coil, counter rotating current (CRC) surface coils consisting of two parallel rings carrying opposite currents were used for the phased array. Results Numerical simulations and experimental data demonstrate that substantial improvements in transmit field homogeneity can be obtained. Conclusion We have demonstrated the feasibility of using simultaneous transmission with human head-sized volume coils and CRC phased arrays to improve homogeneity of the transmit RF B1 field for high-field MRI systems. PMID:20677280

  4. Effect of homogenous-heterogeneous reactions on MHD Prandtl fluid flow over a stretching sheet

    NASA Astrophysics Data System (ADS)

    Khan, Imad; Malik, M. Y.; Hussain, Arif; Salahuddin, T.

    An analysis is performed to explore the effects of homogenous-heterogeneous reactions on two-dimensional flow of Prandtl fluid over a stretching sheet. In present analysis, we used the developed model of homogeneous-heterogeneous reactions in boundary layer flow. The mathematical configuration of presented flow phenomenon yields the nonlinear partial differential equations. Using scaling transformations, the governing partial differential equations (momentum equation and homogenous-heterogeneous reactions equations) are transformed into non-linear ordinary differential equations (ODE's). Then, resulting non-linear ODE's are solved by computational scheme known as shooting method. The quantitative and qualitative manners of concerned physical quantities (velocity, concentration and drag force coefficient) are examined under prescribed physical constrained through figures and tables. It is observed that velocity profile enhances verses fluid parameters α and β while Hartmann number reduced it. The homogeneous and heterogeneous reactions parameters have reverse effects on concentration profile. Concentration profile shows retarding behavior for large values of Schmidt number. Skin fraction coefficient enhances with increment in Hartmann number H and fluid parameter α .

  5. Experimental investigation of homogeneous charge compression ignition combustion of biodiesel fuel with external mixture formation in a CI engine.

    PubMed

    Ganesh, D; Nagarajan, G; Ganesan, S

    2014-01-01

    In parallel to the interest in renewable fuels, there has also been increased interest in homogeneous charge compression ignition (HCCI) combustion. HCCI engines are being actively developed because they have the potential to be highly efficient and to produce low emissions. Even though HCCI has been researched extensively, few challenges still exist. These include controlling the combustion at higher loads and the formation of a homogeneous mixture. To obtain better homogeneity, in the present investigation external mixture formation method was adopted, in which the fuel vaporiser was used to achieve excellent HCCI combustion in a single cylinder air-cooled direct injection diesel engine. In continuation of our previous works, in the current study a vaporised jatropha methyl ester (JME) was mixed with air to form a homogeneous mixture and inducted into the cylinder during the intake stroke to analyze the combustion, emission and performance characteristics. To control the early ignition of JME vapor-air mixture, cooled (30 °C) Exhaust gas recirculation (EGR) technique was adopted. The experimental result shows 81% reduction in NOx and 72% reduction in smoke emission.

  6. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.

  7. [Growth Factors and Interleukins in Amniotic Membrane Tissue Homogenate].

    PubMed

    Stachon, T; Bischoff, M; Seitz, B; Huber, M; Zawada, M; Langenbucher, A; Szentmáry, N

    2015-07-01

    Application of amniotic membrane homogenate eye drops may be a potential treatment alternative for therapy resistant corneal epithelial defects. The purpose of this study was to determine the concentrations of epidermal growth factor (EGF), fibroblast growth factor basic (bFGF), hepatocyte growth factor (HGF), keratinocyte growth factor (KGF), interleukin-6 (IL-6) and interleukin-8 (IL-8) in amniotic membrane homogenates. Amniotic membranes of 8 placentas were prepared and thereafter stored at - 80 °C using the standard methods of the LIONS Cornea Bank Saar-Lor-Lux, Trier/Westpfalz. Following defreezing, amniotic membranes were cut in two pieces and homogenized in liquid nitrogen. One part of the homogenate was prepared in cell-lysis buffer, the other part was prepared in PBS. The tissue homogenates were stored at - 20 °C until enzyme-linked immunosorbent assay (ELISA) analysis for EGF, bFGF, HGF, KGF, IL-6 and IL-8 concentrations. Concentrations of KGF, IL-6 and IL-8 were below the detection limit using both preparation techniques. The EGF concentration in tissue homogenates treated with cell-lysis buffer (2412 pg/g tissue) was not significantly different compared to that of tissue homogenates treated with PBS (1586 pg/g tissue, p = 0.72). bFGF release was also not significantly different using cell-lysis buffer (3606 pg/g tissue) or PBS treated tissue homogenates (4649 pg/g tissue, p = 0.35). HGF release was significantly lower using cell-lysis buffer (23,555 pg/g tissue), compared to PBS treated tissue (47,766 pg/g tissue, p = 0.007). Containing EGF, bFGF and HGF, and lacking IL-6 and IL-8, the application of amniotic membrane homogenate eye drops may be a potential treatment alternative for therapy-resistant corneal epithelial defects. Georg Thieme Verlag KG Stuttgart · New York.

  8. SU-E-T-76: Comparing Homogeneity Between Gafchromic Film EBT2 and EBT3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mizuno, H; Sumida, I; Ogawa, K

    2014-06-01

    Purpose: We found out that homogeneity of EBT2 was different among lot numbers in previous study. Variation in local homogeneity of EBT3 among several lot numbers has not been reported. In this study, we investigated film homogeneity of Gafcrhomic EBT3 films compared with EBT2 films. Methods: All sheets from five lots were cut into 12 pieces to investigate film homogeneity, and were irradiated at 0.5, 2, and 3 Gy. To investigate intra- and inter-sheet uniformity, five sheets from five lots were exposed to 2 Gy: intra-sheet uniformity was evaluated by the coefficient of variation of homogeneity for all pieces ofmore » a single sheet, and inter-sheet uniformity was evaluated by the coefficient of variation of homogeneity among the same piece numbers in the five sheets. To investigate the difference of ADC value in various doses, a single sheet from each of five lots was irradiated at 0.5 Gy and 3 Gy in addition to 2 Gy. A scan resolution of 72 dots per inch (dpi) and color depth of 48-bit RGB were used. Films were analyzed by the inhouse software; Average of ADC value in center ROI and profile X and Y axis were measured. Results and Conclusion: Intra-sheet uniformity of non-irradiated EBT2 films were ranged from 0.1% to 0.4%, however that of irradiated EBT2 films were ranged from 0.2% to 1.5%. On the other hand, intra-sheet uniformity of irradiated and non-irradiated EBT3 films were from 0.2% to 0.6%. Inter-sheet uniformity of all films were less than 0.5%. It was interesting point that homogeneity of EBT3 between no-irradiated and irradiated films were similar value, whereas EBT2 had dose dependence of homogeneity in ADC value evaluation. These results suggested that EBT3 homogeneity was corrected by this feature.« less

  9. Multiscale intensity homogeneity transformation method and its application to computer-aided detection of pulmonary embolism in computed tomographic pulmonary angiography (CTPA)

    NASA Astrophysics Data System (ADS)

    Guo, Yanhui; Zhou, Chuan; Chan, Heang-Ping; Wei, Jun; Chughtai, Aamer; Sundaram, Baskaran; Hadjiiski, Lubomir M.; Patel, Smita; Kazerooni, Ella A.

    2013-04-01

    A 3D multiscale intensity homogeneity transformation (MIHT) method was developed to reduce false positives (FPs) in our previously developed CAD system for pulmonary embolism (PE) detection. In MIHT, the voxel intensity of a PE candidate region was transformed to an intensity homogeneity value (IHV) with respect to the local median intensity. The IHVs were calculated in multiscales (MIHVs) to measure the intensity homogeneity, taking into account vessels of different sizes and different degrees of occlusion. Seven new features including the entropy, gradient, and moments that characterized the intensity distributions of the candidate regions were derived from the MIHVs and combined with the previously designed features that described the shape and intensity of PE candidates for the training of a linear classifier to reduce the FPs. 59 CTPA PE cases were collected from our patient files (UM set) with IRB approval and 69 cases from the PIOPED II data set with access permission. 595 and 800 PEs were identified as reference standard by experienced thoracic radiologists in the UM and PIOPED set, respectively. FROC analysis was used for performance evaluation. Compared with our previous CAD system, at a test sensitivity of 80%, the new method reduced the FP rate from 18.9 to 14.1/scan for the PIOPED set when the classifier was trained with the UM set and from 22.6 to 16.0/scan vice versa. The improvement was statistically significant (p<0.05) by JAFROC analysis. This study demonstrated that the MIHT method is effective in reducing FPs and improving the performance of the CAD system.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: Different particle scanning beam delivery systems have different delivery accuracies. This study was performed to determine, for our particle treatment system, an appropriate composition (n=FWHM/GS) of spot size(FWHM) and grid size (GS), which can provide homogenous delivered dose distributions for both proton and heavy ion scanning beam radiotherapy. Methods: We analyzed the delivery errors of our beam delivery system using log files from the treatment of 28 patients. We used a homemade program to simulate square fields for different n values with and without considering the delivery errors and analyzed the homogeneity. All spots were located on a rectilinearmore » grid with equal spacing in the × and y directions. After that, we selected 7 energy levels for both proton and carbon ions. For each energy level, we made 6 square field plans with different n values (1, 1.5, 2, 2.5, 3, 3.5). Then we delivered those plans and used films to measure the homogeneity of each field. Results: For program simulation without delivery errors, when n≥1.1 the homogeneity can be within ±3%. For both proton and carbon program simulations with delivery errors and film measurements, the homogeneity can be within ±3% when n≥2.5. Conclusion: For our facility with system errors, the n≥2.5 is appropriate for maintaining homogeneity within ±3%.« less

  11. New methods in the Newtonian potential theory. I - The representation of the potential energy of homogeneous gravitating bodies by converging bodies

    NASA Astrophysics Data System (ADS)

    Kondrat'ev, B. P.

    1993-06-01

    A method is developed for the representation of the potential energy of homogeneous gravitating, as well as electrically charged, bodies in the form of special series. These series contain members consisting of products of the corresponding coefficients appearing in the expansion of external and internal Newtonian potentials in Legendre polynomial series. Several versions of the representation of potential energy through these series are possible. A formula which expresses potential energy not as a volume integral, as is the convention, but as an integral over the body surface is derived. The method is tested for the particular cases of sphere and ellipsoid, and the convergence of the found series is shown.

  12. Practical aerobic oxidations of alcohols and amines with homogeneous copper/TEMPO and related catalyst systems.

    PubMed

    Ryland, Bradford L; Stahl, Shannon S

    2014-08-18

    Oxidations of alcohols and amines are common reactions in the synthesis of organic molecules in the laboratory and industry. Aerobic oxidation methods have long been sought for these transformations, but few practical methods exist that offer advantages over traditional oxidation methods. Recently developed homogeneous Cu/TEMPO (TEMPO = 2,2,6,6-tetramethylpiperidinyl-N-oxyl) and related catalyst systems appear to fill this void. The reactions exhibit high levels of chemoselectivity and broad functional-group tolerance, and they often operate efficiently at room temperature with ambient air as the oxidant. These advances, together with their historical context and recent applications, are highlighted in this Minireview. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Structured-illumination photoacoustic Doppler flowmetry of axial flow in homogeneous scattering media

    NASA Astrophysics Data System (ADS)

    Zhang, Ruiying; Yao, Junjie; Maslov, Konstantin I.; Wang, Lihong V.

    2013-08-01

    We propose a method for photoacoustic flow measurement based on the Doppler effect from a flowing homogeneous medium. Excited by spatially modulated laser pulses, the flowing medium induces a Doppler frequency shift in the received photoacoustic signals. The frequency shift is proportional to the component of the flow speed projected onto the acoustic beam axis, and the sign of the shift reflects the flow direction. Unlike conventional flowmetry, this method does not rely on particle heterogeneity in the medium; thus, it can tolerate extremely high particle density. A red-ink phantom flowing in a tube immersed in water was used to validate the method in both the frequency and time domains. The phantom flow immersed in an intralipid solution was also measured.

  14. Inverse Monte Carlo method in a multilayered tissue model for diffuse reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Fredriksson, Ingemar; Larsson, Marcus; Strömberg, Tomas

    2012-04-01

    Model based data analysis of diffuse reflectance spectroscopy data enables the estimation of optical and structural tissue parameters. The aim of this study was to present an inverse Monte Carlo method based on spectra from two source-detector distances (0.4 and 1.2 mm), using a multilayered tissue model. The tissue model variables include geometrical properties, light scattering properties, tissue chromophores such as melanin and hemoglobin, oxygen saturation and average vessel diameter. The method utilizes a small set of presimulated Monte Carlo data for combinations of different levels of epidermal thickness and tissue scattering. The path length distributions in the different layers are stored and the effect of the other parameters is added in the post-processing. The accuracy of the method was evaluated using Monte Carlo simulations of tissue-like models containing discrete blood vessels, evaluating blood tissue fraction and oxygenation. It was also compared to a homogeneous model. The multilayer model performed better than the homogeneous model and all tissue parameters significantly improved spectral fitting. Recorded in vivo spectra were fitted well at both distances, which we previously found was not possible with a homogeneous model. No absolute intensity calibration is needed and the algorithm is fast enough for real-time processing.

  15. Global stabilisation of a class of generalised cascaded systems by homogeneous method

    NASA Astrophysics Data System (ADS)

    Ding, Shihong; Zheng, Wei Xing

    2016-04-01

    This paper considers the problem of global stabilisation of a class of generalised cascaded systems. By using the extended adding a power integrator technique, a global controller is first constructed for the driving subsystem. Then based on the homogeneous properties and polynomial assumption, it is shown that the stabilisation of the driving subsystem implies the stabilisation of the overall cascaded system. Meanwhile, by properly choosing some control parameters, the global finite-time stability of the closed-loop cascaded system is also established. The proposed control method has several new features. First, the nonlinear cascaded systems considered in the paper are more general than the conventional ones, since the powers in the nominal part of the driving subsystem are not required to be restricted to ratios of positive odd numbers. Second, the proposed method has some flexible parameters which provide the possibility for designing continuously differentiable controllers for cascaded systems, while the existing designed controllers for such kind of cascaded systems are only continuous. Third, the homogenous and polynomial conditions adopted for the driven subsystem are easier to verify when compared with the matching conditions that are widely used previously. Furthermore, the efficiency of the proposed control method is validated by its application to finite-time tracking control of non-holonomic wheeled mobile robot.

  16. An integral equation method for the homogenization of unidirectional fibre-reinforced media; antiplane elasticity and other potential problems.

    PubMed

    Joyce, Duncan; Parnell, William J; Assier, Raphaël C; Abrahams, I David

    2017-05-01

    In Parnell & Abrahams (2008 Proc. R. Soc. A 464 , 1461-1482. (doi:10.1098/rspa.2007.0254)), a homogenization scheme was developed that gave rise to explicit forms for the effective antiplane shear moduli of a periodic unidirectional fibre-reinforced medium where fibres have non-circular cross section. The explicit expressions are rational functions in the volume fraction. In that scheme, a (non-dilute) approximation was invoked to determine leading-order expressions. Agreement with existing methods was shown to be good except at very high volume fractions. Here, the theory is extended in order to determine higher-order terms in the expansion. Explicit expressions for effective properties can be derived for fibres with non-circular cross section, without recourse to numerical methods. Terms appearing in the expressions are identified as being associated with the lattice geometry of the periodic fibre distribution, fibre cross-sectional shape and host/fibre material properties. Results are derived in the context of antiplane elasticity but the analogy with the potential problem illustrates the broad applicability of the method to, e.g. thermal, electrostatic and magnetostatic problems. The efficacy of the scheme is illustrated by comparison with the well-established method of asymptotic homogenization where for fibres of general cross section, the associated cell problem must be solved by some computational scheme.

  17. An integral equation method for the homogenization of unidirectional fibre-reinforced media; antiplane elasticity and other potential problems

    PubMed Central

    Joyce, Duncan

    2017-01-01

    In Parnell & Abrahams (2008 Proc. R. Soc. A 464, 1461–1482. (doi:10.1098/rspa.2007.0254)), a homogenization scheme was developed that gave rise to explicit forms for the effective antiplane shear moduli of a periodic unidirectional fibre-reinforced medium where fibres have non-circular cross section. The explicit expressions are rational functions in the volume fraction. In that scheme, a (non-dilute) approximation was invoked to determine leading-order expressions. Agreement with existing methods was shown to be good except at very high volume fractions. Here, the theory is extended in order to determine higher-order terms in the expansion. Explicit expressions for effective properties can be derived for fibres with non-circular cross section, without recourse to numerical methods. Terms appearing in the expressions are identified as being associated with the lattice geometry of the periodic fibre distribution, fibre cross-sectional shape and host/fibre material properties. Results are derived in the context of antiplane elasticity but the analogy with the potential problem illustrates the broad applicability of the method to, e.g. thermal, electrostatic and magnetostatic problems. The efficacy of the scheme is illustrated by comparison with the well-established method of asymptotic homogenization where for fibres of general cross section, the associated cell problem must be solved by some computational scheme. PMID:28588412

  18. Gynogenesis in carp, Cyprinus Carpio L. and tench, Tinca Tinca L. induced by 60Co radiation in highly homogeneous radiating field

    NASA Astrophysics Data System (ADS)

    Pipota, J.; Linhart, O.

    The paper deals with a method of fertility inactivation of fish spermatozoa by gamma radiation. Spermatozoa motility remained unchanged after irradiation. Irradiated sperm has been utilized to induced gynogenesis by means of retention of the second polar body and of mitotic gynogenesis, realized in carp for the first time. Homogeneity of gamma-rays field was + - 1 %.

  19. Phase retrieval with the transport-of-intensity equation in an arbitrarily-shaped aperture by iterative discrete cosine transforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Lei; Zuo, Chao; Idir, Mourad

    A novel transport-of-intensity equation (TIE) based phase retrieval method is proposed with putting an arbitrarily-shaped aperture into the optical wavefield. In this arbitrarily-shaped aperture, the TIE can be solved under non-uniform illuminations and even non-homogeneous boundary conditions by iterative discrete cosine transforms with a phase compensation mechanism. Simulation with arbitrary phase, arbitrary aperture shape, and non-uniform intensity distribution verifies the effective compensation and high accuracy of the proposed method. Experiment is also carried out to check the feasibility of the proposed method in real measurement. Comparing to the existing methods, the proposed method is applicable for any types of phasemore » distribution under non-uniform illumination and non-homogeneous boundary conditions within an arbitrarily-shaped aperture, which enables the technique of TIE with hard aperture become a more flexible phase retrieval tool in practical measurements.« less

  20. Phase retrieval with the transport-of-intensity equation in an arbitrarily-shaped aperture by iterative discrete cosine transforms

    DOE PAGES

    Huang, Lei; Zuo, Chao; Idir, Mourad; ...

    2015-04-21

    A novel transport-of-intensity equation (TIE) based phase retrieval method is proposed with putting an arbitrarily-shaped aperture into the optical wavefield. In this arbitrarily-shaped aperture, the TIE can be solved under non-uniform illuminations and even non-homogeneous boundary conditions by iterative discrete cosine transforms with a phase compensation mechanism. Simulation with arbitrary phase, arbitrary aperture shape, and non-uniform intensity distribution verifies the effective compensation and high accuracy of the proposed method. Experiment is also carried out to check the feasibility of the proposed method in real measurement. Comparing to the existing methods, the proposed method is applicable for any types of phasemore » distribution under non-uniform illumination and non-homogeneous boundary conditions within an arbitrarily-shaped aperture, which enables the technique of TIE with hard aperture become a more flexible phase retrieval tool in practical measurements.« less

  1. Homotopy perturbation method with Laplace Transform (LT-HPM) for solving Lane-Emden type differential equations (LETDEs).

    PubMed

    Tripathi, Rajnee; Mishra, Hradyesh Kumar

    2016-01-01

    In this communication, we describe the Homotopy Perturbation Method with Laplace Transform (LT-HPM), which is used to solve the Lane-Emden type differential equations. It's very difficult to solve numerically the Lane-Emden types of the differential equation. Here we implemented this method for two linear homogeneous, two linear nonhomogeneous, and four nonlinear homogeneous Lane-Emden type differential equations and use their appropriate comparisons with exact solutions. In the current study, some examples are better than other existing methods with their nearer results in the form of power series. The Laplace transform used to accelerate the convergence of power series and the results are shown in the tables and graphs which have good agreement with the other existing method in the literature. The results show that LT-HPM is very effective and easy to implement.

  2. Combined Enzymatic and Mechanical Cell Disruption and Lipid Extraction of Green Alga Neochloris oleoabundans

    PubMed Central

    Wang, Dongqin; Li, Yanqun; Hu, Xueqiong; Su, Weimin; Zhong, Min

    2015-01-01

    Microalgal biodiesel is one of the most promising renewable fuels. The wet technique for lipids extraction has advantages over the dry method, such as energy-saving and shorter procedure. The cell disruption is a key factor in wet oil extraction to facilitate the intracellular oil release. Ultrasonication, high-pressure homogenization, enzymatic hydrolysis and the combination of enzymatic hydrolysis with high-pressure homogenization and ultrasonication were employed in this study to disrupt the cells of the microalga Neochloris oleoabundans. The cell disruption degree was investigated. The cell morphology before and after disruption was assessed with scanning and transmission electron microscopy. The energy requirements and the operation cost for wet cell disruption were also estimated. The highest disruption degree, up to 95.41%, assessed by accounting method was achieved by the combination of enzymatic hydrolysis and high-pressure homogenization. A lipid recovery of 92.6% was also obtained by the combined process. The combined process was found to be more efficient and economical compared with the individual process. PMID:25853267

  3. Light emitting fabric technologies for photodynamic therapy.

    PubMed

    Mordon, Serge; Cochrane, Cédric; Tylcz, Jean Baptiste; Betrouni, Nacim; Mortier, Laurent; Koncar, Vladan

    2015-03-01

    Photodynamic therapy (PDT) is considered to be a promising method for treating various types of cancer. A homogeneous and reproducible illumination during clinical PDT plays a determinant role in preventing under- or over-treatment. The development of flexible light sources would considerably improve the homogeneity of light delivery. The integration of optical fiber into flexible structures could offer an interesting alternative. This paper aims to describe different methods proposed to develop Side Emitting Optical Fibers (SEOF), and how these SEOF can be integrated in a flexible structure to improve light illumination of the skin during PDT. Four main techniques can be described: (i) light blanket integrating side-glowing optical fibers, (ii) light emitting panel composed of SEOF obtained by micro-perforations of the cladding, (iii) embroidery-based light emitting fabric, and (iv) woven-based light emitting fabric. Woven-based light emitting fabrics give the best performances: higher fluence rate, best homogeneity of light delivery, good flexibility. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Application of finite elements heterogeneous multi-scale method to eddy currents non destructive testing of carbon composites material

    NASA Astrophysics Data System (ADS)

    Khebbab, Mohamed; Feliachi, Mouloud; El Hadi Latreche, Mohamed

    2018-03-01

    In this present paper, a simulation of eddy current non-destructive testing (EC NDT) on unidirectional carbon fiber reinforced polymer is performed; for this magneto-dynamic formulation in term of magnetic vector potential is solved using finite element heterogeneous multi-scale method (FE HMM). FE HMM has as goal to compute the homogenized solution without calculating the homogenized tensor explicitly, the solution is based only on the physical characteristic known in micro domain. This feature is well adapted to EC NDT to evaluate defect in carbon composite material in microscopic scale, where the defect detection is performed by coil impedance measurement; the measurement value is intimately linked to material characteristic in microscopic level. Based on this, our model can handle different defects such as: cracks, inclusion, internal electrical conductivity changes, heterogeneities, etc. The simulation results were compared with the solution obtained with homogenized material using mixture law, a good agreement was found.

  5. High-performance liquid chromatography purification of homogenous-length RNA produced by trans cleavage with a hammerhead ribozyme.

    PubMed Central

    Shields, T P; Mollova, E; Ste Marie, L; Hansen, M R; Pardi, A

    1999-01-01

    An improved method is presented for the preparation of milligram quantities of homogenous-length RNAs suitable for nuclear magnetic resonance or X-ray crystallographic structural studies. Heterogeneous-length RNA transcripts are processed with a hammerhead ribozyme to yield homogenous-length products that are then readily purified by anion exchange high-performance liquid chromatography. This procedure eliminates the need for denaturing polyacrylamide gel electrophoresis, which is the most laborious step in the standard procedure for large-scale production of RNA by in vitro transcription. The hammerhead processing of the heterogeneous-length RNA transcripts also substantially improves the overall yield and purity of the desired RNA product. PMID:10496226

  6. New procedure to design low radar cross section near perfect isotropic and homogeneous triangular carpet cloaks.

    PubMed

    Sharifi, Zohreh; Atlasbaf, Zahra

    2016-10-01

    A new design procedure for near perfect triangular carpet cloaks, fabricated based on only isotropic homogeneous materials, is proposed. This procedure enables us to fabricate a cloak with simple metamaterials or even without employing metamaterials. The proposed procedure together with an invasive weed optimization algorithm is used to design carpet cloaks based on quasi-isotropic metamaterial structures, Teflon and AN-73. According to the simulation results, the proposed cloaks have good invisibility properties against radar, especially monostatic radar. The procedure is a new method to derive isotropic and homogeneous parameters from transformation optics formulas so we do not need to use complicated structures to fabricate the carpet cloaks.

  7. Mt-Insar Landslide Monitoring with the Aid of Homogeneous Pixels Filter

    NASA Astrophysics Data System (ADS)

    Liu, X. J.; Zhao, C. Y.; Wang, B. H.; Zhu, W. F.

    2018-04-01

    SAR interferograms are often contaminated by random noises related to temporal decorrelation, geometrical decorrelation and thermal noises, which makes the fringes obscured and greatly decreases the density of the coherent target and the accuracy of InSAR deformation results, especially for the landslide monitoring in vegetated region and in rainy season. Two different SAR interferogram filtering methods, that is Goldstein filter and homogeneous pixels filter, for one specific landslide are compared. The results show that homogeneous pixels filter is better than Goldstein one for small-scale loess landslide monitoring, which can increase the density of monitoring points. Moreover, the precision of InSAR result can reach millimeter by comparing with GPS time series measurements.

  8. TEST METHODS TO DETERMINE THE MERCURY EMISSIONS FROM SLUDGE INCINERATION PLANTS

    EPA Science Inventory

    Two test methods for mercury are described along with the laboratory and field studies done in developing and validating them. One method describes how to homogenize and analyze large quantities of sewage sludge. The other test method describes how to measure the mercury emission...

  9. An Active Patch Model for Real World Texture and Appearance Classification

    PubMed Central

    Mao, Junhua; Zhu, Jun; Yuille, Alan L.

    2014-01-01

    This paper addresses the task of natural texture and appearance classification. Our goal is to develop a simple and intuitive method that performs at state of the art on datasets ranging from homogeneous texture (e.g., material texture), to less homogeneous texture (e.g., the fur of animals), and to inhomogeneous texture (the appearance patterns of vehicles). Our method uses a bag-of-words model where the features are based on a dictionary of active patches. Active patches are raw intensity patches which can undergo spatial transformations (e.g., rotation and scaling) and adjust themselves to best match the image regions. The dictionary of active patches is required to be compact and representative, in the sense that we can use it to approximately reconstruct the images that we want to classify. We propose a probabilistic model to quantify the quality of image reconstruction and design a greedy learning algorithm to obtain the dictionary. We classify images using the occurrence frequency of the active patches. Feature extraction is fast (about 100 ms per image) using the GPU. The experimental results show that our method improves the state of the art on a challenging material texture benchmark dataset (KTH-TIPS2). To test our method on less homogeneous or inhomogeneous images, we construct two new datasets consisting of appearance image patches of animals and vehicles cropped from the PASCAL VOC dataset. Our method outperforms competing methods on these datasets. PMID:25531013

  10. Ab initio molecular dynamics in a finite homogeneous electric field.

    PubMed

    Umari, P; Pasquarello, Alfredo

    2002-10-07

    We treat homogeneous electric fields within density functional calculations with periodic boundary conditions. A nonlocal energy functional depending on the applied field is used within an ab initio molecular dynamics scheme. The reliability of the method is demonstrated in the case of bulk MgO for the Born effective charges, and the high- and low-frequency dielectric constants. We evaluate the static dielectric constant by performing a damped molecular dynamics in an electric field and avoiding the calculation of the dynamical matrix. Application of this method to vitreous silica shows good agreement with experiment and illustrates its potential for systems of large size.

  11. Indirect tissue electrophoresis: a new method for analyzing solid tissue protein.

    PubMed

    Smith, A C

    1988-01-01

    1. The eye lens core (nucleus) has been a valuable source of molecular biologic information. 2. In these studies, lens nuclei are usually homogenized so that any protein information related to anatomical subdivisions, or layers, of the nucleus is lost. 3. The present report is of a new method, indirect tissue electrophoresis (ITE), which, when applied to fish lens nuclei, permitted (a) automatic correlation of protein information with anatomic layer, (b) production of large, clear electrophoretic patterns even from small tissue samples and (c) detection of more proteins than in liquid extracts of homogenized tissues. 4. ITE seems potentially applicable to a variety of solid tissues.

  12. Cloaking of arbitrarily shaped objects with homogeneous coatings

    NASA Astrophysics Data System (ADS)

    Forestiere, Carlo; Dal Negro, Luca; Miano, Giovanni

    2014-05-01

    We present a theory for the cloaking of arbitrarily shaped objects and demonstrate electromagnetic scattering cancellation through designed homogeneous coatings. First, in the small-particle limit, we expand the dipole moment of a coated object in terms of its resonant modes. By zeroing the numerator of the resulting rational function, we accurately predict the permittivity values of the coating layer that abates the total scattered power. Then, we extend the applicability of the method beyond the small-particle limit, deriving the radiation corrections of the scattering-cancellation permittivity within a perturbation approach. Our method permits the design of invisibility cloaks for irregularly shaped devices such as complex sensors and detectors.

  13. Radiolabel ratio method for measuring pulmonary clearance of intratracheal bacterial challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaForce, F.M.; Boose, D.S.

    Calculation of bacterial clearance is a fundamental step in any study of in situ lung antibacterial defenses. A method is described whereby about 85% of a radiolabeled bacterial inoculum was consistently introduced into the bronchopulmonary tree of a mouse by the intratracheal route. Mice were then killed 1 and 4 hours later; their lungs were removed aseptically and homogenized, and viable bacteria and radiolabel counts were determined. Radiolabel counts fell slowly, and more than 80% of the original radiolabel was still present in homogenized lung samples from animals sacrificed 4 hours after challenge. Bacteria/isotope ratios for the bacterial inoculum andmore » homogenized lung samples from animals sacrificed immediately after challenge were very similar. Bacterial clearance values were the same whether computed from bacterial counts alone or according to a radiolabel ratio method whereby the change in the bacteria/isotope ratio in ground lung aliquots was divided by a similar ratio from bacteria used to inoculate animals. Some contamination resulted from oral streptococci being swept into the bronchopulmonary free during the aspiration process. This contamination was not a problem when penicillin was incorporated into the agar and penicillin-resistant strains were used for the bacterial challenges.« less

  14. Nano-ceramics and method thereof

    DOEpatents

    Satcher, Jr., Joe H.; Gash, Alex [Livermore, CA; Simpson, Randall [Livermore, CA; Landingham, Richard [Livermore, CA; Reibold, Robert A [Salida, CA

    2006-08-08

    Disclosed herein is a method to produce ceramic materials utilizing the sol-gel process. The methods enable the preparation of intimate homogeneous dispersions of materials while offering the ability to control the size of one component within another. The method also enables the preparation of materials that will densify at reduced temperature.

  15. The generalized scattering coefficient method for plane wave scattering in layered structures

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Li, Chao; Wang, Huai-Yu; Zhou, Yun-Song

    2017-02-01

    The generalized scattering coefficient (GSC) method is pedagogically derived and employed to study the scattering of plane waves in homogeneous and inhomogeneous layered structures. The numerical stabilities and accuracies of this method and other commonly used numerical methods are discussed and compared. For homogeneous layered structures, concise scattering formulas with clear physical interpretations and strong numerical stability are obtained by introducing the GSCs. For inhomogeneous layered structures, three numerical methods are employed: the staircase approximation method, the power series expansion method, and the differential equation based on the GSCs. We investigate the accuracies and convergence behaviors of these methods by comparing their predictions to the exact results. The conclusions are as follows. The staircase approximation method has a slow convergence in spite of its simple and intuitive implementation, and a fine stratification within the inhomogeneous layer is required for obtaining accurate results. The expansion method results are sensitive to the expansion order, and the treatment becomes very complicated for relatively complex configurations, which restricts its applicability. By contrast, the GSC-based differential equation possesses a simple implementation while providing fast and accurate results.

  16. Computer program for thin-wire structures in a homogeneous conducting medium

    NASA Technical Reports Server (NTRS)

    Richmond, J. H.

    1974-01-01

    A computer program is presented for thin-wire antennas and scatters in a homogeneous conducting medium. The anaylsis is performed in the real or complex frequency domain. The program handles insulated and bare wires with finite conductivity and lumped loads. The output data includes the current distribution, impedance, radiation efficiency, gain, absorption cross section, scattering cross section, echo area and the polarization scattering matrix. The program uses sinusoidal bases and Galerkin's method.

  17. REPRESENTATIVE SAMPLING AND ANALYSIS OF HETEROGENEOUS SOILS

    EPA Science Inventory

    Standard sampling and analysis methods for hazardous substances in contaminated soils currently are available and routinely employed. Standard methods inherently assume a homogeneous soil matrix and contaminant distribution; therefore only small sample quantities typically are p...

  18. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  19. Voxel-wise meta-analyses of brain blood flow and local synchrony abnormalities in medication-free patients with major depressive disorder.

    PubMed

    Chen, Zi-Qi; Du, Ming-Ying; Zhao, You-Jin; Huang, Xiao-Qi; Li, Jing; Lui, Su; Hu, Jun-Mei; Sun, Huai-Qiang; Liu, Jia; Kemp, Graham J; Gong, Qi-Yong

    2015-11-01

    Published meta-analyses of resting-state regional cerebral blood flow (rCBF) studies of major depressive disorder (MDD) have included patients receiving antidepressants, which might affect brain activity and thus bias the results. To our knowledge, no meta-analysis has investigated regional homogeneity changes in medication-free patients with MDD. Moreover, an association between regional homogeneity and rCBF has been demonstrated in some brain regions in healthy controls. We sought to explore to what extent resting-state rCBF and regional homogeneity changes co-occur in the depressed brain without the potential confound of medication. Using the effect-size signed differential mapping method, we conducted 2 meta-analyses of rCBF and regional homogeneity studies of medication-free patients with MDD. Our systematic search identified 14 rCBF studies and 9 regional homogeneity studies. We identified conjoint decreases in resting-state rCBF and regional homogeneity in the insula and superior temporal gyrus in medication-free patients with MDD compared with controls. Other changes included altered resting-state rCBF in the precuneus and in the frontal-limbic-thalamic-striatal neural circuit as well as altered regional homogeneity in the uncus and parahippocampal gyrus. Meta-regression revealed that the percentage of female patients with MDD was negatively associated with resting-state rCBF in the right anterior cingulate cortex and that the age of patients with MDD was negatively associated with rCBF in the left insula and with regional homogeneity in the left uncus. The analysis techniques, patient characteristics and clinical variables of the included studies were heterogeneous. The conjoint alterations of rCBF and regional homogeneity in the insula and superior temporal gyrus may be core neuropathological changes in medication-free patients with MDD and serve as a specific region of interest for further studies on MDD.

  20. Testing the cosmic anisotropy with supernovae data: Hemisphere comparison and dipole fitting

    NASA Astrophysics Data System (ADS)

    Deng, Hua-Kai; Wei, Hao

    2018-06-01

    The cosmological principle is one of the cornerstones in modern cosmology. It assumes that the universe is homogeneous and isotropic on cosmic scales. Both the homogeneity and the isotropy of the universe should be tested carefully. In the present work, we are interested in probing the possible preferred direction in the distribution of type Ia supernovae (SNIa). To our best knowledge, two main methods have been used in almost all of the relevant works in the literature, namely the hemisphere comparison (HC) method and the dipole fitting (DF) method. However, the results from these two methods are not always approximately coincident with each other. In this work, we test the cosmic anisotropy by using these two methods with the joint light-curve analysis (JLA) and simulated SNIa data sets. In many cases, both methods work well, and their results are consistent with each other. However, in the cases with two (or even more) preferred directions, the DF method fails while the HC method still works well. This might shed new light on our understanding of these two methods.

  1. Analysis of messy data with heteroscedastic in mean models

    NASA Astrophysics Data System (ADS)

    Trianasari, Nurvita; Sumarni, Cucu

    2016-02-01

    In the analysis of the data, we often faced with the problem of data where the data did not meet some assumptions. In conditions of such data is often called data messy. This problem is a consequence of the data that generates outliers that bias or error estimation. To analyze the data messy, there are three approaches, namely standard analysis, transform data and data analysis methods rather than a standard. Simulations conducted to determine the performance of a third comparative test procedure on average often the model variance is not homogeneous. Data simulation of each scenario is raised as much as 500 times. Next, we do the analysis of the average comparison test using three methods, Welch test, mixed models and Welch-r test. Data generation is done through software R version 3.1.2. Based on simulation results, these three methods can be used for both normal and abnormal case (homoscedastic). The third method works very well on data balanced or unbalanced when there is no violation in the homogenity's assumptions variance. For balanced data, the three methods still showed an excellent performance despite the violation of the assumption of homogeneity of variance, with the requisite degree of heterogeneity is high. It can be shown from the level of power test above 90 percent, and the best to Welch method (98.4%) and the Welch-r method (97.8%). For unbalanced data, Welch method will be very good moderate at in case of heterogeneity positive pair with a 98.2% power. Mixed models method will be very good at case of highly heterogeneity was negative negative pairs with power. Welch-r method works very well in both cases. However, if the level of heterogeneity of variance is very high, the power of all method will decrease especially for mixed models methods. The method which still works well enough (power more than 50%) is Welch-r method (62.6%), and the method of Welch (58.6%) in the case of balanced data. If the data are unbalanced, Welch-r method works well enough in the case of highly heterogeneous positive positive or negative negative pairs, there power are 68.8% and 51% consequencly. Welch method perform well enough only in the case of highly heterogeneous variety of positive positive pairs with it is power of 64.8%. While mixed models method is good in the case of a very heterogeneous variety of negative partner with 54.6% power. So in general, when there is a variance is not homogeneous case, Welch method is applied to the data rank (Welch-r) has a better performance than the other methods.

  2. Homogenization of Classification Functions Measurement (HOCFUN): A Method for Measuring the Salience of Emotional Arousal in Thinking.

    PubMed

    Tonti, Marco; Salvatore, Sergio

    2015-01-01

    The problem of the measurement of emotion is a widely debated one. In this article we propose an instrument, the Homogenization of Classification Functions Measure (HOCFUN), designed for assessing the influence of emotional arousal on a rating task consisting of the evaluation of a sequence of images. The instrument defines an indicator (κ) that measures the degree of homogenization of the ratings given over 2 rating scales (pleasant-unpleasant and relevant-irrelevant). Such a degree of homogenization is interpreted as the effect of emotional arousal on thinking and therefore lends itself to be used as a marker of emotional arousal. A preliminary study of validation was implemented. The association of the κ indicator with 3 additional indicators was analyzed. Consistent with the hypotheses, the κ indicator proved to be associated, even if weakly and nonlinearly, with a marker of the homogenization of classification functions derived from a separate rating task and with 2 indirect indicators of emotional activation: the speed of performance on the HOCFUN task and an indicator of mood intensity. Taken as a whole, such results provide initial evidence supporting the HOCFUN construct validity.

  3. Permian paleoclimate data from fluid inclusions in halite

    USGS Publications Warehouse

    Benison, K.C.; Goldstein, R.H.

    1999-01-01

    This study has yielded surface water paleotemperatures from primary fluid inclusions in mid Permian Nippewalla Group halite from western Kansas. A 'cooling nucleation' method is used to generate vapor bubbles in originally all-liquid primary inclusions. Then, surface water paleotemperatures are obtained by measuring temperatures of homogenization to liquid. Homogenization temperatures ranged from 21??C to 50??C and are consistent along individual fluid inclusion assemblages, indicating that the fluid inclusions have not been altered by thermal reequilibration. Homogenization temperatures show a range of up to 26??C from base to top of individual cloudy chevron growth bands. Petrographic and fluid inclusion evidence indicate that no significant pressure correction is needed for the homogenization temperature data. We interpret these homogenization temperatures to represent shallow surface water paleotemperatures. The range in temperatures from base to top of single chevron bands may reflect daily temperatures variations. These Permian surface water temperatures fall within the same range as some modern evaporative surface waters, suggesting that this Permian environment may have been relatively similar to its modern counterparts. Shallow surface water temperatures in evaporative settings correspond closely to local air temperatures. Therefore, the Permian surface water temperatures determined in this study may be considered proxies for local Permian air temperatures.

  4. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  5. A Simple MO Treatment of Metal Clusters.

    ERIC Educational Resources Information Center

    Sahyun, M. R. V.

    1980-01-01

    Illustrates how a qualitative description of the geometry and electronic characteristics of homogeneous metal clusters can be obtained using semiempirical MO (molecular orbital theory) methods. Computer applications of MO methods to inorganic systems are also described. (CS)

  6. Homogenizing microwave illumination in thermoacoustic tomography by a linear-to-circular polarizer based on frequency selective surfaces

    NASA Astrophysics Data System (ADS)

    He, Yu; Shen, Yuecheng; Feng, Xiaohua; Liu, Changjun; Wang, Lihong V.

    2017-08-01

    A circularly polarized antenna, providing more homogeneous illumination compared to a linearly polarized antenna, is more suitable for microwave induced thermoacoustic tomography (TAT). The conventional realization of a circular polarization is by using a helical antenna, but it suffers from low efficiency, low power capacity, and limited aperture in TAT systems. Here, we report an implementation of a circularly polarized illumination method in TAT by inserting a single-layer linear-to-circular polarizer based on frequency selective surfaces between a pyramidal horn antenna and an imaging object. The performance of the proposed method was validated by both simulations and experimental imaging of a breast tumor phantom. The results showed that a circular polarization was achieved, and the resultant thermoacoustic signal-to-noise was twice greater than that in the helical antenna case. The proposed method is more desirable in a waveguide-based TAT system than the conventional method.

  7. An ultrahigh pressure homogenization technique for easily exfoliating few-layer phosphorene from bulk black phosphorus

    NASA Astrophysics Data System (ADS)

    Guan, Qing-Qing; Zhou, Hua-Jing; Ning, Ping; Lian, Pei-Chao; Wang, Bo; He, Liang; Chai, Xin-Sheng

    2018-05-01

    We have developed an easy and efficient method for exfoliating few-layer sheets of black phosphorus (BP) in N-methyl-2-pyrrolidone, using ultra-high pressure homogenization (UPH). The BP was first exfoliated into sheets that were a few atomic layers thick, using a homogenizer for only 30 min. Next, a double centrifugation procedure was used to separate the material into few-layer nanosheets that were examined by X-ray diffraction, atomic force microscopy (AFM), transmission electron microscopy (TEM), high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM), and energy-dispersive X-ray (EDX) spectroscopy. The results show that the products are specimens of phosphorene that are only a few-layer thick.

  8. Some variance reduction methods for numerical stochastic homogenization

    PubMed Central

    Blanc, X.; Le Bris, C.; Legoll, F.

    2016-01-01

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. PMID:27002065

  9. Bayesian analysis of non-homogeneous Markov chains: application to mental health data.

    PubMed

    Sung, Minje; Soyer, Refik; Nhan, Nguyen

    2007-07-10

    In this paper we present a formal treatment of non-homogeneous Markov chains by introducing a hierarchical Bayesian framework. Our work is motivated by the analysis of correlated categorical data which arise in assessment of psychiatric treatment programs. In our development, we introduce a Markovian structure to describe the non-homogeneity of transition patterns. In doing so, we introduce a logistic regression set-up for Markov chains and incorporate covariates in our model. We present a Bayesian model using Markov chain Monte Carlo methods and develop inference procedures to address issues encountered in the analyses of data from psychiatric treatment programs. Our model and inference procedures are implemented to some real data from a psychiatric treatment study. Copyright 2006 John Wiley & Sons, Ltd.

  10. Homogenization theory for designing graded viscoelastic sonic crystals

    NASA Astrophysics Data System (ADS)

    Qu, Zhao-Liang; Ren, Chun-Yu; Pei, Yong-Mao; Fang, Dai-Ning

    2015-02-01

    In this paper, we propose a homogenization theory for designing graded viscoelastic sonic crystals (VSCs) which consist of periodic arrays of elastic scatterers embedded in a viscoelastic host material. We extend an elastic homogenization theory to VSC by using the elastic-viscoelastic correspondence principle and propose an analytical effective loss factor of VSC. The results of VSC and the equivalent structure calculated by using the finite element method are in good agreement. According to the relation of the effective loss factor to the filling fraction, a graded VSC plate is easily and quickly designed. Then, the graded VSC may have potential applications in the vibration absorption and noise reduction fields. Project supported by the National Basic Research Program of China (Grant No. 2011CB610301).

  11. Elastic full waveform inversion based on the homogenization method: theoretical framework and 2-D numerical illustrations

    NASA Astrophysics Data System (ADS)

    Capdeville, Yann; Métivier, Ludovic

    2018-05-01

    Seismic imaging is an efficient tool to investigate the Earth interior. Many of the different imaging techniques currently used, including the so-called full waveform inversion (FWI), are based on limited frequency band data. Such data are not sensitive to the true earth model, but to a smooth version of it. This smooth version can be related to the true model by the homogenization technique. Homogenization for wave propagation in deterministic media with no scale separation, such as geological media, has been recently developed. With such an asymptotic theory, it is possible to compute an effective medium valid for a given frequency band such that effective waveforms and true waveforms are the same up to a controlled error. In this work we make the link between limited frequency band inversion, mainly FWI, and homogenization. We establish the relation between a true model and an FWI result model. This relation is important for a proper interpretation of FWI images. We numerically illustrate, in the 2-D case, that an FWI result is at best the homogenized version of the true model. Moreover, it appears that the homogenized FWI model is quite independent of the FWI parametrization, as long as it has enough degrees of freedom. In particular, inverting for the full elastic tensor is, in each of our tests, always a good choice. We show how the homogenization can help to understand FWI behaviour and help to improve its robustness and convergence by efficiently constraining the solution space of the inverse problem.

  12. Detailed description of oil shale organic and mineralogical heterogeneity via fourier transform infrared mircoscopy

    USGS Publications Warehouse

    Washburn, Kathryn E.; Birdwell, Justin E.; Foster, Michael; Gutierrez, Fernando

    2015-01-01

    Mineralogical and geochemical information on reservoir and source rocks is necessary to assess and produce from petroleum systems. The standard methods in the petroleum industry for obtaining these properties are bulk measurements on homogenized, generally crushed, and pulverized rock samples and can take from hours to days to perform. New methods using Fourier transform infrared (FTIR) spectroscopy have been developed to more rapidly obtain information on mineralogy and geochemistry. However, these methods are also typically performed on bulk, homogenized samples. We present a new approach to rock sample characterization incorporating multivariate analysis and FTIR microscopy to provide non-destructive, spatially resolved mineralogy and geochemistry on whole rock samples. We are able to predict bulk mineralogy and organic carbon content within the same margin of error as standard characterization techniques, including X-ray diffraction (XRD) and total organic carbon (TOC) analysis. Validation of the method was performed using two oil shale samples from the Green River Formation in the Piceance Basin with differing sedimentary structures. One sample represents laminated Green River oil shales, and the other is representative of oil shale breccia. The FTIR microscopy results on the oil shales agree with XRD and LECO TOC data from the homogenized samples but also give additional detail regarding sample heterogeneity by providing information on the distribution of mineral phases and organic content. While measurements for this study were performed on oil shales, the method could also be applied to other geological samples, such as other mudrocks, complex carbonates, and soils.

  13. A fast and sensitive TLD method for measurement of energy and homogeneity of electron beams using transmitted radiation through lead.

    PubMed

    Pradhan, A S; Quast, U; Sharma, P K

    1994-09-01

    A simple and fast, but sensitive TLD method for the measurement of energy and homogeneity of therapeutically used electron beams has been developed and tested. This method is based on the fact that when small thicknesses of high-Z absorbers such as lead are interposed in the high-energy electron beams, the transmitted radiation increases with the energy of the electron beams. Consequently, the ratio of readouts of TLDS held on the two sides of a lead plate varied sharply (by factor of 70) with a change in energy of the electron beam from 5 MeV to 18 MeV, offering a very sensitive method for the measurement of the energy of electron beams. By using the ratio of TL readouts of two types of TLD ribbon with widely different sensitivities, LiF TLD-700 ribbons on the upstream side and highly sensitive CaF2:Dy TLD-200 ribbons on the downstream side, an electron energy discrimination of better than +/- 0.1 MeV could be achieved. The homogeneity of the electron beam energy and the absorbed dose was measured by using a jig in which the TLDS were held in the desired array on both sides of a 4 mm thick lead plate. The method takes minimal beam time and makes it possible to carry out measurements for the audit of the quality of electron beams as well as for intercomparison of beams by mail.

  14. Optimization study on the magnetic field of superconducting Halbach Array magnet

    NASA Astrophysics Data System (ADS)

    Shen, Boyang; Geng, Jianzhao; Li, Chao; Zhang, Xiuchang; Fu, Lin; Zhang, Heng; Ma, Jun; Coombs, T. A.

    2017-07-01

    This paper presents the optimization on the strength and homogeneity of magnetic field from superconducting Halbach Array magnet. Conventional Halbach Array uses a special arrangement of permanent magnets which can generate homogeneous magnetic field. Superconducting Halbach Array utilizes High Temperature Superconductor (HTS) to construct an electromagnet to work below its critical temperature, which performs equivalently to the permanent magnet based Halbach Array. The simulations of superconducting Halbach Array were carried out using H-formulation based on B-dependent critical current density and bulk approximation, with the FEM platform COMSOL Multiphysics. The optimization focused on the coils' location, as well as the geometry and numbers of coils on the premise of maintaining the total amount of superconductor. Results show Halbach Array configuration based superconducting magnet is able to generate the magnetic field with intensity over 1 Tesla and improved homogeneity using proper optimization methods. Mathematical relation of these optimization parameters with the intensity and homogeneity of magnetic field was developed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lytkina, D. N., E-mail: darya-lytkina@yandex.ru; Shapovalova, Y. G., E-mail: elena.shapovalova@ro.ru; Rasskazova, L. A., E-mail: ly-2207@mail.ru

    Relevance of the work is due to the need for new materials that are used in medicine (orthopedics, surgery, dentistry, and others) as a substitute for natural bone tissue injuries, fractures, etc. The aim of presented work is developing of a method of producing biocompatible materials based on polyesters of hydroxycarboxylic acids and calcium phosphate ceramic (hydroxyapatite, HA) with homogeneous distribution of the inorganic component. Bioactive composites based on poly-L-lactide (PL) and hydroxyapatite with homogeneous distribution were prepared. The results of scanning electron microscopy confirm homogeneous distribution of the inorganic filler in the polymer matrix. The positive effect of ultrasoundmore » on the homogeneity of the composites was determined. The rate of hydrolysis of composites was evaluated. The rate of hydrolysis of polylactide as an individual substance is 7 times lower than the rate of hydrolysis of the polylactide as a part of the composite. It was found that materials submarines HA composite and do not cause a negative response in the cells of the immune system, while contributing to anti-inflammatory cytokines released by cells.« less

  16. The Influence of Landscape Heterogeneity - Ground Beetles (Coleoptera: Carabidae) in Fthiotida, Central Greece

    PubMed Central

    2014-01-01

    Abstract Pitfall traps were used to sample Carabidae in agricultural land of the Spercheios valley, Fthiotida, Central Greece. Four pairs of cultivated fields were sampled. One field of each pair was located in a heterogeneous area and the other in a more homogeneous area. Heterogeneous areas were composed of small fields. They had high percentages of non-cropped habitats and a high diversity of land use types. Homogeneous areas were composed of larger fields. They had lower percentages of non-cropped habitats and a lower diversity of land use types. One pair of fields had been planted with cotton, one with maize, one with olives and one with wheat. Altogether 28 carabid species were recorded. This paper describes the study areas, the sampling methods used and presents the data collected during the study. Neither heterogeneous nor homogeneous areas had consistently higher abundance levels, activity density levels, species richness levels or diversity levels. However, significant differences were seen in some of the comparisons between heterogeneous and homogeneous areas. PMID:24891833

  17. Homogenization of a Directed Dispersal Model for Animal Movement in a Heterogeneous Environment.

    PubMed

    Yurk, Brian P

    2016-10-01

    The dispersal patterns of animals moving through heterogeneous environments have important ecological and epidemiological consequences. In this work, we apply the method of homogenization to analyze an advection-diffusion (AD) model of directed movement in a one-dimensional environment in which the scale of the heterogeneity is small relative to the spatial scale of interest. We show that the large (slow) scale behavior is described by a constant-coefficient diffusion equation under certain assumptions about the fast-scale advection velocity, and we determine a formula for the slow-scale diffusion coefficient in terms of the fast-scale parameters. We extend the homogenization result to predict invasion speeds for an advection-diffusion-reaction (ADR) model with directed dispersal. For periodic environments, the homogenization approximation of the solution of the AD model compares favorably with numerical simulations. Invasion speed approximations for the ADR model also compare favorably with numerical simulations when the spatial period is sufficiently small.

  18. Homogenization analysis of invasion dynamics in heterogeneous landscapes with differential bias and motility.

    PubMed

    Yurk, Brian P

    2018-07-01

    Animal movement behaviors vary spatially in response to environmental heterogeneity. An important problem in spatial ecology is to determine how large-scale population growth and dispersal patterns emerge within highly variable landscapes. We apply the method of homogenization to study the large-scale behavior of a reaction-diffusion-advection model of population growth and dispersal. Our model includes small-scale variation in the directed and random components of movement and growth rates, as well as large-scale drift. Using the homogenized model we derive simple approximate formulas for persistence conditions and asymptotic invasion speeds, which are interpreted in terms of residence index. The homogenization results show good agreement with numerical solutions for environments with a high degree of fragmentation, both with and without periodicity at the fast scale. The simplicity of the formulas, and their connection to residence index make them appealing for studying the large-scale effects of a variety of small-scale movement behaviors.

  19. Substrate specificity and pH dependence of homogeneous wheat germ acid phosphatase.

    PubMed

    Van Etten, R L; Waymack, P P

    1991-08-01

    The broad substrate specificity of a homogeneous isoenzyme of wheat germ acid phosphatase (WGAP) was extensively investigated by chromatographic, electrophoretic, NMR, and kinetic procedures. WGAP exhibited no divalent metal ion requirement and was unaffected upon incubation with EDTA or o-phenanthroline. A comparison of two catalytically homogeneous isoenzymes revealed little difference in substrate specificity. The specificity of WGAP was established by determining the Michaelis constants for a wide variety of substrates. p-Nitrophenyl phosphate, pyrophosphate, tripolyphosphate, and ATP were preferred substrates while lesser activities were seen toward sugar phosphates, trimetaphosphate, phosphoproteins, and (much less) phosphodiesters. An extensive table of Km and Vmax values is given. The pathway for the hydrolysis of trimetaphosphate was examined by colorimetric and 31P NMR methods and it was found that linear tripolyphosphate is not a free intermediate in the enzymatic reaction. In contrast to literature reports, homogeneous wheat germ acid phosphatase exhibits no measurable carboxylesterase activity, nor does it hydrolyze phenyl phosphonothioate esters or phytic acid at significant rates.

  20. Visualizing excipient composition and homogeneity of Compound Liquorice Tablets by near-infrared chemical imaging

    NASA Astrophysics Data System (ADS)

    Wu, Zhisheng; Tao, Ou; Cheng, Wei; Yu, Lu; Shi, Xinyuan; Qiao, Yanjiang

    2012-02-01

    This study demonstrated that near-infrared chemical imaging (NIR-CI) was a promising technology for visualizing the spatial distribution and homogeneity of Compound Liquorice Tablets. The starch distribution (indirectly, plant extraction) could be spatially determined using basic analysis of correlation between analytes (BACRA) method. The correlation coefficients between starch spectrum and spectrum of each sample were greater than 0.95. Depending on the accurate determination of starch distribution, a method to determine homogeneous distribution was proposed by histogram graph. The result demonstrated that starch distribution in sample 3 was relatively heterogeneous according to four statistical parameters. Furthermore, the agglomerates domain in each tablet was detected using score image layers of principal component analysis (PCA) method. Finally, a novel method named Standard Deviation of Macropixel Texture (SDMT) was introduced to detect agglomerates and heterogeneity based on binary image. Every binary image was divided into different sizes length of macropixel and the number of zero values in each macropixel was counted to calculate standard deviation. Additionally, a curve fitting graph was plotted on the relationship between standard deviation and the size length of macropixel. The result demonstrated the inter-tablet heterogeneity of both starch and total compounds distribution, simultaneously, the similarity of starch distribution and the inconsistency of total compounds distribution among intra-tablet were signified according to the value of slope and intercept parameters in the curve.

  1. Preparation of peanut butter suspension for determination of peanuts using enzyme-linked immunoassay kits.

    PubMed

    Trucksess, Mary W; Brewer, Vickery A; Williams, Kristina M; Westphal, Carmen D; Heeres, James T

    2004-01-01

    Peanuts are one of the 8 most common allergenic foods and a large proportion of peanut-allergic individuals have severe reactions, some to minimal exposure. Specific protein constituents in the peanuts are the cause of the allergic reactions in sensitized individuals who ingest the peanuts. To avoid accidental ingestion of peanut-contaminated food, methods of analysis for the determination of the allergenic proteins in foods are important tools. Such methods could help identify foods inadvertently contaminated with peanuts, thereby reducing the incidence of allergic reactions to peanuts. Commercial immunoassay kits are available but need study for method performance, which requires reference materials for within- and between-laboratory validations. In this study, National Institute of Standards and Technology Standard Reference Material 2387 peanut butter was used. A polytron homogenizer was used to prepare a homogenous aqueous Peanut Butter suspension for the evaluation of method performance of some commercially available immunoassay kits such as Veratox for Peanut Allergen Test (Neogen Corp.), Ridascreen Peanut (R-Biopharm GmbH), and Bio-Kit Peanut Protein Assay Kit (Tepnel). Each gram of the aqueous peanut butter suspension contained 20 mg carboxymethylcellulose sodium salt, 643 microg peanut, 0.5 mg thimerosal, and 2.5 mg bovine serum albumin. The suspension was homogenous, stable, reproducible, and applicable for adding to ice cream, cookies, breakfast cereals, and chocolate for recovery studies at spike levels ranging from 12 to 90 microg/g.

  2. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  3. Nuclear Forensics Applications of Principal Component Analysis on Micro X-ray Fluorescence Images

    DTIC Science & Technology

    analysis on quantified micro x-ray fluorescence intensity values. This method is then applied to address goals of nuclear forensics . Thefirst...researchers in the development and validation of nuclear forensics methods. A method for determining material homogeneity is developed and demonstrated

  4. Control of the surface quality parameters of machine components during static pulsed treatment

    NASA Astrophysics Data System (ADS)

    Komkov, V. A.; Rabinskii, L. N.; Kokoreva, O. G.; Kuprikov, N. M.

    2016-12-01

    A technique is developed to determine the homogeneity of the structure in a surface layer subjected to strain hardening. Static pulsed treatment is found to be one of the most effective surface plastic deformation methods that can be used to control the uniformity of hardening a surface layer. This treatment makes it possible to create a hardened surface layer to a depth of 10 mm with a homogeneous or heterogeneous structure.

  5. [The effect of carbon tetrachloride poisoning on the activity of digestive proteases in rats and correction of the disorders with vegetable oils].

    PubMed

    Esaulenko, E E; Khil'chuk, M A; Bykov, I M

    2013-01-01

    The results of the study of activity of digestive proteases (pepsin, trypsin, chymotrypsin) in homogenates of stomach, pancreas and duodenum in experimental animals have been presented. Rats were exposed to intoxication with carbon tetrachloride (subcutaneous administration of a 50% oil solution of CCl4 in the dose of 0.5 ml per 100 g body weight) for three days and then they were given analysed oils (black nut, walnut and flax oil) intragastrically by gavage at a dose of 0.2 ml per day within 23 days. Pepsin level in gastric mucosa homogenates and chymotrypsin activity in pancreatic homogenates were determined by method of N.P. Pyatnitskiy based on on the ability of enzymes to coagulate dairy-acetate mixture, respectively, at 25 degrees C and 35 degrees C. Trypsin activity in homogenates of pancreatic was determined by method of Erlanger - Shaternikova colorimetrically. It has been established that intoxication with CCl4 decreased the synthesis of proteolytic enzymes of the stomach (by 51%) and pancreas (by 70-78%). Injections of analysed vegetable oils to animals contributed to the normalization of proteolytic enzymes synthesis. The conclusion that there are prospects of using the analysed vegetable oils containing large quantity of polyunsaturated fatty acids (omega-3 and omega-6) for the correction of detected biochemical abnormalities has been done.

  6. Stability of cosmetic emulsion containing different amount of hemp oil.

    PubMed

    Kowalska, M; Ziomek, M; Żbikowska, A

    2015-08-01

    The aim of the study was to determine the optimal conditions, that is the content of hemp oil and time of homogenization to obtain stable dispersion systems. For this purpose, six emulsions were prepared, their stability was examined empirically and the most correctly formulated emulsion composition was determined using a computer simulation. Variable parameters (oil content and homogenization time) were indicated by the optimization software based on Kleeman's method. Physical properties of the synthesized emulsions were studied by numerous techniques involving particle size analysis, optical microscopy, Turbiscan test and viscosity of emulsions. The emulsion containing 50 g of oil and being homogenized for 6 min had the highest stability. Empirically determined parameters proved to be consistent with the results obtained using the computer software. The computer simulation showed that the most stable emulsion should contain from 30 to 50 g of oil and should be homogenized for 2.5-6 min. The computer software based on Kleeman's method proved to be useful for quick optimization of the composition and production parameters of stable emulsion systems. Moreover, obtaining an emulsion system with proper stability justifies further research extended with sensory analysis, which will allow the application of such systems (containing hemp oil, beneficial for skin) in the cosmetic industry. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  7. Supported Dendrimer-Encapsulated Metal Clusters: Toward Heterogenizing Homogeneous Catalysts

    DOE PAGES

    Ye, Rong; Zhukhovitskiy, Aleksandr V.; Deraedt, Christophe V.; ...

    2017-07-13

    Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles—some without homogeneous analogues—for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimizemore » and expand the scope of their reactivity and selectivity. Ongoing efforts in our laboratories are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence, and structural uniformity, dendrimers have proven to be versatile scaffolds for the synthesis and stabilization of small nanoclusters. Then these dendrimer-encapsulated metal clusters (DEMCs) are adsorbed onto mesoporous silica. Through this method, we have achieved selective transformations that had been challenging to accomplish in a heterogeneous setting, e.g., π-bond activation and aldol reactions. Extensive investigation into the catalytic systems under reaction conditions allowed us to correlate the structural features (e.g., oxidation states) of the catalysts and their activity. Moreover, we have demonstrated that supported DEMCs are also excellent catalysts for typical heterogeneous reactions, including hydrogenation and alkane isomerization. Critically, these investigations also confirmed that the supported DEMCs are heterogeneous and stable against leaching. Catalysts optimization is achieved through the modulation of various parameters. The clusters are oxidized (e.g., with PhICl 2) or reduced (e.g., with H 2) in situ. Changing the dendrimer properties (e.g., generation, terminal functional groups) is analogous to ligand modification in homogeneous catalysts, which affect both catalytic activity and selectivity. Similarly, pore size of the support is another factor in determining product distribution. In a flow reactor, the flow rate is adjusted to control the residence time of the starting material and intermediates, and thus the final product selectivity. Our approach to heterogeneous catalysis affords various advantages: (1) the catalyst system can tap into the reactivity typical to homogeneous catalysts, which conventional heterogeneous catalysts could not achieve; (2) unlike most homogeneous catalysts with comparable performance, the heterogenized homogeneous catalysts can be recycled; (3) improved activity or selectivity compared to conventional homogeneous catalysts is possible because of uniquely heterogeneous parameters for optimization. Here in this Account, we will briefly introduce metal clusters and describe the synthesis and characterizations of supported DEMCs. We will present the catalysis studies of supported DEMCs in both the batch and flow modes. Lastly, we will summarize the current state of heterogenizing homogeneous catalysis and provide future directions for this area of research.« less

  8. Increasing Inferential Leverage in the Comparative Method: Placebo Tests in Small-"n" Research

    ERIC Educational Resources Information Center

    Glynn, Adam N.; Ichino, Nahomi

    2016-01-01

    We delineate the underlying homogeneity assumption, procedural variants, and implications of the comparative method and distinguish this from Mill's method of difference. We demonstrate that additional units can provide "placebo" tests for the comparative method even if the scope of inference is limited to the two units under comparison.…

  9. Comparison of manual and homogenizer methods for preparation of tick-derived stabilates of Theileria parva: equivalence testing using an in vitro titration model.

    PubMed

    Mbao, V; Speybroeck, N; Berkvens, D; Dolan, T; Dorny, P; Madder, M; Mulumba, M; Duchateau, L; Brandt, J; Marcotty, T

    2005-07-01

    Theileria parva sporozoite stabilates are used in the infection and treatment method of immunization, a widely accepted control option for East Coast fever in cattle. T. parva sporozoites are extracted from infected adult Rhipicephalus appendiculatus ticks either manually, using a pestle and a mortar, or by use of an electric homogenizer. A comparison of the two methods as a function of stabilate infectivity has never been documented. This study was designed to provide a quantitative comparison of stabilates produced by the two methods. The approach was to prepare batches of stabilate by both methods and then subject them to in vitro titration. Equivalence testing was then performed on the average effective doses (ED). The ratio of infective sporozoites yielded by the two methods was found to be 1.14 in favour of the manually ground stabilate with an upper limit of the 95% confidence interval equal to 1.3. We conclude that the choice of method rests more on costs, available infrastructure and standardization than on which method produces a richer sporozoite stabilate.

  10. Neuronal Correlates of Individual Differences in the Big Five Personality Traits: Evidences from Cortical Morphology and Functional Homogeneity.

    PubMed

    Li, Ting; Yan, Xu; Li, Yuan; Wang, Junjie; Li, Qiang; Li, Hong; Li, Junfeng

    2017-01-01

    There have been many neuroimaging studies of human personality traits, and it have already provided glimpse into the neurobiology of complex traits. And most of previous studies adopt voxel-based morphology (VBM) analysis to explore the brain-personality mechanism from two levels (vertex and regional based), the findings are mixed with great inconsistencies and the brain-personality relations are far from a full understanding. Here, we used a new method of surface-based morphology (SBM) analysis, which provides better alignment of cortical landmarks to generate about the associations between cortical morphology and the personality traits across 120 healthy individuals at both vertex and regional levels. While to further reveal local functional correlates of the morphology-personality relationships, we related surface-based functional homogeneity measures to the regions identified in the regional-based SBM correlation. Vertex-wise analysis revealed that people with high agreeableness exhibited larger areas in the left superior temporal gyrus. Based on regional parcellation we found that extroversion was negatively related with the volume of the left lateral occipito-temporal gyrus and agreeableness was negatively associated with the sulcus depth of the left superior parietal lobule. Moreover, increased regional homogeneity in the left lateral occipito-temporal gyrus is related to the scores of extroversion, and increased regional homogeneity in the left superior parietal lobule is related to the scores of agreeableness. These findings provide supporting evidence of a link between personality and brain structural mysteries with a method of SBM, and further suggest that local functional homogeneity of personality traits has neurobiological relevance that is likely based on anatomical substrates.

  11. A multi-scale homogenization model for fine-grained porous viscoplastic polycrystals: I - Finite-strain theory

    NASA Astrophysics Data System (ADS)

    Song, Dawei; Ponte Castañeda, P.

    2018-06-01

    We make use of the recently developed iterated second-order homogenization method to obtain finite-strain constitutive models for the macroscopic response of porous polycrystals consisting of large pores randomly distributed in a fine-grained polycrystalline matrix. The porous polycrystal is modeled as a three-scale composite, where the grains are described by single-crystal viscoplasticity and the pores are assumed to be large compared to the grain size. The method makes use of a linear comparison composite (LCC) with the same substructure as the actual nonlinear composite, but whose local properties are chosen optimally via a suitably designed variational statement. In turn, the effective properties of the resulting three-scale LCC are determined by means of a sequential homogenization procedure, utilizing the self-consistent estimates for the effective behavior of the polycrystalline matrix, and the Willis estimates for the effective behavior of the porous composite. The iterated homogenization procedure allows for a more accurate characterization of the properties of the matrix by means of a finer "discretization" of the properties of the LCC to obtain improved estimates, especially at low porosities, high nonlinearties and high triaxialities. In addition, consistent homogenization estimates for the average strain rate and spin fields in the pores and grains are used to develop evolution laws for the substructural variables, including the porosity, pore shape and orientation, as well as the "crystallographic" and "morphological" textures of the underlying matrix. In Part II of this work has appeared in Song and Ponte Castañeda (2018b), the model will be used to generate estimates for both the instantaneous effective response and the evolution of the microstructure for porous FCC and HCP polycrystals under various loading conditions.

  12. Climatic warming in China during 1901–2015 based on an extended dataset of instrumental temperature records

    DOE PAGES

    Cao, Lijuan; Yan, Zhongwei; Zhao, Ping; ...

    2017-05-26

    Monthly mean instrumental surface air temperature (SAT) observations back to the nineteenth century in China are synthesized from different sources via specific quality-control, interpolation, and homogenization. Compared with the first homogenized long-term SAT dataset for China which contained 18 stations mainly located in the middle and eastern part of China, the present dataset includes homogenized monthly SAT series at 32 stations, with an extended coverage especially towards western China. Missing values are interpolated by using observations at nearby stations, including those from neighboring countries. Cross validation shows that the mean bias error (MBE) is generally small and falls between 0.45more » °C and –0.35 °C. Multiple homogenization methods and available metadata are applied to assess the consistency of the time series and to adjust inhomogeneity biases. The homogenized annual mean SAT series shows a range of trends between 1.1 °C and 4.0 °C/century in northeastern China, between 0.4 °C and 1.9 °C/century in southeastern China, and between 1.4 °C and 3.7 °C/century in western China to the west of 105 E (from the initial years of the stations to 2015). The unadjusted data include unusually warm records during the 1940s and hence tend to underestimate the warming trends at a number of stations. As a result, the mean SAT series for China based on the climate anomaly method shows a warming trend of 1.56 °C/century during 1901–2015, larger than those based on other currently available datasets.« less

  13. Climatic warming in China during 1901–2015 based on an extended dataset of instrumental temperature records

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, Lijuan; Yan, Zhongwei; Zhao, Ping

    Monthly mean instrumental surface air temperature (SAT) observations back to the nineteenth century in China are synthesized from different sources via specific quality-control, interpolation, and homogenization. Compared with the first homogenized long-term SAT dataset for China which contained 18 stations mainly located in the middle and eastern part of China, the present dataset includes homogenized monthly SAT series at 32 stations, with an extended coverage especially towards western China. Missing values are interpolated by using observations at nearby stations, including those from neighboring countries. Cross validation shows that the mean bias error (MBE) is generally small and falls between 0.45more » °C and –0.35 °C. Multiple homogenization methods and available metadata are applied to assess the consistency of the time series and to adjust inhomogeneity biases. The homogenized annual mean SAT series shows a range of trends between 1.1 °C and 4.0 °C/century in northeastern China, between 0.4 °C and 1.9 °C/century in southeastern China, and between 1.4 °C and 3.7 °C/century in western China to the west of 105 E (from the initial years of the stations to 2015). The unadjusted data include unusually warm records during the 1940s and hence tend to underestimate the warming trends at a number of stations. As a result, the mean SAT series for China based on the climate anomaly method shows a warming trend of 1.56 °C/century during 1901–2015, larger than those based on other currently available datasets.« less

  14. Colloidal synthesis of silicon nanoparticles in molten salts.

    PubMed

    Shavel, A; Guerrini, L; Alvarez-Puebla, R A

    2017-06-22

    Silicon nanoparticles are unique materials with applications in a variety of fields, from electronics to catalysis and biomedical uses. Despite technological advancements in nanofabrication, the development of a simple and inexpensive route for the synthesis of homogeneous silicon nanoparticles remains highly challenging. Herein, we describe a new, simple and inexpensive colloidal synthetic method for the preparation, under normal pressure and mild temperature conditions, of relatively homogeneous spherical silicon nanoparticles of either ca. 4 or 6 nm diameter. The key features of this method are the selection of a eutectic salt mixture as a solvent, the identification of appropriate silicon alkoxide precursors, and the unconventional use of alkali earth metals as shape-controlling agents.

  15. Novel Bioreactor Platform for Scalable Cardiomyogenic Differentiation from Pluripotent Stem Cell-Derived Embryoid Bodies.

    PubMed

    Rungarunlert, Sasitorn; Ferreira, Joao N; Dinnyes, Andras

    2016-01-01

    Generation of cardiomyocytes from pluripotent stem cells (PSCs) is a common and valuable approach to produce large amount of cells for various applications, including assays and models for drug development, cell-based therapies, and tissue engineering. All these applications would benefit from a reliable bioreactor-based methodology to consistently generate homogenous PSC-derived embryoid bodies (EBs) at a large scale, which can further undergo cardiomyogenic differentiation. The goal of this chapter is to describe a scalable method to consistently generate large amount of homogeneous and synchronized EBs from PSCs. This method utilizes a slow-turning lateral vessel bioreactor to direct the EB formation and their subsequent cardiomyogenic lineage differentiation.

  16. Numerical experiments in homogeneous turbulence

    NASA Technical Reports Server (NTRS)

    Rogallo, R. S.

    1981-01-01

    The direct simulation methods developed by Orszag and Patternson (1972) for isotropic turbulence were extended to homogeneous turbulence in an incompressible fluid subjected to uniform deformation or rotation. The results of simulations for irrotational strain (plane and axisymmetric), shear, rotation, and relaxation toward isotropy following axisymmetric strain are compared with linear theory and experimental data. Emphasis is placed on the shear flow because of its importance and because of the availability of accurate and detailed experimental data. The computed results are used to assess the accuracy of two popular models used in the closure of the Reynolds-stress equations. Data from a variety of the computed fields and the details of the numerical methods used in the simulation are also presented.

  17. [Mechanical Shimming Method and Implementation for Permanent Magnet of MRI System].

    PubMed

    Xue, Tingqiang; Chen, Jinjun

    2015-03-01

    A mechanical shimming method and device for permanent magnet of MRI system has been developed to meet its stringent homogeneity requirement without time-consuming passive shimming on site, installation and adjustment efficiency has been increased.

  18. A Comparison of Performance versus Presentation Based Methods of Instructing Pre-service Teachers in Media Competencies.

    ERIC Educational Resources Information Center

    Mattox, Daniel V., Jr.

    Research compared conventional and experimental methods of instruction in a teacher education media course. The conventional method relied upon factual presentations to heterogeneous groups, while the experimental utilized homogeneous clusters of students and stressed individualized instruction. A pretest-posttest, experimental-control group…

  19. Non-homogeneous updates for the iterative coordinate descent algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Zhou; Thibault, Jean-Baptiste; Bouman, Charles A.; Sauer, Ken D.; Hsieh, Jiang

    2007-02-01

    Statistical reconstruction methods show great promise for improving resolution, and reducing noise and artifacts in helical X-ray CT. In fact, statistical reconstruction seems to be particularly valuable in maintaining reconstructed image quality when the dosage is low and the noise is therefore high. However, high computational cost and long reconstruction times remain as a barrier to the use of statistical reconstruction in practical applications. Among the various iterative methods that have been studied for statistical reconstruction, iterative coordinate descent (ICD) has been found to have relatively low overall computational requirements due to its fast convergence. This paper presents a novel method for further speeding the convergence of the ICD algorithm, and therefore reducing the overall reconstruction time for statistical reconstruction. The method, which we call nonhomogeneous iterative coordinate descent (NH-ICD) uses spatially non-homogeneous updates to speed convergence by focusing computation where it is most needed. Experimental results with real data indicate that the method speeds reconstruction by roughly a factor of two for typical 3D multi-slice geometries.

  20. Physical-geometric optics method for large size faceted particles.

    PubMed

    Sun, Bingqiang; Yang, Ping; Kattawar, George W; Zhang, Xiaodong

    2017-10-02

    A new physical-geometric optics method is developed to compute the single-scattering properties of faceted particles. It incorporates a general absorption vector to accurately account for inhomogeneous wave effects, and subsequently yields the relevant analytical formulas effective and computationally efficient for absorptive scattering particles. A bundle of rays incident on a certain facet can be traced as a single beam. For a beam incident on multiple facets, a systematic beam-splitting technique based on computer graphics is used to split the original beam into several sub-beams so that each sub-beam is incident only on an individual facet. The new beam-splitting technique significantly reduces the computational burden. The present physical-geometric optics method can be generalized to arbitrary faceted particles with either convex or concave shapes and with a homogeneous or an inhomogeneous (e.g., a particle with a core) composition. The single-scattering properties of irregular convex homogeneous and inhomogeneous hexahedra are simulated and compared to their counterparts from two other methods including a numerically rigorous method.

  1. Optimal regionalization of extreme value distributions for flood estimation

    NASA Astrophysics Data System (ADS)

    Asadi, Peiman; Engelke, Sebastian; Davison, Anthony C.

    2018-01-01

    Regionalization methods have long been used to estimate high return levels of river discharges at ungauged locations on a river network. In these methods, discharge measurements from a homogeneous group of similar, gauged, stations are used to estimate high quantiles at a target location that has no observations. The similarity of this group to the ungauged location is measured in terms of a hydrological distance measuring differences in physical and meteorological catchment attributes. We develop a statistical method for estimation of high return levels based on regionalizing the parameters of a generalized extreme value distribution. The group of stations is chosen by optimizing over the attribute weights of the hydrological distance, ensuring similarity and in-group homogeneity. Our method is applied to discharge data from the Rhine basin in Switzerland, and its performance at ungauged locations is compared to that of other regionalization methods. For gauged locations we show how our approach improves the estimation uncertainty for long return periods by combining local measurements with those from the chosen group.

  2. A stochastic vortex structure method for interacting particles in turbulent shear flows

    NASA Astrophysics Data System (ADS)

    Dizaji, Farzad F.; Marshall, Jeffrey S.; Grant, John R.

    2018-01-01

    In a recent study, we have proposed a new synthetic turbulence method based on stochastic vortex structures (SVSs), and we have demonstrated that this method can accurately predict particle transport, collision, and agglomeration in homogeneous, isotropic turbulence in comparison to direct numerical simulation results. The current paper extends the SVS method to non-homogeneous, anisotropic turbulence. The key element of this extension is a new inversion procedure, by which the vortex initial orientation can be set so as to generate a prescribed Reynolds stress field. After validating this inversion procedure for simple problems, we apply the SVS method to the problem of interacting particle transport by a turbulent planar jet. Measures of the turbulent flow and of particle dispersion, clustering, and collision obtained by the new SVS simulations are shown to compare well with direct numerical simulation results. The influence of different numerical parameters, such as number of vortices and vortex lifetime, on the accuracy of the SVS predictions is also examined.

  3. ZERODUR: progress in CTE characterization

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Kunisch, Clemens; Westerhoff, Thomas

    2013-09-01

    In 2010, SCHOTT introduced a method for the modeling of the thermal expansion behavior of ZERODUR® under arbitrary temperature profiles for an optimized production of material for the upcoming Extremely Large Telescope (ELT) projects. In 2012 a new product was introduced based on this method called ZERODUR® TAILORED. ZERODUR® TAILORED provides an evolution in the specification of the absolute Coefficient of Thermal Expansion (CTE) value by including the individual customer requirements in this process. This paper presents examples showing the benefit of an application oriented approach in the design of specifications using ZERODUR®. Additionally it will be shown how the modeling approach has advanced during the last years to improve the prediction accuracy on long time scales. ZERODUR® is known not only for its lowest CTE but also for its excellent CTE homogeneity as shown in the past for disc shaped blanks typical for telescope mirror substrates. Additionally this paper presents recent results of CTE homogeneity measurements in the single digit ppb/K range for a rectangular cast plate proving that the excellent CTE homogeneity is independent of the production format.

  4. Flux density calibration in diffuse optical tomographic systems.

    PubMed

    Biswas, Samir Kumar; Rajan, Kanhirodan; Vasu, Ram M

    2013-02-01

    The solution of the forward equation that models the transport of light through a highly scattering tissue material in diffuse optical tomography (DOT) using the finite element method gives flux density (Φ) at the nodal points of the mesh. The experimentally measured flux (Umeasured) on the boundary over a finite surface area in a DOT system has to be corrected to account for the system transfer functions (R) of various building blocks of the measurement system. We present two methods to compensate for the perturbations caused by R and estimate true flux density (Φ) from Umeasuredcal. In the first approach, the measurement data with a homogeneous phantom (Umeasuredhomo) is used to calibrate the measurement system. The second scheme estimates the homogeneous phantom measurement using only the measurement from a heterogeneous phantom, thereby eliminating the necessity of a homogeneous phantom. This is done by statistically averaging the data (Umeasuredhetero) and redistributing it to the corresponding detector positions. The experiments carried out on tissue mimicking phantom with single and multiple inhomogeneities, human hand, and a pork tissue phantom demonstrate the robustness of the approach.

  5. Quantitation of repaglinide and metabolites in mouse whole-body thin tissue sections using droplet-based liquid microjunction surface sampling-high-performance liquid chromatography-electrospray ionization tandem mass spectrometry.

    PubMed

    Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J; Kertesz, Vilmos; Gan, Jinping

    2016-03-25

    Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites were studied. Major organs (brain, lung, liver, kidney and muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed the same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. In addition, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Quantitation of repaglinide and metabolites in mouse whole-body thin tissue sections using droplet-based liquid microjunction surface sampling-high-performance liquid chromatography-electrospray ionization tandem mass spectrometry

    DOE PAGES

    Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.; ...

    2015-11-03

    Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less

  7. Quantitation of repaglinide and metabolites in mouse whole-body thin tissue sections using droplet-based liquid microjunction surface sampling-high-performance liquid chromatography-electrospray ionization tandem mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.

    Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less

  8. Homogeneity study of a corn flour laboratory reference material candidate for inorganic analysis.

    PubMed

    Dos Santos, Ana Maria Pinto; Dos Santos, Liz Oliveira; Brandao, Geovani Cardoso; Leao, Danilo Junqueira; Bernedo, Alfredo Victor Bellido; Lopes, Ricardo Tadeu; Lemos, Valfredo Azevedo

    2015-07-01

    In this work, a homogeneity study of a corn flour reference material candidate for inorganic analysis is presented. Seven kilograms of corn flour were used to prepare the material, which was distributed among 100 bottles. The elements Ca, K, Mg, P, Zn, Cu, Fe, Mn and Mo were quantified by inductively coupled plasma optical emission spectrometry (ICP OES) after acid digestion procedure. The method accuracy was confirmed by analyzing the rice flour certified reference material, NIST 1568a. All results were evaluated by analysis of variance (ANOVA) and principal component analysis (PCA). In the study, a sample mass of 400mg was established as the minimum mass required for analysis, according to the PCA. The between-bottle test was performed by analyzing 9 bottles of the material. Subsamples of a single bottle were analyzed for the within-bottle test. No significant differences were observed for the results obtained through the application of both statistical methods. This fact demonstrates that the material is homogeneous for use as a laboratory reference material. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Automated object-based classification of topography from SRTM data

    PubMed Central

    Drăguţ, Lucian; Eisank, Clemens

    2012-01-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download. PMID:22485060

  10. Observation of force-detected nuclear magnetic resonance in a homogeneous field

    PubMed Central

    Madsen, L. A.; Leskowitz, G. M.; Weitekamp, D. P.

    2004-01-01

    We report the experimental realization of BOOMERANG (better observation of magnetization, enhanced resolution, and no gradient), a sensitive and general method of magnetic resonance. The prototype millimeter-scale NMR spectrometer shows signal and noise levels in agreement with the design principles. We present 1H and 19F NMR in both solid and liquid samples, including time-domain Fourier transform NMR spectroscopy, multiple-pulse echoes, and heteronuclear J spectroscopy. By measuring a 1H-19F J coupling, this last experiment accomplishes chemically specific spectroscopy with force-detected NMR. In BOOMERANG, an assembly of permanent magnets provides a homogeneous field throughout the sample, while a harmonically suspended part of the assembly, a detector, is mechanically driven by spin-dependent forces. By placing the sample in a homogeneous field, signal dephasing by diffusion in a field gradient is made negligible, enabling application to liquids, in contrast to other force-detection methods. The design appears readily scalable to μm-scale samples where it should have sensitivity advantages over inductive detection with microcoils and where it holds great promise for application of magnetic resonance in biology, chemistry, physics, and surface science. We briefly discuss extensions of the BOOMERANG method to the μm and nm scales. PMID:15326302

  11. Automated object-based classification of topography from SRTM data

    NASA Astrophysics Data System (ADS)

    Drăguţ, Lucian; Eisank, Clemens

    2012-03-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download.

  12. Warm and cold pasta phase in relativistic mean field theory

    NASA Astrophysics Data System (ADS)

    Avancini, S. S.; Menezes, D. P.; Alloy, M. D.; Marinelli, J. R.; Moraes, M. M. W.; Providência, C.

    2008-07-01

    In the present article we investigate the onset of the pasta phase with different parametrizations of the nonlinear Walecka model. At zero temperature two different methods are used, one based on coexistent phases and the other on the Thomas-Fermi approximation. At finite temperature only the coexistence phases method is used. npe matter with fixed proton fractions and in β equilibrium is studied. The pasta phase decreases with the increase of temperature. The internal pasta structure and the beginning of the homogeneous phase vary depending on the proton fraction (or the imposition of β equilibrium), on the method used, and on the chosen parametrization. It is shown that a good parametrization of the surface tension with dependence on the temperature, proton fraction, and geometry is essential to describe correctly large isospin asymmetries and the transition from pasta to homogeneous matter.

  13. Optimization of synthesis process of thermally-responsive poly-n-isopropylacrylamide nanoparticles for controlled release of antimicrobial hydrophobic compounds

    NASA Astrophysics Data System (ADS)

    Hill, Laura E.; Gomes, Carmen L.

    2014-12-01

    The goal of this study was to develop an effective method to synthesize poly-n-isopropylacrylamide (PNIPAAM) nanoparticles with entrapped cinnamon bark extract (CBE) to improve its delivery to foodborne pathogens and control its release with temperature stimuli. CBE was used as a model for hydrophobic natural antimicrobials. A top-down procedure using crosslinked PNIPAAM was compared to a bottom-up procedure using NIPAAM monomer. Both processes relied on self-assembly of the molecules into micelles around the CBE at 40 °C. Processing conditions were compared including homogenization time of the polymer, hydration time prior to homogenization, lyophilization, and the effect of particle ultrafiltration. The top-down versus bottom-up synthesis methods yielded particles with significantly different characteristics, especially their release profiles and antimicrobial activities. The synthesis methods affected particle size, with the bottom-up procedure resulting in smaller (P < 0.05) diameters than the top-down procedure. The controlled release profile of CBE from nanoparticles was dependent on the release media temperature. A faster, burst release was observed at 40 °C and a slower, more sustained release was observed at lower temperatures. PNIPAAM particles containing CBE were analyzed for their antimicrobial activity against Salmonella enterica serovar Typhimurium LT2 and Listeria monocytogenes Scott A. The PNIPAAM particles synthesized via the top-down procedure had a much faster release, which led to a greater (P < 0.05) antimicrobial activity. Both of the top-down nanoparticles performed similarly, therefore the 7 min homogenization time nanoparticles would be the best for this application, as the process time is shorter and little improvement was seen by using a slightly longer homogenization.

  14. Travelling wave solutions of the homogeneous one-dimensional FREFLO model

    NASA Astrophysics Data System (ADS)

    Huang, B.; Hong, J. Y.; Jing, G. Q.; Niu, W.; Fang, L.

    2018-01-01

    Presently there is quite few analytical studies in traffic flows due to the non-linearity of the governing equations. In the present paper we introduce travelling wave solutions for the homogeneous one-dimensional FREFLO model, which are expressed in the form of series and describe the procedure that vehicles/pedestrians move with a negative velocity and decelerate until rest, then accelerate inversely to positive velocities. This method is expect to be extended to more complex situations in the future.

  15. Reactant conversion in homogeneous turbulence - Mathematical modeling, computational validations, and practical applications

    NASA Technical Reports Server (NTRS)

    Madnia, C. K.; Frankel, S. H.; Givi, P.

    1992-01-01

    The presently obtained closed-form analytical expressions, which predict the limiting rate of mean reactant conversion in homogeneous turbulent flows under the influence of a binary reaction, are derived via the single-point pdf method based on amplitude mapping closure. With this model, the maximum rate of the mean reactant's decay can be conveniently expressed in terms of definite integrals of the parabolic cylinder functions. The results obtained are shown to be in good agreement with data generated by direct numerical simulations.

  16. On controllability of homogeneous and inhomogeneous discrete-time multi-input bilinear systems in dimension two

    NASA Astrophysics Data System (ADS)

    Tie, Lin

    2017-08-01

    In this paper, the controllability problem of two-dimensional discrete-time multi-input bilinear systems is completely solved. The homogeneous and the inhomogeneous cases are studied separately and necessary and sufficient conditions for controllability are established by using a linear algebraic method, which are easy to apply. Moreover, for the uncontrollable systems, near-controllability is considered and similar necessary and sufficient conditions are also obtained. Finally, examples are provided to demonstrate the results of this paper.

  17. Optical spectroscopy to study confined and semi-closed explosions of homogeneous and composite charges

    NASA Astrophysics Data System (ADS)

    Maiz, Lotfi; Trzciński, Waldemar A.; Paszula, Józef

    2017-01-01

    Confined and semi-closed explosions of new class of energetic composites as well as TNT and RDX charges were investigated using optical spectroscopy. These composites are considered as thermobarics when used in layered charges or enhanced blast explosives when pressed. Two methods to estimate fireball temperature histories of both homogeneous and metallized explosives from the spectroscopic data are also presented, compared and analyzed. Fireball temperature results of the charges detonated in a small explosion chamber under air and argon atmospheres, and detonated in a semi-closed bunker are presented and compared with theoretical ones calculated by a thermochemical code. Important conclusions about the fireball temperatures and the physical and chemical phenomena occurring after the detonation of homogeneous explosives and composite formulations are deduced.

  18. Production of solid lipid nanoparticles (SLN): scaling up feasibilities.

    PubMed

    Dingler, A; Gohla, S

    2002-01-01

    Solid lipid nanoparticles (SLN/Lipopearls) are widely discussed as a new colloidal drug carrier system. In contrast to polymeric systems, such as Polylactic copolyol microcapsules, these systems show with a good biocompatibility, if applied parenterally. The solid lipid matrices can be comprised of fats or waxes, and allow protection of incorporated active ingredients against chemical and physical degradation. The SLN can either be produced by 'hot homogenization' of melted lipids at elevated temperatures or by a 'cold homogenization' process. This paper deals with production technologies for SLN formulations, based on non-ethoxylated fat components for topical application and high pressure homogenization. Based on the chosen fat components, a novel and easy manufacturing and scaling-up method was developed to maintain chemical and physical integrity of the encapsulated active ingredients in the carrier.

  19. Homogeneous molybdenum disulfide tunnel diode formed via chemical doping

    NASA Astrophysics Data System (ADS)

    Liu, Xiaochi; Qu, Deshun; Choi, Min Sup; Lee, Changmin; Kim, Hyoungsub; Yoo, Won Jong

    2018-04-01

    We report on a simple, controllable chemical doping method to fabricate a lateral homogeneous MoS2 tunnel diode. MoS2 was doped to degenerate n- (1.6 × 1013 cm-2) and p-type (1.1 × 1013 cm-2) by benzyl viologen and AuCl3, respectively. The n- and p-doping can be patterned on the same MoS2 flake, and the high doping concentration can be maintained by Al2O3 masking together with vacuum annealing. A forward rectifying p-n diode and a band-to-band tunneling induced backward rectifying diode were realized by modulating the doping concentration of both the n- and p-sides. Our approach is a universal stratagem to fabricate diverse 2D homogeneous diodes with various functions.

  20. Some variance reduction methods for numerical stochastic homogenization.

    PubMed

    Blanc, X; Le Bris, C; Legoll, F

    2016-04-28

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. © 2016 The Author(s).

  1. Homogeneous immunoglobulins in the sera of lung carcinoma patients receiving cytotoxic chemotherapy--detection with the use of isoelectric focusing and immunoblotting.

    PubMed Central

    Haas, H; Lange, A; Schlaak, M

    1987-01-01

    Using isoelectric focusing (IEF) with immunoblotting, we have analysed serum immunoglobulins of 15 lung cancer patients on cytotoxic chemotherapy. In five of the patients homogeneous immunoglobulins were found which appeared between 9 and 18 months after beginning of treatment and were monoclonal in two and oligoclonal in three cases. These abnormalities were only partially shown by zonal electrophoresis with immunofixation and not detected by immune electrophoresis. Examination of 10 normal and 10 myeloma sera by the three techniques in parallel confirmed the competence and sensitivity of IEF with immunoblotting in detecting homogeneous immunoglobulins. Thus, this method provides a valuable tool for investigating an abnormal regulation of the immunoglobulin synthesis. Images Fig. 1 Fig. 2 Fig. 3 Fig. 4 Fig. 5 PMID:3325203

  2. Performance of electrolyte measurements assessed by a trueness verification program.

    PubMed

    Ge, Menglei; Zhao, Haijian; Yan, Ying; Zhang, Tianjiao; Zeng, Jie; Zhou, Weiyan; Wang, Yufei; Meng, Qinghui; Zhang, Chuanbao

    2016-08-01

    In this study, we analyzed frozen sera with known commutabilities for standardization of serum electrolyte measurements in China. Fresh frozen sera were sent to 187 clinical laboratories in China for measurement of four electrolytes (sodium, potassium, calcium, and magnesium). Target values were assigned by two reference laboratories. Precision (CV), trueness (bias), and accuracy [total error (TEa)] were used to evaluate measurement performance, and the tolerance limit derived from the biological variation was used as the evaluation criterion. About half of the laboratories used a homogeneous system (same manufacturer for instrument, reagent and calibrator) for calcium and magnesium measurement, and more than 80% of laboratories used a homogeneous system for sodium and potassium measurement. More laboratories met the tolerance limit of imprecision (coefficient of variation [CVa]) than the tolerance limits of trueness (biasa) and TEa. For sodium, calcium, and magnesium, the minimal performance criterion derived from biological variation was used, and the pass rates for total error were approximately equal to the bias (<50%). For potassium, the pass rates for CV and TE were more than 90%. Compared with the non homogeneous system, the homogeneous system was superior for all three quality specifications. The use of commutable proficiency testing/external quality assessment (PT/EQA) samples with values assigned by reference methods can monitor performance and provide reliable data for improving the performance of laboratory electrolyte measurement. The homogeneous systems were superior to the non homogeneous systems, whereas accuracy of assigned values of calibrators and assay stability remained challenges.

  3. Use of vertical temperature gradients for prediction of tidal flat sediment characteristics

    USGS Publications Warehouse

    Miselis, Jennifer L.; Holland, K. Todd; Reed, Allen H.; Abelev, Andrei

    2012-01-01

    Sediment characteristics largely govern tidal flat morphologic evolution; however, conventional methods of investigating spatial variability in lithology on tidal flats are difficult to employ in these highly dynamic regions. In response, a series of laboratory experiments was designed to investigate the use of temperature diffusion toward sediment characterization. A vertical thermistor array was used to quantify temperature gradients in simulated tidal flat sediments of varying compositions. Thermal conductivity estimates derived from these arrays were similar to measurements from a standard heated needle probe, which substantiates the thermistor methodology. While the thermal diffusivities of dry homogeneous sediments were similar, diffusivities for saturated homogeneous sediments ranged approximately one order of magnitude. The thermal diffusivity of saturated sand was five times the thermal diffusivity of saturated kaolin and more than eight times the thermal diffusivity of saturated bentonite. This suggests that vertical temperature gradients can be used for distinguishing homogeneous saturated sands from homogeneous saturated clays and perhaps even between homogeneous saturated clay types. However, experiments with more realistic tidal flat mixtures were less discriminating. Relationships between thermal diffusivity and percent fines for saturated mixtures varied depending upon clay composition, indicating that clay hydration and/or water content controls thermal gradients. Furthermore, existing models for the bulk conductivity of sediment mixtures were improved only through the use of calibrated estimates of homogeneous end-member conductivity and water content values. Our findings suggest that remotely sensed observations of water content and thermal diffusivity could only be used to qualitatively estimate tidal flat sediment characteristics.

  4. The Fourier transforms for the spatially homogeneous Boltzmann equation and Landau equation

    NASA Astrophysics Data System (ADS)

    Meng, Fei; Liu, Fang

    2018-03-01

    In this paper, we study the Fourier transforms for two equations arising in the kinetic theory. The first equation is the spatially homogeneous Boltzmann equation. The Fourier transform of the spatially homogeneous Boltzmann equation has been first addressed by Bobylev (Sov Sci Rev C Math Phys 7:111-233, 1988) in the Maxwellian case. Alexandre et al. (Arch Ration Mech Anal 152(4):327-355, 2000) investigated the Fourier transform of the gain operator for the Boltzmann operator in the cut-off case. Recently, the Fourier transform of the Boltzmann equation is extended to hard or soft potential with cut-off by Kirsch and Rjasanow (J Stat Phys 129:483-492, 2007). We shall first establish the relation between the results in Alexandre et al. (2000) and Kirsch and Rjasanow (2007) for the Fourier transform of the Boltzmann operator in the cut-off case. Then we give the Fourier transform of the spatially homogeneous Boltzmann equation in the non cut-off case. It is shown that our results cover previous works (Bobylev 1988; Kirsch and Rjasanow 2007). The second equation is the spatially homogeneous Landau equation, which can be obtained as a limit of the Boltzmann equation when grazing collisions prevail. Following the method in Kirsch and Rjasanow (2007), we can also derive the Fourier transform for Landau equation.

  5. Analysis of openings and wide of leaf on multileaf Colimators Using Gafchromic RTQA2 Film

    NASA Astrophysics Data System (ADS)

    Setiawati, Evi; Lailla Rachma, Assyifa; Hidayatullah, M.

    2018-05-01

    The research determined an excitence of correction openings leaf for treatment, and the distribution dose using Gafchromic RTQA2 film. This was about MLC’s correction based on result of movement leaf and field irradiating uniform was done. Methods of research was conduct an irradiating on Gafchromic RTQA2 film based on the index planning homogeneity philosophy, openings leaf and wide leaf. The result of film was lit later in scan. It was continued to include image of the software scanning into matlab. From this case, the image of films common to greyscale image and analysis on the rise in doses blackish films. In this step, we made a correlation between the doses and determine the homogenity to know film dosimetri used homogeneous, and correction of openings leaf and wide leaf. The result between pixel and doses was linear with the equation y = (-0,6)x+108 to low dose and y = (-0,28)x + 108 to high doses and the index of homogeneity range of 0,003 – 0,084. The result homogeneous and correction distribution doses at the openings leaf and wide leaf was around 5% with a value still into the suggested tolerance from ICRU No.50 was 10%.

  6. A homogenization approach for characterization of the fluid-solid coupling parameters in Biot's equations for acoustic poroelastic materials

    NASA Astrophysics Data System (ADS)

    Gao, K.; van Dommelen, J. A. W.; Göransson, P.; Geers, M. G. D.

    2015-09-01

    In this paper, a homogenization method is proposed to obtain the parameters of Biot's poroelastic theory from a multiscale perspective. It is assumed that the behavior of a macroscopic material point can be captured through the response of a microscopic Representative Volume Element (RVE) consisting of both a solid skeleton and a gaseous fluid. The macroscopic governing equations are assumed to be Biot's poroelastic equations and the RVE is governed by the conservation of linear momentum and the adopted linear constitutive laws under the isothermal condition. With boundary conditions relying on the macroscopic solid displacement and fluid pressure, the homogenized solid stress and fluid displacement are obtained based on energy consistency. This homogenization framework offers an approach to obtain Biot's parameters directly through the response of the RVE in the regime of Darcy's flow where the pressure gradient is dominating. A numerical experiment is performed in the form of a sound absorption test on a porous material with an idealized partially open microstructure that is described by Biot's equations where the parameters are obtained through the proposed homogenization approach. The result is evaluated by comparison with Direct Numerical Simulations (DNS), showing a superior performance of this approach compared to an alternative semi-phenomenological model for estimating Biot's parameters of the studied porous material.

  7. An experimental evaluation of the effect of homogenization quality as a preconditioning on oil-water two-phase volume fraction measurement accuracy using gamma-ray attenuation technique

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, M.; Hashemabadi, S. H.; Afarideh, H.; Khalafi, H.

    2018-02-01

    The problem of how to accurately measure multiphase flow in the oil/gas industry remains as an important issue since the early 80 s. Meanwhile, oil-water two-phase flow rate measurement has been regarded as an important issue. Gamma-ray attenuation is one of the most commonly used methods for phase fraction measurement which is entirely dependent on the flow regime variations. The peripheral strategy applied for removing the regime dependency problem, is using a homogenization system as a preconditioning tool, as this research work demonstrates. Here, at first, TPFHL as a two-phase flow homogenizer loop has been introduced and verified by a quantitative assessment. In the wake of this procedure, SEMPF as a static-equivalent multiphase flow with an additional capability for preparing a uniform mixture has been explained. The proposed idea in this system was verified by Monte Carlo simulations. Finally, the different water-gas oil two-phase volume fractions fed to the homogenizer loop and injected into the static-equivalent system. A comparison between performance of these two systems by using gamma-ray attenuation technique, showed not only an extra ability to prepare a homogenized mixture but a remarkably increased measurement accuracy for the static-equivalent system.

  8. Use of Flood Seasonality in Pooling-Group Formation and Quantile Estimation: An Application in Great Britain

    NASA Astrophysics Data System (ADS)

    Formetta, Giuseppe; Bell, Victoria; Stewart, Elizabeth

    2018-02-01

    Regional flood frequency analysis is one of the most commonly applied methods for estimating extreme flood events at ungauged sites or locations with short measurement records. It is based on: (i) the definition of a homogeneous group (pooling-group) of catchments, and on (ii) the use of the pooling-group data to estimate flood quantiles. Although many methods to define a pooling-group (pooling schemes, PS) are based on catchment physiographic similarity measures, in the last decade methods based on flood seasonality similarity have been contemplated. In this paper, two seasonality-based PS are proposed and tested both in terms of the homogeneity of the pooling-groups they generate and in terms of the accuracy in estimating extreme flood events. The method has been applied in 420 catchments in Great Britain (considered as both gauged and ungauged) and compared against the current Flood Estimation Handbook (FEH) PS. Results for gauged sites show that, compared to the current PS, the seasonality-based PS performs better both in terms of homogeneity of the pooling-group and in terms of the accuracy of flood quantile estimates. For ungauged locations, a national-scale hydrological model has been used for the first time to quantify flood seasonality. Results show that in 75% of the tested locations the seasonality-based PS provides an improvement in the accuracy of the flood quantile estimates. The remaining 25% were located in highly urbanized, groundwater-dependent catchments. The promising results support the aspiration that large-scale hydrological models complement traditional methods for estimating design floods.

  9. Towards machine ecoregionalization of Earth's landmass using pattern segmentation method

    NASA Astrophysics Data System (ADS)

    Nowosad, Jakub; Stepinski, Tomasz F.

    2018-07-01

    We present and evaluate a quantitative method for delineation of ecophysiographic regions throughout the entire terrestrial landmass. The method uses the new pattern-based segmentation technique which attempts to emulate the qualitative, weight-of-evidence approach to a delineation of ecoregions in a computer code. An ecophysiographic region is characterized by homogeneous physiography defined by the cohesiveness of patterns of four variables: land cover, soils, landforms, and climatic patterns. Homogeneous physiography is a necessary but not sufficient condition for a region to be an ecoregion, thus machine delineation of ecophysiographic regions is the first, important step toward global ecoregionalization. In this paper, we focus on the first-order approximation of the proposed method - delineation on the basis of the patterns of the land cover alone. We justify this approximation by the existence of significant spatial associations between various physiographic variables. Resulting ecophysiographic regionalization (ECOR) is shown to be more physiographically homogeneous than existing global ecoregionalizations (Terrestrial Ecoregions of the World (TEW) and Bailey's Ecoregions of the Continents (BEC)). The presented quantitative method has an advantage of being transparent and objective. It can be verified, easily updated, modified and customized for specific applications. Each region in ECOR contains detailed, SQL-searchable information about physiographic patterns within it. It also has a computer-generated label. To give a sense of how ECOR compares to TEW and, in the U.S., to EPA Level III ecoregions, we contrast these different delineations using two specific sites as examples. We conclude that ECOR yields regionalization somewhat similar to EPA level III ecoregions, but for the entire world, and by automatic means.

  10. Development of an electrothermal vaporization ICP-MS method and assessment of its applicability to studies of the homogeneity of reference materials.

    PubMed

    Friese, K C; Grobecker, K H; Wätjen, U

    2001-07-01

    A method has been developed for measurement of the homogeneity of analyte distribution in powdered materials by use of electrothermal vaporization with inductively coupled plasma mass spectrometric (ETV-ICP-MS) detection. The method enabled the simultaneous determination of As, Cd, Cu, Fe, Mn, Pb, and Zn in milligram amounts of samples of biological origin. The optimized conditions comprised a high plasma power of 1,500 W, reduced aerosol transport flow, and heating ramps below 300 degrees C s(-1). A temperature ramp to 550 degrees C ensured effective pyrolysis of approximately 70% of the organic compounds without losses of analyte. An additional hold stage at 700 degrees C led to separation of most of the analyte signals from the evaporation of carbonaceous matrix compounds. The effect of time resolution of signal acquisition on the precision of the ETV measurements was investigated. An increase in the number of masses monitored up to 20 is possible with not more than 1% additional relative standard deviation of results caused by limited temporal resolution of the transient signals. Recording of signals from the nebulization of aqueous standards in each sample run enabled correction for drift of the sensitivity of the ETV-ICP-MS instrument. The applicability of the developed method to homogeneity studies was assessed by use of four certified reference materials. According to the best repeatability observed in these sample runs, the maximum contribution of the method to the standard deviation is approximately 5% to 6% for all the elements investigated.

  11. Eulerian formulation of the interacting particle representation model of homogeneous turbulence

    DOE PAGES

    Campos, Alejandro; Duraisamy, Karthik; Iaccarino, Gianluca

    2016-10-21

    The Interacting Particle Representation Model (IPRM) of homogeneous turbulence incorporates information about the morphology of turbulent structures within the con nes of a one-point model. In the original formulation [Kassinos & Reynolds, Center for Turbulence Research: Annual Research Briefs, 31{51, (1996)], the IPRM was developed in a Lagrangian setting by evolving second moments of velocity conditional on a given gradient vector. In the present work, the IPRM is re-formulated in an Eulerian framework and evolution equations are developed for the marginal PDFs. Eulerian methods avoid the issues associated with statistical estimators used by Lagrangian approaches, such as slow convergence. Amore » specific emphasis of this work is to use the IPRM to examine the long time evolution of homogeneous turbulence. We first describe the derivation of the marginal PDF in spherical coordinates, which reduces the number of independent variables and the cost associated with Eulerian simulations of PDF models. Next, a numerical method based on radial basis functions over a spherical domain is adapted to the IPRM. Finally, results obtained with the new Eulerian solution method are thoroughly analyzed. The sensitivity of the Eulerian simulations to parameters of the numerical scheme, such as the size of the time step and the shape parameter of the radial basis functions, is examined. A comparison between Eulerian and Lagrangian simulations is performed to discern the capabilities of each of the methods. Finally, a linear stability analysis based on the eigenvalues of the discrete differential operators is carried out for both the new Eulerian solution method and the original Lagrangian approach.« less

  12. A Review of Web Information Seeking Research: Considerations of Method and Foci of Interest

    ERIC Educational Resources Information Center

    Martzoukou, Konstantina

    2005-01-01

    Introduction: This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background: Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of…

  13. Influence of high-pressure homogenization, ultrasonication, and supercritical fluid on free astaxanthin extraction from β-glucanase-treated Phaffia rhodozyma cells.

    PubMed

    Hasan, Mojeer; Azhar, Mohd; Nangia, Hina; Bhatt, Prakash Chandra; Panda, Bibhu Prasad

    2016-01-01

    In this study astaxanthin production by Phaffia rhodozyma was enhanced by chemical mutation using ethyl methane sulfonate. The mutant produces a higher amount of astaxanthin than the wild yeast strain. In comparison to supercritical fluid technique, high-pressure homogenization is better for extracting astaxanthin from yeast cells. Ultrasonication of dimethyl sulfoxide, hexane, and acetone-treated cells yielded less astaxanthin than β-glucanase enzyme-treated cells. The combination of ultrasonication with β-glucanase enzyme is found to be the most efficient method of extraction among all the tested physical and chemical extraction methods. It gives a maximum yield of 435.71 ± 6.55 µg free astaxanthin per gram of yeast cell mass.

  14. Large eddy simulation of hydrodynamic cavitation

    NASA Astrophysics Data System (ADS)

    Bhatt, Mrugank; Mahesh, Krishnan

    2017-11-01

    Large eddy simulation is used to study sheet to cloud cavitation over a wedge. The mixture of water and water vapor is represented using a homogeneous mixture model. Compressible Navier-Stokes equations for mixture quantities along with transport equation for vapor mass fraction employing finite rate mass transfer between the two phases, are solved using the numerical method of Gnanaskandan and Mahesh. The method is implemented on unstructured grid with parallel MPI capabilities. Flow over a wedge is simulated at Re = 200 , 000 and the performance of the homogeneous mixture model is analyzed in predicting different regimes of sheet to cloud cavitation; namely, incipient, transitory and periodic, as observed in the experimental investigation of Harish et al.. This work is supported by the Office of Naval Research.

  15. ISOTOPE CONVERSION DEVICE AND METHOD

    DOEpatents

    Wigner, E.P.; Ohlinger, L.A.

    1958-11-11

    Homogeneous nuclear reactors are discussed, and an apparatus and method of operation are descrlbed. The apparatus consists essentially of a reaction tank, a heat exchanger connected to the reaction tank and two separate surge tanks connected to the heat exchanger. An oscillating differential pressure is applied to the surge tanks so that a portion of the homogeneous flssionable solution is circulated through the heat exchanger and reaction tank while maintaining sufficient solution in the reaction tank to sustain a controlled fission chain reaction. The reaction tank is disposed within another tank containing a neutron absorbing material through which coolant fluid is circulated, the outer tank being provided with means to permit and cause rotation thereof due to the circulation of the coolant therethrough.

  16. Extraction of intracellular protein from Chlorella pyrenoidosa using a combination of ethanol soaking, enzyme digest, ultrasonication and homogenization techniques.

    PubMed

    Zhang, Ruilin; Chen, Jian; Zhang, Xuewu

    2018-01-01

    Due to the rigid cell wall of Chlorella species, it is still challenging to effectively extract significant amounts of protein. Mass methods were used for the extraction of intracellular protein from microalgae with biological, mechanical and chemical approaches. In this study, based on comparison of different extraction methods, a new protocol was established to maximize extract amounts of protein, which was involved in ethanol soaking, enzyme digest, ultrasonication and homogenization techniques. Under the optimized conditions, 72.4% of protein was extracted from the microalgae Chlorella pyrenoidosa, which should contribute to the research and development of Chlorella protein in functional food and medicine. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory

    NASA Astrophysics Data System (ADS)

    Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.

    2011-10-01

    The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.

  18. A user-friendly and scalable process to prepare a ready-to-use inactivated vaccine: the example of heartwater in ruminants under tropical conditions.

    PubMed

    Marcelino, Isabel; Lefrançois, Thierry; Martinez, Dominique; Giraud-Girard, Ken; Aprelon, Rosalie; Mandonnet, Nathalie; Gaucheron, Jérôme; Bertrand, François; Vachiéry, Nathalie

    2015-01-29

    The use of cheap and thermoresistant vaccines in poor tropical countries for the control of animal diseases is a key issue. Our work aimed at designing and validating a process for the large-scale production of a ready-to-use inactivated vaccine for ruminants. Our model was heartwater caused by the obligate intracellular bacterium Ehrlichia ruminantium (ER). The conventional inactivated vaccine against heartwater (based on whole bacteria inactivated with sodium azide) is prepared immediately before injection, using a syringe-extrusion method with Montanide ISA50. This is a fastidious time-consuming process and it limits the number of vaccine doses available. To overcome these issues, we tested three different techniques (syringe, vortex and homogenizer) and three Montanide ISA adjuvants (50, 70 and 70M). High-speed homogenizer was the optimal method to emulsify ER antigens with both ISA70 and 70M adjuvants. The emulsions displayed a good homogeneity (particle size below 1 μm and low phase separation), conductivity below 10 μS/cm and low antigen degradation at 4 °C for up to 1 year. The efficacy of the different formulations was then evaluated during vaccination trials on goats. The inactivated ER antigens emulsified with ISA70 and ISA70M in a homogenizer resulted in 80% and 100% survival rates, respectively. A cold-chain rupture assay using ISA70M+ER was performed to mimic possible field conditions exposing the vaccine at 37 °C for 4 days before delivery. Surprisingly, the animal survival rate was still high (80%). We also observed that the MAP-1B antibody response was very similar between animals vaccinated with ISA70+ER and ISA70M+ER emulsions, suggesting a more homogenous antigen distribution and presentation in these emulsions. Our work demonstrated that the combination of ISA70 or ISA70M and homogenizer is optimal for the production of an effective ready-to-use inactivated vaccine against heartwater, which could easily be produced on an industrial scale. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. A Study on Regional Frequency Analysis using Artificial Neural Network - the Sumjin River Basin

    NASA Astrophysics Data System (ADS)

    Jeong, C.; Ahn, J.; Ahn, H.; Heo, J. H.

    2017-12-01

    Regional frequency analysis means to make up for shortcomings in the at-site frequency analysis which is about a lack of sample size through the regional concept. Regional rainfall quantile depends on the identification of hydrologically homogeneous regions, hence the regional classification based on hydrological homogeneous assumption is very important. For regional clustering about rainfall, multidimensional variables and factors related geographical features and meteorological figure are considered such as mean annual precipitation, number of days with precipitation in a year and average maximum daily precipitation in a month. Self-Organizing Feature Map method which is one of the artificial neural network algorithm in the unsupervised learning techniques solves N-dimensional and nonlinear problems and be shown results simply as a data visualization technique. In this study, for the Sumjin river basin in South Korea, cluster analysis was performed based on SOM method using high-dimensional geographical features and meteorological factor as input data. then, for the results, in order to evaluate the homogeneity of regions, the L-moment based discordancy and heterogeneity measures were used. Rainfall quantiles were estimated as the index flood method which is one of regional rainfall frequency analysis. Clustering analysis using SOM method and the consequential variation in rainfall quantile were analyzed. This research was supported by a grant(2017-MPSS31-001) from Supporting Technology Development Program for Disaster Management funded by Ministry of Public Safety and Security(MPSS) of the Korean government.

  20. Novel Budesonide Particles for Dry Powder Inhalation Prepared Using a Microfluidic Reactor Coupled With Ultrasonic Spray Freeze Drying.

    PubMed

    Saboti, Denis; Maver, Uroš; Chan, Hak-Kim; Planinšek, Odon

    2017-07-01

    Budesonide (BDS) is a potent active pharmaceutical ingredient, often administered using respiratory devices such as metered dose inhalers, nebulizers, and dry powder inhalers. Inhalable drug particles are conventionally produced by crystallization followed by milling. This approach tends to generate partially amorphous materials that require post-processing to improve the formulations' stability. Other methods involve homogenization or precipitation and often require the use of stabilizers, mostly surfactants. The purpose of this study was therefore to develop a novel method for preparation of fine BDS particles using a microfluidic reactor coupled with ultrasonic spray freeze drying, and hence avoiding the need of additional homogenization or stabilizer use. A T-junction microfluidic reactor was employed to produce particle suspension (using an ethanol-water, methanol-water, and an acetone-water system), which was directly fed into an ultrasonic atomization probe, followed by direct feeding to liquid nitrogen. Freeze drying was the final preparation step. The result was fine crystalline BDS powders which, when blended with lactose and dispersed in an Aerolizer at 100 L/min, generated fine particle fraction in the range 47.6% ± 2.8% to 54.9% ± 1.8%, thus exhibiting a good aerosol performance. Subsequent sample analysis confirmed the suitability of the developed method to produce inhalable drug particles without additional homogenization or stabilizers. The developed method provides a viable solution for particle isolation in microfluidics in general. Copyright © 2017 American Pharmacists Association®. All rights reserved.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Rong; Zhukhovitskiy, Aleksandr V.; Deraedt, Christophe V.

    Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles—some without homogeneous analogues—for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimizemore » and expand the scope of their reactivity and selectivity. Ongoing efforts in our laboratories are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence, and structural uniformity, dendrimers have proven to be versatile scaffolds for the synthesis and stabilization of small nanoclusters. Then these dendrimer-encapsulated metal clusters (DEMCs) are adsorbed onto mesoporous silica. Through this method, we have achieved selective transformations that had been challenging to accomplish in a heterogeneous setting, e.g., π-bond activation and aldol reactions. Extensive investigation into the catalytic systems under reaction conditions allowed us to correlate the structural features (e.g., oxidation states) of the catalysts and their activity. Moreover, we have demonstrated that supported DEMCs are also excellent catalysts for typical heterogeneous reactions, including hydrogenation and alkane isomerization. Critically, these investigations also confirmed that the supported DEMCs are heterogeneous and stable against leaching. Catalysts optimization is achieved through the modulation of various parameters. The clusters are oxidized (e.g., with PhICl 2) or reduced (e.g., with H 2) in situ. Changing the dendrimer properties (e.g., generation, terminal functional groups) is analogous to ligand modification in homogeneous catalysts, which affect both catalytic activity and selectivity. Similarly, pore size of the support is another factor in determining product distribution. In a flow reactor, the flow rate is adjusted to control the residence time of the starting material and intermediates, and thus the final product selectivity. Our approach to heterogeneous catalysis affords various advantages: (1) the catalyst system can tap into the reactivity typical to homogeneous catalysts, which conventional heterogeneous catalysts could not achieve; (2) unlike most homogeneous catalysts with comparable performance, the heterogenized homogeneous catalysts can be recycled; (3) improved activity or selectivity compared to conventional homogeneous catalysts is possible because of uniquely heterogeneous parameters for optimization. Here in this Account, we will briefly introduce metal clusters and describe the synthesis and characterizations of supported DEMCs. We will present the catalysis studies of supported DEMCs in both the batch and flow modes. Lastly, we will summarize the current state of heterogenizing homogeneous catalysis and provide future directions for this area of research.« less

  2. Reconstituted asbestos matrix for fuel cells

    NASA Technical Reports Server (NTRS)

    Mcbryar, H.

    1975-01-01

    Method is described for reprocessing commercially available asbestos matrix stock to yield greater porosity and bubble pressure (due to increased surface tension), improved homogeneity, and greater uniformity.

  3. Understanding homogeneous nucleation in solidification of aluminum by molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Mahata, Avik; Asle Zaeem, Mohsen; Baskes, Michael I.

    2018-02-01

    Homogeneous nucleation from aluminum (Al) melt was investigated by million-atom molecular dynamics simulations utilizing the second nearest neighbor modified embedded atom method potentials. The natural spontaneous homogenous nucleation from the Al melt was produced without any influence of pressure, free surface effects and impurities. Initially isothermal crystal nucleation from undercooled melt was studied at different constant temperatures, and later superheated Al melt was quenched with different cooling rates. The crystal structure of nuclei, critical nucleus size, critical temperature for homogenous nucleation, induction time, and nucleation rate were determined. The quenching simulations clearly revealed three temperature regimes: sub-critical nucleation, super-critical nucleation, and solid-state grain growth regimes. The main crystalline phase was identified as face-centered cubic, but a hexagonal close-packed (hcp) and an amorphous solid phase were also detected. The hcp phase was created due to the formation of stacking faults during solidification of Al melt. By slowing down the cooling rate, the volume fraction of hcp and amorphous phases decreased. After the box was completely solid, grain growth was simulated and the grain growth exponent was determined for different annealing temperatures.

  4. Two-dimensional arbitrarily shaped acoustic cloaks composed of homogeneous parts

    NASA Astrophysics Data System (ADS)

    Li, Qi; Vipperman, Jeffrey S.

    2017-10-01

    Acoustic cloaking is an important application of acoustic metamaterials. Although the topic has received much attention, there are a number of areas where contributions are needed. In this paper, a design method for producing acoustic cloaks with arbitrary shapes that are composed of homogeneous parts is presented. The cloak is divided into sections, each of which, in turn, is further divided into two parts, followed by the application of transformation acoustics to derive the required properties for cloaking. With the proposed mapping relations, the properties of each part of the cloak are anisotropic but homogeneous, which can be realized using two alternating layers of homogeneous and isotropic materials. A hexagonal and an irregular cloak are presented as design examples. The full wave simulations using COMSOL Multiphysics finite element software show that the cloaks function well at reducing reflections and shadows. The variation of the cloak properties is investigated as a function of three important geometric parameters used in the transformations. A balance can be found between cloaking performance and materials properties that are physically realizable.

  5. Comparison of directly compressed vitamin B12 tablets prepared from micronized rotary-spun microfibers and cast films.

    PubMed

    Sebe, István; Bodai, Zsolt; Eke, Zsuzsanna; Kállai-Szabó, Barnabás; Szabó, Péter; Zelkó, Romána

    2015-01-01

    Fiber-based dosage forms are potential alternatives of conventional dosage forms from the point of the improved extent and rate of drug dissolution. Rotary-spun polymer fibers and cast films were prepared and micronized in order to direct compress after homogenization with tabletting excipients. Particle size distribution of powder mixtures of micronized fibers and films homogenized with tabletting excipients were determined by laser scattering particle size distribution analyzer. Powder rheological behavior of the mixtures containing micronized fibers and cast films was also compared. Positron annihilation lifetime spectroscopy was applied for the microstructural characterization of micronized fibers and films. The water-soluble vitamin B12 release from the compressed tablets was determined. It was confirmed that the rotary spinning method resulted in homogeneous supramolecularly ordered powder mixture, which was successfully compressed after homogenization with conventional tabletting excipients. The obtained directly compressed tablets showed uniform drug release of low variations. The results highlight the novel application of micronized rotary-spun fibers as intermediate for further processing reserving the original favorable powder characteristics of fibrous systems.

  6. Homogeneity tests of clustered diagnostic markers with applications to the BioCycle Study

    PubMed Central

    Tang, Liansheng Larry; Liu, Aiyi; Schisterman, Enrique F.; Zhou, Xiao-Hua; Liu, Catherine Chun-ling

    2014-01-01

    Diagnostic trials often require the use of a homogeneity test among several markers. Such a test may be necessary to determine the power both during the design phase and in the initial analysis stage. However, no formal method is available for the power and sample size calculation when the number of markers is greater than two and marker measurements are clustered in subjects. This article presents two procedures for testing the accuracy among clustered diagnostic markers. The first procedure is a test of homogeneity among continuous markers based on a global null hypothesis of the same accuracy. The result under the alternative provides the explicit distribution for the power and sample size calculation. The second procedure is a simultaneous pairwise comparison test based on weighted areas under the receiver operating characteristic curves. This test is particularly useful if a global difference among markers is found by the homogeneity test. We apply our procedures to the BioCycle Study designed to assess and compare the accuracy of hormone and oxidative stress markers in distinguishing women with ovulatory menstrual cycles from those without. PMID:22733707

  7. Resolving Rapid Variation in Energy for Particle Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haut, Terry Scot; Ahrens, Cory Douglas; Jonko, Alexandra

    2016-08-23

    Resolving the rapid variation in energy in neutron and thermal radiation transport is needed for the predictive simulation capability in high-energy density physics applications. Energy variation is difficult to resolve due to rapid variations in cross sections and opacities caused by quantized energy levels in the nuclei and electron clouds. In recent work, we have developed a new technique to simultaneously capture slow and rapid variations in the opacities and the solution using homogenization theory, which is similar to multiband (MB) and to the finite-element with discontiguous support (FEDS) method, but does not require closure information. We demonstrated the accuracymore » and efficiency of the method for a variety of problems. We are researching how to extend the method to problems with multiple materials and the same material but with different temperatures and densities. In this highlight, we briefly describe homogenization theory and some results.« less

  8. Parallel Excitation for B-Field Insensitive Fat-Saturation Preparation

    PubMed Central

    Heilman, Jeremiah A.; Derakhshan, Jamal D.; Riffe, Matthew J.; Gudino, Natalia; Tkach, Jean; Flask, Chris A.; Duerk, Jeffrey L.; Griswold, Mark A.

    2016-01-01

    Multichannel transmission has the potential to improve many aspects of MRI through a new paradigm in excitation. In this study, multichannel transmission is used to address the effects that variations in B0 homogeneity have on fat-saturation preparation through the use of the frequency, phase, and amplitude degrees of freedom afforded by independent transmission channels. B1 homogeneity is intrinsically included via use of coil sensitivities in calculations. A new method, parallel excitation for B-field insensitive fat-saturation preparation, can achieve fat saturation in 89% of voxels with Mz ≤ 0.1 in the presence of ±4 ppm B0 variation, where traditional CHESS methods achieve only 40% in the same conditions. While there has been much progress to apply multichannel transmission at high field strengths, particular focus is given here to application of these methods at 1.5 T. PMID:22247080

  9. Fast computation of radiation pressure force exerted by multiple laser beams on red blood cell-like particles

    NASA Astrophysics Data System (ADS)

    Gou, Ming-Jiang; Yang, Ming-Lin; Sheng, Xin-Qing

    2016-10-01

    Mature red blood cells (RBC) do not contain huge complex nuclei and organelles, makes them can be approximately regarded as homogeneous medium particles. To compute the radiation pressure force (RPF) exerted by multiple laser beams on this kind of arbitrary shaped homogenous nano-particles, a fast electromagnetic optics method is demonstrated. In general, based on the Maxwell's equations, the matrix equation formed by the method of moment (MOM) has many right hand sides (RHS's) corresponding to the different laser beams. In order to accelerate computing the matrix equation, the algorithm conducts low-rank decomposition on the excitation matrix consisting of all RHS's to figure out the so-called skeleton laser beams by interpolative decomposition (ID). After the solutions corresponding to the skeletons are obtained, the desired responses can be reconstructed efficiently. Some numerical results are performed to validate the developed method.

  10. Temperature Profile in Fuel and Tie-Tubes for Nuclear Thermal Propulsion Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishal Patel

    A finite element method to calculate temperature profiles in heterogeneous geometries of tie-tube moderated LEU nuclear thermal propulsion systems and HEU designs with tie-tubes is developed and implemented in MATLAB. This new method is compared to previous methods to demonstrate shortcomings in those methods. Typical methods to analyze peak fuel centerline temperature in hexagonal geometries rely on spatial homogenization to derive an analytical expression. These methods are not applicable to cores with tie-tube elements because conduction to tie-tubes cannot be accurately modeled with the homogenized models. The fuel centerline temperature directly impacts safety and performance so it must be predictedmore » carefully. The temperature profile in tie-tubes is also important when high temperatures are expected in the fuel because conduction to the tie-tubes may cause melting in tie-tubes, which may set maximum allowable performance. Estimations of maximum tie-tube temperature can be found from equivalent tube methods, however this method tends to be approximate and overly conservative. A finite element model of heat conduction on a unit cell can model spatial dependence and non-linear conductivity for fuel and tie-tube systems allowing for higher design fidelity of Nuclear Thermal Propulsion.« less

  11. Electrodeposition of reduced graphene oxide with chitosan based on the coordination deposition method

    PubMed Central

    Liu, Mingyang; Qin, Chaoran; Zhang, Zheng; Ma, Shuai; Cai, Xiuru; Li, Xueqian

    2018-01-01

    The electrodeposition of graphene has drawn considerable attention due to its appealing applications for sensors, supercapacitors and lithium-ion batteries. However, there are still some limitations in the current electrodeposition methods for graphene. Here, we present a novel electrodeposition method for the direct deposition of reduced graphene oxide (rGO) with chitosan. In this method, a 2-hydroxypropyltrimethylammonium chloride-based chitosan-modified rGO material was prepared. This material disperses homogenously in the chitosan solution, forming a deposition solution with good dispersion stability. Subsequently, the modified rGO material was deposited on an electrode through codeposition with chitosan, based on the coordination deposition method. After electrodeposition, the homogeneous, deposited rGO/chitosan films can be generated on copper or silver electrodes or substrates. The electrodeposition method allows for the convenient and controlled creation of rGO/chitosan nanocomposite coatings and films of different shapes and thickness. It also introduces a new method of creating films, as they can be peeled completely from the electrodes. Moreover, this method allows for a rGO/chitosan film to be deposited directly onto an electrode, which can then be used for electrochemical detection. PMID:29765797

  12. Advanced nodal neutron diffusion method with space-dependent cross sections: ILLICO-VX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajic, H.L.; Ougouag, A.M.

    1987-01-01

    Advanced transverse integrated nodal methods for neutron diffusion developed since the 1970s require that node- or assembly-homogenized cross sections be known. The underlying structural heterogeneity can be accurately accounted for in homogenization procedures by the use of heterogeneity or discontinuity factors. Other (milder) types of heterogeneity, burnup-induced or due to thermal-hydraulic feedback, can be resolved by explicitly accounting for the spatial variations of material properties. This can be done during the nodal computations via nonlinear iterations. The new method has been implemented in the code ILLICO-VX (ILLICO variable cross-section method). Numerous numerical tests were performed. As expected, the convergence ratemore » of ILLICO-VX is lower than that of ILLICO, requiring approx. 30% more outer iterations per k/sub eff/ computation. The methodology has also been implemented as the NOMAD-VX option of the NOMAD, multicycle, multigroup, two- and three-dimensional nodal diffusion depletion code. The burnup-induced heterogeneities (space dependence of cross sections) are calculated during the burnup steps.« less

  13. A Homogeneous Time-Resolved Fluorescence Immunoassay Method for the Measurement of Compound W

    PubMed Central

    Huang, Biao; Yu, Huixin; Bao, Jiandong; Zhang, Manda; Green, William L; Wu, Sing-Yung

    2018-01-01

    Objective: Using compound W (a 3,3′-diiodothyronine sulfate [T2S] immuno-crossreactive material)-specific polyclonal antibodies and homogeneous time-resolved fluorescence immunoassay assay techniques (AlphaLISA) to establish an indirect competitive compound W (ICW) quantitative detection method. Method: Photosensitive particles (donor beads) coated with compound W or T2S and rabbit anti-W antibody were incubated with biotinylated goat anti-rabbit antibody. This constitutes a detection system with streptavidin-coated acceptor particle. We have optimized the test conditions and evaluated the detection performance. Results: The sensitivity of the method was 5 pg/mL, and the detection range was 5 to 10 000 pg/mL. The intra-assay coefficient of variation averages <10% with stable reproducibility. Conclusions: The ICW-AlphaLISA shows good stability and high sensitivity and can measure a wide range of compound W levels in extracts of maternal serum samples. This may have clinical application to screen congenital hypothyroidism in utero. PMID:29449777

  14. Conventional and dense gas techniques for the production of liposomes: a review.

    PubMed

    Meure, Louise A; Foster, Neil R; Dehghani, Fariba

    2008-01-01

    The aim of this review paper is to compare the potential of various techniques developed for production of homogenous, stable liposomes. Traditional techniques, such as Bangham, detergent depletion, ether/ethanol injection, reverse-phase evaporation and emulsion methods, were compared with the recent advanced techniques developed for liposome formation. The major hurdles for scaling up the traditional methods are the consumption of large quantities of volatile organic solvent, the stability and homogeneity of the liposomal product, as well as the lengthy multiple steps involved. The new methods have been designed to alleviate the current issues for liposome formulation. Dense gas liposome techniques are still in their infancy, however they have remarkable advantages in reducing the use of organic solvents, providing fast, single-stage production and producing stable, uniform liposomes. Techniques such as the membrane contactor and heating methods are also promising as they eliminate the use of organic solvent, however high temperature is still required for processing.

  15. Efficient implicit LES method for the simulation of turbulent cavitating flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Egerer, Christian P., E-mail: christian.egerer@aer.mw.tum.de; Schmidt, Steffen J.; Hickel, Stefan

    2016-07-01

    We present a numerical method for efficient large-eddy simulation of compressible liquid flows with cavitation based on an implicit subgrid-scale model. Phase change and subgrid-scale interface structures are modeled by a homogeneous mixture model that assumes local thermodynamic equilibrium. Unlike previous approaches, emphasis is placed on operating on a small stencil (at most four cells). The truncation error of the discretization is designed to function as a physically consistent subgrid-scale model for turbulence. We formulate a sensor functional that detects shock waves or pseudo-phase boundaries within the homogeneous mixture model for localizing numerical dissipation. In smooth regions of the flowmore » field, a formally non-dissipative central discretization scheme is used in combination with a regularization term to model the effect of unresolved subgrid scales. The new method is validated by computing standard single- and two-phase test-cases. Comparison of results for a turbulent cavitating mixing layer obtained with the new method demonstrates its suitability for the target applications.« less

  16. Testing the ISP method with the PARIO device: Accuracy of results and influence of homogenization technique

    NASA Astrophysics Data System (ADS)

    Durner, Wolfgang; Huber, Magdalena; Yangxu, Li; Steins, Andi; Pertassek, Thomas; Göttlein, Axel; Iden, Sascha C.; von Unold, Georg

    2017-04-01

    The particle-size distribution (PSD) is one of the main properties of soils. To determine the proportions of the fine fractions silt and clay, sedimentation experiments are used. Most common are the Pipette and Hydrometer method. Both need manual sampling at specific times. Both are thus time-demanding and rely on experienced operators. Durner et al. (Durner, W., S.C. Iden, and G. von Unold (2017): The integral suspension pressure method (ISP) for precise particle-size analysis by gravitational sedimentation, Water Resources Research, doi:10.1002/2016WR019830) recently developed the integral suspension method (ISP) method, which is implemented in the METER Group device PARIOTM. This new method estimates continuous PSD's from sedimentation experiments by recording the temporal evolution of the suspension pressure at a certain measurement depth in a sedimentation cylinder. It requires no manual interaction after start and thus no specialized training of the lab personnel. The aim of this study was to test the precision and accuracy of new method with a variety of materials, to answer the following research questions: (1) Are the results obtained by PARIO reliable and stable? (2) Are the results affected by the initial mixing technique to homogenize the suspension, or by the presence of sand in the experiment? (3) Are the results identical to the one that are obtained with the Pipette method as reference method? The experiments were performed with a pure quartz silt material and four real soil materials. PARIO measurements were done repetitively on the same samples in a temperature-controlled lab to characterize the repeatability of the measurements. Subsequently, the samples were investigated by the pipette method to validate the results. We found that the statistical error for silt fraction from replicate and repetitive measurements was in the range of 1% for the quartz material to 3% for soil materials. Since the sand fractions, as in any sedimentation method, must be measured explicitly and are used as fixed parameters in the PARIO evaluation, the error of the clay fraction is determined by error propagation from the sand and silt fraction. Homogenization of the suspension by overhead shaking gave lower reproducibility and smaller silt fractions than vertical stirring. However, it turned out that vertical stirring must be performed with sufficient rigour to obtain a fully homogeneous initial distribution. Analysis of material sieved to < 2000 μm and to < 200 μm gave equal results, i.e., there was no hint towards dragging effects of large particles. Complete removal of the sand fraction, i.e. sieving to < 63 μm lead to less silt, probably due to a loss of fine material by the sieving process. The PSD's obtained with the PARIO corresponded very well with the results of the Pipette method.

  17. Mechanical and Tear Properties of Fabric/Film Laminates

    NASA Technical Reports Server (NTRS)

    Said, Magdi A.

    1998-01-01

    Films reinforced with woven fabrics are being considered for the development of a material suitable for long duration scientific balloons under a program managed by the National Aeronautics and Space Administration (NASA). Recently developed woven fabrics provide a relatively high strength to weight ratio compared to standard homogenous films. Woven fabrics also have better crack propagation resistance and rip stop capabilities when compared to homogenous lightweight, high strength polymeric films such as polyester and nylon. If joining is required, such as in the case of scientific balloons, woven fabrics have the advantage over polymeric thin films to utilize traditional textile methods as well as other techniques including hot sealing, adhesion, and ultrasonic means. Woven fabrics, however, lack the barrier properties required for helium filled scientific balloons, therefore lamination with homogenous films is required to provide the gas barrier capabilities required in these applications.

  18. Economic regionalization and choice of strategic development directions of municipalities of the Republic of Tatarstan

    NASA Astrophysics Data System (ADS)

    Panasyuk, M. V.

    2018-01-01

    This paper shows the results of economic regionalization and zoning of the Republic of Tatarstan, conducted in 2017. The latest experience of economic regionalization and zoning of the Republic of Tatarstan in 2007 - 2015 is exposed. The economic regionalization problem is solved on the basis of new method and algorithm that uses quantitative measures which characterize spatial and economic features of generated economic regions including their internal and average connectivity, homogeneity, compactness, socio-economic development level and life quality of the population. Three nodal and one homogeneous economic region in the Republic of Tatarstan were identified. The results of economic zoning within homogeneous economic region led to the conclusion about two existing economic zones. They have the potential for developing new economic growth pole and three economic centers - growth points with specialization on agro-industrial sector.

  19. Exact image theory for the problem of dielectric/magnetic slab

    NASA Technical Reports Server (NTRS)

    Lindell, I. V.

    1987-01-01

    Exact image method, recently introduced for the exact solution of electromagnetic field problems involving homogeneous half spaces and microstrip-like geometries, is developed for the problem of homogeneous slab of dielectric and/or magnetic material in free space. Expressions for image sources, creating the exact reflected and transmitted fields, are given and their numerical evaluation is demonstrated. Nonradiating modes, guided by the slab and responsible for the loss of convergence of the image functions, are considered and extracted. The theory allows, for example, an analysis of finite ground planes in microstrip antenna structures.

  20. [In vitro metabolism of fenbendazole prodrug].

    PubMed

    Wen, Ai-Dan; Duan, Li-Ping; Liu, Cong-Shan; Tao, Yi; Xue, Jian; Wu, Ning-Bo; Jiang, Bin; Zhang, Hao-Bing

    2013-02-01

    Synthesized fenbendazole prodrug N-methoxycarbonyl-N'-(2-nitro-4-phenylthiophenyl) thiourea (MPT) was analyzed in vitro in artificial gastric juice, intestinal juice and mouse liver homogenate model by using HPLC method, and metabolic curve was then generated. MPT was tested against Echinococcus granulosus protoscolices in vitro. The result showed that MPT could be metabolized in the three biological media, and to the active compound fenbendazole in liver homogenate, with a metabolic rate of 7.92%. Besides, the prodrug showed a weak activity against E. granulosus protoscolices with a mortality of 45.9%.

  1. Semi-supervised clustering for parcellating brain regions based on resting state fMRI data

    NASA Astrophysics Data System (ADS)

    Cheng, Hewei; Fan, Yong

    2014-03-01

    Many unsupervised clustering techniques have been adopted for parcellating brain regions of interest into functionally homogeneous subregions based on resting state fMRI data. However, the unsupervised clustering techniques are not able to take advantage of exiting knowledge of the functional neuroanatomy readily available from studies of cytoarchitectonic parcellation or meta-analysis of the literature. In this study, we propose a semi-supervised clustering method for parcellating amygdala into functionally homogeneous subregions based on resting state fMRI data. Particularly, the semi-supervised clustering is implemented under the framework of graph partitioning, and adopts prior information and spatial consistent constraints to obtain a spatially contiguous parcellation result. The graph partitioning problem is solved using an efficient algorithm similar to the well-known weighted kernel k-means algorithm. Our method has been validated for parcellating amygdala into 3 subregions based on resting state fMRI data of 28 subjects. The experiment results have demonstrated that the proposed method is more robust than unsupervised clustering and able to parcellate amygdala into centromedial, laterobasal, and superficial parts with improved functionally homogeneity compared with the cytoarchitectonic parcellation result. The validity of the parcellation results is also supported by distinctive functional and structural connectivity patterns of the subregions and high consistency between coactivation patterns derived from a meta-analysis and functional connectivity patterns of corresponding subregions.

  2. Calculation of Disease Dynamics in a Population of Households

    PubMed Central

    Ross, Joshua V.; House, Thomas; Keeling, Matt J.

    2010-01-01

    Early mathematical representations of infectious disease dynamics assumed a single, large, homogeneously mixing population. Over the past decade there has been growing interest in models consisting of multiple smaller subpopulations (households, workplaces, schools, communities), with the natural assumption of strong homogeneous mixing within each subpopulation, and weaker transmission between subpopulations. Here we consider a model of SIRS (susceptible-infectious-recovered-susceptible) infection dynamics in a very large (assumed infinite) population of households, with the simplifying assumption that each household is of the same size (although all methods may be extended to a population with a heterogeneous distribution of household sizes). For this households model we present efficient methods for studying several quantities of epidemiological interest: (i) the threshold for invasion; (ii) the early growth rate; (iii) the household offspring distribution; (iv) the endemic prevalence of infection; and (v) the transient dynamics of the process. We utilize these methods to explore a wide region of parameter space appropriate for human infectious diseases. We then extend these results to consider the effects of more realistic gamma-distributed infectious periods. We discuss how all these results differ from standard homogeneous-mixing models and assess the implications for the invasion, transmission and persistence of infection. The computational efficiency of the methodology presented here will hopefully aid in the parameterisation of structured models and in the evaluation of appropriate responses for future disease outbreaks. PMID:20305791

  3. Effective elastic moduli of triangular lattice material with defects

    NASA Astrophysics Data System (ADS)

    Liu, Xiaoyu; Liang, Naigang

    2012-10-01

    This paper presents an attempt to extend homogenization analysis for the effective elastic moduli of triangular lattice materials with microstructural defects. The proposed homogenization method adopts a process based on homogeneous strain boundary conditions, the micro-scale constitutive law and the micro-to-macro static operator to establish the relationship between the macroscopic properties of a given lattice material to its micro-discrete behaviors and structures. Further, the idea behind Eshelby's equivalent eigenstrain principle is introduced to replace a defect distribution by an imagining displacement field (eigendisplacement) with the equivalent mechanical effect, and the triangular lattice Green's function technique is developed to solve the eigendisplacement field. The proposed method therefore allows handling of different types of microstructural defects as well as its arbitrary spatial distribution within a general and compact framework. Analytical closed-form estimations are derived, in the case of the dilute limit, for all the effective elastic moduli of stretch-dominated triangular lattices containing fractured cell walls and missing cells, respectively. Comparison with numerical results, the Hashin-Shtrikman upper bounds and uniform strain upper bounds are also presented to illustrate the predictive capability of the proposed method for lattice materials. Based on this work, we propose that not only the effective Young's and shear moduli but also the effective Poisson's ratio of triangular lattice materials depend on the number density of fractured cell walls and their spatial arrangements.

  4. Statistical multi-path exposure method for assessing the whole-body SAR in a heterogeneous human body model in a realistic environment.

    PubMed

    Vermeeren, Günter; Joseph, Wout; Martens, Luc

    2013-04-01

    Assessing the whole-body absorption in a human in a realistic environment requires a statistical approach covering all possible exposure situations. This article describes the development of a statistical multi-path exposure method for heterogeneous realistic human body models. The method is applied for the 6-year-old Virtual Family boy (VFB) exposed to the GSM downlink at 950 MHz. It is shown that the whole-body SAR does not differ significantly over the different environments at an operating frequency of 950 MHz. Furthermore, the whole-body SAR in the VFB for multi-path exposure exceeds the whole-body SAR for worst-case single-incident plane wave exposure by 3.6%. Moreover, the ICNIRP reference levels are not conservative with the basic restrictions in 0.3% of the exposure samples for the VFB at the GSM downlink of 950 MHz. The homogeneous spheroid with the dielectric properties of the head suggested by the IEC underestimates the absorption compared to realistic human body models. Moreover, the variation in the whole-body SAR for realistic human body models is larger than for homogeneous spheroid models. This is mainly due to the heterogeneity of the tissues and the irregular shape of the realistic human body model compared to homogeneous spheroid human body models. Copyright © 2012 Wiley Periodicals, Inc.

  5. Intrinsic brain abnormalities in young healthy adults with childhood trauma: A resting-state functional magnetic resonance imaging study of regional homogeneity and functional connectivity.

    PubMed

    Lu, Shaojia; Gao, Weijia; Wei, Zhaoguo; Wang, Dandan; Hu, Shaohua; Huang, Manli; Xu, Yi; Li, Lingjiang

    2017-06-01

    Childhood trauma confers great risk for the development of multiple psychiatric disorders; however, the neural basis for this association is still unknown. The present resting-state functional magnetic resonance imaging study aimed to detect the effects of childhood trauma on brain function in a group of young healthy adults. In total, 24 healthy individuals with childhood trauma and 24 age- and sex-matched adults without childhood trauma were recruited. Each participant underwent resting-state functional magnetic resonance imaging scanning. Intra-regional brain activity was evaluated by regional homogeneity method and compared between groups. Areas with altered regional homogeneity were further selected as seeds in subsequent functional connectivity analysis. Statistical analyses were performed by setting current depression and anxiety as covariates. Adults with childhood trauma showed decreased regional homogeneity in bilateral superior temporal gyrus and insula, and the right inferior parietal lobule, as well as increased regional homogeneity in the right cerebellum and left middle temporal gyrus. Regional homogeneity values in the left middle temporal gyrus, right insula and right cerebellum were correlated with childhood trauma severity. In addition, individuals with childhood trauma also exhibited altered default mode network, cerebellum-default mode network and insula-default mode network connectivity when the left middle temporal gyrus, right cerebellum and right insula were selected as seed area, respectively. The present outcomes suggest that childhood trauma is associated with disturbed intrinsic brain function, especially the default mode network, in adults even without psychiatric diagnoses, which may mediate the relationship between childhood trauma and psychiatric disorders in later life.

  6. Evaluation of four methods for estimating leaf area of isolated trees

    Treesearch

    P.J. Peper; E.G. McPherson

    2003-01-01

    The accurate modeling of the physiological and functional processes of urban forests requires information on the leaf area of urban tree species. Several non-destructive, indirect leaf area sampling methods have shown good performance for homogenous canopies. These methods have not been evaluated for use in urban settings where trees are typically isolated and...

  7. Ultrasound assisted methods for enhanced extraction of phycobiliproteins from marine macro-algae, Gelidium pusillum (Rhodophyta).

    PubMed

    Mittal, Rochak; Tavanandi, Hrishikesh A; Mantri, Vaibhav A; Raghavarao, K S M S

    2017-09-01

    Extraction of phycobiliproteins (R-phycoerythrin, R-PE and R-phycocyanin, R-PC) from macro-algae is difficult due to the presence of large polysaccharides (agar, cellulose etc.) present in the cell wall which offer major hindrance for cell disruption. The present study is aimed at developing most suitable methodology for the primary extraction of R-PE and R-PC from marine macro-algae, Gelidium pusillum(Stackhouse) Le Jolis. Such extraction of phycobiliproteins by using ultrasonication and other conventional methods such as maceration, maceration in presence of liquid nitrogen, homogenization, and freezing and thawing (alone and in combinations) is reported for the first time. Standardization of ultrasonication for different parameters such as ultrasonication amplitude (60, 90 and 120µm) and ultrasonication time (1, 2, 4, 6, 8 and 10mins) at different temperatures (30, 35 and 40°C) was carried out. Kinetic parameters were estimated for extraction of phycobiliproteins by ultrasonication based on second order mass transfer kinetics. Based on calorimetric measurements, power, ultrasound intensity and acoustic power density were estimated to be 41.97W, 14.81W/cm 2 and 0.419W/cm 3 , respectively. Synergistic effect of ultrasonication was observed when employed in combination with other conventional primary extraction methods. Homogenization in combination with ultrasonication resulted in an enhancement in efficiency by 9.3% over homogenization alone. Similarly, maceration in combination with ultrasonication resulted in an enhancement in efficiency by 31% over maceration alone. Among all the methods employed, maceration in combination with ultrasonication resulted in the highest extraction efficiency of 77 and 93% for R-PE and R-PC, respectively followed by homogenization in combination with ultrasonication (69.6% for R-PE and 74.1% for R-PC). HPLC analysis was carried out in order to ensure that R-PE was present in the extract and remained intact even after processing. Microscopic studies indicated a clear relation between the extraction efficiency of phycobiliproteins and degree of cell disruption in a given primary extraction method. These combination methods were found to be effective for extraction of phycobiliproteins from rigid biomass of Gelidium pusillum macro-algae and can be employed for downstream processing of biomolecules also from other macro-algae. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Combining Heterogeneous Correlation Matrices: Simulation Analysis of Fixed-Effects Methods

    ERIC Educational Resources Information Center

    Hafdahl, Adam R.

    2008-01-01

    Monte Carlo studies of several fixed-effects methods for combining and comparing correlation matrices have shown that two refinements improve estimation and inference substantially. With rare exception, however, these simulations have involved homogeneous data analyzed using conditional meta-analytic procedures. The present study builds on…

  9. Extraction and characterization of corn germ proteins

    USDA-ARS?s Scientific Manuscript database

    Our study was conducted to develop methods to extract corn germ protein economically and characterize and identify potential applications of the recovered protein. Protein was extracted from both wet germ and finished (dried) germ using 0.1M NaCl as solvent. The method involved homogenization, sti...

  10. Abel's Theorem Simplifies Reduction of Order

    ERIC Educational Resources Information Center

    Green, William R.

    2011-01-01

    We give an alternative to the standard method of reduction or order, in which one uses one solution of a homogeneous, linear, second order differential equation to find a second, linearly independent solution. Our method, based on Abel's Theorem, is shorter, less complex and extends to higher order equations.

  11. Anti-dsDNA antibodies in systemic lupus erythematosus: A combination of two quantitative methods and the ANA pattern is the most efficient strategy of detection.

    PubMed

    Almeida González, Delia; Roces Varela, Alfredo; Marcelino Rodríguez, Itahisa; González Vera, Alexander; Delgado Sánchez, Mónica; Aznar Esquivel, Antonio; Casañas Rodríguez, Carlos; Cabrera de León, Antonio

    2015-12-01

    Several methods have been used to measure anti-double-stranded DNA auto-antibody (anti-dsDNA). Our aim was to determine the most efficient strategy to test anti-dsDNA in systemic lupus erythematosus (SLE). In this study, anti-dsDNA and anti-nuclear antibody (ANA) tests were requested for 644 patients. Anti-dsDNA was tested by RIA, ELISA and CLIA in all patients. The results indicated that 78 patients had a positive anti-dsDNA test according to at least one of the methods. After a 3-year follow-up period only 26 patients were diagnosed with SLE. We evaluated each method and combination of methods. Specificity and positive predictive value (PPV) increased with the number of assay methods used (p=0.002 for trend), and PPV was 100% in patients whose results were positive by all three anti-dsDNA assay methods. The proportion of anti-dsDNA-positive patients who had SLE was highest (82%; p b 0.001) among those with a homogeneous pattern of ANA staining, followed by those with a speckled pattern. In ANA positive patients, when only RIA was considered, 59% of anti-dsDNA-positive patients had SLE, but when RIA and CLIA were both considered, all patients with positive results on both tests had SLE. The combination of RIA+CLIA in patients with homogeneous and speckled ANA staining showed a similar cost and higher sensitivity than RIA alone in ANA positive patients (p b 0.001). We conclude that the most efficient strategy was to combine simultaneously two quantitative and sensitive methods but only in patients with a homogeneous or speckled pattern of ANA staining. This approach maximized specificity and PPV, and reduced costs. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Characterization of NIES CRM No. 23 Tea Leaves II for the determination of multielements.

    PubMed

    Mori, Ikuko; Ukachi, Miyuki; Nagano, Kimiyo; Ito, Hiroyasu; Yoshinaga, Jun; Nishikawa, Masataka

    2010-05-01

    A candidate environmental certified reference material (CRM) for the determination of multielements in tea leaves and materials of similar matrix, NIES CRM No. 23 Tea Leaves II, has been developed and characterized by the National Institute for Environmental Studies (NIES), Japan. The origin of the material was tea leaves, which were ground, sieved through a 106-microm mesh, homogenized, and then subdivided into amber glass bottles. The results of homogeneity and stability tests indicated that the material was sufficiently homogeneous and stable for use as a reference material. The property values of the material were statistically determined based on chemical analyses by a network of laboratories using a wide range of methods. Sixteen laboratories participated in the characterization, and nine certified values and five reference values were obtained. These property values of the candidate CRM, which are expressed as mass fractions, were close to the median and/or mean values of the mass fractions of elements in various tea products. The candidate CRM is appropriate for use in analytical quality control and in the evaluation of methods used in the analysis of tea and materials of similar matrix.

  13. Experimental wear behavioral studies of as-cast and 5 hr homogenized Al25Mg2Si2Cu4Ni alloy at constant load based on taguchi method

    NASA Astrophysics Data System (ADS)

    Harlapur, M. D.; Mallapur, D. G.; Udupa, K. Rajendra

    2018-04-01

    In the present study, an experimental study of the volumetric wear behaviour of Aluminium (Al-25Mg2Si2Cu4Ni) alloy in as cast and 5Hr homogenized with T6 heat treatment is carried out at constant load. The Pin on disc apparatus was used to carry out the sliding wear test. Taguchi method based on L-16 orthogonal array was employed to evaluate the data on the wear behavior. Signal-to-noise ratio among the objective of smaller the better and mean of means results were used. General regression model is obtained by correlation. Lastly confirmation test was completed to compose a comparison between the experimental results foreseen from the mention correlation. The mathematical model reveals the load has maximum contribution on the wear rate compared to speed. Scanning Electron Microscope was used to analyze the worn-out wear surfaces. Wear results show that 5Hr homogenized Al-25Mg2Si2Cu4Ni alloy samples with T6 treated had better volumetric wear resistance as compared to as cast samples.

  14. Pore-scale modeling of hydromechanical coupled mechanics in hydrofracturing process

    NASA Astrophysics Data System (ADS)

    Chen, Zhiqiang; Wang, Moran

    2017-05-01

    Hydrofracturing is an important technique in petroleum industry to stimulate well production. Yet the mechanism of induced fracture growth is still not fully understood, which results in some unsatisfactory wells even with hydrofracturing treatments. In this work we establish a more accurate numerical framework for hydromechanical coupling, where the solid deformation and fracturing are modeled by discrete element method and the fluid flow is simulated directly by lattice Boltzmann method at pore scale. After validations, hydrofracturing is simulated with consideration on the strength heterogeneity effects on fracture geometry and microfailure mechanism. A modified topological index is proposed to quantify the complexity of fracture geometry. The results show that strength heterogeneity has a significant influence on hydrofracturing. In heterogeneous samples, the fracturing behavior is crack nucleation around the tip of fracture and connection of it to the main fracture, which is usually accompanied by shear failure. However, in homogeneous ones the fracture growth is achieved by the continuous expansion of the crack, where the tensile failure often dominates. It is the fracturing behavior that makes the fracture geometry in heterogeneous samples much more complex than that in homogeneous ones. In addition, higher pore pressure leads to more shear failure events for both heterogeneous and homogeneous samples.

  15. Numerical Generation of Dense Plume Fingers in Unsaturated Homogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Cremer, C.; Graf, T.

    2012-04-01

    In nature, the migration of dense plumes typically results in the formation of vertical plume fingers. Flow direction in fingers is downwards, which is counterbalanced by upwards flow of less dense fluid between fingers. In heterogeneous media, heterogeneity itself is known to trigger the formation of fingers. In homogeneous media, however, fingers are also created even if all grains had the same diameter. The reason is that pore-scale heterogeneity leading to different flow velocities also exists in homogeneous media due to two effects: (i) Grains of identical size may randomly arrange differently, e.g. forming tetrahedrons, hexahedrons or octahedrons. Each arrangement creates pores of varying diameter, thus resulting in different average flow velocities. (ii) Random variations of solute concentration lead to varying buoyancy effects, thus also resulting in different velocities. As a continuation of previously made efforts to incorporate pore-scale heterogeneity into fully saturated soil such that dense fingers are realistically generated (Cremer and Graf, EGU Assembly, 2011), the current paper extends the research scope from saturated to unsaturated soil. Perturbation methods are evaluated by numerically re-simulating a laboratory-scale experiment of plume transport in homogeneous unsaturated sand (Simmons et al., Transp. Porous Media, 2002). The following 5 methods are being discussed: (i) homogeneous sand, (ii) initial perturbation of solute concentration, (iii) spatially random, time-constant perturbation of solute source, (iv) spatially and temporally random noise of simulated solute concentration, and (v) random K-field that introduces physically insignificant but numerically significant heterogeneity. Results demonstrate that, as opposed to saturated flow, perturbing the solute source will not result in plume fingering. This is because the location of the perturbed source (domain top) and the location of finger generation (groundwater surface) do not coincide. Alternatively, similar to saturated flow, applying either a random concentration noise (iv) or a random K-field (v) generates realistic plume fingering. Future work will focus on the generation mechanisms of plume finger splitting.

  16. Case study of small scale polytropic index in the central plasma sheet

    NASA Astrophysics Data System (ADS)

    Peng, XueXia; Cao, JinBin; Liu, WenLen; Ma, YuDuan; Lu, HaiYu; Yang, JunYing; Liu, LiuYuan; Liu, Xu; Wang, Jing; Wang, TieYan; Yu, Jiang

    2015-11-01

    This paper studies the effective polytropic index in the central plasma sheet (CPS) by using the method of Kartalev et al. (2006), which adopts the denoising technique of Haar wavelet to identify the homogeneous MHD Bernoulli integral (MBI) and has been frequently used to study the polytropic relation in the solar wind. We chose the quiet CPS crossing by Cluster C1 during the interval 08:51:00-09:19:00 UT on 03 August 2001. In the central plasma sheet, thermal pressure energy per unit mass is the most important part in MBI, and kinetic energy of fluid motion and electromagnetic energy per unit mass are less important. In the MBI, there are many peaks, which correspond to isothermal or near isothermal processes. The interval lengths of homogenous MBI regions are generally less than 1 min. The polytropic indexes are calculated by linearly fitting the data of lnp and lnn within a 16 s window, which is shifted forward by 8 s step length. Those polytropic indexes with |R|≥ 0.8 (R is the correlation coefficient between lnp and lnn) and p-value≤0.1 in the homogeneous regions are almost all in the range of [0, 1]. The mean and median effective polytropic indexes with high R and low p-value in homogeneous regions are 0.34 and 0.32 respectively, which are much different from the polytropic index obtained by traditional method (αtrad=-0.15). This result indicates that the CPS is not uniform even during quiet time and the blanket applications of polytropic law to plasma sheet may return misleading value of polytropic index. The polytropic indexes in homogeneous regions with a high correlation coefficient basically have good regression significance and are thus credible. These results are very important to understand the energy transport in magnetotail in the MHD frame.

  17. Effect of freezing time on macronutrients and energy content of breastmilk.

    PubMed

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; García-Algar, Oscar; De la Cruz, Javier; Lora, David; Pallás-Alonso, Carmen

    2012-08-01

    In neonatal units and human milk banks freezing breastmilk at less than -20 °C is the choice for preserving it. Scientific evidence in relation to the loss of nutritional quality during freezing is rare. Our main aim in this study is to determine the effect of freezing time up to 3 months on the content of fat, total nitrogen, lactose, and energy. Our secondary aim is to assess whether ultrasonic homogenization of samples enables a more suitable reading of breastmilk macronutrients with a human milk analyzer (HMA) (MIRIS, Uppsala, Sweden). Refrigerated breastmilk samples were collected. Each sample was divided into six pairs of aliquots. One pair was analyzed on day 0, and the remaining pairs were frozen and analyzed, one each at 7, 15, 30, 60, and 90 days later. For each pair, one aliquot was homogenized by stirring, and the other by applying ultrasound. Samples were analyzed with the HMA. By 3 months from freezing with the two homogenization methods, we observed a relevant and significant decline in the concentration of fat and energy content. The modification of total nitrogen and lactose was not constant and of lower magnitude. The absolute concentration of all macronutrients and calories was greater with ultrasonic homogenization. After 3 months from freezing at -20 °C, an important decrease in fat and caloric content is observed. Correct homogenization is fundamental for correct nutritional analysis.

  18. Edge-Based Image Compression with Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Mainberger, Markus; Weickert, Joachim

    It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.

  19. Method for removing trace pollutants from aqueous solutions

    DOEpatents

    Silver, G.L.

    A method of substantially removing a trace metallic contaminant from a liquid containing the same comprises: adding an oxidizing agent to a liquid containing a trace amount of a metallic contaminant of a concentration of up to about 0.1 ppM, and separating the homogeneously precipitated product from the liquid.

  20. Comparative Robustness of Recent Methods for Analyzing Multivariate Repeated Measures Designs

    ERIC Educational Resources Information Center

    Seco, Guillermo Vallejo; Gras, Jaime Arnau; Garcia, Manuel Ato

    2007-01-01

    This study evaluated the robustness of two recent methods for analyzing multivariate repeated measures when the assumptions of covariance homogeneity and multivariate normality are violated. Specifically, the authors' work compares the performance of the modified Brown-Forsythe (MBF) procedure and the mixed-model procedure adjusted by the…

  1. Clustering "N" Objects into "K" Groups under Optimal Scaling of Variables.

    ERIC Educational Resources Information Center

    van Buuren, Stef; Heiser, Willem J.

    1989-01-01

    A method based on homogeneity analysis (multiple correspondence analysis or multiple scaling) is proposed to reduce many categorical variables to one variable with "k" categories. The method is a generalization of the sum of squared distances cluster analysis problem to the case of mixed measurement level variables. (SLD)

  2. Evaluation of Criterion Validity for Scales with Congeneric Measures

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2007-01-01

    A method for estimating criterion validity of scales with homogeneous components is outlined. It accomplishes point and interval estimation of interrelationship indices between composite scores and criterion variables and is useful for testing hypotheses about criterion validity of measurement instruments. The method can also be used with missing…

  3. A method to improve the B0 homogeneity of the heart in vivo.

    PubMed

    Jaffer, F A; Wen, H; Balaban, R S; Wolff, S D

    1996-09-01

    A homogeneous static (B0) magnetic field is required for many NMR experiments such as echo planar imaging, localized spectroscopy, and spiral scan imaging. Although semi-automated techniques have been described to improve the B0 field homogeneity, none has been applied to the in vivo heart. The acquisition of cardiac field maps is complicated by motion, blood flow, and chemical shift artifact from epicardial fat. To overcome these problems, an ungated three-dimensional (3D) chemical shift image (CSI) was collected to generate a time and motion-averaged B0 field map. B0 heterogeneity in the heart was minimized by using a previous algorithm that solves for the optimal shim coil currents for an input field map, using up to third-order current-bounded shims (1). The method improved the B0 homogenelty of the heart in all 11 normal volunteers studied. After application of the algorithm to the unshimmed cardiac field maps, the standard deviation of proton frequency decreased by 43%, the magnitude 1H spectral linewidth decreased by 24%, and the peak-peak gradient decreased by 35%. Simulations of the high-order (second- and third-order) shims in B0 field correction of the heart show that high order shims are important, resulting for nearly half of the improvement in homogeneity for several subjects. The T2* of the left ventricular anterior wall before and after field correction was determined at 4.0 Tesis. Finally, results show that cardiac shimming is of benefit in cardiac 31P NMR spectroscopy and cardiac echo planar imaging.

  4. From analytical solutions of solute transport equations to multidimensional time-domain random walk (TDRW) algorithms

    NASA Astrophysics Data System (ADS)

    Bodin, Jacques

    2015-03-01

    In this study, new multi-dimensional time-domain random walk (TDRW) algorithms are derived from approximate one-dimensional (1-D), two-dimensional (2-D), and three-dimensional (3-D) analytical solutions of the advection-dispersion equation and from exact 1-D, 2-D, and 3-D analytical solutions of the pure-diffusion equation. These algorithms enable the calculation of both the time required for a particle to travel a specified distance in a homogeneous medium and the mass recovery at the observation point, which may be incomplete due to 2-D or 3-D transverse dispersion or diffusion. The method is extended to heterogeneous media, represented as a piecewise collection of homogeneous media. The particle motion is then decomposed along a series of intermediate checkpoints located on the medium interface boundaries. The accuracy of the multi-dimensional TDRW method is verified against (i) exact analytical solutions of solute transport in homogeneous media and (ii) finite-difference simulations in a synthetic 2-D heterogeneous medium of simple geometry. The results demonstrate that the method is ideally suited to purely diffusive transport and to advection-dispersion transport problems dominated by advection. Conversely, the method is not recommended for highly dispersive transport problems because the accuracy of the advection-dispersion TDRW algorithms degrades rapidly for a low Péclet number, consistent with the accuracy limit of the approximate analytical solutions. The proposed approach provides a unified methodology for deriving multi-dimensional time-domain particle equations and may be applicable to other mathematical transport models, provided that appropriate analytical solutions are available.

  5. Blend uniformity evaluation during continuous mixing in a twin screw granulator by in-line NIR using a moving F-test.

    PubMed

    Fonteyne, Margot; Vercruysse, Jurgen; De Leersnyder, Fien; Besseling, Rut; Gerich, Ad; Oostra, Wim; Remon, Jean Paul; Vervaet, Chris; De Beer, Thomas

    2016-09-07

    This study focuses on the twin screw granulator of a continuous from-powder-to-tablet production line. Whereas powder dosing into the granulation unit is possible from a container of preblended material, a truly continuous process uses several feeders (each one dosing an individual ingredient) and relies on a continuous blending step prior to granulation. The aim of the current study was to investigate the in-line blending capacity of this twin screw granulator, equipped with conveying elements only. The feasibility of in-line NIR (SentroPAT, Sentronic GmbH, Dresden, Germany) spectroscopy for evaluating the blend uniformity of powders after the granulator was tested. Anhydrous theophylline was used as a tracer molecule and was blended with lactose monohydrate. Theophylline and lactose were both fed from a different feeder into the twin screw granulator barrel. Both homogeneous mixtures and mixing experiments with induced errors were investigated. The in-line spectroscopic analyses showed that the twin screw granulator is a useful tool for in-line blending in different conditions. The blend homogeneity was evaluated by means of a novel statistical method being the moving F-test method in which the variance between two blocks of collected NIR spectra is evaluated. The α- and β-error of the moving F-test are controlled by using the appropriate block size of spectra. The moving F-test method showed to be an appropriate calibration and maintenance free method for blend homogeneity evaluation during continuous mixing. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Fabrication of patterned calcium cross-linked alginate hydrogel films and coatings through reductive cation exchange.

    PubMed

    Bruchet, Marion; Melman, Artem

    2015-10-20

    Calcium cross-linked alginate hydrogels are widely used in targeted drug delivery, tissue engineering, wound treatment, and other biomedical applications. We developed a method for preparing homogeneous alginate hydrogels cross-linked with Ca(2+) cations using reductive cation exchange in homogeneous iron(III) cross-linked alginate hydrogels. Treatment of iron(III) cross-linked alginate hydrogels with calcium salts and sodium ascorbate results in reduction of iron(III) cations to iron(II) that are instantaneously replaced with Ca(2+) cations, producing homogeneous ionically cross-linking hydrogels. Alternatively, the cation exchange can be performed by photochemical reduction in the presence of calcium chloride using a sacrificial photoreductant. This approach allows fabrication of patterned calcium alginate hydrogels through photochemical patterning of iron(III) cross-linked alginate hydrogel followed by the photochemical reductive exchange of iron cations to calcium. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Certified reference material of volatile organic compounds for environmental analysis: BTEX in methanol.

    PubMed

    Neves, Laura A; Almeida, Renato R R; Rego, Eliane P; Rodrigues, Janaína Marques; de Carvalho, Lucas Junqueira; de M Goulart, Ana Letícia

    2015-04-01

    The Brazilian Metrology Institute (National Institute of Metrology, Quality, and Technology, Inmetro) has been developing a certified reference material (CRM) of the volatile organic compounds benzene; toluene; ethylbenzene; and ortho, meta, and para-xylenes (BTEX) in methanol, to ensure quality control for environmental-analysis measurements. The objective of this paper is to present the results of certification studies: uncertainty estimates related to characterization, a homogeneity study, and a stability study on a single lot of CRM composed of BTEX in methanol. The method used analysis of variance (ANOVA), a statistical tool, to evaluate the homogeneity and stability of the BTEX CRM, which complies with ISO Guide 30 series. The homogeneity and stability of the BTEX CRM was confirmed for all analytes and their respective properties. All the procedures used in this study complied with ISO GUIDE 34, ISO GUIDE 35, and the guide to the expression of uncertainty of measurement (GUM).

  8. Bopp-Podolsky black holes and the no-hair theorem

    NASA Astrophysics Data System (ADS)

    Cuzinatto, R. R.; de Melo, C. A. M.; Medeiros, L. G.; Pimentel, B. M.; Pompeia, P. J.

    2018-01-01

    Bopp-Podolsky electrodynamics is generalized to curved space-times. The equations of motion are written for the case of static spherically symmetric black holes and their exterior solutions are analyzed using Bekenstein's method. It is shown that the solutions split up into two parts, namely a non-homogeneous (asymptotically massless) regime and a homogeneous (asymptotically massive) sector which is null outside the event horizon. In addition, in the simplest approach to Bopp-Podolsky black holes, the non-homogeneous solutions are found to be Maxwell's solutions leading to a Reissner-Nordström black hole. It is also demonstrated that the only exterior solution consistent with the weak and null energy conditions is the Maxwell one. Thus, in the light of the energy conditions, it is concluded that only Maxwell modes propagate outside the horizon and, therefore, the no-hair theorem is satisfied in the case of Bopp-Podolsky fields in spherically symmetric space-times.

  9. Homogenized electromechanical properties of crystalline and ceramic relaxor ferroelectric 0.58Pb(Mg1/3Nb2/3)O3 0.42PbTiO3

    NASA Astrophysics Data System (ADS)

    Jayachandran, K. P.; Guedes, J. M.; Rodrigues, H. C.

    2007-10-01

    A modelling framework that incorporates the peculiarities of microstructural features, such as the spatial correlation of crystallographic orientations and morphological texture in piezoelectrics, is established. The mathematical homogenization theory of a piezoelectric medium is implemented using the finite element method by solving the coupled equilibrium electrical and mechanical fields. The dependence of the domain orientation on the macroscopic electromechanical properties of crystalline as well as polycrystalline ceramic relaxor ferroelectric 0.58Pb(Mg1/3Nb2/3)O3-0.42PbTiO3 (PMN-42% PT) is studied based on this model. The material shows large anisotropy in the piezoelectric coefficient ejK in its crystalline form. The homogenized electromechanical moduli of polycrystalline ceramic also exhibit significantly anisotropic behaviours. An optimum texture at which the piezoceramic exhibits its maximum longitudinal piezoelectric response is identified.

  10. Variable angle spectroscopic ellipsometry - Application to GaAs-AlGaAs multilayer homogeneity characterization

    NASA Technical Reports Server (NTRS)

    Alterovitz, Samuel A.; Snyder, Paul G.; Merkel, Kenneth G.; Woollam, John A.; Radulescu, David C.

    1988-01-01

    Variable angle spectroscopic ellipsometry has been applied to a GaAs-AlGaAs multilayer structure to obtain a three-dimensional characterization, using repetitive measurements at several spots on the same sample. The reproducibility of the layer thickness measurements is of order 10 A, while the lateral dimension is limited by beam diameter, presently of order 1 mm. Thus, the three-dimensional result mainly gives the sample homogeneity. In the present case three spots were used to scan the homogeneity over 1 in of a wafer which had molecular-beam epitaxially grown layers. The thickness of the AlGaAs, GaAs, and oxide layers and the Al concentration varied by 1 percent or less from edge to edge. This result was confirmed by two methods of data analysis. No evidence of an interfacial layer was observed on top of the AlGaAs.

  11. A validated UPLC-MS/MS method for the analysis of linezolid and a novel oxazolidinone derivative (PH027) in plasma and its application to tissue distribution study in rabbits.

    PubMed

    Hedaya, Mohsen A; Thomas, Vidhya; Abdel-Hamid, Mohamed E; Kehinde, Elijah O; Phillips, Oludotun A

    2017-01-01

    Linezolid is the first approved oxazolidinone antibacterial agent, whereas PH027 is a novel compound of the same class that exhibits good in vitro antibacterial activity. The objective of this study was to develop an UPLC-MS/MS assay for the analysis of linezolid and PH027 in plasma and to apply the method for comparative pharmacokinetic and tissue distribution studies of both compounds. Plasma samples and calibrators were extracted with diethyl ether after addition of the internal standard solution. After evaporation of the ether layer, the residue was reconstituted in mobile phase and injected into UPLC-MS/MS. The mobile phase consisted of 2mM ammonium acetate buffer solution and acetonitrile (70:30) at a flow rate of 0.2ml/min. Separation was achieved using UPLC BEH C 18 column, and quantitative determination of the analytes was performed using multiple-reaction monitoring (MRM) scanning mode. The method was validated by analyzing quality control tissue homogenate samples, and was applied to analyze tissue homogenate samples obtained following IV injections of linezolid and PH027 in rabbits. The developed UPLC-MS/MS method was linear in the concentration range of 50-5000ng/ml. Validation of the method proved that the method's precision, selectivity and stability were all within the acceptable limits. Linezolid and PH027 concentrations were accurately determined in the quality control tissue homogenate samples, and analysis of samples obtained following IV administration of the two compounds showed that the tissue to plasma concentration ratio of PH027 was higher than that of linezolid probably due to its higher lipophilicity. The developed UPLC-MS/MS method for the analysis of linezolid and PH027 in rabbit's plasma can accurately determine the concentrations of these compounds in different tissues. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Polyamines and inorganic ions extracted from woody tissues by freeze-thawing

    Treesearch

    Rakesh Minocha; Walter C. Shortle

    1994-01-01

    A simple and fast method for extraction of major inorganic ions (Ca, Mg, Mn, K, and P) and cellular polyamines from small quantities of wood and woody plant tissues is described. The method involves repeated freezing and thawing of samples instead of homogenization or wet ash digestion. The efficiency of extraction of both polyamines and inorganic ions by these methods...

  13. SU-E-T-417: The Impact of Normal Tissue Constraints On PTV Dose Homogeneity for Intensity Modulated Radiotherapy (IMRT), Volume Modulated Arc Therapy (VMAT) and Tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, J; McDonald, D; Ashenafi, M

    2014-06-01

    Purpose: Complex intensity modulated arc therapy tends to spread low dose to normal tissue(NT)regions to obtain improved target conformity and homogeneity and OAR sparing.This work evaluates the trade-offs between PTV homogeneity and reduction of the maximum dose(Dmax)spread to NT while planning of IMRT,VMAT and Tomotherapy. Methods: Ten prostate patients,previously planned with step-and-shoot IMRT,were selected.To fairly evaluate how PTV homogeneity was affected by NT Dmax constraints,original IMRT DVH objectives for PTV and OARs(femoral heads,and rectal and bladder wall)applied to 2 VMAT plans in Pinnacle(V9.0), and Tomotherapy(V4.2).The only constraint difference was the NT which was defined as body contours excluding targets,OARs andmore » dose rings.NT Dmax constraint for 1st VMAT was set to the prescription dose(Dp).For 2nd VMAT(VMAT-NT)and Tomotherapy,it was set to the Dmax achieved in IMRT(~70-80% of Dp).All NT constraints were set to the lowest priority.Three common homogeneity indices(HI),RTOG-HI=Dmax/Dp,moderated-HI=D95%/D5% and complex-HI=(D2%-D98%)/Dp*100 were calculated. Results: All modalities with similar dosimetric endpoints for PTV and OARs.The complex-HI shows the most variability of indices,with average values of 5.9,4.9,9.3 and 6.1 for IMRT,VMAT,VMAT-NT and Tomotherapy,respectively.VMAT provided the best PTV homogeneity without compromising any OAR/NT sparing.Both VMAT-NT and Tomotherapy,planned with more restrictive NT constraints,showed reduced homogeneity,with VMAT-NT showing the worst homogeneity(P<0.0001)for all HI.Tomotherapy gave the lowest NT Dmax,with slightly decreased homogeneity compared to VMAT. Finally, there was no significant difference in NT Dmax or Dmean between VMAT and VMAT-NT. Conclusion: PTV HI is highly dependent on permitted NT constraints. Results demonstrated that VMAT-NT with more restrictive NT constraints does not reduce Dmax NT,but significantly receives higher Dmax and worse target homogeneity.Therefore, it is critical that planners do not use too restrictive NT constraints during VMAT optimization.Tomotherapy plan was not as sensitive to NT constraints,however,care shall be taken to ensure NT is not pushed too hard.These results are relevant for clinical practice.The biological effect of higher Dmax and increased target heterogeneity needs further study.« less

  14. Palladium nanoparticle deposition via precipitation: a new method to functionalize macroporous silicon

    PubMed Central

    Scheen, Gilles; Bassu, Margherita; Douchamps, Antoine; Zhang, Chao; Debliquy, Marc; Francis, Laurent A

    2014-01-01

    We present an original two-step method for the deposition via precipitation of Pd nanoparticles into macroporous silicon. The method consists in immersing a macroporous silicon sample in a PdCl2/DMSO solution and then in annealing the sample at a high temperature. The impact of composition and concentration of the solution and annealing time on the nanoparticle characteristics is investigated. This method is compared to electroless plating, which is a standard method for the deposition of Pd nanoparticles. Scanning electron microscopy and computerized image processing are used to evaluate size, shape, surface density and deposition homogeneity of the Pd nanoparticles on the pore walls. Energy-dispersive x-ray spectroscopy (EDX) and x-ray photoelectron spectroscopy (XPS) analyses are used to evaluate the composition of the deposited nanoparticles. In contrast to electroless plating, the proposed method leads to homogeneously distributed Pd nanoparticles along the macropores depth with a surface density that increases proportionally with the PdCl2 concentration. Moreover EDX and XPS analysis showed that the nanoparticles are composed of Pd in its metallic state, while nanoparticles deposited by electroless plating are composed of both metallic Pd and PdOx. PMID:27877732

  15. Phantom Preparation and Optical Property Determination

    NASA Astrophysics Data System (ADS)

    He, Di; He, Jie; Mao, Heng

    2018-12-01

    Tissue-like optical phantoms are important in testing new imaging algorithms. Homogeneous optical phantoms with determined optical properties are the first step of making a proper heterogeneous phantom for multi-modality imaging. Typical recipes for such phantoms consist of epoxy resin, hardener, India ink and titanium oxide. By altering the concentration of India ink and titanium oxide, we are able to get multiple homogeneous phantoms with different absorption and scattering coefficients by carefully mixing all the ingredients. After fabricating the phantoms, we need to find their individual optical properties including the absorption and scattering coefficients. This is achieved by solving diffusion equation of each phantom as a homogeneous slab under canonical illumination. We solve the diffusion equation of homogeneous slab in frequency domain and get the formula for theoretical measurements. Under our steady-state diffused optical tomography (DOT) imaging system, we are able to obtain the real distribution of the incident light produced by a laser. With this source distribution we got and the formula we derived, numerical experiments show how measurements change while varying the value of absorption and scattering coefficients. Then we notice that the measurements alone will not be enough for us to get unique optical properties for steady-state DOT problem. Thus in order to determine the optical properties of a homogeneous slab we want to fix one of the coefficients first and use optimization methods to find another one. Then by assemble multiple homogeneous slab phantoms with different optical properties, we are able to obtain a heterogeneous phantom suitable for testing multi-modality imaging algorithms. In this paper, we describe how to make phantoms, derive a formula to solve the diffusion equation, demonstrate the non-uniqueness of steady-state DOT problem by analysing some numerical results of our formula, and finally propose a possible way to determine optical properties for homogeneous slab for our future work.

  16. Proton Minibeam Radiation Therapy Reduces Side Effects in an In Vivo Mouse Ear Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Girst, Stefanie, E-mail: stefanie.girst@unibw.de; Greubel, Christoph; Reindl, Judith

    Purpose: Proton minibeam radiation therapy is a novel approach to minimize normal tissue damage in the entrance channel by spatial fractionation while keeping tumor control through a homogeneous tumor dose using beam widening with an increasing track length. In the present study, the dose distributions for homogeneous broad beam and minibeam irradiation sessions were simulated. Also, in an animal study, acute normal tissue side effects of proton minibeam irradiation were compared with homogeneous irradiation in a tumor-free mouse ear model to account for the complex effects on the immune system and vasculature in an in vivo normal tissue model. Methods andmore » Materials: At the ion microprobe SNAKE, 20-MeV protons were administered to the central part (7.2 × 7.2 mm{sup 2}) of the ear of BALB/c mice, using either a homogeneous field with a dose of 60 Gy or 16 minibeams with a nominal 6000 Gy (4 × 4 minibeams, size 0.18 × 0.18 mm{sup 2}, with a distance of 1.8 mm). The same average dose was used over the irradiated area. Results: No ear swelling or other skin reactions were observed at any point after minibeam irradiation. In contrast, significant ear swelling (up to fourfold), erythema, and desquamation developed in homogeneously irradiated ears 3 to 4 weeks after irradiation. Hair loss and the disappearance of sebaceous glands were only detected in the homogeneously irradiated fields. Conclusions: These results show that proton minibeam radiation therapy results in reduced adverse effects compared with conventional homogeneous broad-beam irradiation and, therefore, might have the potential to decrease the incidence of side effects resulting from clinical proton and/or heavy ion therapy.« less

  17. Fine-grained zirconium-base material

    DOEpatents

    Van Houten, G.R.

    1974-01-01

    A method is described for making zirconium with inhibited grain growth characteristics, by the process of vacuum melting the zirconium, adding 0.3 to 0.5% carbon, stirring, homogenizing, and cooling. (Official Gazette)

  18. Application of singular value decomposition to structural dynamics systems with constraints

    NASA Technical Reports Server (NTRS)

    Juang, J.-N.; Pinson, L. D.

    1985-01-01

    Singular value decomposition is used to construct a coordinate transformation for a linear dynamic system subject to linear, homogeneous constraint equations. The method is compared with two commonly used methods, namely classical Gaussian elimination and Walton-Steeves approach. Although the classical method requires fewer numerical operations, the singular value decomposition method is more accurate and convenient in eliminating the dependent coordinates. Numerical examples are presented to demonstrate the application of the method.

  19. Homogenisation of minimum and maximum air temperature in northern Portugal

    NASA Astrophysics Data System (ADS)

    Freitas, L.; Pereira, M. G.; Caramelo, L.; Mendes, L.; Amorim, L.; Nunes, L.

    2012-04-01

    Homogenization of minimum and maximum air temperature has been carried out for northern Portugal for the period 1941-2010. The database corresponds to the values of the monthly arithmetic averages calculated from daily values observed at stations within the network of stations managed by the national Institute of Meteorology (IM). Some of the weather stations of IM's network are collecting data for more than a century; however, during the entire observing period, some factors have affected the climate series and have to be considered such as, changes in the station surroundings and changes related to replacement of manually operated instruments. Besides these typical changes, it is of particular interest the station relocation to rural areas or to the urban-rural interface and the installation of automatic weather stations in the vicinity of the principal or synoptic stations with the aim of replacing them. The information from these relocated and new stations was merged to produce just one but representative time series of that site. This process starts at the end 90's and the information of the time series fusion process constitutes the set of metadata used. Two basic procedures were performed: (i) preliminary statistical and quality control analysis; and, (ii) detection and correction of problems of homogeneity. In the first case, was developed and used software for quality control, specifically dedicated for the detection of outliers, based on the quartile values of the time series itself. The analysis of homogeneity was performed using the MASH (Multiple Analysis of Series for Homogenisation) and HOMER, which is a software application developed and recently made available within the COST Action ES0601 (COST-ES0601, 2012). Both methods provide a fast quality control of the original data and were developed for automatic processing, analyzing, homogeneity testing and adjusting of climatological data, but manual usage is also possible. Obtained results with both methods will be presented, compared and discussed along with the results of the sensitivity tests performed with both methods. COST-ES0601, 2012: "ACTION COST-ES0601 - Advances in homogenisation methods of climate series: an integrated approach HOME". Available at http://www.homogenisation.org/v_02_15/ [accessed 3 January 2012].

  20. Random walks on mutual microRNA-target gene interaction network improve the prediction of disease-associated microRNAs.

    PubMed

    Le, Duc-Hau; Verbeke, Lieven; Son, Le Hoang; Chu, Dinh-Toi; Pham, Van-Huy

    2017-11-14

    MicroRNAs (miRNAs) have been shown to play an important role in pathological initiation, progression and maintenance. Because identification in the laboratory of disease-related miRNAs is not straightforward, numerous network-based methods have been developed to predict novel miRNAs in silico. Homogeneous networks (in which every node is a miRNA) based on the targets shared between miRNAs have been widely used to predict their role in disease phenotypes. Although such homogeneous networks can predict potential disease-associated miRNAs, they do not consider the roles of the target genes of the miRNAs. Here, we introduce a novel method based on a heterogeneous network that not only considers miRNAs but also the corresponding target genes in the network model. Instead of constructing homogeneous miRNA networks, we built heterogeneous miRNA networks consisting of both miRNAs and their target genes, using databases of known miRNA-target gene interactions. In addition, as recent studies demonstrated reciprocal regulatory relations between miRNAs and their target genes, we considered these heterogeneous miRNA networks to be undirected, assuming mutual miRNA-target interactions. Next, we introduced a novel method (RWRMTN) operating on these mutual heterogeneous miRNA networks to rank candidate disease-related miRNAs using a random walk with restart (RWR) based algorithm. Using both known disease-associated miRNAs and their target genes as seed nodes, the method can identify additional miRNAs involved in the disease phenotype. Experiments indicated that RWRMTN outperformed two existing state-of-the-art methods: RWRMDA, a network-based method that also uses a RWR on homogeneous (rather than heterogeneous) miRNA networks, and RLSMDA, a machine learning-based method. Interestingly, we could relate this performance gain to the emergence of "disease modules" in the heterogeneous miRNA networks used as input for the algorithm. Moreover, we could demonstrate that RWRMTN is stable, performing well when using both experimentally validated and predicted miRNA-target gene interaction data for network construction. Finally, using RWRMTN, we identified 76 novel miRNAs associated with 23 disease phenotypes which were present in a recent database of known disease-miRNA associations. Summarizing, using random walks on mutual miRNA-target networks improves the prediction of novel disease-associated miRNAs because of the existence of "disease modules" in these networks.

  1. Estimating time since infection in early homogeneous HIV-1 samples using a poisson model

    PubMed Central

    2010-01-01

    Background The occurrence of a genetic bottleneck in HIV sexual or mother-to-infant transmission has been well documented. This results in a majority of new infections being homogeneous, i.e., initiated by a single genetic strain. Early after infection, prior to the onset of the host immune response, the viral population grows exponentially. In this simple setting, an approach for estimating evolutionary and demographic parameters based on comparison of diversity measures is a feasible alternative to the existing Bayesian methods (e.g., BEAST), which are instead based on the simulation of genealogies. Results We have devised a web tool that analyzes genetic diversity in acutely infected HIV-1 patients by comparing it to a model of neutral growth. More specifically, we consider a homogeneous infection (i.e., initiated by a unique genetic strain) prior to the onset of host-induced selection, where we can assume a random accumulation of mutations. Previously, we have shown that such a model successfully describes about 80% of sexual HIV-1 transmissions provided the samples are drawn early enough in the infection. Violation of the model is an indicator of either heterogeneous infections or the initiation of selection. Conclusions When the underlying assumptions of our model (homogeneous infection prior to selection and fast exponential growth) are met, we are under a very particular scenario for which we can use a forward approach (instead of backwards in time as provided by coalescent methods). This allows for more computationally efficient methods to derive the time since the most recent common ancestor. Furthermore, the tool performs statistical tests on the Hamming distance frequency distribution, and outputs summary statistics (mean of the best fitting Poisson distribution, goodness of fit p-value, etc). The tool runs within minutes and can readily accommodate the tens of thousands of sequences generated through new ultradeep pyrosequencing technologies. The tool is available on the LANL website. PMID:20973976

  2. Old document image segmentation using the autocorrelation function and multiresolution analysis

    NASA Astrophysics Data System (ADS)

    Mehri, Maroua; Gomez-Krämer, Petra; Héroux, Pierre; Mullot, Rémy

    2013-01-01

    Recent progress in the digitization of heterogeneous collections of ancient documents has rekindled new challenges in information retrieval in digital libraries and document layout analysis. Therefore, in order to control the quality of historical document image digitization and to meet the need of a characterization of their content using intermediate level metadata (between image and document structure), we propose a fast automatic layout segmentation of old document images based on five descriptors. Those descriptors, based on the autocorrelation function, are obtained by multiresolution analysis and used afterwards in a specific clustering method. The method proposed in this article has the advantage that it is performed without any hypothesis on the document structure, either about the document model (physical structure), or the typographical parameters (logical structure). It is also parameter-free since it automatically adapts to the image content. In this paper, firstly, we detail our proposal to characterize the content of old documents by extracting the autocorrelation features in the different areas of a page and at several resolutions. Then, we show that is possible to automatically find the homogeneous regions defined by similar indices of autocorrelation without knowledge about the number of clusters using adapted hierarchical ascendant classification and consensus clustering approaches. To assess our method, we apply our algorithm on 316 old document images, which encompass six centuries (1200-1900) of French history, in order to demonstrate the performance of our proposal in terms of segmentation and characterization of heterogeneous corpus content. Moreover, we define a new evaluation metric, the homogeneity measure, which aims at evaluating the segmentation and characterization accuracy of our methodology. We find a 85% of mean homogeneity accuracy. Those results help to represent a document by a hierarchy of layout structure and content, and to define one or more signatures for each page, on the basis of a hierarchical representation of homogeneous blocks and their topology.

  3. Numerical methods in Markov chain modeling

    NASA Technical Reports Server (NTRS)

    Philippe, Bernard; Saad, Youcef; Stewart, William J.

    1989-01-01

    Several methods for computing stationary probability distributions of Markov chains are described and compared. The main linear algebra problem consists of computing an eigenvector of a sparse, usually nonsymmetric, matrix associated with a known eigenvalue. It can also be cast as a problem of solving a homogeneous singular linear system. Several methods based on combinations of Krylov subspace techniques are presented. The performance of these methods on some realistic problems are compared.

  4. Continuous-flow extraction system for elemental association study: a case of synthetic metal-doped iron hydroxide.

    PubMed

    Hinsin, Duangduean; Pdungsap, Laddawan; Shiowatana, Juwadee

    2002-12-06

    A continuous-flow extraction system originally developed for sequential extraction was applied to study elemental association of a synthetic metal-doped amorphous iron hydroxide phase. The homogeneity and metal association of the precipitates were evaluated by gradual leaching using the system. Leachate was collected in fractions for determination of elemental concentrations. The result obtained as extractograms indicated that the doped metals were adsorbed more on the outermost surface rather than homogeneously distributed in the precipitates. The continuous-flow extraction method was also used for effective removal of surface adsorbed metals to obtain a homogeneous metal-doped synthetic iron hydroxide by a sequential extraction using acetic acid and small volume of hydroxylamine hydrochloride solution. The system not only ensures complete washing, but the extent of metal immobilization in the synthetic iron hydroxide could be determined with high accuracy from the extractograms. The initial metal/iron mole ratio (M/Fe) in solution affected the M/Fe mole ratio in homogeneous doped iron hydroxide phase. The M/Fe mole ratio of metal incorporation was approximately 0.01-0.02 and 0.03-0.06, for initial solution M/Fe mole ratio of 0.025 and 0.100, respectively.

  5. Nanocomposites based on banana starch reinforced with cellulose nanofibers isolated from banana peels.

    PubMed

    Pelissari, Franciele Maria; Andrade-Mahecha, Margarita María; Sobral, Paulo José do Amaral; Menegalli, Florencia Cecilia

    2017-11-01

    Cellulose nanofibers were isolated from banana peel using a combination of chemical and mechanical treatments with different number of passages through the high-pressure homogenizer (0, 3, 5, and 7 passages). New nanocomposites were then prepared from a mixed suspension of banana starch and cellulose nanofibers using the casting method and the effect of the addition of these nanofibers on the properties of the resulting nanocomposites was investigated. The cellulose nanofibers homogeneously dispersed in the starch matrix increased the glass transition temperature, due to the strong intermolecular interactions occurring between the starch and cellulose. The nanocomposites exhibited significantly increased the tensile strength, Young's modulus, water-resistance, opacity, and crystallinity as the number of passages through the homogenizer augmented. However, a more drastic mechanical treatment (seven passages) caused defects in nanofibers, deteriorating the nanocomposite properties. The most suitable mechanical treatment condition for the preparation of cellulose nanofibers and the corresponding nanocomposite was five passages through the high-pressure homogenizer. In general, the cellulose nanofibers improved the features of the starch-based material and are potentially applicable as reinforcing elements in a variety of polymer composites. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Investigating the spectral characteristics of backscattering from heterogeneous spherical nuclei using broadband finite-difference time-domain simulations

    NASA Astrophysics Data System (ADS)

    Chao, Guo-Shan; Sung, Kung-Bin

    2010-01-01

    Reflectance spectra measured from epithelial tissue have been used to extract size distribution and refractive index of cell nuclei for noninvasive detection of precancerous changes. Despite many in vitro and in vivo experimental results, the underlying mechanism of sizing nuclei based on modeling nuclei as homogeneous spheres and fitting the measured data with Mie theory has not been fully explored. We describe the implementation of a three-dimensional finite-difference time-domain (FDTD) simulation tool using a Gaussian pulse as the light source to investigate the wavelength-dependent characteristics of backscattered light from a nuclear model consisting of a nucleolus and clumps of chromatin embedded in homogeneous nucleoplasm. The results show that small-sized heterogeneities within the nuclei generate about five times higher backscattering than homogeneous spheres. More interestingly, backscattering spectra from heterogeneous spherical nuclei show periodic oscillations similar to those from homogeneous spheres, leading to high accuracy of estimating the nuclear diameter by comparison with Mie theory. In addition to the application in light scattering spectroscopy, the reported FDTD method could be adapted to study the relations between measured spectral data and nuclear structures in other optical imaging and spectroscopic techniques for in vivo diagnosis.

  7. New quality-control materials for the determination of alkylphenols and alkylphenol ethoxylates in sewage sludge.

    PubMed

    Fernández-Sanjuan, María; Lacorte, Silvia; Rigol, Anna; Sahuquillo, Angels

    2012-11-01

    The determination of alkylphenols in sewage sludge is still hindered by the complexity of the matrix and of the analytes, some of which are a mixture of isomers. Most of the methods published in the literature have not been validated, due to the lack of reference materials for the determination of alkylphenols in sludge. Given this situation, the objectives of the present study were to develop a new quality-control material for determining octylphenol, nonylphenol and nonylphenol monoethoxylate in sludge. The material was prepared from an anaerobically digested sewage sludge, which was thermally dried, sieved, homogenized and bottled after checking for the bulk homogeneity of the processed material. Together with the sewage sludge, an extract was also prepared, in order to provide a quality-control material for allowing laboratories to test the measuring step. The homogeneity and 1-year stability of the two materials were evaluated. Statistical analysis proved that the materials were homogeneous and stable for at least 1 year stored at different temperatures. These materials are intended to assist in the quality control of the determination of alkylphenols and alkylphenol ethoxylates in sewage sludge.

  8. Cryomilling for the fabrication of doxorubicin-containing silica-nanoparticle/polycaprolactone nanocomposite films

    NASA Astrophysics Data System (ADS)

    Gao, Yu; Lim, Jing; Han, Yiyuan; Wang, Lifeng; Chong, Mark Seow Khoon; Teoh, Swee-Hin; Xu, Chenjie

    2016-01-01

    Bionanocomposites need to have a homogeneous distribution of nanomaterials in the polymeric matrix to achieve consistent mechanical and biological functions. However, a significant challenge lies in achieving the homogeneous distribution of nanomaterials, particularly through a solvent-free approach. This report introduces a technology to address this need. Specifically, cryomilling, a solvent-free, low-temperature processing method, was applied to generate a bionanocomposite film with well-dispersed nanoparticles. As a proof-of-concept, polycaprolactone (PCL) and doxorubicin-containing silica nanoparticles (Si-Dox) were processed through cryomilling and subsequently heat pressed to form the PCL/Si-Dox (cPCL/Si-Dox) film. Homogeneous distribution of Si-Dox was observed under both confocal imaging and atomic force microscopy imaging. The mechanical properties of cPCL/Si-Dox were comparable to those of the pure PCL film. Subsequent in vitro release profiles suggested that sustained release of Dox from the cPCL/Si-Dox film was achievable over 50 days. When human cervical cancer cells were seeded directly on these films, uptake of Dox was observed as early as day 1 and significant inhibition of cell growth was recorded on day 5.Bionanocomposites need to have a homogeneous distribution of nanomaterials in the polymeric matrix to achieve consistent mechanical and biological functions. However, a significant challenge lies in achieving the homogeneous distribution of nanomaterials, particularly through a solvent-free approach. This report introduces a technology to address this need. Specifically, cryomilling, a solvent-free, low-temperature processing method, was applied to generate a bionanocomposite film with well-dispersed nanoparticles. As a proof-of-concept, polycaprolactone (PCL) and doxorubicin-containing silica nanoparticles (Si-Dox) were processed through cryomilling and subsequently heat pressed to form the PCL/Si-Dox (cPCL/Si-Dox) film. Homogeneous distribution of Si-Dox was observed under both confocal imaging and atomic force microscopy imaging. The mechanical properties of cPCL/Si-Dox were comparable to those of the pure PCL film. Subsequent in vitro release profiles suggested that sustained release of Dox from the cPCL/Si-Dox film was achievable over 50 days. When human cervical cancer cells were seeded directly on these films, uptake of Dox was observed as early as day 1 and significant inhibition of cell growth was recorded on day 5. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07287e

  9. Layering, interface and edge effects in multi-layered composite medium

    NASA Technical Reports Server (NTRS)

    Datta, S. K.; Shah, A. H.; Karunesena, W.

    1990-01-01

    Guided waves in a cross-ply laminated plate are studied. Because of the complexity of the exact dispersion equation that governs the wave propagation in a multi-layered fiber-reinforced plate, a stiffness method that can be applied to any number of layers is presented. It is shown that, for a sufficiently large number of layers, the plate can be modeled as a homogeneous anisotropic plate. Also studied is the reflection of guided waves from the edge of a multilayered plate. These results are quite different than in the case of a single homogeneous plate.

  10. Modularization of gradient-index optical design using wavefront matching enabled optimization.

    PubMed

    Nagar, Jogender; Brocker, Donovan E; Campbell, Sawyer D; Easum, John A; Werner, Douglas H

    2016-05-02

    This paper proposes a new design paradigm which allows for a modular approach to replacing a homogeneous optical lens system with a higher-performance GRadient-INdex (GRIN) lens system using a WaveFront Matching (WFM) method. In multi-lens GRIN systems, a full-system-optimization approach can be challenging due to the large number of design variables. The proposed WFM design paradigm enables optimization of each component independently by explicitly matching the WaveFront Error (WFE) of the original homogeneous component at the exit pupil, resulting in an efficient design procedure for complex multi-lens systems.

  11. Numerical study for melting heat transfer and homogeneous-heterogeneous reactions in flow involving carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Hayat, Tasawar; Muhammad, Khursheed; Alsaedi, Ahmed; Asghar, Saleem

    2018-03-01

    Present work concentrates on melting heat transfer in three-dimensional flow of nanofluid over an impermeable stretchable surface. Analysis is made in presence of porous medium and homogeneous-heterogeneous reactions. Single and multi-wall CNTs (carbon nanotubes) are considered. Water is chosen as basefluid. Adequate transformations yield the non-linear ordinary differential systems. Solution of emerging problems is obtained using shooting method. Impacts of influential variables on velocity and temperature are discussed graphically. Skin friction coefficient and Nusselt number are numerically discussed. The results for MWCNTs and SWCNTs are compared and examined.

  12. Efficient method for the calculation of mean extinction. II. Analyticity of the complex extinction efficiency of homogeneous spheroids and finite cylinders.

    PubMed

    Xing, Z F; Greenberg, J M

    1994-08-20

    The analyticity of the complex extinction efficiency is examined numerically in the size-parameter domain for homogeneous prolate and oblate spheroids and finite cylinders. The T-matrix code, which is the most efficient program available to date, is employed to calculate the individual particle-extinction efficiencies. Because of its computational limitations in the size-parameter range, a slightly modified Hilbert-transform algorithm is required to establish the analyticity numerically. The findings concerning analyticity that we reported for spheres (Astrophys. J. 399, 164-175, 1992) apply equally to these nonspherical particles.

  13. Diversity and homogeneity of oral microbiota in healthy Korean pre-school children using pyrosequencing.

    PubMed

    Lee, Soo Eon; Nam, Ok Hyung; Lee, Hyo-Seol; Choi, Sung Chul

    2016-07-01

    Objectives The purpose of this study was designed to identify the oral microbiota in healthy Korean pre-school children using pyrosequencing. Materials and methods Dental plaque samples were obtained form 10 caries-free pre-school children. The samples were analysed using pyrosequencing. Results The pyrosequencing analysis revealed that, at the phylum level, Proteobacteria, Firmicutes, Bacteroidetes, Actinobacteria and Fusobacteria showed high abundance. Also, predominant genera were identified as core microbiome, such as Streptococcus, Neisseria, Capnocytophaga, Haemophilus and Veilonella. Conclusions The diversity and homogeneity was shown in the dental plaque microbiota in healthy Korean pre-school children.

  14. Three dimensional radiative flow of magnetite-nanofluid with homogeneous-heterogeneous reactions

    NASA Astrophysics Data System (ADS)

    Hayat, Tasawar; Rashid, Madiha; Alsaedi, Ahmed

    2018-03-01

    Present communication deals with the effects of homogeneous-heterogeneous reactions in flow of nanofluid by non-linear stretching sheet. Water based nanofluid containing magnetite nanoparticles is considered. Non-linear radiation and non-uniform heat sink/source effects are examined. Non-linear differential systems are computed by Optimal homotopy analysis method (OHAM). Convergent solutions of nonlinear systems are established. The optimal data of auxiliary variables is obtained. Impact of several non-dimensional parameters for velocity components, temperature and concentration fields are examined. Graphs are plotted for analysis of surface drag force and heat transfer rate.

  15. Applications to car bodies - Generalized layout design of three-dimensional shells

    NASA Technical Reports Server (NTRS)

    Fukushima, Junichi; Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    We shall describe applications of the homogenization method, formulated in Part 1, to design layout of car bodies represented by three-dimensional shell structures based on a multi-loading optimization.

  16. Homogeneous fast-flux isotope-production reactor

    DOEpatents

    Cawley, W.E.; Omberg, R.P.

    1982-08-19

    A method is described for producing tritium in a liquid metal fast breeder reactor. Lithium target material is dissolved in the liquid metal coolant in order to facilitate the production and removal of tritium.

  17. Worldwide cloud cover model

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Sommerville, P. N.

    1979-01-01

    Classifying worldwide cloudiness into homogeneous regions, using a satellite data set containing day IR, night IR, incoming, and absorbed solar radiation measurements on a 2.5-degree latitude-longitude grid is considered. Methods of analysis are presented.

  18. Energetic materials and methods of tailoring electrostatic discharge sensitivity of energetic materials

    DOEpatents

    Daniels, Michael A.; Heaps, Ronald J.; Wallace, Ronald S.; Pantoya, Michelle L.; Collins, Eric S.

    2016-11-01

    An energetic material comprising an elemental fuel, an oxidizer or other element, and a carbon nanofiller or carbon fiber rods, where the carbon nanofiller or carbon fiber rods are substantially homogeneously dispersed in the energetic material. Methods of tailoring the electrostatic discharge sensitivity of an energetic material are also disclosed.

  19. Extraction and functional properties of non-zein proteins in corn germ from wet-milling

    USDA-ARS?s Scientific Manuscript database

    This study was conducted to develop methods of extracting corn germ protein and characterize and identify potential applications of the recovered protein. Protein was extracted from both wet germ and finished (dried) germ using 0.1M NaCl as solvent. The method involved homogenization, stirring, cent...

  20. One-step purification of nisin A by immunoaffinity chromatography.

    PubMed

    Suárez, A M; Azcona, J I; Rodríguez, J M; Sanz, B; Hernández, P E

    1997-12-01

    The lantibiotic nisin A was purified to homogeneity by a single-step immunoaffinity chromatography method. An immunoadsorption matrix was developed by direct binding of anti-nisin A monoclonal antibodies to N-hydroxysuccinimide-activated Sepharose. The purification procedure was rapid and reproducible and rendered much higher final yields of nisin than any other described method.

  1. Note on the eigensolution of a homogeneous equation with semi-infinite domain

    NASA Technical Reports Server (NTRS)

    Wadia, A. R.

    1980-01-01

    The 'variation-iteration' method using Green's functions to find the eigenvalues and the corresponding eigenfunctions of a homogeneous Fredholm integral equation is employed for the stability analysis of fluid hydromechanics problems with a semiinfinite (infinite) domain of application. The objective of the study is to develop a suitable numerical approach to the solution of such equations in order to better understand the full set of equations for 'real-world' flow models. The study involves a search for a suitable value of the length of the domain which is a fair finite approximation to infinity, which makes the eigensolution an approximation dependent on the length of the interval chosen. In the examples investigated y = 1 = a seems to be the best approximation of infinity; for y greater than unity this method fails due to the polynomial nature of Green's functions.

  2. Parallel fast multipole boundary element method applied to computational homogenization

    NASA Astrophysics Data System (ADS)

    Ptaszny, Jacek

    2018-01-01

    In the present work, a fast multipole boundary element method (FMBEM) and a parallel computer code for 3D elasticity problem is developed and applied to the computational homogenization of a solid containing spherical voids. The system of equation is solved by using the GMRES iterative solver. The boundary of the body is dicretized by using the quadrilateral serendipity elements with an adaptive numerical integration. Operations related to a single GMRES iteration, performed by traversing the corresponding tree structure upwards and downwards, are parallelized by using the OpenMP standard. The assignment of tasks to threads is based on the assumption that the tree nodes at which the moment transformations are initialized can be partitioned into disjoint sets of equal or approximately equal size and assigned to the threads. The achieved speedup as a function of number of threads is examined.

  3. Synthesis of porous nanocrystalline NiO with hexagonal sheet-like morphology by homogeneous precipitation method

    NASA Astrophysics Data System (ADS)

    Sharma, Ravi Kant; Ghose, Ranjana

    2015-04-01

    Porous nanocrystalline NiO has been synthesized by a simple homogeneous precipitation method in short time at low calcination temperature without using any surfactant, chelating or gelating agents. The porous nanocrystalline NiO with a hexagonal sheet-like morphology were obtained by calcination of Ni(OH)2 nanoflakes at 500 °C. The calcination temperature strongly influences the morphology, crystallite size, specific surface area, pore volume and optical band gap of the samples. The samples were characterized using powder X-ray diffraction, thermal gravimetric analysis, FT-IR spectroscopy, UV-Visible diffuse reflectance spectroscopy, surface area measurements, field emission scanning electron microscopy coupled with energy dispersive X-ray analysis and transmission electron microscopy. The chemical activity of the samples was tested by catalytic reduction of 4-nitrophenol with NaBH4.

  4. Synthesis of focused beam with controllable arbitrary homogeneous polarization using engineered vectorial optical fields.

    PubMed

    Rui, Guanghao; Chen, Jian; Wang, Xiaoyan; Gu, Bing; Cui, Yiping; Zhan, Qiwen

    2016-10-17

    The propagation and focusing properties of light beams continue to remain a research interest owning to their promising applications in physics, chemistry and biological sciences. One of the main challenges to these applications is the control of polarization distribution within the focal volume. In this work, we propose and experimentally demonstrate a method for generating a focused beam with arbitrary homogeneous polarization at any transverse plane. The required input field at the pupil plane of a high numerical aperture objective lens can be found analytically by solving an inverse problem with the Richard-Wolf vectorial diffraction method, and can be experimentally created with a vectorial optical field generator. Focused fields with various polarizations are successfully generated and verified using a Stokes parameter measurement to demonstrate the capability and versatility of proposed technique.

  5. Rapid characterization of the chemical constituents of Cortex Fraxini by homogenate extraction followed by UHPLC coupled with Fourier transform ion cyclotron resonance mass spectrometry and GC-MS.

    PubMed

    Wang, Yinan; Han, Fei; Song, Aihua; Wang, Miao; Zhao, Min; Zhao, Chunjie

    2016-11-01

    Cortex Fraxini is an important traditional Chinese medicine. In this work, a rapid and reliable homogenate extraction method was applied for the fast extraction for Cortex Fraxini, and the method was optimized by response surface methodology. Ultra high performance liquid chromatography combined with Fourier transform ion cyclotron resonance mass spectrometry and gas chromatography with mass spectrometry were established for the separation and characterization of the constituents of Cortex Fraxini. Liquid chromatography separation was conducted on a C 18 column (150 mm × 2.1 mm, 1.8 μm), and gas chromatography separation was performed on a capillary with a 5% phenyl-methylpolysiloxane stationary phase (30 m × 0.25 mm × 0.25 mm) by injection of silylated samples. According to the results, 33 chemical compounds were characterized by liquid chromatography with mass spectrometry, and 11 chemical compounds were characterized by gas chromatography with mass spectrometry, and coumarins were the major components characterized by both gas chromatography with mass spectrometry and liquid chromatography with mass spectrometry. The proposed homogenate extraction was an efficient and rapid method, and coumarins, phenylethanoid glycosides, iridoid glycosides, phenylpropanoids, and lignans were the main constituents of Cortex Fraxini. This work laid the foundation for further study of Cortex Fraxini and will be helpful for the rapid extraction and characterization of ingredients in other traditional Chinese medicines. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. A new multigroup method for cross-sections that vary rapidly in energy

    NASA Astrophysics Data System (ADS)

    Haut, T. S.; Ahrens, C.; Jonko, A.; Lowrie, R.; Till, A.

    2017-01-01

    We present a numerical method for solving the time-independent thermal radiative transfer (TRT) equation or the neutron transport (NT) equation when the opacity (cross-section) varies rapidly in frequency (energy) on the microscale ε; ε corresponds to the characteristic spacing between absorption lines or resonances, and is much smaller than the macroscopic frequency (energy) variation of interest. The approach is based on a rigorous homogenization of the TRT/NT equation in the frequency (energy) variable. Discretization of the homogenized TRT/NT equation results in a multigroup-type system, and can therefore be solved by standard methods. We demonstrate the accuracy and efficiency of the approach on three model problems. First we consider the Elsasser band model with constant temperature and a line spacing ε =10-4 . Second, we consider a neutron transport application for fast neutrons incident on iron, where the characteristic resonance spacing ε necessitates ≈ 16 , 000 energy discretization parameters if Planck-weighted cross sections are used. Third, we consider an atmospheric TRT problem for an opacity corresponding to water vapor over a frequency range 1000-2000 cm-1, where we take 12 homogeneous layers between 1-15 km, and temperature/pressure values in each layer from the standard US atmosphere. For all three problems, we demonstrate that we can achieve between 0.1 and 1 percent relative error in the solution, and with several orders of magnitude fewer parameters than a standard multigroup formulation using Planck-weighted (source-weighted) opacities for a comparable accuracy.

  7. A robust LC-MS/MS method for the determination of pidotimod in different biological matrixes and its application to in vivo and in vitro pharmacokinetic studies.

    PubMed

    Wang, Guangji; Wang, Qian; Rao, Tai; Shen, Boyu; Kang, Dian; Shao, Yuhao; Xiao, Jingcheng; Chen, Huimin; Liang, Yan

    2016-06-15

    Pidotimod, (R)-3-[(S)-(5-oxo-2-pyrrolidinyl) carbonyl]-thiazolidine-4-carboxylic acid, was frequently used to treat children with recurrent respiratory infections. Preclinical pharmacokinetics of pidotimod was still rarely reported to date. Herein, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated to determine pidotimod in rat plasma, tissue homogenate and Caco-2 cells. In this process, phenacetin was chosen as the internal standard due to its similarity in chromatographic and mass spectrographic characteristics with pidotimod. The plasma calibration curves were established within the concentration range of 0.01-10.00μg/mL, and similar linear curves were built using tissue homogenate and Caco-2 cells. The calibration curves for all biological samples showed good linearity (r>0.99) over the concentration ranges tested. The intra- and inter-day precision (RSD, %) values were below 15% and accuracy (RE, %) was ranged from -15% to 15% at all quality control levels. For plasma, tissue homogenate and Caco-2 cells, no obvious matrix effect was found, and the average recoveries were all above 75%. Thus, the method demonstrated excellent accuracy, precision and robustness for high throughput applications, and was then successfully applied to the studies of absorption in rat plasma, distribution in rat tissues and intracellular uptake characteristics in Caco-2 cells for pidotimod. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Development of salt and pH-induced solidified floating organic droplets homogeneous liquid-liquid microextraction for extraction of ten pyrethroid insecticides in fresh fruits and fruit juices followed by gas chromatography-mass spectrometry.

    PubMed

    Torbati, Mohammadali; Farajzadeh, Mir Ali; Torbati, Mostafa; Nabil, Ali Akbar Alizadeh; Mohebbi, Ali; Afshar Mogaddam, Mohammad Reza

    2018-01-01

    A new microextraction method named salt and pH-induced homogeneous liquid-liquid microextraction has been developed in a home-made extraction device for the extraction and preconcentration of some pyrethroid insecticides from different fruit juice samples prior to gas chromatography-mass spectrometry. In the present work, an extraction device made from two parallel glass tubes with different lengths and diameters was used in the microextraction procedure. In this method, a homogeneous solution of a sample solution and an extraction solvent (pivalic acid) was broken by performing an acid-base reaction and the extraction solvent was produced in whole of the solution. The produced droplets of the extraction solvent went up through the solution and solidified using an ice-bath. They were collected without centrifugation step. Under the optimum conditions, limits of detection and quantification were obtained in the ranges of 0.006-0.038, and 0.023-0.134ngmL -1 , respectively. The enrichment factors and extraction recoveries of the selected analytes ranged from 365-460 to 73-92%, respectively. The relative standard deviations were lower than 9% for intra- (n = 6) and inter-day (n = 4) precisions at a concentration of 1ngmL -1 of each analyte. Finally, some fruit juice samples were effectively analyzed by the proposed method. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Evaluating Feynman integrals by the hypergeometry

    NASA Astrophysics Data System (ADS)

    Feng, Tai-Fu; Chang, Chao-Hsi; Chen, Jian-Bin; Gu, Zhi-Hua; Zhang, Hai-Bin

    2018-02-01

    The hypergeometric function method naturally provides the analytic expressions of scalar integrals from concerned Feynman diagrams in some connected regions of independent kinematic variables, also presents the systems of homogeneous linear partial differential equations satisfied by the corresponding scalar integrals. Taking examples of the one-loop B0 and massless C0 functions, as well as the scalar integrals of two-loop vacuum and sunset diagrams, we verify our expressions coinciding with the well-known results of literatures. Based on the multiple hypergeometric functions of independent kinematic variables, the systems of homogeneous linear partial differential equations satisfied by the mentioned scalar integrals are established. Using the calculus of variations, one recognizes the system of linear partial differential equations as stationary conditions of a functional under some given restrictions, which is the cornerstone to perform the continuation of the scalar integrals to whole kinematic domains numerically with the finite element methods. In principle this method can be used to evaluate the scalar integrals of any Feynman diagrams.

  10. Generation of Regionally Specific Neural Progenitor Cells (NPCs) and Neurons from Human Pluripotent Stem Cells (hPSCs).

    PubMed

    Cutts, Josh; Brookhouser, Nicholas; Brafman, David A

    2016-01-01

    Neural progenitor cells (NPCs) derived from human pluripotent stem cells (hPSCs) are a multipotent cell population capable of long-term expansion and differentiation into a variety of neuronal subtypes. As such, NPCs have tremendous potential for disease modeling, drug screening, and regenerative medicine. Current methods for the generation of NPCs results in cell populations homogenous for pan-neural markers such as SOX1 and SOX2 but heterogeneous with respect to regional identity. In order to use NPCs and their neuronal derivatives to investigate mechanisms of neurological disorders and develop more physiologically relevant disease models, methods for generation of regionally specific NPCs and neurons are needed. Here, we describe a protocol in which exogenous manipulation of WNT signaling, through either activation or inhibition, during neural differentiation of hPSCs, promotes the formation of regionally homogenous NPCs and neuronal cultures. In addition, we provide methods to monitor and characterize the efficiency of hPSC differentiation to these regionally specific cell identities.

  11. Safranal-loaded solid lipid nanoparticles: evaluation of sunscreen and moisturizing potential for topical applications

    PubMed Central

    Khameneh, Bahman; Halimi, Vahid; Jaafari, Mahmoud Reza; Golmohammadzadeh, Shiva

    2015-01-01

    Objective(s): In the current study, sunscreen and moisturizing properties of solid lipid nanoparticle (SLN)-safranal formulations were evaluated. Materials and Methods: Series of SLN were prepared using glyceryl monostearate, Tween 80 and different amounts of safranal by high shear homogenization, and ultrasound and high-pressure homogenization (HPH) methods. SLN formulations were characterized for size, zeta potential, morphology, thermal properties, and encapsulation efficacy. The Sun Protection Factor (SPF) of the products was determined in vitro using transpore tape. The moisturizing activity of the products was also evaluated by corneometer. Results: The SPF of SLN-safranal formulations was increased when the amount of safranal increased. Mean particle size for all formulas was approximately 106 nm by probe sonication and 233 nm using HPH method. The encapsulation efficiency of safranal was around 70% for all SLN-safranal formulations. Conclusion: The results conclude that SLN-safranal formulations were found to be effective for topical delivery of safranal and succeeded in providing appropriate sunscreen properties. PMID:25810877

  12. A Homogeneous Time-Resolved Fluorescence Immunoassay Method for the Measurement of Compound W.

    PubMed

    Huang, Biao; Yu, Huixin; Bao, Jiandong; Zhang, Manda; Green, William L; Wu, Sing-Yung

    2018-01-01

    Using compound W (a 3,3'-diiodothyronine sulfate [T 2 S] immuno-crossreactive material)-specific polyclonal antibodies and homogeneous time-resolved fluorescence immunoassay assay techniques (AlphaLISA) to establish an indirect competitive compound W (ICW) quantitative detection method. Photosensitive particles (donor beads) coated with compound W or T 2 S and rabbit anti-W antibody were incubated with biotinylated goat anti-rabbit antibody. This constitutes a detection system with streptavidin-coated acceptor particle. We have optimized the test conditions and evaluated the detection performance. The sensitivity of the method was 5 pg/mL, and the detection range was 5 to 10 000 pg/mL. The intra-assay coefficient of variation averages <10% with stable reproducibility. The ICW-AlphaLISA shows good stability and high sensitivity and can measure a wide range of compound W levels in extracts of maternal serum samples. This may have clinical application to screen congenital hypothyroidism in utero.

  13. An efficient numerical method for the solution of the problem of elasticity for 3D-homogeneous elastic medium with cracks and inclusions

    NASA Astrophysics Data System (ADS)

    Kanaun, S.; Markov, A.

    2017-06-01

    An efficient numerical method for solution of static problems of elasticity for an infinite homogeneous medium containing inhomogeneities (cracks and inclusions) is developed. Finite number of heterogeneous inclusions and planar parallel cracks of arbitrary shapes is considered. The problem is reduced to a system of surface integral equations for crack opening vectors and volume integral equations for stress tensors inside the inclusions. For the numerical solution of these equations, a class of Gaussian approximating functions is used. The method based on these functions is mesh free. For such functions, the elements of the matrix of the discretized system are combinations of explicit analytical functions and five standard 1D-integrals that can be tabulated. Thus, the numerical integration is excluded from the construction of the matrix of the discretized problem. For regular node grids, the matrix of the discretized system has Toeplitz's properties, and Fast Fourier Transform technique can be used for calculation matrix-vector products of such matrices.

  14. Partial homogeneity based high-resolution nuclear magnetic resonance spectra under inhomogeneous magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Zhiliang; Lin, Liangjie; Lin, Yanqin, E-mail: linyq@xmu.edu.cn, E-mail: chenz@xmu.edu.cn

    2014-09-29

    In nuclear magnetic resonance (NMR) technique, it is of great necessity and importance to obtain high-resolution spectra, especially under inhomogeneous magnetic fields. In this study, a method based on partial homogeneity is proposed for retrieving high-resolution one-dimensional NMR spectra under inhomogeneous fields. Signals from series of small voxels, which characterize high resolution due to small sizes, are recorded simultaneously. Then, an inhomogeneity correction algorithm is developed based on pattern recognition to correct the influence brought by field inhomogeneity automatically, thus yielding high-resolution information. Experiments on chemical solutions and fish spawn were carried out to demonstrate the performance of the proposedmore » method. The proposed method serves as a single radiofrequency pulse high-resolution NMR spectroscopy under inhomogeneous fields and may provide an alternative of obtaining high-resolution spectra of in vivo living systems or chemical-reaction systems, where performances of conventional techniques are usually degenerated by field inhomogeneity.« less

  15. Effects of collaboration and inquiry on reasoning and achievement in biology

    NASA Astrophysics Data System (ADS)

    Jensen, Jamie Lee

    The primary purpose of the present study was to compare the effectiveness of two collaborative grouping strategies and two instructional methods in terms of gains in reasoning ability and achievement in college biology. In order to do so, a quasi-experimental study was performed in which students were placed in one of four treatment conditions: heterogeneous grouping within inquiry instruction, homogeneous grouping within inquiry instruction, heterogeneous grouping within non-inquiry instruction, and homogeneous grouping within non-inquiry instruction. Students were placed in groups based on initial reasoning level. Reasoning levels and achievement gains were assessed at the end of the study. Results showed that within non-inquiry instruction, heterogeneous mean group scores were higher in both reasoning and achievement than homogeneous groups. In contrast, within inquiry instruction, homogeneous mean group scores were higher in both reasoning and achievement. Inquiry instruction, as a whole, significantly outperformed non-inquiry instruction in the development of reasoning ability. Within inquiry instruction, low-ability students had significantly greater reasoning gains when grouped homogeneously. These results support Piaget's developmental theory and contradict Vygotsky's developmental theory. These results also suggest that the success of one grouping strategy over another is highly dependent upon the nature of instruction, which may be a cause for such conflicting views on grouping strategies within the educational literature. In addition, inquiry instruction led to students having greater confidence in their reasoning ability as well as a more positive attitude toward collaboration. Instructional implications are discussed.

  16. A general multiscale framework for the emergent effective elastodynamics of metamaterials

    NASA Astrophysics Data System (ADS)

    Sridhar, A.; Kouznetsova, V. G.; Geers, M. G. D.

    2018-02-01

    This paper presents a general multiscale framework towards the computation of the emergent effective elastodynamics of heterogeneous materials, to be applied for the analysis of acoustic metamaterials and phononic crystals. The generality of the framework is exemplified by two key characteristics. First, the underlying formalism relies on the Floquet-Bloch theorem to derive a robust definition of scales and scale separation. Second, unlike most homogenization approaches that rely on a classical volume average, a generalized homogenization operator is defined with respect to a family of particular projection functions. This yields a generalized macro-scale continuum, instead of the classical Cauchy continuum. This enables (in a micromorphic sense) to homogenize the rich dispersive behavior resulting from both Bragg scattering and local resonance. For an arbitrary unit cell, the homogenization projection functions are constructed using the Floquet-Bloch eigenvectors obtained in the desired frequency regime at select high symmetry points, which effectively resolves the emergent phenomena dominating that regime. Furthermore, a generalized Hill-Mandel condition is proposed that ensures power consistency between the homogenized and full-scale model. A high-order spatio-temporal gradient expansion is used to localize the multiscale problem leading to a series of recursive unit cell problems giving the appropriate micro-mechanical corrections. The developed multiscale method is validated against standard numerical Bloch analysis of the dispersion spectra of example unit cells encompassing multiple high-order branches generated by local resonance and/or Bragg scattering.

  17. Optimization of processing parameters for the preparation of phytosterol microemulsions by the solvent displacement method.

    PubMed

    Leong, Wai Fun; Che Man, Yaakob B; Lai, Oi Ming; Long, Kamariah; Misran, Misni; Tan, Chin Ping

    2009-09-23

    The purpose of this study was to optimize the parameters involved in the production of water-soluble phytosterol microemulsions for use in the food industry. In this study, response surface methodology (RSM) was employed to model and optimize four of the processing parameters, namely, the number of cycles of high-pressure homogenization (1-9 cycles), the pressure used for high-pressure homogenization (100-500 bar), the evaporation temperature (30-70 degrees C), and the concentration ratio of microemulsions (1-5). All responses-particle size (PS), polydispersity index (PDI), and percent ethanol residual (%ER)-were well fit by a reduced cubic model obtained by multiple regression after manual elimination. The coefficient of determination (R(2)) and absolute average deviation (AAD) value for PS, PDI, and %ER were 0.9628 and 0.5398%, 0.9953 and 0.7077%, and 0.9989 and 1.0457%, respectively. The optimized processing parameters were 4.88 (approximately 5) homogenization cycles, homogenization pressure of 400 bar, evaporation temperature of 44.5 degrees C, and concentration ratio of microemulsions of 2.34 cycles (approximately 2 cycles) of high-pressure homogenization. The corresponding responses for the optimized preparation condition were a minimal particle size of 328 nm, minimal polydispersity index of 0.159, and <0.1% of ethanol residual. The chi-square test verified the model, whereby the experimental values of PS, PDI, and %ER agreed with the predicted values at a 0.05 level of significance.

  18. Flow perfusion culture of MC3T3-E1 osteogenic cells on gradient calcium polyphosphate scaffolds with different pore sizes.

    PubMed

    Chen, Liang; Song, Wei; Markel, David C; Shi, Tong; Muzik, Otto; Matthew, Howard; Ren, Weiping

    2016-02-01

    Calcium polyphosphate is a biodegradable bone substitute. It remains a challenge to prepare porous calcium polyphosphate with desired gradient porous structures. In this study, a modified one-step gravity sintering method was used to prepare calcium polyphosphate scaffolds with desired-gradient-pore-size distribution. The differences of porous structure, mechanical strength, and degradation rate between gradient and homogenous calcium polyphosphate scaffolds were evaluated by micro-computed tomography, scanning electron microscopy, and mechanical testing. Preosteoblastic MC3T3-E1 cells were seeded onto gradient and homogenous calcium polyphosphate scaffolds and cultured in a flow perfusion bioreactor. The distribution, proliferation, and differentiation of the MC3T3-E1 cells were compared to that of homogenous calcium polyphosphate scaffolds. Though no significant difference of cell proliferation was found between the gradient and the homogenous calcium polyphosphate scaffolds, a much higher cell differentiation and mineralization were observed in the gradient calcium polyphosphate scaffolds than that of the homogenous calcium polyphosphate scaffolds, as manifested by increased alkaline phosphatase activity (p < 0.05). The improved distribution and differentiation of cultured cells within gradient scaffolds were further supported by both (18)F-fluorine micro-positron emission tomography scanning and in vitro tetracycline labeling. We conclude that the calcium polyphosphate scaffold with gradient pore sizes enhances osteogenic cell differentiation as well as mineralization. The in vivo performance of gradient calcium polyphosphate scaffolds warrants further investigation in animal bone defect models. © The Author(s) 2015.

  19. A rapid and reliable procedure for extraction of cellular polyamines and inorganic ions from plant tissues

    Treesearch

    Rakesh Minocha; Walter C. Shortle; Stephanie L. Long; Subhash C. Minocha

    1994-01-01

    A fast and reliable method for the extraction of cellular polyamines and major inorganic ions (Ca, Mg, Mn, K, and P) from several plant tissues is described. The method involves repeated freezing and thawing of samples instead of homogenization. The efficiency of extraction of both the polyamines and inorganic ions by these two methods was compared for 10 different...

  20. Development and Validation of a HPTLC Method for Simultaneous Estimation of L-Glutamic Acid and γ-Aminobutyric Acid in Mice Brain

    PubMed Central

    Sancheti, J. S.; Shaikh, M. F.; Khatwani, P. F.; Kulkarni, Savita R.; Sathaye, Sadhana

    2013-01-01

    A new robust, simple and economic high performance thin layer chromatographic method was developed for simultaneous estimation of L-glutamic acid and γ-amino butyric acid in brain homogenate. The high performance thin layer chromatographic separation of these amino acid was achieved using n-butanol:glacial acetic acid:water (22:3:5 v/v/v) as mobile phase and ninhydrin as a derivatising agent. Quantitation of the method was achieved by densitometric method at 550 nm over the concentration range of 10-100 ng/spot. This method showed good separation of amino acids in the brain homogenate with Rf value of L-glutamic acid and γ-amino butyric acid as 21.67±0.58 and 33.67±0.58, respectively. The limit of detection and limit of quantification for L-glutamic acid was found to be 10 and 20 ng and for γ-amino butyric acid it was 4 and 10 ng, respectively. The method was also validated in terms of accuracy, precision and repeatability. The developed method was found to be precise and accurate with good reproducibility and shows promising applicability for studying pathological status of disease and therapeutic significance of drug treatment. PMID:24591747

  1. Development and Validation of a HPTLC Method for Simultaneous Estimation of L-Glutamic Acid and γ-Aminobutyric Acid in Mice Brain.

    PubMed

    Sancheti, J S; Shaikh, M F; Khatwani, P F; Kulkarni, Savita R; Sathaye, Sadhana

    2013-11-01

    A new robust, simple and economic high performance thin layer chromatographic method was developed for simultaneous estimation of L-glutamic acid and γ-amino butyric acid in brain homogenate. The high performance thin layer chromatographic separation of these amino acid was achieved using n-butanol:glacial acetic acid:water (22:3:5 v/v/v) as mobile phase and ninhydrin as a derivatising agent. Quantitation of the method was achieved by densitometric method at 550 nm over the concentration range of 10-100 ng/spot. This method showed good separation of amino acids in the brain homogenate with Rf value of L-glutamic acid and γ-amino butyric acid as 21.67±0.58 and 33.67±0.58, respectively. The limit of detection and limit of quantification for L-glutamic acid was found to be 10 and 20 ng and for γ-amino butyric acid it was 4 and 10 ng, respectively. The method was also validated in terms of accuracy, precision and repeatability. The developed method was found to be precise and accurate with good reproducibility and shows promising applicability for studying pathological status of disease and therapeutic significance of drug treatment.

  2. A contrast source method for nonlinear acoustic wave fields in media with spatially inhomogeneous attenuation.

    PubMed

    Demi, L; van Dongen, K W A; Verweij, M D

    2011-03-01

    Experimental data reveals that attenuation is an important phenomenon in medical ultrasound. Attenuation is particularly important for medical applications based on nonlinear acoustics, since higher harmonics experience higher attenuation than the fundamental. Here, a method is presented to accurately solve the wave equation for nonlinear acoustic media with spatially inhomogeneous attenuation. Losses are modeled by a spatially dependent compliance relaxation function, which is included in the Westervelt equation. Introduction of absorption in the form of a causal relaxation function automatically results in the appearance of dispersion. The appearance of inhomogeneities implies the presence of a spatially inhomogeneous contrast source in the presented full-wave method leading to inclusion of forward and backward scattering. The contrast source problem is solved iteratively using a Neumann scheme, similar to the iterative nonlinear contrast source (INCS) method. The presented method is directionally independent and capable of dealing with weakly to moderately nonlinear, large scale, three-dimensional wave fields occurring in diagnostic ultrasound. Convergence of the method has been investigated and results for homogeneous, lossy, linear media show full agreement with the exact results. Moreover, the performance of the method is demonstrated through simulations involving steered and unsteered beams in nonlinear media with spatially homogeneous and inhomogeneous attenuation. © 2011 Acoustical Society of America

  3. Assessment of a hybrid finite element-transfer matrix model for flat structures with homogeneous acoustic treatments.

    PubMed

    Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck

    2014-05-01

    Modeling complex vibroacoustic systems including poroelastic materials using finite element based methods can be unfeasible for practical applications. For this reason, analytical approaches such as the transfer matrix method are often preferred to obtain a quick estimation of the vibroacoustic parameters. However, the strong assumptions inherent within the transfer matrix method lead to a lack of accuracy in the description of the geometry of the system. As a result, the transfer matrix method is inherently limited to the high frequency range. Nowadays, hybrid substructuring procedures have become quite popular. Indeed, different modeling techniques are typically sought to describe complex vibroacoustic systems over the widest possible frequency range. As a result, the flexibility and accuracy of the finite element method and the efficiency of the transfer matrix method could be coupled in a hybrid technique to obtain a reduction of the computational burden. In this work, a hybrid methodology is proposed. The performances of the method in predicting the vibroacoutic indicators of flat structures with attached homogeneous acoustic treatments are assessed. The results prove that, under certain conditions, the hybrid model allows for a reduction of the computational effort while preserving enough accuracy with respect to the full finite element solution.

  4. The Mantel-Haenszel procedure revisited: models and generalizations.

    PubMed

    Fidler, Vaclav; Nagelkerke, Nico

    2013-01-01

    Several statistical methods have been developed for adjusting the Odds Ratio of the relation between two dichotomous variables X and Y for some confounders Z. With the exception of the Mantel-Haenszel method, commonly used methods, notably binary logistic regression, are not symmetrical in X and Y. The classical Mantel-Haenszel method however only works for confounders with a limited number of discrete strata, which limits its utility, and appears to have no basis in statistical models. Here we revisit the Mantel-Haenszel method and propose an extension to continuous and vector valued Z. The idea is to replace the observed cell entries in strata of the Mantel-Haenszel procedure by subject specific classification probabilities for the four possible values of (X,Y) predicted by a suitable statistical model. For situations where X and Y can be treated symmetrically we propose and explore the multinomial logistic model. Under the homogeneity hypothesis, which states that the odds ratio does not depend on Z, the logarithm of the odds ratio estimator can be expressed as a simple linear combination of three parameters of this model. Methods for testing the homogeneity hypothesis are proposed. The relationship between this method and binary logistic regression is explored. A numerical example using survey data is presented.

  5. The Mantel-Haenszel Procedure Revisited: Models and Generalizations

    PubMed Central

    Fidler, Vaclav; Nagelkerke, Nico

    2013-01-01

    Several statistical methods have been developed for adjusting the Odds Ratio of the relation between two dichotomous variables X and Y for some confounders Z. With the exception of the Mantel-Haenszel method, commonly used methods, notably binary logistic regression, are not symmetrical in X and Y. The classical Mantel-Haenszel method however only works for confounders with a limited number of discrete strata, which limits its utility, and appears to have no basis in statistical models. Here we revisit the Mantel-Haenszel method and propose an extension to continuous and vector valued Z. The idea is to replace the observed cell entries in strata of the Mantel-Haenszel procedure by subject specific classification probabilities for the four possible values of (X,Y) predicted by a suitable statistical model. For situations where X and Y can be treated symmetrically we propose and explore the multinomial logistic model. Under the homogeneity hypothesis, which states that the odds ratio does not depend on Z, the logarithm of the odds ratio estimator can be expressed as a simple linear combination of three parameters of this model. Methods for testing the homogeneity hypothesis are proposed. The relationship between this method and binary logistic regression is explored. A numerical example using survey data is presented. PMID:23516463

  6. Multilevel Models for Intensive Longitudinal Data with Heterogeneous Autoregressive Errors: The Effect of Misspecification and Correction with Cholesky Transformation

    PubMed Central

    Jahng, Seungmin; Wood, Phillip K.

    2017-01-01

    Intensive longitudinal studies, such as ecological momentary assessment studies using electronic diaries, are gaining popularity across many areas of psychology. Multilevel models (MLMs) are most widely used analytical tools for intensive longitudinal data (ILD). Although ILD often have individually distinct patterns of serial correlation of measures over time, inferences of the fixed effects, and random components in MLMs are made under the assumption that all variance and autocovariance components are homogenous across individuals. In the present study, we introduced a multilevel model with Cholesky transformation to model ILD with individually heterogeneous covariance structure. In addition, the performance of the transformation method and the effects of misspecification of heterogeneous covariance structure were investigated through a Monte Carlo simulation. We found that, if individually heterogeneous covariances are incorrectly assumed as homogenous independent or homogenous autoregressive, MLMs produce highly biased estimates of the variance of random intercepts and the standard errors of the fixed intercept and the fixed effect of a level 2 covariate when the average autocorrelation is high. For intensive longitudinal data with individual specific residual covariance, the suggested transformation method showed lower bias in those estimates than the misspecified models when the number of repeated observations within individuals is 50 or more. PMID:28286490

  7. Quantitative Evaluation of E1 Endoglucanase Recovery from Tobacco Leaves Using the Vacuum Infiltration-Centrifugation Method

    PubMed Central

    Kingsbury, Nathaniel J.; McDonald, Karen A.

    2014-01-01

    As a production platform for recombinant proteins, plant leaf tissue has many advantages, but commercialization of this technology has been hindered by high recovery and purification costs. Vacuum infiltration-centrifugation (VI-C) is a technique to obtain extracellularly-targeted products from the apoplast wash fluid (AWF). Because of its selective recovery of secreted proteins without homogenizing the whole tissue, VI-C can potentially reduce downstream production costs. Lab scale experiments were conducted to quantitatively evaluate the VI-C method and compared to homogenization techniques in terms of product purity, concentration, and other desirable characteristics. From agroinfiltrated Nicotiana benthamiana leaves, up to 81% of a truncated version of E1 endoglucanase from Acidothermus cellulolyticus was recovered with VI-C versus homogenate extraction, and average purity and concentration increases of 4.2-fold and 3.1-fold, respectively, were observed. Formulas were developed to predict recovery yields of secreted protein obtained by performing multiple rounds of VI-C on the same leaf tissue. From this, it was determined that three rounds of VI-C recovered 97% of the total active recombinant protein accessible to the VI-C procedure. The results suggest that AWF recovery is an efficient process that could reduce downstream processing steps and costs for plant-made recombinant proteins. PMID:24971334

  8. The local density of optical states of a metasurface

    NASA Astrophysics Data System (ADS)

    Lunnemann, Per; Koenderink, A. Femius

    2016-02-01

    While metamaterials are often desirable for near-field functions, such as perfect lensing, or cloaking, they are often quantified by their response to plane waves from the far field. Here, we present a theoretical analysis of the local density of states near lattices of discrete magnetic scatterers, i.e., the response to near field excitation by a point source. Based on a pointdipole theory using Ewald summation and an array scanning method, we can swiftly and semi-analytically evaluate the local density of states (LDOS) for magnetoelectric point sources in front of an infinite two-dimensional (2D) lattice composed of arbitrary magnetoelectric dipole scatterers. The method takes into account radiation damping as well as all retarded electrodynamic interactions in a self-consistent manner. We show that a lattice of magnetic scatterers evidences characteristic Drexhage oscillations. However, the oscillations are phase shifted relative to the electrically scattering lattice consistent with the difference expected for reflection off homogeneous magnetic respectively electric mirrors. Furthermore, we identify in which source-surface separation regimes the metasurface may be treated as a homogeneous interface, and in which homogenization fails. A strong frequency and in-plane position dependence of the LDOS close to the lattice reveals coupling to guided modes supported by the lattice.

  9. Modified Homogeneous Data Set of Coronal Intensities

    NASA Astrophysics Data System (ADS)

    Dorotovič, I.; Minarovjech, M.; Lorenc, M.; Rybanský, M.

    2014-07-01

    The Astronomical Institute of the Slovak Academy of Sciences has published the intensities, recalibrated with respect to a common intensity scale, of the 530.3 nm (Fe xiv) green coronal line observed at ground-based stations up to the year 2008. The name of this publication is Homogeneous Data Set (HDS). We have developed a method that allows one to successfully substitute the ground-based observations by satellite observations and, thus, continue with the publication of the HDS. For this purpose, the observations of the Extreme-ultraviolet Imaging Telescope (EIT), onboard the Solar and Heliospheric Observatory (SOHO) satellite, were exploited. Among other data the EIT instrument provides almost daily 28.4 nm (Fe xv) emission-line snapshots of the corona. The Fe xiv and Fe xv data (4051 observation days) taken in the period 1996 - 2008 have been compared and good agreement was found. The method to obtain the individual data for the HDS follows from the correlation analysis described in this article. The resulting data, now under the name of Modified Homogeneous Data Set (MHDS), are identical up to 1996 to those in the HDS. The MHDS can be used further for studies of the coronal solar activity and its cycle. These data are available at http://www.suh.sk.

  10. Homogeneity testing and quantitative analysis of manganese (Mn) in vitrified Mn-doped glasses by laser-induced breakdown spectroscopy (LIBS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unnikrishnan, V. K.; Nayak, Rajesh; Kartha, V. B.

    2014-09-15

    Laser-induced breakdown spectroscopy (LIBS), an atomic emission spectroscopy method, has rapidly grown as one of the best elemental analysis techniques over the past two decades. Homogeneity testing and quantitative analysis of manganese (Mn) in manganese-doped glasses have been carried out using an optimized LIBS system employing a nanosecond ultraviolet Nd:YAG laser as the source of excitation. The glass samples have been prepared using conventional vitrification methods. The laser pulse irradiance on the surface of the glass samples placed in air at atmospheric pressure was about 1.7×10{sup 9} W/cm{sup 2}. The spatially integrated plasma emission was collected and imaged on tomore » the spectrograph slit using an optical-fiber-based collection system. Homogeneity was checked by recording LIBS spectra from different sites on the sample surface and analyzing the elemental emission intensities for concentration determination. Validation of the observed LIBS results was done by comparison with scanning electron microscope- energy dispersive X-ray spectroscopy (SEM-EDX) surface elemental mapping. The analytical performance of the LIBS system has been evaluated through the correlation of the LIBS determined concentrations of Mn with its certified values. The results are found to be in very good agreement with the certified concentrations.« less

  11. Regional Homogeneity

    PubMed Central

    Jiang, Lili; Zuo, Xi-Nian

    2015-01-01

    Much effort has been made to understand the organizational principles of human brain function using functional magnetic resonance imaging (fMRI) methods, among which resting-state fMRI (rfMRI) is an increasingly recognized technique for measuring the intrinsic dynamics of the human brain. Functional connectivity (FC) with rfMRI is the most widely used method to describe remote or long-distance relationships in studies of cerebral cortex parcellation, interindividual variability, and brain disorders. In contrast, local or short-distance functional interactions, especially at a scale of millimeters, have rarely been investigated or systematically reviewed like remote FC, although some local FC algorithms have been developed and applied to the discovery of brain-based changes under neuropsychiatric conditions. To fill this gap between remote and local FC studies, this review will (1) briefly survey the history of studies on organizational principles of human brain function; (2) propose local functional homogeneity as a network centrality to characterize multimodal local features of the brain connectome; (3) render a neurobiological perspective on local functional homogeneity by linking its temporal, spatial, and individual variability to information processing, anatomical morphology, and brain development; and (4) discuss its role in performing connectome-wide association studies and identify relevant challenges, and recommend its use in future brain connectomics studies. PMID:26170004

  12. A Review of the Scattering-Parameter Extraction Method with Clarification of Ambiguity Issues in Relation to Metamaterial Homogenization

    NASA Astrophysics Data System (ADS)

    Arslanagic, S.; Hansen, T. V.; Mortensen, N. A.; Gregersen, A. H.; Sigmund, O.; Ziolkowski, R. W.; Breinbjerg, O.

    2013-04-01

    The scattering parameter extraction method of metamaterial homogenization is reviewed to show that the only ambiguity is the one related to the choice of the branch of the complex logarithmic function (or the complex inverse cosine function), whereas it has no ambiguity for the sign of the wave number and intrinsic impedance. While the method indeed yields two signs of the intrinsic impedance, and thus the wave number, the signs are dependent, and moreover, both sign combinations lead to the same permittivity and permeability, and are thus permissible. This observation is in distinct contrast to a number of statements in the literature where the correct sign of the intrinsic impedance and wave number, resulting from the scattering parameter method, is chosen by imposing additional physical requirements such as passivity. The scattering parameter method is reviewed through an investigation of a uniform plane wave normally incident on a planar slab in free-space, and the severity of the branch ambiguity is illustrated through simulations of a known metamaterial realization. Several approaches for proper branch selection are reviewed and their suitability to metamaterial samples is discussed.

  13. Analysis of the methods for assessing socio-economic development level of urban areas

    NASA Astrophysics Data System (ADS)

    Popova, Olga; Bogacheva, Elena

    2017-01-01

    The present paper provides a targeted analysis of current approaches (ratings) in the assessment of socio-economic development of urban areas. The survey focuses on identifying standardized methodologies to area assessment techniques formation that will result in developing the system of intelligent monitoring, dispatching, building management, scheduling and effective management of an administrative-territorial unit. This system is characterized by complex hierarchical structure, including tangible and intangible properties (parameters, attributes). Investigating the abovementioned methods should increase the administrative-territorial unit's attractiveness for investors and residence. The research aims at studying methods for evaluating socio-economic development level of the Russian Federation territories. Experimental and theoretical territory estimating methods were revealed. Complex analysis of the characteristics of the areas was carried out and evaluation parameters were determined. Integral indicators (resulting rating criteria values) as well as the overall rankings (parameters, characteristics) were analyzed. The inventory of the most widely used partial indicators (parameters, characteristics) of urban areas was revealed. The resulting criteria of rating values homogeneity were verified and confirmed by determining the root mean square deviation, i.e. divergence of indices. The principal shortcomings of assessment methodologies were revealed. The assessment methods with enhanced effectiveness and homogeneity were proposed.

  14. Nodal Diffusion Burnable Poison Treatment for Prismatic Reactor Cores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. M. Ougouag; R. M. Ferrer

    2010-10-01

    The prismatic block version of the High Temperature Reactor (HTR) considered as a candidate Very High Temperature Reactor (VHTR)design may use burnable poison pins in locations at some corners of the fuel blocks (i.e., assembly equivalent structures). The presence of any highly absorbing materials, such as these burnable poisons, within fuel blocks for hexagonal geometry, graphite-moderated High Temperature Reactors (HTRs) causes a local inter-block flux depression that most nodal diffusion-based method have failed to properly model or otherwise represent. The location of these burnable poisons near vertices results in an asymmetry in the morphology of the assemblies (or blocks). Hencemore » the resulting inadequacy of traditional homogenization methods, as these “spread” the actually local effect of the burnable poisons throughout the assembly. Furthermore, the actual effect of the burnable poison is primarily local with influence in its immediate vicinity, which happens to include a small region within the same assembly as well as similar regions in the adjacent assemblies. Traditional homogenization methods miss this artifact entirely. This paper presents a novel method for treating the local effect of the burnable poison explicitly in the context of a modern nodal method.« less

  15. Methods for Purifying Enzymes for Mycoremediation

    NASA Technical Reports Server (NTRS)

    Cullings, Kenneth W. (Inventor); DeSimone, Julia C. (Inventor); Paavola, Chad D. (Inventor)

    2014-01-01

    A process for purifying laccase from an ectomycorrhizal fruiting body is disclosed. The process includes steps of homogenization, sonication, centrifugation, filtration, affinity chromatography, ion exchange chromatography, and gel filtration. Purified laccase can also be separated into isomers.

  16. Evaluation of Time-Temperature Integrators (TTIs) with Microorganism-Entrapped Microbeads Produced Using Homogenization and SPG Membrane Emulsification Techniques.

    PubMed

    Rahman, A T M Mijanur; Lee, Seung Ju; Jung, Seung Won

    2015-12-28

    A comparative study was conducted to evaluate precision and accuracy in controlling the temperature dependence of encapsulated microbial time-temperature integrators (TTIs) developed using two different emulsification techniques. Weissela cibaria CIFP 009 cells, immobilized within 2% Na-alginate gel microbeads using homogenization (5,000, 7,000, and 10,000 rpm) and Shirasu porous glass (SPG) membrane technologies (10 μm), were applied to microbial TTIs. The prepared micobeads were characterized with respect to their size, size distribution, shape and morphology, entrapment efficiency, and bead production yield. Additionally, fermentation process parameters including growth rate were investigated. The TTI responses (changes in pH and titratable acidity (TA)) were evaluated as a function of temperature (20°C, 25°C, and 30°C). In comparison with conventional methods, SPG membrane technology was able not only to produce highly uniform, small-sized beads with the narrowest size distribution, but also the bead production yield was found to be nearly 3.0 to 4.5 times higher. However, among the TTIs produced using the homogenization technique, poor linearity (R(2)) in terms of TA was observed for the 5,000 and 7,000 rpm treatments. Consequently, microbeads produced by the SPG membrane and by homogenization at 10,000 rpm were selected for adjusting the temperature dependence. The Ea values of TTIs containing 0.5, 1.0, and 1.5 g microbeads, prepared by SPG membrane and conventional methods, were estimated to be 86.0, 83.5, and 76.6 kJ/mol, and 85.5, 73.5, and 62.2 kJ/mol, respectively. Therefore, microbial TTIs developed using SPG membrane technology are much more efficient in controlling temperature dependence.

  17. Design optimization of space structures

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos

    1991-01-01

    The topology-shape-size optimization of space structures is investigated through Kikuchi's homogenization method. The method starts from a 'design domain block,' which is a region of space into which the structure is to materialize. This domain is initially filled with a finite element mesh, typically regular. Force and displacement boundary conditions corresponding to applied loads and supports are applied at specific points in the domain. An optimal structure is to be 'carved out' of the design under two conditions: (1) a cost function is to be minimized, and (2) equality or inequality constraints are to be satisfied. The 'carving' process is accomplished by letting microstructure holes develop and grow in elements during the optimization process. These holes have a rectangular shape in two dimensions and a cubical shape in three dimensions, and may also rotate with respect to the reference axes. The properties of the perforated element are obtained through an homogenization procedure. Once a hole reaches the volume of the element, that element effectively disappears. The project has two phases. In the first phase the method was implemented as the combination of two computer programs: a finite element module, and an optimization driver. In the second part, focus is on the application of this technique to planetary structures. The finite element part of the method was programmed for the two-dimensional case using four-node quadrilateral elements to cover the design domain. An element homogenization technique different from that of Kikuchi and coworkers was implemented. The optimization driver is based on an augmented Lagrangian optimizer, with the volume constraint treated as a Courant penalty function. The optimizer has to be especially tuned to this type of optimization because the number of design variables can reach into the thousands. The driver is presently under development.

  18. Simple Reversed-Phase HPLC Method with Spectrophotometric Detection for Measuring Acetaminophen-Protein Adducts in Rat Liver Samples

    PubMed Central

    Acharya, Miteshkumar; Lau-Cam, Cesar A.

    2012-01-01

    A simple reversed-phase HPLC method for measuring hepatic levels of acetaminophen- (APAP-) protein adduct following an overdose of APAP was developed. An aliquot of liver homogenate in phosphate-buffered saline pH 7.4 (PBS) was placed on a Nanosep centrifugal device, which was centrifuged to obtain a protein residue. This residue was incubated with a solution of p-aminobenzoic acid (PABA), the internal standard, and bacterial protease in PBS, transferred to a Nanosep centrifugal device, and centrifuged. A 100 μL portion of the filtrate was analyzed on a YMC-Pack ODS-AMQ C18 column, using 100 mM potassium dihydrogen phosphate-methanol-acetic acid (100 : 0.6 : 0.1) as the mobile phase, a flow rate of 1 mL/min, and photometric detection at 254 nm. PABA and APAP-cystein-S-yl (APAP-Cys) eluted at ~14.7 min and 22.7 min, respectively. Method linearity, based on on-column concentrations of APAP-Cys, was observed over the range 0.078–40 μg. Recoveries of APAP-Cys from spiked blank liver homogenates ranged from ~83% to 91%. Limits of detection and of quantification of APAP-Cys, based on column concentrations, were 0.06 μg and 0.14 μg, respectively. RSD values for interday and intraday analyses of a blank liver homogenate spiked with APAP-Cyst at three levels were, in all cases, ≤1.0% and <1.5%, respectively. The proposed method was found appropriate for comparing the antidotal properties of N-acetylcysteine and taurine in a rat model of APAP poisoning. PMID:22619591

  19. Pancreas Oxygen Persufflation Increases ATP Levels as Shown by Nuclear Magnetic Resonance

    PubMed Central

    Scott, W.E.; Weegman, B.P.; Ferrer-Fabrega, J.; Stein, S.A.; Anazawa, T.; Kirchner, V.A.; Rizzari, M.D.; Stone, J.; Matsumoto, S.; Hammer, B.E.; Balamurugan, A.N.; Kidder, L.S.; Suszynski, T.M.; Avgoustiniatos, E.S.; Stone, S.G.; Tempelman, L.A.; Sutherland, D.E.R.; Hering, B.J.; Papas, K.K.

    2010-01-01

    Background Islet transplantation is a promising treatment for type 1 diabetes. Due to a shortage of suitable human pancreata, high cost, and the large dose of islets presently required for long-term diabetes reversal; it is important to maximize viable islet yield. Traditional methods of pancreas preservation have been identified as suboptimal due to insufficient oxygenation. Enhanced oxygen delivery is a key area of improvement. In this paper, we explored improved oxygen delivery by persufflation (PSF), ie, vascular gas perfusion. Methods Human pancreata were obtained from brain-dead donors. Porcine pancreata were procured by en bloc viscerectomy from heparinized donation after cardiac death donors and were either preserved by either two-layer method (TLM) or PSF. Following procurement, organs were transported to a 1.5-T magnetic resonance (MR) system for 31P nuclear magnetic resonance spectroscopy to investigate their bioenergetic status by measuring the ratio of adenosine triphosphate to inorganic phosphate (ATP:Pi) and for assessing PSF homogeneity by MRI. Results Human and porcine pancreata can be effectively preserved by PSF. MRI showed that pancreatic tissue was homogeneously filled with gas. TLM can effectively raise ATP:Pi levels in rat pancreata but not in larger porcine pancreata. ATP:Pi levels were almost undetectable in porcine organs preserved with TLM. When human or porcine organs were preserved by PSF, ATP:Pi was elevated to levels similar to those observed in rat pancreata. Conclusion The methods developed for human and porcine pancreas PSF homogeneously deliver oxygen throughout the organ. This elevates ATP levels during preservation and may improve islet isolation outcomes while enabling the use of marginal donors, thus expanding the usable donor pool. PMID:20692395

  20. Estimation of design floods in ungauged catchments using a regional index flood method. A case study of Lake Victoria Basin in Kenya

    NASA Astrophysics Data System (ADS)

    Nobert, Joel; Mugo, Margaret; Gadain, Hussein

    Reliable estimation of flood magnitudes corresponding to required return periods, vital for structural design purposes, is impacted by lack of hydrological data in the study area of Lake Victoria Basin in Kenya. Use of regional information, derived from data at gauged sites and regionalized for use at any location within a homogenous region, would improve the reliability of the design flood estimation. Therefore, the regional index flood method has been applied. Based on data from 14 gauged sites, a delineation of the basin into two homogenous regions was achieved using elevation variation (90-m DEM), spatial annual rainfall pattern and Principal Component Analysis of seasonal rainfall patterns (from 94 rainfall stations). At site annual maximum series were modelled using the Log normal (LN) (3P), Log Logistic Distribution (LLG), Generalized Extreme Value (GEV) and Log Pearson Type 3 (LP3) distributions. The parameters of the distributions were estimated using the method of probability weighted moments. Goodness of fit tests were applied and the GEV was identified as the most appropriate model for each site. Based on the GEV model, flood quantiles were estimated and regional frequency curves derived from the averaged at site growth curves. Using the least squares regression method, relationships were developed between the index flood, which is defined as the Mean Annual Flood (MAF) and catchment characteristics. The relationships indicated area, mean annual rainfall and altitude were the three significant variables that greatly influence the index flood. Thereafter, estimates of flood magnitudes in ungauged catchments within a homogenous region were estimated from the derived equations for index flood and quantiles from the regional curves. These estimates will improve flood risk estimation and to support water management and engineering decisions and actions.

  1. Comparison of experimentally and theoretically determined radiation characteristics of photosynthetic microorganisms

    NASA Astrophysics Data System (ADS)

    Kandilian, Razmig; Pruvost, Jérémy; Artu, Arnaud; Lemasson, Camille; Legrand, Jack; Pilon, Laurent

    2016-05-01

    This paper aims to experimentally and directly validate a recent theoretical method for predicting the radiation characteristics of photosynthetic microorganisms. Such predictions would facilitate light transfer analysis in photobioreactors (PBRs) to control their operation and to maximize their production of biofuel and other high-value products. The state of the art experimental method can be applied to microorganisms of any shape and inherently accounts for their non-spherical and heterogeneous nature. On the other hand, the theoretical method treats the microorganisms as polydisperse homogeneous spheres with some effective optical properties. The absorption index is expressed as the weighted sum of the pigment mass absorption cross-sections and the refractive index is estimated based on the subtractive Kramers-Kronig relationship given an anchor refractive index and wavelength. Here, particular attention was paid to green microalgae Chlamydomonas reinhardtii grown under nitrogen-replete and nitrogen-limited conditions and to Chlorella vulgaris grown under nitrogen-replete conditions. First, relatively good agreement was found between the two methods for determining the mass absorption and scattering cross-sections and the asymmetry factor of both nitrogen-replete and nitrogen-limited C. reinhardtii with the proper anchor point. However, the homogeneous sphere approximation significantly overestimated the absorption cross-section of C. vulgaris cells. The latter were instead modeled as polydisperse coated spheres consisting of an absorbing core containing pigments and a non-absorbing but strongly refracting wall made of sporopollenin. The coated sphere approximation gave good predictions of the experimentally measured integral radiation characteristics of C. vulgaris. In both cases, the homogeneous and coated sphere approximations predicted resonance in the scattering phase function that were not observed experimentally. However, these approximations were sufficiently accurate to predict the fluence rate and local rate of photon absorption in PBRs.

  2. Upscaling: Effective Medium Theory, Numerical Methods and the Fractal Dream

    NASA Astrophysics Data System (ADS)

    Guéguen, Y.; Ravalec, M. Le; Ricard, L.

    2006-06-01

    Upscaling is a major issue regarding mechanical and transport properties of rocks. This paper examines three issues relative to upscaling. The first one is a brief overview of Effective Medium Theory (EMT), which is a key tool to predict average rock properties at a macroscopic scale in the case of a statistically homogeneous medium. EMT is of particular interest in the calculation of elastic properties. As discussed in this paper, EMT can thus provide a possible way to perform upscaling, although it is by no means the only one, and in particular it is irrelevant if the medium does not adhere to statistical homogeneity. This last circumstance is examined in part two of the paper. We focus on the example of constructing a hydrocarbon reservoir model. Such a construction is a required step in the process of making reasonable predictions for oil production. Taking into account rock permeability, lithological units and various structural discontinuities at different scales is part of this construction. The result is that stochastic reservoir models are built that rely on various numerical upscaling methods. These methods are reviewed. They provide techniques which make it possible to deal with upscaling on a general basis. Finally, a last case in which upscaling is trivial is considered in the third part of the paper. This is the fractal case. Fractal models have become popular precisely because they are free of the assumption of statistical homogeneity and yet do not involve numerical methods. It is suggested that using a physical criterion as a means to discriminate whether fractality is a dream or reality would be more satisfactory than relying on a limited data set alone.

  3. The BUME method: a new rapid and simple chloroform-free method for total lipid extraction of animal tissue

    NASA Astrophysics Data System (ADS)

    Löfgren, Lars; Forsberg, Gun-Britt; Ståhlman, Marcus

    2016-06-01

    In this study we present a simple and rapid method for tissue lipid extraction. Snap-frozen tissue (15-150 mg) is collected in 2 ml homogenization tubes. 500 μl BUME mixture (butanol:methanol [3:1]) is added and automated homogenization of up to 24 frozen samples at a time in less than 60 seconds is performed, followed by a 5-minute single-phase extraction. After the addition of 500 μl heptane:ethyl acetate (3:1) and 500 μl 1% acetic acid a 5-minute two-phase extraction is performed. Lipids are recovered from the upper phase by automated liquid handling using a standard 96-tip robot. A second two-phase extraction is performed using 500 μl heptane:ethyl acetate (3:1). Validation of the method showed that the extraction recoveries for the investigated lipids, which included sterols, glycerolipids, glycerophospholipids and sphingolipids were similar or better than for the Folch method. We also applied the method for lipid extraction of liver and heart and compared the lipid species profiles with profiles generated after Folch and MTBE extraction. We conclude that the BUME method is superior to the Folch method in terms of simplicity, through-put, automation, solvent consumption, economy, health and environment yet delivering lipid recoveries fully comparable to or better than the Folch method.

  4. Mixedness determination of rare earth-doped ceramics

    NASA Astrophysics Data System (ADS)

    Czerepinski, Jennifer H.

    The lack of chemical uniformity in a powder mixture, such as clustering of a minor component, can lead to deterioration of materials properties. A method to determine powder mixture quality is to correlate the chemical homogeneity of a multi-component mixture with its particle size distribution and mixing method. This is applicable to rare earth-doped ceramics, which require at least 1-2 nm dopant ion spacing to optimize optical properties. Mixedness simulations were conducted for random heterogeneous mixtures of Nd-doped LaF3 mixtures using the Concentric Shell Model of Mixedness (CSMM). Results indicate that when the host to dopant particle size ratio is 100, multi-scale concentration variance is optimized. In order to verify results from the model, experimental methods that probe a mixture at the micro, meso, and macro scales are needed. To directly compare CSMM results experimentally, an image processing method was developed to calculate variance profiles from electron images. An in-lens (IL) secondary electron image is subtracted from the corresponding Everhart-Thornley (ET) secondary electron image in a Field-Emission Scanning Electron Microscope (FESEM) to produce two phases and pores that can be quantified with 50 nm spatial resolution. A macro was developed to quickly analyze multi-scale compositional variance from these images. Results for a 50:50 mixture of NdF3 and LaF3 agree with the computational model. The method has proven to be applicable only for mixtures with major components and specific particle morphologies, but the macro is useful for any type of imaging that produces excellent phase contrast, such as confocal microscopy. Fluorescence spectroscopy was used as an indirect method to confirm computational results for Nd-doped LaF3 mixtures. Fluorescence lifetime can be used as a quantitative method to indirectly measure chemical homogeneity when the limits of electron microscopy have been reached. Fluorescence lifetime represents the compositional fluctuations of a dopant on the nanoscale while accounting for billions of particles in a fast, non-destructive manner. The significance of this study will show how small-scale fluctuations in homogeneity limit the optimization of optical properties, which can be improved by the proper selection of particle size and mixing method.

  5. Method for calcining radioactive wastes

    DOEpatents

    Bjorklund, William J.; McElroy, Jack L.; Mendel, John E.

    1979-01-01

    This invention relates to a method for the preparation of radioactive wastes in a low leachability form by calcining the radioactive waste on a fluidized bed of glass frit, removing the calcined waste to melter to form a homogeneous melt of the glass and the calcined waste, and then solidifying the melt to encapsulate the radioactive calcine in a glass matrix.

  6. A compact finite element method for elastic bodies

    NASA Technical Reports Server (NTRS)

    Rose, M. E.

    1984-01-01

    A nonconforming finite method is described for treating linear equilibrium problems, and a convergence proof showing second order accuracy is given. The close relationship to a related compact finite difference scheme due to Phillips and Rose is examined. A condensation technique is shown to preserve the compactness property and suggests an approach to a certain type of homogenization.

  7. One-step purification of nisin A by immunoaffinity chromatography.

    PubMed Central

    Suárez, A M; Azcona, J I; Rodríguez, J M; Sanz, B; Hernández, P E

    1997-01-01

    The lantibiotic nisin A was purified to homogeneity by a single-step immunoaffinity chromatography method. An immunoadsorption matrix was developed by direct binding of anti-nisin A monoclonal antibodies to N-hydroxysuccinimide-activated Sepharose. The purification procedure was rapid and reproducible and rendered much higher final yields of nisin than any other described method. PMID:9406424

  8. Superconductor precursor mixtures made by precipitation method

    DOEpatents

    Bunker, Bruce C.; Lamppa, Diana L.; Voigt, James A.

    1989-01-01

    Method and apparatus for preparing highly pure homogeneous precursor powder mixtures for metal oxide superconductive ceramics. The mixes are prepared by instantaneous precipitation from stoichiometric solutions of metal salts such as nitrates at controlled pH's within the 9 to 12 range, by addition of solutions of non-complexing pyrolyzable cations, such as alkyammonium and carbonate ions.

  9. The Effect of Virtual Language Learning Method on Writing Ability of Iranian Intermediate EFL Learners

    ERIC Educational Resources Information Center

    Khoshsima, Hooshang; Sayadi, Fatemeh

    2016-01-01

    This study aimed at investigating the effect of virtual language learning method on Iranian intermediate EFL learners writing ability. The study was conducted with 20 English Translation students at Chabahar Maritime University who were assigned into two groups, control and experimental, after ensuring of their homogeneity by administering a TOEFL…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Ch.; Gao, X. W.; Sladek, J.

    This paper reports our recent research works on crack analysis in continuously non-homogeneous and linear elastic functionally graded materials. A meshless boundary element method is developed for this purpose. Numerical examples are presented and discussed to demonstrate the efficiency and the accuracy of the present numerical method, and to show the effects of the material gradation on the crack-opening-displacements and the stress intensity factors.

  11. Unsupported palladium alloy membranes and methods of making same

    DOEpatents

    Way, J. Douglas; Thoen, Paul; Gade, Sabina K.

    2015-06-02

    The invention provides support-free palladium membranes and methods of making these membranes. Single-gas testing of the unsupported foils produced hydrogen permeabilities equivalent to thicker membranes produced by cold-rolling. Defect-free films as thin as 7.2 microns can be fabricated, with ideal H.sub.2/N.sub.2 selectivities as high as 40,000. Homogeneous membrane compositions may also be produced using these methods.

  12. Study of DNA extraction methods for use in loop-mediated isothermal amplification detection of single resting cysts in the toxic dinoflagellates Alexandrium tamarense and A. catenella.

    PubMed

    Nagai, Satoshi; Yamamoto, Keigo; Hata, Naotugu; Itakura, Shigeru

    2012-09-01

    In a previous study, we experienced instable amplification and a low amplification success in loop-mediated isothermal amplification (LAMP) reactions from naturally occurring vegetative cells or resting cysts of the toxic dinoflagellates Alexandrium tamarense and Alexandrium catenella. In this study, we examined 4 methods for extracting DNA from single resting cysts of A. tamarense and A. catenella to obtain more stable and better amplification success and to facilitate unambiguous detection using the LAMP method. Apart from comparing the 4 different DNA extraction methods, namely, (1) boiling in Tris-EDTA (TE) buffer, (2) heating at 65 °C in hexadecyltrimethylammonium bromide buffer, (3) boiling in 0.5% Chelex buffer, and (4) boiling in 5% Chelex buffer, we also examined the need for homogenization to crush the resting cysts before DNA extraction in each method. Homogenization of resting cysts was found to be essential for DNA extraction in all 4 methods. The detection time was significantly shorter in 5% Chelex buffer than in the other buffers and the amplification success was 100% (65/65), indicating the importance of DNA extraction and the effectiveness of 5% Chelex buffer in the Alexandrium LAMP. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Modeling strategies for pharmaceutical blend monitoring and end-point determination by near-infrared spectroscopy.

    PubMed

    Igne, Benoît; de Juan, Anna; Jaumot, Joaquim; Lallemand, Jordane; Preys, Sébastien; Drennen, James K; Anderson, Carl A

    2014-10-01

    The implementation of a blend monitoring and control method based on a process analytical technology such as near infrared spectroscopy requires the selection and optimization of numerous criteria that will affect the monitoring outputs and expected blend end-point. Using a five component formulation, the present article contrasts the modeling strategies and end-point determination of a traditional quantitative method based on the prediction of the blend parameters employing partial least-squares regression with a qualitative strategy based on principal component analysis and Hotelling's T(2) and residual distance to the model, called Prototype. The possibility to monitor and control blend homogeneity with multivariate curve resolution was also assessed. The implementation of the above methods in the presence of designed experiments (with variation of the amount of active ingredient and excipients) and with normal operating condition samples (nominal concentrations of the active ingredient and excipients) was tested. The impact of criteria used to stop the blends (related to precision and/or accuracy) was assessed. Results demonstrated that while all methods showed similarities in their outputs, some approaches were preferred for decision making. The selectivity of regression based methods was also contrasted with the capacity of qualitative methods to determine the homogeneity of the entire formulation. Copyright © 2014. Published by Elsevier B.V.

  14. Determination of refractive and volatile elements in sediment using laser ablation inductively coupled plasma mass spectrometry.

    PubMed

    Duodu, Godfred Odame; Goonetilleke, Ashantha; Allen, Charlotte; Ayoko, Godwin A

    2015-10-22

    Wet-milling protocol was employed to produce pressed powder tablets with excellent cohesion and homogeneity suitable for laser ablation (LA) analysis of volatile and refractive elements in sediment. The influence of sample preparation on analytical performance was also investigated, including sample homogeneity, accuracy and limit of detection. Milling in volatile solvent for 40 min ensured sample is well mixed and could reasonably recover both volatile (Hg) and refractive (Zr) elements. With the exception of Cr (-52%) and Nb (+26%) major, minor and trace elements in STSD-1 and MESS-3 could be analysed within ±20% of the certified values. Comparison of the method with total digestion method using HF was tested by analysing 10 different sediment samples. The laser method recovers significantly higher amounts of analytes such as Ag, Cd, Sn and Sn than the total digestion method making it a more robust method for elements across the periodic table. LA-ICP-MS also eliminates the interferences from chemical reagents as well as the health and safety risks associated with digestion processes. Therefore, it can be considered as an enhanced method for the analysis of heterogeneous matrices such as river sediments. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Recent developments in multidimensional transport methods for the APOLLO 2 lattice code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zmijarevic, I.; Sanchez, R.

    1995-12-31

    A usual method of preparation of homogenized cross sections for reactor coarse-mesh calculations is based on two-dimensional multigroup transport treatment of an assembly together with an appropriate leakage model and reaction-rate-preserving homogenization technique. The actual generation of assembly spectrum codes based on collision probability methods is capable of treating complex geometries (i.e., irregular meshes of arbitrary shape), thus avoiding the modeling error that was introduced in codes with traditional tracking routines. The power and architecture of current computers allow the treatment of spatial domains comprising several mutually interacting assemblies using fine multigroup structure and retaining all geometric details of interest.more » Increasing safety requirements demand detailed two- and three-dimensional calculations for very heterogeneous problems such as control rod positioning, broken Pyrex rods, irregular compacting of mixed- oxide (MOX) pellets at an MOX-UO{sub 2} interface, and many others. An effort has been made to include accurate multi- dimensional transport methods in the APOLLO 2 lattice code. These include extension to three-dimensional axially symmetric geometries of the general-geometry collision probability module TDT and the development of new two- and three-dimensional characteristics methods for regular Cartesian meshes. In this paper we discuss the main features of recently developed multidimensional methods that are currently being tested.« less

  16. A Bayes linear Bayes method for estimation of correlated event rates.

    PubMed

    Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim

    2013-12-01

    Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.

  17. Innovations In Site Characterization: Geophysical Investigation at Hazardous Waste Sites

    EPA Pesticide Factsheets

    This compendium describes a number of geophysical technologies and methods that were used at 11 sites with significantly different geological settings and types of subsurface contamination, ranging from relatively homogeneous stratigraphy to the highly ...

  18. Promoted Iron Nanocrystals Obtained via Ligand Exchange as Active and Selective Catalysts for Synthesis Gas Conversion

    PubMed Central

    2017-01-01

    Colloidal synthesis routes have been recently used to fabricate heterogeneous catalysts with more controllable and homogeneous properties. Herein a method was developed to modify the surface composition of colloidal nanocrystal catalysts and to purposely introduce specific atoms via ligands and change the catalyst reactivity. Organic ligands adsorbed on the surface of iron oxide catalysts were exchanged with inorganic species such as Na2S, not only to provide an active surface but also to introduce controlled amounts of Na and S acting as promoters for the catalytic process. The catalyst composition was optimized for the Fischer–Tropsch direct conversion of synthesis gas into lower olefins. At industrially relevant conditions, these nanocrystal-based catalysts with controlled composition were more active, selective, and stable than catalysts with similar composition but synthesized using conventional methods, possibly due to their homogeneity of properties and synergic interaction of iron and promoters. PMID:28824820

  19. Decarboxylative alkylation for site-selective bioconjugation of native proteins via oxidation potentials.

    PubMed

    Bloom, Steven; Liu, Chun; Kölmel, Dominik K; Qiao, Jennifer X; Zhang, Yong; Poss, Michael A; Ewing, William R; MacMillan, David W C

    2018-02-01

    The advent of antibody-drug conjugates as pharmaceuticals has fuelled a need for reliable methods of site-selective protein modification that furnish homogeneous adducts. Although bioorthogonal methods that use engineered amino acids often provide an elegant solution to the question of selective functionalization, achieving homogeneity using native amino acids remains a challenge. Here, we explore visible-light-mediated single-electron transfer as a mechanism towards enabling site- and chemoselective bioconjugation. Specifically, we demonstrate the use of photoredox catalysis as a platform to selectivity wherein the discrepancy in oxidation potentials between internal versus C-terminal carboxylates can be exploited towards obtaining C-terminal functionalization exclusively. This oxidation potential-gated technology is amenable to endogenous peptides and has been successfully demonstrated on the protein insulin. As a fundamentally new approach to bioconjugation this methodology provides a blueprint toward the development of photoredox catalysis as a generic platform to target other redox-active side chains for native conjugation.

  20. Influence of homogeneous magnetic fields on the flow of a ferrofluid in the Taylor-Couette system.

    PubMed

    Altmeyer, S; Hoffmann, Ch; Leschhorn, A; Lücke, M

    2010-07-01

    We investigate numerically the influence of a homogeneous magnetic field on a ferrofluid in the gap between two concentric, independently rotating cylinders. The full Navier-Stokes equations are solved with a combination of a finite difference method and a Galerkin method. Structure, dynamics, symmetry properties, bifurcation, and stability behavior of different vortex structures are investigated for axial and transversal magnetic fields, as well as combinations of them. We show that a transversal magnetic field modulates the Taylor vortex flow and the spiral vortex flow. Thus, a transversal magnetic field induces wavy structures: wavy Taylor vortex flow (wTVF) and wavy spiral vortex flow. In contrast to the classic wTVF, which is a secondarily bifurcating structure, these magnetically generated wavy Taylor vortices are pinned by the magnetic field, i.e., they are stationary and they appear via a primary forward bifurcation out of the basic state of circular Couette flow.

  1. Synchronous characterization of semiconductor microcavity laser beam.

    PubMed

    Wang, T; Lippi, G L

    2015-06-01

    We report on a high-resolution double-channel imaging method used to synchronously map the intensity- and optical-frequency-distribution of a laser beam in the plane orthogonal to the propagation direction. The synchronous measurement allows us to show that the laser frequency is an inhomogeneous distribution below threshold, but that it becomes homogeneous across the fundamental Gaussian mode above threshold. The beam's tails deviations from the Gaussian shape, however, are accompanied by sizeable fluctuations in the laser wavelength, possibly deriving from manufacturing details and from the influence of spontaneous emission in the very low intensity wings. In addition to the synchronous spatial characterization, a temporal analysis at any given point in the beam cross section is carried out. Using this method, the beam homogeneity and spatial shape, energy density, energy center, and the defects-related spectrum can also be extracted from these high-resolution pictures.

  2. Optimization of cell disruption methods for efficient recovery of bioactive metabolites via NMR of three freshwater microalgae (chlorophyta).

    PubMed

    Ma, Nyuk Ling; Teh, Kit Yinn; Lam, Su Shiung; Kaben, Anne Marie; Cha, Thye San

    2015-08-01

    This study demonstrates the use of NMR techniques coupled with chemometric analysis as a high throughput data mining method to identify and examine the efficiency of different disruption techniques tested on microalgae (Chlorella variabilis, Scenedesmus regularis and Ankistrodesmus gracilis). The yield and chemical diversity from the disruptions together with the effects of pre-oven and pre-freeze drying prior to disruption techniques were discussed. HCl extraction showed the highest recovery of oil compounds from the disrupted microalgae (up to 90%). In contrast, NMR analysis showed the highest intensity of bioactive metabolites obtained for homogenized extracts pre-treated with freeze-drying, indicating that homogenizing is a more favorable approach to recover bioactive substances from the disrupted microalgae. The results show the potential of NMR as a useful metabolic fingerprinting tool for assessing compound diversity in complex microalgae extracts. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Decarboxylative alkylation for site-selective bioconjugation of native proteins via oxidation potentials

    NASA Astrophysics Data System (ADS)

    Bloom, Steven; Liu, Chun; Kölmel, Dominik K.; Qiao, Jennifer X.; Zhang, Yong; Poss, Michael A.; Ewing, William R.; MacMillan, David W. C.

    2018-02-01

    The advent of antibody-drug conjugates as pharmaceuticals has fuelled a need for reliable methods of site-selective protein modification that furnish homogeneous adducts. Although bioorthogonal methods that use engineered amino acids often provide an elegant solution to the question of selective functionalization, achieving homogeneity using native amino acids remains a challenge. Here, we explore visible-light-mediated single-electron transfer as a mechanism towards enabling site- and chemoselective bioconjugation. Specifically, we demonstrate the use of photoredox catalysis as a platform to selectivity wherein the discrepancy in oxidation potentials between internal versus C-terminal carboxylates can be exploited towards obtaining C-terminal functionalization exclusively. This oxidation potential-gated technology is amenable to endogenous peptides and has been successfully demonstrated on the protein insulin. As a fundamentally new approach to bioconjugation this methodology provides a blueprint toward the development of photoredox catalysis as a generic platform to target other redox-active side chains for native conjugation.

  4. Combinatorial construction of tilings by barycentric simplex orbits (D symbols) and their realizations in Euclidean and other homogeneous spaces.

    PubMed

    Molnár, Emil

    2005-11-01

    A new method, developed in previous works by the author (partly with co-authors), is presented which decides algorithmically, in principle by computer, whether a combinatorial space tiling (Tau, Gamma) is realizable in the d-dimensional Euclidean space E(d) (think of d = 2, 3, 4) or in other homogeneous spaces, e.g. in Thurston's 3-geometries: E(3), S(3), H(3), S(2) x R, H(2) x R, SL(2)R, Nil, Sol. Then our group Gamma will be an isometry group of a projective metric 3-sphere PiS(3) (R, < , >), acting discontinuously on its above tiling Tau. The method is illustrated by a plane example and by the well known rhombohedron tiling (Tau, Gamma), where Gamma = R3m is the Euclidean space group No. 166 in International Tables for Crystallography.

  5. Reverse engineering of the homogeneous-entity product profiles based on CCD

    NASA Astrophysics Data System (ADS)

    Gan, Yong; Zhong, Jingru; Sun, Ning; Sun, Aoran

    2011-08-01

    This measurement system uses delaminated measurement principle, measures the three perpendicular direction values of the entities. When the measured entity is immerged in the liquid layer by layer, every layer's image are collected by CCD and digitally processed. It introduces the basic measuring principle and the working process of the measure method. According to Archimedes law, the related buoyancy and volume that soaked in different layer's depth are measured by electron balance and the mathematics models are established. Through calculating every layer's weight and centre of gravity by computer based on the method of Artificial Intelligence, we can reckon 3D coordinate values of every minute entity cell in different layers and its 3D contour picture is constructed. The experimental results show that for all the homogeneous entity insoluble in water, it can measure them. The measurement velocity is fast and non-destructive test, it can measure the entity with internal hole.

  6. Solution XAS Analysis for Exploring the Active Species in Homogeneous Vanadium Complex Catalysis

    NASA Astrophysics Data System (ADS)

    Nomura, Kotohiro; Mitsudome, Takato; Tsutsumi, Ken; Yamazoe, Seiji

    2018-06-01

    Selected examples in V K-edge X-ray Absorption Near Edge Structure (XANES) analysis of a series of vanadium complexes containing imido ligands (possessing metal-nitrogen double bond) in toluene solution have been introduced, and their pre-edge and the edge were affected by their structures and nature of ligands. Selected results in exploring the oxidation states of the active species in ethylene dimerization/polymerization using homogeneous vanadium catalysts [consisting of (imido)vanadium(V) complexes and Al cocatalysts] by X-ray absorption spectroscopy (XAS) analyses have been introduced. It has been demonstrated that the method should provide more clear information concerning the active species in situ, especially by combination with the other methods (NMR and ESR spectra, X-ray crystallographic analysis, and reaction chemistry), and should be powerful tool for study of catalysis mechanism as well as for the structural analysis in solution.

  7. A computer-assisted study of pulse dynamics in anisotropic media

    NASA Astrophysics Data System (ADS)

    Krishnan, J.; Engelborghs, K.; Bär, M.; Lust, K.; Roose, D.; Kevrekidis, I. G.

    2001-06-01

    This study focuses on the computer-assisted stability analysis of travelling pulse-like structures in spatially periodic heterogeneous reaction-diffusion media. The physical motivation comes from pulse propagation in thin annular domains on a diffusionally anisotropic catalytic surface. The study was performed by computing the travelling pulse-like structures as limit cycles of the spatially discretized PDE, which in turn is performed in two ways: a Newton method based on a pseudospectral discretization of the PDE, and a Newton-Picard method based on a finite difference discretization. Details about the spectra of these modulated pulse-like structures are discussed, including how they may be compared with the spectra of pulses in homogeneous media. The effects of anisotropy on the dynamics of pulses and pulse pairs are studied. Beyond shifting the location of bifurcations present in homogeneous media, anisotropy can also introduce certain new instabilities.

  8. [Proposed method to estimate underreporting of induced abortion in Spain].

    PubMed

    Rodríguez Blas, C; Sendra Gutiérrez, J M; Regidor Poyatos, E; Gutiérrez Fisac, J L; Iñigo Martínez, J

    1994-01-01

    In Spain, from 1987 to 1990 the rate of legal abortion reported to the health authorities has doubled; nevertheless, the observed geographical differences suggest to an underreporting of the number of voluntary pregnancy terminations. Based on information on several sociodemographic, economic and cultural characteristics, contraceptive use, availability of abortion services, fertility indices, and maternal and child health status, five homogenEous groups of autonomous region were identified applying factor and cluster analysis techniques. To estimate the level of underreporting, we assumed that all the regions which shape a cluster ought to have the same abortion rate that the region with the highest rate in each group. We estimate that about 18,463 abortions (33.2%) were not reported during 1990. The proposed method can be used for assessing the notification since it allows to identify geographical areas where very similar rates of legal abortion are expected.

  9. Method of making active magnetic refrigerant, colossal magnetostriction and giant magnetoresistive materials based on Gd-Si-Ge alloys

    DOEpatents

    Gschneidner, Jr., Karl A.; Pecharsky, Alexandra O.; Pecharsky, Vitalij K.

    2003-07-08

    Method of making an active magnetic refrigerant represented by Gd.sub.5 (Si.sub.x Ge.sub.1-x).sub.4 alloy for 0.ltoreq.x.ltoreq.1.0 comprising placing amounts of the commercially pure Gd, Si, and Ge charge components in a crucible, heating the charge contents under subambient pressure to a melting temperature of the alloy for a time sufficient to homogenize the alloy and oxidize carbon with oxygen present in the Gd charge component to reduce carbon, rapidly solidifying the alloy in the crucible, and heat treating the solidified alloy at a temperature below the melting temperature for a time effective to homogenize a microstructure of the solidified material, and then cooling sufficiently fast to prevent the eutectoid decomposition and improve magnetocaloric and/or the magnetostrictive and/or the magnetoresistive properties thereof.

  10. Finite-time stabilization for a class of nonholonomic feedforward systems subject to inputs saturation.

    PubMed

    Gao, Fangzheng; Yuan, Ye; Wu, Yuqiang

    2016-09-01

    This paper studies the problem of finite-time stabilization by state feedback for a class of uncertain nonholonomic systems in feedforward-like form subject to inputs saturation. Under the weaker homogeneous condition on systems growth, a saturated finite-time control scheme is developed by exploiting the adding a power integrator method, the homogeneous domination approach and the nested saturation technique. Together with a novel switching control strategy, the designed saturated controller guarantees that the states of closed-loop system are regulated to zero in a finite time without violation of the constraint. As an application of the proposed theoretical results, the problem of saturated finite-time control for vertical wheel on rotating table is solved. Simulation results are given to demonstrate the effectiveness of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Non-homogeneous harmonic analysis: 16 years of development

    NASA Astrophysics Data System (ADS)

    Volberg, A. L.; Èiderman, V. Ya

    2013-12-01

    This survey contains results and methods in the theory of singular integrals, a theory which has been developing dramatically in the last 15-20 years. The central (although not the only) topic of the paper is the connection between the analytic properties of integrals and operators with Calderón-Zygmund kernels and the geometric properties of the measures. The history is traced of the classical Painlevé problem of describing removable singularities of bounded analytic functions, which has provided a strong incentive for the development of this branch of harmonic analysis. The progress of recent decades has largely been based on the creation of an apparatus for dealing with non-homogeneous measures, and much attention is devoted to this apparatus here. Several open questions are stated, first and foremost in the multidimensional case, where the method of curvature of a measure is not available. Bibliography: 128 titles.

  12. Metal Catalyst for Low-Temperature Growth of Controlled Zinc Oxide Nanowires on Arbitrary Substrates

    PubMed Central

    Kim, Baek Hyun; Kwon, Jae W.

    2014-01-01

    Zinc oxide nanowires generated by hydrothermal method present superior physical and chemical characteristics. Quality control of the growth has been very challenging and controlled growth is only achievable under very limited conditions using homogeneous seed layers with high temperature processes. Here we show the controlled ZnO nanowire growth on various organic and inorganic materials without the requirement of a homogeneous seed layer and a high temperature process. We also report the discovery of an important role of the electronegativity in the nanowire growth on arbitrary substrates. Using heterogeneous metal oxide interlayers with low-temperature hydrothermal methods, we demonstrate well-controlled ZnO nanowire arrays and single nanowires on flat or curved surfaces. A metal catalyst and heterogeneous metal oxide interlayers are found to determine lattice-match with ZnO and to largely influence the controlled alignment. These findings will contribute to the development of novel nanodevices using controlled nanowires. PMID:24625584

  13. Differential homogeneous immunosensor device

    DOEpatents

    Malmros, Mark K.; Gulbinski, III, Julian

    1990-04-10

    There is provided a novel method of testing for the presence of an analyte in a fluid suspected of containing the same. In this method, in the presence of the analyte, a substance capable of modifying certain characteristics of the substrate is bound to the substrate and the change in these qualities is measured. While the method may be modified for carrying out quantitative differential analyses, it eliminates the need for washing analyte from the substrate which is characteristic of prior art methods.

  14. Sample preservation, transport and processing strategies for honeybee RNA extraction: Influence on RNA yield, quality, target quantification and data normalization.

    PubMed

    Forsgren, Eva; Locke, Barbara; Semberg, Emilia; Laugen, Ane T; Miranda, Joachim R de

    2017-08-01

    Viral infections in managed honey bees are numerous, and most of them are caused by viruses with an RNA genome. Since RNA degrades rapidly, appropriate sample management and RNA extraction methods are imperative to get high quality RNA for downstream assays. This study evaluated the effect of various sampling-transport scenarios (combinations of temperature, RNA stabilizers, and duration) of transport on six RNA quality parameters; yield, purity, integrity, cDNA synthesis efficiency, target detection and quantification. The use of water and extraction buffer were also compared for a primary bee tissue homogenate prior to RNA extraction. The strategy least affected by time was preservation of samples at -80°C. All other regimens turned out to be poor alternatives unless the samples were frozen or processed within 24h. Chemical stabilizers have the greatest impact on RNA quality and adding an extra homogenization step (a QIAshredder™ homogenizer) to the extraction protocol significantly improves the RNA yield and chemical purity. This study confirms that RIN values (RNA Integrity Number), should be used cautiously with bee RNA. Using water for the primary homogenate has no negative effect on RNA quality as long as this step is no longer than 15min. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Asphalt pavement aging and temperature dependent properties using functionally graded viscoelastic model

    NASA Astrophysics Data System (ADS)

    Dave, Eshan V.

    Asphalt concrete pavements are inherently graded viscoelastic structures. Oxidative aging of asphalt binder and temperature cycling due to climatic conditions being the major cause of non-homogeneity. Current pavement analysis and simulation procedures dwell on the use of layered approach to account for these non-homogeneities. The conventional finite-element modeling (FEM) technique discretizes the problem domain into smaller elements, each with a unique constitutive property. However the assignment of unique material property description to an element in the FEM approach makes it an unattractive choice for simulation of problems with material non-homogeneities. Specialized elements such as "graded elements" allow for non-homogenous material property definitions within an element. This dissertation describes the development of graded viscoelastic finite element analysis method and its application for analysis of asphalt concrete pavements. Results show that the present research improves efficiency and accuracy of simulations for asphalt pavement systems. Some of the practical implications of this work include the new technique's capability for accurate analysis and design of asphalt pavements and overlay systems and for the determination of pavement performance with varying climatic conditions and amount of in-service age. Other application areas include simulation of functionally graded fiber-reinforced concrete, geotechnical materials, metal and metal composites at high temperatures, polymers, and several other naturally existing and engineered materials.

  16. Development of a homogeneous assay format for p53 antibodies using fluorescence correlation spectroscopy

    NASA Astrophysics Data System (ADS)

    Neuweiler, Hannes; Scheffler, Silvia; Sauer, Markus

    2005-08-01

    The development of reliable methods for the detection of minute amounts of antibodies directly in homogeneous solution represents one of the major tasks in the current research field of molecular diagnostics. We demonstrate the potential of fluorescence correlation spectroscopy (FCS) in combination with quenched peptide-based fluorescence probes for sensitive detection of p53 antibodies directly in homogeneous solution. Single tryptophan (Trp) residues in the sequences of short, synthetic peptide epitopes of the human p53 protein efficiently quench the fluorescence of an oxazine fluorophore attached to the amino terminal ends of the peptides. The fluorescence quenching mechanism is thought to be a photoinduced electron transfer reaction from Trp to the dye enabled by the formation of intramolecular complexes between dye and Trp. Specific recognition of the epitope by the antibody confines the conformational flexibility of the peptide. Consequently, complex formation between dye and Trp is abolished and fluorescence is recovered. Using fluorescence correlation spectroscopy (FCS), antibody binding can be monitored observing two parameters simultaneously: the diffusional mobility of the peptide as well as the quenching amplitude induced by the conformational flexibility of the peptide change significantly upon antibody binding. Our data demonstrate that FCS in combination with fluorescence-quenched peptide epitopes opens new possibilities for the reliable detection of antibody binding events in homogeneous solution.

  17. Form of the manifestly covariant Lagrangian

    NASA Astrophysics Data System (ADS)

    Johns, Oliver Davis

    1985-10-01

    The preferred form for the manifestly covariant Lagrangian function of a single, charged particle in a given electromagnetic field is the subject of some disagreement in the textbooks. Some authors use a ``homogeneous'' Lagrangian and others use a ``modified'' form in which the covariant Hamiltonian function is made to be nonzero. We argue in favor of the ``homogeneous'' form. We show that the covariant Lagrangian theories can be understood only if one is careful to distinguish quantities evaluated on the varied (in the sense of the calculus of variations) world lines from quantities evaluated on the unvaried world lines. By making this distinction, we are able to derive the Hamilton-Jacobi and Klein-Gordon equations from the ``homogeneous'' Lagrangian, even though the covariant Hamiltonian function is identically zero on all world lines. The derivation of the Klein-Gordon equation in particular gives Lagrangian theoretical support to the derivations found in standard quantum texts, and is also shown to be consistent with the Feynman path-integral method. We conclude that the ``homogeneous'' Lagrangian is a completely adequate basis for covariant Lagrangian theory both in classical and quantum mechanics. The article also explores the analogy with the Fermat theorem of optics, and illustrates a simple invariant notation for the Lagrangian and other four-vector equations.

  18. Radiation and scattering by thin-wire structures in the complex frequency domain. [electromagnetic theory for thin-wire antennas

    NASA Technical Reports Server (NTRS)

    Richmond, J. H.

    1974-01-01

    Piecewise-sinusoidal expansion functions and Galerkin's method are employed to formulate a solution for an arbitrary thin-wire configuration in a homogeneous conducting medium. The analysis is performed in the real or complex frequency domain. In antenna problems, the solution determines the current distribution, impedance, radiation efficiency, gain and far-field patterns. In scattering problems, the solution determines the absorption cross section, scattering cross section and the polarization scattering matrix. The electromagnetic theory is presented for thin wires and the forward-scattering theorem is developed for an arbitrary target in a homogeneous conducting medium.

  19. Method of producing homogeneous mixed metal oxides and metal-metal oxide mixtures

    DOEpatents

    Quinby, Thomas C.

    1978-01-01

    Metal powders, metal oxide powders, and mixtures thereof of controlled particle size are provided by reacting an aqueous solution containing dissolved metal values with excess urea. Upon heating, urea reacts with water from the solution leaving a molten urea solution containing the metal values. The molten urea solution is heated to above about 180.degree. C. whereupon metal values precipitate homogeneously as a powder. The powder is reduced to metal or calcined to form oxide particles. One or more metal oxides in a mixture can be selectively reduced to produce metal particles or a mixture of metal and metal oxide particles.

  20. Homogeneity of CdZnTe detectors

    NASA Astrophysics Data System (ADS)

    Hermon, H.; Schieber, M.; James, R. B.; Lund, J.; Antolak, A. J.; Morse, D. H.; Kolesnikov, N. N. P.; Ivanov, Y. N.; Goorsky, M. S.; Yoon, H.; Toney, J.; Schlesinger, T. E.

    1998-02-01

    We describe the current state of nuclear radiation detectors produced from single crystals of Cd 1- xZn xTe(CZT), with 0.04 < x < 0.4, grown by the vertical high pressure Bridgman (VHPB) method. The crystals investigated were grown commercially both in the USA and at the Institute of Solid State Physics, Chernogolska, Russia. The CZT was evaluated by Sandia National Laboratories and the UCLA and CMU groups using proton-induced X-ray emission (PIXE), X-ray diffraction (XRD), photoluminescence (PL), infrared (IR) transmission microscopy, leakage current measurements and response to nuclear radiation. We discuss the homogeneity of the various CZT crystals based on the results from these measurement techniques.

  1. Silver-free activation of ligated gold(I) chlorides: the use of [Me3NB12Cl11]- as a weakly coordinating anion in homogeneous gold catalysis.

    PubMed

    Wegener, Michael; Huber, Florian; Bolli, Christoph; Jenne, Carsten; Kirsch, Stefan F

    2015-01-12

    Phosphane and N-heterocyclic carbene ligated gold(I) chlorides can be effectively activated by Na[Me3NB12Cl11] (1) under silver-free conditions. This activation method with a weakly coordinating closo-dodecaborate anion was shown to be suitable for a large variety of reactions known to be catalyzed by homogeneous gold species, ranging from carbocyclizations to heterocyclizations. Additionally, the capability of 1 in a previously unknown conversion of 5-silyloxy-1,6-allenynes was demonstrated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Bifurcation approach to a logistic elliptic equation with a homogeneous incoming flux boundary condition

    NASA Astrophysics Data System (ADS)

    Umezu, Kenichiro

    In this paper, we consider a semilinear elliptic boundary value problem in a smooth bounded domain, having the so-called logistic nonlinearity that originates from population dynamics, with a nonlinear boundary condition. Although the logistic nonlinearity has an absorption effect in the problem, the nonlinear boundary condition is induced by the homogeneous incoming flux on the boundary. The objective of our study is to analyze the existence of a bifurcation component of positive solutions from trivial solutions and its asymptotic behavior and stability. We perform this analysis using the method developed by Lyapunov and Schmidt, based on a scaling argument.

  3. The stagnation-point flow towards a shrinking sheet with homogeneous - heterogeneous reactions effects: A stability analysis

    NASA Astrophysics Data System (ADS)

    Ismail, Nurul Syuhada; Arifin, Norihan Md.; Bachok, Norfifah; Mahiddin, Norhasimah

    2017-01-01

    A numerical study is performed to evaluate the problem of stagnation - point flow towards a shrinking sheet with homogeneous - heterogeneous reaction effects. By using non-similar transformation, the governing equations be able to reduced to an ordinary differential equation. Then, results of the equations can be obtained numerically by shooting method with maple implementation. Based on the numerical results obtained, the velocity ratio parameter λ< 0, the dual solutions do exist. Then, the stability analysis is carried out to determine which solution is more stable between both of the solutions by bvp4c solver in Matlab.

  4. Synthesis of ultrasmall, homogeneously alloyed, bimetallic nanoparticles on silica supports

    NASA Astrophysics Data System (ADS)

    Wong, A.; Liu, Q.; Griffin, S.; Nicholls, A.; Regalbuto, J. R.

    2017-12-01

    Supported nanoparticles containing more than one metal have a variety of applications in sensing, catalysis, and biomedicine. Common synthesis techniques for this type of material often result in large, unalloyed nanoparticles that lack the interactions between the two metals that give the particles their desired characteristics. We demonstrate a relatively simple, effective, generalizable method to produce highly dispersed, well-alloyed bimetallic nanoparticles. Ten permutations of noble and base metals (platinum, palladium, copper, nickel, and cobalt) were synthesized with average particle sizes from 0.9 to 1.4 nanometers, with tight size distributions. High-resolution imaging and x-ray analysis confirmed the homogeneity of alloying in these ultrasmall nanoparticles.

  5. Constitutive modeling and control of 1D smart composite structures

    NASA Astrophysics Data System (ADS)

    Briggs, Jonathan P.; Ostrowski, James P.; Ponte-Castaneda, Pedro

    1998-07-01

    Homogenization techniques for determining effective properties of composite materials may provide advantages for control of stiffness and strain in systems using hysteretic smart actuators embedded in a soft matrix. In this paper, a homogenized model of a 1D composite structure comprised of shape memory alloys and a rubber-like matrix is presented. With proportional and proportional/integral feedback, using current as the input state and global strain as an error state, implementation scenarios include the use of tractions on the boundaries and a nonlinear constitutive law for the matrix. The result is a simple model which captures the nonlinear behavior of the smart composite material system and is amenable to experiments with various control paradigms. The success of this approach in the context of the 1D model suggests that the homogenization method may prove useful in investigating control of more general smart structures. Applications of such materials could include active rehabilitation aids, e.g. wrist braces, as well as swimming/undulating robots, or adaptive molds for manufacturing processes.

  6. Determining the wedge angle and optical homogeneity of a glass plate by statistically analyzing the deformation in the wavefront surface.

    PubMed

    Yang, Pao-Keng

    2017-08-01

    By using a light-emitting diode as the probing light source and a Shack-Hartmann wavefront sensor as the recorder for the wavefront surface to execute a relative measurement, we present a useful method for determining the small wedge angle and optical homogeneity of a nominally planar glass plate from the wavefront measurements. The measured wavefront surface from the light source was first calibrated to be a horizontal plane before the plate under test was inserted. The wedge angle of the plate can be determined from the inclining angle of the regression plane of the measured wavefront surface after the plate was inserted between the light source and the wavefront sensor. Despite the annoying time-dependent altitude fluctuation in measured wavefront topography, the optical homogeneity of the plate can be estimated from the increment on the average variance of the wavefront surface to its regression plane after the light passes through it by using the Bienaymé formula.

  7. A zebrafish embryo behaves both as a "cortical shell-liquid core" structure and a homogeneous solid when experiencing mechanical forces.

    PubMed

    Liu, Fei; Wu, Dan; Chen, Ken

    2014-12-01

    Mechanical properties are vital for living cells, and various models have been developed to study the mechanical behavior of cells. However, there is debate regarding whether a cell behaves more similarly to a "cortical shell-liquid core" structure (membrane-like) or a homogeneous solid (cytoskeleton-like) when experiencing stress by mechanical forces. Unlike most experimental methods, which concern the small-strain deformation of a cell, we focused on the mechanical behavior of a cell undergoing small to large strain by conducting microinjection experiments on zebrafish embryo cells. The power law with order of 1.5 between the injection force and the injection distance indicates that the cell behaves as a homogenous solid at small-strain deformation. The linear relation between the rupture force and the microinjector radius suggests that the embryo behaves as membrane-like when subjected to large-strain deformation. We also discuss the possible reasons causing the debate by analyzing the mechanical properties of F-actin filaments.

  8. Modeling Woven Polymer Matrix Composites with MAC/GMC

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M. (Technical Monitor)

    2000-01-01

    NASA's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) is used to predict the elastic properties of plain weave polymer matrix composites (PMCs). The traditional one step three-dimensional homogertization procedure that has been used in conjunction with MAC/GMC for modeling woven composites in the past is inaccurate due to the lack of shear coupling inherent to the model. However, by performing a two step homogenization procedure in which the woven composite repeating unit cell is homogenized independently in the through-thickness direction prior to homogenization in the plane of the weave, MAC/GMC can now accurately model woven PMCs. This two step procedure is outlined and implemented, and predictions are compared with results from the traditional one step approach and other models and experiments from the literature. Full coupling of this two step technique with MAC/ GMC will result in a widely applicable, efficient, and accurate tool for the design and analysis of woven composite materials and structures.

  9. Combinatorial synthesis of phosphors using arc-imaging furnace

    PubMed Central

    Ishigaki, Tadashi; Toda, Kenji; Yoshimura, Masahiro; Uematsu, Kazuyoshi; Sato, Mineo

    2011-01-01

    We have applied a novel ‘melt synthesis technique’ rather than a conventional solid-state reaction to rapidly synthesize phosphor materials. During a synthesis, the mixture of oxides or their precursors is melted by light pulses (10–60 s) in an arc-imaging furnace on a water-cooled copper hearth to form a globule of 1–5 mm diameter, which is then rapidly cooled by turning off the light. Using this method, we synthesized several phosphor compounds including Y3Al5O12:Ce(YAG) and SrAl2O4:Eu,Dy. Complex phosphor oxides are difficult to produce by conventional solid-state reaction techniques because of the slow reaction rates among solid oxides; as a result, the oxides form homogeneous compounds or solid solutions. On the other hand, melt reactions are very fast (10–60 s) and result in homogeneous compounds owing to rapid diffusion and mixing in the liquid phase. Therefore, melt synthesis techniques are suitable for preparing multi component homogeneous compounds and solid solutions. PMID:27877432

  10. Simplified Calculation Model and Experimental Study of Latticed Concrete-Gypsum Composite Panels

    PubMed Central

    Jiang, Nan; Ma, Shaochun

    2015-01-01

    In order to address the performance complexity of the various constituent materials of (dense-column) latticed concrete-gypsum composite panels and the difficulty in the determination of the various elastic constants, this paper presented a detailed structural analysis of the (dense-column) latticed concrete-gypsum composite panel and proposed a feasible technical solution to simplified calculation. In conformity with mechanical rules, a typical panel element was selected and divided into two homogenous composite sub-elements and a secondary homogenous element, respectively for solution, thus establishing an equivalence of the composite panel to a simple homogenous panel and obtaining the effective formulas for calculating the various elastic constants. Finally, the calculation results and the experimental results were compared, which revealed that the calculation method was correct and reliable and could meet the calculation needs of practical engineering and provide a theoretical basis for simplified calculation for studies on composite panel elements and structures as well as a reference for calculations of other panels. PMID:28793631

  11. Nonparametric estimation of median survival times with applications to multi-site or multi-center studies.

    PubMed

    Rahbar, Mohammad H; Choi, Sangbum; Hong, Chuan; Zhu, Liang; Jeon, Sangchoon; Gardiner, Joseph C

    2018-01-01

    We propose a nonparametric shrinkage estimator for the median survival times from several independent samples of right-censored data, which combines the samples and hypothesis information to improve the efficiency. We compare efficiency of the proposed shrinkage estimation procedure to unrestricted estimator and combined estimator through extensive simulation studies. Our results indicate that performance of these estimators depends on the strength of homogeneity of the medians. When homogeneity holds, the combined estimator is the most efficient estimator. However, it becomes inconsistent when homogeneity fails. On the other hand, the proposed shrinkage estimator remains efficient. Its efficiency decreases as the equality of the survival medians is deviated, but is expected to be as good as or equal to the unrestricted estimator. Our simulation studies also indicate that the proposed shrinkage estimator is robust to moderate levels of censoring. We demonstrate application of these methods to estimating median time for trauma patients to receive red blood cells in the Prospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study.

  12. Combinatorial synthesis of phosphors using arc-imaging furnace

    NASA Astrophysics Data System (ADS)

    Ishigaki, Tadashi; Toda, Kenji; Yoshimura, Masahiro; Uematsu, Kazuyoshi; Sato, Mineo

    2011-10-01

    We have applied a novel 'melt synthesis technique' rather than a conventional solid-state reaction to rapidly synthesize phosphor materials. During a synthesis, the mixture of oxides or their precursors is melted by light pulses (10-60 s) in an arc-imaging furnace on a water-cooled copper hearth to form a globule of 1-5 mm diameter, which is then rapidly cooled by turning off the light. Using this method, we synthesized several phosphor compounds including Y3Al5O12:Ce(YAG) and SrAl2O4:Eu,Dy. Complex phosphor oxides are difficult to produce by conventional solid-state reaction techniques because of the slow reaction rates among solid oxides; as a result, the oxides form homogeneous compounds or solid solutions. On the other hand, melt reactions are very fast (10-60 s) and result in homogeneous compounds owing to rapid diffusion and mixing in the liquid phase. Therefore, melt synthesis techniques are suitable for preparing multi component homogeneous compounds and solid solutions.

  13. Atomic density functional and diagram of structures in the phase field crystal model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ankudinov, V. E., E-mail: vladimir@ankudinov.org; Galenko, P. K.; Kropotin, N. V.

    2016-02-15

    The phase field crystal model provides a continual description of the atomic density over the diffusion time of reactions. We consider a homogeneous structure (liquid) and a perfect periodic crystal, which are constructed from the one-mode approximation of the phase field crystal model. A diagram of 2D structures is constructed from the analytic solutions of the model using atomic density functionals. The diagram predicts equilibrium atomic configurations for transitions from the metastable state and includes the domains of existence of homogeneous, triangular, and striped structures corresponding to a liquid, a body-centered cubic crystal, and a longitudinal cross section of cylindricalmore » tubes. The method developed here is employed for constructing the diagram for the homogeneous liquid phase and the body-centered iron lattice. The expression for the free energy is derived analytically from density functional theory. The specific features of approximating the phase field crystal model are compared with the approximations and conclusions of the weak crystallization and 2D melting theories.« less

  14. In vivo quantitative bioluminescence tomography using heterogeneous and homogeneous mouse models.

    PubMed

    Liu, Junting; Wang, Yabin; Qu, Xiaochao; Li, Xiangsi; Ma, Xiaopeng; Han, Runqiang; Hu, Zhenhua; Chen, Xueli; Sun, Dongdong; Zhang, Rongqing; Chen, Duofang; Chen, Dan; Chen, Xiaoyuan; Liang, Jimin; Cao, Feng; Tian, Jie

    2010-06-07

    Bioluminescence tomography (BLT) is a new optical molecular imaging modality, which can monitor both physiological and pathological processes by using bioluminescent light-emitting probes in small living animal. Especially, this technology possesses great potential in drug development, early detection, and therapy monitoring in preclinical settings. In the present study, we developed a dual modality BLT prototype system with Micro-computed tomography (MicroCT) registration approach, and improved the quantitative reconstruction algorithm based on adaptive hp finite element method (hp-FEM). Detailed comparisons of source reconstruction between the heterogeneous and homogeneous mouse models were performed. The models include mice with implanted luminescence source and tumor-bearing mice with firefly luciferase report gene. Our data suggest that the reconstruction based on heterogeneous mouse model is more accurate in localization and quantification than the homogeneous mouse model with appropriate optical parameters and that BLT allows super-early tumor detection in vivo based on tomographic reconstruction of heterogeneous mouse model signal.

  15. Influence of hydroxypropylmethylcellulose addition and homogenization conditions on properties and ageing of corn starch based films.

    PubMed

    Jiménez, Alberto; Fabra, María José; Talens, Pau; Chiralt, Amparo

    2012-06-20

    Edible films based on corn starch, hydroxypropyl methylcellulose (HPMC) and their mixtures were prepared by using two different procedures to homogenize the film forming dispersions (rotor-stator and rotor-stator plus microfluidizer). The influence of both HPMC-starch ratio and the homogenization method on the structural, optical, tensile and barrier properties of the films was analysed. The ageing of the films was also studied by characterizing them after 5 weeks' storage. Starch re-crystallization in newly prepared and stored films was analysed by means of X-ray diffraction. HPMC-corn starch films showed phase separation of polymers, which was enhanced when microfluidization was applied to the film forming dispersion. Nevertheless, HPMC addition inhibited starch re-crystallization during storage, giving rise to more flexible films at the end of the period. Water barrier properties of starch films were hardly affected by the addition of HPMC, although oxygen permeability increased due to its poorer oxygen barrier properties. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Tannin-immobilized cellulose hydrogel fabricated by a homogeneous reaction as a potential adsorbent for removing cationic organic dye from aqueous solution.

    PubMed

    Pei, Ying; Chu, Shan; Chen, Yue; Li, Zhidong; Zhao, Jin; Liu, Shuqi; Wu, Xingjun; Liu, Jie; Zheng, Xuejing; Tang, Keyong

    2017-10-01

    Tannin-immobilized cellulose (CT) hydrogels were successfully fabricated by homogeneous immobilization and crosslinking reaction via a simple method. The structures and properties of hydrogels were characterized by SEM and mechanical test. Methlyene Blue (MB) was selected as a cationic dye model, and the adsorption ability of CT hydrogel was evaluated. Tannins immobilized acted as adsorbent sites which combined MB by electrostatic attraction, resulting in the attractive adsorption ability of CT hydrogel. Adsorption kinetics could be better described by the pseudo-second-order model, and the absorption behaviors were in agreement with a Langmuir isotherm. The adsorption-desorption cycle of CT hydrogel was repeated six times without significant loss of adsorption capacity. In this work, both tannin immobilization and hydrogel formation were achieved simultaneously by a facile homogeneous reaction, providing a new pathway to fabricate tannin-immobilized materials for water treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Theoretical study of mixing in liquid clouds – Part 1: Classical concepts

    DOE PAGES

    Korolev, Alexei; Khain, Alex; Pinsky, Mark; ...

    2016-07-28

    The present study considers final stages of in-cloud mixing in the framework of classical concept of homogeneous and extreme inhomogeneous mixing. Simple analytical relationships between basic microphysical parameters were obtained for homogeneous and extreme inhomogeneous mixing based on the adiabatic consideration. It was demonstrated that during homogeneous mixing the functional relationships between the moments of the droplets size distribution hold only during the primary stage of mixing. Subsequent random mixing between already mixed parcels and undiluted cloud parcels breaks these relationships. However, during extreme inhomogeneous mixing the functional relationships between the microphysical parameters hold both for primary and subsequent mixing.more » The obtained relationships can be used to identify the type of mixing from in situ observations. The effectiveness of the developed method was demonstrated using in situ data collected in convective clouds. It was found that for the specific set of in situ measurements the interaction between cloudy and entrained environments was dominated by extreme inhomogeneous mixing.« less

  18. A new chemical approach to optimize the in vitro SPF method on the HD6 PMMA plate.

    PubMed

    Marguerie, S; Pissavini, M; Baud, A; Carayol, T; Doucet, O

    2012-01-01

    In a previous study, we demonstrated that control of the roughness of molded PMMA plates improves in vitro SPF reproducibility. However, in vitro/vivo deviations are still observed. Sunscreens show different behavior during spreading on the HD6 surface according to the formulation, resulting in a more or less homogenous distribution. The hydrophilic nature of HD6 appears to contribute significantly during spreading. Two different sunscreens offering a homogenous and non-homogenous distribution were investigated to check if the interfacial tension between product and substrate has a real influence on the spreading quality. Using microscopic observations, we attempted to correlate the in vitro SPF results with the product's spreading property. In order to reduce this interfacial tension, an HD6 pretreatment with an amphoteric surfactant, cocamidopropyl betain, was performed. In vitro SPF on "pretreated HD6" was examined using a cohort of 30 products. This pretreatment led to reliable results, demonstrating good association with the in vivo SPF.

  19. Materials for x-ray refractive lenses minimizing wavefront distortions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roth, Thomas; Alianelli, Lucia; Lengeler, Daniel

    2017-06-09

    Refraction through curved surfaces, reflection from curved mirrors in grazing incidence, and diffraction from Fresnel zone plates are key hard x-ray focusing mechanisms. In this article, we present materials used for refractive x-ray lenses. Important properties of such x-ray lenses include focusing strength, shape, and the material’s homogeneity and absorption coefficient. Both the properties of the initial material and the fabrication process result in a lens with imperfections, which can lead to unwanted wavefront distortions. Different fabrication methods for one-dimensional and two-dimensional focusing lenses are presented, together with the respective benefits and inconveniences that are mostly due to shape fidelity.more » Different materials and material grades have been investigated in terms of their homogeneity and the absence of inclusions. Single-crystalline materials show high homogeneity, but suffer from unwanted diffracted radiation, which can be avoided using amorphous materials. Lastly, we show that shape imperfections can be corrected using a correction lens.« less

  20. Simplified Calculation Model and Experimental Study of Latticed Concrete-Gypsum Composite Panels.

    PubMed

    Jiang, Nan; Ma, Shaochun

    2015-10-27

    In order to address the performance complexity of the various constituent materials of (dense-column) latticed concrete-gypsum composite panels and the difficulty in the determination of the various elastic constants, this paper presented a detailed structural analysis of the (dense-column) latticed concrete-gypsum composite panel and proposed a feasible technical solution to simplified calculation. In conformity with mechanical rules, a typical panel element was selected and divided into two homogenous composite sub-elements and a secondary homogenous element, respectively for solution, thus establishing an equivalence of the composite panel to a simple homogenous panel and obtaining the effective formulas for calculating the various elastic constants. Finally, the calculation results and the experimental results were compared, which revealed that the calculation method was correct and reliable and could meet the calculation needs of practical engineering and provide a theoretical basis for simplified calculation for studies on composite panel elements and structures as well as a reference for calculations of other panels.

Top