Sample records for process creates homogenous

  1. Pattern and process of biotic homogenization in the New Pangaea

    PubMed Central

    Baiser, Benjamin; Olden, Julian D.; Record, Sydne; Lockwood, Julie L.; McKinney, Michael L.

    2012-01-01

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from −0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization. PMID:23055062

  2. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-07

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization.

  3. On the time-homogeneous Ornstein-Uhlenbeck process in the foreign exchange rates

    NASA Astrophysics Data System (ADS)

    da Fonseca, Regina C. B.; Matsushita, Raul Y.; de Castro, Márcio T.; Figueiredo, Annibal

    2015-10-01

    Since Gaussianity and stationarity assumptions cannot be fulfilled by financial data, the time-homogeneous Ornstein-Uhlenbeck (THOU) process was introduced as a candidate model to describe time series of financial returns [1]. It is an Ornstein-Uhlenbeck (OU) process in which these assumptions are replaced by linearity and time-homogeneity. We employ the OU and THOU processes to analyze daily foreign exchange rates against the US dollar. We confirm that the OU process does not fit the data, while in most cases the first four cumulants patterns from data can be described by the THOU process. However, there are some exceptions in which the data do not follow linearity or time-homogeneity assumptions.

  4. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  5. Mesoscopic homogenization of semi-insulating GaAs by two-step post growth annealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffmann, B.; Jurisch, M.; Koehler, A.

    1996-12-31

    Mesoscopic homogenization of the electrical properties of s.i. LEC-GaAs is commonly realized by thermal treatment of the crystals including the steps of dissolution of arsenic precipitates, homogenization of excess As and re-precipitation by creating a controlled supersaturation. Caused by the inhomogeneous distribution of dislocations and the corresponding cellular structure along and across LEC-grown crystals a proper choice of the time-temperature program is necessary to minimize fluctuations of mesoscopic homogeneity. A modified two-step ingot annealing process is demonstrated to ensure the homogeneous distribution of mesoscopic homogeneity.

  6. Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2009-09-01

    In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in

  7. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  8. NON-HOMOGENEOUS POISSON PROCESS MODEL FOR GENETIC CROSSOVER INTERFERENCE.

    PubMed

    Leu, Szu-Yun; Sen, Pranab K

    2014-01-01

    The genetic crossover interference is usually modeled with a stationary renewal process to construct the genetic map. We propose two non-homogeneous, also dependent, Poisson process models applied to the known physical map. The crossover process is assumed to start from an origin and to occur sequentially along the chromosome. The increment rate depends on the position of the markers and the number of crossover events occurring between the origin and the markers. We show how to obtain parameter estimates for the process and use simulation studies and real Drosophila data to examine the performance of the proposed models.

  9. Autoregressive Processes in Homogenization of GNSS Tropospheric Data

    NASA Astrophysics Data System (ADS)

    Klos, A.; Bogusz, J.; Teferle, F. N.; Bock, O.; Pottiaux, E.; Van Malderen, R.

    2016-12-01

    Offsets due to changes in hardware equipment or any other artificial event are all a subject of a task of homogenization of tropospheric data estimated within a processing of Global Navigation Satellite System (GNSS) observables. This task is aimed at identifying exact epochs of offsets and estimate their magnitudes since they may artificially under- or over-estimate trend and its uncertainty delivered from tropospheric data and used in climate studies. In this research, we analysed a common data set of differences of Integrated Water Vapour (IWV) from GPS and ERA-Interim (1995-2010) provided for a homogenization group working within ES1206 COST Action GNSS4SWEC. We analysed daily IWV records of GPS and ERA-Interim in terms of trend, seasonal terms and noise model with Maximum Likelihood Estimation in Hector software. We found that this data has a character of autoregressive process (AR). Basing on this analysis, we performed Monte Carlo simulations of 25 years long data with two different noise types: white as well as combination of white and autoregressive and also added few strictly defined offsets. This synthetic data set of exactly the same character as IWV from GPS and ERA-Interim was then subjected to a task of manual and automatic/statistical homogenization. We made blind tests and detected possible epochs of offsets manually. We found that simulated offsets were easily detected in series with white noise, no influence of seasonal signal was noticed. The autoregressive series were much more problematic when offsets had to be determined. We found few epochs, for which no offset was simulated. This was mainly due to strong autocorrelation of data, which brings an artificial trend within. Due to regime-like behaviour of AR it is difficult for statistical methods to properly detect epochs of offsets, which was previously reported by climatologists.

  10. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    PubMed

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  11. Sojourning with the Homogeneous Poisson Process.

    PubMed

    Liu, Piaomu; Peña, Edsel A

    2016-01-01

    In this pedagogical article, distributional properties, some surprising, pertaining to the homogeneous Poisson process (HPP), when observed over a possibly random window, are presented. Properties of the gap-time that covered the termination time and the correlations among gap-times of the observed events are obtained. Inference procedures, such as estimation and model validation, based on event occurrence data over the observation window, are also presented. We envision that through the results in this paper, a better appreciation of the subtleties involved in the modeling and analysis of recurrent events data will ensue, since the HPP is arguably one of the simplest among recurrent event models. In addition, the use of the theorem of total probability, Bayes theorem, the iterated rules of expectation, variance and covariance, and the renewal equation could be illustrative when teaching distribution theory, mathematical statistics, and stochastic processes at both the undergraduate and graduate levels. This article is targeted towards both instructors and students.

  12. Design and fabrication of optical homogenizer with micro structure by injection molding process

    NASA Astrophysics Data System (ADS)

    Chen, C.-C. A.; Chang, S.-W.; Weng, C.-J.

    2008-08-01

    This paper is to design and fabricate an optical homogenizer with hybrid design of collimator, toroidal lens array, and projection lens for beam shaping of Gaussian beam into uniform cylindrical beam. TracePro software was used to design the geometry of homogenizer and simulation of injection molding was preceded by Moldflow MPI to evaluate the mold design for injection molding process. The optical homogenizer is a cylindrical part with thickness 8.03 mm and diameter 5 mm. The micro structure of toroidal array has groove height designed from 12 μm to 99 μm. An electrical injection molding machine and PMMA (n= 1.4747) were selected to perform the experiment. Experimental results show that the optics homogenizer has achieved the transfer ratio of grooves (TRG) as 88.98% and also the optical uniformity as 68% with optical efficiency as 91.88%. Future study focuses on development of an optical homogenizer for LED light source.

  13. Spatial homogenization methods for pin-by-pin neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Kozlowski, Tomasz

    For practical reactor core applications low-order transport approximations such as SP3 have been shown to provide sufficient accuracy for both static and transient calculations with considerably less computational expense than the discrete ordinate or the full spherical harmonics methods. These methods have been applied in several core simulators where homogenization was performed at the level of the pin cell. One of the principal problems has been to recover the error introduced by pin-cell homogenization. Two basic approaches to treat pin-cell homogenization error have been proposed: Superhomogenization (SPH) factors and Pin-Cell Discontinuity Factors (PDF). These methods are based on well established Equivalence Theory and Generalized Equivalence Theory to generate appropriate group constants. These methods are able to treat all sources of error together, allowing even few-group diffusion with one mesh per cell to reproduce the reference solution. A detailed investigation and consistent comparison of both homogenization techniques showed potential of PDF approach to improve accuracy of core calculation, but also reveal its limitation. In principle, the method is applicable only for the boundary conditions at which it was created, i.e. for boundary conditions considered during the homogenization process---normally zero current. Therefore, there exists a need to improve this method, making it more general and environment independent. The goal of proposed general homogenization technique is to create a function that is able to correctly predict the appropriate correction factor with only homogeneous information available, i.e. a function based on heterogeneous solution that could approximate PDFs using homogeneous solution. It has been shown that the PDF can be well approximated by least-square polynomial fit of non-dimensional heterogeneous solution and later used for PDF prediction using homogeneous solution. This shows a promise for PDF prediction for off

  14. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  15. Influence of Homogenization and Thermal Processing on the Gastrointestinal Fate of Bovine Milk Fat: In Vitro Digestion Study.

    PubMed

    Liang, Li; Qi, Ce; Wang, Xingguo; Jin, Qingzhe; McClements, David Julian

    2017-12-20

    Dairy lipids are an important source of energy and nutrients for infants and adults. The dimensions, aggregation state, and interfacial properties of fat globules in raw milk are changed by dairy processing operations, such as homogenization and thermal processing. These changes influence the behavior of fat globules within the human gastrointestinal tract (GIT). The gastrointestinal fate of raw milk, homogenized milk, high temperature short time (HTST) pasteurized milk, and ultrahigh temperature (UHT) pasteurized milk samples was therefore determined using a simulated GIT. The properties of particles in different regions of the GIT depended on the degree of milk processing. Homogenization increased the initial lipid digestion rate but did not influence the final digestion extent. Thermal processing of homogenized milk decreased the initial rate and final extent of lipid digestion, which was attributed to changes in interfacial structure. These results provide insights into the impact of dairy processing on the gastrointestinal fate of milk fat.

  16. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    PubMed Central

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  17. CO2-assisted high pressure homogenization: a solvent-free process for polymeric microspheres and drug-polymer composites.

    PubMed

    Kluge, Johannes; Mazzotti, Marco

    2012-10-15

    The study explores the enabling role of near-critical CO(2) as a reversible plasticizer in the high pressure homogenization of polymer particles, aiming at their comminution as well as at the formation of drug-polymer composites. First, the effect of near-critical CO(2) on the homogenization of aqueous suspensions of poly lactic-co-glycolic acid (PLGA) was investigated. Applying a pressure drop of 900 bar and up to 150 passes across the homogenizer, it was found that particles processed in the presence of CO(2) were generally of microspherical morphology and at all times significantly smaller than those obtained in the absence of a plasticizer. The smallest particles, exhibiting a median x(50) of 1.3 μm, were obtained by adding a small quantity of ethyl acetate, which exerts on PLGA an additional plasticizing effect during the homogenization step. Further, the study concerns the possibility of forming drug-polymer composites through simultaneous high pressure homogenization of the two relevant solids, and particularly the effect of near-critical CO(2) on this process. Therefore, PLGA was homogenized together with crystalline S-ketoprofen (S-KET), a non-steroidal anti-inflammatory drug, at a drug to polymer ratio of 1:10, a pressure drop of 900 bar and up to 150 passes across the homogenizer. When the process was carried out in the presence of CO(2), an impregnation efficiency of 91% has been reached, corresponding to 8.3 wt.% of S-KET in PLGA; moreover, composite particles were of microspherical morphology and significantly smaller than those obtained in the absence of CO(2). The formation of drug-polymer composites through simultaneous homogenization of the two materials is thus greatly enhanced by the presence of CO(2), which increases the efficiency for both homogenization and impregnation. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Occurrence analysis of daily rainfalls through non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2011-06-01

    A stochastic model based on a non-homogeneous Poisson process, characterised by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. The data modelling has been performed with a partition of observed daily rainfall data into a calibration period for parameter estimation and a validation period for checking on occurrence process changes. The model has been applied to a set of rain gauges located in different geographical areas of Southern Italy. The results show a good fit for time-varying intensity of rainfall occurrence process by 2-harmonic Fourier law and no statistically significant evidence of changes in the validation period for different threshold values.

  19. Enhancement of Lipid Extraction from Marine Microalga, Scenedesmus Associated with High-Pressure Homogenization Process

    PubMed Central

    Cho, Seok-Cheol; Choi, Woon-Yong; Oh, Sung-Ho; Lee, Choon-Geun; Seo, Yong-Chang; Kim, Ji-Seon; Song, Chi-Ho; Kim, Ga-Vin; Lee, Shin-Young; Kang, Do-Hyung; Lee, Hyeon-Yong

    2012-01-01

    Marine microalga, Scenedesmus sp., which is known to be suitable for biodiesel production because of its high lipid content, was subjected to the conventional Folch method of lipid extraction combined with high-pressure homogenization pretreatment process at 1200 psi and 35°C. Algal lipid yield was about 24.9% through this process, whereas only 19.8% lipid can be obtained by following a conventional lipid extraction procedure using the solvent, chloroform : methanol (2 : 1, v/v). Present approach requires 30 min process time and a moderate working temperature of 35°C as compared to the conventional extraction method which usually requires >5 hrs and 65°C temperature. It was found that this combined extraction process followed second-order reaction kinetics, which means most of the cellular lipids were extracted during initial periods of extraction, mostly within 30 min. In contrast, during the conventional extraction process, the cellular lipids were slowly and continuously extracted for >5 hrs by following first-order kinetics. Confocal and scanning electron microscopy revealed altered texture of algal biomass pretreated with high-pressure homogenization. These results clearly demonstrate that the Folch method coupled with high-pressure homogenization pretreatment can easily destruct the rigid cell walls of microalgae and release the intact lipids, with minimized extraction time and temperature, both of which are essential for maintaining good quality of the lipids for biodiesel production. PMID:22969270

  20. Preparation and characterization of molecularly homogeneous silica-titania film by sol-gel process with different synthetic strategies.

    PubMed

    Chen, Hsueh-Shih; Huang, Sheng-Hsin; Perng, Tsong-Pyng

    2012-10-24

    Three silica-titania thin films with various degrees of molecular homogeneity were synthesized by the sol-gel process with the same precursor formula but different reaction paths. The dried films prepared by a single spin-coating process have a thickness of 500-700 nm and displayed no cracks or pin holes. The transmittances and refractive indices of the samples are >97.8% in the range of 350-1800 nm and 1.62-1.65 at 500 nm, respectively. The in-plane and out-of-plane chemical homogeneities of the films were analyzed by X-ray photoelectron spectroscopy and Auger electron spectroscopy, respectively. For the film with the highest degree of homogeneity, the deviations of O, Si, and Ti atomic contents in both in-plane and out-of-plane directions are less than 1.5%, indicating that the film is highly molecularly homogeneous. It also possesses the highest transparency and the lowest refractive index among the three samples.

  1. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    PubMed

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Homogeneous Atomic Fermi Gases

    NASA Astrophysics Data System (ADS)

    Mukherjee, Biswaroop; Yan, Zhenjie; Patel, Parth B.; Hadzibabic, Zoran; Yefsah, Tarik; Struck, Julian; Zwierlein, Martin W.

    2017-03-01

    We report on the creation of homogeneous Fermi gases of ultracold atoms in a uniform potential. In the momentum distribution of a spin-polarized gas, we observe the emergence of the Fermi surface and the saturated occupation of one particle per momentum state: the striking consequence of Pauli blocking in momentum space for a degenerate gas. Cooling a spin-balanced Fermi gas at unitarity, we create homogeneous superfluids and observe spatially uniform pair condensates. For thermodynamic measurements, we introduce a hybrid potential that is harmonic in one dimension and uniform in the other two. The spatially resolved compressibility reveals the superfluid transition in a spin-balanced Fermi gas, saturation in a fully polarized Fermi gas, and strong attraction in the polaronic regime of a partially polarized Fermi gas.

  3. Process for forming a homogeneous oxide solid phase of catalytically active material

    DOEpatents

    Perry, Dale L.; Russo, Richard E.; Mao, Xianglei

    1995-01-01

    A process is disclosed for forming a homogeneous oxide solid phase reaction product of catalytically active material comprising one or more alkali metals, one or more alkaline earth metals, and one or more Group VIII transition metals. The process comprises reacting together one or more alkali metal oxides and/or salts, one or more alkaline earth metal oxides and/or salts, one or more Group VIII transition metal oxides and/or salts, capable of forming a catalytically active reaction product, in the optional presence of an additional source of oxygen, using a laser beam to ablate from a target such metal compound reactants in the form of a vapor in a deposition chamber, resulting in the deposition, on a heated substrate in the chamber, of the desired oxide phase reaction product. The resulting product may be formed in variable, but reproducible, stoichiometric ratios. The homogeneous oxide solid phase product is useful as a catalyst, and can be produced in many physical forms, including thin films, particulate forms, coatings on catalyst support structures, and coatings on structures used in reaction apparatus in which the reaction product of the invention will serve as a catalyst.

  4. Competing Contact Processes on Homogeneous Networks with Tunable Clusterization

    NASA Astrophysics Data System (ADS)

    Rybak, Marcin; Kułakowski, Krzysztof

    2013-03-01

    We investigate two homogeneous networks: the Watts-Strogatz network with mean degree ⟨k⟩ = 4 and the Erdös-Rényi network with ⟨k⟩ = 10. In both kinds of networks, the clustering coefficient C is a tunable control parameter. The network is an area of two competing contact processes, where nodes can be in two states, S or D. A node S becomes D with probability 1 if at least two its mutually linked neighbors are D. A node D becomes S with a given probability p if at least one of its neighbors is S. The competition between the processes is described by a phase diagram, where the critical probability pc depends on the clustering coefficient C. For p > pc the rate of state S increases in time, seemingly to dominate in the whole system. Below pc, the majority of nodes is in the D-state. The numerical results indicate that for the Watts-Strogatz network the D-process is activated at the finite value of the clustering coefficient C, close to 0.3. On the contrary, for the Erdös-Rényi network the transition is observed at the whole investigated range of C.

  5. Experimenting With Ore: Creating the Taconite Process; flow chart of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Experimenting With Ore: Creating the Taconite Process; flow chart of process - Mines Experiment Station, University of Minnesota, Twin Cities Campus, 56 East River Road, Minneapolis, Hennepin County, MN

  6. Gravitational influences on the liquid-state homogenization and solidification of aluminum antimonide. [space processing of solar cell material

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1979-01-01

    Typical commercial or laboratory-prepared samples of polycrystalline AlSb contain microstructural inhomogeneities of Al- or Sb-rich phases in addition to the primary AlSb grains. The paper reports on gravitational influences, such as density-driven convection or sedimentation, that cause microscopic phase separation and nonequilibrium conditions to exist in earth-based melts of AlSb. A triple-cavity electric furnace is used to homogenize the multiphase AlSb samples in space and on earth. A comparative characterization of identically processed low- and one-gravity samples of commercial AlSb reveals major improvements in the homogeneity of the low-gravity homogenized material.

  7. Efficacy of low-temperature high hydrostatic pressure processing in inactivating Vibrio parahaemolyticus in culture suspension and oyster homogenate.

    PubMed

    Phuvasate, Sureerat; Su, Yi-Cheng

    2015-03-02

    Culture suspensions of five clinical and five environmental Vibrio parahaemolyticus strains in 2% NaCl solution were subjected to high pressure processing (HPP) under various conditions (200-300MPa for 5 and 10 min at 1.5-20°C) to study differences in pressure resistance among the strains. The most pressure-resistant and pressure-sensitive strains were selected to investigate the effects of low temperatures (15, 5 and 1.5°C) on HPP (200 or 250MPa for 5 min) to inactivate V. parahaemolyticus in sterile oyster homogenates. Inactivation of V. parahaemolyticus cells in culture suspensions and oyster homogenates was greatly enhanced by lowering the processing temperature from 15 to 5 or 1.5°C. A treatment of oyster homogenates at 250MPa for 5 min at 5°C decreased the populations of V. parahaemolyticus by 6.2logCFU/g for strains 10290 and 100311Y11 and by >7.4logCFU/g for strain 10292. Decreasing the processing temperature of the same treatment to 1.5°C reduced all the V. parahaemolyticus strains inoculated to oyster homogenates to non-detectable (<10CFU/g) levels. Factors including pressure level, processing temperature and time all need to be considered for developing effective HPP for eliminating pathogens from foods. Further studies are needed to validate the efficacy of the HPP (250MPa for 5 min at 1.5°C) in inactivating V. parahaemolyticus cells in whole oysters. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  9. Nonstationary homogeneous nucleation

    NASA Technical Reports Server (NTRS)

    Harstad, K. G.

    1974-01-01

    The theory of homogeneous condensation is reviewed and equations describing this process are presented. Numerical computer solutions to transient problems in nucleation (relaxation to steady state) are presented and compared to a prior computation.

  10. Modeling environmental noise exceedances using non-homogeneous Poisson processes.

    PubMed

    Guarnaccia, Claudio; Quartieri, Joseph; Barrios, Juan M; Rodrigues, Eliane R

    2014-10-01

    In this work a non-homogeneous Poisson model is considered to study noise exposure. The Poisson process, counting the number of times that a sound level surpasses a threshold, is used to estimate the probability that a population is exposed to high levels of noise a certain number of times in a given time interval. The rate function of the Poisson process is assumed to be of a Weibull type. The presented model is applied to community noise data from Messina, Sicily (Italy). Four sets of data are used to estimate the parameters involved in the model. After the estimation and tuning are made, a way of estimating the probability that an environmental noise threshold is exceeded a certain number of times in a given time interval is presented. This estimation can be very useful in the study of noise exposure of a population and also to predict, given the current behavior of the data, the probability of occurrence of high levels of noise in the near future. One of the most important features of the model is that it implicitly takes into account different noise sources, which need to be treated separately when using usual models.

  11. Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT

  12. Characterization of Particles Created By Laser-Driven Hydrothermal Processing

    DTIC Science & Technology

    2016-06-01

    created by laser-driven hydrothermal processing, an innovative technique used for the ablation of submerged materials. Two naturally occurring...processing, characterization, obsidian, tektite, natural glass 15. NUMBER OF PAGES 89 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...technique used for the ablation of submerged materials. Two naturally occurring materials, obsidian and tektite, were used as targets for this technique

  13. High-pressure homogenization associated hydrothermal process of palygorskite for enhanced adsorption of Methylene blue

    NASA Astrophysics Data System (ADS)

    Zhang, Zhifang; Wang, Wenbo; Wang, Aiqin

    2015-02-01

    Palygorskite (PAL) was modified by a high-pressure homogenization assisted hydrothermal process. The effects of modification on the morphology, structure and physicochemical properties of PAL were systematically investigated by Field-emission scanning electron microscopy (FESEM), Transmission electron microscopy (TEM), Fourier transform infrared spectrometry (FTIR), Brunauer-Emmett-Teller (BET) analysis, X-ray diffraction (XRD) and Zeta potential analysis techniques, and the adsorption properties were systematically evaluated using Methylene blue (MB) as the model dye. The results revealed that the crystal bundles were disaggregated and the PAL nanorods became more even after treated via associated high-pressure homogenization and hydrothermal process, and the crystal bundles were dispersed as nanorods. The intrinsic crystal structure of PAL was remained after hydrothermal treatment, and the pore size calculated by the BET method was increased. The adsorption properties of PAL for MB were evidently improved (from 119 mg/g to 171 mg/g) after modification, and the dispersion of PAL before hydrothermal reaction is favorable to the adsorption. The desorption evaluation confirms that the modified PAL has stronger affinity with MB, which is benefit to fabricate a stable organic-inorganic hybrid pigment.

  14. Effects of ultrasonication and conventional mechanical homogenization processes on the structures and dielectric properties of BaTiO3 ceramics.

    PubMed

    Akbas, Hatice Zehra; Aydin, Zeki; Yilmaz, Onur; Turgut, Selvin

    2017-01-01

    The effects of the homogenization process on the structures and dielectric properties of pure and Nb-doped BaTiO 3 ceramics have been investigated using an ultrasonic homogenization and conventional mechanical methods. The reagents were homogenized using an ultrasonic processor with high-intensity ultrasonic waves and using a compact mixer-shaker. The components and crystal types of the powders were determined by Fourier-transform infrared spectroscopy (FTIR) and X-ray diffraction (XRD) analyses. The complex permittivity (ε ' , ε″) and AC conductivity (σ') of the samples were analyzed in a wide frequency range of 20Hz to 2MHz at room temperature. The structures and dielectric properties of pure and Nb-doped BaTiO 3 ceramics strongly depend on the homogenization process in a solid-state reaction method. Using an ultrasonic processor with high-intensity ultrasonic waves based on acoustic cavitation phenomena can make a significant improvement in producing high-purity BaTiO 3 ceramics without carbonate impurities with a small dielectric loss. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Homogeneity of Gd-based garnet transparent ceramic scintillators for gamma spectroscopy

    NASA Astrophysics Data System (ADS)

    Seeley, Z. M.; Cherepy, N. J.; Payne, S. A.

    2013-09-01

    Transparent polycrystalline ceramic scintillators based on the composition Gd1.49Y1.49Ce0.02Ga2.2Al2.8O12 are being developed for gamma spectroscopy detectors. Scintillator light yield and energy resolution depend on the details of various processing steps, including powder calcination, green body formation, and sintering atmosphere. We have found that gallium sublimation during vacuum sintering creates compositional gradients in the ceramic and can degrade the energy resolution. While sintering in oxygen produces ceramics with uniform composition and little afterglow, light yields are reduced, compared to vacuum sintering. By controlling the atmosphere during the various process steps, we were able to minimize the gallium sublimation, resulting in a more homogeneous composition and improved gamma spectroscopy performance.

  16. Political homogeneity can nurture threats to research validity.

    PubMed

    Chambers, John R; Schlenker, Barry R

    2015-01-01

    Political homogeneity within a scientific field nurtures threats to the validity of many research conclusions by allowing ideologically compatible values to influence interpretations, by minimizing skepticism, and by creating premature consensus. Although validity threats can crop in any research, the usual corrective activities in science are more likely to be minimized and delayed.

  17. Temperature distribution around thin electroconductive layers created on composite textile substrates

    NASA Astrophysics Data System (ADS)

    Korzeniewska, Ewa; Szczesny, Artur; Krawczyk, Andrzej; Murawski, Piotr; Mróz, Józef; Seme, Sebastian

    2018-03-01

    In this paper, the authors describe the distribution of temperatures around electroconductive pathways created by a physical vacuum deposition process on flexible textile substrates used in elastic electronics and textronics. Cordura material was chosen as the substrate. Silver with 99.99% purity was used as the deposited metal. This research was based on thermographic photographs of the produced samples. Analysis of the temperature field around the electroconductive layer was carried out using Image ThermaBase EU software. The analysis of the temperature distribution highlights the software's usefulness in determining the homogeneity of the created metal layer. Higher local temperatures and non-uniform distributions at the same time can negatively influence the work of the textronic system.

  18. Thermal homogeneity of plastication processes in single-screw extruders

    NASA Astrophysics Data System (ADS)

    Bu, L. X.; Agbessi, Y.; Béreaux, Y.; Charmeau, J.-Y.

    2018-05-01

    Single-screw plastication, used in extrusion and in injection moulding, is a major way of processing commodity thermoplastics. During the plastication phase, the polymeric material is melted by the combined effects of shear-induced self-heating (viscous dissipation) and heat conduction coming from the barrel. In injection moulding, a high level of reliability is usually achieved that makes this process ideally suited to mass market production. Nonetheless, process fluctuations still appear that make moulded part quality control an everyday issue. In this work, we used a combined modelling of plastication, throughput calculation and laminar dispersion, to investigate if, and how, thermal fluctuations could propagate along the screw length and affect the melt homogeneity at the end of the metering section. To do this, we used plastication models to relate changes in processing parameters to changes in the plastication length. Moreover, a simple model of throughput calculation is used to relate the screw geometry, the polymer rheology and the processing parameters to get a good estimate of the mass flow rate. Hence, we found that the typical residence time in a single screw is around one tenth of the thermal diffusion time scale. This residence time is too short for the dispersion coefficient to reach a steady state, but too long to be able to neglect radial thermal diffusion and resort to a purely convective solution. Therefore, a full diffusion/convection problem has to be solved with a base flow described by the classic pressure and drag velocity field. Preliminary results already show the major importance of the processing parameters in the breakthrough curve of an arbitrary temperature fluctuation at the end of the metering section of injection moulding screw. When the flow back-pressure is high, the temperature fluctuation is spread more evenly with time, whereas a pressure drop in the flow will results in a breakthrough curve which presents a larger peak of

  19. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  20. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-03

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales.

  1. A Tool for Creating Healthier Workplaces: The Conducivity Process

    ERIC Educational Resources Information Center

    Karasek, Robert A.

    2004-01-01

    The conducivity process, a methodology for creating healthier workplaces by promoting conducive production, is illustrated through the use of the "conducivity game" developed in the NordNet Project in Sweden, which was an action research project to test a job redesign methodology. The project combined the "conducivity" hypotheses about a…

  2. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  3. Polymer powder processing of cryomilled polycaprolactone for solvent-free generation of homogeneous bioactive tissue engineering scaffolds.

    PubMed

    Lim, Jing; Chong, Mark Seow Khoon; Chan, Jerry Kok Yen; Teoh, Swee-Hin

    2014-06-25

    Synthetic polymers used in tissue engineering require functionalization with bioactive molecules to elicit specific physiological reactions. These additives must be homogeneously dispersed in order to achieve enhanced composite mechanical performance and uniform cellular response. This work demonstrates the use of a solvent-free powder processing technique to form osteoinductive scaffolds from cryomilled polycaprolactone (PCL) and tricalcium phosphate (TCP). Cryomilling is performed to achieve micrometer-sized distribution of PCL and reduce melt viscosity, thus improving TCP distribution and improving structural integrity. A breakthrough is achieved in the successful fabrication of 70 weight percentage of TCP into a continuous film structure. Following compaction and melting, PCL/TCP composite scaffolds are found to display uniform distribution of TCP throughout the PCL matrix regardless of composition. Homogeneous spatial distribution is also achieved in fabricated 3D scaffolds. When seeded onto powder-processed PCL/TCP films, mesenchymal stem cells are found to undergo robust and uniform osteogenic differentiation, indicating the potential application of this approach to biofunctionalize scaffolds for tissue engineering applications. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda

    2015-12-01

    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  5. Numerical investigation of homogeneous cavitation nucleation in a microchannel

    NASA Astrophysics Data System (ADS)

    Lyu, Xiuxiu; Pan, Shucheng; Hu, Xiangyu; Adams, Nikolaus A.

    2018-06-01

    The physics of nucleation in water is an important issue for many areas, ranging from biomedical to engineering applications. Within the present study, we investigate numerically homogeneous nucleation in a microchannel induced by shock reflection to gain a better understanding of the mechanism of homogeneous nucleation. The liquid expands due to the reflected shock and homogeneous cavitation nuclei are generated. An Eulerian-Lagrangian approach is employed for modeling this process in a microchanel. Two-dimensional axisymmetric Euler equations are solved for obtaining the time evolution of shock, gas bubble, and the ambient fluid. The dynamics of dispersed vapor bubbles is coupled with the surrounding fluid in a Lagrangian framework, describing bubble location and bubble size variation. Our results reproduce nuclei distributions at different stages of homogeneous nucleation and are in good agreement with experimental results. We obtain numerical data for the negative pressure that water can sustain under the process of homogeneous nucleation. An energy transformation description for the homogeneous nucleation inside a microchannel flow is derived and analyzed in detail.

  6. Enhancement of anaerobic sludge digestion by high-pressure homogenization.

    PubMed

    Zhang, Sheng; Zhang, Panyue; Zhang, Guangming; Fan, Jie; Zhang, Yuxuan

    2012-08-01

    To improve anaerobic sludge digestion efficiency, the effects of high-pressure homogenization (HPH) conditions on the anaerobic sludge digestion were investigated. The VS and TCOD were significantly removed with the anaerobic digestion, and the VS removal and TCOD removal increased with increasing the homogenization pressure and homogenization cycle number; correspondingly, the accumulative biogas production also increased with increasing the homogenization pressure and homogenization cycle number. The optimal homogenization pressure was 50 MPa for one homogenization cycle and 40 MPa for two homogenization cycles. The SCOD of the sludge supernatant significantly increased with increasing the homogenization pressure and homogenization cycle number due to the sludge disintegration. The relationship between the biogas production and the sludge disintegration showed that the accumulative biogas and methane production were mainly enhanced by the sludge disintegration, which accelerated the anaerobic digestion process and improved the methane content in the biogas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Effects of high-speed homogenization and high-pressure homogenization on structure of tomato residue fibers.

    PubMed

    Hua, Xiao; Xu, Shanan; Wang, Mingming; Chen, Ying; Yang, Hui; Yang, Ruijin

    2017-10-01

    Tomato residue fibers obtained after derosination and deproteinization were processed by high-speed homogenization (HSH) and high-pressure homogenization (HPH), and their effects on fiber structure was investigated, respectively. Characterizations including particle size distribution, SEM, TEM and XRD were performed. HSH could break raw fibers to small particles of around 60μm, while HPH could reshape fibers to build network structure. Microfibrils were released and their nanostructure consisting of elementary fibrils was observed by TEM. XRD patterns indicated both HSH and HPH could hardly alter the nanostructure of the fibers. Physicochemical properties including expansibility, WHC and OHC were determined. Both HSH and HPH could increase the soluble fiber content by about 8%, but HSH-HPH combined processing did not show better result. Acid (4mol/L HCl) was used in replacement of water medium and the acidic degradation of fibers could be promoted by high speed shearing or high pressure processing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  9. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  10. Process spectroscopy in microemulsions—setup and multi-spectral approach for reaction monitoring of a homogeneous hydroformylation process

    NASA Astrophysics Data System (ADS)

    Meyer, K.; Ruiken, J.-P.; Illner, M.; Paul, A.; Müller, D.; Esche, E.; Wozny, G.; Maiwald, M.

    2017-03-01

    Reaction monitoring in disperse systems, such as emulsions, is of significant technical importance in various disciplines like biotechnological engineering, chemical industry, food science, and a growing number other technical fields. These systems pose several challenges when it comes to process analytics, such as heterogeneity of mixtures, changes in optical behavior, and low optical activity. Concerning this, online nuclear magnetic resonance (NMR) spectroscopy is a powerful technique for process monitoring in complex reaction mixtures due to its unique direct comparison abilities, while at the same time being non-invasive and independent of optical properties of the sample. In this study the applicability of online-spectroscopic methods on the homogeneously catalyzed hydroformylation system of 1-dodecene to tridecanal is investigated, which is operated in a mini-plant scale at Technische Universität Berlin. The design of a laboratory setup for process-like calibration experiments is presented, including a 500 MHz online NMR spectrometer, a benchtop NMR device with 43 MHz proton frequency as well as two Raman probes and a flow cell assembly for an ultraviolet and visible light (UV/VIS) spectrometer. Results of high-resolution online NMR spectroscopy are shown and technical as well as process-specific problems observed during the measurements are discussed.

  11. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  12. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Effect of heat and homogenization on in vitro digestion of milk

    USDA-ARS?s Scientific Manuscript database

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  14. (Ultra) High Pressure Homogenization for Continuous High Pressure Sterilization of Pumpable Foods – A Review

    PubMed Central

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work. PMID:25988118

  15. Homogenization patterns of the world's freshwater fish faunas.

    PubMed

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-11-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the "Homogocene era" is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes.

  16. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape.

    PubMed

    Carneiro, Magda Silva; Campos, Caroline Cambraia Furtado; Beijo, Luiz Alberto; Ramos, Flavio Nunes

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to

  17. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape

    PubMed Central

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to

  18. Deep nursing: a thoughtful, co-created nursing process.

    PubMed

    Griffiths, Colin

    2017-03-30

    This article examines some of the challenges in nursing practice experienced by patients and nurses in the UK and Ireland, and considers some of the associated stressors in the system. Nurses must respond to these challenges by crafting their own practice, and the article offers a blueprint for developing personal nursing practice through acceptance, paying detailed attention to patients, taking time with patients and personal reflection. It draws on innovations in learning disability practice to suggest that care should be jointly thought through and co-created by patients and nurses, and that this process of thoughtful engagement constitutes 'deep nursing'.

  19. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  20. Process spectroscopy in microemulsions—Raman spectroscopy for online monitoring of a homogeneous hydroformylation process

    NASA Astrophysics Data System (ADS)

    Paul, Andrea; Meyer, Klas; Ruiken, Jan-Paul; Illner, Markus; Müller, David-Nicolas; Esche, Erik; Wozny, Günther; Westad, Frank; Maiwald, Michael

    2017-03-01

    A major industrial reaction based on homogeneous catalysis is hydroformylation for the production of aldehydes from alkenes and syngas. Hydroformylation in microemulsions, which is currently under investigation at Technische Universität Berlin on a mini-plant scale, was identified as a cost efficient approach which also enhances product selectivity. Herein, we present the application of online Raman spectroscopy on the reaction of 1-dodecene to 1-tridecanal within a microemulsion. To achieve a good representation of the operation range in the mini-plant with regard to concentrations of the reactants a design of experiments was used. Based on initial Raman spectra partial least squares regression (PLSR) models were calibrated for the prediction of 1-dodecene and 1-tridecanal. Limits of predictions arise from nonlinear correlations between Raman intensity and mass fractions of compounds in the microemulsion system. Furthermore, the prediction power of PLSR models becomes limited due to unexpected by-product formation. Application of the lab-scale derived calibration spectra and PLSR models on online spectra from a mini-plant operation yielded promising estimations of 1-tridecanal and acceptable predictions of 1-dodecene mass fractions suggesting Raman spectroscopy as a suitable technique for process analytics in microemulsions.

  1. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and

  2. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    PubMed

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  3. Homogenous charge compression ignition engine having a cylinder including a high compression space

    DOEpatents

    Agama, Jorge R.; Fiveland, Scott B.; Maloney, Ronald P.; Faletti, James J.; Clarke, John M.

    2003-12-30

    The present invention relates generally to the field of homogeneous charge compression engines. In these engines, fuel is injected upstream or directly into the cylinder when the power piston is relatively close to its bottom dead center position. The fuel mixes with air in the cylinder as the power piston advances to create a relatively lean homogeneous mixture that preferably ignites when the power piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. Thus, the present invention divides the homogeneous charge between a controlled volume higher compression space and a lower compression space to better control the start of ignition.

  4. Comparing the effect of homogenization and heat processing on the properties and in vitro digestion of milk from organic and conventional dairy herds.

    PubMed

    Van Hekken, D L; Tunick, M H; Ren, D X; Tomasula, P M

    2017-08-01

    We compared the effects of homogenization and heat processing on the chemical and in vitro digestion traits of milk from organic and conventional herds. Raw milk from organic (>50% of dry matter intake from pasture) and conventional (no access to pasture) farms were adjusted to commercial whole and nonfat milk fat standards, and processed with or without homogenization, and with high-temperature-short-time or UHT pasteurization. The milk then underwent in vitro gastrointestinal digestion. Comparison of milk from organic and conventional herds showed that the milks responded to processing in similar ways. General composition was the same among the whole milk samples and among the nonfat milk samples. Protein profiles were similar, with intact caseins and whey proteins predominant and only minor amounts of peptides. Whole milk samples from grazing cows contained higher levels of α-linolenic (C18:3), vaccenic (C18:1 trans), and conjugated linoleic acids, and lower levels of palmitic (C16:0) and stearic (C18:0) acids than samples from nongrazing cows. Processing had no effect on conjugated linoleic acid and linolenic acid levels in milk, although homogenization resulted in higher levels of C8 to C14 saturated fatty acids. Of the 9 volatile compounds evaluated, milk from grazing cows contained lower levels of 2-butanone than milk from nongrazing cows, and milk from both farms showed spikes for heptanal in UHT samples and spikes for butanoic, octanoic, nonanoic, and N-decanoic acids in homogenized samples. At the start of in vitro digestion, nonfat raw and pasteurized milk samples formed the largest acid clots, and organic milk clots were larger than conventional milk clots; UHT whole milk formed the smallest clots. Milk digests from grazing cows had lower levels of free fatty acids than digests from nongrazing cows. In vitro proteolysis was similar in milk from both farms and resulted in 85 to 95% digestibility. Overall, milk from organic/grass-fed and conventional

  5. Magnetic field homogeneity of a conical coaxial coil pair.

    PubMed

    Salazar, F J; Nieves, F J; Bayón, A; Gascón, F

    2017-09-01

    An analytical study of the magnetic field created by a double-conical conducting sheet is presented. The analysis is based on the expansion of the magnetic field in terms of Legendre polynomials. It is demonstrated analytically that the angle of the conical surface that produces a nearly homogeneous magnetic field coincides with that of a pair of loops that fulfills the Helmholtz condition. From the results obtained, we propose an electric circuit formed by pairs of isolated conducting loops tightly wound around a pair of conical surfaces, calculating numerically the magnetic field produced by this system and its heterogeneity. An experimental setup of the proposed circuit was constructed and its magnetic field was measured. The results were compared with those obtained by numerical calculation, finding a good agreement. The numerical results demonstrate a significant improvement in homogeneity in the field of the proposed pair of conical coils compared with that achieved with a simple pair of Helmholtz loops or with a double solenoid. Moreover, a new design of a double pair of conical coils based on Braunbek's four loops is also proposed to achieve greater homogeneity. Regarding homogeneity, the rating of the analyzed configurations from best to worst is as follows: (1) double pair of conical coils, (2) pair of conical coils, (3) Braunbek's four loops, (4) Helmholtz pair, and (5) solenoid pair.

  6. Magnetic field homogeneity of a conical coaxial coil pair

    NASA Astrophysics Data System (ADS)

    Salazar, F. J.; Nieves, F. J.; Bayón, A.; Gascón, F.

    2017-09-01

    An analytical study of the magnetic field created by a double-conical conducting sheet is presented. The analysis is based on the expansion of the magnetic field in terms of Legendre polynomials. It is demonstrated analytically that the angle of the conical surface that produces a nearly homogeneous magnetic field coincides with that of a pair of loops that fulfills the Helmholtz condition. From the results obtained, we propose an electric circuit formed by pairs of isolated conducting loops tightly wound around a pair of conical surfaces, calculating numerically the magnetic field produced by this system and its heterogeneity. An experimental setup of the proposed circuit was constructed and its magnetic field was measured. The results were compared with those obtained by numerical calculation, finding a good agreement. The numerical results demonstrate a significant improvement in homogeneity in the field of the proposed pair of conical coils compared with that achieved with a simple pair of Helmholtz loops or with a double solenoid. Moreover, a new design of a double pair of conical coils based on Braunbek's four loops is also proposed to achieve greater homogeneity. Regarding homogeneity, the rating of the analyzed configurations from best to worst is as follows: (1) double pair of conical coils, (2) pair of conical coils, (3) Braunbek's four loops, (4) Helmholtz pair, and (5) solenoid pair.

  7. Homogenization of CZ Si wafers by Tabula Rasa annealing

    NASA Astrophysics Data System (ADS)

    Meduňa, M.; Caha, O.; Kuběna, J.; Kuběna, A.; Buršík, J.

    2009-12-01

    The precipitation of interstitial oxygen in Czochralski grown silicon has been investigated by infrared absorption spectroscopy, chemical etching, transmission electron microscopy and X-ray diffraction after application of homogenization annealing process called Tabula Rasa. The influence of this homogenization step consisting in short time annealing at high temperature has been observed for various temperatures and times. The experimental results involving the interstitial oxygen decay in Si wafers and absorption spectra of SiOx precipitates during precipitation annealing at 1000∘ C were compared with other techniques for various Tabula Rasa temperatures. The differences in oxygen precipitation, precipitate morphology and evolution of point defects in samples with and without Tabula Rasa applied is evident from all used experimental techniques. The results qualitatively correlate with prediction of homogenization annealing process based on classical nucleation theory.

  8. Regional Homogeneity

    PubMed Central

    Jiang, Lili; Zuo, Xi-Nian

    2015-01-01

    Much effort has been made to understand the organizational principles of human brain function using functional magnetic resonance imaging (fMRI) methods, among which resting-state fMRI (rfMRI) is an increasingly recognized technique for measuring the intrinsic dynamics of the human brain. Functional connectivity (FC) with rfMRI is the most widely used method to describe remote or long-distance relationships in studies of cerebral cortex parcellation, interindividual variability, and brain disorders. In contrast, local or short-distance functional interactions, especially at a scale of millimeters, have rarely been investigated or systematically reviewed like remote FC, although some local FC algorithms have been developed and applied to the discovery of brain-based changes under neuropsychiatric conditions. To fill this gap between remote and local FC studies, this review will (1) briefly survey the history of studies on organizational principles of human brain function; (2) propose local functional homogeneity as a network centrality to characterize multimodal local features of the brain connectome; (3) render a neurobiological perspective on local functional homogeneity by linking its temporal, spatial, and individual variability to information processing, anatomical morphology, and brain development; and (4) discuss its role in performing connectome-wide association studies and identify relevant challenges, and recommend its use in future brain connectomics studies. PMID:26170004

  9. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  10. Homogeneous sonophotolysis of food processing industry wastewater: Study of synergistic effects, mineralization and toxicity removal.

    PubMed

    Durán, A; Monteagudo, J M; Sanmartín, I; Gómez, P

    2013-03-01

    The mineralization of industrial wastewater coming from food industry using an emerging homogeneous sonophotolytic oxidation process was evaluated as an alternative to or a rapid pretreatment step for conventional anaerobic digestion with the aim of considerably reducing the total treatment time. At the selected operation conditions ([H(2)O(2)]=11,750ppm, pH=8, amplitude=50%, pulse length (cycles)=1), 60% of TOC is removed after 60min and 98% after 180min when treating an industrial effluent with 2114ppm of total organic carbon (TOC). This process removed completely the toxicity generated during storing or due to intermediate compounds. An important synergistic effect between sonolysis and photolysis (H(2)O(2)/UV) was observed. Thus the sonophotolysis (ultrasound/H(2)O(2)/UV) technique significantly increases TOC removal when compared with each individual process. Finally, a preliminary economical analysis confirms that the sono-photolysis with H(2)O(2) and pretreated water is a profitable system when compared with the same process without using ultrasound waves and with no pretreatment. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  12. Process to create simulated lunar agglutinate particles

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert J. (Inventor); Gustafson, Marty A. (Inventor); White, Brant C. (Inventor)

    2011-01-01

    A method of creating simulated agglutinate particles by applying a heat source sufficient to partially melt a raw material is provided. The raw material is preferably any lunar soil simulant, crushed mineral, mixture of crushed minerals, or similar material, and the heat source creates localized heating of the raw material.

  13. Low-gravity homogenization and solidification of aluminum antimonide. [Apollo-Soyuz test project

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1976-01-01

    The III-V semiconducting compound AlSb shows promise as a highly efficient solar cell material, but it has not been commercially exploited because of difficulties in compound synthesis. Liquid state homogenization and solidification of AlSb were carried out in the Apollo-Soyuz Test Project Experiment MA-044 in the hope that compositional homogeneity would be improved by negating the large density difference between the two constituents. Post-flight analysis and comparative characterization of the space-processed and ground-processed samples indicate that there are major homogeneity improvements in the low-gravity solidified material.

  14. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  15. Ultrasound and polar homogeneous reactions.

    PubMed

    Tuulmets, A

    1997-04-01

    The effect of ultrasound on the rates of homogeneous heterolytic reactions not switched to a free radical pathway can be explained by the perturbation of the molecular organization of or the solvation in the reacting system. A quantitative analysis of the sonochemical acceleration on the basis of the microreactor concept was carried out. It was found that (1) the Diels-Alder reaction cannot be accelerated by ultrasound except when SET or free radical processes are promoted, (2) the rectified diffusion during cavitation cannot be responsible for the acceleration of reactions, and (3) the sonochemical acceleration of polar homogeneous reactions takes place in the bulk reaction medium. This implies the presence of a 'sound-field' sonochemistry besides the 'hot-spot' sonochemistry. The occurrence of a sonochemical deceleration effect can be predicted.

  16. Homogenization patterns of the world’s freshwater fish faunas

    PubMed Central

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-01-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the “Homogocene era” is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes. PMID:22025692

  17. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization.

    PubMed

    Kwiatkowski, M; Wurlitzer, M; Krutilin, A; Kiani, P; Nimer, R; Omidi, M; Mannaa, A; Bussmann, T; Bartkowiak, K; Kruber, S; Uschold, S; Steffen, P; Lübberstedt, J; Küpker, N; Petersen, H; Knecht, R; Hansen, N O; Zarrine-Afsar, A; Robertson, W D; Miller, R J D; Schlüter, H

    2016-02-16

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization

    PubMed Central

    Kwiatkowski, M.; Wurlitzer, M.; Krutilin, A.; Kiani, P.; Nimer, R.; Omidi, M.; Mannaa, A.; Bussmann, T.; Bartkowiak, K.; Kruber, S.; Uschold, S.; Steffen, P.; Lübberstedt, J.; Küpker, N.; Petersen, H.; Knecht, R.; Hansen, N.O.; Zarrine-Afsar, A.; Robertson, W.D.; Miller, R.J.D.; Schlüter, H.

    2016-01-01

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Biological significance Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. PMID:26778141

  19. Effect of homogenization and pasteurization on the structure and stability of whey protein in milk.

    PubMed

    Qi, Phoebe X; Ren, Daxi; Xiao, Yingping; Tomasula, Peggy M

    2015-05-01

    The effect of homogenization alone or in combination with high-temperature, short-time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a 2-stage homogenizer at 35°C (6.9 MPa/10.3 MPa) and, along with skim milk, were subjected to HTST pasteurization (72°C for 15 s) or UHT processing (135°C for 2 s). Other whole milk samples were processed using homogenization followed by either HTST pasteurization or UHT processing. The processed skim and whole milk samples were centrifuged further to remove fat and then acidified to pH 4.6 to isolate the corresponding whey fractions, and centrifuged again. The whey fractions were then purified using dialysis and investigated using the circular dichroism, Fourier transform infrared, and Trp intrinsic fluorescence spectroscopic techniques. Results demonstrated that homogenization combined with UHT processing of milk caused not only changes in protein composition but also significant secondary structural loss, particularly in the amounts of apparent antiparallel β-sheet and α-helix, as well as diminished tertiary structural contact. In both cases of homogenization alone and followed by HTST treatments, neither caused appreciable chemical changes, nor remarkable secondary structural reduction. But disruption was evident in the tertiary structural environment of the whey proteins due to homogenization of whole milk as shown by both the near-UV circular dichroism and Trp intrinsic fluorescence. In-depth structural stability analyses revealed that even though processing of milk imposed little impairment on the secondary structural stability, the tertiary structural stability of whey protein was altered significantly. The following order was derived based on these studies: raw whole>HTST, homogenized, homogenized and pasteurized>skimmed and pasteurized, and skimmed UHT>homogenized

  20. The Challenges of Creating a Benchmarking Process for Administrative and Support Services

    ERIC Educational Resources Information Center

    Manning, Terri M.

    2007-01-01

    In the current climate of emphasis on outcomes assessment, colleges and universities are working diligently to create assessment processes for student learning outcomes, competence in general education, student satisfaction with services, and electronic tracking media to document evidence of competence in graduates. Benchmarking has become a…

  1. Homogenization of Mammalian Cells.

    PubMed

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  2. Revisiting Shock Initiation Modeling of Homogeneous Explosives

    NASA Astrophysics Data System (ADS)

    Partom, Yehuda

    2013-04-01

    Shock initiation of homogeneous explosives has been a subject of research since the 1960s, with neat and sensitized nitromethane as the main materials for experiments. A shock initiation model of homogeneous explosives was established in the early 1960s. It involves a thermal explosion event at the shock entrance boundary, which develops into a superdetonation that overtakes the initial shock. In recent years, Sheffield and his group, using accurate experimental tools, were able to observe details of buildup of the superdetonation. There are many papers on modeling shock initiation of heterogeneous explosives, but there are only a few papers on modeling shock initiation of homogeneous explosives. In this article, bulk reaction reactive flow equations are used to model homogeneous shock initiation in an attempt to reproduce experimental data of Sheffield and his group. It was possible to reproduce the main features of the shock initiation process, including thermal explosion, superdetonation, input shock overtake, overdriven detonation after overtake, and the beginning of decay toward Chapman-Jouget (CJ) detonation. The time to overtake (TTO) as function of input pressure was also calculated and compared to the experimental TTO.

  3. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  4. Numerical Generation of Dense Plume Fingers in Unsaturated Homogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Cremer, C.; Graf, T.

    2012-04-01

    In nature, the migration of dense plumes typically results in the formation of vertical plume fingers. Flow direction in fingers is downwards, which is counterbalanced by upwards flow of less dense fluid between fingers. In heterogeneous media, heterogeneity itself is known to trigger the formation of fingers. In homogeneous media, however, fingers are also created even if all grains had the same diameter. The reason is that pore-scale heterogeneity leading to different flow velocities also exists in homogeneous media due to two effects: (i) Grains of identical size may randomly arrange differently, e.g. forming tetrahedrons, hexahedrons or octahedrons. Each arrangement creates pores of varying diameter, thus resulting in different average flow velocities. (ii) Random variations of solute concentration lead to varying buoyancy effects, thus also resulting in different velocities. As a continuation of previously made efforts to incorporate pore-scale heterogeneity into fully saturated soil such that dense fingers are realistically generated (Cremer and Graf, EGU Assembly, 2011), the current paper extends the research scope from saturated to unsaturated soil. Perturbation methods are evaluated by numerically re-simulating a laboratory-scale experiment of plume transport in homogeneous unsaturated sand (Simmons et al., Transp. Porous Media, 2002). The following 5 methods are being discussed: (i) homogeneous sand, (ii) initial perturbation of solute concentration, (iii) spatially random, time-constant perturbation of solute source, (iv) spatially and temporally random noise of simulated solute concentration, and (v) random K-field that introduces physically insignificant but numerically significant heterogeneity. Results demonstrate that, as opposed to saturated flow, perturbing the solute source will not result in plume fingering. This is because the location of the perturbed source (domain top) and the location of finger generation (groundwater surface) do not

  5. Light beam shaping and homogenization (LSBH) by irregular microlens structure for medical applications

    NASA Astrophysics Data System (ADS)

    Semchishen, Vladimir A.; Mrochen, Michael; Seminogov, Vladimir N.; Panchenko, Vladislav Y.; Seiler, Theo

    1998-04-01

    Purpose: The increasing interest in a homogeneous Gaussian light beam profile for applications in ophthalmology e.g. photorefractive keratectomy (PRK) requests simple optical systems with low energy losses. Therefore, we developed the Light Shaping Beam Homogenizer (LSBH) working from UV up to mid-IR. Method: The irregular microlenses structure on a quartz surface was fabricated by using photolithography, chemical etching and chemical polishing processes. This created a three dimensional structure on the quartz substrate characterized in case of a Gaussian beam by random law distribution of individual irregularities tilts. The LSBH was realized for the 193 nm and the 2.94 micrometer wavelengths. Simulation results obtained by 3-D analysis for an arbitrary incident light beam were compared to experimental results. Results: The correlation to a numerical Gaussian fit is better than 94% with high uniformity for an incident beam with an intensity modulation of nearly 100%. In the far field the cross section of the beam shows always rotation symmetry. Transmittance and damage threshold of the LSBH are only dependent on the substrate characteristics. Conclusions: considering our experimental and simulation results it is possible to control the angular distribution of the beam intensity after LSBH with higher efficiency compared to diffraction or holographic optical elements.

  6. Multiple-pass high-pressure homogenization of milk for the development of pasteurization-like processing conditions.

    PubMed

    Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J

    2013-02-01

    Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P < 0·05) on microbial reduction of both spoilage microflora and Staph. aureus in milk. Optimized UHPH processes (five 202-MPa passes; four 232-MPa passes) defined a region where a 5-log(10) reduction of total bacterial count of milk and a baroresistant pathogen are attainable, as a requisite parameter for establishing an alternative method of pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization. © 2012 The Society for Applied Microbiology.

  7. Computationally Probing the Performance of Hybrid, Heterogeneous, and Homogeneous Iridium-Based Catalysts for Water Oxidation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Melchor, Max; Vilella, Laia; López, Núria

    2016-04-29

    An attractive strategy to improve the performance of water oxidation catalysts would be to anchor a homogeneous molecular catalyst on a heterogeneous solid surface to create a hybrid catalyst. The idea of this combined system is to take advantage of the individual properties of each of the two catalyst components. We use Density Functional Theory to determine the stability and activity of a model hybrid water oxidation catalyst consisting of a dimeric Ir complex attached on the IrO 2(110) surface through two oxygen atoms. We find that homogeneous catalysts can be bound to its matrix oxide without losing significant activity.more » Hence, designing hybrid systems that benefit from both the high tunability of activity of homogeneous catalysts and the stability of heterogeneous systems seems feasible.« less

  8. Utilizing Hierarchical Clustering to improve Efficiency of Self-Organizing Feature Map to Identify Hydrological Homogeneous Regions

    NASA Astrophysics Data System (ADS)

    Farsadnia, Farhad; Ghahreman, Bijan

    2016-04-01

    Hydrologic homogeneous group identification is considered both fundamental and applied research in hydrology. Clustering methods are among conventional methods to assess the hydrological homogeneous regions. Recently, Self-Organizing feature Map (SOM) method has been applied in some studies. However, the main problem of this method is the interpretation on the output map of this approach. Therefore, SOM is used as input to other clustering algorithms. The aim of this study is to apply a two-level Self-Organizing feature map and Ward hierarchical clustering method to determine the hydrologic homogenous regions in North and Razavi Khorasan provinces. At first by principal component analysis, we reduced SOM input matrix dimension, then the SOM was used to form a two-dimensional features map. To determine homogeneous regions for flood frequency analysis, SOM output nodes were used as input into the Ward method. Generally, the regions identified by the clustering algorithms are not statistically homogeneous. Consequently, they have to be adjusted to improve their homogeneity. After adjustment of the homogeneity regions by L-moment tests, five hydrologic homogeneous regions were identified. Finally, adjusted regions were created by a two-level SOM and then the best regional distribution function and associated parameters were selected by the L-moment approach. The results showed that the combination of self-organizing maps and Ward hierarchical clustering by principal components as input is more effective than the hierarchical method, by principal components or standardized inputs to achieve hydrologic homogeneous regions.

  9. Homogenization and texture development in rapidly solidified AZ91E consolidated by Shear Assisted Processing and Extrusion (ShAPE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Overman, N. R.; Whalen, S. A.; Bowden, M. E.

    Shear Assisted Processing and Extrusion (ShAPE) -a novel processing route that combines high shear and extrusion conditions- was evaluated as a processing method to densify melt spun magnesium alloy (AZ91E) flake materials. This study illustrates the microstructural regimes and transitions in crystallographic texture that occur as a result of applying simultaneous linear and rotational shear during extrusion. Characterization of the flake precursor and extruded tube was performed using scanning and transmission electron microscopy, x-ray diffraction and microindentation techniques. Results show a unique transition in the orientation of basal texture development. Despite the high temperatures involved during processing, uniform grain refinementmore » and material homogenization are observed. These results forecast the ability to implement the ShAPE processing approach for a broader range of materials with novel microstructures and high performance.« less

  10. Optimizing homogenization by chaotic unmixing?

    NASA Astrophysics Data System (ADS)

    Weijs, Joost; Bartolo, Denis

    2016-11-01

    A number of industrial processes rely on the homogeneous dispersion of non-brownian particles in a viscous fluid. An ideal mixing would yield a so-called hyperuniform particle distribution. Such configurations are characterized by density fluctuations that grow slower than the standard √{ N}-fluctuations. Even though such distributions have been found in several natural structures, e.g. retina receptors in birds, they have remained out of experimental reach until very recently. Over the last 5 years independent experiments and numerical simulations have shown that periodically driven suspensions can self-assemble hyperuniformally. Simple as the recipe may be, it has one important disadvantage. The emergence of hyperuniform states co-occurs with a critical phase transition from reversible to non reversible particle dynamics. As a consequence the homogenization dynamics occurs over a time that diverges with the system size (critical slowing down). Here, we discuss how this process can be sped up by exploiting the stirring properties of chaotic advection. Among the questions that we answer are: What are the physical mechanisms in a chaotic flow that are relevant for hyperuniformity? How can we tune the flow parameters such to obtain optimal hyperuniformity in the fastest way? JW acknowledges funding by NWO (Netherlands Organisation for Scientific Research) through a Rubicon Grant.

  11. Homogeneous crystal nucleation in polymers.

    PubMed

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  12. Understanding homogeneous nucleation in solidification of aluminum by molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Mahata, Avik; Asle Zaeem, Mohsen; Baskes, Michael I.

    2018-02-01

    Homogeneous nucleation from aluminum (Al) melt was investigated by million-atom molecular dynamics simulations utilizing the second nearest neighbor modified embedded atom method potentials. The natural spontaneous homogenous nucleation from the Al melt was produced without any influence of pressure, free surface effects and impurities. Initially isothermal crystal nucleation from undercooled melt was studied at different constant temperatures, and later superheated Al melt was quenched with different cooling rates. The crystal structure of nuclei, critical nucleus size, critical temperature for homogenous nucleation, induction time, and nucleation rate were determined. The quenching simulations clearly revealed three temperature regimes: sub-critical nucleation, super-critical nucleation, and solid-state grain growth regimes. The main crystalline phase was identified as face-centered cubic, but a hexagonal close-packed (hcp) and an amorphous solid phase were also detected. The hcp phase was created due to the formation of stacking faults during solidification of Al melt. By slowing down the cooling rate, the volume fraction of hcp and amorphous phases decreased. After the box was completely solid, grain growth was simulated and the grain growth exponent was determined for different annealing temperatures.

  13. Investigations of effect of phase change mass transfer rate on cavitation process with homogeneous relaxation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Zhixia; Zhang, Liang; Saha, Kaushik

    The super high fuel injection pressure and micro size of nozzle orifice has been an important development trend for the fuel injection system. Accordingly, cavitation transient process, fuel compressibility, amount of noncondensable gas in the fuel and cavitation erosion have attracted more attention. Based on the fact of cavitation in itself is a kind of thermodynamic phase change process, this paper takes the perspective of the cavitation phase change mass transfer process to analyze above mentioned phenomenon. The two-phase cavitating turbulent flow simulations with VOF approach coupled with HRM cavitation model and U-RANS of standard k-ε turbulence model were performedmore » for investigations of cavitation phase change mass transfer process. It is concluded the mass transfer time scale coefficient in the Homogenous Relaxation Model (HRM) representing mass transfer rate should tend to be as small as possible in a condition that ensured the solver stable. At very fast mass transfer rate, the phase change occurs at very thin interface between liquid and vapor phase and condensation occurs more focused and then will contribute predictably to a more serious cavitation erosion. Both the initial non-condensable gas in fuel and the fuel compressibility can accelerate the cavitation mass transfer process.« less

  14. Effect of heat and homogenization on in vitro digestion of milk.

    PubMed

    Tunick, Michael H; Ren, Daxi X; Van Hekken, Diane L; Bonnaillie, Laetitia; Paul, Moushumi; Kwoczak, Raymond; Tomasula, Peggy M

    2016-06-01

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. Ultra-high-temperature sterilization is also applied to milk and is typically used to extend the shelf life of refrigerated, specialty milk products or to provide shelf-stable milk. The structures of the milk proteins and lipids are affected by processing but little information is available on the effects of the individual processes or sequences of processes on digestibility. In this study, raw whole milk was subjected to homogenization, HTST pasteurization, and homogenization followed by HTST or UHT processing. Raw skim milk was subjected to the same heating regimens. In vitro gastrointestinal digestion using a fasting model was then used to detect the processing-induced changes in the proteins and lipids. Using sodium dodecyl sulfate-PAGE, gastric pepsin digestion of the milk samples showed rapid elimination of the casein and α-lactalbumin bands, persistence of the β-lactoglobulin bands, and appearance of casein and whey peptide bands. The bands for β-lactoglobulin were eliminated within the first 15min of intestinal pancreatin digestion. The remaining proteins and peptides of raw, HTST, and UHT skim samples were digested rapidly within the first 15min of intestinal digestion, but intestinal digestion of raw and HTST pasteurized whole milk showed some persistence of the peptides throughout digestion. The availability of more lipid droplets upon homogenization, with greater surface area available for interaction with the peptides, led to persistence of the smaller peptide bands and thus slower intestinal digestion when followed by HTST pasteurization but not by UHT processing, in which the denatured proteins may be more accessible to the digestive enzymes. Homogenization and heat processing also affected the ζ-potential and free fatty acid release

  15. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  16. Development of Molecular Catalysts to Bridge the Gap between Heterogeneous and Homogeneous Catalysts

    NASA Astrophysics Data System (ADS)

    Ye, Rong

    Catalysts, heterogeneous, homogeneous, and enzymatic, are comprised of nanometer-sized inorganic and/or organic components. They share molecular factors including charge, coordination, interatomic distance, bonding, and orientation of catalytically active atoms. By controlling the governing catalytic components and molecular factors, catalytic processes of a multichannel and multiproduct nature could be run in all three catalytic platforms to create unique end-products. Unifying the fields of catalysis is the key to achieving the goal of 100% selectivity in catalysis. Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles - some without homogeneous analogues - for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimize and expand the scope of their reactivity and selectivity. Ongoing efforts are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence and structural uniformity, dendrimers have proven to

  17. Platinum isotopes in iron meteorites: Galactic cosmic ray effects and nucleosynthetic homogeneity in the p-process isotope 190Pt and the other platinum isotopes

    NASA Astrophysics Data System (ADS)

    Hunt, Alison C.; Ek, Mattias; Schönbächler, Maria

    2017-11-01

    Platinum isotopes are sensitive to the effects of galactic cosmic rays (GCR), which can alter isotope ratios and mask nucleosynthetic isotope variations. Platinum also features one p-process isotope, 190Pt, which is very low abundance and therefore challenging to analyse. Platinum-190 is relevant for early solar-system chronology because of its decay to 186Os. Here, we present new Pt isotope data for five iron meteorite groups (IAB, IIAB, IID, IIIAB and IVA), including high-precision measurements of 190Pt for the IAB, IIAB and IIIAB irons, determined by multi-collector ICPMS. New data are in good agreement with previous studies and display correlations between different Pt isotopes. The slopes of these correlations are well-reproduced by the available GCR models. We report Pt isotope ratios for the IID meteorite Carbo that are consistently higher than the predicted effects from the GCR model. This suggests that the model predictions do not fully account for all the GCR effects on Pt isotopes, but also that the pre-atmospheric radii and exposure times calculated for Carbo may be incorrect. Despite this, the good agreement of relative effects in Pt isotopes with the predicted GCR trends confirms that Pt isotopes are a useful in-situ neutron dosimeter. Once GCR effects are accounted for, our new dataset reveals s- and r-process homogeneity between the iron meteorite groups studied here and the Earth. New 190Pt data for the IAB, IIAB and IIIAB iron meteorites indicate the absence of GCR effects and homogeneity in the p-process isotope between these groups and the Earth. This corresponds well with results from other heavy p-process isotopes and suggests their homogenous distribution in the inner solar system, although it does not exclude that potential p-process isotope variations are too diluted to be currently detectable.

  18. Effect of high-pressure homogenization on different matrices of food supplements.

    PubMed

    Martínez-Sánchez, Ascensión; Tarazona-Díaz, Martha Patricia; García-González, Antonio; Gómez, Perla A; Aguayo, Encarna

    2016-12-01

    There is a growing demand for food supplements containing high amounts of vitamins, phenolic compounds and mineral content that provide health benefits. Those functional compounds have different solubility properties, and the maintenance of their compounds and the guarantee of their homogenic properties need the application of novel technologies. The quality of different drinkable functional foods after thermal processing (0.1 MPa) or high-pressure homogenization under two different conditions (80 MPa, 33 ℃ and 120 MPa, 43 ℃) was studied. Physicochemical characteristics and sensory qualities were evaluated throughout the six months of accelerated storage at 40 ℃ and 75% relative humidity (RH). Aroma and color were better maintained in high-pressure homogenization-treated samples than the thermally treated ones, which contributed significantly to extending their shelf life. The small particle size obtained after high-pressure homogenization treatments caused differences in turbidity and viscosity with respect to heat-treated samples. The use of high-pressure homogenization, more specifically, 120 MPa, provided active ingredient homogeneity to ensure uniform content in functional food supplements. Although the effect of high-pressure homogenization can be affected by the food matrix, high-pressure homogenization can be implemented as an alternative to conventional heat treatments in a commercial setting within the functional food supplement or pharmaceutical industry. © The Author(s) 2016.

  19. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique

    PubMed Central

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5–20 minutes) and speed (1,000–9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters. PMID:25709435

  20. Decay and growth laws in homogeneous shear turbulence

    NASA Astrophysics Data System (ADS)

    Briard, Antoine; Gomez, Thomas; Mons, Vincent; Sagaut, Pierre

    2016-07-01

    Homogeneous anisotropic turbulence has been widely studied in the past decades, both numerically and experimentally. Shear flows have received a particular attention because of the numerous physical phenomena they exhibit. In the present paper, both the decay and growth of anisotropy in homogeneous shear flows at high Reynolds numbers are revisited thanks to a recent eddy-damped quasi-normal Markovian closure adapted to homogeneous anisotropic turbulence. The emphasis is put on several aspects: an asymptotic model for the slow part of the pressure-strain tensor is derived for the return to isotropy process when mean velocity gradients are released. Then, a general decay law for purely anisotropic quantities in Batchelor turbulence is proposed. At last, a discussion is proposed to explain the scattering of global quantities obtained in DNS and experiments in sustained shear flows: the emphasis is put on the exponential growth rate of the kinetic energy and on the shear parameter.

  1. Homogeneous Freezing of Water Droplets and its Dependence on Droplet Size

    NASA Astrophysics Data System (ADS)

    Schmitt, Thea; Möhler, Ottmar; Höhler, Kristina; Leisner, Thomas

    2014-05-01

    The formulation and parameterisation of microphysical processes in tropospheric clouds, such as phase transitions, is still a challenge for weather and climate models. This includes the homogeneous freezing of supercooled water droplets, since this is an important process in deep convective systems, where almost pure water droplets may stay liquid until homogeneous freezing occurs at temperatures around 238 K. Though the homogeneous ice nucleation in supercooled water is considered to be well understood, recent laboratory experiments with typical cloud droplet sizes showed one to two orders of magnitude smaller nucleation rate coefficients than previous literature results, including earlier results from experiments with single levitated water droplets and from cloud simulation experiments at the AIDA (Aerosol Interaction and Dynamics in the Atmosphere) facility. This motivated us to re-analyse homogeneous droplet freezing experiments conducted during the previous years at the AIDA cloud chamber. This cloud chamber has a volume of 84m3 and operates under atmospherically relevant conditions within wide ranges of temperature, pressure and humidity, whereby investigations of both tropospheric mixed-phase clouds and cirrus clouds can be realised. By controlled adiabatic expansions, the ascent of an air parcel in the troposphere can be simulated. According to our new results and their comparison to the results from single levitated droplet experiments, the homogeneous freezing of water droplets seems to be a volume-dependent process, at least for droplets as small as a few micrometers in diameter. A contribution of surface induced freezing can be ruled out, in agreement to previous conclusions from the single droplet experiments. The obtained volume nucleation rate coefficients are in good agreement, within error bars, with some previous literature data, including our own results from earlier AIDA experiments, but they do not agree with recently published lower volume

  2. Comparing the impact of homogenization and heat processing on the properties and in vitro digestion of milk from organic and conventional dairy herds

    USDA-ARS?s Scientific Manuscript database

    The effects of homogenization and heat processing on the chemical and in vitro digestion traits of milk from organic and conventional herds were compared. Raw milk from organic (>50% of dry matter intake from pasture) and conventional (no access to pasture) farms were adjusted to commercial whole a...

  3. Immortal homogeneous Ricci flows

    NASA Astrophysics Data System (ADS)

    Böhm, Christoph; Lafuente, Ramiro A.

    2018-05-01

    We show that for an immortal homogeneous Ricci flow solution any sequence of parabolic blow-downs subconverges to a homogeneous expanding Ricci soliton. This is established by constructing a new Lyapunov function based on curvature estimates which come from real geometric invariant theory.

  4. An investigation into the effects of excipient particle size, blending techniques and processing parameters on the homogeneity and content uniformity of a blend containing low-dose model drug

    PubMed Central

    Alyami, Hamad; Dahmash, Eman; Bowen, James

    2017-01-01

    Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation. PMID:28609454

  5. An investigation into the effects of excipient particle size, blending techniques and processing parameters on the homogeneity and content uniformity of a blend containing low-dose model drug.

    PubMed

    Alyami, Hamad; Dahmash, Eman; Bowen, James; Mohammed, Afzal R

    2017-01-01

    Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation.

  6. Disruption of Pseudomonas putida by high pressure homogenization: a comparison of the predictive capacity of three process models for the efficient release of arginine deiminase.

    PubMed

    Patil, Mahesh D; Patel, Gopal; Surywanshi, Balaji; Shaikh, Naeem; Garg, Prabha; Chisti, Yusuf; Banerjee, Uttam Chand

    2016-12-01

    Disruption of Pseudomonas putida KT2440 by high-pressure homogenization in a French press is discussed for the release of arginine deiminase (ADI). The enzyme release response of the disruption process was modelled for the experimental factors of biomass concentration in the broth being disrupted, the homogenization pressure and the number of passes of the cell slurry through the homogenizer. For the same data, the response surface method (RSM), the artificial neural network (ANN) and the support vector machine (SVM) models were compared for their ability to predict the performance parameters of the cell disruption. The ANN model proved to be best for predicting the ADI release. The fractional disruption of the cells was best modelled by the RSM. The fraction of the cells disrupted depended mainly on the operating pressure of the homogenizer. The concentration of the biomass in the slurry was the most influential factor in determining the total protein release. Nearly 27 U/mL of ADI was released within a single pass from slurry with a biomass concentration of 260 g/L at an operating pressure of 510 bar. Using a biomass concentration of 100 g/L, the ADI release by French press was 2.7-fold greater than in a conventional high-speed bead mill. In the French press, the total protein release was 5.8-fold more than in the bead mill. The statistical analysis of the completely unseen data exhibited ANN and SVM modelling as proficient alternatives to RSM for the prediction and generalization of the cell disruption process in French press.

  7. Heterogeneity in homogeneous nucleation from billion-atom molecular dynamics simulation of solidification of pure metal.

    PubMed

    Shibuta, Yasushi; Sakane, Shinji; Miyoshi, Eisuke; Okita, Shin; Takaki, Tomohiro; Ohno, Munekazu

    2017-04-05

    Can completely homogeneous nucleation occur? Large scale molecular dynamics simulations performed on a graphics-processing-unit rich supercomputer can shed light on this long-standing issue. Here, a billion-atom molecular dynamics simulation of homogeneous nucleation from an undercooled iron melt reveals that some satellite-like small grains surrounding previously formed large grains exist in the middle of the nucleation process, which are not distributed uniformly. At the same time, grains with a twin boundary are formed by heterogeneous nucleation from the surface of the previously formed grains. The local heterogeneity in the distribution of grains is caused by the local accumulation of the icosahedral structure in the undercooled melt near the previously formed grains. This insight is mainly attributable to the multi-graphics processing unit parallel computation combined with the rapid progress in high-performance computational environments.Nucleation is a fundamental physical process, however it is a long-standing issue whether completely homogeneous nucleation can occur. Here the authors reveal, via a billion-atom molecular dynamics simulation, that local heterogeneity exists during homogeneous nucleation in an undercooled iron melt.

  8. Clinical perspective: creating an effective practice peer review process-a primer.

    PubMed

    Gandhi, Manisha; Louis, Frances S; Wilson, Shae H; Clark, Steven L

    2017-03-01

    Peer review serves as an important adjunct to other hospital quality and safety programs. Despite its importance, the available literature contains virtually no guidance regarding the structure and function of effective peer review committees. This Clinical Perspective provides a summary of the purposes, structure, and functioning of effective peer review committees. We also discuss important legal considerations that are a necessary component of such processes. This discussion includes useful templates for case selection and review. Proper committee structure, membership, work flow, and leadership as well as close cooperation with the hospital medical executive committee and legal representatives are essential to any effective peer review process. A thoughtful, fair, systematic, and organized approach to creating a peer review process will lead to confidence in the committee by providers, hospital leadership, and patients. If properly constructed, such committees may also assist in monitoring and enforcing compliance with departmental protocols, thus reducing harm and promoting high-quality practice. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Evidence of linked biogeochemical and hydrological processes in homogeneous and layered vadose zone systems

    NASA Astrophysics Data System (ADS)

    McGuire, J. T.; Hansen, D. J.; Mohanty, B. P.

    2010-12-01

    Understanding chemical fate and transport in the vadose zone is critical to protect groundwater resources and preserve ecosystem health. However, prediction can be challenging due to the dynamic hydrologic and biogeochemical nature of the vadose zone. Additional controls on hydrobiogeochemical processes are added by subsurface structural heterogeneity. This study uses repacked soil column experiments to quantify linkages between microbial activity, geochemical cycling and hydrologic flow. Three “short” laboratory soil columns were constructed to evaluate the effects of soil layering: a homogenized medium-grained sand, homogenized organic-rich loam, and a sand-over-loam layered column. In addition, two “long” columns were constructed using either gamma-irradiated (sterilized) or untreated sediments to evaluate the effects of both soil layers and the presence of microorganisms. The long columns were packed identically; a medium-grained sand matrix with two vertically separated and horizontally offset lenses of organic-rich loam. In all 5 columns, downward and upward infiltration of water was evaluated to simulate rainfall and rising water table events respectively. In-situ colocated probes were used to measure soil water content, matric potential, Eh, major anions, ammonium, Fe2+, and total sulfide. Enhanced biogeochemical cycling was observed in the short layered column versus the short, homogeneous columns, and enumerations of iron and sulfate reducing bacteria were 1-2 orders of magnitude greater. In the long columns, microbial activity caused mineral bands and produced insoluble gases that impeded water flow through the pores of the sediment. Capillary barriers, formed around the lenses due to soil textural differences, retarded water flow rates through the lenses. This allowed reducing conditions to develop, evidenced by the production of Fe2+ and S2-. At the fringes of the lenses, Fe2+ oxidized to form Fe(III)-oxide bands that further retarded water

  10. The Copenhagen problem with a quasi-homogeneous potential

    NASA Astrophysics Data System (ADS)

    Fakis, Demetrios; Kalvouridis, Tilemahos

    2017-05-01

    The Copenhagen problem is a well-known case of the famous restricted three-body problem. In this work instead of considering Newtonian potentials and forces we assume that the two primaries create a quasi-homogeneous potential, which means that we insert to the inverse square law of gravitation an inverse cube corrective term in order to approximate various phenomena as the radiation pressure of the primaries or the non-sphericity of them. Based on this new consideration we investigate the equilibrium locations of the small body and their parametric dependence, as well as the zero-velocity curves and surfaces for the planar motion, and the evolution of the regions where this motion is permitted when the Jacobian constant varies.

  11. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    NASA Astrophysics Data System (ADS)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  12. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  13. Creating an Equity State of Mind: A Learning Process

    ERIC Educational Resources Information Center

    Pickens, Augusta Maria

    2012-01-01

    The Diversity Scorecard Project evaluated in this study was created by the University of Southern California's Center for Urban Education. It was designed to create awareness among institutional members about the state of inequities in educational outcomes for underrepresented students. The Diversity Scorecard Project facilitators' aimed to…

  14. Concordance and discordance between taxonomic and functional homogenization: responses of soil mite assemblages to forest conversion.

    PubMed

    Mori, Akira S; Ota, Aino T; Fujii, Saori; Seino, Tatsuyuki; Kabeya, Daisuke; Okamoto, Toru; Ito, Masamichi T; Kaneko, Nobuhiro; Hasegawa, Motohiro

    2015-10-01

    The compositional characteristics of ecological assemblages are often simplified; this process is termed "biotic homogenization." This process of biological reorganization occurs not only taxonomically but also functionally. Testing both aspects of homogenization is essential if ecosystem functioning supported by a diverse mosaic of functional traits in the landscape is concerned. Here, we aimed to infer the underlying processes of taxonomic/functional homogenization at the local scale, which is a scale that is meaningful for this research question. We recorded species of litter-dwelling oribatid mites along a gradient of forest conversion from a natural forest to a monoculture larch plantation in Japan (in total 11 stands), and collected data on the functional traits of the recorded species to quantify functional diversity. We calculated the taxonomic and functional β-diversity, an index of biotic homogenization. We found that both the taxonomic and functional β-diversity decreased with larch dominance (stand homogenization). After further deconstructing β-diversity into the components of turnover and nestedness, which reflect different processes of community organization, a significant decrease in the response to larch dominance was observed only for the functional turnover. As a result, there was a steeper decline in the functional β-diversity than the taxonomic β-diversity. This discordance between the taxonomic and functional response suggests that species replacement occurs between species that are functionally redundant under environmental homogenization, ultimately leading to the stronger homogenization of functional diversity. The insights gained from community organization of oribatid mites suggest that the functional characteristics of local assemblages, which support the functionality of ecosystems, are of more concern in human-dominated forest landscapes.

  15. Homogeneous (Cu, Ni)6Sn5 intermetallic compound joints rapidly formed in asymmetrical Ni/Sn/Cu system using ultrasound-induced transient liquid phase soldering process.

    PubMed

    Li, Z L; Dong, H J; Song, X G; Zhao, H Y; Tian, H; Liu, J H; Feng, J C; Yan, J C

    2018-04-01

    Homogeneous (Cu, Ni) 6 Sn 5 intermetallic compound (IMC) joints were rapidly formed in asymmetrical Ni/Sn/Cu system by an ultrasound-induced transient liquid phase (TLP) soldering process. In the traditional TLP soldering process, the intermetallic joints formed in Ni/Sn/Cu system consisted of major (Cu, Ni) 6 Sn 5 and minor Cu 3 Sn IMCs, and the grain morphology of (Cu, Ni) 6 Sn 5 IMCs subsequently exhibited fine rounded, needlelike and coarse rounded shapes from the Ni side to the Cu side, which was highly in accordance with the Ni concentration gradient across the joints. However, in the ultrasound-induced TLP soldering process, the intermetallic joints formed in Ni/Sn/Cu system only consisted of the (Cu, Ni) 6 Sn 5 IMCs which exhibited an uniform grain morphology of rounded shape with a remarkably narrowed Ni concentration gradient. The ultrasound-induced homogeneous intermetallic joints exhibited higher shear strength (61.6 MPa) than the traditional heterogeneous intermetallic joints (49.8 MPa). Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Quality of mango nectar processed by high-pressure homogenization with optimized heat treatment.

    PubMed

    Tribst, Alline Artigiani Lima; Franchi, Mark Alexandrow; de Massaguer, Pilar Rodriguez; Cristianini, Marcelo

    2011-03-01

    This work aimed to evaluate the effect of high-pressure homogenization (HPH) with heat shock on Aspergillus niger, vitamin C, and color of mango nectar. The nectar was processed at 200 MPa followed by heat shock, which was optimized by response surface methodology by using mango nectar ratio (45 to 70), heat time (10 to 20), and temperature (60 to 85 °C) as variables. The color of mango nectar and vitamin C retention were evaluated at the optimized treatments, that is, 200 MPa + 61.5 °C/20 min or 73.5 °C/10 min. The mathematical model indicates that heat shock time and temperature showed a positive effect in the mould inactivation, whereas increasing ratio resulted in a protective effect on A. niger. The optimized treatments did not increase the retention of vitamin C, but had positive effect for the nectar color, in particular for samples treated at 200 MPa + 61.5 °C/20 min. The results obtained in this study show that the conidia can be inactivated by applying HPH with heat shock, particularly to apply HPH as an option to pasteurize fruit nectar for industries.

  17. Challenges in modelling homogeneous catalysis: new answers from ab initio molecular dynamics to the controversy over the Wacker process.

    PubMed

    Stirling, András; Nair, Nisanth N; Lledós, Agustí; Ujaque, Gregori

    2014-07-21

    We present here a review of the mechanistic studies of the Wacker process stressing the long controversy about the key reaction steps. We give an overview of the previous experimental and theoretical studies on the topic. Then we describe the importance of the most recent Ab Initio Molecular Dynamics (AIMD) calculations in modelling organometallic reactivity in water. As a prototypical example of homogeneous catalytic reactions, the Wacker process poses serious challenges to modelling. The adequate description of the multiple role of the water solvent is very difficult by using static quantum chemical approaches including cluster and continuum solvent models. In contrast, such reaction systems are suitable for AIMD, and by combining with rare event sampling techniques, the method provides reaction mechanisms and the corresponding free energy profiles. The review also highlights how AIMD has helped to obtain a novel understanding of the mechanism and kinetics of the Wacker process.

  18. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  19. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Generating and controlling homogeneous air turbulence using random jet arrays

    NASA Astrophysics Data System (ADS)

    Carter, Douglas; Petersen, Alec; Amili, Omid; Coletti, Filippo

    2016-12-01

    The use of random jet arrays, already employed in water tank facilities to generate zero-mean-flow homogeneous turbulence, is extended to air as a working fluid. A novel facility is introduced that uses two facing arrays of individually controlled jets (256 in total) to force steady homogeneous turbulence with negligible mean flow, shear, and strain. Quasi-synthetic jet pumps are created by expanding pressurized air through small straight nozzles and are actuated by fast-response low-voltage solenoid valves. Velocity fields, two-point correlations, energy spectra, and second-order structure functions are obtained from 2D PIV and are used to characterize the turbulence from the integral-to-the Kolmogorov scales. Several metrics are defined to quantify how well zero-mean-flow homogeneous turbulence is approximated for a wide range of forcing and geometric parameters. With increasing jet firing time duration, both the velocity fluctuations and the integral length scales are augmented and therefore the Reynolds number is increased. We reach a Taylor-microscale Reynolds number of 470, a large-scale Reynolds number of 74,000, and an integral-to-Kolmogorov length scale ratio of 680. The volume of the present homogeneous turbulence, the largest reported to date in a zero-mean-flow facility, is much larger than the integral length scale, allowing for the natural development of the energy cascade. The turbulence is found to be anisotropic irrespective of the distance between the jet arrays. Fine grids placed in front of the jets are effective at modulating the turbulence, reducing both velocity fluctuations and integral scales. Varying the jet-to-jet spacing within each array has no effect on the integral length scale, suggesting that this is dictated by the length scale of the jets.

  1. Role of structural barriers for carotenoid bioaccessibility upon high pressure homogenization.

    PubMed

    Palmero, Paola; Panozzo, Agnese; Colle, Ines; Chigwedere, Claire; Hendrickx, Marc; Van Loey, Ann

    2016-05-15

    A specific approach to investigate the effect of high pressure homogenization on the carotenoid bioaccessibility in tomato-based products was developed. Six different tomato-based model systems were reconstituted in order to target the specific role of the natural structural barriers (chromoplast substructure/cell wall) and of the phases (soluble/insoluble) in determining the carotenoid bioaccessibility and viscosity changes upon high pressure homogenization. Results indicated that in the absence of natural structural barriers (carotenoid enriched oil), the soluble and insoluble phases determined the carotenoid bioaccessibility upon processing whereas, in their presence, these barriers governed the bioaccessibility. Furthermore, it was shown that the increment of the viscosity upon high pressure homogenization is determined by the presence of insoluble phase, however, this result was related to the initial ratio of the soluble:insoluble phases in the system. In addition, no relationship between the changes in viscosity and carotenoid bioaccessibility upon high pressure homogenization was found. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Influence of homogenization treatment on physicochemical properties and enzymatic hydrolysis rate of pure cellulose fibers.

    PubMed

    Jacquet, N; Vanderghem, C; Danthine, S; Blecker, C; Paquot, M

    2013-02-01

    The aim of this study is to compare the effect of different homogenization treatments on the physicochemical properties and the hydrolysis rate of a pure bleached cellulose. Results obtained show that homogenization treatments improve the enzymatic hydrolysis rate of the cellulose fibers by 25 to 100 %, depending of the homogenization treatment applied. Characterization of the samples showed also that homogenization had an impact on some physicochemical properties of the cellulose. For moderate treatment intensities (pressure below 500 b and degree of homogenization below 25), an increase of water retention values (WRV) that correlated to the increase of the hydrolysis rate was highlighted. Result also showed that the overall crystallinity of the cellulose properties appeared not to be impacted by the homogenization treatment. For higher treatment intensities, homogenized cellulose samples developed a stable tridimentional network that contributes to decrease cellulase mobility and slowdown the hydrolysis process.

  3. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    NASA Astrophysics Data System (ADS)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  4. Isotopic homogeneity of iron in the early solar nebula.

    PubMed

    Zhu, X K; Guo, Y; O'Nions, R K; Young, E D; Ash, R D

    2001-07-19

    The chemical and isotopic homogeneity of the early solar nebula, and the processes producing fractionation during its evolution, are central issues of cosmochemistry. Studies of the relative abundance variations of three or more isotopes of an element can in principle determine if the initial reservoir of material was a homogeneous mixture or if it contained several distinct sources of precursor material. For example, widespread anomalies observed in the oxygen isotopes of meteorites have been interpreted as resulting from the mixing of a solid phase that was enriched in 16O with a gas phase in which 16O was depleted, or as an isotopic 'memory' of Galactic evolution. In either case, these anomalies are regarded as strong evidence that the early solar nebula was not initially homogeneous. Here we present measurements of the relative abundances of three iron isotopes in meteoritic and terrestrial samples. We show that significant variations of iron isotopes exist in both terrestrial and extraterrestrial materials. But when plotted in a three-isotope diagram, all of the data for these Solar System materials fall on a single mass-fractionation line, showing that homogenization of iron isotopes occurred in the solar nebula before both planetesimal accretion and chondrule formation.

  5. Microstructural evolution during the homogenization heat treatment of 6XXX and 7XXX aluminum alloys

    NASA Astrophysics Data System (ADS)

    Priya, Pikee

    Homogenization heat treatment of as-cast billets is an important step in the processing of aluminum extrusions. Microstructural evolution during homogenization involves elimination of the eutectic morphology by spheroidisation of the interdendritic phases, minimization of the microsegregation across the grains through diffusion, dissolution of the low-melting phases, which enhances the surface finish of the extrusions, and precipitation of nano-sized dispersoids (for Cr-, Zr-, Mn-, Sc-containing alloys), which inhibit grain boundary motion to prevent recrystallization. Post-homogenization cooling reprecipitates some of the phases, changing the flow stress required for subsequent extrusion. These precipitates, however, are deleterious for the mechanical properties of the alloy and also hamper the age-hardenability and are hence dissolved during solution heat treatment. Microstructural development during homogenization and subsequent cooling occurs both at the length scale of the Secondary Dendrite Arm Spacing (SDAS) in micrometers and dispersoids in nanometers. Numerical tools to simulate microstructural development at both the length scales have been developed and validated against experiments. These tools provide easy and convenient means to study the process. A Cellular Automaton-Finite Volume-based model for evolution of interdendritic phases is coupled with a Particle Size Distribution-based model for precipitation of dispersoids across the grain. This comprehensive model has been used to study the effect of temperature, composition, as-cast microstructure, and cooling rates during post-homogenization quenching on microstructural evolution. The numerical study has been complimented with experiments involving Scanning Electron Microscopy, Energy Dispersive Spectroscopy, X-Ray Diffraction and Differential Scanning Calorimetry and a good agreement has with numerical results has been found. The current work aims to study the microstructural evolution during

  6. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to

  7. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with

  8. Modeling the Homogenization Kinetics of As-Cast U-10wt% Mo alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang Y.

    2016-01-15

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding ofmore » the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.« less

  9. Stability of the dithiocarbamate pesticide maneb in tomato homogenates during cold storage and thermal processing.

    PubMed

    Kontou, S; Tsipi, D; Tzia, C

    2004-11-01

    The effect of storage at 5 degrees C and of thermal processing by cooking at 100 degrees C and sterilization at 121 degrees C for 15 min on maneb residues in tomato homogenates was investigated. Remaining maneb and its toxic metabolite ethylenethiourea (ETU) were measured after each treatment by headspace gas chromatography with flame-photometric detection and by high-performance liquid chromatography with photo-diode array detection, respectively. No significant loss of maneb was observed during cold storage for up to 6 weeks, taking into account analytical variability. Conversely, thermal treatment resulted in substantial degradation of maneb with extensive conversion to ETU. After cooking, only 26 +/- 1% (+/- SE, n = 8) of initial maneb residues remained in the samples, whilst the conversion to ETU was 28 +/- 1% (mol mol(-1)) (+/- SE, n = 4). Sterilization eliminated the residues of the parent compound giving rise to conversion to ETU up to 32 +/- 1% (mol mol(-1)) (+/- SE, n = 4).

  10. The Leadership Assignment: Creating Change.

    ERIC Educational Resources Information Center

    Calabrese, Raymond L.

    This book provides change-motivated leaders with an understanding of the change process and the tools to drive change. Eight change principles guide change agents in creating and sustaining change: prepare to lead change; knowledge is power; create empowering mental models; overcome resistance to change; lead change; accelerate the change process;…

  11. Producing a lycopene nanodispersion: Formulation development and the effects of high pressure homogenization.

    PubMed

    Shariffa, Y N; Tan, T B; Uthumporn, U; Abas, F; Mirhosseini, H; Nehdi, I A; Wang, Y-H; Tan, C P

    2017-11-01

    The aim of this study was to develop formulations to produce lycopene nanodispersions and to investigate the effects of the homogenization pressure on the physicochemical properties of the lycopene nanodispersion. The samples were prepared by using emulsification-evaporation technique. The best formulation was achieved by dispersing an organic phase (0.3% w/v lycopene dissolved in dichloromethane) in an aqueous phase (0.3% w/v Tween 20 dissolved in deionized water) at a ratio of 1:9 by using homogenization process. The increased level of homogenization pressure to 500bar reduced the particle size and lycopene concentration significantly (p<0.05). Excessive homogenization pressure (700-900bar) resulted in large particle sizes with high dispersibility. The zeta potential and turbidity of the lycopene nanodispersion were significantly influenced by the homogenization pressure. The results from this study provided useful information for producing small-sized lycopene nanodispersions with a narrow PDI and good stability for application in beverage products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Homogenization techniques for population dynamics in strongly heterogeneous landscapes.

    PubMed

    Yurk, Brian P; Cobbold, Christina A

    2018-12-01

    An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.

  13. Internal homogenization: effective permittivity of a coated sphere.

    PubMed

    Chettiar, Uday K; Engheta, Nader

    2012-10-08

    The concept of internal homogenization is introduced as a complementary approach to the conventional homogenization schemes, which could be termed as external homogenization. The theory for the internal homogenization of the permittivity of subwavelength coated spheres is presented. The effective permittivity derived from the internal homogenization of coreshells is discussed for plasmonic and dielectric constituent materials. The effective model provided by the homogenization is a useful design tool in constructing coated particles with desired resonant properties.

  14. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  15. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  16. Multifractal spectra in homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  17. Sewage sludge solubilization by high-pressure homogenization.

    PubMed

    Zhang, Yuxuan; Zhang, Panyue; Guo, Jianbin; Ma, Weifang; Fang, Wei; Ma, Boqiang; Xu, Xiangzhe

    2013-01-01

    The behavior of sludge solubilization using high-pressure homogenization (HPH) treatment was examined by investigating the sludge solid reduction and organics solubilization. The sludge volatile suspended solids (VSS) decreased from 10.58 to 6.67 g/L for the sludge sample with a total solids content (TS) of 1.49% after HPH treatment at a homogenization pressure of 80 MPa with four homogenization cycles; total suspended solids (TSS) correspondingly decreased from 14.26 to 9.91 g/L. About 86.15% of the TSS reduction was attributed to the VSS reduction. The increase of homogenization pressure from 20 to 80 MPa or homogenization cycle number from 1 to 4 was favorable to the sludge organics solubilization, and the protein and polysaccharide solubilization linearly increased with the soluble chemical oxygen demand (SCOD) solubilization. More proteins were solubilized than polysaccharides. The linear relationship between SCOD solubilization and VSS reduction had no significant change under different homogenization pressures, homogenization cycles and sludge solid contents. The SCOD of 1.65 g/L was solubilized for the VSS reduction of 1.00 g/L for the three experimental sludge samples with a TS of 1.00, 1.49 and 2.48% under all HPH operating conditions. The energy efficiency results showed that the HPH treatment at a homogenization pressure of 30 MPa with a single homogenization cycle for the sludge sample with a TS of 2.48% was the most energy efficient.

  18. Homogenous stretching or detachment faulting? Which process is primarily extending the Aegean crust

    NASA Astrophysics Data System (ADS)

    Kumerics, C.; Ring, U.

    2003-04-01

    In extending orogens like the Aegean Sea of Greece and the Basin-and-Range province of the western United States, knowledge of rates of tectonic processes are important for understanding which process is primarily extending the crust. Platt et al. (1998) proposed that homogeneous stretching of the lithosphere (i.e. vertical ductile thinning associated with a subhorizontal foliation) at rates of 4-5 km Myr-1 is the dominant process that formed the Alboran Sea in the western Mediterranean. The Aegean Sea in the eastern Mediterranean is well-known for its low-angle normal faults (detachments) (Lister et al., 1984; Lister &Forster, 1996) suggesting that detachment faulting may have been the primary agent achieving ~>250 km (McKenzie, 1978) of extension since the Miocene. Ring et al. (2003) provided evidence for a very fast-slipping detachment on the islands of Syros and Tinos in the western Cyclades, which suggests that normal faulting was the dominant tectonic process that formed the Aegean Sea. However, most extensional detachments in the Aegean do not allow to quantify the amount of vertical ductile thinning associated with extension and therefore a full evaluation of the significance of vertical ductile thinning is not possible. On the Island of Ikaria in the eastern Aegean Sea, a subhorizontal extensional ductile shear zone is well exposed. We studied this shear zone in detail to quantify the amount of vertical ductile thinning associated with extension. Numerous studies have shown that natural shear zones usually deviate significantly from progressive simple shear and are characterized by pronounced shortening perpendicular to the shear zone. Numerous deformed pegmatitic veins in this shear zone on Ikaria allow the reconstruction of deformation and flow parameters (Passchier, 1990), which are necessary for quantifying the amount of vertical ductile thinning in the shear zone. Furthermore, a flow-path and finite-strain study in a syn-tectonic granite, which

  19. Case for a field-programmable gate array multicore hybrid machine for an image-processing application

    NASA Astrophysics Data System (ADS)

    Rakvic, Ryan N.; Ives, Robert W.; Lira, Javier; Molina, Carlos

    2011-01-01

    General purpose computer designers have recently begun adding cores to their processors in order to increase performance. For example, Intel has adopted a homogeneous quad-core processor as a base for general purpose computing. PlayStation3 (PS3) game consoles contain a multicore heterogeneous processor known as the Cell, which is designed to perform complex image processing algorithms at a high level. Can modern image-processing algorithms utilize these additional cores? On the other hand, modern advancements in configurable hardware, most notably field-programmable gate arrays (FPGAs) have created an interesting question for general purpose computer designers. Is there a reason to combine FPGAs with multicore processors to create an FPGA multicore hybrid general purpose computer? Iris matching, a repeatedly executed portion of a modern iris-recognition algorithm, is parallelized on an Intel-based homogeneous multicore Xeon system, a heterogeneous multicore Cell system, and an FPGA multicore hybrid system. Surprisingly, the cheaper PS3 slightly outperforms the Intel-based multicore on a core-for-core basis. However, both multicore systems are beaten by the FPGA multicore hybrid system by >50%.

  20. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  1. Using endemic road features to create self-explaining roads and reduce vehicle speeds.

    PubMed

    Charlton, Samuel G; Mackie, Hamish W; Baas, Peter H; Hay, Karen; Menezes, Miguel; Dixon, Claire

    2010-11-01

    This paper describes a project undertaken to establish a self-explaining roads (SER) design programme on existing streets in an urban area. The methodology focussed on developing a process to identify functional road categories and designs based on endemic road characteristics taken from functional exemplars in the study area. The study area was divided into two sections, one to receive SER treatments designed to maximise visual differences between road categories, and a matched control area to remain untreated for purposes of comparison. The SER design for local roads included increased landscaping and community islands to limit forward visibility, and removal of road markings to create a visually distinct road environment. In comparison, roads categorised as collectors received increased delineation, addition of cycle lanes, and improved amenity for pedestrians. Speed data collected 3 months after implementation showed a significant reduction in vehicle speeds on local roads and increased homogeneity of speeds on both local and collector roads. The objective speed data, combined with residents' speed choice ratings, indicated that the project was successful in creating two discriminably different road categories. 2010 Elsevier Ltd. All rights reserved.

  2. Cellulose nanofibers from banana peels as a Pickering emulsifier: High-energy emulsification processes.

    PubMed

    Costa, Ana Letícia Rodrigues; Gomes, Andresa; Tibolla, Heloisa; Menegalli, Florencia Cecilia; Cunha, Rosiane Lopes

    2018-08-15

    Cellulose nanofibers (CNFs) from banana peels was evaluated as promising stabilizer for oil-in-water emulsions. CNFs were treated using ultrasound and high-pressure homogenizer. Changes on the size, crystallinity index and zeta potential of CNFs were associated with the intense effects of cavitation phenomenon and shear forces promoted by mechanical treatments. CNFs-stabilized emulsions were produced under the same process conditions as the particles. Coalescence phenomenon was observed in the emulsions produced using high-pressure homogenizer, whereas droplets flocculation occurred in emulsions processed by ultrasound. In the latter, coalescence stability was associated with effects of cavitation forces acting on the CNFs breakup. Thus, smaller droplets created during the ultrasonication process could be recovered by particles that acted as an effective barrier against droplets coalescence. Our results improved understanding about the relationship between the choice of emulsification process and their effects on the CNFs properties influencing the potential application of CNFs as a food emulsifier. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Quadrature transmit coil for breast imaging at 7 tesla using forced current excitation for improved homogeneity.

    PubMed

    McDougall, Mary Preston; Cheshkov, Sergey; Rispoli, Joseph; Malloy, Craig; Dimitrov, Ivan; Wright, Steven M

    2014-11-01

    To demonstrate the use of forced current excitation (FCE) to create homogeneous excitation of the breast at 7 tesla, insensitive to the effects of asymmetries in the electrical environment. FCE was implemented on two breast coils: one for quadrature (1) H imaging and one for proton-decoupled (13) C spectroscopy. Both were a Helmholtz-saddle combination, with the saddle tuned to 298 MHz for imaging and 75 MHz for spectroscopy. Bench measurements were acquired to demonstrate the ability to force equal currents on elements in the presence of asymmetric loading to improve homogeneity. Modeling and temperature measurements were conducted per safety protocol. B1 mapping, imaging, and proton-decoupled (13) C spectroscopy were demonstrated in vivo. Using FCE to ensure balanced currents on elements enabled straightforward tuning and maintaining of isolation between quadrature elements of the coil. Modeling and bench measurements confirmed homogeneity of the field, which resulted in images with excellent fat suppression and in broadband proton-decoupled carbon-13 spectra. FCE is a straightforward approach to ensure equal currents on multiple coil elements and a homogeneous excitation field, insensitive to the effects of asymmetries in the electrical environment. This enabled effective breast imaging and proton-decoupled carbon-13 spectroscopy at 7T. © 2014 Wiley Periodicals, Inc.

  4. A homogeneous superconducting magnet design using a hybrid optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ni, Zhipeng; Wang, Qiuliang; Liu, Feng; Yan, Luguang

    2013-12-01

    This paper employs a hybrid optimization algorithm with a combination of linear programming (LP) and nonlinear programming (NLP) to design the highly homogeneous superconducting magnets for magnetic resonance imaging (MRI). The whole work is divided into two stages. The first LP stage provides a global optimal current map with several non-zero current clusters, and the mathematical model for the LP was updated by taking into account the maximum axial and radial magnetic field strength limitations. In the second NLP stage, the non-zero current clusters were discretized into practical solenoids. The superconducting conductor consumption was set as the objective function both in the LP and NLP stages to minimize the construction cost. In addition, the peak-peak homogeneity over the volume of imaging (VOI), the scope of 5 Gauss fringe field, and maximum magnetic field strength within superconducting coils were set as constraints. The detailed design process for a dedicated 3.0 T animal MRI scanner was presented. The homogeneous magnet produces a magnetic field quality of 6.0 ppm peak-peak homogeneity over a 16 cm by 18 cm elliptical VOI, and the 5 Gauss fringe field was limited within a 1.5 m by 2.0 m elliptical region.

  5. Mechanized syringe homogenization of human and animal tissues.

    PubMed

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  6. Homogenization kinetics of a nickel-based superalloy produced by powder bed fusion laser sintering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Fan; Levine, Lyle E.; Allen, Andrew J.

    2017-04-01

    Additively manufactured (AM) metal components often exhibit fine dendritic microstructures and elemental segregation due to the initial rapid solidification and subsequent melting and cooling during the build process, which without homogenization would adversely affect materials performance. In this letter, we report in situ observation of the homogenization kinetics of an AM nickel-based superalloy using synchrotron small angle X-ray scattering. The identified kinetic time scale is in good agreement with thermodynamic diffusion simulation predictions using microstructural dimensions acquired by ex situ scanning electron microscopy. These findings could serve as a recipe for predicting, observing, and validating homogenization treatments in AM materials.

  7. Homogenization Kinetics of a Nickel-based Superalloy Produced by Powder Bed Fusion Laser Sintering.

    PubMed

    Zhang, Fan; Levine, Lyle E; Allen, Andrew J; Campbell, Carelyn E; Lass, Eric A; Cheruvathur, Sudha; Stoudt, Mark R; Williams, Maureen E; Idell, Yaakov

    2017-04-01

    Additively manufactured (AM) metal components often exhibit fine dendritic microstructures and elemental segregation due to the initial rapid solidification and subsequent melting and cooling during the build process, which without homogenization would adversely affect materials performance. In this letter, we report in situ observation of the homogenization kinetics of an AM nickel-based superalloy using synchrotron small angle X-ray scattering. The identified kinetic time scale is in good agreement with thermodynamic diffusion simulation predictions using microstructural dimensions acquired by ex situ scanning electron microscopy. These findings could serve as a recipe for predicting, observing, and validating homogenization treatments in AM materials.

  8. CREATE-IP and CREATE-V: Data and Services Update

    NASA Astrophysics Data System (ADS)

    Carriere, L.; Potter, G. L.; Hertz, J.; Peters, J.; Maxwell, T. P.; Strong, S.; Shute, J.; Shen, Y.; Duffy, D.

    2017-12-01

    The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center and the Earth System Grid Federation (ESGF) are working together to build a uniform environment for the comparative study and use of a group of reanalysis datasets of particular importance to the research community. This effort is called the Collaborative REAnalysis Technical Environment (CREATE) and it contains two components: the CREATE-Intercomparison Project (CREATE-IP) and CREATE-V. This year's efforts included generating and publishing an atmospheric reanalysis ensemble mean and spread and improving the analytics available through CREATE-V. Related activities included adding access to subsets of the reanalysis data through ArcGIS and expanding the visualization tool to GMAO forecast data. This poster will present the access mechanisms to this data and use cases including example Jupyter Notebook code. The reanalysis ensemble was generated using two methods, first using standard Python tools for regridding, extracting levels and creating the ensemble mean and spread on a virtual server in the NCCS environment. The second was using a new analytics software suite, the Earth Data Analytics Services (EDAS), coupled with a high-performance Data Analytics and Storage System (DASS) developed at the NCCS. Results were compared to validate the EDAS methodologies, and the results, including time to process, will be presented. The ensemble includes selected 6 hourly and monthly variables, regridded to 1.25 degrees, with 24 common levels used for the 3D variables. Use cases for the new data and services will be presented, including the use of EDAS for the backend analytics on CREATE-V, the use of the GMAO forecast aerosol and cloud data in CREATE-V, and the ability to connect CREATE-V data to NCCS ArcGIS services.

  9. Effect of homogenization and pasteurization on the structure and thermal stability of whey protein in milk

    USDA-ARS?s Scientific Manuscript database

    The effect of homogenization alone or in combination with high temperature, short time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a two-...

  10. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    PubMed

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.

  11. An easy-to-use word processing program for creating concept cards in psychology courses: a method for teachers.

    PubMed

    Abramson, Charles I; Robinson, Ellen Gray; Rice, Jessica; Burley, Jami; Bergman, Staci; Delougherty, Patricia; Reudy, Katherine

    2002-06-01

    We describe a template to create concept cards in psychology courses using a word processing program. Students create their own individualized cards, which have the look and feel of flashcards and retain the same self-testing and monitoring features. Students report the template is easy to use, that the cards help them focus their study behavior and employ critical thinking skills in learning class material. We offer several suggestions on how to use the cards.

  12. Local loss and spatial homogenization of plant diversity reduce ecosystem multifunctionality

    USDA-ARS?s Scientific Manuscript database

    Experimental studies show that local plant species loss decreases ecosystem functioning and services, but it remains unclear how other changes in biodiversity, such as spatial homogenization, alter multiple processes (multifunctionality) in natural ecosystems. We present a global analysis of eight ...

  13. Method to study the effect of blend flowability on the homogeneity of acetaminophen.

    PubMed

    Llusá, Marcos; Pingali, Kalyana; Muzzio, Fernando J

    2013-02-01

    Excipient selection is key to product development because it affects their processability and physical properties, which ultimately affect the quality attributes of the pharmaceutical product. To study how the flowability of lubricated formulations affects acetaminophen (APAP) homogeneity. The formulations studied here contain one of two types of cellulose (Avicel 102 or Ceollus KG-802), one of three grades of Mallinckrodt APAP (fine, semi-fine, or micronized), lactose (Fast-Flo) and magnesium stearate. These components are mixed in a 300-liter bin blender. Blend flowability is assessed with the Gravitational Displacement Rheometer. APAP homogeneity is assessed with off-line NIR. Excluding blends dominated by segregation, there is a trend between APAP homogeneity and blend flow index. Blend flowability is affected by the type of microcrystalline cellulose and by the APAP grade. The preliminary results suggest that the methodology used in this paper is adequate to study of the effect of blend flow index on APAP homogeneity.

  14. Application and possible benefits of high hydrostatic pressure or high-pressure homogenization on beer processing: A review.

    PubMed

    Santos, Lígia Mr; Oliveira, Fabiano A; Ferreira, Elisa Hr; Rosenthal, Amauri

    2017-10-01

    Beer is the most consumed beverage in the world, especially in countries such as USA, China and Brazil.It is an alcoholic beverage made from malted cereals, and the barley malt is the main ingredient, added with water, hops and yeast. High-pressure processing is a non-traditional method to preserve food and beverages. This technology has become more interesting compared to heat pasteurization, due to the minimal changes it brings to the original nutritional and sensory characteristics of the product, and it comprises two processes: high hydrostatic pressure, which is the most industrially used process, and high-pressure homogenization. The use of high pressure almost does not affect the molecules that are responsible for the aroma and taste, pigments and vitamins compared to the conventional thermal processes. Thus, the products processed by high-pressure processing have similar characteristics compared to fresh products, including beer. The aim of this paper was to review what has been investigated about beer processing using this technology regarding the effects on physicochemical, microbiology and sensory characteristics and related issues. It is organized by processing steps, since high pressure can be applied to malting, mashing, boiling, filtration and pasteurization. Therefore, the beer processed with high-pressure processing may have an extended shelf-life because this process can inactivate beer spoilage microorganisms and result in a superior sensory quality related to freshness and preservation of flavors as it does to juices that are already commercialized. However, beyond this application, high-pressure processing can modify protein structures, such as enzymes that are present in the malt, like α- and β-amylases. This process can activate enzymes to promote, for example, saccharification, or instead inactivate at the end of mashing, depending on the pressure the product is submitted, besides being capable of isomerizing hops to raise beer bitterness

  15. A homogenized localizing gradient damage model with micro inertia effect

    NASA Astrophysics Data System (ADS)

    Wang, Zhao; Poh, Leong Hien

    2018-07-01

    The conventional gradient enhancement regularizes structural responses during material failure. However, it induces a spurious damage growth phenomenon, which is shown here to persist in dynamics. Similar issues were reported with the integral averaging approach. Consequently, the conventional nonlocal enhancement cannot adequately describe the dynamic fracture of quasi-brittle materials, particularly in the high strain rate regime, where a diffused damage profile precludes the development of closely spaced macrocracks. To this end, a homogenization theory is proposed to translate the micro processes onto the macro scale. Starting with simple elementary models at the micro scale to describe the fracture mechanisms, an additional kinematic field is introduced to capture the variations in deformation and velocity within a unit cell. An energetic equivalence between micro and macro is next imposed to ensure consistency at the two scales. The ensuing homogenized microforce balance resembles closely the conventional gradient expression, albeit with an interaction domain that decreases with damage, complemented by a micro inertia effect. Considering a direct single pressure bar example, the homogenized model is shown to resolve the non-physical responses obtained with conventional nonlocal enhancement. The predictive capability of the homogenized model is furthermore demonstrated by considering the spall tests of concrete, with good predictions on failure characteristics such as fragmentation profiles and dynamic tensile strengths, at three different loading rates.

  16. Two-Dimensional Homogeneous Fermi Gases

    NASA Astrophysics Data System (ADS)

    Hueck, Klaus; Luick, Niclas; Sobirey, Lennart; Siegl, Jonas; Lompe, Thomas; Moritz, Henning

    2018-02-01

    We report on the experimental realization of homogeneous two-dimensional (2D) Fermi gases trapped in a box potential. In contrast to harmonically trapped gases, these homogeneous 2D systems are ideally suited to probe local as well as nonlocal properties of strongly interacting many-body systems. As a first benchmark experiment, we use a local probe to measure the density of a noninteracting 2D Fermi gas as a function of the chemical potential and find excellent agreement with the corresponding equation of state. We then perform matter wave focusing to extract the momentum distribution of the system and directly observe Pauli blocking in a near unity occupation of momentum states. Finally, we measure the momentum distribution of an interacting homogeneous 2D gas in the crossover between attractively interacting fermions and bosonic dimers.

  17. Improving homogeneity by dynamic speed limit systems.

    PubMed

    van Nes, Nicole; Brandenburg, Stefan; Twisk, Divera

    2010-05-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12 road sections in a driving simulator. The speed limit system (static-dynamic), the sophistication of the dynamic speed limit system (basic roadside, advanced roadside, and advanced in-car) and the situational condition (dangerous-non-dangerous) were varied. The homogeneity of driving speed, the rated credibility of the posted speed limit and the acceptance of the different dynamic speed limit systems were assessed. The results show that the homogeneity of individual speeds, defined as the variation in driving speed for an individual subject along a particular road section, was higher with the dynamic speed limit system than with the static speed limit system. The more sophisticated dynamic speed limit system tested within this study led to higher homogeneity than the less sophisticated systems. The acceptance of the dynamic speed limit systems used in this study was positive, they were perceived as quite useful and rather satisfactory. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  18. Searching regional rainfall homogeneity using atmospheric fields

    NASA Astrophysics Data System (ADS)

    Gabriele, Salvatore; Chiaravalloti, Francesco

    2013-03-01

    The correct identification of homogeneous areas in regional rainfall frequency analysis is fundamental to ensure the best selection of the probability distribution and the regional model which produce low bias and low root mean square error of quantiles estimation. In an attempt at rainfall spatial homogeneity, the paper explores a new approach that is based on meteo-climatic information. The results are verified ex-post using standard homogeneity tests applied to the annual maximum daily rainfall series. The first step of the proposed procedure selects two different types of homogeneous large regions: convective macro-regions, which contain high values of the Convective Available Potential Energy index, normally associated with convective rainfall events, and stratiform macro-regions, which are characterized by low values of the Q vector Divergence index, associated with dynamic instability and stratiform precipitation. These macro-regions are identified using Hot Spot Analysis to emphasize clusters of extreme values of the indexes. In the second step, inside each identified macro-region, homogeneous sub-regions are found using kriging interpolation on the mean direction of the Vertically Integrated Moisture Flux. To check the proposed procedure, two detailed examples of homogeneous sub-regions are examined.

  19. Supported Dendrimer-Encapsulated Metal Clusters: Toward Heterogenizing Homogeneous Catalysts

    DOE PAGES

    Ye, Rong; Zhukhovitskiy, Aleksandr V.; Deraedt, Christophe V.; ...

    2017-07-13

    Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles—some without homogeneous analogues—for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimizemore » and expand the scope of their reactivity and selectivity. Ongoing efforts in our laboratories are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence, and structural uniformity, dendrimers have proven to be versatile scaffolds for the synthesis and stabilization of small nanoclusters. Then these dendrimer-encapsulated metal clusters (DEMCs) are adsorbed onto mesoporous silica. Through this method, we have achieved selective transformations that had been challenging to accomplish in a heterogeneous setting, e.g., π-bond activation and aldol reactions. Extensive investigation into the catalytic systems under reaction conditions allowed us to correlate the structural features (e.g., oxidation states) of the catalysts and their activity

  20. Quantitative Homogenization in Nonlinear Elasticity for Small Loads

    NASA Astrophysics Data System (ADS)

    Neukamm, Stefan; Schäffner, Mathias

    2018-04-01

    We study quantitative periodic homogenization of integral functionals in the context of nonlinear elasticity. Under suitable assumptions on the energy densities (in particular frame indifference; minimality, non-degeneracy and smoothness at the identity; {p ≥q d} -growth from below; and regularity of the microstructure), we show that in a neighborhood of the set of rotations, the multi-cell homogenization formula of non-convex homogenization reduces to a single-cell formula. The latter can be expressed with the help of correctors. We prove that the homogenized integrand admits a quadratic Taylor expansion in an open neighborhood of the rotations - a result that can be interpreted as the fact that homogenization and linearization commute close to the rotations. Moreover, for small applied loads, we provide an estimate on the homogenization error in terms of a quantitative two-scale expansion.

  1. Effects of homogenization treatment on recrystallization behavior of 7150 aluminum sheet during post-rolling annealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Zhanying; Department of Applied Science, University of Québec at Chicoutimi, Saguenay, QC G7H 2B1; Zhao, Gang

    2016-04-15

    The effects of two homogenization treatments applied to the direct chill (DC) cast billet on the recrystallization behavior in 7150 aluminum alloy during post-rolling annealing have been investigated using the electron backscatter diffraction (EBSD) technique. Following hot and cold rolling to the sheet, measured orientation maps, the recrystallization fraction and grain size, the misorientation angle and the subgrain size were used to characterize the recovery and recrystallization processes at different annealing temperatures. The results were compared between the conventional one-step homogenization and the new two-step homogenization, with the first step being pretreated at 250 °C. Al{sub 3}Zr dispersoids with highermore » densities and smaller sizes were obtained after the two-step homogenization, which strongly retarded subgrain/grain boundary mobility and inhibited recrystallization. Compared with the conventional one-step homogenized samples, a significantly lower recrystallized fraction and a smaller recrystallized grain size were obtained under all annealing conditions after cold rolling in the two-step homogenized samples. - Highlights: • Effects of two homogenization treatments on recrystallization in 7150 Al sheets • Quantitative study on the recrystallization evolution during post-rolling annealing • Al{sub 3}Zr dispersoids with higher densities and smaller sizes after two-step treatment • Higher recrystallization resistance of 7150 sheets with two-step homogenization.« less

  2. An efficient, reliable and inexpensive device for the rapid homogenization of multiple tissue samples by centrifugation.

    PubMed

    Ilyin, S E; Plata-Salamán, C R

    2000-02-15

    Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.

  3. Creating visual explanations improves learning.

    PubMed

    Bobek, Eliza; Tversky, Barbara

    2016-01-01

    Many topics in science are notoriously difficult for students to learn. Mechanisms and processes outside student experience present particular challenges. While instruction typically involves visualizations, students usually explain in words. Because visual explanations can show parts and processes of complex systems directly, creating them should have benefits beyond creating verbal explanations. We compared learning from creating visual or verbal explanations for two STEM domains, a mechanical system (bicycle pump) and a chemical system (bonding). Both kinds of explanations were analyzed for content and learning assess by a post-test. For the mechanical system, creating a visual explanation increased understanding particularly for participants of low spatial ability. For the chemical system, creating both visual and verbal explanations improved learning without new teaching. Creating a visual explanation was superior and benefitted participants of both high and low spatial ability. Visual explanations often included crucial yet invisible features. The greater effectiveness of visual explanations appears attributable to the checks they provide for completeness and coherence as well as to their roles as platforms for inference. The benefits should generalize to other domains like the social sciences, history, and archeology where important information can be visualized. Together, the findings provide support for the use of learner-generated visual explanations as a powerful learning tool.

  4. First-Principles Molecular Dynamics Studies of Organometallic Complexes and Homogeneous Catalytic Processes.

    PubMed

    Vidossich, Pietro; Lledós, Agustí; Ujaque, Gregori

    2016-06-21

    Computational chemistry is a valuable aid to complement experimental studies of organometallic systems and their reactivity. It allows probing mechanistic hypotheses and investigating molecular structures, shedding light on the behavior and properties of molecular assemblies at the atomic scale. When approaching a chemical problem, the computational chemist has to decide on the theoretical approach needed to describe electron/nuclear interactions and the composition of the model used to approximate the actual system. Both factors determine the reliability of the modeling study. The community dedicated much effort to developing and improving the performance and accuracy of theoretical approaches for electronic structure calculations, on which the description of (inter)atomic interactions rely. Here, the importance of the model system used in computational studies is highlighted through examples from our recent research focused on organometallic systems and homogeneous catalytic processes. We show how the inclusion of explicit solvent allows the characterization of molecular events that would otherwise not be accessible in reduced model systems (clusters). These include the stabilization of nascent charged fragments via microscopic solvation (notably, hydrogen bonding), transfer of charge (protons) between distant fragments mediated by solvent molecules, and solvent coordination to unsaturated metal centers. Furthermore, when weak interactions are involved, we show how conformational and solvation properties of organometallic complexes are also affected by the explicit inclusion of solvent molecules. Such extended model systems may be treated under periodic boundary conditions, thus removing the cluster/continuum (or vacuum) boundary, and require a statistical mechanics simulation technique to sample the accessible configurational space. First-principles molecular dynamics, in which atomic forces are computed from electronic structure calculations (namely, density

  5. A facile approach to manufacturing non-ionic surfactant nanodipsersions using proniosome technology and high-pressure homogenization.

    PubMed

    Najlah, Mohammad; Hidayat, Kanar; Omer, Huner K; Mwesigwa, Enosh; Ahmed, Waqar; AlObaidy, Kais G; Phoenix, David A; Elhissi, Abdelbary

    2015-03-01

    In this study, a niosome nanodispersion was manufactured using high-pressure homogenization following the hydration of proniosomes. Using beclometasone dipropionate (BDP) as a model drug, the characteristics of the homogenized niosomes were compared with vesicles prepared via the conventional approach of probe-sonication. Particle size, zeta potential, and the drug entrapment efficiency were similar for both size reduction mechanisms. However, high-pressure homogenization was much more efficient than sonication in terms of homogenization output rate, avoidance of sample contamination, offering a greater potential for a large-scale manufacturing of noisome nanodispersions. For example, high-pressure homogenization was capable of producing small size niosomes (209 nm) using a short single-step of size reduction (6 min) as compared with the time-consuming process of sonication (237 nm in >18 min) and the BDP entrapment efficiency was 29.65% ± 4.04 and 36.4% ± 2.8. In addition, for homogenization, the output rate of the high-pressure homogenization was 10 ml/min compared with 0.83 ml/min using the sonication protocol. In conclusion, a facile, applicable, and highly efficient approach for preparing niosome nanodispersions has been established using proniosome technology and high-pressure homogenization.

  6. Absorbing metasurface created by diffractionless disordered arrays of nanoantennas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chevalier, Paul; Minao, Laboratoire de Photonique et Nanostructures; Bouchon, Patrick, E-mail: patrick.bouchon@onera.fr

    2015-12-21

    We study disordered arrays of metal-insulator-metal nanoantenna in order to create a diffractionless metasurface able to absorb light in the 3–5 μm spectral range. This study is conducted with angle-resolved reflectivity measurements obtained with a Fourier transform infrared spectrometer. A first design is based on a perturbation of a periodic arrangement, leading to a significant reduction of the radiative losses. Then, a random assembly of nanoantennas is built following a Poisson-disk distribution of given density, in order to obtain a nearly perfect cluttered assembly with optical properties of a homogeneous material.

  7. Sensitivity of liquid clouds to homogenous freezing parameterizations.

    PubMed

    Herbert, Ross J; Murray, Benjamin J; Dobbie, Steven J; Koop, Thomas

    2015-03-16

    Water droplets in some clouds can supercool to temperatures where homogeneous ice nucleation becomes the dominant freezing mechanism. In many cloud resolving and mesoscale models, it is assumed that homogeneous ice nucleation in water droplets only occurs below some threshold temperature typically set at -40°C. However, laboratory measurements show that there is a finite rate of nucleation at warmer temperatures. In this study we use a parcel model with detailed microphysics to show that cloud properties can be sensitive to homogeneous ice nucleation as warm as -30°C. Thus, homogeneous ice nucleation may be more important for cloud development, precipitation rates, and key cloud radiative parameters than is often assumed. Furthermore, we show that cloud development is particularly sensitive to the temperature dependence of the nucleation rate. In order to better constrain the parameterization of homogeneous ice nucleation laboratory measurements are needed at both high (>-35°C) and low (<-38°C) temperatures. Homogeneous freezing may be significant as warm as -30°CHomogeneous freezing should not be represented by a threshold approximationThere is a need for an improved parameterization of homogeneous ice nucleation.

  8. Arc melting and homogenization of ZrC and ZrC + B alloys

    NASA Technical Reports Server (NTRS)

    Darolia, R.; Archbold, T. F.

    1973-01-01

    A description is given of the methods used to arc-melt and to homogenize near-stoichiometric ZrC and ZrC-boron alloys, giving attention to the oxygen contamination problem. The starting material for the carbide preparation was ZrC powder with an average particle size of 4.6 micron. Pellets weighing approximately 3 g each were prepared at room temperature from the powder by the use of an isostatic press operated at 50,000 psi. These pellets were individually melted in an arc furnace containing a static atmosphere of purified argon. A graphite resistance furnace was used for the homogenization process.

  9. Investigation on Hot Workability of Homogenized Al-Zn-Mg-Cu Alloy Based on Activation Energy and Processing Map

    NASA Astrophysics Data System (ADS)

    Peng, Xiaoyan; Su, Wusen; Xiao, Dan; Xu, Guofu

    2018-06-01

    Hot deformation behaviors of the homogenized Al-Zn-Mg-Cu alloy were studied by uniaxial compression tests carried out at 623-743 K and strain rates of 0.01-10 s-1. The constitutive equation was developed for the activation energy, and thus the activation energy map was constructed. During the hot deformation, the dominated softening mechanisms were the dynamic recovery and dynamic recrystallization, which were most likely to be driven with increasing temperature and decreasing activation energy. Based on the superposition of the activation energy map and the processing map, together with the microstructure characteristics, the optimized hot workability of the alloy was proposed at the domain (670-743 K and 0.01-0.16 s-1), where the peak efficiency was 0.39 and the activation energy range was 196-260 kJ mol-1.

  10. Properties of lotus seed starch-glycerin monostearin complexes formed by high pressure homogenization.

    PubMed

    Chen, Bingyan; Zeng, Shaoxiao; Zeng, Hongliang; Guo, Zebin; Zhang, Yi; Zheng, Baodong

    2017-07-01

    Starch-lipid complexes were prepared using lotus seed starch (LS) and glycerin monostearate (GMS) via a high pressure homogenization (HPH) process, and the effect of HPH on the physicochemical properties of LS-GMS complexes was investigated. The results of Fourier transform infrared spectroscopy and complex index analysis showed that LS-GMS complexes were formed at 40MPa by HPH and the complex index increased with the increase of homogenization pressure. Scanning electron microscopy displayed LS-GMS complexes present more nest-shape structure with increasing homogenization pressure. X-ray diffraction and differential scanning calorimetry results revealed that V-type crystalline polymorph was formed between LS and GMS, with higher homogenization pressure producing an increasingly stable complex. LS-GMS complex inhibited starch granules swelling, solubility and pasting development, which further reduced peak and breakdown viscosity. During storage, LS-GMS complexes prepared by 70-100MPa had higher Avrami exponent values and lower recrystallization rates compared with native starch, which suggested a lower retrogradation trendency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Fuel mixture stratification as a method for improving homogeneous charge compression ignition engine operation

    DOEpatents

    Dec, John E [Livermore, CA; Sjoberg, Carl-Magnus G [Livermore, CA

    2006-10-31

    A method for slowing the heat-release rate in homogeneous charge compression ignition ("HCCI") engines that allows operation without excessive knock at higher engine loads than are possible with conventional HCCI. This method comprises injecting a fuel charge in a manner that creates a stratified fuel charge in the engine cylinder to provide a range of fuel concentrations in the in-cylinder gases (typically with enough oxygen for complete combustion) using a fuel with two-stage ignition fuel having appropriate cool-flame chemistry so that regions of different fuel concentrations autoignite sequentially.

  12. Enabling Reanalysis Intercomparison with the CREATE-IP and CREATE-V Projects

    NASA Astrophysics Data System (ADS)

    Carriere, L.; Potter, G. L.; Hertz, J.; Shen, Y.; Britzolakis, G.; Peters, J.; Maxwell, T. P.; Li, J.; Strong, S.; Schnase, J. L.

    2016-12-01

    NASA Goddard Space Flight Center's Office of Computational and Information Sciences and Technology, the NASA Center for Climate Simulation (NCCS), and the Earth System Grid Federation (ESGF) are working together to build a uniform environment for the comparative study and use of a group of reanalysis datasets of particular importance to the research community. This effort is called the Collaborative REAnalysis Technical Environment (CREATE) and it contains two components: the CREATE-Intercomparison Project (CREATE-IP) and CREATE-V. For CREATE-IP, our target reanalyses include ECMWF ERA-Interim, NASA/GMAO MERRA and MERRA2, NOAA/NCEP CFSR, NOAA/ESRL 20CR and 20CRv2, JMA JRA25, and JRA55. Each dataset is reformatted similarly to the models in the CMIP5 archive. By repackaging the reanalysis data into a common structure and format, it simplifies access, subsetting, and reanalysis comparison. Both monthly average data and a selection of high frequency data (6-hr) relevant to investigations such as the 2016 El Niño are provided. Much of the processing workflow has been automated and new data appear on a regular basis. In collaboration with the CLIVAR Global Synthesis and Observations Panel (GSOP), we are also processing and publishing eight ocean reanalyses, from 1980 to the present. Here, the data are regridded to a common 1° x 1° grid, vertically interpolated to the World Ocean Atlas 09 (WOA09) depths, and an ensemble is generated. CREATE-V is a web based visualization tool that allows the user to simultaneously view four reanalyses to facilitate comparison. The addition of a backend analytics engine, based on UV-CDAT and Scala provides the ability to generate a time series and anomaly for any given location on a map. The system enables scientists to identify data of interest and visualize, subset, and compare data without the need for download large volumes of data for local visualization.

  13. Airway bypass treatment of severe homogeneous emphysema: taking advantage of collateral ventilation.

    PubMed

    Choong, Cliff K; Cardoso, Paulo F G; Sybrecht, Gerhard W; Cooper, Joel D

    2009-05-01

    Airway bypass is being investigated as a new form of minimally invasive therapy for the treatment of homogeneous emphysema. It is a bronchoscopic catheter-based procedure that creates transbronchial extra-anatomic passages at the bronchial segmental level. The passages are expanded, supported with the expectation that the patency is maintained by paclitaxel drug-eluting airway bypass stents. The concept of airway bypass has been demonstrated in two separate experimental studies. These studies have shown that airway bypass takes advantage of collateral ventilation present in homogeneous emphysema to allow trapped gas to escape and reduce hyperinflation. It improves lung mechanics, expiratory flow, and volume. Airway bypass stent placements have been shown to be feasible and safe in both animal and human studies. Paclitaxel-eluting airway bypass stents were found to prolong stent patency and were adopted for clinical studies. A study evaluating the early results of the clinical application of airway bypass with paclitaxel-eluting stents found that airway bypass procedures reduced hyperinflation and improved pulmonary function and dyspnea in selected subjects who have severe emphysema. The duration of benefit appeared to correlate with the degree of pretreatment hyperinflation. These preliminary clinical results supported further evaluation of the procedure and led to the EASE Trial. The EASE Trial is a prospective, multicenter, randomized, double-blind, sham-controlled study. The trial aims to evaluate the safety and effectiveness of the airway bypass to improve pulmonary function and reduce dyspnea in homogeneous emphysema subjects who have severe hyperinflation. The trial is presently ongoing worldwide, though enrollment was completed.

  14. Homogeneous buoyancy-generated turbulence

    NASA Technical Reports Server (NTRS)

    Batchelor, G. K.; Canuto, V. M.; Chasnov, J. R.

    1992-01-01

    Using a theoretical analysis of fundamental equations and a numerical simulation of the flow field, the statistically homogeneous motion that is generated by buoyancy forces after the creation of homogeneous random fluctuations in the density of infinite fluid at an initial instant is examined. It is shown that analytical results together with numerical results provide a comprehensive description of the 'birth, life, and death' of buoyancy-generated turbulence. Results of numerical simulations yielded the mean-square density mean-square velocity fluctuations and the associated spectra as functions of time for various initial conditions, and the time required for the mean-square density fluctuation to fall to a specified small value was estimated.

  15. High temperature homogenization improves impact toughness of vitamin E-diffused, irradiated UHMWPE.

    PubMed

    Oral, Ebru; O'Brien, Caitlin; Doshi, Brinda; Muratoglu, Orhun K

    2017-06-01

    Diffusion of vitamin E into radiation cross-linked ultrahigh molecular weight polyethylene (UHMWPE) is used to increase stability against oxidation of total joint implant components. The dispersion of vitamin E throughout implant preforms has been optimized by a two-step process of doping and homogenization. Both of these steps are performed below the peak melting point of the cross-linked polymer (<140°C) to avoid loss of crystallinity and strength. Recently, it was discovered that the exposure of UHMWPE to elevated temperatures, around 300°C, for a limited amount of time in nitrogen, could improve the toughness without sacrificing wear resistance. We hypothesized that high temperature homogenization of antioxidant-doped, radiation cross-linked UHMWPE could improve its toughness. We found that homogenization at 300°C for 8 h resulted in an increase in the impact toughness (74 kJ/m 2 compared to 67 kJ/m 2 ), the ultimate tensile strength (50 MPa compared to 43 MPa) and elongation at break (271% compared to 236%). The high temperature treatment did not compromise the wear resistance or the oxidative stability as measured by oxidation induction time. In addition, the desired homogeneity was achieved at a much shorter duration (8 h compared to >240 h) by using high temperature homogenization. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:1343-1347, 2017. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  16. Self-dispersible nanocrystals of albendazole produced by high pressure homogenization and spray-drying.

    PubMed

    Paredes, Alejandro Javier; Llabot, Juan Manuel; Sánchez Bruni, Sergio; Allemandi, Daniel; Palma, Santiago Daniel

    2016-10-01

    Albendazole (ABZ) is a broad-spectrum antiparasitic drug used in the treatment of human or animal infections. Although ABZ has shown a high efficacy for repeated doses in monogastric mammals, its low aqueous solubility leads to erratic bioavailability. The aim of this work was to optimize a procedure in order to obtain ABZ self-dispersible nanocrystals (SDNC) by combining high pressure homogenization (HPH) and spray-drying (SD). The material thus obtained was characterized and the variables affecting both the HPH and SD processes were studied. As expected, the homogenizing pressure and number of cycles influenced the final particle size, while the stabilizer concentration had a strong impact on SD output and redispersion of powders upon contact with water. ABZ SDNC were successfully obtained with high process yield and redispersibility. The characteristic peaks of ABZ were clearly identified in the X-ray patterns of the processed samples. A noticeable increase in the dissolution rate was observed in the aqueous environment.

  17. Enzymatic cell wall degradation of high-pressure-homogenized tomato puree and its effect on lycopene bioaccessibility.

    PubMed

    Palmero, Paola; Colle, Ines; Lemmens, Lien; Panozzo, Agnese; Nguyen, Tuyen Thi My; Hendrickx, Marc; Van Loey, Ann

    2016-01-15

    High-pressure homogenization disrupts cell structures, assisting carotenoid release from the matrix and subsequent micellarization. However, lycopene bioaccessibility of tomato puree upon high-pressure homogenization is limited by the formation of a process-induced barrier. In this context, cell wall-degrading enzymes were applied to hydrolyze the formed barrier and enhance lycopene bioaccessibility. The effectiveness of the enzymes in degrading their corresponding substrates was evaluated (consistency, amount of reducing sugars, molar mass distribution and immunolabeling). An in vitro digestion procedure was applied to evaluate the effect of the enzymatic treatments on lycopene bioaccessibility. Enzymatic treatments with pectinases and cellulase were proved to effectively degrade their corresponding cell wall polymers; however, no further significant increase in lycopene bioaccessibility was obtained. A process-induced barrier consisting of cell wall material is not the only factor governing lycopene bioaccessibility upon high-pressure homogenization. © 2015 Society of Chemical Industry.

  18. Preparation of cotton linter nanowhiskers by high-pressure homogenization process and its application in thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Savadekar, N. R.; Karande, V. S.; Vigneshwaran, N.; Kadam, P. G.; Mhaske, S. T.

    2015-03-01

    The present work deals with the preparation of cotton linter nanowhiskers (CLNW) by acid hydrolysis and subsequent processing in a high-pressure homogenizer. Prepared CLNW were then used as a reinforcing material in thermoplastic starch (TPS), with an aim to improve its performance properties. Concentration of CLNW was varied as 0, 1, 2, 3, 4 and 5 wt% in TPS. TPS/CLNW nanocomposite films were prepared by solution-casting process. The nanocomposite films were characterized by tensile, differential scanning calorimetry, scanning electron microscopy (SEM), water vapor permeability (WVP), oxygen permeability (OP), X-ray diffraction and light transmittance properties. 3 wt% CLNW-loaded TPS nanocomposite films demonstrated 88 % improvement in the tensile strength as compared to the pristine TPS polymer film; whereas, WVP and OP decreased by 90 and 92 %, respectively, which is highly appreciable compared to the quantity of CLNW added. DSC thermograms of nanocomposite films did not show any significant effect on melting temperature as compared to the pristine TPS. Light transmittance ( T r) value of TPS decreased with increased content of CLNW. Better interaction between CLNW and TPS, caused due to the hydrophilic nature of both the materials, and uniform distribution of CLNW in TPS were the prime reason for the improvement in properties observed at 3 wt% loading of CLNW in TPS. However, CLNW was seen to have formed agglomerates at higher concentration as determined from SEM analysis. These nanocomposite films can have potential use in food and pharmaceutical packaging applications.

  19. Stability analysis for virus spreading in complex networks with quarantine and non-homogeneous transition rates

    NASA Astrophysics Data System (ADS)

    Alarcon-Ramos, L. A.; Schaum, A.; Rodríguez Lucatero, C.; Bernal Jaquez, R.

    2014-03-01

    Virus propagations in complex networks have been studied in the framework of discrete time Markov process dynamical systems. These studies have been carried out under the assumption of homogeneous transition rates, yielding conditions for virus extinction in terms of the transition probabilities and the largest eigenvalue of the connectivity matrix. Nevertheless the assumption of homogeneous rates is rather restrictive. In the present study we consider non-homogeneous transition rates, assigned according to a uniform distribution, with susceptible, infected and quarantine states, thus generalizing the previous studies. A remarkable result of this analysis is that the extinction depends on the weakest element in the network. Simulation results are presented for large free-scale networks, that corroborate our theoretical findings.

  20. It's Who You Know "and" What You Know: The Process of Creating Partnerships between Schools and Communities

    ERIC Educational Resources Information Center

    Hands, Catherine

    2005-01-01

    Based on qualitative research, this article aims to clarify the process of creating school-community partnerships. Two secondary schools with numerous partnerships were selected within a southern Ontario school board characterized by economic and cultural diversity. Drawing on the within- and cross-case analyses of documents, observations, and 25…

  1. Influence of Interspecific Competition and Landscape Structure on Spatial Homogenization of Avian Assemblages

    PubMed Central

    Robertson, Oliver J.; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or ‘winners’, is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or ‘winners’, can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems. PMID:23724136

  2. Reciprocity theory of homogeneous reactions

    NASA Astrophysics Data System (ADS)

    Agbormbai, Adolf A.

    1990-03-01

    The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.

  3. Cosmic homogeneity: a spectroscopic and model-independent measurement

    NASA Astrophysics Data System (ADS)

    Gonçalves, R. S.; Carvalho, G. C.; Bengaly, C. A. P., Jr.; Carvalho, J. C.; Bernui, A.; Alcaniz, J. S.; Maartens, R.

    2018-03-01

    Cosmology relies on the Cosmological Principle, i.e. the hypothesis that the Universe is homogeneous and isotropic on large scales. This implies in particular that the counts of galaxies should approach a homogeneous scaling with volume at sufficiently large scales. Testing homogeneity is crucial to obtain a correct interpretation of the physical assumptions underlying the current cosmic acceleration and structure formation of the Universe. In this letter, we use the Baryon Oscillation Spectroscopic Survey to make the first spectroscopic and model-independent measurements of the angular homogeneity scale θh. Applying four statistical estimators, we show that the angular distribution of galaxies in the range 0.46 < z < 0.62 is consistent with homogeneity at large scales, and that θh varies with redshift, indicating a smoother Universe in the past. These results are in agreement with the foundations of the standard cosmological paradigm.

  4. Decay of homogeneous two-dimensional quantum turbulence

    NASA Astrophysics Data System (ADS)

    Baggaley, Andrew W.; Barenghi, Carlo F.

    2018-03-01

    We numerically simulate the free decay of two-dimensional quantum turbulence in a large, homogeneous Bose-Einstein condensate. The large number of vortices, the uniformity of the density profile, and the absence of boundaries (where vortices can drift out of the condensate) isolate the annihilation of vortex-antivortex pairs as the only mechanism which reduces the number of vortices, Nv, during the turbulence decay. The results clearly reveal that vortex annihilation is a four-vortex process, confirming the decay law Nv˜t-1 /3 where t is time, which was inferred from experiments with relatively few vortices in small harmonically trapped condensates.

  5. The process of co-creating the interface for VENSTER, an interactive artwork for nursing home residents with dementia.

    PubMed

    Jamin, Gaston; Luyten, Tom; Delsing, Rob; Braun, Susy

    2017-10-17

    Interactive art installations might engage nursing home residents with dementia. The main aim of this article was to describe the challenging design process of an interactive artwork for nursing home residents, in co-creation with all stakeholders and to share the used methods and lessons learned. This process is illustrated by the design of the interface of VENSTER as a case. Nursing home residents from the psychogeriatric ward, informal caregivers, client representatives, health care professionals and members of the management team were involved in the design process, which consisted of three phases: (1) identify requirements, (2) develop a prototype and (3) conduct usability tests. Several methods were used (e.g. guided co-creation sessions, "Wizard of Oz"). Each phase generated "lessons learned", which were used as the departure point of the next phase. Participants hardly paid attention to the installation and interface. There, however, seemed to be an untapped potential for creating an immersive experience by focussing more on the content itself as an interface (e.g. creating specific scenes with cues for interaction, scenes based on existing knowledge or prior experiences). "Fifteen lessons learned" which can potentially assist the design of an interactive artwork for nursing home residents suffering from dementia were derived from the design process. This description provides tools and best practices for stakeholders to make (better) informed choices during the creation of interactive artworks. It also illustrates how co-design can make the difference between designing a pleasurable experience and a meaningful one. Implications for rehabilitation Co-design with all stakeholders can make the difference between designing a pleasurable experience and a meaningful one. There seems to be an untapped potential for creating an immersive experience by focussing more on the content itself as an interface (e.g. creating specific scenes with cues for interaction

  6. Cosmological models with homogeneous and isotropic spatial sections

    NASA Astrophysics Data System (ADS)

    Katanaev, M. O.

    2017-05-01

    The assumption that the universe is homogeneous and isotropic is the basis for the majority of modern cosmological models. We give an example of a metric all of whose spatial sections are spaces of constant curvature but the space-time is nevertheless not homogeneous and isotropic as a whole. We give an equivalent definition of a homogeneous and isotropic universe in terms of embedded manifolds.

  7. We're Born to Learn: Using the Brain's Natural Learning Process to Create Today's Curriculum. Second Edition

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    2011-01-01

    This updated edition of the bestselling book on the brain's natural learning process brings new research results and applications in a power-packed teacher tool kit. Rita Smilkstein shows teachers how to create and deliver curricula that help students become the motivated, successful, and natural learners they were born to be. Updated features…

  8. Simulator for SUPO, a Benchmark Aqueous Homogeneous Reactor (AHR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Steven Karl; Determan, John C.

    2015-10-14

    A simulator has been developed for SUPO (Super Power) an aqueous homogeneous reactor (AHR) that operated at Los Alamos National Laboratory (LANL) from 1951 to 1974. During that period SUPO accumulated approximately 600,000 kWh of operation. It is considered the benchmark for steady-state operation of an AHR. The SUPO simulator was developed using the process that resulted in a simulator for an accelerator-driven subcritical system, which has been previously reported.

  9. AnClim and ProClimDB software for data quality control and homogenization of time series

    NASA Astrophysics Data System (ADS)

    Stepanek, Petr

    2015-04-01

    During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.

  10. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  11. Homogenization models for thin rigid structured surfaces and films.

    PubMed

    Marigo, Jean-Jacques; Maurel, Agnès

    2016-07-01

    A homogenization method for thin microstructured surfaces and films is presented. In both cases, sound hard materials are considered, associated with Neumann boundary conditions and the wave equation in the time domain is examined. For a structured surface, a boundary condition is obtained on an equivalent flat wall, which links the acoustic velocity to its normal and tangential derivatives (of the Myers type). For a structured film, jump conditions are obtained for the acoustic pressure and the normal velocity across an equivalent interface (of the Ventcels type). This interface homogenization is based on a matched asymptotic expansion technique, and differs slightly from the classical homogenization, which is known to fail for small structuration thicknesses. In order to get insight into what causes this failure, a two-step homogenization is proposed, mixing classical homogenization and matched asymptotic expansion. Results of the two homogenizations are analyzed in light of the associated elementary problems, which correspond to problems of fluid mechanics, namely, potential flows around rigid obstacles.

  12. A method to eliminate wetting during the homogenization of HgCdTe

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Lehoczky, S. L.; Szofran, F. R.

    1986-01-01

    Adhesion of HgCdTe samples to fused silica ampoule walls, or 'wetting', during the homogenization process was eliminated by adopting a slower heating rate. The idea is to decrease Cd activity in the sample so as to reduce the rate of reaction between Cd and the silica wall.

  13. Sewage sludge disintegration by high-pressure homogenization: a sludge disintegration model.

    PubMed

    Zhang, Yuxuan; Zhang, Panyue; Ma, Boqiang; Wu, Hao; Zhang, Sheng; Xu, Xin

    2012-01-01

    High-pressure homogenization (HPH) technology was applied as a pretreatment to disintegrate sewage sludge. The effects of homogenization pressure, homogenization cycle number, and total solid content on sludge disintegration were investigated. The sludge disintegration degree (DD(COD)), protein concentration, and polysaccharide concentration increased with the increase of homogenization pressure and homogenization cycle number, and decreased with the increase of sludge total solid (TS) content. The maximum DD(COD) of 43.94% was achieved at 80 MPa with four homogenization cycles for a 9.58 g/L TS sludge sample. A HPH sludge disintegration model of DD(COD) = kNaPb was established by multivariable linear regression to quantify the effects of homogenization parameters. The homogenization cycle exponent a and homogenization pressure exponent b were 0.4763 and 0.7324 respectively, showing that the effect of homogenization pressure (P) was more significant than that of homogenization cycle number (N). The value of the rate constant k decreased with the increase of sludge total solid content. The specific energy consumption increased with the increment of sludge disintegration efficiency. Lower specific energy consumption was required for higher total solid content sludge.

  14. Directed Thermal Diffusions through Metamaterial Source Illusion with Homogeneous Natural Media

    PubMed Central

    Xu, Guoqiang; Zhang, Haochun; Jin, Liang

    2018-01-01

    Owing to the utilization of transformation optics, many significant research and development achievements have expanded the applications of illusion devices into thermal fields. However, most of the current studies on relevant thermal illusions used to reshape the thermal fields are dependent of certain pre-designed geometric profiles with complicated conductivity configurations. In this paper, we propose a methodology for designing a new class of thermal source illusion devices for achieving directed thermal diffusions with natural homogeneous media. The employments of the space rotations in the linear transformation processes allow the directed thermal diffusions to be independent of the geometric profiles, and the utilization of natural homogeneous media improve the feasibility. Four schemes, with fewer types of homogeneous media filling the functional regions, are demonstrated in transient states. The expected performances are observed in each scheme. The related performance are analyzed by comparing the thermal distribution characteristics and the illusion effectiveness on the measured lines. The findings obtained in this paper see applications in the development of directed diffusions with minimal thermal loss, used in novel “multi-beam” thermal generation, thermal lenses, solar receivers, and waveguide. PMID:29671833

  15. Homogeneous and heterogeneous chemistry along air parcel trajectories

    NASA Technical Reports Server (NTRS)

    Jones, R. L.; Mckenna, D. L.; Poole, L. R.; Solomon, S.

    1990-01-01

    The study of coupled heterogeneous and homogeneous chemistry due to polar stratospheric clouds (PSC's) using Lagrangian parcel trajectories for interpretation of the Airborne Arctic Stratosphere Experiment (AASE) is discussed. This approach represents an attempt to quantitatively model the physical and chemical perturbation to stratospheric composition due to formation of PSC's using the fullest possible representation of the relevant processes. Further, the meteorological fields from the United Kingdom Meteorological office global model were used to deduce potential vorticity and inferred regions of PSC's as an input to flight planning during AASE.

  16. Climate for Learning: A Symposium. Creating a Climate for Learning, and the Humanizing Process. The Principal and School Discipline. Curriculum Bulletin Vol. XXXII, No. 341.

    ERIC Educational Resources Information Center

    Johnson, Simon O.; Chaky, June

    This publication contains two articles focusing on creating a climate for learning. In "Creating a Climate for Learning, and the Humanizing Process," Simon O. Johnson offers practical suggestions for creating a humanistic learning environment. The author begins by defining the basic concepts--humanism, affective education, affective situation,…

  17. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    DOE PAGES

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; ...

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCPmore » is referred to herein as a Virtual ADCP (VADCP).« less

  18. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  19. Homogeneity testing of the global ESA CCI multi-satellite soil moisture climate data record

    NASA Astrophysics Data System (ADS)

    Preimesberger, Wolfgang; Su, Chun-Hsu; Gruber, Alexander; Dorigo, Wouter

    2017-04-01

    ESA's Climate Change Initiative (CCI) creates a global, long-term data record by merging multiple available earth observation products with the goal to provide a product for climate studies, trend analysis, and risk assessments. The blending of soil moisture (SM) time series derived from different active and passive remote sensing instruments with varying sensor characteristics, such as microwave frequency, signal polarization or radiometric accuracy, could potentially lead to inhomogeneities in the merged long-term data series, undercutting the usefulness of the product. To detect the spatio-temporal extent of contiguous periods without inhomogeneities as well as subsequently minimizing their negative impact on the data records, different relative homogeneity tests (namely Fligner-Killeen test of homogeneity of variances and Wilcoxon rank-sums test) are implemented and tested on the combined active-passive ESA CCI SM data set. Inhomogeneities are detected by comparing the data against reference data from in-situ data from ISMN, and model-based estimates from GLDAS-Noah and MERRA-Land. Inhomogeneity testing is performed over the ESA CCI SM data time frame of 38 years (from 1978 to 2015), on a global quarter-degree grid and with regard to six alterations in the combination of observation systems used in the data blending process. This study describes and explains observed variations in the spatial and temporal patterns of inhomogeneities in the combined products. Besides we proposes methodologies for measuring and reducing the impact of inhomogeneities on trends derived from the ESA CCI SM data set, and suggest the use of inhomogeneity-corrected data for future trend studies. This study is supported by the European Union's FP7 EartH2Observe "Global Earth Observation for Integrated Water Resource Assessment" project (grant agreement number 331 603608).

  20. A non-asymptotic homogenization theory for periodic electromagnetic structures.

    PubMed

    Tsukerman, Igor; Markel, Vadim A

    2014-08-08

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions.

  1. Gauge Fields in Homogeneous and Inhomogeneous Cosmologies

    NASA Astrophysics Data System (ADS)

    Darian, Bahman K.

    Despite its formidable appearance, the study of classical Yang-Mills (YM) fields on homogeneous cosmologies is amenable to a formal treatment. This dissertation is a report on a systematic approach to the general construction of invariant YM fields on homogeneous cosmologies undertaken for the first time in this context. This construction is subsequently followed by the investigation of the behavior of YM field variables for the most simple of self-gravitating YM fields. Particularly interesting was a dynamical system analysis and the discovery of chaotic signature in the axially symmetric Bianchi I-YM cosmology. Homogeneous YM fields are well studied and are known to have chaotic properties. The chaotic behavior of YM field variables in homogeneous cosmologies might eventually lead to an invariant definition of chaos in (general) relativistic cosmological models. By choosing the gauge fields to be Abelian, the construction and the field equations presented so far reduce to that of electromagnetic field in homogeneous cosmologies. A perturbative analysis of gravitationally interacting electromagnetic and scalar fields in inhomogeneous cosmologies is performed via the Hamilton-Jacobi formulation of general relativity. An essential feature of this analysis is the spatial gradient expansion of the generating functional (Hamilton principal function) to solve the Hamiltonian constraint. Perturbations of a spatially flat Friedman-Robertson-Walker cosmology with an exponential potential for the scalar field are presented.

  2. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  3. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions

    PubMed Central

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter. PMID:26731550

  4. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    PubMed

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

  5. Occupational exposure to motor exhaust in Stockholm, Sweden--different grouping strategies using variability in NO₂ to create homogenous groups.

    PubMed

    Lewné, Marie; Plato, Nils; Bellander, Tom; Alderling, Magnus; Gustavsson, Per

    2011-01-01

    The aim of the present study was to investigate the personal variability in occupational exposure to NO(2), as a marker of exposure to diesel exhaust, and to compare a statistical method of grouping workers in homogenous groups with a grouping performed by a qualified occupational hygienist. Forty-seven workers exposed to motor exhaust in their occupation were included. Personal measurements of NO(2) were performed with diffusive samplers over three full working shifts. The results from the measurements were analysed with a linear mixed effects model, taking both between and within-worker variability into consideration. The workers were divided into occupational groups in different ways in order to find a categorization, with maximal homogeneity in exposure in each group. We used (B)R(0.95) as an estimator of the between-worker variability. To study the effect of the divisions on the fit of the statistical model, we used the Akaike Information Criterion. The geometric mean for NO(2) for all 47 workers was 69 μg/m(3) and the between-worker variability (B)R(0.95) was 23.8. In six successive steps, the 47 workers were divided into up to eight groups, based on observed job characteristics. In the final grouping, seven groups were included with geometric means ranging from 32 μg/m(3) for outdoor workers, to 316 μg/m(3) for the most exposed group (tunnel construction workers). The (B)R(0.95) varied between 2.4 and 6.3. The within-worker variability (W)R(0.95) for the last division differed in the groups from 2.0 to 7.9. The Akaike Information Criterion decreased from 246, if all persons were included in one group, to 174 for the final grouping. The average level of NO(2) varied about 10 times between the different occupational groups, with the highest level for tunnel construction workers (316 μg/m(3)) and lowest for outdoor workers (32 μg/m(3)). For four of the seven groups the between-worker variability was higher than the within-worker variability. Copyright

  6. [Growth Factors and Interleukins in Amniotic Membrane Tissue Homogenate].

    PubMed

    Stachon, T; Bischoff, M; Seitz, B; Huber, M; Zawada, M; Langenbucher, A; Szentmáry, N

    2015-07-01

    Application of amniotic membrane homogenate eye drops may be a potential treatment alternative for therapy resistant corneal epithelial defects. The purpose of this study was to determine the concentrations of epidermal growth factor (EGF), fibroblast growth factor basic (bFGF), hepatocyte growth factor (HGF), keratinocyte growth factor (KGF), interleukin-6 (IL-6) and interleukin-8 (IL-8) in amniotic membrane homogenates. Amniotic membranes of 8 placentas were prepared and thereafter stored at - 80 °C using the standard methods of the LIONS Cornea Bank Saar-Lor-Lux, Trier/Westpfalz. Following defreezing, amniotic membranes were cut in two pieces and homogenized in liquid nitrogen. One part of the homogenate was prepared in cell-lysis buffer, the other part was prepared in PBS. The tissue homogenates were stored at - 20 °C until enzyme-linked immunosorbent assay (ELISA) analysis for EGF, bFGF, HGF, KGF, IL-6 and IL-8 concentrations. Concentrations of KGF, IL-6 and IL-8 were below the detection limit using both preparation techniques. The EGF concentration in tissue homogenates treated with cell-lysis buffer (2412 pg/g tissue) was not significantly different compared to that of tissue homogenates treated with PBS (1586 pg/g tissue, p = 0.72). bFGF release was also not significantly different using cell-lysis buffer (3606 pg/g tissue) or PBS treated tissue homogenates (4649 pg/g tissue, p = 0.35). HGF release was significantly lower using cell-lysis buffer (23,555 pg/g tissue), compared to PBS treated tissue (47,766 pg/g tissue, p = 0.007). Containing EGF, bFGF and HGF, and lacking IL-6 and IL-8, the application of amniotic membrane homogenate eye drops may be a potential treatment alternative for therapy-resistant corneal epithelial defects. Georg Thieme Verlag KG Stuttgart · New York.

  7. Anthropogenic disturbance homogenizes seagrass fish communities.

    PubMed

    Iacarella, Josephine C; Adamczyk, Emily; Bowen, Dan; Chalifour, Lia; Eger, Aaron; Heath, William; Helms, Sibylla; Hessing-Lewis, Margot; Hunt, Brian P V; MacInnis, Andrew; O'Connor, Mary I; Robinson, Clifford L K; Yakimishyn, Jennifer; Baum, Julia K

    2018-05-01

    Anthropogenic activities have led to the biotic homogenization of many ecological communities, yet in coastal systems this phenomenon remains understudied. In particular, activities that locally affect marine habitat-forming foundation species may perturb habitat and promote species with generalist, opportunistic traits, in turn affecting spatial patterns of biodiversity. Here, we quantified fish diversity in seagrass communities across 89 sites spanning 6° latitude along the Pacific coast of Canada, to test the hypothesis that anthropogenic disturbances homogenize (i.e., lower beta-diversity) assemblages within coastal ecosystems. We test for patterns of biotic homogenization at sites within different anthropogenic disturbance categories (low, medium, and high) at two spatial scales (within and across regions) using both abundance- and incidence-based beta-diversity metrics. Our models provide clear evidence that fish communities in high anthropogenic disturbance seagrass areas are homogenized relative to those in low disturbance areas. These results were consistent across within-region comparisons using abundance- and incidence-based measures of beta-diversity, and in across-region comparisons using incidence-based measures. Physical and biotic characteristics of seagrass meadows also influenced fish beta-diversity. Biotic habitat characteristics including seagrass biomass and shoot density were more differentiated among high disturbance sites, potentially indicative of a perturbed environment. Indicator species and trait analyses revealed fishes associated with low disturbance sites had characteristics including stenotopy, lower swimming ability, and egg guarding behavior. Our study is the first to show biotic homogenization of fishes across seagrass meadows within areas of relatively high human impact. These results support the importance of targeting conservation efforts in low anthropogenic disturbance areas across land- and seascapes, as well as managing

  8. 3D-ICs created using oblique processing

    NASA Astrophysics Data System (ADS)

    Burckel, D. Bruce

    2016-03-01

    This paper demonstrates that another class of three-dimensional integrated circuits (3D-ICs) exists, distinct from through silicon via centric and monolithic 3D-ICs. Furthermore, it is possible to create devices that are 3D at the device level (i.e. with active channels oriented in each of the three coordinate axes), by performing standard CMOS fabrication operations at an angle with respect to the wafer surface into high aspect ratio silicon substrates using membrane projection lithography (MPL). MPL requires only minimal fixturing changes to standard CMOS equipment, and no change to current state-of-the-art lithography. Eliminating the constraint of 2D planar device architecture enables a wide range of new interconnect topologies which could help reduce interconnect resistance/capacitance, and potentially improve performance.

  9. A non-asymptotic homogenization theory for periodic electromagnetic structures

    PubMed Central

    Tsukerman, Igor; Markel, Vadim A.

    2014-01-01

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions. PMID:25104912

  10. Layout optimization using the homogenization method

    NASA Technical Reports Server (NTRS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  11. Homogeneous cosmological models and new inflation

    NASA Technical Reports Server (NTRS)

    Turner, Michael S.; Widrow, Lawrence M.

    1986-01-01

    The promise of the inflationary-universe scenario is to free the present state of the universe from extreme dependence upon initial data. Paradoxically, inflation is usually analyzed in the context of the homogeneous and isotropic Robertson-Walker cosmological models. It is shown that all but a small subset of the homogeneous models undergo inflation. Any initial anisotropy is so strongly damped that if sufficient inflation occurs to solve the flatness and horizon problems, the universe today would still be very isotropic.

  12. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    PubMed Central

    Krasteva, Vessela TZ; Papazov, Sava P; Daskalov, Ivan K

    2003-01-01

    Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM) of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium. PMID:14693034

  13. Analyzing multistep homogeneous nucleation in vapor-to-solid transitions using molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Tanaka, Kyoko K.; Diemand, Jürg; Tanaka, Hidekazu; Angélil, Raymond

    2017-08-01

    In this paper, we present multistep homogeneous nucleations in vapor-to-solid transitions as revealed by molecular dynamics simulations on Lennard-Jones molecules, where liquidlike clusters are created and crystallized. During a long, direct N V E (constant volume, energy, and number of molecules) involving the integration of (1.9 -15 )× 106 molecules in up to 200 million steps (=4.3 μ s ), crystallization in many large, supercooled nanoclusters is observed once the liquid clusters grow to a certain size (˜800 molecules for the case of T ≃0.5 ɛ /k ). In the simulations, we discovered an interesting process associated with crystallization: the solid clusters lost 2-5 % of their mass during crystallization at low temperatures below their melting temperatures. Although the crystallized clusters were heated by latent heat, they were stabilized by cooling due to evaporation. The clusters crystallized quickly and completely except at surface layers. However, they did not have stable crystal structures, rather they had metastable structures such as icosahedral, decahedral, face-centered-cubic-rich (fcc-rich), and hexagonal-close-packed-rich (hcp-rich). Several kinds of cluster structures coexisted in the same size range of ˜1000 -5000 molecules. Our results imply that multistep nucleation is a common first stage of condensation from vapor to solid.

  14. A Homogenization Approach for Design and Simulation of Blast Resistant Composites

    NASA Astrophysics Data System (ADS)

    Sheyka, Michael

    Structural composites have been used in aerospace and structural engineering due to their high strength to weight ratio. Composite laminates have been successfully and extensively used in blast mitigation. This dissertation examines the use of the homogenization approach to design and simulate blast resistant composites. Three case studies are performed to examine the usefulness of different methods that may be used in designing and optimizing composite plates for blast resistance. The first case study utilizes a single degree of freedom system to simulate the blast and a reliability based approach. The first case study examines homogeneous plates and the optimal stacking sequence and plate thicknesses are determined. The second and third case studies use the homogenization method to calculate the properties of composite unit cell made of two different materials. The methods are integrated with dynamic simulation environments and advanced optimization algorithms. The second case study is 2-D and uses an implicit blast simulation, while the third case study is 3-D and simulates blast using the explicit blast method. Both case studies 2 and 3 rely on multi-objective genetic algorithms for the optimization process. Pareto optimal solutions are determined in case studies 2 and 3. Case study 3 is an integrative method for determining optimal stacking sequence, microstructure and plate thicknesses. The validity of the different methods such as homogenization, reliability, explicit blast modeling and multi-objective genetic algorithms are discussed. Possible extension of the methods to include strain rate effects and parallel computation is also examined.

  15. Directed assembly-based printing of homogeneous and hybrid nanorods using dielectrophoresis

    NASA Astrophysics Data System (ADS)

    Chai, Zhimin; Yilmaz, Cihan; Busnaina, Ahmed A.; Lissandrello, Charles A.; Carter, David J. D.

    2017-11-01

    Printing nano and microscale three-dimensional (3D) structures using directed assembly of nanoparticles has many potential applications in electronics, photonics and biotechnology. This paper presents a reproducible and scalable 3D dielectrophoresis assembly process for printing homogeneous silica and hybrid silica/gold nanorods from silica and gold nanoparticles. The nanoparticles are assembled into patterned vias under a dielectrophoretic force generated by an alternating current (AC) field, and then completely fused in situ to form nanorods. The assembly process is governed by the applied AC voltage amplitude and frequency, pattern geometry, and assembly time. Here, we find out that complete assembly of nanorods is not possible without applying both dielectrophoresis and electrophoresis. Therefore, a direct current offset voltage is used to add an additional electrophoretic force to the assembly process. The assembly can be precisely controlled to print silica nanorods with diameters from 20-200 nm and spacing from 500 nm to 2 μm. The assembled nanorods have good uniformity in diameter and height over a millimeter scale. Besides homogeneous silica nanorods, hybrid silica/gold nanorods are also assembled by sequentially assembling silica and gold nanoparticles. The precision of the assembly process is further demonstrated by assembling a single particle on top of each nanorod to demonstrate an additional level of functionalization. The assembled hybrid silica/gold nanorods have potential to be used for metamaterial applications that require nanoscale structures as well as for plasmonic sensors for biosensing applications.

  16. Slowly digestible properties of lotus seed starch-glycerine monostearin complexes formed by high pressure homogenization.

    PubMed

    Chen, Bingyan; Jia, Xiangze; Miao, Song; Zeng, Shaoxiao; Guo, Zebin; Zhang, Yi; Zheng, Baodong

    2018-06-30

    Starch-lipid complexes were prepared using lotus seed starch (LS) and glycerin monostearate (GMS) via a high-pressure homogenization process, and the effect of high pressure homogenization (HPH) on the slow digestion properties of LS-GMS was investigated. The digestion profiles showed HPH treatment reduced the digestive rate of LS-GMS, and the extent of this change was dependent on homogenized pressure. Scanning electron microscopy displayed HPH treatment change the morphology of LS-GMS, with high pressure producing more compact block-shape structure to resist enzyme digestion. The results of Gel-permeation chromatography and Small-angle X-ray scattering revealed high homogenization pressure impacted molecular weight distribution and semi-crystalline region of complexes, resulting in the formation of new semi-crystalline with repeat unit distance of 16-18 nm and molecular weight distribution of 2.50-2.80 × 10 5  Da, which displayed strong enzymatic resistance. Differential scanning calorimeter results revealed new semi-crystalline lamellar may originate from type-II complexes that exhibited a high transition temperature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Iterative and variational homogenization methods for filled elastomers

    NASA Astrophysics Data System (ADS)

    Goudarzi, Taha

    Elastomeric composites have increasingly proved invaluable in commercial technological applications due to their unique mechanical properties, especially their ability to undergo large reversible deformation in response to a variety of stimuli (e.g., mechanical forces, electric and magnetic fields, changes in temperature). Modern advances in organic materials science have revealed that elastomeric composites hold also tremendous potential to enable new high-end technologies, especially as the next generation of sensors and actuators featured by their low cost together with their biocompatibility, and processability into arbitrary shapes. This potential calls for an in-depth investigation of the macroscopic mechanical/physical behavior of elastomeric composites directly in terms of their microscopic behavior with the objective of creating the knowledge base needed to guide their bottom-up design. The purpose of this thesis is to generate a mathematical framework to describe, explain, and predict the macroscopic nonlinear elastic behavior of filled elastomers, arguably the most prominent class of elastomeric composites, directly in terms of the behavior of their constituents --- i.e., the elastomeric matrix and the filler particles --- and their microstructure --- i.e., the content, size, shape, and spatial distribution of the filler particles. This will be accomplished via a combination of novel iterative and variational homogenization techniques capable of accounting for interphasial phenomena and finite deformations. Exact and approximate analytical solutions for the fundamental nonlinear elastic response of dilute suspensions of rigid spherical particles (either firmly bonded or bonded through finite size interphases) in Gaussian rubber are first generated. These results are in turn utilized to construct approximate solutions for the nonlinear elastic response of non-Gaussian elastomers filled with a random distribution of rigid particles (again, either firmly

  18. Homogeneity and structure of CuZrAlY metallic glass ribbons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fetić, A. Salčinović, E-mail: amra.s@pmf.unsa.ba; Selimović, A.; Hrvat, K.

    2016-03-25

    Metallic glasses are metastable amorphous structures produced by quenching-rapid cooling technique. Due to very high cooling rates during the production process, it is very difficult to produce homogeneous samples with identical chemical composition. In this paper we will present preliminary results of homogeneity and structure examinations of a CuZrAlY metallic glass ribbon. The ribbon, approximately 1.5 m long and 1 mm wide, was produced using melt spinning technique. Samples from the middle and the end of the ribbon were chosen for further examination. Surface was checked by metallographic and electron scanning microscopy. Chemical composition in different areas of each sample was checkedmore » by energy-dispersive X-ray spectroscopy. Electrical resistivity measurements in the temperature range from 80 K to 280 K were also conducted.« less

  19. High-performance liquid chromatography purification of homogenous-length RNA produced by trans cleavage with a hammerhead ribozyme.

    PubMed Central

    Shields, T P; Mollova, E; Ste Marie, L; Hansen, M R; Pardi, A

    1999-01-01

    An improved method is presented for the preparation of milligram quantities of homogenous-length RNAs suitable for nuclear magnetic resonance or X-ray crystallographic structural studies. Heterogeneous-length RNA transcripts are processed with a hammerhead ribozyme to yield homogenous-length products that are then readily purified by anion exchange high-performance liquid chromatography. This procedure eliminates the need for denaturing polyacrylamide gel electrophoresis, which is the most laborious step in the standard procedure for large-scale production of RNA by in vitro transcription. The hammerhead processing of the heterogeneous-length RNA transcripts also substantially improves the overall yield and purity of the desired RNA product. PMID:10496226

  20. Homogeneous nucleation and droplet growth in nitrogen. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Dotson, E. H.

    1983-01-01

    A one dimensional computer model of the homogeneous nucleation process and growth of condensate for nitrogen flows over airfoils is developed to predict the onset of condensation and thus to be able to take advantage of as much of Reynolds capability of cryogenic tunnels as possible. Homogeneous nucleation data were taken using a DFVLR CAST-10 airfoil in the 0.3-Meter Transonic Cryogenic Tunnel and are used to evaluate the classical liquid droplet theory and several proposed corrections to it. For predicting liquid nitrogen condensation effects, use of the arbitrary Tolman constant of 0.25 x 250 billionth m or the Reiss or Kikuchi correction agrees with the CAST-10 data. Because no solid nitrogen condensation were found experimentally during the CAST-10 experiments, earlier nozzle data are used to evaluate corrections to the classical liquid droplet theory in the lower temperature regime. A theoretical expression for the surface tension of solid nitrogen is developed.

  1. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each

  2. Kinetic energy and scalar spectra in high Rayleigh number axially homogeneous buoyancy driven turbulence

    NASA Astrophysics Data System (ADS)

    Pawar, Shashikant S.; Arakeri, Jaywant H.

    2016-06-01

    Kinetic energy and scalar spectra from the measurements in high Rayleigh number axially homogeneous buoyancy driven turbulent flow are presented. Kinetic energy and concentration (scalar) spectra are obtained from the experiments wherein density difference is created using brine and fresh water and temperature spectra are obtained from the experiments in which heat is used. Scaling of the frequency spectra of lateral and longitudinal velocity near the tube axis is closer to the Kolmogorov-Obukhov scaling, while the scalar spectra show some evidence of dual scaling, Bolgiano-Obukhov scaling followed by Obukhov-Corrsin scaling. These scalings are also observed in the corresponding second order spatial structure functions of velocity and concentration fluctuations.

  3. Femtosecond laser-induced ripple patterns for homogenous nanostructuring of pyrolytic carbon heart valve implant

    NASA Astrophysics Data System (ADS)

    Stępak, Bogusz; Dzienny, Paulina; Franke, Volker; Kunicki, Piotr; Gotszalk, Teodor; Antończak, Arkadiusz

    2018-04-01

    Laser-induced periodic surface structures (LIPSS) are highly periodic wavy surface features which are frequently smaller than incident light wavelength that bring possibility of nanostructuring of many materials. In this paper the possibility of using them to homogeneously structure the surface of artificial heart valve made of PyC was examined. By changing laser irradiation parameters such like energy density and pulse separation the most suitable conditions were established for 1030 nm wavelength. A wide spectrum of periodicities and geometries was obtained. Interesting side effects like creating a thin shell-like layer were observed. Modified surfaces were examined using EDX and Raman spectroscopy to determine change in elemental composition of surface.

  4. Te homogeneous precipitation in Ge dislocation loop vicinity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perrin Toinin, J.; Portavoce, A., E-mail: alain.portavoce@im2np.fr; Texier, M.

    2016-06-06

    High resolution microscopies were used to study the interactions of Te atoms with Ge dislocation loops, after a standard n-type doping process in Ge. Te atoms neither segregate nor precipitate on dislocation loops, but form Te-Ge clusters at the same depth as dislocation loops, in contradiction with usual dopant behavior and thermodynamic expectations. Atomistic kinetic Monte Carlo simulations show that Te atoms are repulsed from dislocation loops due to elastic interactions, promoting homogeneous Te-Ge nucleation between dislocation loops. This phenomenon is enhanced by coulombic interactions between activated Te{sup 2+} or Te{sup 1+} ions.

  5. Creating a multidisciplinary low back pain guideline: anatomy of a guideline adaptation process.

    PubMed

    Harstall, Christa; Taenzer, Paul; Angus, Donna K; Moga, Carmen; Schuller, Tara; Scott, N Ann

    2011-08-01

    A collaborative, multidisciplinary guideline adaptation process was developed to construct a single overarching, evidence-based clinical practice guideline (CPG) for all primary care practitioners responsible for the management of low back pain (LBP) to curb the use of ineffective treatments and improve patient outcomes. The adaptation strategy, which involved multiple committees and partnerships, leveraged existing knowledge transfer connections to recruit guideline development group (GDG) members and ensure that all stakeholders had a voice in the guideline development process. Videoconferencing was used to coordinate the large, geographically dispersed GDG. Information services and health technology assessment experts were used throughout the process to lighten the GDG's workload. The GDG reviewed seven seed guidelines and drafted an Alberta-specific guideline during 10 half-day meetings over a 12-month period. The use of ad hoc subcommittees to resolve uncertainties or disagreements regarding evidence interpretation expedited the process. Challenges were encountered in dealing with subjectivity, guideline appraisal tools, evidence source limitations and inconsistencies, and the lack of sophisticated evidence analysis inherent in guideline adaptation. Strategies for overcoming these difficulties are discussed. Guideline adaptation is useful when resources are limited and good-quality seed CPGs exist. The Ambassador Program successfully utilized existing stakeholder interest to create an overarching guideline that aligned guidance for LBP management across multiple primary care disciplines. Unforeseen challenges in guideline adaptation can be overcome with credible seed guidelines, a consistently applied and transparent methodology, and clear documentation of the subjective contextualization process. Multidisciplinary stakeholder input and an open, trusting relationship among all contributors will ensure that the end product is clinically meaningful. © 2010

  6. A micromechanical approach for homogenization of elastic metamaterials with dynamic microstructure.

    PubMed

    Muhlestein, Michael B; Haberman, Michael R

    2016-08-01

    An approximate homogenization technique is presented for generally anisotropic elastic metamaterials consisting of an elastic host material containing randomly distributed heterogeneities displaying frequency-dependent material properties. The dynamic response may arise from relaxation processes such as viscoelasticity or from dynamic microstructure. A Green's function approach is used to model elastic inhomogeneities embedded within a uniform elastic matrix as force sources that are excited by a time-varying, spatially uniform displacement field. Assuming dynamic subwavelength inhomogeneities only interact through their volume-averaged fields implies the macroscopic stress and momentum density fields are functions of both the microscopic strain and velocity fields, and may be related to the macroscopic strain and velocity fields through localization tensors. The macroscopic and microscopic fields are combined to yield a homogenization scheme that predicts the local effective stiffness, density and coupling tensors for an effective Willis-type constitutive equation. It is shown that when internal degrees of freedom of the inhomogeneities are present, Willis-type coupling becomes necessary on the macroscale. To demonstrate the utility of the homogenization technique, the effective properties of an isotropic elastic matrix material containing isotropic and anisotropic spherical inhomogeneities, isotropic spheroidal inhomogeneities and isotropic dynamic spherical inhomogeneities are presented and discussed.

  7. A micromechanical approach for homogenization of elastic metamaterials with dynamic microstructure

    PubMed Central

    Haberman, Michael R.

    2016-01-01

    An approximate homogenization technique is presented for generally anisotropic elastic metamaterials consisting of an elastic host material containing randomly distributed heterogeneities displaying frequency-dependent material properties. The dynamic response may arise from relaxation processes such as viscoelasticity or from dynamic microstructure. A Green's function approach is used to model elastic inhomogeneities embedded within a uniform elastic matrix as force sources that are excited by a time-varying, spatially uniform displacement field. Assuming dynamic subwavelength inhomogeneities only interact through their volume-averaged fields implies the macroscopic stress and momentum density fields are functions of both the microscopic strain and velocity fields, and may be related to the macroscopic strain and velocity fields through localization tensors. The macroscopic and microscopic fields are combined to yield a homogenization scheme that predicts the local effective stiffness, density and coupling tensors for an effective Willis-type constitutive equation. It is shown that when internal degrees of freedom of the inhomogeneities are present, Willis-type coupling becomes necessary on the macroscale. To demonstrate the utility of the homogenization technique, the effective properties of an isotropic elastic matrix material containing isotropic and anisotropic spherical inhomogeneities, isotropic spheroidal inhomogeneities and isotropic dynamic spherical inhomogeneities are presented and discussed. PMID:27616932

  8. Beyond homogenization discourse: Reconsidering the cultural consequences of globalized medical education.

    PubMed

    Gosselin, K; Norris, J L; Ho, M-J

    2016-07-01

    Global medical education standards, largely designed in the West, have been promoted across national boundaries with limited regard for cultural differences. This review aims to identify discourses on cultural globalization in medical education literature from non-Western countries. To explore the diversity of discourses related to globalization and culture in the field of medical education, the authors conducted a critical review of medical education research from non-Western countries published in Academic Medicine, Medical Education and Medical Teacher from 2006 to 2014. Key discourses about globalization and culture emerged from a preliminary analysis of this body of literature. A secondary analysis identified inductive sub-themes. Homogenization, polarization and hybridization emerged as key themes in the literature. These findings demonstrate the existence of discourses beyond Western-led homogenization and the co-existence of globalization discourses ranging from homogenization to syncretism to resistance. This review calls attention to the existence of manifold discourses about globalization and culture in non-Western medical education contexts. In refocusing global medical education processes to avoid Western cultural imperialism, it will also be necessary to avoid the pitfalls of other globalization discourses. Moving beyond existing discourses, researchers and educators should work towards equitable, context-sensitive and locally-driven approaches to global medical education.

  9. A micromechanical approach for homogenization of elastic metamaterials with dynamic microstructure

    NASA Astrophysics Data System (ADS)

    Muhlestein, Michael B.; Haberman, Michael R.

    2016-08-01

    An approximate homogenization technique is presented for generally anisotropic elastic metamaterials consisting of an elastic host material containing randomly distributed heterogeneities displaying frequency-dependent material properties. The dynamic response may arise from relaxation processes such as viscoelasticity or from dynamic microstructure. A Green's function approach is used to model elastic inhomogeneities embedded within a uniform elastic matrix as force sources that are excited by a time-varying, spatially uniform displacement field. Assuming dynamic subwavelength inhomogeneities only interact through their volume-averaged fields implies the macroscopic stress and momentum density fields are functions of both the microscopic strain and velocity fields, and may be related to the macroscopic strain and velocity fields through localization tensors. The macroscopic and microscopic fields are combined to yield a homogenization scheme that predicts the local effective stiffness, density and coupling tensors for an effective Willis-type constitutive equation. It is shown that when internal degrees of freedom of the inhomogeneities are present, Willis-type coupling becomes necessary on the macroscale. To demonstrate the utility of the homogenization technique, the effective properties of an isotropic elastic matrix material containing isotropic and anisotropic spherical inhomogeneities, isotropic spheroidal inhomogeneities and isotropic dynamic spherical inhomogeneities are presented and discussed.

  10. Fracture of Rolled Homogeneous Steel Armor (Nucleation Threshold Stress).

    DTIC Science & Technology

    1980-01-01

    AD-AO81 618 ARMY ARMAMENT RESEARCH AND DEVELOPMENT COMMAND ABERD--ETC F/B 19/4 FRACTURE OF ROLLED HOMOGENEOUS STEEL ARMOR (NUCLEATION THRESHOL--ETC(U...ARBRL-MR-02984A QQ FRACTURE OF ROLLED HOMOGENEOUS STEEL ARMOR (NUCLEATION THRESHOLD STRESS) Gerald L Moss Lynn SeamanLy~ S, ,.DTIC S ELECTED January...nucleation stress, Crack threshold stress, Fracture, Fracture stress, Spallation, Armor, Rolled homogeneous steel armor M~ AS$TRACr (Vita ssf -- ebb

  11. Noncommutative complex structures on quantum homogeneous spaces

    NASA Astrophysics Data System (ADS)

    Ó Buachalla, Réamonn

    2016-01-01

    A new framework for noncommutative complex geometry on quantum homogeneous spaces is introduced. The main ingredients used are covariant differential calculi and Takeuchi's categorical equivalence for quantum homogeneous spaces. A number of basic results are established, producing a simple set of necessary and sufficient conditions for noncommutative complex structures to exist. Throughout, the framework is applied to the quantum projective spaces endowed with the Heckenberger-Kolb calculus.

  12. A homogeneous focusing system for diode lasers and its applications in metal surface modification

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Zhong, Lijing; Tang, Xiahui; Xu, Chengwen; Wan, Chenhao

    2018-06-01

    High power diode lasers are applied in many different areas, including surface modification, welding and cutting. It is an important technical trend in laser processing of metals in the future. This paper aims to analyze the impact of the shape and homogeneity of the focal spot of the diode laser on surface modification. A focusing system using the triplet lenses for a direct output diode laser which can be used to eliminate coma aberrations is studied. A rectangular stripe with an aspect ratio from 8:1 to 25:1 is obtained, in which the power is homogeneously distributed along the fast axis, the power is 1117.6 W and the peak power intensity is 1.1587 × 106 W/cm2. This paper also presents a homogeneous focusing system by use of a Fresnel lens, in which the incident beam size is 40 × 40 mm2, the focal length is 380 mm, and the dimension of the obtained focal spot is 2 × 10 mm2. When the divergence angle of the incident light is in the range of 12.5-20 mrad and the pitch is 1 mm, the obtained homogeneity in the focal spot is the optimum (about 95.22%). Experimental results show that the measured focal spot size is 2.04 × 10.39 mm2. This research presents a novel design of homogeneous focusing systems for high power diode lasers.

  13. Creating multithemed ecological regions for macroscale ecology: Testing a flexible, repeatable, and accessible clustering method

    USGS Publications Warehouse

    Cheruvelil, Kendra Spence; Yuan, Shuai; Webster, Katherine E.; Tan, Pang-Ning; Lapierre, Jean-Francois; Collins, Sarah M.; Fergus, C. Emi; Scott, Caren E.; Norton Henry, Emily; Soranno, Patricia A.; Filstrup, Christopher T.; Wagner, Tyler

    2017-01-01

    Understanding broad-scale ecological patterns and processes often involves accounting for regional-scale heterogeneity. A common way to do so is to include ecological regions in sampling schemes and empirical models. However, most existing ecological regions were developed for specific purposes, using a limited set of geospatial features and irreproducible methods. Our study purpose was to: (1) describe a method that takes advantage of recent computational advances and increased availability of regional and global data sets to create customizable and reproducible ecological regions, (2) make this algorithm available for use and modification by others studying different ecosystems, variables of interest, study extents, and macroscale ecology research questions, and (3) demonstrate the power of this approach for the research question—How well do these regions capture regional-scale variation in lake water quality? To achieve our purpose we: (1) used a spatially constrained spectral clustering algorithm that balances geospatial homogeneity and region contiguity to create ecological regions using multiple terrestrial, climatic, and freshwater geospatial data for 17 northeastern U.S. states (~1,800,000 km2); (2) identified which of the 52 geospatial features were most influential in creating the resulting 100 regions; and (3) tested the ability of these ecological regions to capture regional variation in water nutrients and clarity for ~6,000 lakes. We found that: (1) a combination of terrestrial, climatic, and freshwater geospatial features influenced region creation, suggesting that the oft-ignored freshwater landscape provides novel information on landscape variability not captured by traditionally used climate and terrestrial metrics; and (2) the delineated regions captured macroscale heterogeneity in ecosystem properties not included in region delineation—approximately 40% of the variation in total phosphorus and water clarity among lakes was at the regional

  14. MANCOVA for one way classification with homogeneity of regression coefficient vectors

    NASA Astrophysics Data System (ADS)

    Mokesh Rayalu, G.; Ravisankar, J.; Mythili, G. Y.

    2017-11-01

    The MANOVA and MANCOVA are the extensions of the univariate ANOVA and ANCOVA techniques to multidimensional or vector valued observations. The assumption of a Gaussian distribution has been replaced with the Multivariate Gaussian distribution for the vectors data and residual term variables in the statistical models of these techniques. The objective of MANCOVA is to determine if there are statistically reliable mean differences that can be demonstrated between groups later modifying the newly created variable. When randomization assignment of samples or subjects to groups is not possible, multivariate analysis of covariance (MANCOVA) provides statistical matching of groups by adjusting dependent variables as if all subjects scored the same on the covariates. In this research article, an extension has been made to the MANCOVA technique with more number of covariates and homogeneity of regression coefficient vectors is also tested.

  15. A Story Approach to Create Online College Courses

    ERIC Educational Resources Information Center

    Romero, Liz

    2016-01-01

    The purpose of this article is to describe the implementation of a story approach to create online courses in a college environment. The article describes the components of the approach and the implementation process to create a nursing and a language course. The implementation starts with the identification of the need and follows by creating a…

  16. PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Glauco; Netto Lahoz, Carlos Henrique

    2013-09-01

    This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).

  17. RY-Coding and Non-Homogeneous Models Can Ameliorate the Maximum-Likelihood Inferences From Nucleotide Sequence Data with Parallel Compositional Heterogeneity.

    PubMed

    Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo

    2012-01-01

    In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.

  18. Convergence and stress analysis of the homogeneous structure of human femur bone during standing up condition

    NASA Astrophysics Data System (ADS)

    Izzawati, B.; Daud, R.; Afendi, M.; Majid, M. S. Abdul; Zain, N. A. M.

    2017-09-01

    Finite element models have been widely used to quantify the stress analysis and to predict the bone fractures of the human body. The present study highlights on the stress analysis of the homogeneous structure of human femur bone during standing up condition. The main objective of this study is to evaluate and understand the biomechanics for human femur bone and to prepare orthotropic homogeneous material models used for FE analysis of the global proximal femur. Thus, it is necessary to investigate critical stress on the human femur bone for future study on implantation of internal fixator and external fixator. The implication possibility to create a valid FE model by simply comparing the FE results with the actual biomechanics structures. Thus, a convergence test was performed by FE model of the femur and the stress analysis based on the actual biomechanics of the human femur bone. An increment of critical stress shows in the femur shaft as the increasing of load on the femoral head and decreasing the pulling force at greater trochanter.

  19. Report: Recipient Subawards to Fellows Did Not Comply With Federal Requirements and EPA’s Involvement in Fellow Selection Process Creates the Appearance EPA Could Be Circumventing the Hiring Process

    EPA Pesticide Factsheets

    Report #14-P-0357, September 17, 2014. ASPH’s subawards to fellows made under the CA are contrary to federal requirements ... and ... creates an appearance that the EPA could be circumventing the hiring process.

  20. Nonlinear vibration of a traveling belt with non-homogeneous boundaries

    NASA Astrophysics Data System (ADS)

    Ding, Hu; Lim, C. W.; Chen, Li-Qun

    2018-06-01

    Free and forced nonlinear vibrations of a traveling belt with non-homogeneous boundary conditions are studied. The axially moving materials in operation are always externally excited and produce strong vibrations. The moving materials with the homogeneous boundary condition are usually considered. In this paper, the non-homogeneous boundaries are introduced by the support wheels. Equilibrium deformation of the belt is produced by the non-homogeneous boundaries. In order to solve the equilibrium deformation, the differential and integral quadrature methods (DIQMs) are utilized to develop an iterative scheme. The influence of the equilibrium deformation on free and forced nonlinear vibrations of the belt is explored. The DIQMs are applied to solve the natural frequencies and forced resonance responses of transverse vibration around the equilibrium deformation. The Galerkin truncation method (GTM) is utilized to confirm the DIQMs' results. The numerical results demonstrate that the non-homogeneous boundary conditions cause the transverse vibration to deviate from the straight equilibrium, increase the natural frequencies, and lead to coexistence of square nonlinear terms and cubic nonlinear terms. Moreover, the influence of non-homogeneous boundaries can be exacerbated by the axial speed. Therefore, non-homogeneous boundary conditions of axially moving materials especially should be taken into account.

  1. The effectiveness of CCDSR learning model to improve skills of creating lesson plan and worksheet science process skill (SPS) for pre-service physics teacher

    NASA Astrophysics Data System (ADS)

    Limatahu, I.; Sutoyo, S.; Wasis; Prahani, B. K.

    2018-03-01

    In the previous research, CCDSR (Condition, Construction, Development, Simulation, and Reflection) learning model has been developed to improve science process skills for pre-service physics teacher. This research is aimed to analyze the effectiveness of CCDSR learning model towards the improvement skills of creating lesson plan and worksheet of Science Process Skill (SPS) for pre-service physics teacher in academic year 2016/2017. This research used one group pre-test and post-test design on 12 pre-service physics teacher at Physics Education, University of Khairun. Data collection was conducted through test and observation. Creating lesson plan and worksheet SPS skills of pre-service physics teacher measurement were conducted through Science Process Skill Evaluation Sheet (SPSES). The data analysis technique was done by Wilcoxon t-test and n-gain. The CCDSR learning model consists of 5 phases, including (1) Condition, (2) Construction, (3) Development, (4) Simulation, and (5) Reflection. The results showed that there was a significant increase in creating lesson plan and worksheet SPS skills of pre-service physics teacher at α = 5% and n-gain average of moderate category. Thus, the CCDSR learning model is effective for improving skills of creating lesson plan and worksheet SPS for pre-service physics teacher.

  2. Utilizing Educational Corporate Culture To Create a Quality School.

    ERIC Educational Resources Information Center

    Osborne, Bill

    Strategies for utilizing educational corporate culture to create a quality school are presented in this paper, which argues that the understanding of the shared belief system of organizational members is crucial to the process. Creating a quality school entails moving from a "teach the process" oriented model to one that internalizes the…

  3. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  4. Short communication: effect of homogenization on heat inactivation of Mycobacterium avium subspecies paratuberculosis in milk.

    PubMed

    Hammer, P; Kiesner, C; Walte, H-G C

    2014-01-01

    Mycobacterium avium ssp. paratuberculosis (MAP) can be present in cow milk and low numbers may survive high-temperature, short-time (HTST) pasteurization. Although HTST treatment leads to inactivation of at least 5 log10 cycles, it might become necessary to enhance the efficacy of HTST by additional treatments such as homogenization if the debate about the role of MAP in Crohn's disease of humans concludes that MAP is a zoonotic agent. This study aimed to determine whether disrupting the clumps of MAP in milk by homogenization during the heat treatment process would enhance the inactivation of MAP. We used HTST pasteurization in a continuous-flow pilot-plant pasteurizer and evaluated the effect of upstream, downstream, and in-hold homogenization on inactivation of MAP. Reduction of MAP at 72°C with a holding time of 28s was between 3.7 and 6.9 log10 cycles, with an overall mean of 5.5 log10 cycles. None of the 3 homogenization modes applied showed a statistically significant additional effect on the inactivation of MAP during HTST treatment. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Volatile loss during homogenization of lunar melt inclusions

    NASA Astrophysics Data System (ADS)

    Ni, Peng; Zhang, Youxue; Guan, Yunbin

    2017-11-01

    Volatile abundances in lunar mantle are critical factors to consider for constraining the model of Moon formation. Recently, the earlier understanding of a ;dry; Moon has shifted to a fairly ;wet; Moon due to the detection of measurable amount of H2O in lunar volcanic glass beads, mineral grains, and olivine-hosted melt inclusions. The ongoing debate on a ;dry; or ;wet; Moon requires further studies on lunar melt inclusions to obtain a broader understanding of volatile abundances in the lunar mantle. One important uncertainty for lunar melt inclusion studies, however, is whether the homogenization of melt inclusions would cause volatile loss. In this study, a series of homogenization experiments were conducted on olivine-hosted melt inclusions from the sample 74220 to evaluate the possible loss of volatiles during homogenization of lunar melt inclusions. Our results suggest that significant loss of H2O could occur even during minutes of homogenization, while F, Cl and S in the inclusions remain unaffected. We model the trend of H2O loss in homogenized melt inclusions by a diffusive hydrogen loss model. The model can reconcile the observed experimental data well, with a best-fit H diffusivity in accordance with diffusion data explained by the ;slow; mechanism for hydrogen diffusion in olivine. Surprisingly, no significant effect for the low oxygen fugacity on the Moon is observed on the diffusive loss of hydrogen during homogenization of lunar melt inclusions under reducing conditions. Our experimental and modeling results show that diffusive H loss is negligible for melt inclusions of >25 μm radius. As our results mitigate the concern of H2O loss during homogenization for crystalline lunar melt inclusions, we found that H2O/Ce ratios in melt inclusions from different lunar samples vary with degree of crystallization. Such a variation is more likely due to H2O loss on the lunar surface, while heterogeneity in their lunar mantle source is also a possibility. A

  6. Homogenization of periodic bi-isotropic composite materials

    NASA Astrophysics Data System (ADS)

    Ouchetto, Ouail; Essakhi, Brahim

    2018-07-01

    In this paper, we present a new method for homogenizing the bi-periodic materials with bi-isotropic components phases. The presented method is a numerical method based on the finite element method to compute the local electromagnetic properties. The homogenized constitutive parameters are expressed as a function of the macroscopic electromagnetic properties which are obtained from the local properties. The obtained results are compared to Unfolding Finite Element Method and Maxwell-Garnett formulas.

  7. Refined Zigzag Theory for Homogeneous, Laminated Composite, and Sandwich Plates: A Homogeneous Limit Methodology for Zigzag Function Selection

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; DiSciuva, Marco; Gherlone, marco

    2010-01-01

    The Refined Zigzag Theory (RZT) for homogeneous, laminated composite, and sandwich plates is presented from a multi-scale formalism starting with the inplane displacement field expressed as a superposition of coarse and fine contributions. The coarse kinematic field is that of first-order shear-deformation theory, whereas the fine kinematic field has a piecewise-linear zigzag distribution through the thickness. The condition of limiting homogeneity of transverse-shear properties is proposed and yields four distinct sets of zigzag functions. By examining elastostatic solutions for highly heterogeneous sandwich plates, the best-performing zigzag functions are identified. The RZT predictive capabilities to model homogeneous and highly heterogeneous sandwich plates are critically assessed, demonstrating its superior efficiency, accuracy ; and a wide range of applicability. The present theory, which is derived from the virtual work principle, is well-suited for developing computationally efficient CO-continuous finite elements, and is thus appropriate for the analysis and design of high-performance load-bearing aerospace structures.

  8. Homogenization Models for Carbon Nanotubes

    NASA Astrophysics Data System (ADS)

    Muc, A.; Jamróz, M.

    2004-03-01

    Two homogenization models for evaluating Young's modulus of nanocomposites reinforced with single-walled and multi-walled carbon nanotubes are presented. The first model is based on a physical description taking into account the interatomic interaction and nanotube geometry. The elementary cell, here a nanotube with a surrounding resin layer, is treated as a homogeneous body — a material continuum. The second model, similar to a phenomenological engineering one, is obtained by combining the law of mixture with the Cox mechanical model. This model describes the stress distribution along stretched short fibers surrounded by a resin matrix. The similarities between composite materials reinforced with short fibers and nanotubes are elucidated. The results obtained are compared with those for classical microcomposites to demonstrate the advantages and disadvantages of both the composite materials.

  9. Method of fabricating a homogeneous wire of inter-metallic alloy

    DOEpatents

    Ohriner, Evan Keith; Blue, Craig Alan

    2001-01-01

    A method for fabricating a homogeneous wire of inter-metallic alloy comprising the steps of providing a base-metal wire bundle comprising a metal, an alloy or a combination thereof; working the wire bundle through at least one die to obtain a desired dimension and to form a precursor wire; and, controllably heating the precursor wire such that a portion of the wire will become liquid while simultaneously maintaining its desired shape, whereby substantial homogenization of the wire occurs in the liquid state and additional homogenization occurs in the solid state resulting in a homogenous alloy product.

  10. Soy Protein Isolate-Phosphatidylcholine Nanoemulsions Prepared Using High-Pressure Homogenization

    PubMed Central

    Li, Yang; Liu, Jun; Zhu, Ying; Zhang, Xiao-Yuan; Jiang, Lian-Zhou; Qi, Bao-Kun; Zhang, Xiao-Nan; Wang, Zhong-Jiang; Teng, Fei

    2018-01-01

    The nanoemulsions of soy protein isolate-phosphatidylcholine (SPI-PC) with different emulsion conditions were studied. Homogenization pressure and homogenization cycle times were varied, along with SPI and PC concentration. Evaluations included turbidity, particle size, ζ-potential, particle distribution index, and turbiscan stability index (TSI). The nanoemulsions had the best stability when SPI was at 1.5%, PC was at 0.22%, the homogenization pressure was 100 MPa and homogenization was performed 4 times. The average particle size of the SPI-PC nanoemulsions was 217 nm, the TSI was 3.02 and the emulsification yield was 93.4% of nanoemulsions. PMID:29735918

  11. Soy Protein Isolate-Phosphatidylcholine Nanoemulsions Prepared Using High-Pressure Homogenization.

    PubMed

    Li, Yang; Wu, Chang-Ling; Liu, Jun; Zhu, Ying; Zhang, Xiao-Yuan; Jiang, Lian-Zhou; Qi, Bao-Kun; Zhang, Xiao-Nan; Wang, Zhong-Jiang; Teng, Fei

    2018-05-07

    The nanoemulsions of soy protein isolate-phosphatidylcholine (SPI-PC) with different emulsion conditions were studied. Homogenization pressure and homogenization cycle times were varied, along with SPI and PC concentration. Evaluations included turbidity, particle size, ζ-potential, particle distribution index, and turbiscan stability index (TSI). The nanoemulsions had the best stability when SPI was at 1.5%, PC was at 0.22%, the homogenization pressure was 100 MPa and homogenization was performed 4 times. The average particle size of the SPI-PC nanoemulsions was 217 nm, the TSI was 3.02 and the emulsification yield was 93.4% of nanoemulsions.

  12. Sewage sludge disintegration by combined treatment of alkaline+high pressure homogenization.

    PubMed

    Zhang, Yuxuan; Zhang, Panyue; Zhang, Guangming; Ma, Weifang; Wu, Hao; Ma, Boqiang

    2012-11-01

    Alkaline pretreatment combined with high pressure homogenization (HPH) was applied to promote sewage sludge disintegration. For sewage sludge with a total solid content of 1.82%, sludge disintegration degree (DD(COD)) with combined treatment was higher than the sum of DD(COD) with single alkaline and single HPH treatment. NaOH dosage ⩽0.04mol/L, homogenization pressure ⩽60MPa and a single homogenization cycle were the suitable conditions for combined sludge treatment. The combined sludge treatment showed a maximum DD(COD) of 59.26%. By regression analysis, the combined sludge disintegration model was established as 11-DD(COD)=0.713C(0.334)P(0.234)N(0.119), showing that the effect of operating parameters on sludge disintegration followed the order: NaOH dosage>homogenization pressure>number of homogenization cycle. The energy efficiency with combined sludge treatment significantly increased compared with that with single HPH treatment, and the high energy efficiency was achieved at low homogenization pressure with a single homogenization cycle. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Creating multithemed ecological regions for macroscale ecology: Testing a flexible, repeatable, and accessible clustering method.

    PubMed

    Cheruvelil, Kendra Spence; Yuan, Shuai; Webster, Katherine E; Tan, Pang-Ning; Lapierre, Jean-François; Collins, Sarah M; Fergus, C Emi; Scott, Caren E; Henry, Emily Norton; Soranno, Patricia A; Filstrup, Christopher T; Wagner, Tyler

    2017-05-01

    Understanding broad-scale ecological patterns and processes often involves accounting for regional-scale heterogeneity. A common way to do so is to include ecological regions in sampling schemes and empirical models. However, most existing ecological regions were developed for specific purposes, using a limited set of geospatial features and irreproducible methods. Our study purpose was to: (1) describe a method that takes advantage of recent computational advances and increased availability of regional and global data sets to create customizable and reproducible ecological regions, (2) make this algorithm available for use and modification by others studying different ecosystems, variables of interest, study extents, and macroscale ecology research questions, and (3) demonstrate the power of this approach for the research question-How well do these regions capture regional-scale variation in lake water quality? To achieve our purpose we: (1) used a spatially constrained spectral clustering algorithm that balances geospatial homogeneity and region contiguity to create ecological regions using multiple terrestrial, climatic, and freshwater geospatial data for 17 northeastern U.S. states (~1,800,000 km 2 ); (2) identified which of the 52 geospatial features were most influential in creating the resulting 100 regions; and (3) tested the ability of these ecological regions to capture regional variation in water nutrients and clarity for ~6,000 lakes. We found that: (1) a combination of terrestrial, climatic, and freshwater geospatial features influenced region creation, suggesting that the oft-ignored freshwater landscape provides novel information on landscape variability not captured by traditionally used climate and terrestrial metrics; and (2) the delineated regions captured macroscale heterogeneity in ecosystem properties not included in region delineation-approximately 40% of the variation in total phosphorus and water clarity among lakes was at the regional

  14. In vivo quantitative bioluminescence tomography using heterogeneous and homogeneous mouse models.

    PubMed

    Liu, Junting; Wang, Yabin; Qu, Xiaochao; Li, Xiangsi; Ma, Xiaopeng; Han, Runqiang; Hu, Zhenhua; Chen, Xueli; Sun, Dongdong; Zhang, Rongqing; Chen, Duofang; Chen, Dan; Chen, Xiaoyuan; Liang, Jimin; Cao, Feng; Tian, Jie

    2010-06-07

    Bioluminescence tomography (BLT) is a new optical molecular imaging modality, which can monitor both physiological and pathological processes by using bioluminescent light-emitting probes in small living animal. Especially, this technology possesses great potential in drug development, early detection, and therapy monitoring in preclinical settings. In the present study, we developed a dual modality BLT prototype system with Micro-computed tomography (MicroCT) registration approach, and improved the quantitative reconstruction algorithm based on adaptive hp finite element method (hp-FEM). Detailed comparisons of source reconstruction between the heterogeneous and homogeneous mouse models were performed. The models include mice with implanted luminescence source and tumor-bearing mice with firefly luciferase report gene. Our data suggest that the reconstruction based on heterogeneous mouse model is more accurate in localization and quantification than the homogeneous mouse model with appropriate optical parameters and that BLT allows super-early tumor detection in vivo based on tomographic reconstruction of heterogeneous mouse model signal.

  15. The homogeneity effect on figure/ground perception in infancy.

    PubMed

    Takashima, Midori; Kanazawa, So; Yamaguchi, Masami K; Shiina, Ken

    2014-02-01

    We examined whether the homogeneity of the two profiles of Rubin's goblet affects figure/ground perception in infants. We modified the two profiles of Rubin's goblet in order to compare figure/ground perception under four test patterns: (1) two profiles painted with horizontal lines (horizontal-line condition), (2) two profiles painted middle gray (uni-color condition), (3) one profile painted light gray and the other dark gray (two-color condition), and (4) a goblet painted with concentric circles (concentric-circles condition). In the horizontal-line condition the homogeneity of the profile was strengthened, and in the two-color condition the homogeneity of the profile was weakened compared to the uni-color condition, which was an original Rubin's goblet. In the concentric-circles condition the homogeneity of the reversed areas of the horizontal-line were strengthened. After infants were familiarized with each Rubin's goblet, the infants were tested on their discrimination between the two profiles and the goblet in the post-familiarization test. In horizontal-line condition, uni-color condition and concentric-circles condition infants showed a novelty preference for the two profiles in the post-familiarization test. On the other hand, in the two-color condition no preference was observed in the post-familiarization test. This means that infants perceived the goblet as figure and the two profiles as ground in the horizontal-line condition, the uni-color condition and the concentric-circles condition. We found that infants could not perceive the goblet area as figure when the homogeneity of the two profiles was weakened. It can be said that figure/ground perception in infancy is not affected by strengthened homogeneity, but is affected by weakened homogeneity. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Land-use intensification causes multitrophic homogenization of grassland communities.

    PubMed

    Gossner, Martin M; Lewinsohn, Thomas M; Kahl, Tiemo; Grassein, Fabrice; Boch, Steffen; Prati, Daniel; Birkhofer, Klaus; Renner, Swen C; Sikorski, Johannes; Wubet, Tesfaye; Arndt, Hartmut; Baumgartner, Vanessa; Blaser, Stefan; Blüthgen, Nico; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Jorge, Leonardo Ré; Jung, Kirsten; Keyel, Alexander C; Klein, Alexandra-Maria; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Müller, Jörg; Overmann, Jörg; Pašalić, Esther; Penone, Caterina; Perović, David J; Purschke, Oliver; Schall, Peter; Socher, Stephanie A; Sonnemann, Ilja; Tschapka, Marco; Tscharntke, Teja; Türke, Manfred; Venter, Paul Christiaan; Weiner, Christiane N; Werner, Michael; Wolters, Volkmar; Wurst, Susanne; Westphal, Catrin; Fischer, Markus; Weisser, Wolfgang W; Allan, Eric

    2016-12-08

    Land-use intensification is a major driver of biodiversity loss. Alongside reductions in local species diversity, biotic homogenization at larger spatial scales is of great concern for conservation. Biotic homogenization means a decrease in β-diversity (the compositional dissimilarity between sites). Most studies have investigated losses in local (α)-diversity and neglected biodiversity loss at larger spatial scales. Studies addressing β-diversity have focused on single or a few organism groups (for example, ref. 4), and it is thus unknown whether land-use intensification homogenizes communities at different trophic levels, above- and belowground. Here we show that even moderate increases in local land-use intensity (LUI) cause biotic homogenization across microbial, plant and animal groups, both above- and belowground, and that this is largely independent of changes in α-diversity. We analysed a unique grassland biodiversity dataset, with abundances of more than 4,000 species belonging to 12 trophic groups. LUI, and, in particular, high mowing intensity, had consistent effects on β-diversity across groups, causing a homogenization of soil microbial, fungal pathogen, plant and arthropod communities. These effects were nonlinear and the strongest declines in β-diversity occurred in the transition from extensively managed to intermediate intensity grassland. LUI tended to reduce local α-diversity in aboveground groups, whereas the α-diversity increased in belowground groups. Correlations between the β-diversity of different groups, particularly between plants and their consumers, became weaker at high LUI. This suggests a loss of specialist species and is further evidence for biotic homogenization. The consistently negative effects of LUI on landscape-scale biodiversity underscore the high value of extensively managed grasslands for conserving multitrophic biodiversity and ecosystem service provision. Indeed, biotic homogenization rather than local diversity

  17. Land-use intensification causes multitrophic homogenization of grassland communities

    NASA Astrophysics Data System (ADS)

    Gossner, Martin M.; Lewinsohn, Thomas M.; Kahl, Tiemo; Grassein, Fabrice; Boch, Steffen; Prati, Daniel; Birkhofer, Klaus; Renner, Swen C.; Sikorski, Johannes; Wubet, Tesfaye; Arndt, Hartmut; Baumgartner, Vanessa; Blaser, Stefan; Blüthgen, Nico; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Jorge, Leonardo Ré; Jung, Kirsten; Keyel, Alexander C.; Klein, Alexandra-Maria; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Müller, Jörg; Overmann, Jörg; Pašalić, Esther; Penone, Caterina; Perović, David J.; Purschke, Oliver; Schall, Peter; Socher, Stephanie A.; Sonnemann, Ilja; Tschapka, Marco; Tscharntke, Teja; Türke, Manfred; Venter, Paul Christiaan; Weiner, Christiane N.; Werner, Michael; Wolters, Volkmar; Wurst, Susanne; Westphal, Catrin; Fischer, Markus; Weisser, Wolfgang W.; Allan, Eric

    2016-12-01

    Land-use intensification is a major driver of biodiversity loss. Alongside reductions in local species diversity, biotic homogenization at larger spatial scales is of great concern for conservation. Biotic homogenization means a decrease in β-diversity (the compositional dissimilarity between sites). Most studies have investigated losses in local (α)-diversity and neglected biodiversity loss at larger spatial scales. Studies addressing β-diversity have focused on single or a few organism groups (for example, ref. 4), and it is thus unknown whether land-use intensification homogenizes communities at different trophic levels, above- and belowground. Here we show that even moderate increases in local land-use intensity (LUI) cause biotic homogenization across microbial, plant and animal groups, both above- and belowground, and that this is largely independent of changes in α-diversity. We analysed a unique grassland biodiversity dataset, with abundances of more than 4,000 species belonging to 12 trophic groups. LUI, and, in particular, high mowing intensity, had consistent effects on β-diversity across groups, causing a homogenization of soil microbial, fungal pathogen, plant and arthropod communities. These effects were nonlinear and the strongest declines in β-diversity occurred in the transition from extensively managed to intermediate intensity grassland. LUI tended to reduce local α-diversity in aboveground groups, whereas the α-diversity increased in belowground groups. Correlations between the β-diversity of different groups, particularly between plants and their consumers, became weaker at high LUI. This suggests a loss of specialist species and is further evidence for biotic homogenization. The consistently negative effects of LUI on landscape-scale biodiversity underscore the high value of extensively managed grasslands for conserving multitrophic biodiversity and ecosystem service provision. Indeed, biotic homogenization rather than local diversity

  18. Homogeneity revisited: analysis of updated precipitation series in Turkey

    NASA Astrophysics Data System (ADS)

    Bickici Arikan, Bugrayhan; Kahya, Ercan

    2018-01-01

    Homogeneous time series of meteorological variables are necessary for hydrologic and climate studies. Dependability of historical precipitation data is subjected to keen evaluation prior to every study in water resources, hydrology, and climate change fields. This study aims to characterize the homogeneity of long-term Turkish precipitation data in order to ensure that they can be reliably used. The homogeneity of monthly precipitation data set was tested using the standard normal homogeneity test, Buishand test, Von Neumann ratio test, and Pettitt test at the 5% significance level across Turkey. Our precipitation records including the most updated observations, extracted from 160 meteorological stations, for the periods 1974-2014 were analyzed by all the four homogeneity tests. According to the results of all tests, five out of 160 stations have an inhomogeneity. With regard to our strict confirmation rule, 44 out of 160 stations are said to be inhomogeneous since they failed from at least one of the four tests. The breaks captured by the Buishand and Pettitt tests usually tend to appear in the middle of the precipitation series, whereas the ability of standard normal homogeneity test is in favor of identifying inhomogeneities mostly at the beginning or at the end of the records. Our results showed that 42 out of 44 inhomogeneous stations passed all the four tests after applying a correction procedure based on the double mass curve analysis. Available metadata was used to interpret the detected inhomogeneity.

  19. Freezing of homogenized sputum samples for intermittent storage.

    PubMed

    Holz, O; Mücke, M; Zarza, P; Loppow, D; Jörres, R A; Magnussen, H

    2001-08-01

    Among the reasons that restrict the application of sputum induction in outpatient settings is the need for processing of samples within 2 h after induction. The aim of our study was to assess whether freezing is suitable for intermediate storage of sputum samples before processing. We compared differential cell counts between two sputum aliquots derived from the same sample. One aliquot was processed within 2 h after production and one, after it had been frozen under addition of dimethyl-sulfoxid (DMSO) and stored up to 10 days at -20 degrees C. Thirty-five samples were frozen immediately prior to preparation of cytospins, and 10 samples were frozen at an even earlier stage, directly after homogenization. In both sets of experiments we observed a significant relationship between frozen and native samples regarding macrophages, neutrophils and eosinophils, as indicated by respective intraclass correlation coefficients of 0.96, 0.96, and 0.93 in the first, and of 0.92, 0.96 and 0.77 in the second experiments. Our results indicate that the freezing of sputum samples at different stages of processing does not alter sputum morphology to an extent that affects the results of differential cell counts.

  20. A study on the entrainment and mixing process in the continental stratocumulus clouds measured during the RACORO campaign

    DOE PAGES

    Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang; ...

    2017-04-20

    Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed in this paper. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humiditymore » differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Finally, clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.« less

  1. A study on the entrainment and mixing process in the continental stratocumulus clouds measured during the RACORO campaign

    NASA Astrophysics Data System (ADS)

    Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang; Lu, Chunsong

    2017-09-01

    Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humidity differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.

  2. A study on the entrainment and mixing process in the continental stratocumulus clouds measured during the RACORO campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeom, Jae Min; Yum, Seong Soo; Liu, Yangang

    Entrainment and mixing processes and their effects on cloud microphysics in the continental stratocumulus clouds observed in Oklahoma during the RACORO campaign are analyzed in the frame of homogeneous and inhomogeneous mixing concepts by combining the approaches of microphysical correlation, mixing diagram, and transition scale (number). A total of 110 horizontally penetrated cloud segments is analyzed in this paper. Mixing diagram and cloud microphysical relationship analyses show homogeneous mixing trait of positive relationship between liquid water content (L) and mean volume of droplets (V) (i.e., smaller droplets in more diluted parcel) in most cloud segments. Relatively small temperature and humiditymore » differences between the entraining air from above the cloud top and cloudy air and relatively large turbulent dissipation rate are found to be responsible for this finding. The related scale parameters (i.e., transition length and transition scale number) are relatively large, which also indicates high likelihood of homogeneous mixing. Finally, clear positive relationship between L and vertical velocity (W) for some cloud segments is suggested to be evidence of vertical circulation mixing, which may further enhance the positive relationship between L and V created by homogeneous mixing.« less

  3. Human-Induced Landscape Changes Homogenize Atlantic Forest Bird Assemblages through Nested Species Loss

    PubMed Central

    Villegas Vallejos, Marcelo Alejandro; Padial, André Andrian; Vitule, Jean Ricardo Simões

    2016-01-01

    The increasing number of quantitative assessments of homogenization using citizen science data is particularly important in the Neotropics, given its high biodiversity and ecological peculiarity, and whose communities may react differently to landscape changes. We looked for evidence of taxonomic homogenization in terrestrial birds by investigating patterns of beta diversity along a gradient of human-altered landscapes (HAL), trying to identify species associated with this process. We analyzed bird data from 87 sites sampled in a citizen science program in the south Brazilian Atlantic Forest. Regional-scale taxonomic homogenization was assessed by comparing beta diversity among sites in different HALs (natural, rural or urban landscapes) accounting for variation derived from geographical distance and zoogeographical affinities by georeferencing sites and determining their position in a phytogeographical domain. Beta diversity was calculated by multivariate dispersion and by testing compositional changes due to turnover and nestedness among HALs and phytogeographical domains. Finally, we assessed which species were typical for each group using indicator species analysis. Bird homogenization was indicated by decreases in beta diversity following landscape changes. Beta diversity of rural sites was roughly half that of natural habitats, while urban sites held less than 10% of the natural areas’ beta diversity. Species composition analysis revealed that the turnover component was important in differentiating sites depending on HAL and phytogeography; the nestedness component was important among HALs, where directional species loss is maintained even considering effects of sampling effort. A similar result was obtained among phytogeographical domains, indicating nested-pattern dissimilarity among compositions of overlapping communities. As expected, a few native generalists and non-native urban specialists were characteristic of rural and urban sites. We generated

  4. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Taylor, Bryant D. (Inventor); Woodard, Stanley E. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  5. Rapid biotic homogenization of marine fish assemblages

    PubMed Central

    Magurran, Anne E.; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J.; McGill, Brian

    2015-01-01

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change. PMID:26400102

  6. Creating Cartoons to Promote Leaderships Skills and Explore Leadership Qualities

    ERIC Educational Resources Information Center

    Smith, Latisha L.; Clausen, Courtney K.; Teske, Jolene K.; Ghayoorrad, Maryam; Gray, Phyllis; Al Subia, Sukainah; Atwood-Blaine, Dana; Rule, Audrey C.

    2015-01-01

    This document describes a strategy for increasing student leadership and creativity skills through the creation of cartoons. Creating cartoons engages students in divergent thinking and cognitive processes, such as perception, recall, and mental processing. When students create cartoons focused on a particular topic, they are making connections to…

  7. Creating a Pilot Educational Psychiatry Website: Opportunities, Barriers, and Next Steps.

    PubMed

    Torous, John; O'Connor, Ryan; Franzen, Jamie; Snow, Caitlin; Boland, Robert; Kitts, Robert

    2015-11-05

    While medical students and residents may be utilizing websites as online learning resources, medical trainees and educators now have the opportunity to create such educational websites and digital tools on their own. However, the process and theory of building educational websites for medical education have not yet been fully explored. To understand the opportunities, barriers, and process of creating a novel medical educational website. We created a pilot psychiatric educational website to better understand the options, opportunities, challenges, and processes involved in the creation of a psychiatric educational website. We sought to integrate visual and interactive Web design elements to underscore the potential of such Web technology. A pilot website (PsychOnCall) was created to demonstrate the potential of Web technology in medical and psychiatric education. Creating an educational website is now technically easier than ever before, and the primary challenge no longer is technology but rather the creation, validation, and maintenance of information for such websites as well as translating text-based didactics into visual and interactive tools. Medical educators can influence the design and implementation of online educational resources through creating their own websites and engaging medical students and residents in the process.

  8. Application of high-pressure homogenization on gums.

    PubMed

    Belmiro, Ricardo Henrique; Tribst, Alline Artigiani Lima; Cristianini, Marcelo

    2018-04-01

    High-pressure homogenization (HPH) is an emerging process during which a fluid product is pumped by pressure intensifiers, forcing it to flow through a narrow gap, usually measured in the order of micrometers. Gums are polysaccharides from vegetal, animal or microbial origin and are widely employed in food and chemical industries as thickeners, stabilizers, gelling agents and emulsifiers. The choice of a specific gum depends on its application and purpose because each form of gum has particular values with respect to viscosity, intrinsic viscosity, stability, and emulsifying and gelling properties, with these parameters being determined by its structure. HPH is able to alter those properties positively by inducing changes in the original polymer, allowing for new applications and improvements with respect to the technical properties of gums. This review highlights the most important advances when this process is applied to change polysaccharides from distinct sources and molecular structures, as well as the future challenges that remain. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. Pressure-strain-rate events in homogeneous turbulent shear flow

    NASA Technical Reports Server (NTRS)

    Brasseur, James G.; Lee, Moon J.

    1988-01-01

    A detailed study of the intercomponent energy transfer processes by the pressure-strain-rate in homogeneous turbulent shear flow is presented. Probability density functions (pdf's) and contour plots of the rapid and slow pressure-strain-rate show that the energy transfer processes are extremely peaky, with high-magnitude events dominating low-magnitude fluctuations, as reflected by very high flatness factors of the pressure-strain-rate. A concept of the energy transfer class was applied to investigate details of the direction as well as magnitude of the energy transfer processes. In incompressible flow, six disjoint energy transfer classes exist. Examination of contours in instantaneous fields, pdf's and weighted pdf's of the pressure-strain-rate indicates that in the low magnitude regions all six classes play an important role, but in the high magnitude regions four classes of transfer processes, dominate. The contribution to the average slow pressure-strain-rate from the high magnitude fluctuations is only 50 percent or less. The relative significance of high and low magnitude transfer events is discussed.

  10. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  11. Edge-Based Image Compression with Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Mainberger, Markus; Weickert, Joachim

    It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.

  12. Method of removing the effects of electrical shorts and shunts created during the fabrication process of a solar cell

    DOEpatents

    Nostrand, Gerald E.; Hanak, Joseph J.

    1979-01-01

    A method of removing the effects of electrical shorts and shunts created during the fabrication process and improving the performance of a solar cell with a thick film cermet electrode opposite to the incident surface by applying a reverse bias voltage of sufficient magnitude to burn out the electrical shorts and shunts but less than the break down voltage of the solar cell.

  13. Creating Economic Incentives for Waste Disposal in Developing Countries Using the MixAlco Process.

    PubMed

    Lonkar, Sagar; Fu, Zhihong; Wales, Melinda; Holtzapple, Mark

    2017-01-01

    In rapidly growing developing countries, waste disposal is a major challenge. Current waste disposal methods (e.g., landfills and sewage treatment) incur costs and often are not employed; thus, wastes accumulate in the environment. To address this challenge, it is advantageous to create economic incentives to collect and process wastes. One approach is the MixAlco process, which uses methane-inhibited anaerobic fermentation to convert waste biomass into carboxylate salts, which are chemically converted to industrial chemicals and fuels. In this paper, humanure (raw human feces and urine) is explored as a possible nutrient source for fermentation. This work focuses on fermenting municipal solid waste (energy source) and humanure (nutrient source) in batch fermentations. Using the Continuum Particle Distribution Model (CPDM), the performance of continuous countercurrent fermentation was predicted at different volatile solid loading rates (VSLR) and liquid residence times (LRT). For a four-stage countercurrent fermentation system at VSLR = 4 g/(L∙day), LRT = 30 days, and solids concentration = 100 g/L liquid, the model predicts carboxylic acid concentration of 68 g/L and conversion of 78.5 %.

  14. Rotational homogeneity in graphene grown on Au(111)

    NASA Astrophysics Data System (ADS)

    Wofford, Joseph; Starodub, Elena; Walter, Andrew; Nie, Shu; Bostwick, Aaron; Bartelt, Norman; Thürmer, Konrad; Rotenberg, Eli; McCarty, Kevin; Dubon, Oscar

    2012-02-01

    The set of properties offered by the (111) surface of gold makes it intriguing as a platform on which to study the fundamental processes that underpin graphene growth on metals. Among these are the low carbon solubility and an interaction strength with graphene that is predicted to be smaller than most transition metals. We have investigated this synthesis process using low-energy electron microscopy and diffraction to monitor the sample surface in real time, and found that the resulting graphene film possesses a remarkable degree of rotational homogeneity. The dominant orientation of the graphene is aligned with the Au lattice, with a small minority rotated by 30 degrees. The origins of this in-plane structuring are puzzling because angularly resolved photo-emission spectroscopy and scanning tunneling microscopy experiments both suggest only a relatively small interaction between the two materials. Finally, the implications of these findings for the growth of high structural-quality graphene films are discussed.

  15. STEAM STIRRED HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Busey, H.M.

    1958-06-01

    A homogeneous nuclear reactor utilizing a selfcirculating liquid fuel is described. The reactor vessel is in the form of a vertically disposed tubular member having the lower end closed by the tube walls and the upper end closed by a removal fianged assembly. A spherical reaction shell is located in the lower end of the vessel and spaced from the inside walls. The reaction shell is perforated on its lower surface and is provided with a bundle of small-diameter tubes extending vertically upward from its top central portion. The reactor vessel is surrounded in the region of the reaction shell by a neutron reflector. The liquid fuel, which may be a solution of enriched uranyl sulfate in ordinary or heavy water, is mainiained at a level within the reactor vessel of approximately the top of the tubes. The heat of the reaction which is created in the critical region within the spherical reaction shell forms steam bubbles which more upwardly through the tubes. The upward movement of these bubbles results in the forcing of the liquid fuel out of the top of these tubes, from where the fuel passes downwardly in the space between the tubes and the vessel wall where it is cooled by heat exchangers. The fuel then re-enters the critical region in the reaction shell through the perforations in the bottom. The upper portion of the reactor vessel is provided with baffles to prevent the liquid fuel from splashing into this region which is also provided with a recombiner apparatus for recombining the radiolytically dissociated moderator vapor and a control means.

  16. Soluble Molecularly Imprinted Nanorods for Homogeneous Molecular Recognition

    NASA Astrophysics Data System (ADS)

    Liang, Rongning; Wang, Tiantian; Zhang, Huan; Yao, Ruiqing; Qin, Wei

    2018-03-01

    Nowadays, it is still difficult for molecularly imprinted polymer (MIPs) to achieve homogeneous recognition since they cannot be easily dissolved in organic or aqueous phase. To address this issue, soluble molecularly imprinted nanorods have been synthesized by using soluble polyaniline doped with a functionalized organic protonic acid as the polymer matrix. By employing 1-naphthoic acid as a model, the proposed imprinted nanorods exhibit an excellent solubility and good homogeneous recognition ability. The imprinting factor for the soluble imprinted nanoroads is 6.8. The equilibrium dissociation constant and the apparent maximum number of the proposed imprinted nanorods are 248.5 μM and 22.1 μmol/g, respectively. We believe that such imprinted nanorods may provide an appealing substitute for natural receptors in homogeneous recognition related fields.

  17. Intensity and angle-of-arrival spectra of laser light propagating through axially homogeneous buoyancy-driven turbulence.

    PubMed

    Pawar, Shashikant S; Arakeri, Jaywant H

    2016-08-01

    Frequency spectra obtained from the measurements of light intensity and angle of arrival (AOA) of parallel laser light propagating through the axially homogeneous, axisymmetric buoyancy-driven turbulent flow at high Rayleigh numbers in a long (length-to-diameter ratio of about 10) vertical tube are reported. The flow is driven by an unstable density difference created across the tube ends using brine and fresh water. The highest Rayleigh number is about 8×109. The aim of the present work is to find whether the conventional Obukhov-Corrsin scaling or Bolgiano-Obukhov (BO) scaling is obtained for the intensity and AOA spectra in the case of light propagation in a buoyancy-driven turbulent medium. Theoretical relations for the frequency spectra of log amplitude and AOA fluctuations developed for homogeneous isotropic turbulent media are modified for the buoyancy-driven flow in the present case to obtain the asymptotic scalings for the high and low frequency ranges. For low frequencies, the spectra of intensity and vertical AOA fluctuations obtained from measurements follow BO scaling, while scaling for the spectra of horizontal AOA fluctuations shows a small departure from BO scaling.

  18. Homogeneous PCBM layers fabricated by horizontal-dip coating for efficient bilayer heterojunction organic photovoltaic cells.

    PubMed

    Huh, Yoon Ho; Bae, In-Gon; Jeon, Hong Goo; Park, Byoungchoo

    2016-10-31

    We herein report a homogeneous [6,6]-phenyl C61 butyric acid methyl ester (PCBM) layer, produced by a solution process of horizontal-dipping (H-dipping) to improve the photovoltaic (PV) effects of bilayer heterojunction organic photovoltaic cells (OPVs) based on a bi-stacked poly(3-hexylthiophene) (P3HT) electron donor layer and a PCBM electron acceptor layer (P3HT/PCBM). It was shown that a homogeneous and uniform coating of PCBM layers in the P3HT/PCBM bilayer OPVs resulted in reliable and reproducible device performance. We recorded a power conversion efficiency (PCE) of 2.89%, which is higher than that (2.00%) of bilayer OPVs with a spin-coated PCBM layer. Moreover, introducing surfactant additives of poly(oxyethylene tridecyl ether) (PTE) into the homogeneous P3HT/PCBM PV layers resulted in the bilayer OPVs showing a PCE value of 3.95%, which is comparable to those of conventional bulk-heterojunction (BHJ) OPVs (3.57-4.13%) fabricated by conventional spin-coating. This improved device performance may be attributed to the selective collection of charge carriers at the interfaces among the active layers and electrodes due to the PTE additives as well as the homogeneous formation of the functional PCBM layer on the P3HT layer. Furthermore, H-dip-coated PCBM layers were deposited onto aligned P3HT layers by a rubbing technique, and the rubbed bilayer OPV exhibited improved in-plane anisotropic PV effects with PCE anisotropy as high as 1.81, which is also higher than that (1.54) of conventional rubbed BHJ OPVs. Our results suggest that the use of the H-dip-coating process in the fabrication of PCBM layers with the PTE interface-engineering additive could be of considerable interest to those seeking to improve PCBM-based opto-electrical organic thin-film devices.

  19. Method and Apparatus for Creating a Topography at a Surface

    DOEpatents

    Adams, David P.; Sinclair, Michael B.; Mayer, Thomas M.; Vasile, Michael J.; Sweatt, William C.

    2008-11-11

    Methods and apparatus whereby an optical interferometer is utilized to monitor and provide feedback control to an integrated energetic particle column, to create desired topographies, including the depth, shape and/or roughness of features, at a surface of a specimen. Energetic particle columns can direct energetic species including, ions, photons and/or neutral particles to a surface to create features having in-plane dimensions on the order of 1 micron, and a height or depth on the order of 1 nanometer. Energetic processes can include subtractive processes such as sputtering, ablation, focused ion beam milling and, additive processes, such as energetic beam induced chemical vapor deposition. The integration of interferometric methods with processing by energetic species offers the ability to create desired topographies at surfaces, including planar and curved shapes.

  20. Homogeneity of lava flows - Chemical data for historic Mauna Loan eruptions

    NASA Technical Reports Server (NTRS)

    Rhodes, J. M.

    1983-01-01

    Chemical analyses of basalts collected from the major historic eruptions of Mauna Loa volcano show that many of the flow fields are remarkably homogeneous in composition. Despite their large size (lengths 9-85 km), large areal extents (13-114 sq km), and various durations of eruption (1-450 days), many of the flow fields have compositional variability that is within, or close to, the analytical error for most elements. The flow fields that are not homogeneous vary mainly in olivine content in an otherwise homogeneous melt. Some are composite flow fields made up of several, apparently homogeneous subunits erupted at different elevations along the active volcanic rifts. Not all volcanoes produce lavas that are homogeneous like those of Mauna Loa. If studies such as this are to be used to evaluate compositional diversity in lavas where there is a lack of sampling control, such as on other planets, it is necessary to understand why some flow units and flow fields are compositionally homogeneous and others are not, and to develop criteria for distinguishing between them.

  1. The Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database: a long-term database for climate studies

    PubMed Central

    Davis, Sean M.; Rosenlof, Karen H.; Hassler, Birgit; Hurst, Dale F.; Read, William G.; Vömel, Holger; Selkirk, Henry; Fujiwara, Masatomo; Damadeo, Robert

    2017-01-01

    In this paper, we describe the construction of the Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database, which includes vertically resolved ozone and water vapor data from a subset of the limb profiling satellite instruments operating since the 1980s. The primary SWOOSH products are zonal-mean monthly-mean time series of water vapor and ozone mixing ratio on pressure levels (12 levels per decade from 316 to 1 hPa). The SWOOSH pressure level products are provided on several independent zonal-mean grids (2.5, 5, and 10°), and additional products include two coarse 3-D griddings (30° long × 10° lat, 20° × 5°) as well as a zonal-mean isentropic product. SWOOSH includes both individual satellite source data as well as a merged data product. A key aspect of the merged product is that the source records are homogenized to account for inter-satellite biases and to minimize artificial jumps in the record. We describe the SWOOSH homogenization process, which involves adjusting the satellite data records to a “reference” satellite using coincident observations during time periods of instrument overlap. The reference satellite is chosen based on the best agreement with independent balloon-based sounding measurements, with the goal of producing a long-term data record that is both homogeneous (i.e., with minimal artificial jumps in time) and accurate (i.e., unbiased). This paper details the choice of reference measurements, homogenization, and gridding process involved in the construction of the combined SWOOSH product and also presents the ancillary information stored in SWOOSH that can be used in future studies of water vapor and ozone variability. Furthermore, a discussion of uncertainties in the combined SWOOSH record is presented, and examples of the SWOOSH record are provided to illustrate its use for studies of ozone and water vapor variability on interannual to decadal timescales. The version 2.5 SWOOSH data are publicly available at doi:10

  2. The Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database: a long-term database for climate studies.

    PubMed

    Davis, Sean M; Rosenlof, Karen H; Hassler, Birgit; Hurst, Dale F; Read, William G; Vömel, Holger; Selkirk, Henry; Fujiwara, Masatomo; Damadeo, Robert

    2016-01-01

    In this paper, we describe the construction of the Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database, which includes vertically resolved ozone and water vapor data from a subset of the limb profiling satellite instruments operating since the 1980s. The primary SWOOSH products are zonal-mean monthly-mean time series of water vapor and ozone mixing ratio on pressure levels (12 levels per decade from 316 to 1 hPa). The SWOOSH pressure level products are provided on several independent zonal-mean grids (2.5, 5, and 10°), and additional products include two coarse 3-D griddings (30° long × 10° lat, 20° × 5°) as well as a zonal-mean isentropic product. SWOOSH includes both individual satellite source data as well as a merged data product. A key aspect of the merged product is that the source records are homogenized to account for inter-satellite biases and to minimize artificial jumps in the record. We describe the SWOOSH homogenization process, which involves adjusting the satellite data records to a "reference" satellite using coincident observations during time periods of instrument overlap. The reference satellite is chosen based on the best agreement with independent balloon-based sounding measurements, with the goal of producing a long-term data record that is both homogeneous (i.e., with minimal artificial jumps in time) and accurate (i.e., unbiased). This paper details the choice of reference measurements, homogenization, and gridding process involved in the construction of the combined SWOOSH product and also presents the ancillary information stored in SWOOSH that can be used in future studies of water vapor and ozone variability. Furthermore, a discussion of uncertainties in the combined SWOOSH record is presented, and examples of the SWOOSH record are provided to illustrate its use for studies of ozone and water vapor variability on interannual to decadal timescales. The version 2.5 SWOOSH data are publicly available at doi:10

  3. Creating a Pilot Educational Psychiatry Website: Opportunities, Barriers, and Next Steps

    PubMed Central

    O'Connor, Ryan; Franzen, Jamie; Snow, Caitlin; Boland, Robert; Kitts, Robert

    2015-01-01

    Background While medical students and residents may be utilizing websites as online learning resources, medical trainees and educators now have the opportunity to create such educational websites and digital tools on their own. However, the process and theory of building educational websites for medical education have not yet been fully explored. Objective To understand the opportunities, barriers, and process of creating a novel medical educational website. Methods We created a pilot psychiatric educational website to better understand the options, opportunities, challenges, and processes involved in the creation of a psychiatric educational website. We sought to integrate visual and interactive Web design elements to underscore the potential of such Web technology. Results A pilot website (PsychOnCall) was created to demonstrate the potential of Web technology in medical and psychiatric education. Conclusions Creating an educational website is now technically easier than ever before, and the primary challenge no longer is technology but rather the creation, validation, and maintenance of information for such websites as well as translating text-based didactics into visual and interactive tools. Medical educators can influence the design and implementation of online educational resources through creating their own websites and engaging medical students and residents in the process. PMID:27731837

  4. Stochastic transport in the presence of spatial disorder: Fluctuation-induced corrections to homogenization

    NASA Astrophysics Data System (ADS)

    Russell, Matthew J.; Jensen, Oliver E.; Galla, Tobias

    2016-10-01

    Motivated by uncertainty quantification in natural transport systems, we investigate an individual-based transport process involving particles undergoing a random walk along a line of point sinks whose strengths are themselves independent random variables. We assume particles are removed from the system via first-order kinetics. We analyze the system using a hierarchy of approaches when the sinks are sparsely distributed, including a stochastic homogenization approximation that yields explicit predictions for the extrinsic disorder in the stationary state due to sink strength fluctuations. The extrinsic noise induces long-range spatial correlations in the particle concentration, unlike fluctuations due to the intrinsic noise alone. Additionally, the mean concentration profile, averaged over both intrinsic and extrinsic noise, is elevated compared with the corresponding profile from a uniform sink distribution, showing that the classical homogenization approximation can be a biased estimator of the true mean.

  5. Homogenized description and retrieval method of nonlinear metasurfaces

    NASA Astrophysics Data System (ADS)

    Liu, Xiaojun; Larouche, Stéphane; Smith, David R.

    2018-03-01

    A patterned, plasmonic metasurface can strongly scatter incident light, functioning as an extremely low-profile lens, filter, reflector or other optical device. When the metasurface is patterned uniformly, its linear optical properties can be expressed using effective surface electric and magnetic polarizabilities obtained through a homogenization procedure. The homogenized description of a nonlinear metasurface, however, presents challenges both because of the inherent anisotropy of the medium as well as the much larger set of potential wave interactions available, making it challenging to assign effective nonlinear parameters to the otherwise inhomogeneous layer of metamaterial elements. Here we show that a homogenization procedure can be developed to describe nonlinear metasurfaces, which derive their nonlinear response from the enhanced local fields arising within the structured plasmonic elements. With the proposed homogenization procedure, we are able to assign effective nonlinear surface polarization densities to a nonlinear metasurface, and link these densities to the effective nonlinear surface susceptibilities and averaged macroscopic pumping fields across the metasurface. These effective nonlinear surface polarization densities are further linked to macroscopic nonlinear fields through the generalized sheet transition conditions (GSTCs). By inverting the GSTCs, the effective nonlinear surface susceptibilities of the metasurfaces can be solved for, leading to a generalized retrieval method for nonlinear metasurfaces. The application of the homogenization procedure and the GSTCs are demonstrated by retrieving the nonlinear susceptibilities of a SiO2 nonlinear slab. As an example, we investigate a nonlinear metasurface which presents nonlinear magnetoelectric coupling in near infrared regime. The method is expected to apply to any patterned metasurface whose thickness is much smaller than the wavelengths of operation, with inclusions of arbitrary geometry

  6. Creating relationships with persons with moderate to severe dementia

    PubMed Central

    Kjellström, Sofia; Hellström, Ingrid

    2013-01-01

    The study describes how relationships are created with persons with moderate to severe dementia. The material comprises 24 video sequences of Relational Time (RT) sessions, 24 interviews with persons with dementia and eight interviews with professional caregivers. The study method was Constructivist Grounded Theory. The categories of ‘Assigning time’, ‘Establishing security and trust’ and ‘Communicating equality’ were strategies for arriving at the core category, ‘Opening up’, which was the process that led to creating relationships. Both parties had to contribute to create a relationship; the professional caregiver controlled the process, but the person with dementia permitted the caregiver's overtures and opened up, thus making the relationship possible. Interpersonal relationships are significant to enhancing the well-being of persons with dementia. Small measures like RT that do not require major resources can open paths to creating relationships. PMID:24336663

  7. Large-area homogeneous periodic surface structures generated on the surface of sputtered boron carbide thin films by femtosecond laser processing

    NASA Astrophysics Data System (ADS)

    Serra, R.; Oliveira, V.; Oliveira, J. C.; Kubart, T.; Vilar, R.; Cavaleiro, A.

    2015-03-01

    Amorphous and crystalline sputtered boron carbide thin films have a very high hardness even surpassing that of bulk crystalline boron carbide (≈41 GPa). However, magnetron sputtered B-C films have high friction coefficients (C.o.F) which limit their industrial application. Nanopatterning of materials surfaces has been proposed as a solution to decrease the C.o.F. The contact area of the nanopatterned surfaces is decreased due to the nanometre size of the asperities which results in a significant reduction of adhesion and friction. In the present work, the surface of amorphous and polycrystalline B-C thin films deposited by magnetron sputtering was nanopatterned using infrared femtosecond laser radiation. Successive parallel laser tracks 10 μm apart were overlapped in order to obtain a processed area of about 3 mm2. Sinusoidal-like undulations with the same spatial period as the laser tracks were formed on the surface of the amorphous boron carbide films after laser processing. The undulations amplitude increases with increasing laser fluence. The formation of undulations with a 10 μm period was also observed on the surface of the crystalline boron carbide film processed with a pulse energy of 72 μJ. The amplitude of the undulations is about 10 times higher than in the amorphous films processed at the same pulse energy due to the higher roughness of the films and consequent increase in laser radiation absorption. LIPSS formation on the surface of the films was achieved for the three B-C films under study. However, LIPSS are formed under different circumstances. Processing of the amorphous films at low fluence (72 μJ) results in LIPSS formation only on localized spots on the film surface. LIPSS formation was also observed on the top of the undulations formed after laser processing with 78 μJ of the amorphous film deposited at 800 °C. Finally, large-area homogeneous LIPSS coverage of the boron carbide crystalline films surface was achieved within a large range

  8. Homogenization-assisted cavitation hybrid rotation extraction and macroporous resin enrichment of dihydroquercetin from Larix gmelinii.

    PubMed

    Xia, Yu; Wang, Yinhang; Li, Wei; Ma, Chunhui; Liu, Shouxin

    2017-12-01

    Cavitation hybrid rotation, which was and is still looked upon as an unavoidable nuisance in the flow systems, for extraction processing intensification of active chemical compounds from natural products. In this study, a homogenization-assisted cavitation hybrid rotation extraction method was applied to extract dihydroquercetin (DHQ) from larch (Larix gmelinii) wood root. The extraction parameters were optimized in single factor experiments with the DHQ extraction yields as the response values. The optimum conditions were as follows: number of extractions, three; ethanol volume fraction for the extraction, 60%; liquid-solid ratio for homogenization, 10mL/g; homogenization time, 8min; liquid-solid ratio for cavitation extraction, 9mL/g, and cavitation extraction time, 35min. Under these conditions, the DHQ content in extract was 4.50±0.02mg/g, and the extraction efficiency was higher than those of traditional techniques. Cavitation can be effectively used to improve the extraction rate by increasing the mass transfer rates and possible rupture of cell wall due to formation of microcavities leading to higher product yields with reduced processing time and solvent consumption. After the extraction process, macroporous resin column chromatography was used to concentrate and purify the DHQ. Three resins were selected from fifteen macroporous resins for further investigation of their performance. Among these resins, AB-8 resin exhibited relatively better adsorption capacities and desorption ratios for DHQ. The ethanol volume fraction of the solutions for sample loading and desorption, and flow rates for loading and desorption were optimized for the macroporous resin column chromatography. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Alterations in regional homogeneity of resting-state cerebral activity in patients with chronic prostatitis/chronic pelvic pain syndrome.

    PubMed

    Lin, Yusong; Bai, Yan; Liu, Peng; Yang, Xuejuan; Qin, Wei; Gu, Jianqin; Ding, Degang; Tian, Jie; Wang, Meiyun

    2017-01-01

    The purpose of this study was to explore the neural mechanism in Chronic prostatitis/Chronic pelvic pain syndrome (CP/CPPS) using resting-state functional magnetic resonance imaging. The functional magnetic resonance imaging was performed on 31 male CP/CPPS-patients and 31 age and education matched male healthy controls on a 3-T magnetic resonance imaging unit. A two-sample t-test was adopted to reveal the regional homogeneity between the patients and healthy controls. The mean regional homogeneity values in the alerted brain regions of patients were correlated with the clinical measurements by using Pearson's correlation analyses. The CP/CPPS-patients had significantly decreased regional homogeneity in the bilateral anterior cingulate cortices, insular cortices and right medial prefrontal cortex, while significantly increased regional homogeneity in the brainstem and right thalamus compared with the healthy controls. In the CP/CPPS-patients, the mean regional homogeneity value in the left anterior cingulate cortex, bilateral insular cortices and brainstem were respectively correlated with the National Institutes of Health Chronic Prostatitis Symptom Index total score and pain subscale. These brain regions are important in the pain modulation process. Therefore, an impaired pain modulatory system, either by decreased descending pain inhibition or enhanced pain facilitation, may explain the pain symptoms in CP/CPPS.

  10. Homogeneous nucleation and microstructure evolution in million-atom molecular dynamics simulation

    PubMed Central

    Shibuta, Yasushi; Oguchi, Kanae; Takaki, Tomohiro; Ohno, Munekazu

    2015-01-01

    Homogeneous nucleation from an undercooled iron melt is investigated by the statistical sampling of million-atom molecular dynamics (MD) simulations performed on a graphics processing unit (GPU). Fifty independent instances of isothermal MD calculations with one million atoms in a quasi-two-dimensional cell over a nanosecond reveal that the nucleation rate and the incubation time of nucleation as functions of temperature have characteristic shapes with a nose at the critical temperature. This indicates that thermally activated homogeneous nucleation occurs spontaneously in MD simulations without any inducing factor, whereas most previous studies have employed factors such as pressure, surface effect, and continuous cooling to induce nucleation. Moreover, further calculations over ten nanoseconds capture the microstructure evolution on the order of tens of nanometers from the atomistic viewpoint and the grain growth exponent is directly estimated. Our novel approach based on the concept of “melting pots in a supercomputer” is opening a new phase in computational metallurgy with the aid of rapid advances in computational environments. PMID:26311304

  11. The single-zone numerical model of homogeneous charge compression ignition engine performance

    NASA Astrophysics Data System (ADS)

    Fedyanov, E. A.; Itkis, E. M.; Kuzmin, V. N.; Shumskiy, S. N.

    2017-02-01

    The single-zone model of methane-air mixture combustion in the Homogeneous Charge Compression Ignition engine was developed. First modeling efforts resulted in the selection of the detailed kinetic reaction mechanism, most appropriate for the conditions of the HCCI process. Then, the model was completed so as to simulate the performance of the four-stroke engine and was coupled by physically reasonable adjusting functions. Validation of calculations against experimental data showed acceptable agreement.

  12. Heterogeneous vs. Homogeneous Groups: Methodology for Class Instruction for Post-Secondary Business Education Courses

    ERIC Educational Resources Information Center

    Davis, Theodore E., Jr.

    2012-01-01

    The primary purpose of this study is to investigate the influence of racial and gender diversity on group process and problem solving in an academic setting. The importance of this dynamic is its value in preparing students for the workplace. The supposition is if a group is homogeneous, commitment and performance is improved. Data for the study…

  13. Creating customer value by streamlining business processes.

    PubMed

    Vantrappen, H

    1992-02-01

    Much of the strategic preoccupation of senior managers in the 1990s is focusing on the creation of customer value. Companies are seeking competitive advantage by streamlining the three processes through which they interact with their customers: product creation, order handling and service assurance. 'Micro-strategy' is a term which has been coined for the trade-offs and decisions on where and how to streamline these three processes. The article discusses micro-strategies applied by successful companies.

  14. Matrix algorithms for solving (in)homogeneous bound state equations

    PubMed Central

    Blank, M.; Krassnigg, A.

    2011-01-01

    In the functional approach to quantum chromodynamics, the properties of hadronic bound states are accessible via covariant integral equations, e.g. the Bethe–Salpeter equation for mesons. In particular, one has to deal with linear, homogeneous integral equations which, in sophisticated model setups, use numerical representations of the solutions of other integral equations as part of their input. Analogously, inhomogeneous equations can be constructed to obtain off-shell information in addition to bound-state masses and other properties obtained from the covariant analogue to a wave function of the bound state. These can be solved very efficiently using well-known matrix algorithms for eigenvalues (in the homogeneous case) and the solution of linear systems (in the inhomogeneous case). We demonstrate this by solving the homogeneous and inhomogeneous Bethe–Salpeter equations and find, e.g. that for the calculation of the mass spectrum it is as efficient or even advantageous to use the inhomogeneous equation as compared to the homogeneous. This is valuable insight, in particular for the study of baryons in a three-quark setup and more involved systems. PMID:21760640

  15. Generation of phase II in vitro metabolites using homogenized horse liver.

    PubMed

    Wong, Jenny K Y; Chan, George H M; Leung, David K K; Tang, Francis P W; Wan, Terence S M

    2016-02-01

    The successful use of homogenized horse liver for the generation of phase I in vitro metabolites has been previously reported by the authors' laboratory. Prior to the use of homogenized liver, the authors' laboratory had been using mainly horse liver microsomes for carrying out equine in vitro metabolism studies. Homogenized horse liver has shown significant advantages over liver microsomes for in vitro metabolism studies as the procedures are much quicker and have higher capability for generating more in vitro metabolites. In this study, the use of homogenized liver has been extended to the generation of phase II in vitro metabolites (glucuronide and/or sulfate conjugates) using 17β-estradiol, morphine, and boldenone undecylenate as model substrates. It was observed that phase II metabolites could also be generated even without the addition of cofactors. To the authors' knowledge, this is the first report of the successful use of homogenized horse liver for the generation of phase II metabolites. It also demonstrates the ease with which both phase I and phase II metabolites can now be generated in vitro simply by using homogenized liver without the need for ultracentrifuges or tedious preparation steps. Copyright © 2015 John Wiley & Sons, Ltd.

  16. How does creating a concept map affect item-specific encoding?

    PubMed

    Grimaldi, Phillip J; Poston, Laurel; Karpicke, Jeffrey D

    2015-07-01

    Concept mapping has become a popular learning tool. However, the processes underlying the task are poorly understood. In the present study, we examined the effect of creating a concept map on the processing of item-specific information. In 2 experiments, subjects learned categorized or ad hoc word lists by making pleasantness ratings, sorting words into categories, or creating a concept map. Memory was tested using a free recall test and a recognition memory test, which is considered to be especially sensitive to item-specific processing. Typically, tasks that promote item-specific processing enhance free recall of categorized lists, relative to category sorting. Concept mapping resulted in lower recall performance than both the pleasantness rating and category sorting condition for categorized words. Moreover, concept mapping resulted in lower recognition memory performance than the other 2 tasks. These results converge on the conclusion that creating a concept map disrupts the processing of item-specific information. (c) 2015 APA, all rights reserved.

  17. Automatic Feature Selection and Weighting for the Formation of Homogeneous Groups for Regional Intensity-Duration-Frequency (IDF) Curve Estimation

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Burn, D. H.

    2017-12-01

    Extreme rainfall events can have devastating impacts on society. To quantify the associated risk, the IDF curve has been used to provide the essential rainfall-related information for urban planning. However, the recent changes in the rainfall climatology caused by climate change and urbanization have made the estimates provided by the traditional regional IDF approach increasingly inaccurate. This inaccuracy is mainly caused by two problems: 1) The ineffective choice of similarity indicators for the formation of a homogeneous group at different regions; and 2) An inadequate number of stations in the pooling group that does not adequately reflect the optimal balance between group size and group homogeneity or achieve the lowest uncertainty in the rainfall quantiles estimates. For the first issue, to consider the temporal difference among different meteorological and topographic indicators, a three-layer design is proposed based on three stages in the extreme rainfall formation: cloud formation, rainfall generation and change of rainfall intensity above urban surface. During the process, the impacts from climate change and urbanization are considered through the inclusion of potential relevant features at each layer. Then to consider spatial difference of similarity indicators for the homogeneous group formation at various regions, an automatic feature selection and weighting algorithm, specifically the hybrid searching algorithm of Tabu search, Lagrange Multiplier and Fuzzy C-means Clustering, is used to select the optimal combination of features for the potential optimal homogenous groups formation at a specific region. For the second issue, to compare the uncertainty of rainfall quantile estimates among potential groups, the two sample Kolmogorov-Smirnov test-based sample ranking process is used. During the process, linear programming is used to rank these groups based on the confidence intervals of the quantile estimates. The proposed methodology fills the gap

  18. Effect of homogenization on the properties and microstructure of Mozzarella cheese from buffalo milk.

    PubMed

    Abd El-Gawad, Mona A M; Ahmed, Nawal S; El-Abd, M M; Abd El-Rafee, S

    2012-04-02

    The name pasta filata refers to a unique plasticizing and texturing treatments of the fresh curd in hot water that imparts to the finished cheese its characteristic fibrous structure and melting properties. Mozzarella cheese made from standardized homogenized and non-homogenized buffalo milk with 3 and 1.5%fat. The effect of homogenization on rheological, microstructure and sensory evaluation was carried out. Fresh raw buffalo milk and starter cultures of Streptococcus thermophilus and Lactobacillus delbrueckii ssp. bulgaricus were used. The coagulants were calf rennet powder (HA-LA). Standardized buffalo milk was homogenized at 25 kg/cm2 pressure after heating to 60°C using homogenizer. Milk and cheese were analysed. Microstructure of the cheese samples was investigated either with an application of transmission or scanning electron microscope. Statistical analyses were applied on the obtained data. Soluble nitrogen total volatile free fatty acids, soluble tyrosine and tryptophan increased with using homogenized milk and also, increased with relatively decrease in case of homogenized Mozzarella cheese. Meltability of Mozzarella cheese increased with increasing the fat content and storage period and decrease with homogenization. Mozzarella cheese firmness increased with homogenization and also, increased with progressing of storage period. Flavour score, appearance and total score of Mozzarella cheese increased with homogenization and storage period progress, while body and texture score decreased with homogenization and increased with storage period progress. Microstructure of Mozzarella cheese showed the low fat cheese tends to be harder, more crumbly and less smooth than normal. Curd granule junctions were prominent in non-homogenized milk cheese. Homogenization of milk cheese caused changes in the microstructure of the Mozzarella cheese. Microstructure studies of cheese revealed that cheese made from homogenized milk is smoother and has a finer texture than

  19. Creating homogenous strain distribution within 3D cell-encapsulated constructs using a simple and cost-effective uniaxial tensile bioreactor: Design and validation study.

    PubMed

    Subramanian, Gayathri; Elsaadany, Mostafa; Bialorucki, Callan; Yildirim-Ayan, Eda

    2017-08-01

    Mechanical loading bioreactors capable of applying uniaxial tensile strains are emerging to be a valuable tool to investigate physiologically relevant cellular signaling pathways and biochemical expression. In this study, we have introduced a simple and cost-effective uniaxial tensile strain bioreactor for the application of precise and homogenous uniaxial strains to 3D cell-encapsulated collagen constructs at physiological loading strains (0-12%) and frequencies (0.01-1 Hz). The bioreactor employs silicone-based loading chambers specifically designed to stretch constructs without direct gripping to minimize stress concentration at the ends of the construct and preserve its integrity. The loading chambers are driven by a versatile stepper motor ball-screw actuation system to produce stretching of the constructs. Mechanical characterization of the bioreactor performed through Finite Element Analysis demonstrated that the constructs experienced predominantly uniaxial tensile strain in the longitudinal direction. The strains produced were found to be homogenous over a 15 × 4 × 2 mm region of the construct equivalent to around 60% of the effective region of characterization. The strain values were also shown to be consistent and reproducible during cyclic loading regimes. Biological characterization confirmed the ability of the bioreactor to promote cell viability, proliferation, and matrix organization of cell-encapsulated collagen constructs. This easy-to-use uniaxial tensile strain bioreactor can be employed for studying morphological, structural, and functional responses of cell-embedded matrix systems in response to physiological loading of musculoskeletal tissues. It also holds promise for tissue-engineered strategies that involve delivery of mechanically stimulated cells at the site of injury through a biological carrier to develop a clinically useful therapy for tissue healing. Biotechnol. Bioeng. 2017;114: 1878-1887. © 2017 Wiley Periodicals

  20. Superfluid transition of homogeneous and trapped two-dimensional Bose gases.

    PubMed

    Holzmann, Markus; Baym, Gordon; Blaizot, Jean-Paul; Laloë, Franck

    2007-01-30

    Current experiments on atomic gases in highly anisotropic traps present the opportunity to study in detail the low temperature phases of two-dimensional inhomogeneous systems. Although, in an ideal gas, the trapping potential favors Bose-Einstein condensation at finite temperature, interactions tend to destabilize the condensate, leading to a superfluid Kosterlitz-Thouless-Berezinskii phase with a finite superfluid mass density but no long-range order, as in homogeneous fluids. The transition in homogeneous systems is conveniently described in terms of dissociation of topological defects (vortex-antivortex pairs). However, trapped two-dimensional gases are more directly approached by generalizing the microscopic theory of the homogeneous gas. In this paper, we first derive, via a diagrammatic expansion, the scaling structure near the phase transition in a homogeneous system, and then study the effects of a trapping potential in the local density approximation. We find that a weakly interacting trapped gas undergoes a Kosterlitz-Thouless-Berezinskii transition from the normal state at a temperature slightly below the Bose-Einstein transition temperature of the ideal gas. The characteristic finite superfluid mass density of a homogeneous system just below the transition becomes strongly suppressed in a trapped gas.

  1. Catalytic photodegradation of pharmaceuticals - homogeneous and heterogeneous photocatalysis.

    PubMed

    Klementova, S; Kahoun, D; Doubkova, L; Frejlachova, K; Dusakova, M; Zlamal, M

    2017-01-18

    Photocatalytic degradation of pharmaceuticals (hydrocortisone, estradiol, and verapamil) and personal care product additives (parabens-methyl, ethyl, and propyl derivatives) was investigated in the homogeneous phase (with ferric ions as the catalyst) and on TiO 2 . Ferric ions in concentrations corresponding to concentrations in natural water bodies were shown to be a significant accelerator of the degradation in homogeneous reaction mixtures. In heterogeneous photocatalytic reactions on TiO 2 , lower reaction rates, but mineralisation to higher extents, were observed.

  2. A comparison of maximal bioenergetic enzyme activities obtained with commonly used homogenization techniques.

    PubMed

    Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J

    1996-12-01

    Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P < 0.05) enzyme activities compared to all other protocols for both enzymes. Of the four protocols examined, the data demonstrate that the glass-on-glass pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.

  3. Using a critical reflection process to create an effective learning community in the workplace.

    PubMed

    Walker, Rachel; Cooke, Marie; Henderson, Amanda; Creedy, Debra K

    2013-05-01

    Learning circles are an enabling process to critically examine and reflect on practices with the purpose of promoting individual and organizational growth and change. The authors adapted and developed a learning circle strategy to facilitate open discourse between registered nurses, clinical leaders, clinical facilitators and students, to critically reflect on practice experiences to promote a positive learning environment. This paper reports on an analysis of field notes taken during a critical reflection process used to create an effective learning community in the workplace. A total of 19 learning circles were conducted during in-service periods (that is, the time allocated for professional education between morning and afternoon shifts) over a 3 month period with 56 nurses, 33 students and 1 university-employed clinical supervisor. Participation rates ranged from 3 to 12 individuals per discussion. Ten themes emerged from content analysis of the clinical learning issues identified through the four-step model of critical reflection used in learning circle discussions. The four-step model of critical reflection allowed participants to reflect on clinical learning issues, and raise them in a safe environment that enabled topics to be challenged and explored in a shared and cooperative manner. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Five Important Lessons I Learned during the Process of Creating New Child Care Centers

    ERIC Educational Resources Information Center

    Whitehead, R. Ann

    2005-01-01

    In this article, the author describes her experiences of developing new child care sites and offers five important lessons that she learned through her experiences which helped her to create successful child care centers. These lessons include: (1) Finding an appropriate area and location; (2) Creating realistic financial projections based on real…

  5. TESTING HOMOGENEITY WITH GALAXY STAR FORMATION HISTORIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyle, Ben; Jimenez, Raul; Tojeiro, Rita

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past light cone, while observations take place on the light cone. The star formation history (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked luminous red galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal-area contiguous sky patchesmore » and 10 redshift slices (0.2 < z < 0.5), which correspond to 120 blocks of volume {approx}0.04 Gpc{sup 3}. Using the SFH in a time period that samples the history of the universe between look-back times 11.5 and 13.4 Gyr as a proxy for homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is no extra variance at all. At 95% credibility, there is no evidence of deviations larger than 5.8%.« less

  6. Homogeneous Biosensing Based on Magnetic Particle Labels

    PubMed Central

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  7. Creating speech-synchronized animation.

    PubMed

    King, Scott A; Parent, Richard E

    2005-01-01

    We present a facial model designed primarily to support animated speech. Our facial model takes facial geometry as input and transforms it into a parametric deformable model. The facial model uses a muscle-based parameterization, allowing for easier integration between speech synchrony and facial expressions. Our facial model has a highly deformable lip model that is grafted onto the input facial geometry to provide the necessary geometric complexity needed for creating lip shapes and high-quality renderings. Our facial model also includes a highly deformable tongue model that can represent the shapes the tongue undergoes during speech. We add teeth, gums, and upper palate geometry to complete the inner mouth. To decrease the processing time, we hierarchically deform the facial surface. We also present a method to animate the facial model over time to create animated speech using a model of coarticulation that blends visemes together using dominance functions. We treat visemes as a dynamic shaping of the vocal tract by describing visemes as curves instead of keyframes. We show the utility of the techniques described in this paper by implementing them in a text-to-audiovisual-speech system that creates animation of speech from unrestricted text. The facial and coarticulation models must first be interactively initialized. The system then automatically creates accurate real-time animated speech from the input text. It is capable of cheaply producing tremendous amounts of animated speech with very low resource requirements.

  8. Synthesis of focused beam with controllable arbitrary homogeneous polarization using engineered vectorial optical fields.

    PubMed

    Rui, Guanghao; Chen, Jian; Wang, Xiaoyan; Gu, Bing; Cui, Yiping; Zhan, Qiwen

    2016-10-17

    The propagation and focusing properties of light beams continue to remain a research interest owning to their promising applications in physics, chemistry and biological sciences. One of the main challenges to these applications is the control of polarization distribution within the focal volume. In this work, we propose and experimentally demonstrate a method for generating a focused beam with arbitrary homogeneous polarization at any transverse plane. The required input field at the pupil plane of a high numerical aperture objective lens can be found analytically by solving an inverse problem with the Richard-Wolf vectorial diffraction method, and can be experimentally created with a vectorial optical field generator. Focused fields with various polarizations are successfully generated and verified using a Stokes parameter measurement to demonstrate the capability and versatility of proposed technique.

  9. Efficacy of various pasteurization time-temperature conditions in combination with homogenization on inactivation of Mycobacterium avium subsp. paratuberculosis in milk.

    PubMed

    Grant, Irene R; Williams, Alan G; Rowe, Michael T; Muir, D Donald

    2005-06-01

    The effect of various pasteurization time-temperature conditions with and without homogenization on the viability of Mycobacterium avium subsp. paratuberculosis was investigated using a pilot-scale commercial high-temperature, short-time (HTST) pasteurizer and raw milk spiked with 10(1) to 10(5) M. avium subsp. paratuberculosis cells/ml. Viable M. avium subsp. paratuberculosis was cultured from 27 (3.3%) of 816 pasteurized milk samples overall, 5 on Herrold's egg yolk medium and 22 by BACTEC culture. Therefore, in 96.7% of samples, M. avium subsp. paratuberculosis had been completely inactivated by HTST pasteurization, alone or in combination with homogenization. Heat treatments incorporating homogenization at 2,500 lb/in2, applied upstream (as a separate process) or in hold (at the start of a holding section), resulted in significantly fewer culture-positive samples than pasteurization treatments without homogenization (P < 0.001 for those in hold and P < 0.05 for those upstream). Where colony counts were obtained, the number of surviving M. avium subsp. paratuberculosis cells was estimated to be 10 to 20 CFU/150 ml, and the reduction in numbers achieved by HTST pasteurization with or without homogenization was estimated to be 4.0 to 5.2 log10. The impact of homogenization on clump size distribution in M. avium subsp. paratuberculosis broth suspensions was subsequently assessed using a Mastersizer X spectrometer. These experiments demonstrated that large clumps of M. avium subsp. paratuberculosis cells were reduced to single-cell or "miniclump" status by homogenization at 2,500 lb/in2. Consequently, when HTST pasteurization was being applied to homogenized milk, the M. avium subsp. paratuberculosis cells would have been present as predominantly declumped cells, which may possibly explain the greater inactivation achieved by the combination of pasteurization and homogenization.

  10. Efficacy of Various Pasteurization Time-Temperature Conditions in Combination with Homogenization on Inactivation of Mycobacterium avium subsp. paratuberculosis in Milk

    PubMed Central

    Grant, Irene R.; Williams, Alan G.; Rowe, Michael T.; Muir, D. Donald

    2005-01-01

    The effect of various pasteurization time-temperature conditions with and without homogenization on the viability of Mycobacterium avium subsp. paratuberculosis was investigated using a pilot-scale commercial high-temperature, short-time (HTST) pasteurizer and raw milk spiked with 101 to 105 M. avium subsp. paratuberculosis cells/ml. Viable M. avium subsp. paratuberculosis was cultured from 27 (3.3%) of 816 pasteurized milk samples overall, 5 on Herrold's egg yolk medium and 22 by BACTEC culture. Therefore, in 96.7% of samples, M. avium subsp. paratuberculosis had been completely inactivated by HTST pasteurization, alone or in combination with homogenization. Heat treatments incorporating homogenization at 2,500 lb/in2, applied upstream (as a separate process) or in hold (at the start of a holding section), resulted in significantly fewer culture-positive samples than pasteurization treatments without homogenization (P < 0.001 for those in hold and P < 0.05 for those upstream). Where colony counts were obtained, the number of surviving M. avium subsp. paratuberculosis cells was estimated to be 10 to 20 CFU/150 ml, and the reduction in numbers achieved by HTST pasteurization with or without homogenization was estimated to be 4.0 to 5.2 log10. The impact of homogenization on clump size distribution in M. avium subsp. paratuberculosis broth suspensions was subsequently assessed using a Mastersizer X spectrometer. These experiments demonstrated that large clumps of M. avium subsp. paratuberculosis cells were reduced to single-cell or “miniclump” status by homogenization at 2,500 lb/in2. Consequently, when HTST pasteurization was being applied to homogenized milk, the M. avium subsp. paratuberculosis cells would have been present as predominantly declumped cells, which may possibly explain the greater inactivation achieved by the combination of pasteurization and homogenization. PMID:15932977

  11. Retinoic Acid Engineered Amniotic Membrane Used as Graft or Homogenate: Positive Effects on Corneal Alkali Burns.

    PubMed

    Joubert, Romain; Daniel, Estelle; Bonnin, Nicolas; Comptour, Aurélie; Gross, Christelle; Belville, Corinne; Chiambaretta, Frédéric; Blanchon, Loïc; Sapin, Vincent

    2017-07-01

    Alkali burns are the most common, severe chemical ocular injuries, their functional prognosis depending on corneal wound healing efficiency. The purpose of our study was to compare the benefits of amniotic membrane (AM) grafts and homogenates for wound healing in the presence or absence of previous all-trans retinoic acid (atRA) treatment. Fifty male CD1 mice with reproducible corneal chemical burn were divided into five groups, as follows: group 1 was treated with saline solution; groups 2 and 3 received untreated AM grafts or grafts treated with atRA, respectively; and groups 4 and 5 received untreated AM homogenates or homogenates treated with atRA, respectively. After 7 days of treatment, ulcer area and depth were measured, and vascular endothelial growth factor (VEGF) and matrix metalloproteinase 9 (MMP-9) were quantified. AM induction by atRA was confirmed via quantification of retinoic acid receptor β (RARβ), a well-established retinoic acid-induced gene. Significant improvements of corneal wound healing in terms of ulcer area and depth were obtained with both strategies. No major differences were found between the efficiency of AM homogenates and grafts. This positive action was increased when AM was pretreated with atRA. Furthermore, AM induced a decrease in VEGF and MMP-9 levels during the wound healing process. The atRA treatment led to an even greater decrease in the expression of both proteins. Amnion homogenate is as effective as AM grafts in promoting corneal wound healing in a mouse model. A higher positive effect was obtained with atRA treatment.

  12. Investigation to biodiesel production by the two-step homogeneous base-catalyzed transesterification.

    PubMed

    Ye, Jianchu; Tu, Song; Sha, Yong

    2010-10-01

    For the two-step transesterification biodiesel production made from the sunflower oil, based on the kinetics model of the homogeneous base-catalyzed transesterification and the liquid-liquid phase equilibrium of the transesterification product, the total methanol/oil mole ratio, the total reaction time, and the split ratios of methanol and reaction time between the two reactors in the stage of the two-step reaction are determined quantitatively. In consideration of the transesterification intermediate product, both the traditional distillation separation process and the improved separation process of the two-step reaction product are investigated in detail by means of the rigorous process simulation. In comparison with the traditional distillation process, the improved separation process of the two-step reaction product has distinct advantage in the energy duty and equipment requirement due to replacement of the costly methanol-biodiesel distillation column. Copyright 2010 Elsevier Ltd. All rights reserved.

  13. A novel approach to make homogeneous protease-stable monovalent streptavidin

    DOE PAGES

    Zhang, M.; Shao, J; Xiao, J.; ...

    2015-06-11

    The interaction between the tetramer streptavidin and biotin is recognized as one of the strongest non-covalent associations. Owing to the tight and specific binding, the streptavidin-biotin system has been used widely for bimolecular labeling, purification, immobilization, and even for targeted delivery of therapeutics drugs. Here, we report a novel approach to make homogeneous monovalent tetramer streptavidin. The purified monovalent protein showed both thermal stability and protease stability. Unexpectedly, we found that two proteases, Proteinase K (PK) and Subtilisin (SU), can efficiently remove the His8-tag from the wild-type subunit without affecting the tetramer architecture of monovalent streptavidin, thus making it moremore » homogeneous. In addition, crystallization was performed to assure the homogeneity of the monovalent protein prepared. Overall, monovalent streptavidin shows increased homogeneity and will likely be valuable for many future applications in a wide range of research areas.« less

  14. Homogeneous-heterogeneous reactions in curved channel with porous medium

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Ayub, Sadia; Alsaedi, A.

    2018-06-01

    Purpose of the present investigation is to examine the peristaltic flow through porous medium in a curved conduit. Problem is modeled for incompressible electrically conducting Ellis fluid. Influence of porous medium is tackled via modified Darcy's law. The considered model utilizes homogeneous-heterogeneous reactions with equal diffusivities for reactant and autocatalysis. Constitutive equations are formulated in the presence of viscous dissipation. Channel walls are compliant in nature. Governing equations are modeled and simplified under the assumptions of small Reynolds number and large wavelength. Graphical results for velocity, temperature, heat transfer coefficient and homogeneous-heterogeneous reaction parameters are examined for the emerging parameters entering into the problem. Results reveal an activation in both homogenous-heterogenous reaction effect and heat transfer rate with increasing curvature of the channel.

  15. Extreme between-study homogeneity in meta-analyses could offer useful insights.

    PubMed

    Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias

    2006-10-01

    Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.

  16. Injection molding as a one-step process for the direct production of pharmaceutical dosage forms from primary powders.

    PubMed

    Eggenreich, K; Windhab, S; Schrank, S; Treffer, D; Juster, H; Steinbichler, G; Laske, S; Koscher, G; Roblegg, E; Khinast, J G

    2016-05-30

    The objective of the present study was to develop a one-step process for the production of tablets directly from primary powder by means of injection molding (IM), to create solid-dispersion based tablets. Fenofibrate was used as the model API, a polyvinyl caprolactame-polyvinyl acetate-polyethylene glycol graft co-polymer served as a matrix system. Formulations were injection-molded into tablets using state-of-the-art IM equipment. The resulting tablets were physico-chemically characterized and the drug release kinetics and mechanism were determined. Comparison tablets were produced, either directly from powder or from pre-processed pellets prepared via hot melt extrusion (HME). The content of the model drug in the formulations was 10% (w/w), 20% (w/w) and 30% (w/w), respectively. After 120min, both powder-based and pellet-based injection-molded tablets exhibited a drug release of 60% independent of the processing route. Content uniformity analysis demonstrated that the model drug was homogeneously distributed. Moreover, analysis of single dose uniformity also revealed geometric drug homogeneity between tablets of one shot. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Data Homogenization of the NOAA Long-Term Ozonesonde Records

    NASA Astrophysics Data System (ADS)

    Johnson, B.; Cullis, P.; Sterling, C. W.; Jordan, A. F.; Hall, E. G.; Petropavlovskikh, I. V.; Oltmans, S. J.; Mcconville, G.

    2015-12-01

    The NOAA long term balloon-borne ozonesonde sites at Boulder, Colorado; Hilo, Hawaii; and South Pole Station, Antarctica have measured weekly ozone profiles for more than 3 decades. The ozonesonde consists of an electrochemical concentration cell (ECC) sensor interfaced with a weather radiosonde which transmits high resolution ozone and meteorological data during ascent from the surface to 30-35 km altitude. During this 30 year time period there have been several model changes in the commercially available ECC ozonesondes and radiosondes as well as three adjustments in the ozone sensor solution composition at NOAA. These changes were aimed at optimizing the ozonesonde performance. Organized intercomparison campaigns conducted at the environmental simulation facility at the Research Centre Juelich, Germany and international field site testing have been the primary process for assessing new designs, instruments, or sensor solution changes and developing standard operating procedures. NOAA has also performed in-house laboratory tests and launched 28 dual ozonesondes at various sites since 1994 to provide further comparison data to determine the optimum homogenized data set. The final homogenization effort involved reviewing and editing several thousand individual ozonesonde profiles followed by applying the optimum correction algorithms for changes in type of sensor solution composition. The results of improved data sets will be shown with long term trends and uncertainties at various altitude levels.

  18. Pyroxene Homogenization and the Isotopic Systematics of Eucrites

    NASA Technical Reports Server (NTRS)

    Nyquist, L. E.; Bogard, D. D.

    1996-01-01

    The original Mg-Fe zoning of eucritic pyroxenes has in nearly all cases been partly homogenized, an observation that has been combined with other petrographic and compositional criteria to establish a scale of thermal "metamorphism" for eucrites. To evaluate hypotheses explaining development of conditions on the HED parent body (Vesta?) leading to pyroxene homogenization against their chronological implications, it is necessary to know whether pyroxene metamorphism was recorded in the isotopic systems. However, identifying the effects of the thermal metamorphism with specific effects in the isotopic systems has been difficult, due in part to a lack of correlated isotopic and mineralogical studies of the same eucrites. Furthermore, isotopic studies often place high demands on analytical capabilities, resulting in slow growth of the isotopic database. Additionally, some isotopic systems would not respond in a direct and sensitive way to pyroxene homogenization. Nevertheless, sufficient data exist to generalize some observations, and to identify directions of potentially fruitful investigations.

  19. Towards Low-Cost Effective and Homogeneous Thermal Activation of Shape Memory Polymers

    PubMed Central

    Lantada, Andrés Díaz; Rebollo, María Ángeles Santamaría

    2013-01-01

    A typical limitation of intelligent devices based on the use of shape-memory polymers as actuators is linked to the widespread use of distributed heating resistors, via Joule effect, as activation method, which involves several relevant issues needing attention, such as: (a) Final device size is importantly increased due to the additional space required for the resistances; (b) the use of resistances limits materials’ strength and the obtained devices are normally weaker; (c) the activation process through heating resistances is not homogeneous, thus leading to important temperature differences among the polymeric structure and to undesirable thermal gradients and stresses, also limiting the application fields of shape-memory polymers. In our present work we describe interesting activation alternatives, based on coating shape-memory polymers with different kinds of conductive materials, including textiles, conductive threads and conductive paint, which stand out for their easy, rapid and very cheap implementation. Distributed heating and homogeneous activation can be achieved in several of the alternatives studied and the technical results are comparable to those obtained by using advanced shape-memory nanocomposites, which have to deal with complex synthesis, processing and security aspects. Different combinations of shape memory epoxy resin with several coating electrotextiles, conductive films and paints are prepared, simulated with the help of thermal finite element method based resources and characterized using infrared thermography for validating the simulations and overall design process. A final application linked to an active catheter pincer is detailed and the advantages of using distributed heating instead of conventional resistors are discussed. PMID:28788401

  20. Degradation of S-nitrosocysteine in vascular tissue homogenates: role of divalent ions.

    PubMed

    Kostka, P; Xu, B; Skiles, E H

    1999-04-01

    The objective of the study was to inquire about the mechanism(s) involved in the catabolism of S-nitrosothiols by vascular tissue under in vitro conditions. Incubations of S-nitrosocysteine (CYSNO) or S-nitrosoglutathione (GSNO) with homogenates isolated from porcine aortic smooth muscle resulted in only a marginal depletion of S-nitrosothiols from the reaction mixtures, which became statistically significant at relatively high concentrations of homogenate (> or =300 microg of protein/ml). Degradation of CYSNO (but not GSNO) was found to be potentiated several-fold by millimolar concentrations of either Mg2+ or Ca2+ ions. Under such conditions, the degradation of CYSNO was significantly suppressed by the removal of proteins by ultrafiltration (>80% inhibition) and eliminated completely by the alkylation of thiol groups with 1 mM N-ethylmaleimide. The potentiating effect of divalent ions on the degradation of CYSNO was insensitive to 0.1 mM neocuproine (selective chelator of Cu+ ions), although it was enhanced in the presence of 0.1 mM o-phenanthroline (selective chelator of Fe2+ ions). It is concluded that the degradation of CYSNO by tissue homogenate involves the interaction with protein-bound sulfhydryl groups, which is stimulated by Mg2+ or Ca2+ ions. The potentiating effect of o-phenanthroline suggests that the liberation of the nitrosonium moiety in such a process may be accompanied by its transfer to sulfur center(s) by transient formation of dinitrosyl-iron complexes.

  1. Homogenizing bacterial cell factories: Analysis and engineering of phenotypic heterogeneity.

    PubMed

    Binder, Dennis; Drepper, Thomas; Jaeger, Karl-Erich; Delvigne, Frank; Wiechert, Wolfgang; Kohlheyer, Dietrich; Grünberger, Alexander

    2017-07-01

    In natural habitats, microbes form multispecies communities that commonly face rapidly changing and highly competitive environments. Thus, phenotypic heterogeneity has evolved as an innate and important survival strategy to gain an overall fitness advantage over cohabiting competitors. However, in defined artificial environments such as monocultures in small- to large-scale bioreactors, cell-to-cell variations are presumed to cause reduced production yields as well as process instability. Hence, engineering microbial production toward phenotypic homogeneity is a highly promising approach for synthetic biology and bioprocess optimization. In this review, we discuss recent studies that have unraveled the cell-to-cell heterogeneity observed during bacterial gene expression and metabolite production as well as the molecular mechanisms involved. In addition, current single-cell technologies are briefly reviewed with respect to their applicability in exploring cell-to-cell variations. We highlight emerging strategies and tools to reduce phenotypic heterogeneity in biotechnological expression setups. Here, strain or inducer modifications are combined with cell physiology manipulations to achieve the ultimate goal of equalizing bacterial populations. In this way, the majority of cells can be forced into high productivity, thus reducing less productive subpopulations that tend to consume valuable resources during production. Modifications in uptake systems, inducer molecules or nutrients represent valuable tools for diminishing heterogeneity. Finally, we address the challenge of transferring homogeneously responding cells into large-scale bioprocesses. Environmental heterogeneity originating from extrinsic factors such as stirring speed and pH, oxygen, temperature or nutrient distribution can significantly influence cellular physiology. We conclude that engineering microbial populations toward phenotypic homogeneity is an increasingly important task to take biotechnological

  2. ISO 55000: Creating an asset management system.

    PubMed

    Bradley, Chris; Main, Kevin

    2015-02-01

    In the October 2014 issue of HEJ, Keith Hamer, group vice-president, Asset Management & Engineering at Sodexo, and marketing director at Asset Wisdom, Kevin Main, argued that the new ISO 55000 standards present facilities managers with an opportunity to create 'a joined-up, whole lifecycle approach' to managing and delivering value from assets. In this article, Kevin Main and Chris Bradley, who runs various asset management projects, examine the process of creating an asset management system.

  3. Homogeneous free-form directional backlight for 3D display

    NASA Astrophysics Data System (ADS)

    Krebs, Peter; Liang, Haowen; Fan, Hang; Zhang, Aiqin; Zhou, Yangui; Chen, Jiayi; Li, Kunyang; Zhou, Jianying

    2017-08-01

    Realization of a near perfect homogeneous secondary emission source for 3D display is proposed and demonstrated. The light source takes advantage of an array of free-form emission surface with a specially tailored light guiding structure, a light diffuser and Fresnel lens. A seamless and homogeneous directional emission is experimentally obtained which is essential for a high quality naked-eye 3D display.

  4. Universal Scaling Laws in the Dynamics of a Homogeneous Unitary Bose Gas

    NASA Astrophysics Data System (ADS)

    Eigen, Christoph; Glidden, Jake A. P.; Lopes, Raphael; Navon, Nir; Hadzibabic, Zoran; Smith, Robert P.

    2017-12-01

    We study the dynamics of an initially degenerate homogeneous Bose gas after an interaction quench to the unitary regime at a magnetic Feshbach resonance. As the cloud decays and heats, it exhibits a crossover from degenerate- to thermal-gas behavior, both of which are characterized by universal scaling laws linking the particle-loss rate to the total atom number N . In the degenerate and thermal regimes, the per-particle loss rate is ∝N2 /3 and N26 /9, respectively. The crossover occurs at a universal kinetic energy per particle and at a universal time after the quench, in units of energy and time set by the gas density. By slowly sweeping the magnetic field away from the resonance and creating a mixture of atoms and molecules, we also map out the dynamics of correlations in the unitary gas, which display a universal temporal scaling with the gas density, and reach a steady state while the gas is still degenerate.

  5. Universal Scaling Laws in the Dynamics of a Homogeneous Unitary Bose Gas.

    PubMed

    Eigen, Christoph; Glidden, Jake A P; Lopes, Raphael; Navon, Nir; Hadzibabic, Zoran; Smith, Robert P

    2017-12-22

    We study the dynamics of an initially degenerate homogeneous Bose gas after an interaction quench to the unitary regime at a magnetic Feshbach resonance. As the cloud decays and heats, it exhibits a crossover from degenerate- to thermal-gas behavior, both of which are characterized by universal scaling laws linking the particle-loss rate to the total atom number N. In the degenerate and thermal regimes, the per-particle loss rate is ∝N^{2/3} and N^{26/9}, respectively. The crossover occurs at a universal kinetic energy per particle and at a universal time after the quench, in units of energy and time set by the gas density. By slowly sweeping the magnetic field away from the resonance and creating a mixture of atoms and molecules, we also map out the dynamics of correlations in the unitary gas, which display a universal temporal scaling with the gas density, and reach a steady state while the gas is still degenerate.

  6. General Theorems about Homogeneous Ellipsoidal Inclusions

    ERIC Educational Resources Information Center

    Korringa, J.; And Others

    1978-01-01

    Mathematical theorems about the properties of ellipsoids are developed. Included are Poisson's theorem concerning the magnetization of a homogeneous body of ellipsoidal shape, the polarization of a dielectric, the transport of heat or electricity through an ellipsoid, and other problems. (BB)

  7. High pressure homogenization processing, thermal treatment and milk matrix affect in vitro bioaccessibility of phenolics in apple, grape and orange juice to different extents.

    PubMed

    He, Zhiyong; Tao, Yadan; Zeng, Maomao; Zhang, Shuang; Tao, Guanjun; Qin, Fang; Chen, Jie

    2016-06-01

    The effects of high pressure homogenization processing (HPHP), thermal treatment (TT) and milk matrix (soy, skimmed and whole milk) on the phenolic bioaccessibility and the ABTS scavenging activity of apple, grape and orange juice (AJ, GJ and OJ) were investigated. HPHP and soy milk diminished AJ's total phenolic bioaccessibility 29.3%, 26.3%, respectively, whereas TT and bovine milk hardly affected it. HPHP had little effect on GJ's and OJ's total phenolic bioaccessibility, while TT enhanced them 27.3-33.9%, 19.0-29.2%, respectively, and milk matrix increased them 26.6-31.1%, 13.3-43.4%, respectively. Furthermore, TT (80 °C/30 min) and TT (90 °C/30 s) presented the similar influences on GJ's and OJ's phenolic bioaccessibility. Skimmed milk showed a better enhancing effect on OJ's total phenolic bioaccessibility than soy and whole milk, but had a similar effect on GJ's as whole milk. These results contribute to promoting the health benefits of fruit juices by optimizing the processing and formulas in the food industry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database: a long-term database for climate studies

    NASA Astrophysics Data System (ADS)

    Davis, Sean M.; Rosenlof, Karen H.; Hassler, Birgit; Hurst, Dale F.; Read, William G.; Vömel, Holger; Selkirk, Henry; Fujiwara, Masatomo; Damadeo, Robert

    2016-09-01

    In this paper, we describe the construction of the Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) database, which includes vertically resolved ozone and water vapor data from a subset of the limb profiling satellite instruments operating since the 1980s. The primary SWOOSH products are zonal-mean monthly-mean time series of water vapor and ozone mixing ratio on pressure levels (12 levels per decade from 316 to 1 hPa). The SWOOSH pressure level products are provided on several independent zonal-mean grids (2.5, 5, and 10°), and additional products include two coarse 3-D griddings (30° long × 10° lat, 20° × 5°) as well as a zonal-mean isentropic product. SWOOSH includes both individual satellite source data as well as a merged data product. A key aspect of the merged product is that the source records are homogenized to account for inter-satellite biases and to minimize artificial jumps in the record. We describe the SWOOSH homogenization process, which involves adjusting the satellite data records to a "reference" satellite using coincident observations during time periods of instrument overlap. The reference satellite is chosen based on the best agreement with independent balloon-based sounding measurements, with the goal of producing a long-term data record that is both homogeneous (i.e., with minimal artificial jumps in time) and accurate (i.e., unbiased). This paper details the choice of reference measurements, homogenization, and gridding process involved in the construction of the combined SWOOSH product and also presents the ancillary information stored in SWOOSH that can be used in future studies of water vapor and ozone variability. Furthermore, a discussion of uncertainties in the combined SWOOSH record is presented, and examples of the SWOOSH record are provided to illustrate its use for studies of ozone and water vapor variability on interannual to decadal timescales. The version 2.5 SWOOSH data are publicly available at

  9. Disruption and molecule degradation of waxy maize starch granules during high pressure homogenization process.

    PubMed

    Wei, Benxi; Cai, Canxin; Xu, Baoguo; Jin, Zhengyu; Tian, Yaoqi

    2018-02-01

    The mechanism underlying the fragmentation of waxy maize starch (WMS) granules during high-pressure homogenization (HPH) was studied and the results were interpreted in terms of granular and molecular aspects. The diameter of disrupted starch granules decreased exponentially with increasing HPH pressure, but decreased linearly with increasing of HPH cycles. Scanning electron microscopy revealed a cone-like inside-out disruption pattern through the channels that resulted in separation of blocklets fragments or starch fragments. The M w of amylopectin was reduced by ∼half following treatment at 150MPa with two cycles, or at 100MPa for eight cycles, and the decrease was in accordance with the disruption of starch granules. This indicated that amylopectin was "protected" by blocklets, and the disruption of WMS granules mainly occurred close to the linkage among blocklets. Increasing the HPH pressure appeared to be more effective for breaking starch granules than increasing the number of HPH cycles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Beyond relationships between homogeneous and heterogeneous catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, David A.; Katz, Alexander; Arslan, Ilke

    2014-08-13

    Scientists who regard catalysis as a coherent field have been striving for decades to articulate the fundamental unifying principles. But because these principles seem to be broader than chemistry, chemical engineering, and materials science combined, catalytic scientists commonly interact within the sub-domains of homogeneous, heterogeneous, and bio-catalysis, and increasingly within even narrower domains such as organocatalysis, phase-transfer catalysis, acid-base catalysis, zeolite catalysis, etc. Attempts to unify catalysis have motivated researchers to find relationships between homogeneous and heterogeneous catalysis and to mimic enzymes. These themes have inspired vibrant international meetings and workshops, and we have benefited from the idea exchanges andmore » have some thoughts about a path forward.« less

  11. Homogeneity of gels and gel-derived glasses

    NASA Technical Reports Server (NTRS)

    Mukherjee, S. P.

    1984-01-01

    The significance and implications of gel preparation procedures in controlling the homogeneity of multicomponent oxide gels are discussed. The role of physicochemical factors such as the structure and chemical reactivities of alkoxides, the formation of double-metal alkoxides, and the nature of solvent(s) are critically analyzed in the context of homogeneity of gels during gelation. Three procedures for preparing gels in the SiO2-B2O3-Na2O system are examined in the context of cation distribution. Light scattering results for glasses in the SiO2-B2O3-Na2O system prepared by both the gel technique and the conventional technique are examined.

  12. On Using Homogeneous Polynomials To Design Anisotropic Yield Functions With Tension/Compression Symmetry/Assymetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soare, S.; Cazacu, O.; Yoon, J. W.

    With few exceptions, non-quadratic homogeneous polynomials have received little attention as possible candidates for yield functions. One reason might be that not every such polynomial is a convex function. In this paper we show that homogeneous polynomials can be used to develop powerful anisotropic yield criteria, and that imposing simple constraints on the identification process leads, aposteriori, to the desired convexity property. It is shown that combinations of such polynomials allow for modeling yielding properties of metallic materials with any crystal structure, i.e. both cubic and hexagonal which display strength differential effects. Extensions of the proposed criteria to 3D stressmore » states are also presented. We apply these criteria to the description of the aluminum alloy AA2090T3. We prove that a sixth order orthotropic homogeneous polynomial is capable of a satisfactory description of this alloy. Next, applications to the deep drawing of a cylindrical cup are presented. The newly proposed criteria were implemented as UMAT subroutines into the commercial FE code ABAQUS. We were able to predict six ears on the AA2090T3 cup's profile. Finally, we show that a tension/compression asymmetry in yielding can have an important effect on the earing profile.« less

  13. On Using Homogeneous Polynomials To Design Anisotropic Yield Functions With Tension/Compression Symmetry/Assymetry

    NASA Astrophysics Data System (ADS)

    Soare, S.; Yoon, J. W.; Cazacu, O.

    2007-05-01

    With few exceptions, non-quadratic homogeneous polynomials have received little attention as possible candidates for yield functions. One reason might be that not every such polynomial is a convex function. In this paper we show that homogeneous polynomials can be used to develop powerful anisotropic yield criteria, and that imposing simple constraints on the identification process leads, aposteriori, to the desired convexity property. It is shown that combinations of such polynomials allow for modeling yielding properties of metallic materials with any crystal structure, i.e. both cubic and hexagonal which display strength differential effects. Extensions of the proposed criteria to 3D stress states are also presented. We apply these criteria to the description of the aluminum alloy AA2090T3. We prove that a sixth order orthotropic homogeneous polynomial is capable of a satisfactory description of this alloy. Next, applications to the deep drawing of a cylindrical cup are presented. The newly proposed criteria were implemented as UMAT subroutines into the commercial FE code ABAQUS. We were able to predict six ears on the AA2090T3 cup's profile. Finally, we show that a tension/compression asymmetry in yielding can have an important effect on the earing profile.

  14. In-line Raman spectroscopic monitoring and feedback control of a continuous twin-screw pharmaceutical powder blending and tableting process.

    PubMed

    Nagy, Brigitta; Farkas, Attila; Gyürkés, Martin; Komaromy-Hiller, Szofia; Démuth, Balázs; Szabó, Bence; Nusser, Dávid; Borbás, Enikő; Marosi, György; Nagy, Zsombor Kristóf

    2017-09-15

    The integration of Process Analytical Technology (PAT) initiative into the continuous production of pharmaceuticals is indispensable for reliable production. The present paper reports the implementation of in-line Raman spectroscopy in a continuous blending and tableting process of a three-component model pharmaceutical system, containing caffeine as model active pharmaceutical ingredient (API), glucose as model excipient and magnesium stearate as lubricant. The real-time analysis of API content, blend homogeneity, and tablet content uniformity was performed using a Partial Least Squares (PLS) quantitative method. The in-line Raman spectroscopic monitoring showed that the continuous blender was capable of producing blends with high homogeneity, and technological malfunctions can be detected by the proposed PAT method. The Raman spectroscopy-based feedback control of the API feeder was also established, creating a 'Process Analytically Controlled Technology' (PACT), which guarantees the required API content in the produced blend. This is, to the best of the authors' knowledge, the first ever application of Raman-spectroscopy in continuous blending and the first Raman-based feedback control in the formulation technology of solid pharmaceuticals. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Numerical computation of homogeneous slope stability.

    PubMed

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS).

  16. Development of soil properties and nitrogen cycling in created wetlands

    USGS Publications Warehouse

    Wolf, K.L.; Ahn, C.; Noe, G.B.

    2011-01-01

    Mitigation wetlands are expected to compensate for the loss of structure and function of natural wetlands within 5–10 years of creation; however, the age-based trajectory of development in wetlands is unclear. This study investigates the development of coupled structural (soil properties) and functional (nitrogen cycling) attributes of created non-tidal freshwater wetlands of varying ages and natural reference wetlands to determine if created wetlands attain the water quality ecosystem service of nitrogen (N) cycling over time. Soil condition component and its constituents, gravimetric soil moisture, total organic carbon, and total N, generally increased and bulk density decreased with age of the created wetland. Nitrogen flux rates demonstrated age-related patterns, with younger created wetlands having lower rates of ammonification, nitrification, nitrogen mineralization, and denitrification potential than older created wetlands and natural reference wetlands. Results show a clear age-related trajectory in coupled soil condition and N cycle development, which is essential for water quality improvement. These findings can be used to enhance N processing in created wetlands and inform the regulatory evaluation of mitigation wetlands by identifying structural indicators of N processing performance.

  17. Stabilisation of perturbed chains of integrators using Lyapunov-based homogeneous controllers

    NASA Astrophysics Data System (ADS)

    Harmouche, Mohamed; Laghrouche, Salah; Chitour, Yacine; Hamerlain, Mustapha

    2017-12-01

    In this paper, we present a Lyapunov-based homogeneous controller for the stabilisation of a perturbed chain of integrators of arbitrary order r ≥ 1. The proposed controller is based on homogeneous controller for stabilisation of pure chain of integrators. The control of homogeneity degree is also introduced and various controllers are designed using this concept, namely a bounded-controller with minimum amplitude of discontinuous control and a controller with globally fixed-time convergence. The performance of the controller is validated through simulations.

  18. Contrasting Patterns of rDNA Homogenization within the Zygosaccharomyces rouxii Species Complex

    PubMed Central

    Chand Dakal, Tikam; Giudici, Paolo; Solieri, Lisa

    2016-01-01

    Arrays of repetitive ribosomal DNA (rDNA) sequences are generally expected to evolve as a coherent family, where repeats within such a family are more similar to each other than to orthologs in related species. The continuous homogenization of repeats within individual genomes is a recombination process termed concerted evolution. Here, we investigated the extent and the direction of concerted evolution in 43 yeast strains of the Zygosaccharomyces rouxii species complex (Z. rouxii, Z. sapae, Z. mellis), by analyzing two portions of the 35S rDNA cistron, namely the D1/D2 domains at the 5’ end of the 26S rRNA gene and the segment including the internal transcribed spacers (ITS) 1 and 2 (ITS regions). We demonstrate that intra-genomic rDNA sequence variation is unusually frequent in this clade and that rDNA arrays in single genomes consist of an intermixing of Z. rouxii, Z. sapae and Z. mellis-like sequences, putatively evolved by reticulate evolutionary events that involved repeated hybridization between lineages. The levels and distribution of sequence polymorphisms vary across rDNA repeats in different individuals, reflecting four patterns of rDNA evolution: I) rDNA repeats that are homogeneous within a genome but are chimeras derived from two parental lineages via recombination: Z. rouxii in the ITS region and Z. sapae in the D1/D2 region; II) intra-genomic rDNA repeats that retain polymorphisms only in ITS regions; III) rDNA repeats that vary only in their D1/D2 domains; IV) heterogeneous rDNA arrays that have both polymorphic ITS and D1/D2 regions. We argue that an ongoing process of homogenization following allodiplodization or incomplete lineage sorting gave rise to divergent evolutionary trajectories in different strains, depending upon temporal, structural and functional constraints. We discuss the consequences of these findings for Zygosaccharomyces species delineation and, more in general, for yeast barcoding. PMID:27501051

  19. Simple Köhler homogenizers for image-forming solar concentrators

    NASA Astrophysics Data System (ADS)

    Zhang, Weiya; Winston, Roland

    2010-08-01

    By adding simple Köhler homogenizers in the form of aspheric lenses generated with an optimization approach, we solve the problems of non-uniform irradiance distribution and non-square irradiance pattern existing in some image-forming solar concentrators. The homogenizers do not require optical bonding to the solar cells or total internal reflection surface. Two examples are shown including a Fresnel lens based concentrator and a two-mirror aplanatic system.

  20. On domain symmetry and its use in homogenization

    DOE PAGES

    Barbarosie, Cristian A.; Tortorelli, Daniel A.; Watts, Seth E.

    2017-03-08

    The present study focuses on solving partial differential equations in domains exhibiting symmetries and periodic boundary conditions for the purpose of homogenization. We show in a systematic manner how the symmetry can be exploited to significantly reduce the complexity of the problem and the computational burden. This is especially relevant in inverse problems, when one needs to solve the partial differential equation (the primal problem) many times in an optimization algorithm. The main motivation of our study is inverse homogenization used to design architected composite materials with novel properties which are being fabricated at ever increasing rates thanks to recentmore » advances in additive manufacturing. For example, one may optimize the morphology of a two-phase composite unit cell to achieve isotropic homogenized properties with maximal bulk modulus and minimal Poisson ratio. Typically, the isotropy is enforced by applying constraints to the optimization problem. However, in two dimensions, one can alternatively optimize the morphology of an equilateral triangle and then rotate and reflect the triangle to form a space filling D 3 symmetric hexagonal unit cell that necessarily exhibits isotropic homogenized properties. One can further use this D 3 symmetry to reduce the computational expense by performing the “unit strain” periodic boundary condition simulations on the single triangle symmetry sector rather than the six fold larger hexagon. In this paper we use group representation theory to derive the necessary periodic boundary conditions on the symmetry sectors of unit cells. The developments are done in a general setting, and specialized to the two-dimensional dihedral symmetries of the abelian D 2, i.e. orthotropic, square unit cell and nonabelian D 3, i.e. trigonal, hexagon unit cell. We then demonstrate how this theory can be applied by evaluating the homogenized properties of a two-phase planar composite over the triangle symmetry sector

  1. Testing homogeneity in Weibull-regression models.

    PubMed

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  2. Designing and Creating Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    McMeen, George R.

    Designed to encourage the use of a defined methodology and careful planning in creating computer-assisted instructional programs, this paper describes the instructional design process, compares computer-assisted instruction (CAI) and programmed instruction (PI), and discusses pragmatic concerns in computer programming. Topics addressed include:…

  3. Evolution of rDNA in Nicotiana Allopolyploids: A Potential Link between rDNA Homogenization and Epigenetics

    PubMed Central

    Kovarik, Ales; Dadejova, Martina; Lim, Yoong K.; Chase, Mark W.; Clarkson, James J.; Knapp, Sandra; Leitch, Andrew R.

    2008-01-01

    Background The evolution and biology of rDNA have interested biologists for many years, in part, because of two intriguing processes: (1) nucleolar dominance and (2) sequence homogenization. We review patterns of evolution in rDNA in the angiosperm genus Nicotiana to determine consequences of allopolyploidy on these processes. Scope Allopolyploid species of Nicotiana are ideal for studying rDNA evolution because phylogenetic reconstruction of DNA sequences has revealed patterns of species divergence and their parents. From these studies we also know that polyploids formed over widely different timeframes (thousands to millions of years), enabling comparative and temporal studies of rDNA structure, activity and chromosomal distribution. In addition studies on synthetic polyploids enable the consequences of de novo polyploidy on rDNA activity to be determined. Conclusions We propose that rDNA epigenetic expression patterns established even in F1 hybrids have a material influence on the likely patterns of divergence of rDNA. It is the active rDNA units that are vulnerable to homogenization, which probably acts to reduce mutational load across the active array. Those rDNA units that are epigenetically silenced may be less vulnerable to sequence homogenization. Selection cannot act on these silenced genes, and they are likely to accumulate mutations and eventually be eliminated from the genome. It is likely that whole silenced arrays will be deleted in polyploids of 1 million years of age and older. PMID:18310159

  4. Homogeneous Characterization of Transiting Exoplanet Systems

    NASA Astrophysics Data System (ADS)

    Gomez Maqueo Chew, Yilen; Faedi, Francesca; Hebb, Leslie; Pollacco, Don; Stassun, Keivan; Ghezzi, Luan; Cargile, Phillip; Barros, Susana; Smalley, Barry; Mack, Claude

    2012-02-01

    We aim to obtain a homogeneous set of high resolution, high signal- to-noise (S/N) spectra for a large and diverse sample of stars with transiting planets, using the Kitt Peak 4-m echelle spectrograph for bright Northern targets (7.7homogeneous analysis of this high-quality dataset, we will be able to investigate any systematic uncertainties on the derived stellar properties, and consequently, on the planetary properties derived from the iterative combination of our spectral analysis with the best available radial velocity data and transit photometry. % to derive a homogeneous set of properties for the transiting systems. The resulting consistent set of physical properties will allow us to further explore known correlations, e.g., core-size of the planet and stellar metallicity, and to newly identify subtle relationships providing insight into our understanding of planetary formation, structure, and evolution. Our pilot study analyzing our WASP-13 HIRES spectrum (R 48 000, S/N>150) in combination with high precision light curves shows an improvement in the precision of the stellar parameters of 60% in Teff, 75% in FeH, 82% in mstar, and 73% in rstar, which translates into a 64% improvement in the precision of rpl, and more than 2% on mpl, relative to the discovery paper's values.

  5. Role of blockages in particle transport through homogeneous granular assemblies

    NASA Astrophysics Data System (ADS)

    Tejada, I. G.; Sibille, L.; Chareyre, B.

    2016-09-01

    This letter deals with the transport of particles through granular assemblies and, specifically, with the intermittent formation of blockages originated from collective and purely mechanical clogging of constrictions. We perform numerical experiments with a micro-hydromechanical model that is able to reproduce the complex interplay between the carrier fluid, the transported particles and the granular assembly. The probability distribution functions (PDFs) of the duration of blockages and displacements give the time scale on which the effect of blockages is erased and the advection-dispersion paradigm is valid. Our experiments show that these PDFs fit exponential laws, reinforcing the idea that the formation and destruction of blockages are homogeneous Poisson processes.

  6. Optimization of the Magnetic Field Homogeneity Area for Solenoid Type Magnets

    NASA Astrophysics Data System (ADS)

    Perepelkin, Eugene; Polyakova, Rima; Tarelkin, Aleksandr; Kovalenko, Alexander; Sysoev, Pavel; Sadovnikova, Marianne; Yudin, Ivan

    2018-02-01

    Homogeneous magnetic fields are important requisites in modern physics research. In this paper we discuss the problem of magnetic field homogeneity area maximization for solenoid magnets. We discuss A-model and B-model, which are basic types of solenoid magnets used to provide a homogeneous field, and methods for their optimization. We propose C-model which can be used for the NICA project. We have also carried out a cross-check of the C-model with the parameters stated for the CLEO II detector.

  7. Multicomponent homogeneous alloys and method for making same

    DOEpatents

    Dutta, Partha S.; Miller, Thomas R.

    2003-09-02

    The present application discloses a method for preparing a homogeneous ternary or quaternary alloy from a quaternary melt. The method includes providing a family of phase diagrams for the quaternary melt which shows (i) composition/temperature data, (ii) tie lines connecting equilibrium liquid and solid compositions, and (iii) isotherms representing boundaries of a miscibility gap. Based on the family of phase diagrams, a quaternary melt composition and an alloy growth temperature is selected. A quaternary melt having the selected quaternary melt composition is provided and a ternary or quaternary alloy is grown from the quaternary melt at the selected alloy growth temperature. A method for making homogeneous ternary or quaternary alloy from a ternary or quaternary melt is also disclosed, as are homogeneous quaternary single-crystal alloys which are substantially free from crystal defects and which have the formula A.sub.x B.sub.1-x C.sub.y D.sub.1-y, x and y being the same or different and in the range of 0.001 to 0.999.

  8. Salty popcorn in a homogeneous low-dimensional toy model of holographic QCD

    NASA Astrophysics Data System (ADS)

    Elliot-Ripley, Matthew

    2017-04-01

    Recently, a homogeneous ansatz has been used to study cold dense nuclear matter in the Sakai-Sugimoto model of holographic QCD. To justify this homogeneous approximation we here investigate a homogeneous ansatz within a low-dimensional toy version of Sakai-Sugimoto to study finite baryon density configurations and compare it to full numerical solutions. We find the ansatz corresponds to enforcing a dyon salt arrangement in which the soliton solutions are split into half-soliton layers. Within this ansatz we find analogues of the proposed baryonic popcorn transitions, in which solutions split into multiple layers in the holographic direction. The homogeneous results are found to qualitatively match the full numerical solutions, lending confidence to the homogeneous approximations of the full Sakai-Sugimoto model. In addition, we find exact compact solutions in the high density, flat space limit which demonstrate the existence of further popcorn transitions to three layers and beyond.

  9. RVR Meander – Migration of meandering rivers in homogeneous and heterogeneous floodplains using physically-based bank erosion

    USDA-ARS?s Scientific Manuscript database

    The RVR Meander platform for computing long-term meandering-channel migration is presented, together with a method for planform migration based on the modeling of the streambank erosion processes of hydraulic erosion and mass failure. An application to a real-world river, with assumption of homogene...

  10. Temperature lowering program for homogeneous doping in flux growth

    NASA Astrophysics Data System (ADS)

    Qiwei, Wang; Shouquan, Jia

    1989-10-01

    Based on the mass conservation law and the Burton-Prim-Slichter equation, the temperature program for homogeneous doping in flux growth by slow cooling was derived. The effect of various factors, such as initial supersaturation, solution volume, growth kinetic coefficient and degree of mixing in the solution on growth rate, crystal size and temperature program is discussed in detail. Theoretical analysis shows that there is a critical crystal size above which homogeneous doping is impossible.

  11. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  12. Does prescribed burning result in biotic homogenization of coastal heathlands?

    PubMed

    Velle, Liv Guri; Nilsen, Liv Sigrid; Norderhaug, Ann; Vandvik, Vigdis

    2014-05-01

    Biotic homogenization due to replacement of native biodiversity by widespread generalist species has been demonstrated in a number of ecosystems and taxonomic groups worldwide, causing growing conservation concern. Human disturbance is a key driver of biotic homogenization, suggesting potential conservation challenges in seminatural ecosystems, where anthropogenic disturbances such as grazing and burning are necessary for maintaining ecological dynamics and functioning. We test whether prescribed burning results in biotic homogenization in the coastal heathlands of north-western Europe, a seminatural landscape where extensive grazing and burning has constituted the traditional land-use practice over the past 6000 years. We compare the beta-diversity before and after fire at three ecological scales: within local vegetation patches, between wet and dry heathland patches within landscapes, and along a 470 km bioclimatic gradient. Within local patches, we found no evidence of homogenization after fire; species richness increased, and the species that entered the burnt Calluna stands were not widespread specialists but native grasses and herbs characteristic of the heathland system. At the landscapes scale, we saw a weak homogenization as wet and dry heathland patches become more compositionally similar after fire. This was because of a decrease in habitat-specific species unique to either wet or dry habitats and postfire colonization by a set of heathland specialists that established in both habitat types. Along the bioclimatic gradient, species that increased after fire generally had more specific environmental requirements and narrower geographical distributions than the prefire flora, resulting in a biotic 'heterogenisation' after fire. Our study demonstrates that human disturbance does not necessarily cause biotic homogenization, but that continuation of traditional land-use practices can instead be crucial for the maintenance of the diversity and ecological

  13. An homogeneous seismological bulletin for the European-Mediterranean region

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Godey, S.; Mazet-Roux, G.

    2003-12-01

    The Euro-Mediterranean Seismological Center (EMSC) has been working on the production of an homogeneous seismological bulletin for the European- Mediterranean region. Such a catalogue could be useful as a reference for seismic hazard studies in the region. The 3 main objectives are to rapidly (with few months delay) provide a bulletin for events with M>3 occuring in the Euro-Med region (from Iceland in the North-West to the Arabic Peninsula in the South-East) by merging parametric data, to obtain better location accuracy in border regions and to improve data availability. The different tools (data merging, data exchange procedures...) have been developed during a 2-years EC funded project carried out in collaboration with 10 partners including the ISC. The developement period ended with the EC project by the end of 2002. Seismological data from 1998 to 2003 have been gathered from about 60 contributors (mainly national networks) providing data from 1500 stations. Up to now, nearly 5,000,000 arrival times for more than 340 000 events were reported (both values including redundancies). Numerous tests have already been performed to validate the automatic processing of the bulletins. To finalize this test period and start the operational production, we have decided to create a 2 years catalogue for 1998 and 1999 and to carry out a careful comparison of the results with the ISC catalogue. More precisely, the station distribution and data coverage obtained with the EMSC database is presented in parallel with the ones provided by the ISC. A statistical analysis is performed to assess the accuracy of the EMSC relocation. For specific regions (ie. Poland, Italy, North Africa), we present the improvements made using larger dataset than the one provided by national organisations. Finally, we will analyse the performances, point out possible weaknesses and present the necessary improvements before starting the operational production, probably by early 2004.

  14. Creating breakthroughs at 3M.

    PubMed

    von Hippel, E; Thomke, S; Sonnack, M

    1999-01-01

    Most senior managers want their product development teams to create break-throughs--new products that will allow their companies to grow rapidly and maintain high margins. But more often they get incremental improvements to existing products. That's partly because companies must compete in the short term. Searching for breakthroughs is expensive and time consuming; line extensions can help the bottom line immediately. In addition, developers simply don't know how to achieve breakthroughs, and there is usually no system in place to guide them. By the mid-1990s, the lack of such a system was a problem even for an innovative company like 3M. Then a project team in 3M's Medical-Surgical Markets Division became acquainted with a method for developing breakthrough products: the lead user process. The process is based on the fact that many commercially important products are initially thought of and even prototyped by "lead users"--companies, organizations, or individuals that are well ahead of market trends. Their needs are so far beyond those of the average user that lead users create innovations on their own that may later contribute to commercially attractive breakthroughs. The lead user process transforms the job of inventing breakthroughs into a systematic task of identifying lead users and learning from them. The authors explain the process and how the 3M project team successfully navigated through it. In the end, the team proposed three major new product lines and a change in the division's strategy that has led to the development of breakthrough products. And now several more divisions are using the process to break away from incrementalism.

  15. Homogeneous dielectric barrier discharges in atmospheric air and its influencing factor

    NASA Astrophysics Data System (ADS)

    Ran, Junxia; Li, Caixia; Ma, Dong; Luo, Haiyun; Li, Xiaowei

    2018-03-01

    The stable homogeneous dielectric barrier discharge (DBD) is obtained in atmospheric 2-3 mm air gap. It is generated using center frequency 1 kHz high voltage power supply between two plane parallel electrodes with specific alumina ceramic plates as the dielectric barriers. The discharge characteristics are studied by a measurement of its electrical discharge parameters and observation of its light emission phenomena. The results show that a large single current pulse of about 200 μs duration appearing in each voltage pulse, and its light emission is radially homogeneous and covers the entire surface of the two electrodes. The homogeneous discharge generated is a Townsend discharge during discharge. The influences of applied barrier, its thickness, and surface roughness on the transition of discharge modes are studied. The results show that it is difficult to produce a homogeneous discharge using smooth plates or alumina plate surface roughness Ra < 100 nm even at a 1 mm air gap. If the alumina plate is too thin, the discharge also transits to filamentary discharge. If it is too thick, the discharge is too weak to observe. With the increase of air gap distance and applied voltage, the discharge can also transit from a homogeneous mode to a filamentary mode. In order to generate stable and homogeneous DBD at a larger air gap, proper dielectric material, dielectric thickness, and dielectric surface roughness should be used, and proper applied voltage amplitude and frequency should also be used.

  16. A Class of Homogeneous Scalar Tensor Cosmologies with a Radiation Fluid

    NASA Astrophysics Data System (ADS)

    Yazadjiev, Stoytcho S.

    We present a new class of exact homogeneous cosmological solutions with a radiation fluid for all scalar tensor theories. The solutions belong to Bianchi type VIh cosmologies. Explicit examples of nonsingular homogeneous scalar tensor cosmologies are also given.

  17. Homogeneous Reduction of Carbon Dioxide with Hydrogen.

    PubMed

    Dong, Kaiwu; Razzaq, Rauf; Hu, Yuya; Ding, Kuiling

    2017-04-01

    Carbon dioxide (CO 2 ), a key greenhouse gas produced from both anthropogenic and natural sources, has been recently considered to be an important C1 building-block for the synthesis of many industrial fuels and chemicals. Catalytic hydrogenation of CO 2 using a homogeneous system is regarded as an efficient process for CO 2 valorization. This approach leads to the direct products including formic acid (HCOOH), carbon monoxide (CO), methanol (MeOH), and methane (CH 4 ). The hydrogenation of CO 2 to CO followed by alkene carbonylation provides value-added compounds, which also avoids the tedious separation and transportation of toxic CO. Moreover, the reduction of CO 2 with H 2 in the presence of amines is of significance to attain fine chemicals through catalytic formylation and methylation reactions. The synthesis of higher alcohols and dialkoxymethane from CO 2 and H 2 has been demonstrated recently, which opens access to new molecular structures using CO 2 as an important C1 source.

  18. Asymptotic Expansion Homogenization for Multiscale Nuclear Fuel Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hales, J. D.; Tonks, M. R.; Chockalingam, K.

    2015-03-01

    Engineering scale nuclear fuel performance simulations can benefit by utilizing high-fidelity models running at a lower length scale. Lower length-scale models provide a detailed view of the material behavior that is used to determine the average material response at the macroscale. These lower length-scale calculations may provide insight into material behavior where experimental data is sparse or nonexistent. This multiscale approach is especially useful in the nuclear field, since irradiation experiments are difficult and expensive to conduct. The lower length-scale models complement the experiments by influencing the types of experiments required and by reducing the total number of experiments needed.more » This multiscale modeling approach is a central motivation in the development of the BISON-MARMOT fuel performance codes at Idaho National Laboratory. These codes seek to provide more accurate and predictive solutions for nuclear fuel behavior. One critical aspect of multiscale modeling is the ability to extract the relevant information from the lower length-scale sim- ulations. One approach, the asymptotic expansion homogenization (AEH) technique, has proven to be an effective method for determining homogenized material parameters. The AEH technique prescribes a system of equations to solve at the microscale that are used to compute homogenized material constants for use at the engineering scale. In this work, we employ AEH to explore the effect of evolving microstructural thermal conductivity and elastic constants on nuclear fuel performance. We show that the AEH approach fits cleanly into the BISON and MARMOT codes and provides a natural, multidimensional homogenization capability.« less

  19. Effects of two-step homogenization on precipitation behavior of Al{sub 3}Zr dispersoids and recrystallization resistance in 7150 aluminum alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Zhanying; Key Laboratory for Anisotropy and Texture of Materials, Northeastern University, Shenyang 110819, China,; Zhao, Gang

    2015-04-15

    The effect of two-step homogenization treatments on the precipitation behavior of Al{sub 3}Zr dispersoids was investigated by transmission electron microscopy (TEM) in 7150 alloys. Two-step treatments with the first step in the temperature range of 300–400 °C followed by the second step at 470 °C were applied during homogenization. Compared with the conventional one-step homogenization, both a finer particle size and a higher number density of Al{sub 3}Zr dispersoids were obtained with two-step homogenization treatments. The most effective dispersoid distribution was attained using the first step held at 300 °C. In addition, the two-step homogenization minimized the precipitate free zonesmore » and greatly increased the number density of dispersoids near dendrite grain boundaries. The effect of two-step homogenization on recrystallization resistance of 7150 alloys with different Zr contents was quantitatively analyzed using the electron backscattered diffraction (EBSD) technique. It was found that the improved dispersoid distribution through the two-step treatment can effectively inhibit the recrystallization process during the post-deformation annealing for 7150 alloys containing 0.04–0.09 wt.% Zr, resulting in a remarkable reduction of the volume fraction and grain size of recrystallization grains. - Highlights: • Effect of two-step homogenization on Al{sub 3}Zr dispersoids was investigated by TEM. • Finer and higher number of dispersoids obtained with two-step homogenization • Minimized the precipitate free zones and improved the dispersoid distribution • Recrystallization resistance with varying Zr content was quantified by EBSD. • Effectively inhibit the recrystallization through two-step treatments in 7150 alloy.« less

  20. Hydrogen storage materials and method of making by dry homogenation

    DOEpatents

    Jensen, Craig M.; Zidan, Ragaiy A.

    2002-01-01

    Dry homogenized metal hydrides, in particular aluminum hydride compounds, as a material for reversible hydrogen storage is provided. The reversible hydrogen storage material comprises a dry homogenized material having transition metal catalytic sites on a metal aluminum hydride compound, or mixtures of metal aluminum hydride compounds. A method of making such reversible hydrogen storage materials by dry doping is also provided and comprises the steps of dry homogenizing metal hydrides by mechanical mixing, such as be crushing or ball milling a powder, of a metal aluminum hydride with a transition metal catalyst. In another aspect of the invention, a method of powering a vehicle apparatus with the reversible hydrogen storage material is provided.

  1. Crucial effect of melt homogenization on the fragility of non-stoichiometric chalcogenides

    NASA Astrophysics Data System (ADS)

    Ravindren, Sriram; Gunasekera, K.; Tucker, Z.; Diebold, A.; Boolchand, P.; Micoulaut, M.

    2014-04-01

    The kinetics of homogenization of binary AsxSe100 - x melts in the As concentration range 0% < x < 50% are followed in Fourier Transform (FT)-Raman profiling experiments, and show that 2 g sized melts in the middle concentration range 20% < x < 30% take nearly two weeks to homogenize when starting materials are reacted at 700 °C. In glasses of proven homogeneity, we find molar volumes to vary non-monotonically with composition, and the fragility index M displays a broad global minimum in the 20% < x < 30% range of x wherein M< 20. We show that properly homogenized samples have a lower measured fragility when compared to larger under-reacted melts. The enthalpy of relaxation at Tg, ΔHnr(x) shows a minimum in the 27% < x < 37% range. The super-strong nature of melt compositions in the 20% < x < 30% range suppresses melt diffusion at high temperatures leading to the slow kinetics of melt homogenization.

  2. [Near infrared analysis of blending homogeneity of Chinese medicine formula particles based on moving window F test method].

    PubMed

    Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang

    2016-10-01

    Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.

  3. Cascade catalysis for the homogeneous hydrogenation of CO2 to methanol.

    PubMed

    Huff, Chelsea A; Sanford, Melanie S

    2011-11-16

    This communication demonstrates the homogeneous hydrogenation of CO(2) to CH(3)OH via cascade catalysis. Three different homogeneous catalysts, (PMe(3))(4)Ru(Cl)(OAc), Sc(OTf)(3), and (PNN)Ru(CO)(H), operate in sequence to promote this transformation.

  4. Nature of low-frequency noise in homogeneous semiconductors

    PubMed Central

    Palenskis, Vilius; Maknys, Kęstutis

    2015-01-01

    This report deals with a 1/f noise in homogeneous classical semiconductor samples on the base of silicon. We perform detail calculations of resistance fluctuations of the silicon sample due to both a) the charge carrier number changes due to their capture–emission processes, and b) due to screening effect of those negative charged centers, and show that proportionality of noise level to square mobility appears as a presentation parameter, but not due to mobility fluctuations. The obtained calculation results explain well the observed experimental results of 1/f noise in Si, Ge, GaAs and exclude the mobility fluctuations as the nature of 1/f noise in these materials and their devices. It is also shown how from the experimental 1/f noise results to find the effective number of defects responsible for this noise in the measured frequency range. PMID:26674184

  5. jsc2018m000297_Investigation_Seeks_to_Create_Self-Assembling_Materials-MP4

    NASA Image and Video Library

    2018-05-14

    Investigation Seeks to Create Self-Assembling Materials------ As we travel farther into space, clever solutions to problems like engine part malfunctions and other possible mishaps will be a vital part of the planning process. 3D printing, or additive manufacturing, is an emerging technology that may be used to custom-create mission-critical parts. An integral piece of this process is understanding how particle shape, size distribution and packing behavior affect the manufacturing process. The Advanced Colloids Experiment-Temperature-7 investigation (ACE-T-7) aboard the International Space Station explores the feasibility of creating self-assembling microscopic particles for use in the manufacturing of materials during spaceflight. Read more about ACE-T-& here: https://www.nasa.gov/feature/investigation-seeks-to-create-self-assembling-materials

  6. Integration of Modelling and Graphics to Create an Infrared Signal Processing Test Bed

    NASA Astrophysics Data System (ADS)

    Sethi, H. R.; Ralph, John E.

    1989-03-01

    The work reported in this paper was carried out as part of a contract with MoD (PE) UK. It considers the problems associated with realistic modelling of a passive infrared system in an operational environment. Ideally all aspects of the system and environment should be integrated into a complete end-to-end simulation but in the past limited computing power has prevented this. Recent developments in workstation technology and the increasing availability of parallel processing techniques makes the end-to-end simulation possible. However the complexity and speed of such simulations means difficulties for the operator in controlling the software and understanding the results. These difficulties can be greatly reduced by providing an extremely user friendly interface and a very flexible, high power, high resolution colour graphics capability. Most system modelling is based on separate software simulation of the individual components of the system itself and its environment. These component models may have their own characteristic inbuilt assumptions and approximations, may be written in the language favoured by the originator and may have a wide variety of input and output conventions and requirements. The models and their limitations need to be matched to the range of conditions appropriate to the operational scenerio. A comprehensive set of data bases needs to be generated by the component models and these data bases must be made readily available to the investigator. Performance measures need to be defined and displayed in some convenient graphics form. Some options are presented for combining available hardware and software to create an environment within which the models can be integrated, and which provide the required man-machine interface, graphics and computing power. The impact of massively parallel processing and artificial intelligence will be discussed. Parallel processing will make real time end-to-end simulation possible and will greatly improve the

  7. Biotic homogenization of three insect groups due to urbanization.

    PubMed

    Knop, Eva

    2016-01-01

    Cities are growing rapidly, thereby expected to cause a large-scale global biotic homogenization. Evidence for the homogenization hypothesis is mostly derived from plants and birds, whereas arthropods have so far been neglected. Here, I tested the homogenization hypothesis with three insect indicator groups, namely true bugs, leafhoppers, and beetles. In particular, I was interested whether insect species community composition differs between urban and rural areas, whether they are more similar between cities than between rural areas, and whether the found pattern is explained by true species turnover, species diversity gradients and geographic distance, by non-native or specialist species, respectively. I analyzed insect species communities sampled on birch trees in a total of six Swiss cities and six rural areas nearby. In all indicator groups, urban and rural community composition was significantly dissimilar due to native species turnover. Further, for bug and leafhopper communities, I found evidence for large-scale homogenization due to urbanization, which was driven by reduced species turnover of specialist species in cities. Species turnover of beetle communities was similar between cities and rural areas. Interestingly, when specialist species of beetles were excluded from the analyses, cities were more dissimilar than rural areas, suggesting biotic differentiation of beetle communities in cities. Non-native species did not affect species turnover of the insect groups. However, given non-native arthropod species are increasing rapidly, their homogenizing effect might be detected more often in future. Overall, the results show that urbanization has a negative large-scale impact on the diversity specialist species of the investigated insect groups. Specific measures in cities targeted at increasing the persistence of specialist species typical for the respective biogeographic region could help to stop the loss of biodiversity. © 2015 John Wiley & Sons Ltd.

  8. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    PubMed

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  9. Castaways can't be choosers - Homogenization of rafting assemblages on floating seaweeds

    NASA Astrophysics Data System (ADS)

    Gutow, Lars; Beermann, Jan; Buschbaum, Christian; Rivadeneira, Marcelo M.; Thiel, Martin

    2015-01-01

    After detachment from benthic habitats, the epibiont assemblages on floating seaweeds undergo substantial changes, but little is known regarding whether succession varies among different seaweed species. Given that floating algae may represent a limiting habitat in many regions, rafting organisms may be unselective and colonize any available seaweed patch at the sea surface. This process may homogenize rafting assemblages on different seaweed species, which our study examined by comparing the assemblages on benthic and floating individuals of the fucoid seaweeds Fucus vesiculosus and Sargassum muticum in the northern Wadden Sea (North Sea). Species richness was about twice as high on S. muticum as on F. vesiculosus, both on benthic and floating individuals. In both seaweed species benthic samples were more diverse than floating samples. However, the species composition differed significantly only between benthic thalli, but not between floating thalli of the two seaweed species. Separate analyses of sessile and mobile epibionts showed that the homogenization of rafting assemblages was mainly caused by mobile species. Among these, grazing isopods from the genus Idotea reached extraordinarily high densities on the floating samples from the northern Wadden Sea, suggesting that the availability of seaweed rafts was indeed limiting. Enhanced break-up of algal rafts associated with intense feeding by abundant herbivores might force rafters to recolonize benthic habitats. These colonization processes may enhance successful dispersal of rafting organisms and thereby contribute to population connectivity between sink populations in the Wadden Sea and source populations from up-current regions.

  10. Data on the effect of homogenization heat treatments on the cast structure and tensile properties of alloy 718Plus in the presence of grain-boundary elements.

    PubMed

    Hosseini, Seyed Ali; Madar, Karim Zangeneh; Abbasi, Seyed Mehdi

    2017-08-01

    The segregation of the elements during solidification and the direct formation of destructive phases such as Laves from the liquid, result in in-homogeneity of the cast structure and degradation of mechanical properties. Homogenization heat treatment is one of the ways to eliminate destructive Laves from the cast structure of superalloys such as 718Plus. The collected data presents the effect of homogenization treatment conditions on the cast structure, hardness, and tensile properties of the alloy 718Plus in the presence of boron and zirconium additives. For this purpose, five alloys with different contents of boron and zirconium were cast by VIM/VAR process and then were homogenized at various conditions. The microstructural investigation by OM and SEM and phase analysis by XRD were done and then hardness and tensile tests were performed on the homogenized alloys.

  11. Dark energy homogeneity in general relativity: Are we applying it correctly?

    NASA Astrophysics Data System (ADS)

    Duniya, Didam G. A.

    2016-04-01

    Thus far, there does not appear to be an agreed (or adequate) definition of homogeneous dark energy (DE). This paper seeks to define a valid, adequate homogeneity condition for DE. Firstly, it is shown that as long as w_x ≠ -1, DE must have perturbations. It is then argued, independent of w_x, that a correct definition of homogeneous DE is one whose density perturbation vanishes in comoving gauge: and hence, in the DE rest frame. Using phenomenological DE, the consequence of this approach is then investigated in the observed galaxy power spectrum—with the power spectrum being normalized on small scales, at the present epoch z=0. It is found that for high magnification bias, relativistic corrections in the galaxy power spectrum are able to distinguish the concordance model from both a homogeneous DE and a clustering DE—on super-horizon scales.

  12. Spatial variability in acoustic backscatter as an indicator of tissue homogenate production in pulsed cavitational ultrasound therapy.

    PubMed

    Parsons, Jessica E; Cain, Charles A; Fowlkes, J Brian

    2007-03-01

    Spatial variability in acoustic backscatter is investigated as a potential feedback metric for assessment of lesion morphology during cavitation-mediated mechanical tissue disruption ("histotripsy"). A 750-kHz annular array was aligned confocally with a 4.5 MHz passive backscatter receiver during ex vivo insonation of porcine myocardium. Various exposure conditions were used to elicit a range of damage morphologies and backscatter characteristics [pulse duration = 14 micros, pulse repetition frequency (PRF) = 0.07-3.1 kHz, average I(SPPA) = 22-44 kW/cm2]. Variability in backscatter spatial localization was quantified by tracking the lag required to achieve peak correlation between sequential RF A-lines received. Mean spatial variability was observed to be significantly higher when damage morphology consisted of mechanically disrupted tissue homogenate versus mechanically intact coagulation necrosis (2.35 +/- 1.59 mm versus 0.067 +/- 0.054 mm, p < 0.025). Statistics from these variability distributions were used as the basis for selecting a threshold variability level to identify the onset of homogenate formation via an abrupt, sustained increase in spatially dynamic backscatter activity. Specific indices indicative of the state of the homogenization process were quantified as a function of acoustic input conditions. The prevalence of backscatter spatial variability was observed to scale with the amount of homogenate produced for various PRFs and acoustic intensities.

  13. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, Ning; Kaya, Basak Sener; Godt, Jonathan W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs.

  14. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, N.; Kaya, B.S.; Godt, J.W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs. Copyright 2011 by the American Geophysical Union.

  15. Effect of dynamic high pressure homogenization on the aggregation state of soy protein.

    PubMed

    Keerati-U-Rai, Maneephan; Corredig, Milena

    2009-05-13

    Although soy proteins are often employed as functional ingredients in oil-water emulsions, very little is known about the aggregation state of the proteins in solution and whether any changes occur to soy protein dispersions during homogenization. The effect of dynamic high pressure homogenization on the aggregation state of the proteins was investigated using microdifferential scanning calorimetry and high performance size exclusion chromatography coupled with multiangle laser light scattering. Soy protein isolates as well as glycinin and beta-conglycinin fractions were prepared from defatted soy flakes and redispersed in 50 mM sodium phosphate buffer at pH 7.4. The dispersions were then subjected to homogenization at two different pressures, 26 and 65 MPa. The results demonstrated that dynamic high pressure homogenization causes changes in the supramolecular structure of the soy proteins. Both beta-conglycinin and glycinin samples had an increased temperature of denaturation after homogenization. The chromatographic elution profile showed a reduction in the aggregate concentration with homogenization pressure for beta-conglycinin and an increase in the size of the soluble aggregates for glycinin and soy protein isolate.

  16. A rapid mechanism to remobilize and homogenize highly crystalline magma bodies.

    PubMed

    Burgisser, Alain; Bergantz, George W

    2011-03-10

    The largest products of magmatic activity on Earth, the great bodies of granite and their corresponding large eruptions, have a dual nature: homogeneity at the large scale and spatial and temporal heterogeneity at the small scale. This duality calls for a mechanism that selectively removes the large-scale heterogeneities associated with the incremental assembly of these magmatic systems and yet occurs rapidly despite crystal-rich, viscous conditions seemingly resistant to mixing. Here we show that a simple dynamic template can unify a wide range of apparently contradictory observations from both large plutonic bodies and volcanic systems by a mechanism of rapid remobilization (unzipping) of highly viscous crystal-rich mushes. We demonstrate that this remobilization can lead to rapid overturn and produce the observed juxtaposition of magmatic materials with very disparate ages and complex chemical zoning. What distinguishes our model is the recognition that the process has two stages. Initially, a stiff mushy magma is reheated from below, producing a reduction in crystallinity that leads to the growth of a subjacent buoyant mobile layer. When the thickening mobile layer becomes sufficiently buoyant, it penetrates the overlying viscous mushy magma. This second stage rapidly exports homogenized material from the lower mobile layer to the top of the system, and leads to partial overturn within the viscous mush itself as an additional mechanism of mixing. Model outputs illustrate that unzipping can rapidly produce large amounts of mobile magma available for eruption. The agreement between calculated and observed unzipping rates for historical eruptions at Pinatubo and at Montserrat demonstrates the general applicability of the model. This mechanism furthers our understanding of both the formation of periodically homogenized plutons (crust building) and of ignimbrites by large eruptions.

  17. Suppression of turbulent energy cascade due to phase separation in homogenous binary mixture fluid

    NASA Astrophysics Data System (ADS)

    Takagi, Youhei; Okamoto, Sachiya

    2015-11-01

    When a multi-component fluid mixture becomes themophysically unstable state by quenching from well-melting condition, phase separation due to spinodal decomposition occurs, and a self-organized structure is formed. During phase separation, free energy is consumed for the structure formation. In our previous report, the phase separation in homogenous turbulence was numerically simulated and the coarsening process of phase separation was discussed. In this study, we extended our numerical model to a high Schmidt number fluid corresponding to actual polymer solution. The governing equations were continuity, Navier-Stokes, and Chan-Hiliard equations as same as our previous report. The flow filed was an isotropic homogenous turbulence, and the dimensionless parameters in the Chan-Hilliard equation were estimated based on the thermophysical condition of binary mixture. From the numerical results, it was found that turbulent energy cascade was drastically suppressed in the inertial subrange by phase separation for the high Schmidt number flow. By using the identification of turbulent and phase separation structure, we discussed the relation between total energy balance and the structures formation processes. This study is financially supported by the Grand-in-Aid for Young Scientists (B) (No. T26820045) from the Ministry of Education, Cul-ture, Sports, Science and Technology of Japan.

  18. Non-homogeneous flow profiles in sheared bacterial suspensions

    NASA Astrophysics Data System (ADS)

    Samanta, Devranjan; Cheng, Xiang

    Bacterial suspensions under shear exhibit interesting rheological behaviors including the remarkable ``superfluidic'' state with vanishing viscosity at low shear rates. Theoretical studies have shown that such ``superfluidic'' state is linked with non-homogeneous shear flows, which are induced by coupling between nematic order of active fluids and hydrodynamics of shear flows. However, although bulk rheology of bacterial suspensions has been experimentally studied, shear profiles within bacterial suspensions have not been explored so far. Here, we experimentally investigate the flow behaviors of E. coli suspensions under planar oscillatory shear. Using confocal microscopy and PIV, we measure velocity profiles across gap between two shear plates. We find that with increasing shear rates, high-concentration bacterial suspensions exhibit an array of non-homogeneous flow behaviors like yield-stress flows and shear banding. We show that these non-homogeneous flows are due to collective motion of bacterial suspensions. The phase diagram of sheared bacterial suspensions is systematically mapped as functions of shear rates an bacterial concentrations. Our experiments provide new insights into rheology of bacterial suspensions and shed light on shear induced dynamics of active fluids. Chemical Engineering and Material Science department.

  19. Selecting for extinction: nonrandom disease-associated extinction homogenizes amphibian biotas.

    PubMed

    Smith, Kevin G; Lips, Karen R; Chase, Jonathan M

    2009-10-01

    Studying the patterns in which local extinctions occur is critical to understanding how extinctions affect biodiversity at local, regional and global spatial scales. To understand the importance of patterns of extinction at a regional spatial scale, we use data from extirpations associated with a widespread pathogenic agent of amphibian decline, Batrachochytrium dendrobatidis (Bd) as a model system. We apply novel null model analyses to these data to determine whether recent extirpations associated with Bd have resulted in selective extinction and homogenization of diverse tropical American amphibian biotas. We find that Bd-associated extinctions in this region were nonrandom and disproportionately, but not exclusively, affected low-occupancy and endemic species, resulting in homogenization of the remnant amphibian fauna. The pattern of extirpations also resulted in phylogenetic homogenization at the family level and ecological homogenization of reproductive mode and habitat association. Additionally, many more species were extirpated from the region than would be expected if extirpations occurred randomly. Our results indicate that amphibian declines in this region are an extinction filter, reducing regional amphibian biodiversity to highly similar relict assemblages and ultimately causing amplified biodiversity loss at regional and global scales.

  20. Creating Science Picture Books for an Authentic Audience

    ERIC Educational Resources Information Center

    DeFauw, Danielle L.; Saad, Klodia

    2014-01-01

    This article presents an authentic writing opportunity to help ninth-grade students use the writing process in a science classroom to write and illustrate picture books for fourth-grade students to demonstrate and share their understanding of a biology unit on cells. By creating a picture book, students experience the writing process, understand…

  1. Modification of homogeneous and isotropic turbulence by solid particles

    NASA Astrophysics Data System (ADS)

    Hwang, Wontae

    2005-12-01

    Particle-laden flows are prevalent in natural and industrial environments. Dilute loadings of small, heavy particles have been observed to attenuate the turbulence levels of the carrier-phase flow, up to 80% in some cases. We attempt to increase the physical understanding of this complex phenomenon by studying the interaction of solid particles with the most fundamental type of turbulence, which is homogeneous and isotropic with no mean flow. A flow facility was developed that could create air turbulence in a nearly-spherical chamber by means of synthetic jet actuators mounted on the corners. Loudspeakers were used as the actuators. Stationary turbulence and natural decaying turbulence were investigated using two-dimensional particle image velocimetry for the base flow qualification. Results indicated that the turbulence was fairly homogeneous throughout the measurement domain and very isotropic, with small mean flow. The particle-laden flow experiments were conducted in two different environments, the lab and in micro-gravity, to examine the effects of particle wakes and flow structure distortion caused by settling particles. The laboratory experiments showed that glass particles with diameters on the order of the turbulence Kolmogorov length scale attenuated the fluid turbulent kinetic energy (TKE) and dissipation rate with increasing particle mass loadings. The main source of fluid TKE production in the chamber was the speakers, but the loss of potential energy of the settling particles also resulted in a significant amount of production of extra TKE. The sink of TKE in the chamber was due to the ordinary fluid viscous dissipation and extra dissipation caused by particles. This extra dissipation could be divided into "unresolved" dissipation caused by local velocity disturbances in the vicinity of the small particles and dissipation caused by large-scale flow distortions from particle wakes and particle clusters. The micro-gravity experiments in NASA's KC-135

  2. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    PubMed Central

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  3. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    PubMed

    Tang, Liang; Zhang, Jinjie; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  4. Numerical Computation of Homogeneous Slope Stability

    PubMed Central

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS). PMID:25784927

  5. Stereo- and Regioselective Phyllobilane Oxidation in Leaf Homogenates of the Peace Lily (Spathiphyllum wallisii): Hypothetical Endogenous Path to Yellow Chlorophyll Catabolites

    PubMed Central

    Vergeiner, Clemens; Ulrich, Markus; Li, Chengjie; Liu, Xiujun; Müller, Thomas; Kräutler, Bernhard

    2015-01-01

    In senescent leaves, chlorophyll typically is broken down to colorless and essentially photo-inactive phyllobilanes, which are linear tetrapyrroles classified as “nonfluorescent” chlorophyll catabolites (NCCs) and dioxobilane-type NCCs (DNCCs). In homogenates of senescent leaves of the tropical evergreen Spathiphyllum wallisii, when left at room temperature and extracted with methanol, the major endogenous, naturally formed NCC was regio- and stereoselectively oxidized (in part) to a mixture of its 15-hydroxy and 15-methoxy derivative. In the absence of methanol in the extract, only the 15-OH-NCC was observed. The endogenous oxidation process depended upon molecular oxygen. It was inhibited by carbon monoxide, as well as by keeping the leaf homogenate and extract at low temperatures. The remarkable “oxidative activity” was inactivated by heating the homogenate for 10 min at 70 °C. Upon addition of a natural epimeric NCC (epiNCC) to the homogenate of senescent or green Sp. wallisii leaves at room temperature, the exogenous epiNCC was oxidized regio- and stereoselectively to 15-OH-epiNCC and 15-OMe-epiNCC. The identical two oxidized epiNCCs were also obtained as products of the oxidation of epiNCC with dicyanodichlorobenzoquinone (DDQ). Water elimination from 15-OH-epiNCC occurred readily and gave a known “yellow” chlorophyll catabolite (YCC). The endogenous oxidation process, described here, may represent the elusive natural path from the colorless NCCs to yellow and pink coloured phyllobilins, which were found in (extracts of) some senescent leaves. PMID:25382809

  6. Microstructure and Mechanical Properties of the As-Cast and As-Homogenized Mg-Zn-Sn-Mn-Ca Alloy Fabricated by Semicontinuous Casting

    PubMed Central

    Lu, Xing; Zhao, Guoqun; Zhou, Jixue; Zhang, Cunsheng; Yu, Junquan

    2018-01-01

    In this paper, a new type of low-cost Mg-3.36Zn-1.06Sn-0.33Mn-0.27Ca (wt %) alloy ingot with a diameter of 130 mm and a length of 4800 mm was fabricated by semicontinuous casting. The microstructure and mechanical properties at different areas of the ingot were investigated. The microstructure and mechanical properties of the alloy under different one-step and two-step homogenization conditions were studied. For the as-cast alloy, the average grain size and the second phase size decrease from the center to the surface of the ingot, while the area fraction of the second phase increases gradually. At one-half of the radius of the ingot, the alloy presents the optimum comprehensive mechanical properties along the axial direction, which is attributed to the combined effect of relatively small grain size, low second-phase fraction, and uniform microstructure. For the as-homogenized alloy, the optimum two-step homogenization process parameters were determined as 340 °C × 10 h + 520 °C × 16 h. After the optimum homogenization, the proper size and morphology of CaMgSn phase are conducive to improve the microstructure uniformity and the mechanical properties of the alloy. Besides, the yield strength of the alloy is reduced by 20.7% and the elongation is increased by 56.3%, which is more favorable for the subsequent hot deformation processing. PMID:29710818

  7. Microstructure and Mechanical Properties of the As-Cast and As-Homogenized Mg-Zn-Sn-Mn-Ca Alloy Fabricated by Semicontinuous Casting.

    PubMed

    Lu, Xing; Zhao, Guoqun; Zhou, Jixue; Zhang, Cunsheng; Yu, Junquan

    2018-04-29

    In this paper, a new type of low-cost Mg-3.36Zn-1.06Sn-0.33Mn-0.27Ca (wt %) alloy ingot with a diameter of 130 mm and a length of 4800 mm was fabricated by semicontinuous casting. The microstructure and mechanical properties at different areas of the ingot were investigated. The microstructure and mechanical properties of the alloy under different one-step and two-step homogenization conditions were studied. For the as-cast alloy, the average grain size and the second phase size decrease from the center to the surface of the ingot, while the area fraction of the second phase increases gradually. At one-half of the radius of the ingot, the alloy presents the optimum comprehensive mechanical properties along the axial direction, which is attributed to the combined effect of relatively small grain size, low second-phase fraction, and uniform microstructure. For the as-homogenized alloy, the optimum two-step homogenization process parameters were determined as 340 °C × 10 h + 520 °C × 16 h. After the optimum homogenization, the proper size and morphology of CaMgSn phase are conducive to improve the microstructure uniformity and the mechanical properties of the alloy. Besides, the yield strength of the alloy is reduced by 20.7% and the elongation is increased by 56.3%, which is more favorable for the subsequent hot deformation processing.

  8. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  9. Coherent perfect absorption in a homogeneously broadened two-level medium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Longhi, Stefano

    2011-05-15

    In recent works, it has been shown, rather generally, that the time-reversed process of lasing at threshold realizes a coherent perfect absorber (CPA). In a CPA, a lossy medium in an optical cavity with a specific degree of dissipation, equal in modulus to the gain of the lasing medium, can perfectly absorb coherent optical waves that are the time-reversed counterpart of the lasing field. Here, the time-reversed process of lasing is considered in detail for a homogeneously broadened two-level medium in an optical cavity and the conditions for CPA are derived. It is shown that, owing to the dispersive propertiesmore » of the two-level medium, exact time-reversal symmetry is broken and the frequency of the field at which CPA occurs is generally different than the one of the lasing mode. Moreover, at a large cooperation parameter, the observation of CPA in the presence of bistability requires one to operate in the upper branch of the hysteresis cycle.« less

  10. Homogeneity of lithium distribution in cylinder-type Li-ion batteries

    PubMed Central

    Senyshyn, A.; Mühlbauer, M. J.; Dolotko, O.; Hofmann, M.; Ehrenberg, H.

    2015-01-01

    Spatially-resolved neutron powder diffraction with a gauge volume of 2 × 2 × 20 mm3 has been applied as an in situ method to probe the lithium concentration in the graphite anode of different Li-ion cells of 18650-type in charged state. Structural studies performed in combination with electrochemical measurements and X-ray computed tomography under real cell operating conditions unambiguously revealed non-homogeneity of the lithium distribution in the graphite anode. Deviations from a homogeneous behaviour have been found in both radial and axial directions of 18650-type cells and were discussed in the frame of cell geometry and electrical connection of electrodes, which might play a crucial role in the homogeneity of the lithium distribution in the active materials within each electrode. PMID:26681110

  11. Determination of tailored filter sets to create rayfiles including spatial and angular resolved spectral information.

    PubMed

    Rotscholl, Ingo; Trampert, Klaus; Krüger, Udo; Perner, Martin; Schmidt, Franz; Neumann, Cornelius

    2015-11-16

    To simulate and optimize optical designs regarding perceived color and homogeneity in commercial ray tracing software, realistic light source models are needed. Spectral rayfiles provide angular and spatial varying spectral information. We propose a spectral reconstruction method with a minimum of time consuming goniophotometric near field measurements with optical filters for the purpose of creating spectral rayfiles. Our discussion focuses on the selection of the ideal optical filter combination for any arbitrary spectrum out of a given filter set by considering measurement uncertainties with Monte Carlo simulations. We minimize the simulation time by a preselection of all filter combinations, which bases on factorial design.

  12. Assessment the effect of homogenized soil on soil hydraulic properties and soil water transport

    NASA Astrophysics Data System (ADS)

    Mohawesh, O.; Janssen, M.; Maaitah, O.; Lennartz, B.

    2017-09-01

    Soil hydraulic properties play a crucial role in simulating water flow and contaminant transport. Soil hydraulic properties are commonly measured using homogenized soil samples. However, soil structure has a significant effect on the soil ability to retain and to conduct water, particularly in aggregated soils. In order to determine the effect of soil homogenization on soil hydraulic properties and soil water transport, undisturbed soil samples were carefully collected. Five different soil structures were identified: Angular-blocky, Crumble, Angular-blocky (different soil texture), Granular, and subangular-blocky. The soil hydraulic properties were determined for undisturbed and homogenized soil samples for each soil structure. The soil hydraulic properties were used to model soil water transport using HYDRUS-1D.The homogenized soil samples showed a significant increase in wide pores (wCP) and a decrease in narrow pores (nCP). The wCP increased by 95.6, 141.2, 391.6, 3.9, 261.3%, and nCP decreased by 69.5, 10.5, 33.8, 72.7, and 39.3% for homogenized soil samples compared to undisturbed soil samples. The soil water retention curves exhibited a significant decrease in water holding capacity for homogenized soil samples compared with the undisturbed soil samples. The homogenized soil samples showed also a decrease in soil hydraulic conductivity. The simulated results showed that water movement and distribution were affected by soil homogenizing. Moreover, soil homogenizing affected soil hydraulic properties and soil water transport. However, field studies are being needed to find the effect of these differences on water, chemical, and pollutant transport under several scenarios.

  13. Equilibrium states of homogeneous sheared compressible turbulence

    NASA Astrophysics Data System (ADS)

    Riahi, M.; Lili, T.

    2011-06-01

    Equilibrium states of homogeneous compressible turbulence subjected to rapid shear is studied using rapid distortion theory (RDT). The purpose of this study is to determine the numerical solutions of unsteady linearized equations governing double correlations spectra evolution. In this work, RDT code developed by authors solves these equations for compressible homogeneous shear flows. Numerical integration of these equations is carried out using a second-order simple and accurate scheme. The two Mach numbers relevant to homogeneous shear flow are the turbulent Mach number Mt, given by the root mean square turbulent velocity fluctuations divided by the speed of sound, and the gradient Mach number Mg which is the mean shear rate times the transverse integral scale of the turbulence divided by the speed of sound. Validation of this code is performed by comparing RDT results with direct numerical simulation (DNS) of [A. Simone, G.N. Coleman, and C. Cambon, Fluid Mech. 330, 307 (1997)] and [S. Sarkar, J. Fluid Mech. 282, 163 (1995)] for various values of initial gradient Mach number Mg0. It was found that RDT is valid for small values of the non-dimensional times St (St < 3.5). It is important to note that RDT is also valid for large values of St (St > 10) in particular for large values of Mg0. This essential feature justifies the resort to RDT in order to determine equilibrium states in the compressible regime.

  14. Homogeneity and internal defects detect of infrared Se-based chalcogenide glass

    NASA Astrophysics Data System (ADS)

    Li, Zupana; Wu, Ligang; Lin, Changgui; Song, Bao'an; Wang, Xunsi; Shen, Xiang; Dai, Shixunb

    2011-10-01

    Ge-Sb-Se chalcogenide glasses is a kind of excellent infrared optical material, which has been enviromental friendly and widely used in infrared thermal imaging systems. However, due to the opaque feature of Se-based glasses in visible spectral region, it's difficult to measure their homogeneity and internal defect as the common oxide ones. In this study, a measurement was proposed to observe the homogeneity and internal defect of these glasses based on near-IR imaging technique and an effective measurement system was also constructed. The testing result indicated the method can gives the information of homogeneity and internal defect of infrared Se-based chalcogenide glass clearly and intuitionally.

  15. Kinetics of homogeneous and surface-catalyzed mercury(II) reduction by iron(II)

    USGS Publications Warehouse

    Amirbahman, Aria; Kent, Douglas B.; Curtis, Gary P.; Marvin-DiPasquale, Mark C.

    2013-01-01

    Production of elemental mercury, Hg(0), via Hg(II) reduction is an important pathway that should be considered when studying Hg fate in environment. We conducted a kinetic study of abiotic homogeneous and surface-catalyzed Hg(0) production by Fe(II) under dark anoxic conditions. Hg(0) production rate, from initial 50 pM Hg(II) concentration, increased with increasing pH (5.5–8.1) and aqueous Fe(II) concentration (0.1–1 mM). The homogeneous rate was best described by the expression, rhom = khom [FeOH+] [Hg(OH)2]; khom = 7.19 × 10+3 L (mol min)−1. Compared to the homogeneous case, goethite (α-FeOOH) and hematite (α-Fe2O3) increased and γ-alumina (γ-Al2O3) decreased the Hg(0) production rate. Heterogeneous Hg(0) production rates were well described by a model incorporating equilibrium Fe(II) adsorption, rate-limited Hg(II) reduction by dissolved and adsorbed Fe(II), and rate-limited Hg(II) adsorption. Equilibrium Fe(II) adsorption was described using a surface complexation model calibrated with previously published experimental data. The Hg(0) production rate was well described by the expression rhet = khet [>SOFe(II)] [Hg(OH)2], where >SOFe(II) is the total adsorbed Fe(II) concentration; khet values were 5.36 × 10+3, 4.69 × 10+3, and 1.08 × 10+2 L (mol min)−1 for hematite, goethite, and γ-alumina, respectively. Hg(0) production coupled to reduction by Fe(II) may be an important process to consider in ecosystem Hg studies.

  16. Inactivation of Bacillus spores inoculated in milk by Ultra High Pressure Homogenization.

    PubMed

    Amador Espejo, Genaro Gustavo; Hernández-Herrero, M M; Juan, B; Trujillo, A J

    2014-12-01

    Ultra High-Pressure Homogenization treatments at 300 MPa with inlet temperatures (Ti) of 55, 65, 75 and 85 °C were applied to commercial Ultra High Temperature treated whole milk inoculated with Bacillus cereus, Bacillus licheniformis, Bacillus sporothermodurans, Bacillus coagulans, Geobacillus stearothermophilus and Bacillus subtilis spores in order to evaluate the inactivation level achieved. Ultra High-Pressure Homogenization conditions at 300 MPa with Ti = 75 and 85 °C were capable of a spore inactivation of ∼5 log CFU/mL. Furthermore, under these processing conditions, commercial sterility (evaluated as the complete inactivation of the inoculated spores) was obtained in milk, with the exception of G. stearothermophilus and B. subtilis treated at 300 MPa with Ti = 75 °C. The results showed that G. stearothermophilus and B. subtilis have higher resistance to the Ultra High-Pressure Homogenization treatments applied than the other microorganisms inoculated and that a treatment performed at 300 MPa with Ti = 85 °C was necessary to completely inactivate these microorganisms at the spore level inoculated (∼1 × 10(6) CFU/mL). Besides, a change in the resistance of B. licheniformis, B. sporothermodurans, G. stearothermophilus and B. subtilis spores was observed as the inactivation obtained increased remarkably in treatments performed with Ti between 65 and 75 °C. This study provides important evidence of the suitability of UHPH technology for the inactivation of spores in high numbers, leading to the possibility of obtaining commercially sterile milk. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. SELECTING SITES FOR COMPARISON WITH CREATED WETLANDS

    EPA Science Inventory

    The paper describes the method used for selecting natural wetlands to compare with created wetlands. The results of the selection process and the advantages and disadvantages of the method are discussed. The random site selection method required extensive field work and may have ...

  18. Captivate MenuBuilder: Creating an Online Tutorial for Teaching Software

    ERIC Educational Resources Information Center

    Yelinek, Kathryn; Tarnowski, Lynn; Hannon, Patricia; Oliver, Susan

    2008-01-01

    In this article, the authors, students in an instructional technology graduate course, describe a process to create an online tutorial for teaching software. They created the tutorial for a cyber school's use. Five tutorial modules were linked together through one menu screen using the MenuBuilder feature in the Adobe Captivate program. The…

  19. Surface transport processes in charged porous media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2017-03-03

    Surface transport processes are important in chemistry, colloidal sciences, engineering, biology, and geophysics. Natural or externally produced charges on surfaces create electrical double layers (EDLs) at the solid-liquid interface. The existence of the EDLs produces several complex processes including bulk and surface transport of ions. In this work, a model is presented to simulate bulk and transport processes in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without Faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations inmore » the limit of thin electrical double layers. Description of the EDL between the electrolyte solution and the charged wall is accomplished using the Gouy-Chapman-Stern (GCS) model. The surface transport terms enter into the average equations due to the use of boundary conditions for diffuse interfaces. Two extra surface transports terms appear in the closed average equations. One is a surface diffusion term equivalent to the transport process in non-charged porous media. The second surface transport term is a migration term unique to charged porous media. The effective bulk and transport parameters for isotropic porous media are calculated solving the corresponding closure problems.« less

  20. Surface transport processes in charged porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabitto, Jorge; Tsouris, Costas

    Surface transport processes are important in chemistry, colloidal sciences, engineering, biology, and geophysics. Natural or externally produced charges on surfaces create electrical double layers (EDLs) at the solid-liquid interface. The existence of the EDLs produces several complex processes including bulk and surface transport of ions. In this work, a model is presented to simulate bulk and transport processes in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without Faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations inmore » the limit of thin electrical double layers. Description of the EDL between the electrolyte solution and the charged wall is accomplished using the Gouy-Chapman-Stern (GCS) model. The surface transport terms enter into the average equations due to the use of boundary conditions for diffuse interfaces. Two extra surface transports terms appear in the closed average equations. One is a surface diffusion term equivalent to the transport process in non-charged porous media. The second surface transport term is a migration term unique to charged porous media. The effective bulk and transport parameters for isotropic porous media are calculated solving the corresponding closure problems.« less

  1. Ultra-thin carbon-fiber paper fabrication and carbon-fiber distribution homogeneity evaluation method

    NASA Astrophysics Data System (ADS)

    Zhang, L. F.; Chen, D. Y.; Wang, Q.; Li, H.; Zhao, Z. G.

    2018-01-01

    A preparation technology of ultra-thin Carbon-fiber paper is reported. Carbon fiber distribution homogeneity has a great influence on the properties of ultra-thin Carbon-fiber paper. In this paper, a self-developed homogeneity analysis system is introduced to assist users to evaluate the distribution homogeneity of Carbon fiber among two or more two-value images of carbon-fiber paper. A relative-uniformity factor W/H is introduced. The experimental results show that the smaller the W/H factor, the higher uniformity of the distribution of Carbon fiber is. The new uniformity-evaluation method provides a practical and reliable tool for analyzing homogeneity of materials.

  2. A dose homogeneity and conformity evaluation between ViewRay and pinnacle-based linear accelerator IMRT treatment plans

    PubMed Central

    Saenz, Daniel L.; Paliwal, Bhudatt R.; Bayouth, John E.

    2014-01-01

    ViewRay, a novel technology providing soft-tissue imaging during radiotherapy is investigated for treatment planning capabilities assessing treatment plan dose homogeneity and conformity compared with linear accelerator plans. ViewRay offers both adaptive radiotherapy and image guidance. The combination of cobalt-60 (Co-60) with 0.35 Tesla magnetic resonance imaging (MRI) allows for magnetic resonance (MR)-guided intensity-modulated radiation therapy (IMRT) delivery with multiple beams. This study investigated head and neck, lung, and prostate treatment plans to understand what is possible on ViewRay to narrow focus toward sites with optimal dosimetry. The goal is not to provide a rigorous assessment of planning capabilities, but rather a first order demonstration of ViewRay planning abilities. Images, structure sets, points, and dose from treatment plans created in Pinnacle for patients in our clinic were imported into ViewRay. The same objectives were used to assess plan quality and all critical structures were treated as similarly as possible. Homogeneity index (HI), conformity index (CI), and volume receiving <20% of prescription dose (DRx) were calculated to assess the plans. The 95% confidence intervals were recorded for all measurements and presented with the associated bars in graphs. The homogeneity index (D5/D95) had a 1-5% inhomogeneity increase for head and neck, 3-8% for lung, and 4-16% for prostate. CI revealed a modest conformity increase for lung. The volume receiving 20% of the prescription dose increased 2-8% for head and neck and up to 4% for lung and prostate. Overall, for head and neck Co-60 ViewRay treatments planned with its Monte Carlo treatment planning software were comparable with 6 MV plans computed with convolution superposition algorithm on Pinnacle treatment planning system. PMID:24872603

  3. A dose homogeneity and conformity evaluation between ViewRay and pinnacle-based linear accelerator IMRT treatment plans.

    PubMed

    Saenz, Daniel L; Paliwal, Bhudatt R; Bayouth, John E

    2014-04-01

    ViewRay, a novel technology providing soft-tissue imaging during radiotherapy is investigated for treatment planning capabilities assessing treatment plan dose homogeneity and conformity compared with linear accelerator plans. ViewRay offers both adaptive radiotherapy and image guidance. The combination of cobalt-60 (Co-60) with 0.35 Tesla magnetic resonance imaging (MRI) allows for magnetic resonance (MR)-guided intensity-modulated radiation therapy (IMRT) delivery with multiple beams. This study investigated head and neck, lung, and prostate treatment plans to understand what is possible on ViewRay to narrow focus toward sites with optimal dosimetry. The goal is not to provide a rigorous assessment of planning capabilities, but rather a first order demonstration of ViewRay planning abilities. Images, structure sets, points, and dose from treatment plans created in Pinnacle for patients in our clinic were imported into ViewRay. The same objectives were used to assess plan quality and all critical structures were treated as similarly as possible. Homogeneity index (HI), conformity index (CI), and volume receiving <20% of prescription dose (DRx) were calculated to assess the plans. The 95% confidence intervals were recorded for all measurements and presented with the associated bars in graphs. The homogeneity index (D5/D95) had a 1-5% inhomogeneity increase for head and neck, 3-8% for lung, and 4-16% for prostate. CI revealed a modest conformity increase for lung. The volume receiving 20% of the prescription dose increased 2-8% for head and neck and up to 4% for lung and prostate. Overall, for head and neck Co-60 ViewRay treatments planned with its Monte Carlo treatment planning software were comparable with 6 MV plans computed with convolution superposition algorithm on Pinnacle treatment planning system.

  4. ROS production in homogenate from the body wall of sea cucumber Stichopus japonicus under UVA irradiation: ESR spin-trapping study.

    PubMed

    Qi, Hang; Dong, Xiu-fang; Zhao, Ya-ping; Li, Nan; Fu, Hui; Feng, Ding-ding; Liu, Li; Yu, Chen-xu

    2016-02-01

    Sea cucumber Stichopus japonicus (S. japonicus) shows a strong ability of autolysis, which leads to severe deterioration in sea cucumber quality during processing and storage. In this study, to further characterize the mechanism of sea cucumber autolysis, hydroxyl radical production induced by ultraviolet A (UVA) irradiation was investigated. Homogenate from the body wall of S. japonicas was prepared and subjected to UVA irradiation at room temperature. Electron Spin Resonance (ESR) spectra of the treated samples were subsequently recorded. The results showed that hydroxyl radicals (OH) became more abundant while the time of UVA treatment and the homogenate concentration were increased. Addition of superoxide dismutase (SOD), catalase, EDTA, desferal, NaN3 and D2O to the homogenate samples led to different degrees of inhibition on OH production. Metal cations and pH also showed different effects on OH production. These results indicated that OH was produced in the homogenate with a possible pathway as follows: O2(-) → H2O2 → OH, suggesting that OH might be a critical factor in UVA-induced S. japonicus autolysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    NASA Astrophysics Data System (ADS)

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and

  6. ANALYSIS OF FISH HOMOGENATES FOR PERFLUORINATED COMPOUNDS

    EPA Science Inventory

    Perfluorinated compounds (PFCs) which include PFOS and PFOA are widely distributed in wildlife. Whole fish homogenates were analyzed for PFCs from the upper Mississippi, the Missouri and the Ohio rivers. Methods development, validation data, and preliminary study results will b...

  7. Castings, Steel, Homogenization of Steel Castings

    DTIC Science & Technology

    1942-12-05

    concerninr.. the ef- fe~ct of homogenizing herat-- treAment u-non the ballistic prop- erties -%f cLast steel armo--iercinr nrro jectilt:s. .arden.YD- 1 t...of hLmogenizing- treAments upon the3 corrosi;.A -.f quenched- Lr(- t c,-.rnered. c-.st steel. Harich, Riffin, -ri Bolotsk-2 .. ade two-bec-d weldahtil

  8. Development of porous lamellar poly(L-lactic acid) scaffolds by conventional injection molding process.

    PubMed

    Ghosh, Satyabrata; Viana, Júlio C; Reis, Rui L; Mano, João F

    2008-07-01

    A novel fabrication technique is proposed for the preparation of unidirectionally oriented, porous scaffolds by selective polymer leaching from lamellar structures created by conventional injection molding. The proof of the concept is implemented using a 50/50 wt.% poly(L-lactic acid)/poly(ethylene oxide) (PLLA/PEO) blend. With this composition, the PLLA and PEO blend is biphasic, containing a homogeneous PLLA/PEO phase and a PEO-rich phase. The two phases were structured using injection molding into well-defined alternating layers of homogeneous PLLA/PEO phase and PEO-rich phase. Leaching of water-soluble PEO from the PEO-rich phase produces macropores, and leaching of phase-separated PEO from the initially homogeneous PLLA/PEO phase produces micropores in the lamellae. Thus, scaffolds with a macroporous lamellar architecture with microporous walls can be produced. The lamellae are continuous along the flow direction, and a continuous lamellar thickness of less than 1 microm could be achieved. Porosities of 57-74% and pore sizes of around 50-100 microm can be obtained using this process. The tensile elastic moduli of the porous constructs were between 580 and 800 MPa. We propose that this organic-solvent-free method of preparing lamellar scaffolds with good mechanical properties, and the reproducibility associated with the injection molding technique, holds promise for a wide range of guided tissue engineering applications.

  9. Exploring cosmic homogeneity with the BOSS DR12 galaxy sample

    NASA Astrophysics Data System (ADS)

    Ntelis, Pierros; Hamilton, Jean-Christophe; Le Goff, Jean-Marc; Burtin, Etienne; Laurent, Pierre; Rich, James; Guillermo Busca, Nicolas; Tinker, Jeremy; Aubourg, Eric; du Mas des Bourboux, Hélion; Bautista, Julian; Palanque Delabrouille, Nathalie; Delubac, Timothée; Eftekharzadeh, Sarah; Hogg, David W.; Myers, Adam; Vargas-Magaña, Mariana; Pâris, Isabelle; Petitjean, Partick; Rossi, Graziano; Schneider, Donald P.; Tojeiro, Rita; Yeche, Christophe

    2017-06-01

    In this study, we probe the transition to cosmic homogeneity in the Large Scale Structure (LSS) of the Universe using the CMASS galaxy sample of BOSS spectroscopic survey which covers the largest effective volume to date, 3 h-3 Gpc3 at 0.43 <= z <= 0.7. We study the scaled counts-in-spheres, N(homogeneity scale of the universe using a Landy & Szalay inspired estimator. Defining the scale of transition to homogeneity as the scale at which D2(r) reaches 3 within 1%, i.e. D2(r)>2.97 for r>RH, we find RH = (63.3±0.7) h-1 Mpc, in agreement at the percentage level with the predictions of the ΛCDM model RH=62.0 h-1 Mpc. Thanks to the large cosmic depth of the survey, we investigate the redshift evolution of the transition to homogeneity scale and find agreement with the ΛCDM prediction. Finally, we find that Script D2 is compatible with 3 at scales larger than 300 h-1 Mpc in all redshift bins. These results consolidate the Cosmological Principle and represent a precise consistency test of the ΛCDM model.

  10. Broken Ergodicity in Two-Dimensional Homogeneous Magnetohydrodynamic Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2010-01-01

    Two-dimensional (2-D) homogeneous magnetohydrodynamic (MHD) turbulence has many of the same qualitative features as three-dimensional (3-D) homogeneous MHD turbulence.The se features include several ideal invariants, along with the phenomenon of broken ergodicity. Broken ergodicity appears when certain modes act like random variables with mean values that are large compared to their standard deviations, indicating a coherent structure or dynamo.Recently, the origin of broken ergodicity in 3-D MHD turbulence that is manifest in the lowest wavenumbers was explained. Here, a detailed description of the origins of broken ergodicity in 2-D MHD turbulence is presented. It will be seen that broken ergodicity in ideal 2-D MHD turbulence can be manifest in the lowest wavenumbers of a finite numerical model for certain initial conditions or in the highest wavenumbers for another set of initial conditions.T he origins of broken ergodicity in ideal 2-D homogeneous MHD turbulence are found through an eigen analysis of the covariance matrices of the modal probability density functions.It will also be shown that when the lowest wavenumber magnetic field becomes quasi-stationary, the higher wavenumber modes can propagate as Alfven waves on these almost static large-scale magnetic structures

  11. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    NASA Astrophysics Data System (ADS)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  12. How the Spectre of Societal Homogeneity Undermines Equitable Healthcare for Refugees

    PubMed Central

    Razum, Oliver; Wenner, Judith; Bozorgmehr, Kayvan

    2017-01-01

    Recourse to a purported ideal of societal homogeneity has become common in the context of the refugee reception crisis – not only in Japan, as Leppold et al report, but also throughout Europe. Calls for societal homogeneity in Europe originate from populist movements as well as from some governments. Often, they go along with reduced social support for refugees and asylum seekers, for example in healthcare provision. The fundamental right to health is then reduced to a citizens’ right, granted fully only to nationals. Germany, in spite of welcoming many refugees in 2015, is a case in point: entitlement and access to healthcare for asylum seekers are restricted during the first 15 months of their stay. We show that arguments brought forward to defend such restrictions do not hold, particularly not those which relate to maintaining societal homogeneity. European societies are not homogeneous, irrespective of migration. But as migration will continue, societies need to invest in what we call "globalization within." Removing entitlement restrictions and access barriers to healthcare for refugees and asylum seekers is one important element thereof. PMID:28812828

  13. Algebraic Reynolds stress modeling of turbulence subject to rapid homogeneous and non-homogeneous compression or expansion

    NASA Astrophysics Data System (ADS)

    Grigoriev, I. A.; Wallin, S.; Brethouwer, G.; Grundestam, O.; Johansson, A. V.

    2016-02-01

    A recently developed explicit algebraic Reynolds stress model (EARSM) by Grigoriev et al. ["A realizable explicit algebraic Reynolds stress model for compressible turbulent flow with significant mean dilatation," Phys. Fluids 25(10), 105112 (2013)] and the related differential Reynolds stress model (DRSM) are used to investigate the influence of homogeneous shear and compression on the evolution of turbulence in the limit of rapid distortion theory (RDT). The DRSM predictions of the turbulence kinetic energy evolution are in reasonable agreement with RDT while the evolution of diagonal components of anisotropy correctly captures the essential features, which is not the case for standard compressible extensions of DRSMs. The EARSM is shown to give a realizable anisotropy tensor and a correct trend of the growth of turbulence kinetic energy K, which saturates at a power law growth versus compression ratio, as well as retaining a normalized strain in the RDT regime. In contrast, an eddy-viscosity model results in a rapid exponential growth of K and excludes both realizability and high magnitude of the strain rate. We illustrate the importance of using a proper algebraic treatment of EARSM in systems with high values of dilatation and vorticity but low shear. A homogeneously compressed and rotating gas cloud with cylindrical symmetry, related to astrophysical flows and swirling supercritical flows, was investigated too. We also outline the extension of DRSM and EARSM to include the effect of non-homogeneous density coupled with "local mean acceleration" which can be important for, e.g., stratified flows or flows with heat release. A fixed-point analysis of direct numerical simulation data of combustion in a wall-jet flow demonstrates that our model gives quantitatively correct predictions of both streamwise and cross-stream components of turbulent density flux as well as their influence on the anisotropies. In summary, we believe that our approach, based on a proper

  14. Testing the homogeneity of the Universe using gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Li, Ming-Hua; Lin, Hai-Nan

    2015-10-01

    Aims: The discovery of a statistically significant clustering in the distribution of gamma-ray bursts (GRBs) has recently been reported. Given that the cluster has a characteristic size of 2000-3000 Mpc and a redshift between 1.6 ≤ z ≤ 2.1, it has been claimed that this structure is incompatible with the cosmological principle of homogeneity and isotropy of our Universe. In this paper, we study the homogeneity of the GRB distribution using a subsample of the Greiner GRB catalogue, which contains 314 objects with redshift 0 < z < 2.5 (244 of them discovered by the Swift GRB mission). We try to reconcile the dilemma between the new observations and the current theory of structure formation and growth. Methods: To test the results against the possible biases in redshift determination and the incompleteness of the Greiner sample, we also apply our analysis to the 244 GRBs discovered by Swift and the subsample presented by the Swift Gamma-Ray Burst Host Galaxy Legacy Survey (SHOALS). The real space two-point correlation function (2PCF) of GRBs, ξ(r), is calculated using a Landy-Szalay estimator. We perform a standard least-χ2 fit to the measured 2PCFs of GRBs. We use the best-fit 2PCF to deduce a recently defined homogeneity scale. The homogeneity scale, RH, is defined as the comoving radius of the sphere inside which the number of GRBs N(homogeneous distribution of GRBs with correlation dimension above D2 = 2.97 on scales of r ≥ 8200 h-1 Mpc. For the Swift subsample of 244 GRBs, the correlation length and slope are r0 = 387.51 ± 132.75 h-1 Mpc and γ = 1.57 ± 0.65 (at 1

  15. The Denali EarthScope Education Partnership: Creating Opportunities for Learning About Solid Earth Processes in Alaska and Beyond.

    NASA Astrophysics Data System (ADS)

    Roush, J. J.; Hansen, R. A.

    2003-12-01

    The Geophysical Institute of the University of Alaska Fairbanks, in partnership with Denali National Park and Preserve, has begun an education outreach program that will create learning opportunities in solid earth geophysics for a wide sector of the public. We will capitalize upon a unique coincidence of heightened public interest in earthquakes (due to the M 7.9 Denali Fault event of Nov. 3rd, 2002), the startup of the EarthScope experiment, and the construction of the Denali Science & Learning Center, a premiere facility for science education located just 43 miles from the epicenter of the Denali Fault earthquake. Real-time data and current research results from EarthScope installations and science projects in Alaska will be used to engage students and teachers, national park visitors, and the general public in a discovery process that will enhance public understanding of tectonics, seismicity and volcanism along the boundary between the Pacific and North American plates. Activities will take place in five program areas, which are: 1) museum displays and exhibits, 2) outreach via print publications and electronic media, 3) curriculum development to enhance K-12 earth science education, 4) teacher training to develop earth science expertise among K-12 educators, and 5) interaction between scientists and the public. In order to engage the over 1 million annual visitors to Denali, as well as people throughout Alaska, project activities will correspond with the opening of the Denali Science and Learning Center in 2004. An electronic interactive kiosk is being constructed to provide public access to real-time data from seismic and geodetic monitoring networks in Alaska, as well as cutting edge visualizations of solid earth processes. A series of print publications and a website providing access to real-time seismic and geodetic data will be developed for park visitors and the general public, highlighting EarthScope science in Alaska. A suite of curriculum modules

  16. Polyvinylpyrrolidone-Based Bio-Ink Improves Cell Viability and Homogeneity during Drop-On-Demand Printing

    PubMed Central

    Ng, Wei Long; Yeong, Wai Yee; Naing, May Win

    2017-01-01

    Drop-on-demand (DOD) bioprinting has attracted huge attention for numerous biological applications due to its precise control over material volume and deposition pattern in a contactless printing approach. 3D bioprinting is still an emerging field and more work is required to improve the viability and homogeneity of printed cells during the printing process. Here, a general purpose bio-ink was developed using polyvinylpyrrolidone (PVP) macromolecules. Different PVP-based bio-inks (0%–3% w/v) were prepared and evaluated for their printability; the short-term and long-term viability of the printed cells were first investigated. The Z value of a bio-ink determines its printability; it is the inverse of the Ohnesorge number (Oh), which is the ratio between the Reynolds number and a square root of the Weber number, and is independent of the bio-ink velocity. The viability of printed cells is dependent on the Z values of the bio-inks; the results indicated that the cells can be printed without any significant impairment using a bio-ink with a threshold Z value of ≤9.30 (2% and 2.5% w/v). Next, the cell output was evaluated over a period of 30 min. The results indicated that PVP molecules mitigate the cell adhesion and sedimentation during the printing process; the 2.5% w/v PVP bio-ink demonstrated the most consistent cell output over a period of 30 min. Hence, PVP macromolecules can play a critical role in improving the cell viability and homogeneity during the bioprinting process. PMID:28772551

  17. Polyvinylpyrrolidone-Based Bio-Ink Improves Cell Viability and Homogeneity during Drop-On-Demand Printing.

    PubMed

    Ng, Wei Long; Yeong, Wai Yee; Naing, May Win

    2017-02-16

    Drop-on-demand (DOD) bioprinting has attracted huge attention for numerous biological applications due to its precise control over material volume and deposition pattern in a contactless printing approach. 3D bioprinting is still an emerging field and more work is required to improve the viability and homogeneity of printed cells during the printing process. Here, a general purpose bio-ink was developed using polyvinylpyrrolidone (PVP) macromolecules. Different PVP-based bio-inks (0%-3% w/v) were prepared and evaluated for their printability; the short-term and long-term viability of the printed cells were first investigated. The Z value of a bio-ink determines its printability; it is the inverse of the Ohnesorge number (Oh), which is the ratio between the Reynolds number and a square root of the Weber number, and is independent of the bio-ink velocity. The viability of printed cells is dependent on the Z values of the bio-inks; the results indicated that the cells can be printed without any significant impairment using a bio-ink with a threshold Z value of ≤9.30 (2% and 2.5% w/v). Next, the cell output was evaluated over a period of 30 min. The results indicated that PVP molecules mitigate the cell adhesion and sedimentation during the printing process; the 2.5% w/v PVP bio-ink demonstrated the most consistent cell output over a period of 30 min. Hence, PVP macromolecules can play a critical role in improving the cell viability and homogeneity during the bioprinting process.

  18. Homogenization in micro-magneto-mechanics

    NASA Astrophysics Data System (ADS)

    Sridhar, A.; Keip, M.-A.; Miehe, C.

    2016-07-01

    Ferromagnetic materials are characterized by a heterogeneous micro-structure that can be altered by external magnetic and mechanical stimuli. The understanding and the description of the micro-structure evolution is of particular importance for the design and the analysis of smart materials with magneto-mechanical coupling. The macroscopic response of the material results from complex magneto-mechanical interactions occurring on smaller length scales, which are driven by magnetization reorientation and associated magnetic domain wall motions. The aim of this work is to directly base the description of the macroscopic magneto-mechanical material behavior on the micro-magnetic domain evolution. This will be realized by the incorporation of a ferromagnetic phase-field formulation into a macroscopic Boltzmann continuum by the use of computational homogenization. The transition conditions between the two scales are obtained via rigorous exploitation of rate-type and incremental variational principles, which incorporate an extended version of the classical Hill-Mandel macro-homogeneity condition covering the phase field on the micro-scale. An efficient two-scale computational scenario is developed based on an operator splitting scheme that includes a predictor for the magnetization on the micro-scale. Two- and three-dimensional numerical simulations demonstrate the performance of the method. They investigate micro-magnetic domain evolution driven by macroscopic fields as well as the associated overall hysteretic response of ferromagnetic solids.

  19. Digital Documentation: Using Computers to Create Multimedia Reports.

    ERIC Educational Resources Information Center

    Speitel, Tom; And Others

    1996-01-01

    Describes methods for creating integrated multimedia documents using recent advances in print, audio, and video digitization that bring added usefulness to computers as data acquisition, processing, and presentation tools. Discusses advantages of digital documentation. (JRH)

  20. Generation of zonal flows through symmetry breaking of statistical homogeneity

    NASA Astrophysics Data System (ADS)

    Parker, Jeffrey B.; Krommes, John A.

    2014-03-01

    In geophysical and plasma contexts, zonal flows (ZFs) are well known to arise out of turbulence. We elucidate the transition from homogeneous turbulence without ZFs to inhomogeneous turbulence with steady ZFs. Starting from the equation for barotropic flow on a β plane, we employ both the quasilinear approximation and a statistical average, which retains a great deal of the qualitative behavior of the full system. Within the resulting framework known as CE2, we extend recent understanding of the symmetry-breaking zonostrophic instability and show that it is an example of a Type {{\\text{I}}_{s}} instability within the pattern formation literature. The broken symmetry is statistical homogeneity. Near the bifurcation point, the slow dynamics of CE2 are governed by a well-known amplitude equation. The important features of this amplitude equation, and therefore of the CE2 system, are multiple. First, the ZF wavelength is not unique. In an idealized, infinite system, there is a continuous band of ZF wavelengths that allow a nonlinear equilibrium. Second, of these wavelengths, only those within a smaller subband are stable. Unstable wavelengths must evolve to reach a stable wavelength; this process manifests as merging jets. These behaviors are shown numerically to hold in the CE2 system. We also conclude that the stability of the equilibria near the bifurcation point, which is governed by the Eckhaus instability, is independent of the Rayleigh-Kuo criterion.

  1. Homogeneous illusion device exhibiting transformed and shifted scattering effect

    NASA Astrophysics Data System (ADS)

    Mei, Jin-Shuo; Wu, Qun; Zhang, Kuang; He, Xun-Jun; Wang, Yue

    2016-06-01

    Based on the theory of transformation optics, a type of homogeneous illusion device exhibiting transformed and shifted scattering effect is proposed in this paper. The constitutive parameters of the proposed device are derived, and full-wave simulations are performed to validate the electromagnetic properties of transformed and shifted scattering effect. The simulation results show that the proposed device not only can visually shift the image of target in two dimensions, but also can visually transform the shape of target. It is expected that such homogeneous illusion device could possess potential applications in military camouflage and other field of electromagnetic engineering.

  2. Homogenization theory for designing graded viscoelastic sonic crystals

    NASA Astrophysics Data System (ADS)

    Qu, Zhao-Liang; Ren, Chun-Yu; Pei, Yong-Mao; Fang, Dai-Ning

    2015-02-01

    In this paper, we propose a homogenization theory for designing graded viscoelastic sonic crystals (VSCs) which consist of periodic arrays of elastic scatterers embedded in a viscoelastic host material. We extend an elastic homogenization theory to VSC by using the elastic-viscoelastic correspondence principle and propose an analytical effective loss factor of VSC. The results of VSC and the equivalent structure calculated by using the finite element method are in good agreement. According to the relation of the effective loss factor to the filling fraction, a graded VSC plate is easily and quickly designed. Then, the graded VSC may have potential applications in the vibration absorption and noise reduction fields. Project supported by the National Basic Research Program of China (Grant No. 2011CB610301).

  3. Some variance reduction methods for numerical stochastic homogenization

    PubMed Central

    Blanc, X.; Le Bris, C.; Legoll, F.

    2016-01-01

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. PMID:27002065

  4. Homogeneity of Ge-rich nanostructures as characterized by chemical etching and transmission electron microscopy.

    PubMed

    Bollani, Monica; Chrastina, Daniel; Montuori, Valeria; Terziotti, Daniela; Bonera, Emiliano; Vanacore, Giovanni M; Tagliaferri, Alberto; Sordan, Roman; Spinella, Corrado; Nicotra, Giuseppe

    2012-02-03

    The extension of SiGe technology towards new electronic and optoelectronic applications on the Si platform requires that Ge-rich nanostructures be obtained in a well-controlled manner. Ge deposition on Si substrates usually creates SiGe nanostructures with relatively low and inhomogeneous Ge content. We have realized SiGe nanostructures with a very high (up to 90%) Ge content. Using substrate patterning, a regular array of nanostructures is obtained. We report that electron microscopy reveals an abrupt change in Ge content of about 20% between the filled pit and the island, which has not been observed in other Ge island systems. Dislocations are mainly found within the filled pit and only rarely in the island. Selective chemical etching and electron energy-loss spectroscopy reveal that the island itself is homogeneous. These Ge-rich islands are possible candidates for electronic applications requiring locally induced stress, and optoelectronic applications which exploit the Ge-like band structure of Ge-rich SiGe.

  5. Superconductivity modelling: Homogenization of Bean`s model in three dimensions, and the problem of transverse conductivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossavit, A.

    The authors show how to pass from the local Bean`s model, assumed to be valid as a behavior law for a homogeneous superconductor, to a model of similar form, valid on a larger space scale. The process, which can be iterated to higher and higher space scales, consists in solving for the fields e and j over a ``periodicity cell`` with periodic boundary conditions.

  6. Does Double Loop Learning Create Reliable Knowledge?

    ERIC Educational Resources Information Center

    Blackman, Deborah; Connelly, James; Henderson, Steven

    2004-01-01

    This paper addresses doubts concerning the reliability of knowledge being created by double loop learning processes. Popper's ontological worlds are used to explore the philosophical basis of the way that individual experiences are turned into organisational knowledge, and such knowledge is used to generate organisational learning. The paper…

  7. [Experimental-morphological study of morphogenetic potencies of homogeneous aggregates of different types of cells from the freshwater sponge Ephydatia fluviatilis (L.)].

    PubMed

    Nikitin, N S

    1977-01-01

    The morphogenetic potencies of somatic cells of the fresh-water sponge Ephydatia fluviatilis in the developing aggregates depend on their initial specialization and the number of cells in the aggregate. The aggregates of nucleolar amoebocytes consisting of 500 or more cells have the highest morphogenetic potencies. All main cell types can arise in the developing homogeneous aggregates of nucleolar amoebocytes. The fine structure of nucleolar amoebocytes at different stages of development of the homogeneous aggregates was studied by means of electron microscopy. The structural rearrangements are described which accompany the process of redifferentiation of the nucleolar amoebocytes in other cell types.

  8. Altered regional homogeneity of spontaneous brain activity in idiopathic trigeminal neuralgia.

    PubMed

    Wang, Yanping; Zhang, Xiaoling; Guan, Qiaobing; Wan, Lihong; Yi, Yahui; Liu, Chun-Feng

    2015-01-01

    The pathophysiology of idiopathic trigeminal neuralgia (ITN) has conventionally been thought to be induced by neurovascular compression theory. Recent structural brain imaging evidence has suggested an additional central component for ITN pathophysiology. However, far less attention has been given to investigations of the basis of abnormal resting-state brain activity in these patients. The objective of this study was to investigate local brain activity in patients with ITN and its correlation with clinical variables of pain. Resting-state functional magnetic resonance imaging data from 17 patients with ITN and 19 age- and sex-matched healthy controls were analyzed using regional homogeneity (ReHo) analysis, which is a data-driven approach used to measure the regional synchronization of spontaneous brain activity. Patients with ITN had decreased ReHo in the left amygdala, right parahippocampal gyrus, and left cerebellum and increased ReHo in the right inferior temporal gyrus, right thalamus, right inferior parietal lobule, and left postcentral gyrus (corrected). Furthermore, the increase in ReHo in the left precentral gyrus was positively correlated with visual analog scale (r=0.54; P=0.002). Our study found abnormal functional homogeneity of intrinsic brain activity in several regions in ITN, suggesting the maladaptivity of the process of daily pain attacks and a central role for the pathophysiology of ITN.

  9. Population dynamics in non-homogeneous environments

    NASA Astrophysics Data System (ADS)

    Alards, Kim M. J.; Tesser, Francesca; Toschi, Federico

    2014-11-01

    For organisms living in aquatic ecosystems the presence of fluid transport can have a strong influence on the dynamics of populations and on evolution of species. In particular, displacements due to self-propulsion, summed up with turbulent dispersion at larger scales, strongly influence the local densities and thus population and genetic dynamics. Real marine environments are furthermore characterized by a high degree of non-homogeneities. In the case of population fronts propagating in ``fast'' turbulence, with respect to the population duplication time, the flow effect can be studied by replacing the microscopic diffusivity with an effective turbulent diffusivity. In the opposite case of ``slow'' turbulence the advection by the flow has to be considered locally. Here we employ numerical simulations to study the influence of non-homogeneities in the diffusion coefficient of reacting individuals of different species expanding in a 2 dimensional space. Moreover, to explore the influence of advection, we consider a population expanding in the presence of simple velocity fields like cellular flows. The output is analyzed in terms of front roughness, front shape, propagation speed and, concerning the genetics, by means of heterozygosity and local and global extinction probabilities.

  10. Topology of actions and homogeneous spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozlov, Konstantin L

    2013-04-30

    Topologization of a group of homeomorphisms and its action provide additional possibilities for studying the topological space, the group of homeomorphisms, and their interconnections. The subject of the paper is the use of the property of d-openness of an action (introduced by Ancel under the name of weak micro-transitivity) in the study of spaces with various forms of homogeneity. It is proved that a d-open action of a Cech-complete group is open. A characterization of Polish SLH spaces using d-openness is given, and it is established that any separable metrizable SLH space has an SLH completion that is a Polishmore » space. Furthermore, the completion is realized in coordination with the completion of the acting group with respect to the two-sided uniformity. A sufficient condition is given for extension of a d-open action to the completion of the space with respect to the maximal equiuniformity with preservation of d-openness. A result of van Mill is generalized, namely, it is proved that any homogeneous CDH metrizable compactum is the only G-compactification of the space of rational numbers for the action of some Polish group. Bibliography: 39 titles.« less

  11. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  12. Homogeneous and heterogeneous micro-structuring of austenitic stainless steels by the low temperature plasma nitriding

    NASA Astrophysics Data System (ADS)

    Aizawa, T.; Yoshihara, S.-I.

    2018-06-01

    The austenitic stainless steels have been widely utilized as a structural component and member as well as a die and mold substrate for stamping. AISI316 dies and molds require for the surface treatment to accommodate the sufficient hardness and wear resistance to them. In addition, the candidate treatment methods must be free from toxicity, energy consumption and inefficiency. The low temperature plasma nitriding process has become one of the most promising methods to make solid-solution hardening by the nitrogen super-saturation. In the present paper, the high density RF/DC plasma nitriding process was applied to form the uniform nitrided layer in the AISI316 matrix and to describe the essential mechanism of inner nitriding in this low temperature nitriding process. In case of the nitrided AISI316 at 673 K for 14.4ks, the nitrided layer thickness became 60 μm with the surface hardness of 1700 HV and the surface nitrogen content of 7 mass %. This inner nitriding process is governed by the synergetic interrelation among the nitrogen super-saturation, the lattice expansion, the phase transformation, the plastic straining, the microstructure refinement and the acceleration of nitrogen diffusion. As far as this interrelation is sustained during the nitriding process, the original austenitic microstructure is homogeneously nitrided to have fine grains with the average size of 0.1 μm and the high crystallographic misorientation angles and to have two phase (γ + α’) structures with the plateau of nitrogen content by 5 mass%. Once this interrelation does not work anymore, the homogeneous microstructure changed itself to the heterogeneous one. The plastic straining took place in the selected coarse grains; they were partially refined into subgrains. This plastic localization accompanied the localized phase transformation.

  13. Process for producing advanced ceramics

    DOEpatents

    Kwong, Kyei-Sing

    1996-01-01

    A process for the synthesis of homogeneous advanced ceramics such as SiC+AlN, SiAlON, SiC+Al.sub.2 O.sub.3, and Si.sub.3 N.sub.4 +AlN from natural clays such as kaolin, halloysite and montmorillonite by an intercalation and heat treatment method. Included are the steps of refining clays, intercalating organic compounds into the layered structure of clays, drying the intercalated mixture, firing the treated atmospheres and grinding the loosely agglomerated structure. Advanced ceramics produced by this procedure have the advantages of homogeneity, cost effectiveness, simplicity of manufacture, ease of grind and a short process time. Advanced ceramics produced by this process can be used for refractory, wear part and structure ceramics.

  14. Genetic homogeneity of Fascioloides magna in Austria.

    PubMed

    Husch, Christian; Sattmann, Helmut; Hörweg, Christoph; Ursprung, Josef; Walochnik, Julia

    2017-08-30

    The large American liver fluke, Fascioloides magna, is an economically relevant parasite of both domestic and wild ungulates. F. magna was repeatedly introduced into Europe, for the first time already in the 19th century. In Austria, a stable population of F. magna has established in the Danube floodplain forests southeast of Vienna. The aim of this study was to determine the genetic diversity of F. magna in Austria. A total of 26 individuals from various regions within the known area of distribution were investigated for their cytochrome oxidase subunit 1 (cox1) and nicotinamide dehydrogenase subunit 1 (nad1) gene haplotypes. Interestingly, all 26 individuals revealed one and the same haplotype, namely concatenated haplotype Ha5. This indicates a homogenous population of F. magna in Austria and may argue for a single introduction. Alternatively, genetic homogeneity might also be explained by a bottleneck effect and/or genetic drift. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Quantum memory with a controlled homogeneous splitting

    NASA Astrophysics Data System (ADS)

    Hétet, G.; Wilkowski, D.; Chanelière, T.

    2013-04-01

    We propose a quantum memory protocol where an input light field can be stored onto and released from a single ground state atomic ensemble by controlling dynamically the strength of an external static and homogeneous field. The technique relies on the adiabatic following of a polaritonic excitation onto a state for which the forward collective radiative emission is forbidden. The resemblance with the archetypal electromagnetically induced transparency is only formal because no ground state coherence-based slow-light propagation is considered here. As compared to the other grand category of protocols derived from the photon-echo technique, our approach only involves a homogeneous static field. We discuss two physical situations where the effect can be observed, and show that in the limit where the excited state lifetime is longer than the storage time; the protocols are perfectly efficient and noise free. We compare the technique with other quantum memories, and propose atomic systems where the experiment can be realized.

  16. Theoretical investigation of mixing in warm clouds – Part 2: Homogeneous mixing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinsky, Mark; Khain, Alexander; Korolev, Alexei

    Evolution of monodisperse and polydisperse droplet size distributions (DSD) during homogeneous mixing is analyzed. Time-dependent universal analytical expressions for supersaturation and liquid water content are derived. For an initial monodisperse DSD, these quantities are shown to depend on a sole non-dimensional parameter. The evolution of moments and moment-related functions in the course of homogeneous evaporation of polydisperse DSD is analyzed using a parcel model. It is shown that the classic conceptual scheme, according to which homogeneous mixing leads to a decrease in droplet mass at constant droplet concentration, is valid only in cases of monodisperse or initially very narrow polydispersemore » DSD. In cases of wide polydisperse DSD, mixing and successive evaporation lead to a decrease of both mass and concentration, so the characteristic droplet sizes remain nearly constant. As this feature is typically associated with inhomogeneous mixing, we conclude that in cases of an initially wide DSD at cloud top, homogeneous mixing is nearly indistinguishable from inhomogeneous mixing.« less

  17. Theoretical investigation of mixing in warm clouds – Part 2: Homogeneous mixing

    DOE PAGES

    Pinsky, Mark; Khain, Alexander; Korolev, Alexei; ...

    2016-07-28

    Evolution of monodisperse and polydisperse droplet size distributions (DSD) during homogeneous mixing is analyzed. Time-dependent universal analytical expressions for supersaturation and liquid water content are derived. For an initial monodisperse DSD, these quantities are shown to depend on a sole non-dimensional parameter. The evolution of moments and moment-related functions in the course of homogeneous evaporation of polydisperse DSD is analyzed using a parcel model. It is shown that the classic conceptual scheme, according to which homogeneous mixing leads to a decrease in droplet mass at constant droplet concentration, is valid only in cases of monodisperse or initially very narrow polydispersemore » DSD. In cases of wide polydisperse DSD, mixing and successive evaporation lead to a decrease of both mass and concentration, so the characteristic droplet sizes remain nearly constant. As this feature is typically associated with inhomogeneous mixing, we conclude that in cases of an initially wide DSD at cloud top, homogeneous mixing is nearly indistinguishable from inhomogeneous mixing.« less

  18. Creating Sub-50 Nm Nanofluidic Junctions in PDMS Microfluidic Chip via Self-Assembly Process of Colloidal Particles

    PubMed Central

    Wei, Xi; Syed, Abeer; Mao, Pan; Han, Jongyoon; Song, Yong-Ak

    2016-01-01

    Polydimethylsiloxane (PDMS) is the prevailing building material to make microfluidic devices due to its ease of molding and bonding as well as its transparency. Due to the softness of the PDMS material, however, it is challenging to use PDMS for building nanochannels. The channels tend to collapse easily during plasma bonding. In this paper, we present an evaporation-driven self-assembly method of silica colloidal nanoparticles to create nanofluidic junctions with sub-50 nm pores between two microchannels. The pore size as well as the surface charge of the nanofluidic junction is tunable simply by changing the colloidal silica bead size and surface functionalization outside of the assembled microfluidic device in a vial before the self-assembly process. Using the self-assembly of nanoparticles with a bead size of 300 nm, 500 nm, and 900 nm, it was possible to fabricate a porous membrane with a pore size of ~45 nm, ~75 nm and ~135 nm, respectively. Under electrical potential, this nanoporous membrane initiated ion concentration polarization (ICP) acting as a cation-selective membrane to concentrate DNA by ~1,700 times within 15 min. This non-lithographic nanofabrication process opens up a new opportunity to build a tunable nanofluidic junction for the study of nanoscale transport processes of ions and molecules inside a PDMS microfluidic chip. PMID:27023724

  19. Homogenization via Sequential Projection to Nested Subspaces Spanned by Orthogonal Scaling and Wavelet Orthonormal Families of Functions

    DTIC Science & Technology

    2008-07-01

    operators in Hilbert spaces. The homogenization procedure through successive multi- resolution projections is presented, followed by a numerical example of...is intended to be essentially self-contained. The mathematical ( Greenberg 1978; Gilbert 2006) and signal processing (Strang and Nguyen 1995...literature listed in the references. The ideas behind multi-resolution analysis unfold from the theory of linear operators in Hilbert spaces (Davis 1975

  20. Creating a standardized process to offer the standard of care: continuous process improvement methodology is associated with increased rates of sperm cryopreservation among adolescent and young adult males with cancer.

    PubMed

    Shnorhavorian, Margarett; Kroon, Leah; Jeffries, Howard; Johnson, Rebecca

    2012-11-01

    There is limited literature on strategies to overcome the barriers to sperm banking among adolescent and young adult (AYA) males with cancer. By standardizing our process for offering sperm banking to AYA males before cancer treatment, we aimed to improve rates of sperm banking at our institution. Continuous process improvement is a technique that has recently been applied to improve health care delivery. We used continuous process improvement methodologies to create a standard process for fertility preservation for AYA males with cancer at our institution. We compared rates of sperm banking before and after standardization. In the 12-month period after implementation of a standardized process, 90% of patients were offered sperm banking. We demonstrated an 8-fold increase in the proportion of AYA males' sperm banking, and a 5-fold increase in the rate of sperm banking at our institution. Implementation of a standardized process for sperm banking for AYA males with cancer was associated with increased rates of sperm banking at our institution. This study supports the role of standardized health care in decreasing barriers to sperm banking.

  1. Creating nanoscale emulsions using condensation.

    PubMed

    Guha, Ingrid F; Anand, Sushant; Varanasi, Kripa K

    2017-11-08

    Nanoscale emulsions are essential components in numerous products, ranging from processed foods to novel drug delivery systems. Existing emulsification methods rely either on the breakup of larger droplets or solvent exchange/inversion. Here we report a simple, scalable method of creating nanoscale water-in-oil emulsions by condensing water vapor onto a subcooled oil-surfactant solution. Our technique enables a bottom-up approach to forming small-scale emulsions. Nanoscale water droplets nucleate at the oil/air interface and spontaneously disperse within the oil, due to the spreading dynamics of oil on water. Oil-soluble surfactants stabilize the resulting emulsions. We find that the oil-surfactant concentration controls the spreading behavior of oil on water, as well as the peak size, polydispersity, and stability of the resulting emulsions. Using condensation, we form emulsions with peak radii around 100 nm and polydispersities around 10%. This emulsion formation technique may open different routes to creating emulsions, colloidal systems, and emulsion-based materials.

  2. Homogenous Surface Nucleation of Solid Polar Stratospheric Cloud Particles

    NASA Technical Reports Server (NTRS)

    Tabazadeh, A.; Hamill, P.; Salcedo, D.; Gore, Warren J. (Technical Monitor)

    2002-01-01

    A general surface nucleation rate theory is presented for the homogeneous freezing of crystalline germs on the surfaces of aqueous particles. While nucleation rates in a standard classical homogeneous freezing rate theory scale with volume, the rates in a surface-based theory scale with surface area. The theory is used to convert volume-based information on laboratory freezing rates (in units of cu cm, seconds) of nitric acid trihydrate (NAT) and nitric acid dihydrate (NAD) aerosols into surface-based values (in units of sq cm, seconds). We show that a surface-based model is capable of reproducing measured nucleation rates of NAT and NAD aerosols from concentrated aqueous HNO3 solutions in the temperature range of 165 to 205 K. Laboratory measured nucleation rates are used to derive free energies for NAT and NAD germ formation in the stratosphere. NAD germ free energies range from about 23 to 26 kcal mole, allowing for fast and efficient homogeneous NAD particle production in the stratosphere. However, NAT germ formation energies are large (greater than 26 kcal mole) enough to prevent efficient NAT particle production in the stratosphere. We show that the atmospheric NAD particle production rates based on the surface rate theory are roughly 2 orders of magnitude larger than those obtained from a standard volume-based rate theory. Atmospheric volume and surface production of NAD particles will nearly cease in the stratosphere when denitrification in the air exceeds 40 and 78%, respectively. We show that a surface-based (volume-based) homogeneous freezing rate theory gives particle production rates, which are (not) consistent with both laboratory and atmospheric data on the nucleation of solid polar stratospheric cloud particles.

  3. Chromium isotopic homogeneity between the Moon, the Earth, and enstatite chondrites

    NASA Astrophysics Data System (ADS)

    Mougel, Bérengère; Moynier, Frédéric; Göpel, Christa

    2018-01-01

    Among the elements exhibiting non-mass dependent isotopic variations in meteorites, chromium (Cr) has been central in arguing for an isotopic homogeneity between the Earth and the Moon, thus questioning physical models of Moon formation. However, the Cr isotopic composition of the Moon relies on two samples only, which define an average value that is slightly different from the terrestrial standard. Here, by determining the Cr isotopic composition of 17 lunar, 9 terrestrial and 5 enstatite chondrite samples, we re-assess the isotopic similarity between these different planetary bodies, and provide the first robust estimate for the Moon. In average, terrestrial and enstatite samples show similar ε54Cr. On the other hand, lunar samples show variables excesses of 53Cr and 54Cr compared to terrestrial and enstatite chondrites samples with correlated ε53Cr and ε54Cr (per 10,000 deviation of the 53Cr/52Cr and 54Cr/52Cr ratios normalized to the 50Cr/52Cr ratio from the NIST SRM 3112a Cr standard). Unlike previous suggestions, we show for the first time that cosmic irradiation can affect significantly the Cr isotopic composition of lunar materials. Moreover, we also suggest that rather than spallation reactions, neutron capture effects are the dominant process controlling the Cr isotope composition of lunar igneous rocks. This is supported by the correlation between ε53Cr and ε54Cr, and 150Sm/152Sm ratios. After correction of these effects, the average ε54Cr of the Moon is indistinguishable from the terrestrial and enstatite chondrite materials reinforcing the idea of an Earth-Moon-enstatite chondrite system homogeneity. This is compatible with the most recent scenarios of Moon formation suggesting an efficient physical homogenization after a high-energy impact on a fast spinning Earth, and/or with an impactor originating from the same reservoir in the inner proto-planetary disk as the Earth and enstatite chondrites and having similar composition.

  4. Ultra-High Pressure Homogenization improves oxidative stability and interfacial properties of soy protein isolate-stabilized emulsions.

    PubMed

    Fernandez-Avila, C; Trujillo, A J

    2016-10-15

    Ultra-High Pressure Homogenization (100-300MPa) has great potential for technological, microbiological and nutritional aspects of fluid processing. Its effect on the oxidative stability and interfacial properties of oil-in-water emulsions prepared with 4% (w/v) of soy protein isolate and soybean oil (10 and 20%, v/v) were studied and compared to emulsions treated by conventional homogenization (15MPa). Emulsions were characterized by particle size, emulsifying activity index, surface protein concentration at the interface and by transmission electron microscopy. Primary and secondary lipid oxidation products were evaluated in emulsions upon storage. Emulsions with 20% oil treated at 100 and 200MPa exhibited the most oxidative stability due to higher amount of oil and protein surface load at the interface. This manuscript addresses the improvement in oxidative stability in emulsions treated by UHPH when compared to conventional emulsions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Homogeneity of Political Party Preferences between Spouses

    ERIC Educational Resources Information Center

    Weiner, Terry S.

    1978-01-01

    A study investigated how political socialization within marriages affects political party affiliation. Findings indicated that the high degree of homogeneity of party preferences between spouses is due both to assortive mating and to influences of each spouse upon the other. For journal availability, see SO 506 355. (Author/DB)

  6. Extension theorems for homogenization on lattice structures

    NASA Technical Reports Server (NTRS)

    Miller, Robert E.

    1992-01-01

    When applying homogenization techniques to problems involving lattice structures, it is necessary to extend certain functions defined on a perforated domain to a simply connected domain. This paper provides general extension operators which preserve bounds on derivatives of order l. Only the special case of honeycomb structures is considered.

  7. Partial stabilisation of non-homogeneous bilinear systems

    NASA Astrophysics Data System (ADS)

    Hamidi, Z.; Ouzahra, M.

    2018-06-01

    In this work, we study in a Hilbert state space, the partial stabilisation of non-homogeneous bilinear systems using a bounded control. Necessary and sufficient conditions for weak and strong stabilisation are formulated in term of approximate observability like assumptions. Applications to parabolic and hyperbolic equations are presented.

  8. Neural plasticity in amplitude of low frequency fluctuation, cortical hub construction, regional homogeneity resulting from working memory training.

    PubMed

    Takeuchi, Hikaru; Taki, Yasuyuki; Nouchi, Rui; Sekiguchi, Atsushi; Kotozaki, Yuka; Nakagawa, Seishu; Makoto Miyauchi, Carlos; Sassa, Yuko; Kawashima, Ryuta

    2017-05-03

    Working memory training (WMT) induces changes in cognitive function and various neurological systems. Here, we investigated changes in recently developed resting state functional magnetic resonance imaging measures of global information processing [degree of the cortical hub, which may have a central role in information integration in the brain, degree centrality (DC)], the magnitude of intrinsic brain activity [fractional amplitude of low frequency fluctuation (fALFF)], and local connectivity (regional homogeneity) in young adults, who either underwent WMT or received no intervention for 4 weeks. Compared with no intervention, WMT increased DC in the anatomical cluster, including anterior cingulate cortex (ACC), to the medial prefrontal cortex (mPFC). Furthermore, WMT increased fALFF in the anatomical cluster including the right dorsolateral prefrontal cortex (DLPFC), frontopolar area and mPFC. WMT increased regional homogeneity in the anatomical cluster that spread from the precuneus to posterior cingulate cortex and posterior parietal cortex. These results suggest WMT-induced plasticity in spontaneous brain activity and global and local information processing in areas of the major networks of the brain during rest.

  9. Type of homogenization and fat loss during continuous infusion of human milk.

    PubMed

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; Alonso Díaz, Clara; Vázquez Román, Sara; De la Cruz-Bértolo, Javier; Pallás-Alonso, Carmen Rosa

    2014-11-01

    Substantial fat loss may occur during continuous feeding of human milk (HM). A decrease of fat loss has been described following homogenization. Well-established methods of homogenization of HM for routine use in the neonatal intensive care unit (NICU) would be desirable. We compared the loss of fat based on the use of 3 different methods for homogenizing thawed HM during continuous feeding. Sixteen frozen donor HM samples were thawed, homogenized with ultrasound and separated into 3 aliquots ("baseline agitation," "hourly agitation," and "ultrasound"), and then frozen for 48 hours. Aliquots were thawed again and a baseline agitation was applied. Subsequently, aliquots baseline agitation and hourly agitation were drawn into a syringe, while ultrasound was applied to aliquot ultrasound before it was drawn into a syringe. The syringes were loaded into a pump (2 mL/h; 4 hours). At hourly intervals the hourly agitation infusion was stopped, the syringe was disconnected and gently shaken. During infusion, samples from the 3 groups were collected hourly for analysis of fat and caloric content. The 3 groups of homogenization showed similar fat content at the beginning of the infusion. For fat, mean (SD) hourly changes of -0.03 (0.01), -0.09 (0.01), and -0.09 (0.01) g/dL were observed for the hourly agitation, baseline agitation, and ultrasound groups, respectively. The decrease was smaller for the hourly agitation group (P < .001). When thawed HM is continuously infused, a smaller fat loss is observed when syringes are agitated hourly versus when ultrasound or a baseline homogenization is used. © The Author(s) 2014.

  10. Identification of hydrologically homogeneous regions in Ganga-Brahmaputra river basin using Self Organising Maps

    NASA Astrophysics Data System (ADS)

    Ojha, C. S. P.; Sharma, C.

    2017-12-01

    Identification of hydrologically homogeneous regions is crucial for topographically complex regions such as Himalayan river basins. Ganga-Brahmaputra river basin extends through three countries, i.e., India Nepal and China. High elevations and rugged topography impose challenge for in-situ gauges. So, it is always recommended to use data from hydrological similar site in absence of site records. We have tried to find out hydrologically homogeneous regions using Self-Organising-Map (SOM) in Ganga-Brahmaputra river basin. The station characteristics used for identification of homogeneous regions are annual precipitation, total wet season (July to September) precipitation, total dry season (January to March) precipitation, Latitude, Longitude and elevation. Precipitation data was obtained from Climate Research Unit (CRU). Number of cluster are find out using hierarchical k-means clustering. We found that the basin can be divided in 9 clusters. Mere division of regions in clusters does not clarify that identified cluster are homogeneous. The homogeneity of the clusters is found out using Hosking and Wallis heterogeneity test. All the clusters were found to be acceptably homogeneous with the value of Hosking-Wallis test static H<1.

  11. Investigation of methods for hydroclimatic data homogenization

    NASA Astrophysics Data System (ADS)

    Steirou, E.; Koutsoyiannis, D.

    2012-04-01

    We investigate the methods used for the adjustment of inhomogeneities of temperature time series covering the last 100 years. Based on a systematic study of scientific literature, we classify and evaluate the observed inhomogeneities in historical and modern time series, as well as their adjustment methods. It turns out that these methods are mainly statistical, not well justified by experiments and are rarely supported by metadata. In many of the cases studied the proposed corrections are not even statistically significant. From the global database GHCN-Monthly Version 2, we examine all stations containing both raw and adjusted data that satisfy certain criteria of continuity and distribution over the globe. In the United States of America, because of the large number of available stations, stations were chosen after a suitable sampling. In total we analyzed 181 stations globally. For these stations we calculated the differences between the adjusted and non-adjusted linear 100-year trends. It was found that in the two thirds of the cases, the homogenization procedure increased the positive or decreased the negative temperature trends. One of the most common homogenization methods, 'SNHT for single shifts', was applied to synthetic time series with selected statistical characteristics, occasionally with offsets. The method was satisfactory when applied to independent data normally distributed, but not in data with long-term persistence. The above results cast some doubts in the use of homogenization procedures and tend to indicate that the global temperature increase during the last century is between 0.4°C and 0.7°C, where these two values are the estimates derived from raw and adjusted data, respectively.

  12. Some variance reduction methods for numerical stochastic homogenization.

    PubMed

    Blanc, X; Le Bris, C; Legoll, F

    2016-04-28

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. © 2016 The Author(s).

  13. Reverse engineering of the homogeneous-entity product profiles based on CCD

    NASA Astrophysics Data System (ADS)

    Gan, Yong; Zhong, Jingru; Sun, Ning; Sun, Aoran

    2011-08-01

    This measurement system uses delaminated measurement principle, measures the three perpendicular direction values of the entities. When the measured entity is immerged in the liquid layer by layer, every layer's image are collected by CCD and digitally processed. It introduces the basic measuring principle and the working process of the measure method. According to Archimedes law, the related buoyancy and volume that soaked in different layer's depth are measured by electron balance and the mathematics models are established. Through calculating every layer's weight and centre of gravity by computer based on the method of Artificial Intelligence, we can reckon 3D coordinate values of every minute entity cell in different layers and its 3D contour picture is constructed. The experimental results show that for all the homogeneous entity insoluble in water, it can measure them. The measurement velocity is fast and non-destructive test, it can measure the entity with internal hole.

  14. Improved model for detection of homogeneous production batches of electronic components

    NASA Astrophysics Data System (ADS)

    Kazakovtsev, L. A.; Orlov, V. I.; Stashkov, D. V.; Antamoshkin, A. N.; Masich, I. S.

    2017-10-01

    Supplying the electronic units of the complex technical systems with electronic devices of the proper quality is one of the most important problems for increasing the whole system reliability. Moreover, for reaching the highest reliability of an electronic unit, the electronic devices of the same type must have equal characteristics which assure their coherent operation. The highest homogeneity of the characteristics is reached if the electronic devices are manufactured as a single production batch. Moreover, each production batch must contain homogeneous raw materials. In this paper, we propose an improved model for detecting the homogeneous production batches of shipped lot of electronic components based on implementing the kurtosis criterion for the results of non-destructive testing performed for each lot of electronic devices used in the space industry.

  15. HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Hammond, R.P.; Busey, H.M.

    1959-02-17

    Nuclear reactors of the homogeneous liquid fuel type are discussed. The reactor is comprised of an elongated closed vessel, vertically oriented, having a critical region at the bottom, a lower chimney structure extending from the critical region vertically upwardly and surrounded by heat exchanger coils, to a baffle region above which is located an upper chimney structure containing a catalyst functioning to recombine radiolyticallydissociated moderator gages. In operation the liquid fuel circulates solely by convection from the critical region upwardly through the lower chimney and then downwardly through the heat exchanger to return to the critical region. The gases formed by radiolytic- dissociation of the moderator are carried upwardly with the circulating liquid fuel and past the baffle into the region of the upper chimney where they are recombined by the catalyst and condensed, thence returning through the heat exchanger to the critical region.

  16. Quality Improvement: Creating a Float Pool Specialty Within a New Graduate Residency.

    PubMed

    Shinners, Jean; Alejandro, John Aldrich N; Frigillana, Vanessa; Desmond, Juliann; LaVigne, Ronda

    2016-01-01

    Creating new norms is essential for success as acute care leaders seek to redesign care delivery. Through the structures of the registered nurse (RN) residency and utilizing a quality improvement process, new graduate RNs demonstrated success in creating a centralized float pool resource.

  17. SU-E-T-76: Comparing Homogeneity Between Gafchromic Film EBT2 and EBT3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mizuno, H; Sumida, I; Ogawa, K

    2014-06-01

    Purpose: We found out that homogeneity of EBT2 was different among lot numbers in previous study. Variation in local homogeneity of EBT3 among several lot numbers has not been reported. In this study, we investigated film homogeneity of Gafcrhomic EBT3 films compared with EBT2 films. Methods: All sheets from five lots were cut into 12 pieces to investigate film homogeneity, and were irradiated at 0.5, 2, and 3 Gy. To investigate intra- and inter-sheet uniformity, five sheets from five lots were exposed to 2 Gy: intra-sheet uniformity was evaluated by the coefficient of variation of homogeneity for all pieces ofmore » a single sheet, and inter-sheet uniformity was evaluated by the coefficient of variation of homogeneity among the same piece numbers in the five sheets. To investigate the difference of ADC value in various doses, a single sheet from each of five lots was irradiated at 0.5 Gy and 3 Gy in addition to 2 Gy. A scan resolution of 72 dots per inch (dpi) and color depth of 48-bit RGB were used. Films were analyzed by the inhouse software; Average of ADC value in center ROI and profile X and Y axis were measured. Results and Conclusion: Intra-sheet uniformity of non-irradiated EBT2 films were ranged from 0.1% to 0.4%, however that of irradiated EBT2 films were ranged from 0.2% to 1.5%. On the other hand, intra-sheet uniformity of irradiated and non-irradiated EBT3 films were from 0.2% to 0.6%. Inter-sheet uniformity of all films were less than 0.5%. It was interesting point that homogeneity of EBT3 between no-irradiated and irradiated films were similar value, whereas EBT2 had dose dependence of homogeneity in ADC value evaluation. These results suggested that EBT3 homogeneity was corrected by this feature.« less

  18. Creating Intentional Spaces for Sustainable Development in the Indian Trans-Himalaya: Reconceptualizing Globalization from Below

    ERIC Educational Resources Information Center

    Shah, Payal

    2014-01-01

    In an era of globalization, multifaceted and complex changes have increasingly interconnected geographically dispersed places. A central question of globalization studies concerns whether top-down forces of globalization are forging a global culture or whether processes of globalization from below are able to push back against homogenization by…

  19. Partial sequence homogenization in the 5S multigene families may generate sequence chimeras and spurious results in phylogenetic reconstructions.

    PubMed

    Galián, José A; Rosato, Marcela; Rosselló, Josep A

    2014-03-01

    Multigene families have provided opportunities for evolutionary biologists to assess molecular evolution processes and phylogenetic reconstructions at deep and shallow systematic levels. However, the use of these markers is not free of technical and analytical challenges. Many evolutionary studies that used the nuclear 5S rDNA gene family rarely used contiguous 5S coding sequences due to the routine use of head-to-tail polymerase chain reaction primers that are anchored to the coding region. Moreover, the 5S coding sequences have been concatenated with independent, adjacent gene units in many studies, creating simulated chimeric genes as the raw data for evolutionary analysis. This practice is based on the tacitly assumed, but rarely tested, hypothesis that strict intra-locus concerted evolution processes are operating in 5S rDNA genes, without any empirical evidence as to whether it holds for the recovered data. The potential pitfalls of analysing the patterns of molecular evolution and reconstructing phylogenies based on these chimeric genes have not been assessed to date. Here, we compared the sequence integrity and phylogenetic behavior of entire versus concatenated 5S coding regions from a real data set obtained from closely related plant species (Medicago, Fabaceae). Our results suggest that within arrays sequence homogenization is partially operating in the 5S coding region, which is traditionally assumed to be highly conserved. Consequently, concatenating 5S genes increases haplotype diversity, generating novel chimeric genotypes that most likely do not exist within the genome. In addition, the patterns of gene evolution are distorted, leading to incorrect haplotype relationships in some evolutionary reconstructions.

  20. Optimization and characterization of high pressure homogenization produced chemically modified starch nanoparticles.

    PubMed

    Ding, Yongbo; Kan, Jianquan

    2017-12-01

    Chemically modified starch (RS4) nanoparticles were synthesized through homogenization and water-in-oil mini-emulsion cross-linking. Homogenization was optimized with regard to z-average diameter by using a three-factor-three-level Box-Behnken design. Homogenization pressure (X 1 ), oil/water ratio (X 2 ), and surfactant (X 3 ) were selected as independent variables, whereas z-average diameter was considered as a dependent variable. The following optimum preparation conditions were obtained to achieve the minimum average size of these nanoparticles: 50 MPa homogenization pressure, 10:1 oil/water ratio, and 2 g surfactant amount, when the predicted z-average diameter was 303.6 nm. The physicochemical properties of these nanoparticles were also determined. Dynamic light scattering experiments revealed that RS4 nanoparticles measuring a PdI of 0.380 and an average size of approximately 300 nm, which was very close to the predicted z-average diameter (303.6 nm). The absolute value of zeta potential of RS4 nanoparticles (39.7 mV) was higher than RS4 (32.4 mV), with strengthened swelling power. X-ray diffraction results revealed that homogenization induced a disruption in crystalline structure of RS4 nanoparticles led to amorphous or low-crystallinity. Results of stability analysis showed that RS4 nanosuspensions (particle size) had good stability at 30 °C over 24 h.

  1. Versatile and Programmable DNA Logic Gates on Universal and Label-Free Homogeneous Electrochemical Platform.

    PubMed

    Ge, Lei; Wang, Wenxiao; Sun, Ximei; Hou, Ting; Li, Feng

    2016-10-04

    Herein, a novel universal and label-free homogeneous electrochemical platform is demonstrated, on which a complete set of DNA-based two-input Boolean logic gates (OR, NAND, AND, NOR, INHIBIT, IMPLICATION, XOR, and XNOR) is constructed by simply and rationally deploying the designed DNA polymerization/nicking machines without complicated sequence modulation. Single-stranded DNA is employed as the proof-of-concept target/input to initiate or prevent the DNA polymerization/nicking cyclic reactions on these DNA machines to synthesize numerous intact G-quadruplex sequences or binary G-quadruplex subunits as the output. The generated output strands then self-assemble into G-quadruplexes that render remarkable decrease to the diffusion current response of methylene blue and, thus, provide the amplified homogeneous electrochemical readout signal not only for the logic gate operations but also for the ultrasensitive detection of the target/input. This system represents the first example of homogeneous electrochemical logic operation. Importantly, the proposed homogeneous electrochemical logic gates possess the input/output homogeneity and share a constant output threshold value. Moreover, the modular design of DNA polymerization/nicking machines enables the adaptation of these homogeneous electrochemical logic gates to various input and output sequences. The results of this study demonstrate the versatility and universality of the label-free homogeneous electrochemical platform in the design of biomolecular logic gates and provide a potential platform for the further development of large-scale DNA-based biocomputing circuits and advanced biosensors for multiple molecular targets.

  2. Exploring cosmic homogeneity with the BOSS DR12 galaxy sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ntelis, Pierros; Hamilton, Jean-Christophe; Busca, Nicolas Guillermo

    2017-06-01

    In this study, we probe the transition to cosmic homogeneity in the Large Scale Structure (LSS) of the Universe using the CMASS galaxy sample of BOSS spectroscopic survey which covers the largest effective volume to date, 3 h {sup −3} Gpc{sup 3} at 0.43 ≤ z ≤ 0.7. We study the scaled counts-in-spheres, N(< r ), and the fractal correlation dimension, D{sub 2}( r ), to assess the homogeneity scale of the universe using a Landy and Szalay inspired estimator. Defining the scale of transition to homogeneity as the scale at which D{sub 2}( r ) reaches 3 within 1%,more » i.e. D{sub 2}( r )>2.97 for r >R {sub H} , we find R {sub H} = (63.3±0.7) h {sup −1} Mpc, in agreement at the percentage level with the predictions of the ΛCDM model R {sub H} =62.0 h {sup −1} Mpc. Thanks to the large cosmic depth of the survey, we investigate the redshift evolution of the transition to homogeneity scale and find agreement with the ΛCDM prediction. Finally, we find that D{sub 2} is compatible with 3 at scales larger than 300 h {sup −1} Mpc in all redshift bins. These results consolidate the Cosmological Principle and represent a precise consistency test of the ΛCDM model.« less

  3. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  4. SU-E-T-513: Investigating Dose of Internal Target Volume After Correcting for Tissue Heterogeneity in SBRT Lung Plans with Homogeneity Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, P; Zhuang, T; Magnelli, A

    2015-06-15

    Purpose It was recommended to use the prescription of 54 Gy/3 with heterogeneity corrections for previously established dose scheme of 60 Gy/3 with homogeneity calculation. This study is to investigate dose coverage for the internal target volume (ITV) with and without heterogeneity correction. Methods Thirty patients who received stereotactic body radiotherapy (SBRT) to a dose of 60 Gy in 3 fractions with homogeneous planning for early stage non-small-cell lung cancer (NSCLC) were selected. ITV was created either from 4DCT scans or a fusion of multi-phase respiratory scans. Planning target volume (PTV) was a 5 mm expansion of the ITV. Formore » this study, we recalculated homogeneous clinical plans using heterogeneity corrections with monitor units set as clinically delivered. All plans were calculated with 3 mm dose grids and collapsed cone convolution algorithm. To account for uncertainties from tumor delineation and image-guided radiotherapy, a structure ITV2mm was created by expanding ITV with 2 mm margins. Dose coverage to the PTV, ITV and ITV2mm were compared with a student paired t-test. Results With heterogeneity corrections, the PTV V60Gy decreased by 10.1% ± 18.4% (p<0.01) while the maximum dose to the PTV increased by 3.7 ± 4.3% (p<0.01). With and without corrections, D99% was 65.8 ± 4.0 Gy and 66.7 ± 4.8 Gy (p=0.15) for the ITV, and 63.9 ± 3.4 Gy and 62.9 ± 4.6 Gy for the ITV2mm (p=0.22), respectively. The mean dose to the ITV and ITV2mm increased 3.6% ± 4.7% (p<0.01) and 2.3% ± 5.2% (p=0.01) with heterogeneity corrections. Conclusion After heterogeneity correction, the peripheral coverage of the PTV decreased to approximately 54 Gy, but D99% of the ITV and ITV2mm was unchanged and the mean dose to the ITV and ITV2mm was increased. Clinical implication of these results requires more investigation.« less

  5. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems.

    PubMed

    Ye, Ran; Harte, Federico

    2014-03-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions.

  6. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems

    PubMed Central

    Ye, Ran; Harte, Federico

    2013-01-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions. PMID:24159250

  7. HOMOGENEOUS CATALYTIC OXIDATION OF HYDROCARBONS IN ALTERNATIVE SOLVENTS

    EPA Science Inventory

    Homogeneous Catalytic Oxidations of Hydrocarbons in Alternative Solvent Systems

    Michael A. Gonzalez* and Thomas M. Becker, Sustainable Technology Division, Office of Research and Development; United States Environmental Protection Agency, 26 West Martin Luther King Drive, ...

  8. Standard deviation index for stimulated Brillouin scattering suppression with different homogeneities.

    PubMed

    Ran, Yang; Su, Rongtao; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Si, Lei

    2016-05-10

    We present a new quantitative index of standard deviation to measure the homogeneity of spectral lines in a fiber amplifier system so as to find the relation between the stimulated Brillouin scattering (SBS) threshold and the homogeneity of the corresponding spectral lines. A theoretical model is built and a simulation framework has been established to estimate the SBS threshold when input spectra with different homogeneities are set. In our experiment, by setting the phase modulation voltage to a constant value and the modulation frequency to different values, spectral lines with different homogeneities can be obtained. The experimental results show that the SBS threshold increases negatively with the standard deviation of the modulated spectrum, which is in good agreement with the theoretical results. When the phase modulation voltage is confined to 10 V and the modulation frequency is set to 80 MHz, the standard deviation of the modulated spectrum equals 0.0051, which is the lowest value in our experiment. Thus, at this time, the highest SBS threshold has been achieved. This standard deviation can be a good quantitative index in evaluating the power scaling potential in a fiber amplifier system, which is also a design guideline in suppressing the SBS to a better degree.

  9. Dynamic and Thermal Turbulent Time Scale Modelling for Homogeneous Shear Flows

    NASA Technical Reports Server (NTRS)

    Schwab, John R.; Lakshminarayana, Budugur

    1994-01-01

    A new turbulence model, based upon dynamic and thermal turbulent time scale transport equations, is developed and applied to homogeneous shear flows with constant velocity and temperature gradients. The new model comprises transport equations for k, the turbulent kinetic energy; tau, the dynamic time scale; k(sub theta), the fluctuating temperature variance; and tau(sub theta), the thermal time scale. It offers conceptually parallel modeling of the dynamic and thermal turbulence at the two equation level, and eliminates the customary prescription of an empirical turbulent Prandtl number, Pr(sub t), thus permitting a more generalized prediction capability for turbulent heat transfer in complex flows and geometries. The new model also incorporates constitutive relations, based upon invariant theory, that allow the effects of nonequilibrium to modify the primary coefficients for the turbulent shear stress and heat flux. Predictions of the new model, along with those from two other similar models, are compared with experimental data for decaying homogeneous dynamic and thermal turbulence, homogeneous turbulence with constant temperature gradient, and homogeneous turbulence with constant temperature gradient and constant velocity gradient. The new model offers improvement in agreement with the data for most cases considered in this work, although it was no better than the other models for several cases where all the models performed poorly.

  10. Homogenization of locally resonant acoustic metamaterials towards an emergent enriched continuum.

    PubMed

    Sridhar, A; Kouznetsova, V G; Geers, M G D

    This contribution presents a novel homogenization technique for modeling heterogeneous materials with micro-inertia effects such as locally resonant acoustic metamaterials. Linear elastodynamics is used to model the micro and macro scale problems and an extended first order Computational Homogenization framework is used to establish the coupling. Craig Bampton Mode Synthesis is then applied to solve and eliminate the microscale problem, resulting in a compact closed form description of the microdynamics that accurately captures the Local Resonance phenomena. The resulting equations represent an enriched continuum in which additional kinematic degrees of freedom emerge to account for Local Resonance effects which would otherwise be absent in a classical continuum. Such an approach retains the accuracy and robustness offered by a standard Computational Homogenization implementation, whereby the problem and the computational time are reduced to the on-line solution of one scale only.

  11. Video image processing to create a speed sensor

    DOT National Transportation Integrated Search

    1999-11-01

    Image processing has been applied to traffic analysis in recent years, with different goals. In the report, a new approach is presented for extracting vehicular speed information, given a sequence of real-time traffic images. We extract moving edges ...

  12. The contact of a homogeneous unitary Fermi gas

    NASA Astrophysics Data System (ADS)

    Mukherjee, Biswaroop; Patel, Parth; Yan, Zhenjie; Fletcher, Richard; Struck, Julian; Zwierlein, Martin

    2017-04-01

    The contact is a fundamental quantity that measures the strength of short-range correlations in quantum gases. As one of its most important implications, it provides a link between the microscopic two-particle correlation function at small distance and the macroscopic thermodynamic properties of the gas. In particular, pairing and superfluidity in a unitary Fermi gas can be expected to leave its mark in behavior of the contact. Here we present measurements on the temperature dependence of the contact of a unitary Fermi gas across the superfluid transition. By trapping ultracold 6Li atoms in a potential that is homogeneous in two directions and harmonic in the third, we obtain radiofrequency spectra of the homogeneous gas at a high signal-to-noise ratio. We compare our data to existing, but often mutually excluding theoretical calculations for the strongly interacting Fermi gas.

  13. Extreme degree of ionization in homogenous micro-capillary plasma columns heated by ultrafast current pulses.

    PubMed

    Avaria, G; Grisham, M; Li, J; Tomasel, F G; Shlyaptsev, V N; Busquet, M; Woolston, M; Rocca, J J

    2015-03-06

    Homogeneous plasma columns with ionization levels typical of megaampere discharges are created by rapidly heating gas-filled 520-μm-diameter channels with nanosecond rise time current pulses of 40 kA. Current densities of up to 0.3  GA cm^{-2} greatly increase Joule heating with respect to conventional capillary discharge Z pinches, reaching unprecedented degrees of ionization for a high-Z plasma column heated by a current pulse of remarkably low amplitude. Dense xenon plasmas are ionized to Xe^{28+}, while xenon impurities in hydrogen discharges reach Xe^{30+}. The unique characteristics of these hot, ∼300:1 length-to-diameter aspect ratio plasmas allow the observation of unexpected spectroscopic phenomena. Axial spectra show the unusual dominance of the intercombination line over the resonance line of He-like Al by nearly an order of magnitude, caused by differences in opacities in the axial and radial directions. These plasma columns could enable the development of sub-10-nm x-ray lasers.

  14. Electrospinning of caseinates to create protective fibrous mats

    USDA-ARS?s Scientific Manuscript database

    Electrospinning is a nonthermal process that produces fibers on the micron- or nano-scale from a polymer solution. If produced by electrospinning of biopolymer solutions, fibrous mats may be created for protecting foods and allowing for the preservation and controlled release of bioactives for healt...

  15. Electrospinning of caseinates to create protective fibrous mats

    USDA-ARS?s Scientific Manuscript database

    JUSTIFICATION Electrospinning is a nonthermal process that produces fibers with diameters on the micron- or nano-scales from a polymer solution. If produced by electrospinning of biopolymer solutions, fibrous mats may be created for protecting foods, improving food quality and allowing for the prese...

  16. Properties of frozen dairy desserts processed by microfluidization of their mixes.

    PubMed

    Olson, D W; White, C H; Watson, C E

    2003-04-01

    Sensory properties and rate of meltdown of nonfat (0% fat) and low-fat (2% fat) vanilla ice creams processed either by conventional valve homogenization or microfluidization of their mixes were compared with each other and to ice cream (10% fat) processed by conventional valve homogenization. Mixes for frozen dairy desserts containing 0, 2, and 10% fat were manufactured. Some of the nonfat and low-fat ice cream mixes were processed by microfluidization at 50, 100, 150, and 200 MPa, and the remaining nonfat and low-fat ice cream mixes and all of the ice cream mix were processed by conventional valve homogenization at 13.8 MPa, first stage, and 3.4 MPa, second stage. The finished frozen and hardened products were evaluated at d 1 and 45 for meltdown rate and for flavor and body and texture by preference testing. Nonfat and low-fat ice creams that usually had a slower meltdown were produced when processing their mixes by microfluidization instead of by conventional valve homogenization. Sensory scores for the ice cream were significantly higher than sensory scores for the nonfat and low-fat ice creams, but the sensory scores for the conventional valve homogenized controls for the nonfat ice cream and low-fat ice cream were not significantly different from the sensory scores for the nonfat ice cream and low-fat ice cream processed by microfluidization of the mixes, respectively. Microfluidization produced nonfat and low-fat ice creams that usually had a slower meltdown without affecting sensory properties.

  17. IBES: a tool for creating instructions based on event segmentation

    PubMed Central

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-01-01

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool. PMID:24454296

  18. IBES: a tool for creating instructions based on event segmentation.

    PubMed

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  19. Homogeneity and variation of donor doping in Verneuil-grown SrTiO3:Nb single crystals

    PubMed Central

    Rodenbücher, C.; Luysberg, M.; Schwedt, A.; Havel, V.; Gunkel, F.; Mayer, J.; Waser, R.

    2016-01-01

    The homogeneity of Verneuil-grown SrTiO3:Nb crystals was investigated. Due to the fast crystal growth process, inhomogeneities in the donor dopant distribution and variation in the dislocation density are expected to occur. In fact, for some crystals optical studies show variations in the density of Ti3+ states on the microscale and a cluster-like surface conductivity was reported in tip-induced resistive switching studies. However, our investigations by TEM, EDX mapping, and 3D atom probe reveal that the Nb donors are distributed in a statistically random manner, indicating that there is clearly no inhomogeneity on the macro-, micro-, and nanoscale in high quality Verneuil-grown crystals. In consequence, the electronic transport in the bulk of donor-doped crystals is homogeneous and it is not significantly channelled by extended defects such as dislocations which justifies using this material, for example, as electronically conducting substrate for epitaxial oxide film growth. PMID:27577508

  20. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  1. Creating a culture of mutual respect.

    PubMed

    Kaplan, Kathryn; Mestel, Pamela; Feldman, David L

    2010-04-01

    The Joint Commission mandates that hospitals seeking accreditation have a process to define and address disruptive behavior. Leaders at Maimonides Medical Center, Brooklyn, New York, took the initiative to create a code of mutual respect that not only requires respectful behavior, but also encourages sensitivity and awareness to the causes of frustration that often lead to inappropriate behavior. Steps to implementing the code included selecting code advocates, setting up a system for mediating disputes, tracking and addressing operational system issues, providing training for personnel, developing a formal accountability process, and measuring the results. Copyright 2010 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  2. Unified double- and single-sided homogeneous Green's function representations

    NASA Astrophysics Data System (ADS)

    Wapenaar, Kees; van der Neut, Joost; Slob, Evert

    2016-06-01

    In wave theory, the homogeneous Green's function consists of the impulse response to a point source, minus its time-reversal. It can be represented by a closed boundary integral. In many practical situations, the closed boundary integral needs to be approximated by an open boundary integral because the medium of interest is often accessible from one side only. The inherent approximations are acceptable as long as the effects of multiple scattering are negligible. However, in case of strongly inhomogeneous media, the effects of multiple scattering can be severe. We derive double- and single-sided homogeneous Green's function representations. The single-sided representation applies to situations where the medium can be accessed from one side only. It correctly handles multiple scattering. It employs a focusing function instead of the backward propagating Green's function in the classical (double-sided) representation. When reflection measurements are available at the accessible boundary of the medium, the focusing function can be retrieved from these measurements. Throughout the paper, we use a unified notation which applies to acoustic, quantum-mechanical, electromagnetic and elastodynamic waves. We foresee many interesting applications of the unified single-sided homogeneous Green's function representation in holographic imaging and inverse scattering, time-reversed wave field propagation and interferometric Green's function retrieval.

  3. Two-dimensional arbitrarily shaped acoustic cloaks composed of homogeneous parts

    NASA Astrophysics Data System (ADS)

    Li, Qi; Vipperman, Jeffrey S.

    2017-10-01

    Acoustic cloaking is an important application of acoustic metamaterials. Although the topic has received much attention, there are a number of areas where contributions are needed. In this paper, a design method for producing acoustic cloaks with arbitrary shapes that are composed of homogeneous parts is presented. The cloak is divided into sections, each of which, in turn, is further divided into two parts, followed by the application of transformation acoustics to derive the required properties for cloaking. With the proposed mapping relations, the properties of each part of the cloak are anisotropic but homogeneous, which can be realized using two alternating layers of homogeneous and isotropic materials. A hexagonal and an irregular cloak are presented as design examples. The full wave simulations using COMSOL Multiphysics finite element software show that the cloaks function well at reducing reflections and shadows. The variation of the cloak properties is investigated as a function of three important geometric parameters used in the transformations. A balance can be found between cloaking performance and materials properties that are physically realizable.

  4. Agile science: creating useful products for behavior change in the real world.

    PubMed

    Hekler, Eric B; Klasnja, Predrag; Riley, William T; Buman, Matthew P; Huberty, Jennifer; Rivera, Daniel E; Martin, Cesar A

    2016-06-01

    Evidence-based practice is important for behavioral interventions but there is debate on how best to support real-world behavior change. The purpose of this paper is to define products and a preliminary process for efficiently and adaptively creating and curating a knowledge base for behavior change for real-world implementation. We look to evidence-based practice suggestions and draw parallels to software development. We argue to target three products: (1) the smallest, meaningful, self-contained, and repurposable behavior change modules of an intervention; (2) "computational models" that define the interaction between modules, individuals, and context; and (3) "personalization" algorithms, which are decision rules for intervention adaptation. The "agile science" process includes a generation phase whereby contender operational definitions and constructs of the three products are created and assessed for feasibility and an evaluation phase, whereby effect size estimates/casual inferences are created. The process emphasizes early-and-often sharing. If correct, agile science could enable a more robust knowledge base for behavior change.

  5. Impact of the cation distribution homogeneity on the americium oxidation state in the U0.54Pu0.45Am0.01O2-x mixed oxide

    NASA Astrophysics Data System (ADS)

    Vauchy, Romain; Robisson, Anne-Charlotte; Martin, Philippe M.; Belin, Renaud C.; Aufore, Laurence; Scheinost, Andreas C.; Hodaj, Fiqiri

    2015-01-01

    The impact of the cation distribution homogeneity of the U0.54Pu0.45Am0.01O2-x mixed oxide on the americium oxidation state was studied by coupling X-ray diffraction (XRD), electron probe micro analysis (EPMA) and X-ray absorption spectroscopy (XAS). Oxygen-hypostoichiometric Am-bearing uranium-plutonium mixed oxide pellets were fabricated by two different co-milling based processes in order to obtain different cation distribution homogeneities. The americium was generated from β- decay of 241Pu. The XRD analysis of the obtained compounds did not reveal any structural difference between the samples. EPMA, however, revealed a high homogeneity in the cation distribution for one sample, and substantial heterogeneity of the U-Pu (so Am) distribution for the other. The difference in cation distribution was linked to a difference in Am chemistry as investigated by XAS, with Am being present at mixed +III/+IV oxidation state in the heterogeneous compound, whereas only Am(IV) was observed in the homogeneous compound. Previously reported discrepancies on Am oxidation states can hence be explained by cation distribution homogeneity effects.

  6. Do Algorithms Homogenize Students' Achievements in Secondary School Better than Teachers' Tracking Decisions?

    ERIC Educational Resources Information Center

    Klapproth, Florian

    2015-01-01

    Two objectives guided this research. First, this study examined how well teachers' tracking decisions contribute to the homogenization of their students' achievements. Second, the study explored whether teachers' tracking decisions would be outperformed in homogenizing the students' achievements by statistical models of tracking decisions. These…

  7. Fragility and super-strong character of non-stoichiometric chalcogenides: implications on melt homogenization

    NASA Astrophysics Data System (ADS)

    Ravindren, Sriram; Gunasekera, Kapila; Boolchand, Punit; Micoulaut, Matthieu

    2014-03-01

    The kinetics of homogenization of binary AsxSe100-x melts in the As concentration range 0% homogenize when the starting materials are reacted at 700°C. The enthalpy of relaxation at Tg - Δ Hnr(x) - shows a minimum in 27% homogeneous glasses, molar volumes vary non-monotonically with composition and the fragility index m displays a broad global minimum in 20% homogenization. In comparing these results with earlier reports, there is evidence that fragility decreases as melts are homogenized. Furthermore, a clear scaling of m vs. Tg is observed with a negative slope for Flexible glasses and a positive slope for Rigid and Stressed-rigid ones. The absence of a melting endotherm in non-stoichiometric As-Se compositions is reported. Fragilities of the Ge-As-Se are reported and a correlation observed with fragilities of As-Se and Ge-Se. Supported by NSF grant DMR 08-53957.

  8. Sampling and Homogenization Strategies Significantly Influence the Detection of Foodborne Pathogens in Meat.

    PubMed

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-01-01

    Efficient preparation of food samples, comprising sampling and homogenization, for microbiological testing is an essential, yet largely neglected, component of foodstuff control. Salmonella enterica spiked chicken breasts were used as a surface contamination model whereas salami and meat paste acted as models of inner-matrix contamination. A systematic comparison of different homogenization approaches, namely, stomaching, sonication, and milling by FastPrep-24 or SpeedMill, revealed that for surface contamination a broad range of sample pretreatment steps is applicable and loss of culturability due to the homogenization procedure is marginal. In contrast, for inner-matrix contamination long treatments up to 8 min are required and only FastPrep-24 as a large-volume milling device produced consistently good recovery rates. In addition, sampling of different regions of the spiked sausages showed that pathogens are not necessarily homogenously distributed throughout the entire matrix. Instead, in meat paste the core region contained considerably more pathogens compared to the rim, whereas in the salamis the distribution was more even with an increased concentration within the intermediate region of the sausages. Our results indicate that sampling and homogenization as integral parts of food microbiology and monitoring deserve more attention to further improve food safety.

  9. Sampling and Homogenization Strategies Significantly Influence the Detection of Foodborne Pathogens in Meat

    PubMed Central

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-01-01

    Efficient preparation of food samples, comprising sampling and homogenization, for microbiological testing is an essential, yet largely neglected, component of foodstuff control. Salmonella enterica spiked chicken breasts were used as a surface contamination model whereas salami and meat paste acted as models of inner-matrix contamination. A systematic comparison of different homogenization approaches, namely, stomaching, sonication, and milling by FastPrep-24 or SpeedMill, revealed that for surface contamination a broad range of sample pretreatment steps is applicable and loss of culturability due to the homogenization procedure is marginal. In contrast, for inner-matrix contamination long treatments up to 8 min are required and only FastPrep-24 as a large-volume milling device produced consistently good recovery rates. In addition, sampling of different regions of the spiked sausages showed that pathogens are not necessarily homogenously distributed throughout the entire matrix. Instead, in meat paste the core region contained considerably more pathogens compared to the rim, whereas in the salamis the distribution was more even with an increased concentration within the intermediate region of the sausages. Our results indicate that sampling and homogenization as integral parts of food microbiology and monitoring deserve more attention to further improve food safety. PMID:26539462

  10. Influence of Homogenization on Microstructural Response and Mechanical Property of Al-Cu-Mn Alloy.

    PubMed

    Wang, Jian; Lu, Yalin; Zhou, Dongshuai; Sun, Lingyan; Li, Renxing; Xu, Wenting

    2018-05-29

    The evolution of the microstructures and properties of large direct chill (DC)-cast Al-Cu-Mn alloy ingots during homogenization was investigated. The results revealed that the Al-Cu-Mn alloy ingots had severe microsegregation and the main secondary phase was Al₂Cu, with minimal Al₇Cu₂Fe phase. Numerous primary eutectic phases existed in the grain boundary and the main elements were segregated at the interfaces along the interdendritic region. The grain boundaries became discontinuous, residual phases were effectively dissolved into the matrix, and the segregation degree of all elements was reduced dramatically during homogenization. In addition, the homogenized alloys exhibited improved microstructures with finer grain size, higher number density of dislocation networks, higher density of uniformly distributed θ' or θ phase (Al₂Cu), and higher volume fraction of high-angle grain boundaries compared to the nonhomogenized samples. After the optimal homogenization scheme treated at 535 °C for 10 h, the tensile strength and elongation% were about 24 MPa, 20.5 MPa, and 1.3% higher than those of the specimen without homogenization treatment.

  11. Learning through Intermediate Problems in Creating Cognitive Models

    ERIC Educational Resources Information Center

    Miwa, Kazuhisa; Morita, Junya; Nakaike, Ryuichi; Terai, Hitoshi

    2014-01-01

    Cognitive modelling is one of the representative research methods in cognitive science. It is believed that creating cognitive models promotes learners' meta-cognitive activities such as self-monitoring and reflecting on their own cognitive processing. Preceding studies have confirmed that such meta-cognitive activities actually promote learning…

  12. More than Just Convenient: The Scientific Merits of Homogeneous Convenience Samples

    PubMed Central

    Jager, Justin; Putnick, Diane L.; Bornstein, Marc H.

    2017-01-01

    Despite their disadvantaged generalizability relative to probability samples, non-probability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. PMID:28475254

  13. [Methods for enzymatic determination of triglycerides in liver homogenates].

    PubMed

    Höhn, H; Gartzke, J; Burck, D

    1987-10-01

    An enzymatic method is described for the determination of triacylglycerols in liver homogenate. In contrast to usual methods, higher reliability and selectivity are achieved by omitting the extraction step.

  14. Drag reduction in homogeneous turbulence by scale-dependent effective viscosity.

    PubMed

    Benzi, Roberto; Ching, Emily S C; Procaccia, Itamar

    2004-08-01

    We demonstrate, by using suitable shell models, that drag reduction in homogeneous turbulence is usefully discussed in terms of a scale-dependent effective viscosity. The essence of the phenomenon of drag reduction found in models that couple the velocity field to the polymers can be recaptured by an "equivalent" equation of motion for the velocity field alone, with a judiciously chosen scale-dependent effective viscosity that succinctly summarizes the important aspects of the interaction between the velocity and the polymer fields. Finally, we clarify the differences between drag reduction in homogeneous and in wall bounded flows.

  15. Hospital culture--why create one?

    PubMed

    Sovie, M D

    1993-01-01

    Hospitals, to survive, must be transformed into responsive, participative organizations capable of new practices that produce improved results in both quality of care and service at reduced costs. Creating, managing, and changing the culture are critical leadership functions that will enable the hospital to succeed. Strategic planning and effective implementation of planned change will produce the desired culture. Work restructuring, a focus on quality management along with changes in clinical practices, as well as the care and support processes, are all a part of the necessary hospital cultural revolution.

  16. Creating your own leadership brand.

    PubMed

    Kerfoot, Karlene

    2002-01-01

    Building equity in a brand happens through many encounters. The initial attraction must be followed by the meeting of expectations. This creates a loyalty that is part of an emotional connection to that brand. This is the same process people go through when they first meet a leader and decide if this is a person they want to buy into. People will examine your style, your competence, and your standards. If you fail on any of these fronts, your ability to lead will be severely compromised. People expect more of leaders now, because they know and recognize good leaders. And, predictably, people are now more cynical of leaders because of the well-publicized excess of a few leaders who advanced their own causes at the expense of their people and their financial future. This will turn out to be a good thing, because it will create a higher standard of leadership that all must aspire to achieve. When the bar is raised for us, our standards of performance are also raised.

  17. Fourier-Accelerated Nodal Solvers (FANS) for homogenization problems

    NASA Astrophysics Data System (ADS)

    Leuschner, Matthias; Fritzen, Felix

    2017-11-01

    Fourier-based homogenization schemes are useful to analyze heterogeneous microstructures represented by 2D or 3D image data. These iterative schemes involve discrete periodic convolutions with global ansatz functions (mostly fundamental solutions). The convolutions are efficiently computed using the fast Fourier transform. FANS operates on nodal variables on regular grids and converges to finite element solutions. Compared to established Fourier-based methods, the number of convolutions is reduced by FANS. Additionally, fast iterations are possible by assembling the stiffness matrix. Due to the related memory requirement, the method is best suited for medium-sized problems. A comparative study involving established Fourier-based homogenization schemes is conducted for a thermal benchmark problem with a closed-form solution. Detailed technical and algorithmic descriptions are given for all methods considered in the comparison. Furthermore, many numerical examples focusing on convergence properties for both thermal and mechanical problems, including also plasticity, are presented.

  18. Chemically Patterned Inverse Opal Created by a Selective Photolysis Modification Process.

    PubMed

    Tian, Tian; Gao, Ning; Gu, Chen; Li, Jian; Wang, Hui; Lan, Yue; Yin, Xianpeng; Li, Guangtao

    2015-09-02

    Anisotropic photonic crystal materials have long been pursued for their broad applications. A novel method for creating chemically patterned inverse opals is proposed here. The patterning technique is based on selective photolysis of a photolabile polymer together with postmodification on released amine groups. The patterning method allows regioselective modification within an inverse opal structure, taking advantage of selective chemical reaction. Moreover, combined with the unique signal self-reporting feature of the photonic crystal, the fabricated structure is capable of various applications, including gradient photonic bandgap and dynamic chemical patterns. The proposed method provides the ability to extend the structural and chemical complexity of the photonic crystal, as well as its potential applications.

  19. Fresh broad (Vicia faba) tissue homogenate-based biosensor for determination of phenolic compounds.

    PubMed

    Ozcan, Hakki Mevlut; Sagiroglu, Ayten

    2014-08-01

    In this study, a novel fresh broad (Vicia faba) tissue homogenate-based biosensor for determination of phenolic compounds was developed. The biosensor was constructed by immobilizing tissue homogenate of fresh broad (Vicia faba) on to glassy carbon electrode. For the stability of the biosensor, general immobilization techniques were used to secure the fresh broad tissue homogenate in gelatin-glutaraldehyde cross-linking matrix. In the optimization and characterization studies, the amount of fresh broad tissue homogenate and gelatin, glutaraldehyde percentage, optimum pH, optimum temperature and optimum buffer concentration, thermal stability, interference effects, linear range, storage stability, repeatability and sample applications (Wine, beer, fruit juices) were also investigated. Besides, the detection ranges of thirteen phenolic compounds were obtained with the help of the calibration graphs. A typical calibration curve for the sensor revealed a linear range of 5-60 μM catechol. In reproducibility studies, variation coefficient (CV) and standard deviation (SD) were calculated as 1.59%, 0.64×10(-3) μM, respectively.

  20. Motion through a non-homogeneous porous medium: Hydrodynamic permeability of a membrane composed of cylindrical particles

    NASA Astrophysics Data System (ADS)

    Yadav, Pramod Kumar

    2018-01-01

    The present problem is concerned with the flow of a viscous steady incompressible fluid through a non-homogeneous porous medium. Here, the non-homogeneous porous medium is a membrane built up by cylindrical particles. The flow outside the membrane is governed by the Stokes equation and the flow through the non-homogeneous porous membrane composed by cylindrical particles is governed by Darcy's law. In this work, we discussed the effect of various fluid parameters like permeability parameter k0, discontinuity coefficient at fluid-non homogeneous porous interface, viscosity ratio of viscous incompressible fluid region and non-homogeneous porous region, etc. on hydrodynamic permeability of a membrane, stress and on velocity profile. The comparative study for hydrodynamic permeability of membrane built up by non-homogeneous porous cylindrical particles and porous cylindrical shell enclosing a cylindrical cavity has been studied. The effects of various fluid parameters on the streamlines flow patterns are also discussed.