Sample records for homogenization modelisation numerique

  1. Etude numerique et experimentale de la reponse vibro-acoustique des structures raidies a des excitations aeriennes et solidiennes

    NASA Astrophysics Data System (ADS)

    Mejdi, Abderrazak

    Les fuselages des avions sont generalement en aluminium ou en composite renforces par des raidisseurs longitudinaux (lisses) et transversaux (cadres). Les raidisseurs peuvent etre metalliques ou en composite. Durant leurs differentes phases de vol, les structures d'avions sont soumises a des excitations aeriennes (couche limite turbulente : TBL, champs diffus : DAF) sur la peau exterieure dont l'energie acoustique produite se transmet a l'interieur de la cabine. Les moteurs, montes sur la structure, produisent une excitation solidienne significative. Ce projet a pour objectifs de developper et de mettre en place des strategies de modelisations des fuselages d'avions soumises a des excitations aeriennes et solidiennes. Tous d'abord, une mise a jour des modeles existants de la TBL apparait dans le deuxieme chapitre afin de mieux les classer. Les proprietes de la reponse vibro-acoustique des structures planes finies et infinies sont analysees. Dans le troisieme chapitre, les hypotheses sur lesquelles sont bases les modeles existants concernant les structures metalliques orthogonalement raidies soumises a des excitations mecaniques, DAF et TBL sont reexamines en premier lieu. Ensuite, une modelisation fine et fiable de ces structures est developpee. Le modele est valide numeriquement a l'aide des methodes des elements finis (FEM) et de frontiere (BEM). Des tests de validations experimentales sont realises sur des panneaux d'avions fournis par des societes aeronautiques. Au quatrieme chapitre, une extension vers les structures composites renforcees par des raidisseurs aussi en composites et de formes complexes est etablie. Un modele analytique simple est egalement implemente et valide numeriquement. Au cinquieme chapitre, la modelisation des structures raidies periodiques en composites est beaucoup plus raffinee par la prise en compte des effets de couplage des deplacements planes et transversaux. L'effet de taille des structures finies periodiques est egalement pris en compte. Les modeles developpes ont permis de conduire plusieurs etudes parametriques sur les proprietes vibro-acoustiques des structures d'avions facilitant ainsi la tache des concepteurs. Dans le cadre de cette these, un article a ete publie dans le Journal of Sound and Vibration et trois autres soumis, respectivement aux Journal of Acoustical Society of America, International Journal of Solid Mechanics et au Journal of Sound and Vibration Mots cles : structures raidies, composites, vibro-acoustique, perte par transmission.

  2. Etude thermo-hydraulique de l'ecoulement du moderateur dans le reacteur CANDU-6

    NASA Astrophysics Data System (ADS)

    Mehdi Zadeh, Foad

    Etant donne la taille (6,0 m x 7,6 m) ainsi que le domaine multiplement connexe qui caracterisent la cuve des reacteurs CANDU-6 (380 canaux dans la cuve), la physique qui gouverne le comportement du fluide moderateur est encore mal connue de nos jours. L'echantillonnage de donnees dans un reacteur en fonction necessite d'apporter des changements a la configuration de la cuve du reacteur afin d'y inserer des sondes. De plus, la presence d'une zone intense de radiations empeche l'utilisation des capteurs courants d'echantillonnage. En consequence, l'ecoulement du moderateur doit necessairement etre etudie a l'aide d'un modele experimental ou d'un modele numerique. Pour ce qui est du modele experimental, la fabrication et la mise en fonction de telles installations coutent tres cher. De plus, les parametres de la mise a l'echelle du systeme pour fabriquer un modele experimental a l'echelle reduite sont en contradiction. En consequence, la modelisation numerique reste une alternative importante. Actuellement, l'industrie nucleaire utilise une approche numerique, dite de milieu poreux, qui approxime le domaine par un milieu continu ou le reseau des tubes est remplace par des resistances hydrauliques distribuees. Ce modele est capable de decrire les phenomenes macroscopiques de l'ecoulement, mais ne tient pas compte des effets locaux ayant un impact sur l'ecoulement global, tel que les distributions de temperatures et de vitesses a proximite des tubes ainsi que des instabilites hydrodynamiques. Dans le contexte de la surete nucleaire, on s'interesse aux effets locaux autour des tubes de calandre. En effet, des simulations faites par cette approche predisent que l'ecoulement peut prendre plusieurs configurations hydrodynamiques dont, pour certaines, l'ecoulement montre un comportement asymetrique au sein de la cuve. Ceci peut provoquer une ebullition du moderateur sur la paroi des canaux. Dans de telles conditions, le coefficient de reactivite peut varier de maniere importante, se traduisant par l'accroissement de la puissance du reacteur. Ceci peut avoir des consequences majeures pour la surete nucleaire. Une modelisation CFD (Computational Fluid Dynamics) detaillee tenant compte des effets locaux s'avere donc necessaire. Le but de ce travail de recherche est de modeliser le comportement complexe de l'ecoulement du moderateur au sein de la cuve d'un reacteur nucleaire CANDU-6, notamment a proximite des tubes de calandre. Ces simulations servent a identifier les configurations possibles de l'ecoulement dans la calandre. Cette etude consiste ainsi a formuler des bases theoriques a l'origine des instabilites macroscopiques du moderateur, c.-a-d. des mouvements asymetriques qui peuvent provoquer l'ebullition du moderateur. Le defi du projet est de determiner l'impact de ces configurations de l'ecoulement sur la reactivite du reacteur CANDU-6.

  3. Modelisation par elements finis du muscle strie

    NASA Astrophysics Data System (ADS)

    Leonard, Mathieu

    Ce present projet de recherche a permis. de creer un modele par elements finis du muscle strie humain dans le but d'etudier les mecanismes engendrant les lesions musculaires traumatiques. Ce modele constitue une plate-forme numerique capable de discerner l'influence des proprietes mecaniques des fascias et de la cellule musculaire sur le comportement dynamique du muscle lors d'une contraction excentrique, notamment le module de Young et le module de cisaillement de la couche de tissu conjonctif, l'orientation des fibres de collagene de cette membrane et le coefficient de poisson du muscle. La caracterisation experimentale in vitro de ces parametres pour des vitesses de deformation elevees a partir de muscles stries humains actifs est essentielle pour l'etude de lesions musculaires traumatiques. Le modele numerique developpe est capable de modeliser la contraction musculaire comme une transition de phase de la cellule musculaire par un changement de raideur et de volume a l'aide des lois de comportement de materiau predefinies dans le logiciel LS-DYNA (v971, Livermore Software Technology Corporation, Livermore, CA, USA). Le present projet de recherche introduit donc un phenomene physiologique qui pourrait expliquer des blessures musculaires courantes (crampes, courbatures, claquages, etc.), mais aussi des maladies ou desordres touchant le tissu conjonctif comme les collagenoses et la dystrophie musculaire. La predominance de blessures musculaires lors de contractions excentriques est egalement exposee. Le modele developpe dans ce projet de recherche met ainsi a l'avant-scene le concept de transition de phase ouvrant la porte au developpement de nouvelles technologies pour l'activation musculaire chez les personnes atteintes de paraplegie ou de muscles artificiels compacts pour l'elaboration de protheses ou d'exosquelettes. Mots-cles Muscle strie, lesion musculaire, fascia, contraction excentrique, modele par elements finis, transition de phase

  4. Developpement D'un Modele Climatique Regional: Fizr Simulation des Conditions de Janvier de la Cote Ouest Nord Americaine

    NASA Astrophysics Data System (ADS)

    Goyette, Stephane

    1995-11-01

    Le sujet de cette these concerne la modelisation numerique du climat regional. L'objectif principal de l'exercice est de developper un modele climatique regional ayant les capacites de simuler des phenomenes de meso-echelle spatiale. Notre domaine d'etude se situe sur la Cote Ouest nord americaine. Ce dernier a retenu notre attention a cause de la complexite du relief et de son controle sur le climat. Les raisons qui motivent cette etude sont multiples: d'une part, nous ne pouvons pas augmenter, en pratique, la faible resolution spatiale des modeles de la circulation generale de l'atmosphere (MCG) sans augmenter a outrance les couts d'integration et, d'autre part, la gestion de l'environnement exige de plus en plus de donnees climatiques regionales determinees avec une meilleure resolution spatiale. Jusqu'alors, les MCG constituaient les modeles les plus estimes pour leurs aptitudes a simuler le climat ainsi que les changements climatiques mondiaux. Toutefois, les phenomenes climatiques de fine echelle echappent encore aux MCG a cause de leur faible resolution spatiale. De plus, les repercussions socio-economiques des modifications possibles des climats sont etroitement liees a des phenomenes imperceptibles par les MCG actuels. Afin de circonvenir certains problemes inherents a la resolution, une approche pratique vise a prendre un domaine spatial limite d'un MCG et a y imbriquer un autre modele numerique possedant, lui, un maillage de haute resolution spatiale. Ce processus d'imbrication implique alors une nouvelle simulation numerique. Cette "retro-simulation" est guidee dans le domaine restreint a partir de pieces d'informations fournies par le MCG et forcee par des mecanismes pris en charge uniquement par le modele imbrique. Ainsi, afin de raffiner la precision spatiale des previsions climatiques de grande echelle, nous developpons ici un modele numerique appele FIZR, permettant d'obtenir de l'information climatique regionale valide a la fine echelle spatiale. Cette nouvelle gamme de modeles-interpolateurs imbriques qualifies d'"intelligents" fait partie de la famille des modeles dits "pilotes". L'hypothese directrice de notre etude est fondee sur la supposition que le climat de fine echelle est souvent gouverne par des forcages provenant de la surface plutot que par des transports atmospheriques de grande echelle spatiale. La technique que nous proposons vise donc a guider FIZR par la Dynamique echantillonnee d'un MCG et de la forcer par la Physique du MCG ainsi que par un forcage orographique de meso-echelle, en chacun des noeuds de la grille fine de calculs. Afin de valider la robustesse et la justesse de notre modele climatique regional, nous avons choisi la region de la Cote Ouest du continent nord americain. Elle est notamment caracterisee par une distribution geographique des precipitations et des temperatures fortement influencee par le relief sous-jacent. Les resultats d'une simulation d'un mois de janvier avec FIZR demontrent que nous pouvons simuler des champs de precipitations et de temperatures au niveau de l'abri beaucoup plus pres des observations climatiques comparativement a ceux simules a partir d'un MCG. Ces performances sont manifestement attribuees au forcage orographique de meso-echelle de meme qu'aux caracteristiques de surface determinees a fine echelle. Un modele similaire a FIZR peut, en principe, etre implante sur l'importe quel MCG, donc, tout organisme de recherche implique en modelisation numerique mondiale de grande echelle pourra se doter d'un el outil de regionalisation.

  5. Etude aerodynamique d'un jet turbulent impactant une paroi concave

    NASA Astrophysics Data System (ADS)

    LeBlanc, Benoit

    Etant donne la demande croissante de temperatures elevees dans des chambres de combustion de systemes de propulsions en aerospatiale (turbomoteurs, moteur a reaction, etc.), l'interet dans le refroidissement par jets impactant s'est vu croitre. Le refroidissement des aubes de turbine permet une augmentation de temperature de combustion, ce qui se traduit en une augmentation de l'efficacite de combustion et donc une meilleure economie de carburant. Le transfert de chaleur dans les au bages est influence par les aspects aerodynamiques du refroidissement a jet, particulierement dans le cas d'ecoulements turbulents. Un manque de comprehension de l'aerodynamique a l'interieur de ces espaces confinees peut mener a des changements de transfert thermique qui sont inattendus, ce qui augmente le risque de fluage. Il est donc d'interet pour l'industrie aerospatiale et l'academie de poursuivre la recherche dans l'aerodynamique des jets turbulents impactant les parois courbes. Les jets impactant les surfaces courbes ont deja fait l'objet de nombreuses etudes. Par contre des conditions oscillatoires observees en laboratoire se sont averees difficiles a reproduire en numerique, puisque les structures d'ecoulements impactants des parois concaves sont fortement dependantes de la turbulence et des effets instationnaires. Une etude experimentale fut realisee a l'institut PPRIME a l'Universite de Poitiers afin d'observer le phenomene d'oscillation dans le jet. Une serie d'essais ont verifie les conditions d'ecoulement laminaires et turbulentes, toutefois le cout des essais experimentaux a seulement permis d'avoir un apercu du phenomene global. Une deuxieme serie d'essais fut realisee numeriquement a l'Universite de Moncton avec l'outil OpenFOAM pour des conditions d'ecoulement laminaire et bidimensionnel. Cette etude a donc comme but de poursuivre l'enquete de l'aerodynamique oscillatoire des jets impactant des parois courbes, mais pour un regime d'ecoulement transitoire, turbulent, tridimensionnel. Les nombres de Reynolds utilises dans l'etude numerique, bases sur le diametre du jet lineaire observe, sont de Red = 3333 et 6667, consideres comme etant en transition vers la turbulence. Dans cette etude, un montage numerique est construit. Le maillage, le schema numerique, les conditions frontiere et la discretisation sont discutes et choisis. Les resultats sont ensuite valides avec des donnees turbulentes experimentales. En modelisation numerique de turbulence, les modeles de Moyennage Reynolds des Equations Naviers Stokes (RANS) presentent des difficultes avec des ecoulements instationnaires en regime transitionnel. La Simulation des Grandes Echelles (LES) presente une solution plus precise, mais au cout encore hors de portee pour cette etude. La methode employee pour cette etude est la Simulation des Tourbillons Detaches (DES), qui est un hybride des deux methodes (RANS et LES). Pour analyser la topologie de l'ecoulement, la decomposition des modes propres (POD) a ete egalement ete effectuee sur les resultats numeriques. L'etude a demontre d'abord le temps de calcul relativement eleve associe a des essais DES pour garder le nombre de Courant faible. Les resultats numeriques ont cependant reussi a reproduire correctement le basculement asynchrone observe dans les essais experimentaux. Le basculement observe semble etre cause par des effets transitionnels, ce qui expliquerait la difficulte des modeles RANS a correctement reproduire l'aerodynamique de l'ecoulement. L'ecoulement du jet, a son tour, est pour la plupart du temps tridimensionnel et turbulent sauf pour de courtes periodes de temps stable et independant de la troisieme dimension. L'etude topologique de l'ecoulement a egalement permit la reconaissances de structures principales sousjacentes qui etaient brouillees par la turbulence. Mots cles : jet impactant, paroi concave, turbulence, transitionnel, simulation des tourbillons detaches (DES), OpenFOAM.

  6. Modelisation numerique de l'hydrologie pour l'aide a la gestion des bassins versants, par l'utilisation conjointe des systemes d'information geographique et de la methode des elements finis un nouvel outil pour le developpement durable SAGESS

    NASA Astrophysics Data System (ADS)

    Bel Hadj Kacem, Mohamed Salah

    All hydrological processes are affected by the spatial variability of the physical parameters of the watershed, and also by human intervention on the landscape. The water outflow from a watershed strictly depends on the spatial and temporal variabilities of the physical parameters of the watershed. It is now apparent that the integration of mathematical models into GIS's can benefit both GIS and three-dimension environmental models: a true modeling capability can help the modeling community bridge the gap between planners, scientists, decision-makers and end-users. The main goal of this research is to design a practical tool to simulate run-off water surface using Geographic design a practical tool to simulate run-off water surface using Geographic Information Systems and the simulation of the hydrological behavior by the Finite Element Method.

  7. Etude de pratiques d'enseignement relatives a la modelisation en sciences et technologies avec des enseignants du secondaire

    NASA Astrophysics Data System (ADS)

    Aurousseau, Emmanuelle

    Les modeles sont des outils amplement utilises en sciences et technologies (S&T) afin de representer et d’expliquer un phenomene difficilement accessible, voire abstrait. La demarche de modelisation est presentee de maniere explicite dans le programme de formation de l’ecole quebecoise (PFEQ), notamment au 2eme cycle du secondaire (Quebec. Ministere de l'Education du Loisir et du Sport, 2007a). Elle fait ainsi partie des sept demarches auxquelles eleves et enseignants sont censes recourir. Cependant, de nombreuses recherches mettent en avant la difficulte des enseignants a structurer leurs pratiques d’enseignement autour des modeles et de la demarche de modelisation qui sont pourtant reconnus comme indispensables. En effet, les modeles favorisent la conciliation des champs concrets et abstraits entre lesquels le scientifique, meme en herbe, effectue des allers-retours afin de concilier le champ experimental de reference qu’il manipule et observe au champ theorique relie qu’il construit. L’objectif de cette recherche est donc de comprendre comment les modeles et la demarche de modelisation contribuent a faciliter l’articulation du concret et de l’abstrait dans l’enseignement des sciences et des technologies (S&T) au 2eme cycle du secondaire. Pour repondre a cette question, nous avons travaille avec les enseignants dans une perspective collaborative lors de groupes focalises et d’observation en classe. Ces dispositifs ont permis d’examiner les pratiques d’enseignement que quatre enseignants mettent en oeuvre en utilisant des modeles et des demarches de modelisation. L’analyse des pratiques d’enseignement et des ajustements que les enseignants envisagent dans leur pratique nous permet de degager des connaissances a la fois pour la recherche et pour la pratique des enseignants, au regard de l’utilisation des modeles et de la demarche de modelisation en S&T au secondaire.

  8. Conceptual Modeling (CM) for Military Modeling and Simulation (M&S) (Modelisation conceptuelle (MC) pour la modelisation et la simulation (M&S) militaires)

    DTIC Science & Technology

    2012-07-01

    du monde de la modélisation et de la simulation et lui fournir des directives de mise en œuvre ; et fournir des ...définition ; rapports avec les normes ; spécification de procédure de gestion de la MC ; spécification d’artefact de MC. Considérations importantes...utilisant la présente directive comme référence. • Les VV&A (vérification, validation et acceptation) des MC doivent faire partie intégrante du

  9. Modelisation numerique de tunnels de metro dans les massifs rocheux sedimentaires de la region de Montreal

    NASA Astrophysics Data System (ADS)

    Lavergne, Catherine

    Geological formations of the Montreal area are mostly made of limestones. The usual approach for design is based on rock mass classification systems considering the rock mass as an equivalent continuous and isotropic material. However, for shallow excavations, stability is generally controlled by geological structures, that in Montreal, are bedding plans that give to the rock mass a strong strain and stress anisotropy. Objects of the research are to realize a numerical modeling that considers sedimentary rocks anisotropy and to determine the influence of the design parameters on displacements, stresses and failure around metro unsupported underground excavations. Geotechnical data used for this study comes from a metro extension project and has been made available to the author. The excavation geometries analyzed are the tunnel, the station and a garage consisting of three (3) parallel tunnels for rock covered between 4 and 16 m. The numerical modeling has been done with FLAC software that represents continuous environment, and ubiquitous joint behavior model to simulate strength anisotropy of sedimentary rock masses. The model considers gravity constraints for an anisotropic material and pore pressures. In total, eleven (11) design parameters have been analyzed. Results show that unconfined compressive strength of intact rock, fault zones and pore pressures in soils have an important influence on the stability of the numerical model. The geometry of excavation, the thickness of rock covered, the RQD, Poisson's ratio and the horizontal tectonic stresses have a moderate influence. Finally, ubiquitous joint parameters, pore pressures in rock mass, width of the pillars of the garage and the damage linked to the excavation method have a low impact. FLAC results have been compared with those of UDEC, a software that uses the distinct element method. Similar conclusions were obtained on displacements, stress state and failure modes. However, UDEC model give slightly less conservative results than FLAC. This study stands up by his local character and the large amount of geotechnical data available used to determine parameters of the numerical model. The results led to recommendations for laboratory tests that can be applied to characterize more specifically anisotropy of sedimentary rocks.

  10. Modelisation numerique d'un actionneur plasma de type decharge a barriere dielectrique par la methode de derive-diffusion

    NASA Astrophysics Data System (ADS)

    Xing, Jacques

    Dielectric barrier discharge (DBD) plasma actuator is a proposed device for active for control in order to improve the performances of aircraft and turbomachines. Essentially, these actuators are made of two electrodes separated by a layer of dielectric material and convert electricity directly into flow. Because of the high costs associated with experiences in realistic operating conditions, there is a need to develop a robust numerical model that can predict the plasma body force and the effects of various parameters on it. Indeed, this plasma body force can be affected by atmospheric conditions (temperature, pressure, and humidity), velocity of the neutral flow, applied voltage (amplitude, frequency, and waveform), and by the actuator geometry. In that respect, the purpose of this thesis is to implement a plasma model for DBD actuator that has the potential to consider the effects of these various parameters. In DBD actuator modelling, two types of approach are commonly proposed, low-order modelling (or phenomenological) and high-order modelling (or scientific). However a critical analysis, presented in this thesis, showed that phenomenological models are not robust enough to predict the plasma body force without artificial calibration for each specific case. Moreover, there are based on erroneous assumptions. Hence, the selected approach to model the plasma body force is a scientific drift-diffusion model with four chemical species (electrons, positive ions, negative ions, and neutrals). This model was chosen because it gives consistent numerical results comparatively with experimental data. Moreover, this model has great potential to include the effect of temperature, pressure, and humidity on the plasma body force and requires only a reasonable computational time. This model was independently implemented in C++ programming language and validated with several test cases. This model was later used to simulate the effect of the plasma body force on the laminar-turbulent transition on airfoil in order to validate the performance of this model in practical CFD simulation. Numerical results show that this model gives a better prediction of the effect of the plasma on the fluid flow for a practical case in aerospace than a phenomenological model.

  11. Modelisation of the SECMin molten salts environment

    NASA Astrophysics Data System (ADS)

    Lucas, M.; Slim, C.; Delpech, S.; di Caprio, D.; Stafiej, J.

    2014-06-01

    We develop a cellular automata modelisation of SECM experiments to study corrosion in molten salt media for generation IV nuclear reactors. The electrodes used in these experiments are cylindrical glass tips with a coaxial metal wire inside. As the result of simulations we obtain the current approach curves of the electrodes with geometries characterized by several values of the ratios of glass to metal area at the tip. We compare these results with predictions of the known analytic expressions, solutions of partial differential equations for flat uniform geometry of the substrate. We present the results for other, more complicated substrate surface geometries e. g. regular saw modulated surface, surface obtained by Eden model process, ...

  12. Le recours aux modeles dans l'enseignement de la biologie au secondaire : Conceptions d'enseignantes et d'enseignants et modes d'utilisation

    NASA Astrophysics Data System (ADS)

    Varlet, Madeleine

    Le recours aux modeles et a la modelisation est mentionne dans la documentation scientifique comme un moyen de favoriser la mise en oeuvre de pratiques d'enseignement-apprentissage constructivistes pour pallier les difficultes d'apprentissage en sciences. L'etude prealable du rapport des enseignantes et des enseignants aux modeles et a la modelisation est alors pertinente pour comprendre leurs pratiques d'enseignement et identifier des elements dont la prise en compte dans les formations initiale et disciplinaire peut contribuer au developpement d'un enseignement constructiviste des sciences. Plusieurs recherches ont porte sur ces conceptions sans faire de distinction selon les matieres enseignees, telles la physique, la chimie ou la biologie, alors que les modeles ne sont pas forcement utilises ou compris de la meme maniere dans ces differentes disciplines. Notre recherche s'est interessee aux conceptions d'enseignantes et d'enseignants de biologie au secondaire au sujet des modeles scientifiques, de quelques formes de representations de ces modeles ainsi que de leurs modes d'utilisation en classe. Les resultats, que nous avons obtenus au moyen d'une serie d'entrevues semi-dirigees, indiquent que globalement leurs conceptions au sujet des modeles sont compatibles avec celle scientifiquement admise, mais varient quant aux formes de representations des modeles. L'examen de ces conceptions temoigne d'une connaissance limitee des modeles et variable selon la matiere enseignee. Le niveau d'etudes, la formation prealable, l'experience en enseignement et un possible cloisonnement des matieres pourraient expliquer les differentes conceptions identifiees. En outre, des difficultes temporelles, conceptuelles et techniques peuvent freiner leurs tentatives de modelisation avec les eleves. Toutefois, nos resultats accreditent l'hypothese que les conceptions des enseignantes et des enseignants eux-memes au sujet des modeles, de leurs formes de representation et de leur approche constructiviste en enseignement representent les plus grands obstacles a la construction des modeles en classe. Mots-cles : Modeles et modelisation, biologie, conceptions, modes d'utilisation, constructivisme, enseignement, secondaire.

  13. Vectored Thrust Digital Flight Control for Crew Escape. Volume 2.

    DTIC Science & Technology

    1985-12-01

    no. 24. Lecrique, J., A. Rault, M. Tessier and J.L. Testud (1978), - "Multivariable Regulation of a Thermal Power Plant Steam Generator," presented...and Extended Kalman Observers," presented at the Conf. Decision and Control, San Diego, CA. Testud , J.L. (1977), Commande Numerique Multivariable du

  14. Turbomachinery Design Using CFD (La Conception des Turbomachines par l’Aerodynamique Numerique).

    DTIC Science & Technology

    1994-05-01

    Method for Flow Calculations in Turbomachines", Vrije Thompkins, W.T.,1981, "A Fortran Program for Calcu- Univ.Brussel, Dienst Stromingsmechanica, VUB- STR ...Model Equation for Simulating Flows in mung um Profile Multistage Turbomachinery MBB-Bericht Nr. UFE 1352, 1977 ASME paper 85-GT-226, Houston, March

  15. Methodes d'amas quantiques a temperature finie appliquees au modele de Hubbard

    NASA Astrophysics Data System (ADS)

    Plouffe, Dany

    Depuis leur decouverte dans les annees 80, les supraconducteurs a haute temperature critique ont suscite beaucoup d'interet en physique du solide. Comprendre l'origine des phases observees dans ces materiaux, telle la supraconductivite, est l'un des grands defis de la physique theorique du solide des 25 dernieres annees. L'un des mecanismes pressentis pour expliquer ces phenomenes est la forte interaction electron-electron. Le modele de Hubbard est l'un des modeles les plus simples pour tenir compte de ces interactions. Malgre la simplicite apparente de ce modele, certaines de ses caracteristiques, dont son diagramme de phase, ne sont toujours pas bien etablies, et ce malgre plusieurs avancements theoriques dans les dernieres annees. Cette etude se consacre a faire une analyse de methodes numeriques permettant de calculer diverses proprietes du modele de Hubbard en fonction de la temperature. Nous decrivons des methodes (la VCA et la CPT) qui permettent de calculer approximativement la fonction de Green a temperature finie sur un systeme infini a partir de la fonction de Green calculee sur un amas de taille finie. Pour calculer ces fonctions de Green, nous allons utiliser des methodes permettant de reduire considerablement les efforts numeriques necessaires pour les calculs des moyennes thermodynamiques, en reduisant considerablement l'espace des etats a considerer dans ces moyennes. Bien que cette etude vise d'abord a developper des methodes d'amas pour resoudre le modele de Hubbard a temperature finie de facon generale ainsi qu'a etudier les proprietes de base de ce modele, nous allons l'appliquer a des conditions qui s'approchent de supraconducteurs a haute temperature critique. Les methodes presentees dans cette etude permettent de tracer un diagramme de phase pour l'antiferromagnetisme et la supraconductivite qui presentent plusieurs similarites avec celui des supraconducteurs a haute temperature. Mots-cles : modele de Hubbard, thermodynamique, antiferromagnetisme, supraconductivite, methodes numeriques, larges matrices

  16. Hidden Markov random field model and Broyden-Fletcher-Goldfarb-Shanno algorithm for brain image segmentation

    NASA Astrophysics Data System (ADS)

    Guerrout, EL-Hachemi; Ait-Aoudia, Samy; Michelucci, Dominique; Mahiou, Ramdane

    2018-05-01

    Many routine medical examinations produce images of patients suffering from various pathologies. With the huge number of medical images, the manual analysis and interpretation became a tedious task. Thus, automatic image segmentation became essential for diagnosis assistance. Segmentation consists in dividing the image into homogeneous and significant regions. We focus on hidden Markov random fields referred to as HMRF to model the problem of segmentation. This modelisation leads to a classical function minimisation problem. Broyden-Fletcher-Goldfarb-Shanno algorithm referred to as BFGS is one of the most powerful methods to solve unconstrained optimisation problem. In this paper, we investigate the combination of HMRF and BFGS algorithm to perform the segmentation operation. The proposed method shows very good segmentation results comparing with well-known approaches. The tests are conducted on brain magnetic resonance image databases (BrainWeb and IBSR) largely used to objectively confront the results obtained. The well-known Dice coefficient (DC) was used as similarity metric. The experimental results show that, in many cases, our proposed method approaches the perfect segmentation with a Dice Coefficient above .9. Moreover, it generally outperforms other methods in the tests conducted.

  17. Developpements numeriques recents realises en aeroelasticite chez Dassault Aviation pour la conception des avions de combat modernes et des avions d’affaires

    DTIC Science & Technology

    2003-03-01

    combat modernes et des avions d’affaires E. Garrigues, Th. Percheron DASSAULT AVIATION DGT/DTA/IAP F-922 14, Saint-Cloud Cedex France 1. Introduction ...de vol, des acedidrations rigides et des rdponses de la structure ( jauges et acedidrations). Struturl Premdicton Grdjustments n~~~ligh Testsn~n Fig4ure

  18. Etude de la transmission sonore a travers un protecteur de type "coquilles" : modelisation numerique et validation experimentale

    NASA Astrophysics Data System (ADS)

    Boyer, Sylvain

    On estime que sur les 3,7 millions des travailleurs au Quebec, plus de 500 000 sont exposes quotidiennement a des niveaux de bruits pouvant causer des lesions de l'appareil auditif. Lorsqu'il n'est pas possible de diminuer le niveau de bruit environnant, en modifiant les sources de bruits, ou en limitant la propagation du son, le port de protecteurs auditifs individualises, telles que les coquilles, demeure l'ultime solution. Bien que vue comme une solution a court terme, elle est communement employee, du fait de son caractere peu dispendieux, de sa facilite d'implantation et de son adaptabilite a la plupart des operations en environnement bruyant. Cependant les protecteurs auditifs peuvent etre a la fois inadaptes aux travailleurs et a leur environnement et inconfortables ce qui limite leur temps de port, reduisant leur protection effective. Afin de palier a ces difficultes, un projet de recherche sur la protection auditive intitule : " Developpement d'outils et de methodes pour ameliorer et mieux evaluer la protection auditive individuelle des travailleur ", a ete mis sur pied en 2010, associant l'Ecole de technologie superieure (ETS) et l'Institut de recherche Robert-Sauve en sante et en securite du travail (IRSST). S'inscrivant dans ce programme de recherche, le present travail de doctorat s'interesse specifiquement a la protection auditive au moyen de protecteurs auditifs " passifs " de type coquille, dont l'usage presente trois problematiques specifiques presentees dans les paragraphes suivants. La premiere problematique specifique concerne l'inconfort cause par exemple par la pression statique induite par la force de serrage de l'arceau, qui peut reduire le temps de port recommande pour limiter l'exposition au bruit. Il convient alors de pouvoir donner a l'utilisateur un protecteur confortable, adapte a son environnement de travail et a son activite. La seconde problematique specifique est l'evaluation de la protection reelle apportee par le protecteur. La methode des seuils auditifs REAT (Real Ear Attenuation Threshold) aussi vu comme un "golden standard" est utilise pour quantifier la reduction du bruit mais surestime generalement la performance des protecteurs. Les techniques de mesure terrains, telles que la F-MIRE (Field Measurement in Real Ear) peuvent etre a l'avenir de meilleurs outils pour evaluer l'attenuation individuelle. Si ces techniques existent pour des bouchons d'oreilles, elles doivent etre adaptees et ameliorees pour le cas des coquilles, en determinant l'emplacement optimal des capteurs acoustiques et les facteurs de compensation individuels qui lient la mesure microphonique a la mesure qui aurait ete prise au tympan. La troisieme problematique specifique est l'optimisation de l'attenuation des coquilles pour les adapter a l'individu et a son environnement de travail. En effet, le design des coquilles est generalement base sur des concepts empiriques et des methodes essais/erreurs sur des prototypes. La piste des outils predictifs a ete tres peu etudiee jusqu'a present et meriterait d'etre approfondie. L'utilisation du prototypage virtuel, permettrait a la fois d'optimiser le design avant production, d'accelerer la phase de developpement produit et d'en reduire les couts. L'objectif general de cette these est de repondre a ces differentes problematiques par le developpement d'un modele de l'attenuation sonore d'un protecteur auditif de type coquille. A cause de la complexite de la geometrie de ces protecteurs, la methode principale de modelisation retenue a priori est la methode des elements finis (FEM). Pour atteindre cet objectif general, trois objectifs specifiques ont ete etablis et sont presentes dans les trois paragraphes suivants. (Abstract shortened by ProQuest.).

  19. Time Sensitive Course of Action Development and Evaluation

    DTIC Science & Technology

    2010-10-01

    Applications militaires de la modelisation humaine ). RTO-MP-HFM-202 14. ABSTRACT The development of courses of action that integrate military with...routes between the capital town C of the province and a neighboring country M. Both roads are historically significant smuggling routes. There were

  20. Biological Rhythms Modelisation of Vigilance and Sleep in Microgravity State with COSINOR and Volterra's Kernels Methods

    NASA Astrophysics Data System (ADS)

    Gaudeua de Gerlicz, C.; Golding, J. G.; Bobola, Ph.; Moutarde, C.; Naji, S.

    2008-06-01

    The spaceflight under microgravity cause basically biological and physiological imbalance in human being. Lot of study has been yet release on this topic especially about sleep disturbances and on the circadian rhythms (alternation vigilance-sleep, body, temperature...). Factors like space motion sickness, noise, or excitement can cause severe sleep disturbances. For a stay of longer than four months in space, gradual increases in the planned duration of sleep were reported. [1] The average sleep in orbit was more than 1.5 hours shorter than the during control periods on earth, where sleep averaged 7.9 hours. [2] Alertness and calmness were unregistered yield clear circadian pattern of 24h but with a phase delay of 4h.The calmness showed a biphasic component (12h) mean sleep duration was 6.4 structured by 3-5 non REM/REM cycles. Modelisations of neurophysiologic mechanisms of stress and interactions between various physiological and psychological variables of rhythms have can be yet release with the COSINOR method. [3

  1. Simulation 󈨔 Symposium.

    DTIC Science & Technology

    1980-11-21

    defensive , and both the question and the answer seemed to generate supporting reactions from the audience. Discrete Event Simulation The session on...R. Toscano / A. Maceri / F. Maceri (Italy) Analyse numerique de quelques problemes de contact en theorie des membranes 3:40 - 4:00 p.m. COFFEE BREAK...Switzerland Stockage de chaleur faible profondeur : Simulation par elements finis 3:40 - 4:00 p.m. A. Rizk Abu El-Wafa / M. Tawfik / M.S. Mansour (Egypt) Digital

  2. Computational and Experimental Assessment of Jets in Cross Flow (Evaluation Numerique et Experimentale des Jets dans des Courants Transversaux)

    DTIC Science & Technology

    1993-11-01

    sont caracterises par la striosconic continue tic la la partie tie la couche tie melange situde sous le jet figure 3 ainsi que par la tomoscopie tie... caracterisent les ondzs). Ces ondes un disque; de Mach. Sur la figuire 4, on observe la semblent proveriir tie la r6gion tie l’jccteur, juste trace du

  3. Human Modelling for Military Application (Applications militaires de la modelisation humaine)

    DTIC Science & Technology

    2010-10-01

    techniques (rooted in the mathematics-centered analytic methods arising from World War I analyses by Lanchester 2 ). Recent requirements for research and...34Dry Shooting for Airplane Gunners - Popular Science Monthly". January 1919. p. 13-14. 2 Lanchester F.W., Mathematics in Warfare in The World of

  4. Algorithms for Robust Identification and Control of Large Space Structures. Phase 1.

    DTIC Science & Technology

    1988-05-14

    Variate Analysis," Proc. Amer. Control Conf., San Francisco, * pp. 445-451. LECTIQUE, J., Rault, A., Tessier, M., and Testud , J.L. (1978), "Multivariable...Rault, J.L. Testud , and J. Papon (1978), "Model Predictive Heuris- tic Control: Applications to Industrial Processes," Automatica, Vol. 14, pp. 413...Control ’. Conference, Minneapolis, MN, June. TESTUD , J.L. (1979), "Commande Numerique Multivariable du Ballon de Recupera- tion de Vapeur," Adersa/Gerbios

  5. AGARD Flight Test Instrumentation Series. Volume 19. Digital Signal Conditioning for Flight Test. (Le Traitement du Signal Numerique pour les Essais n Vol)

    DTIC Science & Technology

    1991-06-01

    intensive systems, including the use of onboard digital computers. Topics include: measurements that are digital in origin, sampling, encoding, transmitting...Individuals charged with designing aircraft measuring systems to become better acquainted with new solutions to their requirements. This volume Is...concerned with aircraft measuring systems as related to flight test and flight research. Measure - ments that are digital in origin or that must be

  6. A Selection of Experimental Test Cases for the Validation of CFD Codes (Recueil de cas d’essai experimentaux pour la validation des codes de l’aerodynamique numerique). Volume 1

    DTIC Science & Technology

    1994-08-01

    volume H1. Le rapport ext accompagnt5 doun jeo die disqoettex contenant les donn~es appropri~es Li bous let cas d’essai. (’es disqoettes sont disponibles ...GERMANY PURPL’Sb OF THE TESi The tests are part of a larger effort to establish a database of experimental measurements for missile configurations

  7. Team Modelling: Survey of Experimental Platforms (Modelisation d’equipes : Examen de plate-formes experimentales)

    DTIC Science & Technology

    2006-09-01

    Control Force Agility Shared Situational Awareness Attentional Demand Interoperability Network Based Operations Effect Based Operations Speed of...Command Self Synchronization Reach Back Reach Forward Information Superiority Increased Mission Effectiveness Humansystems® Team Modelling...communication effectiveness and Distributed Mission Training (DMT) effectiveness . The NASA Ames Centre - Distributed Research Facilities platform could

  8. Bellman Continuum (3rd) International Workshop (13-14 June 1988)

    DTIC Science & Technology

    1988-06-01

    Modelling Uncertain Problem ................. 53 David Bensoussan ,---,>Asymptotic Linearization of Uncertain Multivariable Systems by Sliding Modes...K. Ghosh .-. Robust Model Tracking for a Class of Singularly Perturbed Nonlinear Systems via Composite Control ....... 93 F. Garofalo and L. Glielmo...MODELISATION ET COMMANDE EN ECONOMIE MODELS AND CONTROL POLICIES IN ECONOMICS Qualitative Differential Games : A Viability Approach ............. 117

  9. CFD Techniques for Propulsion Applications Panel Symposium (77th), Held in San Antonio, Texas on 27-31 May 1991 (Les Techniques de l’Aerodynamique Numerique pour les Applications aux Propulseurs)

    DTIC Science & Technology

    1992-02-01

    CONCLUDING REMARKS secondary flow pattern. Probably both factors are influential. Unfortunately The present study has examined the the secondary...Panels which are compesed of experts appointed - by the National Delegates, the Consultant and Exchange Programme and the Aerospace Applications Studies ...CP 352. September 1983 /Combustion Problems in Turbine Engines AGARD CP 353, January 1984 (,rHazard Studies for Solid Propellant Rocket Motors AGARD CP

  10. Computational Aerodynamics Based on the Euler Equations (L’aerodynamique Numerique a Partir des Equations d’Euler)

    DTIC Science & Technology

    1994-01-01

    0 The Mission of AGARD 0 According to its Charter, the mission of AGARD is to bring together the leading personalities of the NATO nations in the...advances in the aerospace sciences relevant to strengthening the common defence posture; • - Improving the co-operation among member nations in aerospace...for the physical principles. To construct the relevant equations for fluid gas consisting of pseudo particles, 10 is the internal energy due motion it

  11. Environmental Modeling Packages for the MSTDCL TDP: Review and Recommendations (Trousses de Modelisation Environnementale Pour le PDT DCLTCM: Revue et Recommendations)

    DTIC Science & Technology

    2009-09-01

    frequency shallow water scenarios, and DRDC has ready access to a well-established PE model ( PECan ). In those spectral areas below 1 kHz, where the PE...PCs Personnel Computers PE Parabolic Equation PECan PE Model developed by DRDC SPADES/ICE Sensor Performance and Acoustic Detection Evaluation

  12. Simulation numerique de l'effet du reflecteur radial sur les cellules rep en utilisant les codes DRAGON et DONJON

    NASA Astrophysics Data System (ADS)

    Bejaoui, Najoua

    The pressurized water nuclear reactors (PWRs) is the largest fleet of nuclear reactors in operation around the world. Although these reactors have been studied extensively by designers and operators using efficient numerical methods, there are still some calculation weaknesses, given the geometric complexity of the core, still unresolved such as the analysis of the neutron flux's behavior at the core-reflector interface. The standard calculation scheme is a two steps process. In the first step, a detailed calculation at the assembly level with reflective boundary conditions, provides homogenized cross-sections for the assemblies, condensed to a reduced number of groups; this step is called the lattice calculation. The second step uses homogenized properties in each assemblies to calculate reactor properties at the core level. This step is called the full-core calculation or whole-core calculation. This decoupling of the two calculation steps is the origin of methodological bias particularly at the interface core reflector: the periodicity hypothesis used to calculate cross section librairies becomes less pertinent for assemblies that are adjacent to the reflector generally represented by these two models: thus the introduction of equivalent reflector or albedo matrices. The reflector helps to slowdown neutrons leaving the reactor and returning them to the core. This effect leads to two fission peaks in fuel assemblies localised at the core/reflector interface, the fission rate increasing due to the greater proportion of reentrant neutrons. This change in the neutron spectrum arises deep inside the fuel located on the outskirts of the core. To remedy this we simulated a peripheral assembly reflected with TMI-PWR reflector and developed an advanced calculation scheme that takes into account the environment of the peripheral assemblies and generate equivalent neutronic properties for the reflector. This scheme is tested on a core without control mechanisms and charged with fresh fuel. The results of this study showed that explicit representation of reflector and calculation of peripheral assembly with our advanced scheme allow corrections to the energy spectrum at the core interface and increase the peripheral power by up to 12% compared with that of the reference scheme.

  13. A Selection of Experimental Test Cases for the Validation of CFD Codes. Volume 2. (Recueil de cas d’essai Experimentaux Pour la Validation des Codes de L’Aerodynamique Numerique. Volume 2)

    DTIC Science & Technology

    1994-08-01

    c S c o I -2 b I c 5 ^ A9-I0 Kfc 0) >% 3 .« W) o w O OJ a) 5. u > o ^a 5 ~ ^ o ra to *- w 0 " ro d) iO II...12). The technique was modified to calculate the drag %*<• 4 «.c* A12-II using ihc ncin-minjsivc LUV and sidewall pressure measure- menu rather

  14. A Designer’s Guide to Human Performance Modelling (La Modelisation des Performances Humaines: Manuel du Concepteur).

    DTIC Science & Technology

    1998-12-01

    failure detection, monitoring, and decision making.) moderator function. Originally, the output from these One of the best known OCM implementations, the...imposed by the tasks themselves, the information and equipment provided, the task environment, operator skills and experience, operator strategies , the...problem-solving situation, including the toward failure.) knowledge necessary to generate the right problem- solving strategies , the attention that

  15. Computational approach to estimating the effects of blood properties on changes in intra-stent flow.

    PubMed

    Benard, Nicolas; Perrault, Robert; Coisne, Damien

    2006-08-01

    In this study various blood rheological assumptions are numerically investigated for the hemodynamic properties of intra-stent flow. Non-newtonian blood properties have never been implemented in blood coronary stented flow investigation, although its effects appear essential for a correct estimation and distribution of wall shear stress (WSS) exerted by the fluid on the internal vessel surface. Our numerical model is based on a full 3D stent mesh. Rigid wall and stationary inflow conditions are applied. Newtonian behavior, non-newtonian model based on Carreau-Yasuda relation and a characteristic newtonian value defined with flow representative parameters are introduced in this research. Non-newtonian flow generates an alteration of near wall viscosity norms compared to newtonian. Maximal WSS values are located in the center part of stent pattern structure and minimal values are focused on the proximal stent wire surface. A flow rate increase emphasizes fluid perturbations, and generates a WSS rise except for interstrut area. Nevertheless, a local quantitative analysis discloses an underestimation of WSS for modelisation using a newtonian blood flow, with clinical consequence of overestimate restenosis risk area. Characteristic viscosity introduction appears to present a useful option compared to rheological modelisation based on experimental data, with computer time gain and relevant results for quantitative and qualitative WSS determination.

  16. Future Modelling and Simulation Challenges (Defis futurs pour la modelisation et la simulation)

    DTIC Science & Technology

    2002-11-01

    Language School Figure 2: Location of the simulation center within the MEC Military operations research section - simulation lab Military operations... language . This logic can be probabilistic (branching is randomised, which is useful for modelling error), tactical (a branch goes to the task with the... language and a collection of simulation tools that can be used to create human and team behaviour models to meet users’ needs. Hence, different ways of

  17. Modelisation frequentielle de la permittivite du beton pour le controle non destructif par georadar

    NASA Astrophysics Data System (ADS)

    Bourdi, Taoufik

    Le georadar (Ground Penetrating Radar (GPR)) constitue une technique de controle non destructif (CND) interessante pour la mesure des epaisseurs des dalles de beton et la caracterisation des fractures, en raison de ses caracteristiques de resolution et de profondeur de penetration. Les equipements georadar sont de plus en plus faciles a utiliser et les logiciels d'interpretation sont en train de devenir plus aisement accessibles. Cependant, il est ressorti dans plusieurs conferences et ateliers sur l'application du georadar en genie civil qu'il fallait poursuivre les recherches, en particulier sur la modelisation et les techniques de mesure des proprietes electriques du beton. En obtenant de meilleures informations sur les proprietes electriques du beton aux frequences du georadar, l'instrumentation et les techniques d'interpretation pourraient etre perfectionnees plus efficacement. Le modele de Jonscher est un modele qui a montre son efficacite dans le domaine geophysique. Pour la premiere fois, son utilisation dans le domaine genie civil est presentee. Dans un premier temps, nous avons valide l'application du modele de Jonscher pour la caracterisation de la permittivite dielectrique du beton. Les resultats ont montre clairement que ce modele est capable de reproduire fidelement la variation de la permittivite de differents types de beton sur la bande de frequence georadar (100 MHz-2 GHz). Dans un deuxieme temps, nous avons montre l'interet du modele de Jonscher en le comparant a d'autres modeles (Debye et Debye-etendu) deja utilises dans le domaine genie civil. Nous avons montre aussi comment le modele de Jonscher peut presenter une aide a la prediction de l'efficacite de blindage et a l'interpretation des ondes de la technique GPR. Il a ete determine que le modele de Jonscher permet de donner une bonne presentation de la variation de la permittivite du beton dans la gamme de frequence georadar consideree. De plus, cette modelisation est valable pour differents types de beton et a differentes teneurs en eau. Dans une derniere partie, nous avons presente l'utilisation du modele de Jonscher pour l'estimation de l'epaisseur d'une dalle de beton par la technique GPR dans le domaine frequentiel. Mots-cles : CND, beton, georadar , permittivite, Jonscher

  18. 3D Modelling of Urban Terrain (Modelisation 3D de milieu urbain)

    DTIC Science & Technology

    2011-09-01

    Panel • IST Information Systems Technology Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis and Studies Panel • SCI... Systems Concepts and Integration Panel • SET Sensors and Electronics Technology Panel These bodies are made up of national representatives as well as...of a part of it may be made for individual use only. The approval of the RTA Information Management Systems Branch is required for more than one

  19. Directory of Factual and Numeric Databases of Relevance to Aerospace and Defence R and D (Repertoire de Bases de donnees Factuelles ou Numeriques d’interet pour la R and D).

    DTIC Science & Technology

    1992-07-01

    have become quite common in science and engineering, and will become more so as the demand for reliable data increases, and with it the pace of data...la derniere decennie. Elles sont appelees a jouer un r6le plus important. a l’avenir. avec l’evolution de Ia demande d’intormations tiables et...computational codes. The wind tunnel data contained in the SEADS data base were obained using these forward fuselage models (10%, 4% and 2%) over the Match

  20. Understanding and Modeling Vortical Flows to Improve the Technology Readiness Level for Military Aircraft (Comprehension et Modelisation des Flux de Vortex Pour Ameliorer le Niveau de Maturite Technologique au Profit des Avions Militaires)

    DTIC Science & Technology

    2009-10-01

    636.7 115,418 0 2500 5000 7500 10000 12500 iterations -5 -4 -3 -2 -1 0 lo g( dρ /d t) SA EARSM EARSM + CC Hellsten EARSM Hellsten EARSM + CC DRSM...VORTEX BREAKDOWN RTO-TR-AVT-113 29 - 13 θU URo axial= (1) As a vortex passes through a normal shock, the tangential velocity is

  1. Human Behaviour Representation in Constructive Modelling (Representation du comportement humain dans des modelisations creatives)

    DTIC Science & Technology

    2009-09-01

    involved in R&T activities. RTO reports both to the Military Committee of NATO and to the Conference of National Armament Directors. It comprises a...4 11.5.3 Project Description 11-5 Chapter 12 – Technical Evaluation Report 12-1 12.1 Executive Summary 12-1 12.2 Introduction 12-2 12.3...modelling human factors has been slow over the past decade, other forums have been reporting a number of theoretical and applied papers on human behaviour

  2. Human Behaviour Representation in Constructive Modelling (Representation du comportement humain dans des modelisations creatives)

    DTIC Science & Technology

    2009-09-01

    ordination with other NATO bodies involved in R&T activities. RTO reports both to the Military Committee of NATO and to the Conference of National...Aims 11-4 11.5.2 Background 11-4 11.5.3 Project Description 11-5 Chapter 12 – Technical Evaluation Report 12-1 12.1 Executive Summary 12-1...track. Although progress in modelling human factors has been slow over the past decade, other forums have been reporting a number of theoretical and

  3. Effets de l'humidite sur la propagation du delaminage dans un composite carbone/epoxy sollicite en mode mixte I/II

    NASA Astrophysics Data System (ADS)

    LeBlanc, Luc R.

    Les materiaux composites sont de plus en plus utilises dans des domaines tels que l'aerospatiale, les voitures a hautes performances et les equipements sportifs, pour en nommer quelques-uns. Des etudes ont demontre qu'une exposition a l'humidite nuit a la resistance des composites en favorisant l'initiation et la propagation du delaminage. De ces etudes, tres peu traitent de l'effet de l'humidite sur l'initiation du delaminage en mode mixte I/II et aucune ne traite des effets de l'humidite sur le taux de propagation du delaminage en mode mixte I/II dans un composite. La premiere partie de cette these consiste a determiner les effets de l'humidite sur la propagation du delaminage lors d'une sollicitation en mode mixte I/II. Des eprouvettes d'un composite unidirectionnel de carbone/epoxy (G40-800/5276-1) ont ete immergees dans un bain d'eau distillee a 70°C jusqu'a leur saturation. Des essais experimentaux quasi-statiques avec des chargements d'une gamme de mixites des modes I/II (0%, 25%, 50%, 75% et 100%) ont ete executes pour determiner les effets de l'humidite sur la resistance au delaminage du composite. Des essais de fatigue ont ete realises, avec la meme gamme de mixite des modes I/II, pour determiner 1'effet de 1'humidite sur l'initiation et sur le taux de propagation du delaminage. Les resultats des essais en chargement quasi-statique ont demontre que l'humidite reduit la resistance au delaminage d'un composite carbone/epoxy pour toute la gamme des mixites des modes I/II, sauf pour le mode I ou la resistance au delaminage augmente apres une exposition a l'humidite. Pour les chargements en fatigue, l'humidite a pour effet d'accelerer l'initiation du delaminage et d'augmenter le taux de propagation pour toutes les mixites des modes I/II. Les donnees experimentales recueillies ont ete utilisees pour determiner lesquels des criteres de delaminage en statique et des modeles de taux de propagation du delaminage en fatigue en mode mixte I/II proposes dans la litterature representent le mieux le delaminage du composite etudie. Une courbe de regression a ete utilisee pour determiner le meilleur ajustement entre les donnees experimentales et les criteres de delaminage en statique etudies. Une surface de regression a ete utilisee pour determiner le meilleur ajustement entre les donnees experimentales et les modeles de taux de propagation en fatigue etudies. D'apres les ajustements, le meilleur critere de delaminage en statique est le critere B-K et le meilleur modele de propagation en fatigue est le modele de Kenane-Benzeggagh. Afin de predire le delaminage lors de la conception de pieces complexes, des modeles numeriques peuvent etre utilises. La prediction de la longueur de delaminage lors des chargements en fatigue d'une piece est tres importante pour assurer qu'une fissure interlaminaire ne va pas croitre excessivement et causer la rupture de cette piece avant la fin de sa duree de vie de conception. Selon la tendance recente, ces modeles sont souvent bases sur l'approche de zone cohesive avec une formulation par elements finis. Au cours des travaux presentes dans cette these, le modele de progression du delaminage en fatigue de Landry & LaPlante (2012) a ete ameliore en y ajoutant le traitement des chargements en mode mixte I/II et en y modifiant l'algorithme du calcul de la force d'entrainement maximale du delaminage. Une calibration des parametres de zone cohesive a ete faite a partir des essais quasi-statiques experimentaux en mode I et II. Des resultats de simulations numeriques des essais quasi-statiques en mode mixte I/II, avec des eprouvettes seches et humides, ont ete compares avec les essais experimentaux. Des simulations numeriques en fatigue ont aussi ete faites et comparees avec les resultats experimentaux du taux de propagation du delaminage. Les resultats numeriques des essais quasi-statiques et de fatigue ont montre une bonne correlation avec les resultats experimentaux pour toute la gamme des mixites des modes I/II etudiee.

  4. The Second NATO Modelling and Simulation Conference(Deuxieme conference OTAN sur la modelisation et la simulation)

    DTIC Science & Technology

    2001-07-01

    Major General A C Figgures, Capability Manager (Manœuvre) UK MOD, provided the Conference with a fitting end message encouraging the SE and M&S...SESSION Welcoming Address - ‘Synthetic Environments - Managing the Breakout’ WA by M. Markin Opening Address for NATO M&S Conference OA by G. Sürsal...Keynote Address KN by G.J. Burrows Industry’s Role IR† by M. Mansell The RMCS SSEL I by J.R. Searle SESSION 1: POLICY, STRATEGY & MANAGEMENT A Strategy

  5. Reduction of Military Vehicle Acquisition Time and Cost through Advanced Modelling and Virtual Simulation (La reduction des couts et des delais d’acquisition des vehicules militaires par la modelisation avancee et la simulation de produit virtuel)

    DTIC Science & Technology

    2003-03-01

    nations, a very thorough examination of current practices. Introduction The Applied Vehicle Technology Panel (AVT) of the Research and Technology...the introduction of new information generated by computer codes required it to be timely and presented in appropriate fashion so that it could...military competition between the NATO allies and the Soviet Union. The second was the introduction of commercial, high capacity transonic aircraft and

  6. Models for Aircrew Safety Assessment: Uses, Limitations and Requirements (la Modelisation des conditions de securite des equipages: applications, limitations et cahiers des charges)

    DTIC Science & Technology

    1999-08-01

    immediately, re- ducing venous return artifacts during the first beat of the simulation. tn+1 - W+ on c+ / \\ W_ on c_ t 1 Xi-l Xi+1 Figure 4...s) Figure 5: The effect of network complexity. The aortic pressure is shown in Figure 5 during the fifth beat for the networks with one and three...Mechanical Engineering Department, Uni- versity of Victoria. [19] Huyghe J.M., 1986, "Nonlinear Finite Element Models of The Beating Left

  7. Modelling and Simulation as a Service: New Concepts and Service-Oriented Architectures (Modelisation et simulation en tant que service: Nouveaux concepts et architectures orientes service)

    DTIC Science & Technology

    2015-05-01

    delivery business model where S&T activities are conducted in a NATO dedicated executive body, having its own personnel, capabilities and infrastructure ...SD-4: Design for Securability 5-4 5.3.2 Recommendations on Simulation Environment Infrastructure 5-5 5.3.2.1 Recommendation IN-1: Harmonize...Critical Data and 5-5 Algorithms 5.3.2.2 Recommendation IN-2: Establish Permanent Simulation 5-5 Infrastructure 5.3.2.3 Recommendation IN-3: Establish

  8. Modelling of Molecular Structures and Properties in Physical Chemistry and Biophysics, Forty-Fourth International Meeting (Modelisation des Structures et Proprietes Moleculaires en Chimie Physique et en Biophysique, Quarante- Quatrieme Reunion Internationale)

    DTIC Science & Technology

    1989-09-01

    pyridone).Previous work on, py/ridimum, pyrazinjumn or pyrimidi im salts Koon 2 -pyrimloone and 2 - pyrimidone salts [43j have shown that some...forces. Acct . r ~[U... •K;.i. LJ , ’ 0, ’’ .t_I ..- .It . ( :.. 2 A VIBRATIONAL MOLECULAR FORCE FIELD FOR .ACROMOLECULA-R MODELLI= Gerard VERGOTENi...microscopic point of view are (1) understanding, ( 2 ) interpretation of experimental results, (3) semiquantitative estimates of experimental results and (4

  9. Propagation Modelling and Decision Aids for Communications, Radar and Navigation Systems (La Modelisation de la Propagation et Aides a la Decision Pour les Sysemes de elecommunicaions, de Radar et de Navigation)

    DTIC Science & Technology

    1994-09-01

    the refractive index i. can be density, temperature , ion composition, ionospheric determined from a simplified form of the Appleton- electric field...see Cannon 119941. the electron density profile is based upon the underlying neutral composition. temperature and wind together with electric field...in many of the newer HF predictions decision software , NSSDC/WDC-A-R&S 90-19, National Space aids. They also provide a very useful stand alone

  10. Simulation d'ecoulements internes compressibles laminaires et turbulents par une methode d'elements finis

    NASA Astrophysics Data System (ADS)

    Rebaine, Ali

    1997-08-01

    Ce travail consiste en la simulation numerique des ecoulements internes compressibles bidimensionnels laminaires et turbulents. On s'interesse, particulierement, aux ecoulements dans les ejecteurs supersoniques. Les equations de Navier-Stokes sont formulees sous forme conservative et utilisent, comme variables independantes, les variables dites enthalpiques a savoir: la pression statique, la quantite de mouvement et l'enthalpie totale specifique. Une formulation variationnelle stable des equations de Navier-Stokes est utilisee. Elle est base sur la methode SUPG (Streamline Upwinding Petrov Galerkin) et utilise un operateur de capture des forts gradients. Un modele de turbulence, pour la simulation des ecoulements dans les ejecteurs, est mis au point. Il consiste a separer deux regions distinctes: une region proche de la paroi solide, ou le modele de Baldwin et Lomax est utilise et l'autre, loin de la paroi, ou une formulation nouvelle, basee sur le modele de Schlichting pour les jets, est proposee. Une technique de calcul de la viscosite turbulente, sur un maillage non structure, est implementee. La discretisation dans l'espace de la forme variationnelle est faite a l'aide de la methode des elements finis en utilisant une approximation mixte: quadratique pour les composantes de la quantite de mouvement et de la vitesse et lineaire pour le reste des variables. La discretisation temporelle est effectuee par une methode de differences finies en utilisant le schema d'Euler implicite. Le systeme matriciel, resultant de la discretisation spatio-temporelle, est resolu a l'aide de l'algorithme GMRES en utilisant un preconditionneur diagonal. Les validations numeriques ont ete menees sur plusieurs types de tuyeres et ejecteurs. La principale validation consiste en la simulation de l'ecoulement dans l'ejecteur teste au centre de recherche NASA Lewis. Les resultats obtenus sont tres comparables avec ceux des travaux anterieurs et sont nettement superieurs concernant les ecoulements turbulents dans les ejecteurs.

  11. Commerce de detail de l'essence automobile: Modelisation de l'impact a court terme des facteurs endogenes et exogenes sur les ventes d'essence dans les stations-service a Montreal

    NASA Astrophysics Data System (ADS)

    Nguimbus, Raphael

    La determination de l'impact des facteurs sous controle et hors controle qui influencent les volumes de vente des magasins de detail qui vendent des produits homogenes et fortement substituables constitue le coeur de cette these. Il s'agit d'estimer un ensemble de coefficients stables et asymtotiquement efficaces non correles avec les effets specifiques aleatoires des sites d'essence dans le marche de Montreal (Quebec, Canada) durant is periode 1993--1997. Le modele econometrique qui est ainsi specifie et teste, isole un ensemble de quatre variables dont le prix de detail affiche dans un site d'essence ordinaire, la capacite de service du site pendant les heures de pointe, les heures de service et le nombre de sites concurrents au voisinage du site dans un rayon de deux kilometres. Ces quatre facteurs influencent les ventes d'essence dans les stations-service. Les donnees en panel avec les methodes d'estimation robustes (estimateur a distance minimale) sont utilisees pour estimer les parametres du modele de vente. Nous partons avec l'hypothese generale selon laquelle il se developpe une force d'attraction qui attire les clients automobilistes dans chaque site, et qui lui permet de realiser les ventes. Cette capacite d'attraction varie d'un site a un autre et cela est du a la combinaison de l'effort marketing et de l'environnement concurrentiel autour du site. Les notions de voisinage et de concurrence spatiale expliquent les comportements des decideurs qui gerent les sites. Le but de cette these est de developper un outil d'aide a la decision (modele analytique) pour permettre aux gestionnaires des chaines de stations-service d'affecter efficacement les ressources commerciales dans ies points de vente.

  12. Radio Wave Propagation Modeling, Prediction and Assessment (L’Evaluation, la Prevision et la Modelisation des Ondes Hertziennes)

    DTIC Science & Technology

    1990-01-01

    modifiers and added an additional set of modifiers to adjust the average VTOP. The original DECO model made use of waveguide excitation factors and...ranges far beyond the horizon. The modified refractivity M is defined by N - N + (h/a) x 106 - N + 0.157 h (2.1) where h is the height above the earth’s...LAYEIR APPING LAVER REFRACTIVITY N MODIFIED REFRAACTIVIT M Figure 2.4. N and N profiles for an elevated duct. t /VA--’’TM tDUCT ITx IFPAT4G RELRACIVT

  13. Modelisation and distribution of neutron flux in radium-beryllium source (226Ra-Be)

    NASA Astrophysics Data System (ADS)

    Didi, Abdessamad; Dadouch, Ahmed; Jai, Otman

    2017-09-01

    Using the Monte Carlo N-Particle code (MCNP-6), to analyze the thermal, epithermal and fast neutron fluxes, of 3 millicuries of radium-beryllium, for determine the qualitative and quantitative of many materials, using method of neutron activation analysis. Radium-beryllium source of neutron is established to practical work and research in nuclear field. The main objective of this work was to enable us harness the profile flux of radium-beryllium irradiation, this theoretical study permits to discuss the design of the optimal irradiation and performance for increased the facility research and education of nuclear physics.

  14. Guide to Modelling & Simulation (M&S) for NATO Network-Enabled Capability (M&S for NNEC) (Guide de la modelisation et de la simulation (M&S) pour las NATO network-enabled capability (M&S de la NNEC))

    DTIC Science & Technology

    2010-02-01

    interdependencies, and then modifying plans according to updated projections. This is currently an immature area where further research is required. The...crosscutting.html. [7] Zeigler, B.P. and Hammonds, P. (2007). “Modelling and Simulation- Based Data Engineering: Introducing Pragmatics and Ontologies for...the optimum benefit to be obtained and while immature , ongoing research needs to be maintained. 20) Use of M&S to support complex operations needs

  15. Modelisation de l'historique d'operation de groupes turbine-alternateur

    NASA Astrophysics Data System (ADS)

    Szczota, Mickael

    Because of their ageing fleet, the utility managers are increasingly in needs of tools that can help them to plan efficiently maintenance operations. Hydro-Quebec started a project that aim to foresee the degradation of their hydroelectric runner, and use that information to classify the generating unit. That classification will help to know which generating unit is more at risk to undergo a major failure. Cracks linked to the fatigue phenomenon are a predominant degradation mode and the loading sequences applied to the runner is a parameter impacting the crack growth. So, the aim of this memoir is to create a generator able to generate synthetic loading sequences that are statistically equivalent to the observed history. Those simulated sequences will be used as input in a life assessment model. At first, we describe how the generating units are operated by Hydro-Quebec and analyse the available data, the analysis shows that the data are non-stationnary. Then, we review modelisation and validation methods. In the following chapter a particular attention is given to a precise description of the validation and comparison procedure. Then, we present the comparison of three kind of model : Discrete Time Markov Chains, Discrete Time Semi-Markov Chains and the Moving Block Bootstrap. For the first two models, we describe how to take account for the non-stationnarity. Finally, we show that the Markov Chain is not adapted for our case, and that the Semi-Markov chains are better when they include the non-stationnarity. The final choice between Semi-Markov Chains and the Moving Block Bootstrap depends of the user. But, with a long term vision we recommend the use of Semi-Markov chains for their flexibility. Keywords: Stochastic models, Models validation, Reliability, Semi-Markov Chains, Markov Chains, Bootstrap

  16. Aging Mechanisms and Control. Symposium Part A - Developments in Computational Aero- and Hydro-Acoustics. Symposium Part B - Monitoring and Management of Gas Turbine Fleets for Extended Life and Reduced Costs (Les mecanismes vieillissants et le controle) (Symposium Partie A - Developpements dans le domaine de l’aeroacoustique et I’hydroacoustique numeriques) (Symposium Partie B - Le suivi et la gestion des turbomoteurs en vue du prolongement de l

    DTIC Science & Technology

    2003-02-01

    Stromingsmechanica Industriale Pleinlaan, 2 Universita Roma Tre B-1050 Brussel via della Vasca Navale 79 em: hirsch@stro10.vub.ac.be 00146 Roma em...the flow and noise in the diffuser of an industrial gas turbine engine. A steady RANS CFD calculation and experiments were used to identify the gross...finally, defence industry was restructuring demanding that we review our relationship with them. (SYA) KN1-5 Ministers agreed that changes were

  17. Flying Qualities Flight Testing of Digital Flight Control Systems. Flight Test Techniques Series - Volume 21 (les Essais en vol des performances des systemes de ommande de vol numeriques)

    DTIC Science & Technology

    2001-12-01

    product operator, Ucg = X body axis velocity at the cg, Uvane = X body axis velocity at the cg, Vcg = Y body axis velocity at the cg, Vvane = Y body axis...Tan vane Uvane α β   =       =     (5) Ucg = VtrueCOS(βtrue)COS(αtrue) Vcg = VtrueSIN(βtrue) Wcg = VtrueCOS(βtrue)SIN...from the definitions of these angles. 2 2 2 1 1 V U V Wcg cg cgtrue Wcg Tantrue Ucg Vcg Sintrue Vtrue α β = + +  −=      −=     (12) 53

  18. Unsteady Aerodynamics - Fundamentals and Applications of Aircraft Dynamics. Conference Proceedings of the Joint Symposium of the Fluid Dynamics and Flight Mechanics Panels Held in Goettingen, Federal Republic of Germany on 6-9 May 1985.

    DTIC Science & Technology

    1985-11-01

    tourbillons daxe perpendicu-V laire A l’fcoulement principal) issu d’un profil occillant en Tamis dan;, do,, condition,, dn dorochagp dynamique. 5_10...a~rodyna- - mique sur R. A partir de cette analyse experimentale, une tentative de modelisation th~sorique des effets non *lin~ laires observes aux...cisaillement A la paroi d’un profil d’aile anim6 d’un mouvament harmonique parall~le ou parpandicu- laire A 1𔄀coulement non perturb~s", EUROMECH

  19. Modelisation of an unspecialized quadruped walking mammal.

    PubMed

    Neveu, P; Villanova, J; Gasc, J P

    2001-12-01

    Kinematics and structural analyses were used as basic data to elaborate a dynamic quadruped model that may represent an unspecialized mammal. Hedgehogs were filmed on a treadmill with a cinefluorographic system providing trajectories of skeletal elements during locomotion. Body parameters such as limb segments mass and length, and segments centre of mass were checked from cadavers. These biological parameters were compiled in order to build a virtual quadruped robot. The robot locomotor behaviour was compared with the actual hedgehog to improve the model and to disclose the necessary changes. Apart from use in robotics, the resulting model may be useful to simulate the locomotion of extinct mammals.

  20. Conductivite dans le modele de Hubbard bi-dimensionnel a faible couplage

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic

    Le modele de Hubbard bi-dimensionnel (2D) est souvent considere comme le modele minimal pour les supraconducteurs a haute temperature critique a base d'oxyde de cuivre (SCHT). Sur un reseau carre, ce modele possede les phases qui sont communes a tous les SCHT, la phase antiferromagnetique, la phase supraconductrice et la phase dite du pseudogap. Il n'a pas de solution exacte, toutefois, plusieurs methodes approximatives permettent d'etudier ses proprietes de facon numerique. Les proprietes optiques et de transport sont bien connues dans les SCHT et sont donc de bonne candidates pour valider un modele theorique et aider a comprendre mieux la physique de ces materiaux. La presente these porte sur le calcul de ces proprietes pour le modele de Hubbard 2D a couplage faible ou intermediaire. La methode de calcul utilisee est l'approche auto-coherente a deux particules (ACDP), qui est non-perturbative et inclue l'effet des fluctuations de spin et de charge a toutes les longueurs d'onde. La derivation complete de l'expression de la conductivite dans l'approche ACDP est presentee. Cette expression contient ce qu'on appelle les corrections de vertex, qui tiennent compte des correlations entre quasi-particules. Pour rendre possible le calcul numerique de ces corrections, des algorithmes utilisant, entre autres, des transformees de Fourier rapides et des splines cubiques sont developpes. Les calculs sont faits pour le reseau carre avec sauts aux plus proches voisins autour du point critique antiferromagnetique. Aux dopages plus faibles que le point critique, la conductivite optique presente une bosse dans l'infrarouge moyen a basse temperature, tel qu'observe dans plusieurs SCHT. Dans la resistivite en fonction de la temperature, on trouve un comportement isolant dans le pseudogap lorsque les corrections de vertex sont negligees et metallique lorsqu'elles sont prises en compte. Pres du point critique, la resistivite est lineaire en T a basse temperature et devient progressivement proportionnelle a T 2 a fort dopage. Quelques resultats avec sauts aux voisins plus eloignes sont aussi presentes. Mots-cles: Hubbard, point critique quantique, conductivite, corrections de vertex

  1. Reconnaissance invariante d'objets 3-D et correlation SONG

    NASA Astrophysics Data System (ADS)

    Roy, Sebastien

    Cette these propose des solutions a deux problemes de la reconnaissance automatique de formes: la reconnaissance invariante d'objets tridimensionnels a partir d'images d'intensite et la reconnaissance robuste a la presence de bruit disjoint. Un systeme utilisant le balayage angulaire des images et un classificateur par trajectoires d'espace des caracteristiques permet d'obtenir la reconnaissance invariante d'objets tridimensionnels. La reconnaissance robuste a la presence de bruit disjoint est realisee au moyen de la correlation SONG. Nous avons realise la reconnaissance invariante aux translations, rotations et changements d'echelle d'objets tridimensionnels a partir d'images d'intensite segmentees. Nous utilisons le balayage angulaire et un classificateur a trajectoires d'espace des caracteris tiques. Afin d'obtenir l'invariance aux translations, le centre de balayage angulaire coincide avec le centre geometrique de l'image. Le balayage angulaire produit un vecteur de caracteristiques invariant aux changements d'echelle de l'image et il transforme en translations du signal les rotations autour d'un axe parallele a la ligne de visee. Le classificateur par trajectoires d'espace des caracteristiques represente une rotation autour d'un axe perpendiculaire a la ligne de visee par une courbe dans l'espace. La classification se fait par la mesure de la distance du vecteur de caracteristiques de l'image a reconnaitre aux trajectoires stockees dans l'espace. Nos resultats numeriques montrent un taux de classement atteignant 98% sur une banque d'images composee de 5 vehicules militaires. La correlation non-lineaire generalisee en tranches orthogonales (SONG) traite independamment les niveaux de gris presents dans une image. Elle somme les correlations lineaires des images binaires ayant le meme niveau de gris. Cette correlation est equivalente a compter le nombre de pixels situes aux memes positions relatives et ayant les memes intensites sur deux images. Nous presentons une realisation opto-electronique de la correlation SONG. Cette realisation utilise le correlateur a transformees conjointes. Les resultats des experiences numeriques et optiques montrent que le bruit disjoint ne nuit pas a la correlation SONG.

  2. Modelisation de photodetecteurs a base de matrices de diodes avalanche monophotoniques pour tomographie d'emission par positrons

    NASA Astrophysics Data System (ADS)

    Corbeil Therrien, Audrey

    La tomographie d'emission par positrons (TEP) est un outil precieux en recherche preclinique et pour le diagnostic medical. Cette technique permet d'obtenir une image quantitative de fonctions metaboliques specifiques par la detection de photons d'annihilation. La detection des ces photons se fait a l'aide de deux composantes. D'abord, un scintillateur convertit l'energie du photon 511 keV en photons du spectre visible. Ensuite, un photodetecteur convertit l'energie lumineuse en signal electrique. Recemment, les photodiodes avalanche monophotoniques (PAMP) disposees en matrice suscitent beaucoup d'interet pour la TEP. Ces matrices forment des detecteurs sensibles, robustes, compacts et avec une resolution en temps hors pair. Ces qualites en font un photodetecteur prometteur pour la TEP, mais il faut optimiser les parametres de la matrice et de l'electronique de lecture afin d'atteindre les performances optimales pour la TEP. L'optimisation de la matrice devient rapidement une operation difficile, car les differents parametres interagissent de maniere complexe avec les processus d'avalanche et de generation de bruit. Enfin, l'electronique de lecture pour les matrices de PAMP demeure encore rudimentaire et il serait profitable d'analyser differentes strategies de lecture. Pour repondre a cette question, la solution la plus economique est d'utiliser un simulateur pour converger vers la configuration donnant les meilleures performances. Les travaux de ce memoire presentent le developpement d'un tel simulateur. Celui-ci modelise le comportement d'une matrice de PAMP en se basant sur les equations de physique des semiconducteurs et des modeles probabilistes. Il inclut les trois principales sources de bruit, soit le bruit thermique, les declenchements intempestifs correles et la diaphonie optique. Le simulateur permet aussi de tester et de comparer de nouvelles approches pour l'electronique de lecture plus adaptees a ce type de detecteur. Au final, le simulateur vise a quantifier l'impact des parametres du photodetecteur sur la resolution en energie et la resolution en temps et ainsi optimiser les performances de la matrice de PAMP. Par exemple, l'augmentation du ratio de surface active ameliore les performances, mais seulement jusqu'a un certain point. D'autres phenomenes lies a la surface active, comme le bruit thermique, provoquent une degradation du resultat. Le simulateur nous permet de trouver un compromis entre ces deux extremes. Les simulations avec les parametres initiaux demontrent une efficacite de detection de 16,7 %, une resolution en energie de 14,2 % LMH et une resolution en temps de 0.478 ns LMH. Enfin, le simulateur propose, bien qu'il vise une application en TEP, peut etre adapte pour d'autres applications en modifiant la source de photons et en adaptant les objectifs de performances. Mots-cles : Photodetecteurs, photodiodes avalanche monophotoniques, semiconducteurs, tomographie d'emission par positrons, simulations, modelisation, detection monophotonique, scintillateurs, circuit d'etouffement, SPAD, SiPM, Photodiodes avalanche operees en mode Geiger

  3. Developpement de techniques de diagnostic non intrusif par tomographie optique

    NASA Astrophysics Data System (ADS)

    Dubot, Fabien

    Que ce soit dans les domaines des procedes industriels ou de l'imagerie medicale, on a assiste ces deux dernieres decennies a un developpement croissant des techniques optiques de diagnostic. L'engouement pour ces methodes repose principalement sur le fait qu'elles sont totalement non invasives, qu'elle utilisent des sources de rayonnement non nocives pour l'homme et l'environnement et qu'elles sont relativement peu couteuses et faciles a mettre en oeuvre comparees aux autres techniques d'imagerie. Une de ces techniques est la Tomographie Optique Diffuse (TOD). Cette methode d'imagerie tridimensionnelle consiste a caracteriser les proprietes radiatives d'un Milieu Semi-Transparent (MST) a partir de mesures optiques dans le proche infrarouge obtenues a l'aide d'un ensemble de sources et detecteurs situes sur la frontiere du domaine sonde. Elle repose notamment sur un modele direct de propagation de la lumiere dans le MST, fournissant les predictions, et un algorithme de minimisation d'une fonction de cout integrant les predictions et les mesures, permettant la reconstruction des parametres d'interet. Dans ce travail, le modele direct est l'approximation diffuse de l'equation de transfert radiatif dans le regime frequentiel tandis que les parametres d'interet sont les distributions spatiales des coefficients d'absorption et de diffusion reduit. Cette these est consacree au developpement d'une methode inverse robuste pour la resolution du probleme de TOD dans le domaine frequentiel. Pour repondre a cet objectif, ce travail est structure en trois parties qui constituent les principaux axes de la these. Premierement, une comparaison des algorithmes de Gauss-Newton amorti et de Broyden- Fletcher-Goldfarb-Shanno (BFGS) est proposee dans le cas bidimensionnel. Deux methodes de regularisation sont combinees pour chacun des deux algorithmes, a savoir la reduction de la dimension de l'espace de controle basee sur le maillage et la regularisation par penalisation de Tikhonov pour l'algorithme de Gauss-Newton amorti, et les regularisations basees sur le maillage et l'utilisation des gradients de Sobolev, uniformes ou spatialement dependants, lors de l'extraction du gradient de la fonction cout, pour la methode BFGS. Les resultats numeriques indiquent que l'algorithme de BFGS surpasse celui de Gauss-Newton amorti en ce qui concerne la qualite des reconstructions obtenues, le temps de calcul ou encore la facilite de selection du parametre de regularisation. Deuxiemement, une etude sur la quasi-independance du parametre de penalisation de Tikhonov optimal par rapport a la dimension de l'espace de controle dans les problemes inverses d'estimation de fonctions spatialement dependantes est menee. Cette etude fait suite a une observation realisee lors de la premiere partie de ce travail ou le parametre de Tikhonov, determine par la methode " L-curve ", se trouve etre independant de la dimension de l'espace de controle dans le cas sous-determine. Cette hypothese est demontree theoriquement puis verifiee numeriquement sur un probleme inverse lineaire de conduction de la chaleur puis sur le probleme inverse non-lineaire de TOD. La verification numerique repose sur la determination d'un parametre de Tikhonov optimal, defini comme etant celui qui minimise les ecarts entre les cibles et les reconstructions. La demonstration theorique repose sur le principe de Morozov (discrepancy principle) dans le cas lineaire, tandis qu'elle repose essentiellement sur l'hypothese que les fonctions radiatives a reconstruire sont des variables aleatoires suivant une loi normale dans le cas non-lineaire. En conclusion, la these demontre que le parametre de Tikhonov peut etre determine en utilisant une parametrisation des variables de controle associee a un maillage lâche afin de reduire les temps de calcul. Troisiemement, une methode inverse multi-echelle basee sur les ondelettes associee a l'algorithme de BFGS est developpee. Cette methode, qui s'appuie sur une reformulation du probleme inverse original en une suite de sous-problemes inverses de la plus grande echelle a la plus petite, a l'aide de la transformee en ondelettes, permet de faire face a la propriete de convergence locale de l'optimiseur et a la presence de nombreux minima locaux dans la fonction cout. Les resultats numeriques montrent que la methode proposee est plus stable vis-a-vis de l'estimation initiale des proprietes radiatives et fournit des reconstructions finales plus precises que l'algorithme de BFGS ordinaire tout en necessitant des temps de calcul semblables. Les resultats de ces travaux sont presentes dans cette these sous forme de quatre articles. Le premier article a ete accepte dans l'International Journal of Thermal Sciences, le deuxieme est accepte dans la revue Inverse Problems in Science and Engineering, le troisieme est accepte dans le Journal of Computational and Applied Mathematics et le quatrieme a ete soumis au Journal of Quantitative Spectroscopy & Radiative Transfer. Dix autres articles ont ete publies dans des comptes-rendus de conferences avec comite de lecture. Ces articles sont disponibles en format pdf sur le site de la Chaire de recherche t3e (www.t3e.info).

  4. Methodologie de modelisation aerostructurelle d'une aile utilisant un logiciel de calcul aerodynamique et un logiciel de calcul par elements finis =

    NASA Astrophysics Data System (ADS)

    Communier, David

    Lors de l'etude structurelle d'une aile d'avion, il est difficile de modeliser fidelement les forces aerodynamiques subies par l'aile de l'avion. Pour faciliter l'analyse, on repartit la portance maximale theorique de l'aile sur son longeron principal ou sur ses nervures. La repartition utilisee implique que l'aile entiere sera plus resistante que necessaire et donc que la structure ne sera pas totalement optimisee. Pour pallier ce probleme, il faudrait s'assurer d'appliquer une repartition aerodynamique de la portance sur la surface complete de l'aile. On serait donc en mesure d'obtenir une repartition des charges sur l'aile beaucoup plus fiable. Pour le realiser, nous aurons besoin de coupler les resultats d'un logiciel calculant les charges aerodynamiques de l'aile avec les resultats d'un logiciel permettant sa conception et son analyse structurelle. Dans ce projet, le logiciel utilise pour calculer les coefficients de pression sur l'aile est XFLR5 et le logiciel permettant la conception et l'analyse structurelle sera CATIA V5. Le logiciel XFLR5 permet une analyse rapide d'une aile en se basant sur l'analyse de ses profils. Ce logiciel calcule les performances des profils de la meme maniere que XFOIL et permet de choisir parmi trois methodes de calcul pour obtenir les performances de l'aile : Lifting Line Theory (LLT), Vortex Lattice Method (VLM) et 3D Panels. Dans notre methodologie, nous utilisons la methode de calcul 3D Panels dont la validite a ete testee en soufflerie pour confirmer les calculs sur XFLR5. En ce qui concerne la conception et l'analyse par des elements finis de la structure, le logiciel CATIA V5 est couramment utilise dans le domaine aerospatial. CATIA V5 permet une automatisation des etapes de conception de l'aile. Ainsi, dans ce memoire, nous allons decrire la methodologie permettant l'etude aerostructurelle d'une aile d'avion.

  5. Sustaining Tunisian SMEs' Competitiveness in the Knowledge Society

    NASA Astrophysics Data System (ADS)

    Del Vecchio, Pasquale; Elia, Gianluca; Secundo, Giustina

    The paper aims to contribute to the debate about knowledge and digital divide affecting countries' competitiveness in the knowledge society. A survey based on qualitative and quantitative data collection has been performed to analyze the level of ICTs and e-Business adoption of the Tunisian SMEs. The results shows that to increase the SMEs competitiveness is necessary to invest in all the components of Intellectual capital: human capital (knowledge, skills, and the abilities of people for using the ICTs), structural capital (supportive infrastructure such as buildings, software, processes, patents, and trademarks, proprietary databases) and social capital (relations and collaboration inside and outside the company). At this purpose, the LINCET "Laboratoire d'Innovation Numerique pour la Competitivité de l'Entreprise Tunisienne" project is finally proposed as a coherent proposition to foster the growth of all the components of the Intellectual Capital for the benefits of competitiveness of Tunisian SMEs.

  6. Reseaux Neuronaux Optiques

    NASA Astrophysics Data System (ADS)

    Bergeron, Alain

    Cette recherche vise a la mise en oeuvre optique de reseaux neuronaux. Deux architectures differentes sont proposees. La premiere est la memoire associative permettant d'associer a un objet quelconque une sortie arbitraire tout en preservant l'information sur sa position. La seconde architecture, le classificateur neuronal pour le controle robotique, permet l'identification d'une entree et son classement selon differentes categories. La sortie est compatible avec les systemes numeriques standard. Pour realiser ces architectures, une approche modulaire est privilegiee. Le correlateur constitue le module de base des realisations. Differents modules sont de plus introduits pour realiser convenablement les operations neuronales. Le premier de ces modules est le seuil optoelectronique permettant de realiser une fonction non lineaire, element essentiel des reseaux neuronaux. Le second module a etre introduit est l'encodeur optonumerique, utile au classement des objets. Le probleme de l'enregistrement de la memoire est aborde a l'aide du codage iteratif global.

  7. Charge Transport in Carbon Nanotubes-Polymer Composite Photovoltaic Cells

    PubMed Central

    Ltaief, Adnen; Bouazizi, Abdelaziz; Davenas, Joel

    2009-01-01

    We investigate the dark and illuminated current density-voltage (J/V) characteristics of poly(2-methoxy-5-(2’-ethylhexyloxy)1-4-phenylenevinylene) (MEH-PPV)/single-walled carbon nanotubes (SWNTs) composite photovoltaic cells. Using an exponential band tail model, the conduction mechanism has been analysed for polymer only devices and composite devices, in terms of space charge limited current (SCLC) conduction mechanism, where we determine the power parameters and the threshold voltages. Elaborated devices for MEH-PPV:SWNTs (1:1) composites showed a photoresponse with an open-circuit voltage Voc of 0.4 V, a short-circuit current density JSC of 1 µA/cm² and a fill factor FF of 43%. We have modelised the organic photovoltaic devices with an equivalent circuit, where we calculated the series and shunt resistances.

  8. LLR data analysis and impact on lunar dynamics from recent developments at OCA LLR Station

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vishnu; Fienga, Agnes; Courde, Clement; Torre, Jean-Marie; Exertier, Pierre; Samain, Etienne; Feraudy, Dominique; Albanese, Dominique; Aimar, Mourad; Mariey, Hervé; Viot, Hervé; Martinot-Lagarde, Gregoire

    2016-04-01

    Since late 2014, OCA LLR station has been able to range with infrared wavelength (1064nm). IR ranging provides both temporal and spatial improvement in the LLR observations. IR detection also permits in densification of normal points, including the L1 and L2 retroreflectors due to better signal to noise ratio. This contributes to a better modelisation of the lunar libration. The hypothesis of lunar dust and environmental effects due to the chromatic behavior noticed on returns from L2 retroreflector is discussed. In addition, data analysis shows that the effect of retroreflector tilt and the use of calibration profile for the normal point deduction algorithm, contributes to improving the precision of normal points, thereby impacting lunar dynamical models and inner physics.

  9. Implementation en VHDl/FPGA d'afficheur video numerique (AVN) pour des applications aerospatiales

    NASA Astrophysics Data System (ADS)

    Pelletier, Sebastien

    L'objectif de ce projet est de developper un controleur video en langage VHDL afin de remplacer la composante specialisee presentement utilisee chez CMC Electronique. Une recherche approfondie des tendances et de ce qui se fait actuellement dans le domaine des controleurs video est effectuee afin de definir les specifications du systeme. Les techniques d'entreposage et d'affichage des images sont expliquees afin de mener ce projet a terme. Le nouveau controleur est developpe sur une plateforme electronique possedant un FPGA, un port VGA et de la memoire pour emmagasiner les donnees. Il est programmable et prend peu d'espace dans un FPGA, ce qui lui permet de s'inserer dans n'importe quelle nouvelle technologie de masse a faible cout. Il s'adapte rapidement a toutes les resolutions d'affichage puisqu'il est modulaire et configurable. A court terme, ce projet permettra un controle ameliore des specifications et des normes de qualite liees aux contraintes de l'avionique.

  10. Contribution to study of interfaces instabilities in plane, cylindrical and spherical geometry

    NASA Astrophysics Data System (ADS)

    Toque, Nathalie

    1996-12-01

    This thesis proposes several experiments of hydrodynamical instabilities which are studied, numerically and theoretically. The experiments are in plane and cylindrical geometry. Their X-ray radiographies show the evolution of an interface between two solid media crossed by a detonation wave. These materials are initially solid. They become liquide under shock wave or stay between two phases, solid and liquid. The numerical study aims at simulating with the codes EAD and Ouranos, the interfaces instabilities which appear in the experiments. The experimental radiographies and the numerical pictures are in quite good agreement. The theoretical study suggests to modelise a spatio-temporal part of the experiments to obtain the quantitative development of perturbations at the interfaces and in the flows. The models are linear and in plane, cylindrical and spherical geometry. They preceed the inoming study of transition between linear and non linear development of instabilities in multifluids flows crossed by shock waves.

  11. Generation of 238U Covariance Matrices by Using the Integral Data Assimilation Technique of the CONRAD Code

    NASA Astrophysics Data System (ADS)

    Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.

    2016-02-01

    A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.

  12. Methodes de caracterisation des proprietes thermomecaniques d'un acier martensitique =

    NASA Astrophysics Data System (ADS)

    Ausseil, Lucas

    Le but de l'etude est de developper des methodes permettant de mesurer les proprietes thermomecaniques d'un acier martensitique lors de chauffe rapide. Ces donnees permettent d'alimenter les modeles d'elements finis existant avec des donnees experimentales. Pour cela, l'acier 4340 est utilise. Cet acier est notamment utilise dans les roues d'engrenage, il a des proprietes mecaniques tres interessantes. Il est possible de modifier ses proprietes grâce a des traitements thermiques. Le simulateur thermomecanique Gleeble 3800 est utilise. Il permet de tester theoriquement toutes les conditions presentes dans les procedes de fabrication. Avec les tests de dilatation realises dans ce projet, les temperatures exactes de changement de phases austenitiques et martensitiques sont obtenues. Des tests de traction ont aussi permis de deduire la limite d'elasticite du materiau dans le domaine austenitique allant de 850 °C a 1100 °C. L'effet des deformations sur la temperature de debut de transformation est montre qualitativement. Une simulation numerique est aussi realisee pour comprendre les phenomenes intervenant pendant les essais.

  13. Modelisation de l'architecture des forets pour ameliorer la teledetection des attributs forestiers

    NASA Astrophysics Data System (ADS)

    Cote, Jean-Francois

    The quality of indirect measurements of canopy structure, from in situ and satellite remote sensing, is based on knowledge of vegetation canopy architecture. Technological advances in ground-based, airborne or satellite remote sensing can now significantly improve the effectiveness of measurement programs on forest resources. The structure of vegetation canopy describes the position, orientation, size and shape of elements of the canopy. The complexity of the canopy in forest environments greatly limits our ability to characterize forest structural attributes. Architectural models have been developed to help the interpretation of canopy structural measurements by remote sensing. Recently, the terrestrial LiDAR systems, or TLiDAR (Terrestrial Light Detection and Ranging), are used to gather information on the structure of individual trees or forest stands. The TLiDAR allows the extraction of 3D structural information under the canopy at the centimetre scale. The methodology proposed in my Ph.D. thesis is a strategy to overcome the weakness in the structural sampling of vegetation cover. The main objective of the Ph.D. is to develop an architectural model of vegetation canopy, called L-Architect (LiDAR data to vegetation Architecture), and to focus on the ability to document forest sites and to get information on canopy structure from remote sensing tools. Specifically, L-Architect reconstructs the architecture of individual conifer trees from TLiDAR data. Quantitative evaluation of L-Architect consisted to investigate (i) the structural consistency of the reconstructed trees and (ii) the radiative coherence by the inclusion of reconstructed trees in a 3D radiative transfer model. Then, a methodology was developed to quasi-automatically reconstruct the structure of individual trees from an optimization algorithm using TLiDAR data and allometric relationships. L-Architect thus provides an explicit link between the range measurements of TLiDAR and structural attributes of individual trees. L-Architect has finally been applied to model the architecture of forest canopy for better characterization of vertical and horizontal structure with airborne LiDAR data. This project provides a mean to answer requests of detailed canopy architectural data, difficult to obtain, to reproduce a variety of forest covers. Because of the importance of architectural models, L-Architect provides a significant contribution for improving the capacity of parameters' inversion in vegetation cover for optical and lidar remote sensing. Mots-cles: modelisation architecturale, lidar terrestre, couvert forestier, parametres structuraux, teledetection.

  14. Ground observations and remote sensing data for integrated modelisation of water budget in the Merguellil catchment, Tunisia

    NASA Astrophysics Data System (ADS)

    Mougenot, Bernard

    2016-04-01

    The Mediterranean region is affected by water scarcity. Some countries as Tunisia reached the limit of 550 m3/year/capita due overexploitation of low water resources for irrigation, domestic uses and industry. A lot of programs aim to evaluate strategies to improve water consumption at regional level. In central Tunisia, on the Merguellil catchment, we develop integrated water resources modelisations based on social investigations, ground observations and remote sensing data. The main objective is to close the water budget at regional level and to estimate irrigation and water pumping to test scenarios with endusers. Our works benefit from French, bilateral and European projects (ANR, MISTRALS/SICMed, FP6, FP7…), GMES/GEOLAND-ESA) and also network projects as JECAM and AERONET, where the Merguellil site is a reference. This site has specific characteristics associating irrigated and rainfed crops mixing cereals, market gardening and orchards and will be proposed as a new environmental observing system connected to the OMERE, TENSIFT and OSR systems respectively in Tunisia, Morocco and France. We show here an original and large set of ground and remote sensing data mainly acquired from 2008 to present to be used for calibration/validation of water budget processes and integrated models for present and scenarios: - Ground data: meteorological stations, water budget at local scale: fluxes tower, soil fluxes, soil and surface temperature, soil moisture, drainage, flow, water level in lakes, aquifer, vegetation parameters on selected fieds/month (LAI, height, biomass, yield), land cover: 3 times/year, bare soil roughness, irrigation and pumping estimations, soil texture. - Remote sensing data: remote sensing products from multi-platform (MODIS, SPOT, LANDSAT, ASTER, PLEIADES, ASAR, COSMO-SkyMed, TerraSAR X…), multi-wavelength (solar, micro-wave and thermal) and multi-resolution (0.5 meters to 1 km). Ground observations are used (1) to calibrate soil-vegetation-atmosphere models at field scale on different compartment and irrigated and rainfed land during a limited time (seasons or set of dry and wet years), (2) to calibrate and validate particularly evapotranspiration derived from multi-wavelength satellite data at watershed level in relationships with the aquifer conditions: pumping and recharge rate. We will point out some examples.

  15. Prediction du profil de durete de l'acier AISI 4340 traite thermiquement au laser

    NASA Astrophysics Data System (ADS)

    Maamri, Ilyes

    Les traitements thermiques de surfaces sont des procedes qui visent a conferer au coeur et a la surface des pieces mecaniques des proprietes differentes. Ils permettent d'ameliorer la resistance a l'usure et a la fatigue en durcissant les zones critiques superficielles par des apports thermiques courts et localises. Parmi les procedes qui se distinguent par leur capacite en terme de puissance surfacique, le traitement thermique de surface au laser offre des cycles thermiques rapides, localises et precis tout en limitant les risques de deformations indesirables. Les proprietes mecaniques de la zone durcie obtenue par ce procede dependent des proprietes physicochimiques du materiau a traiter et de plusieurs parametres du procede. Pour etre en mesure d'exploiter adequatement les ressources qu'offre ce procede, il est necessaire de developper des strategies permettant de controler et regler les parametres de maniere a produire avec precision les caracteristiques desirees pour la surface durcie sans recourir au classique long et couteux processus essai-erreur. L'objectif du projet consiste donc a developper des modeles pour predire le profil de durete dans le cas de traitement thermique de pieces en acier AISI 4340. Pour comprendre le comportement du procede et evaluer les effets des differents parametres sur la qualite du traitement, une etude de sensibilite a ete menee en se basant sur une planification experimentale structuree combinee a des techniques d'analyse statistiques eprouvees. Les resultats de cette etude ont permis l'identification des variables les plus pertinentes a exploiter pour la modelisation. Suite a cette analyse et dans le but d'elaborer un premier modele, deux techniques de modelisation ont ete considerees, soient la regression multiple et les reseaux de neurones. Les deux techniques ont conduit a des modeles de qualite acceptable avec une precision d'environ 90%. Pour ameliorer les performances des modeles a base de reseaux de neurones, deux nouvelles approches basees sur la caracterisation geometrique du profil de durete ont ete considerees. Contrairement aux premiers modeles predisant le profil de durete en fonction des parametres du procede, les nouveaux modeles combinent les memes parametres avec les attributs geometriques du profil de durete pour refleter la qualite du traitement. Les modeles obtenus montrent que cette strategie conduit a des resultats tres prometteurs.

  16. Realisation et Applications D'un Laser a Fibre a L'erbium Monofrequence

    NASA Astrophysics Data System (ADS)

    Larose, Robert

    L'incorporation d'ions de terres rares a l'interieur de la matrice de verre d'une fibre optique a permis l'emergence de composants amplificateurs tout-fibre. Le but de cette these consiste d'une part a analyser et a modeliser un tel dispositif et d'autre part, a fabriquer puis a caracteriser un amplificateur et un oscillateur a fibre. A l'aide d'une fibre fortement dopee a l'erbium fabriquee sur demande, on realise un laser a fibre syntonisable qui fonctionne en regime multimodes longitudinaux avec une largeur de raie de 1.5 GHz et egalement comme source monofrequencielle de largeur de raie de 70 kHz. Le laser sert ensuite a caracteriser un reseau de Bragg ecrit par photosensibilite dans une fibre optique. La technique de syntonisation permet aussi l'asservissement au fond d'une resonance de l'acetylene. Le laser garde alors la position centrale de la raie avec une erreur de moins de 1 MHz corrigeant ainsi les derives mecaniques de la cavite.

  17. Modelisation des emissions de particules microniques et nanometriques en usinage

    NASA Astrophysics Data System (ADS)

    Khettabi, Riad

    La mise en forme des pieces par usinage emet des particules, de tailles microscopiques et nanometriques, qui peuvent etre dangereuses pour la sante. Le but de ce travail est d'etudier les emissions de ces particules pour fins de prevention et reduction a la source. L'approche retenue est experimentale et theorique, aux deux echelles microscopique et macroscopique. Le travail commence par des essais permettant de determiner les influences du materiau, de l'outil et des parametres d'usinage sur les emissions de particules. E nsuite un nouveau parametre caracterisant les emissions, nomme Dust unit , est developpe et un modele predictif est propose. Ce modele est base sur une nouvelle theorie hybride qui integre les approches energetiques, tribologiques et deformation plastique, et inclut la geometrie de l'outil, les proprietes du materiau, les conditions de coupe et la segmentation des copeaux. Il ete valide au tournage sur quatre materiaux: A16061-T6, AISI1018, AISI4140 et fonte grise.

  18. Comparison of different 3D wavefront sensing and reconstruction techniques for MCAO

    NASA Astrophysics Data System (ADS)

    Bello, Dolores; Vérinaud, Christophe; Conan, Jean-Marc; Fusco, Thierry; Carbillet, Marcel; Esposito, Simone

    2003-02-01

    The vertical distribution of the turbulence limits the field of view of classical adaptive optics due to the anisoplanatism. Multiconjugate adaptive optics (MCAO) uses several deformable mirrors conjugated to different layers in the atmosphere to overcome this effect. In the last few years, many studies and developments have been done regarding the analysis of the turbulence volume, and the choice of the wavefront reconstruction techniques.An extensive study of MCAO modelisation and performance estimation has been done at OAA and ONERA. The developed Monte Carlo codes allow to simulate and investigate many aspects: comparison of turbulence analysis strategies (tomography or layer oriented) and comparison of different reconstruction approaches. For instance in the layer oriented approach, the control for a given deformable mirror can be either deduced from the whole set of wavefront sensor measurements or only using the associated wavefront sensor. Numerical simulations are presented showing the advantages and disadvantages of these different options for several cases depending on the number, geometry and magnitude of the guide stars.

  19. Regard epistemique sur une evolution conceptuelle en physique au secondaire

    NASA Astrophysics Data System (ADS)

    Potvin, Patrice

    The thesis, which is in continuity with Legendre's (1993) work, deals with qualitative understanding of physics notions at the secondary level. It attempts to identify and to label, in the verbalizations of 12 to 16 year-old students, the tendencies that guide their cognitive itineraries through the exploration of problem-situations. The hypotheses of work are about modelisations, conceptions and p-prims. These last objects are seen, in DiSessa's epistemological perspective, as a type of habit that influences the determination of links between the parameters of a problem. In other words, they coordinate logically and mathematically. Methodology is based on explicitation interviews. This type of interview authorizes verbalizations that involve an "intuitive sense" of mechanics. Twenty students are invited to share their evocations as they explore the logics of a computerized microworld. This microworld has been programmed on the "Interactive Physics(TM)" software and is made of five different situations that involve speed, acceleration, mass, force and inertia. The situations are presented to the students from the least to the most complex. An analysis of the verbalizations of the five students shows the existence of elements that play a role in modelisation and qualitative construction of comprehension as well as in its qualitative/quantitative articulation. Results indicate the presence of coordinative habits easily discernible. P-prims appear to play an important part in the construction of models and in the determination of links between the variables of a problem. The analysis of the results allows to see that conceptions are not so important pieces in comprehension. As such, they seem phenotypic. Also, analysis allows to recognize the difficulty to understand properly the inverse relation (1/x) and its asymptotic nature. The "p-prim" analysis also establishes the possibility to analyze not only efficient and inefficient intuitions, but also the cognitive itineraries of students working to construct the logic of the movement of a "ball" as a whole. Implications of the thesis are, among others, at the praxic level; it becomes possible to imagine sequences of learning and teaching physics that are based on the consideration of p-prims despite the implicit nature of these objects. This is a truly constructivist practice which establishes bridges between novice and expert knowledge because there are p-prims in both of them. As so, the thesis acknowledges a perspective of learning inscribed in "continuity". It also proposes a fertile theoretical ground for the comprehension of physics.

  20. Algorithmes de couplage RANS et ecoulement potentiel

    NASA Astrophysics Data System (ADS)

    Gallay, Sylvain

    Dans le processus de developpement d'avion, la solution retenue doit satisfaire de nombreux criteres dans de nombreux domaines, comme par exemple le domaine de la structure, de l'aerodynamique, de la stabilite et controle, de la performance ou encore de la securite, tout en respectant des echeanciers precis et minimisant les couts. Les geometries candidates sont nombreuses dans les premieres etapes de definition du produit et de design preliminaire, et des environnements d'optimisations multidisciplinaires sont developpes par les differentes industries aeronautiques. Differentes methodes impliquant differents niveaux de modelisations sont necessaires pour les differentes phases de developpement du projet. Lors des phases de definition et de design preliminaires, des methodes rapides sont necessaires afin d'etudier les candidats efficacement. Le developpement de methodes ameliorant la precision des methodes existantes tout en gardant un cout de calcul faible permet d'obtenir un niveau de fidelite plus eleve dans les premieres phases de developpement du projet et ainsi grandement diminuer les risques associes. Dans le domaine de l'aerodynamisme, les developpements des algorithmes de couplage visqueux/non visqueux permettent d'ameliorer les methodes de calcul lineaires non visqueuses en methodes non lineaires prenant en compte les effets visqueux. Ces methodes permettent ainsi de caracteriser l'ecoulement visqueux sur les configurations et predire entre autre les mecanismes de decrochage ou encore la position des ondes de chocs sur les surfaces portantes. Cette these se focalise sur le couplage entre une methode d'ecoulement potentiel tridimensionnelle et des donnees de section bidimensionnelles visqueuses. Les methodes existantes sont implementees et leurs limites identifiees. Une methode originale est ensuite developpee et validee. Les resultats sur une aile elliptique demontrent la capacite de l'algorithme a de grands angles d'attaques et dans la region post-decrochage. L'algorithme de couplage a ete compare a des donnees de plus haute fidelite sur des configurations issues de la litterature. Un modele de fuselage base sur des relations empiriques et des simulations RANS a ete teste et valide. Les coefficients de portance, de trainee et de moment de tangage ainsi que les coefficients de pression extraits le long de l'envergure ont montre un bon accord avec les donnees de soufflerie et les modeles RANS pour des configurations transsoniques. Une configuration a geometrie hypersustentatoire a permis d'etudier la modelisation des surfaces hypersustentees de la methode d'ecoulement potentiel, demontrant que la cambrure peut etre prise en compte uniquement dans les donnees visqueuses.

  1. Modelisation numerique des phenomenes physiques du soudage par friction-malaxage et comportement en fatigue de joints soudes en aluminium 7075-T6

    NASA Astrophysics Data System (ADS)

    Gemme, Frederic

    The aim of the present research project is to increase the amount of fundamental knowledge regarding the process by getting a better understanding of the physical phenomena involved in friction stir welding (FSW). Such knowledge is required to improve the process in the context of industrial applications. In order to do so, the first part of the project is dedicated to a theoretical study of the process, while the microstructure and the mechanical properties of welded joints obtained in different welding conditions are measured and analyzed in the second part. The combination of the tool rotating and translating movements induces plastic deformation and heat generation of the welded material. The material thermomechanical history is responsible for metallurgical phenomena occurring during FSW such as recrystallization and precipitate dissolution and coarsening. Process modelling is used to reproduce this thermomechanical history in order to predict the influence of welding on the material microstructure. It is helpful to study heat generation and heat conduction mechanisms and to understand how joint properties are related to them. In the current work, a finite element numerical model based on solid mechanics has been developed to compute the thermomechanical history of the welded material. The computation results were compared to reference experimental data in order to validate the model and to calibrate unknown physical parameters. The model was used to study the effect of the friction coefficient on the thermomechanical history. Results showed that contact conditions at the workpiece/tool interface have a strong effect on relative amounts of heat generated by friction and by plastic deformation. The comparison with the experimental torque applied by the tool for different rotational speeds has shown that the friction coefficient decreases when the rotational speed increases. Consequently, heat generation is far more important near the material/tool interface and the material deformation is shallower, increasing the lack of penetration probability. The variation of thermomechanical conditions with regards to the rotational speed is responsible for the variation of the nugget shape, as recrystallization conditions are not reached in the same volume of material. The second part of the research project was dedicated to a characterization of the welded joints microstructure and mechanical properties. Sound joints were obtained by using a manufacturing procedure involving process parameters optimization and quality control of the joint integrity. Five different combinations of rotational and advancing speeds were studied. Microstructure observations have shown that the rotational speed has an effect on recrystallization conditions because of the variation of the contact conditions at the material/tool interface. On the other hand, the advancing speed has a strong effect on the precipitation state in the heat affected zone (HAZ). The heat input increases when the advancing speed decreases. The material softening in the HAZ is then more pronounced. Mechanical testing of the welded joints showed that the fatigue resistance increases when the rotational speed increases and the advancing speed decreases. The fatigue resistance of FSW joints mainly depends on the ratio of the advancing speed on the rotational speed, called the welding pitch k. When the welding pitch is high (k ≥ 0,66 mm/rev), the fatigue resistance depends on crack initiation at the root of circular grooves left by the tool on the weld surface. The size of these grooves is directly related to the welding pitch. When the welding pitch is low (k ≤ 0,2 mm/rev), the heat input is high and the fatigue resistance is limited by the HAZ softening. The fatigue resistance is optimized when k stands in the 0,25-0,30 mm/rev range. Outside that range, the presence of small lateral lips is critical. The results of the characterization part of the project showed that the effects of the applied vertical force on the formation of lateral lips should be submitted to further investigations. The elimination of the lateral lip, which could be achieved with a more precise adjustment of the vertical force, could lead to an improved fatigue resistance. The elimination of lateral lips, but also the circular grooves left by the tool, may be obtained by developing an appropriate surfacing technique and could lead to an improved fatigue resistance without reducing the advancing speed. (Abstract shortened by UMI.)

  2. Adsorption de gaz sur les materiaux microporeux modelisation, thermodynamique et applications

    NASA Astrophysics Data System (ADS)

    Richard, Marc-Andre

    2009-12-01

    Nos travaux sur l'adsorption de gaz dans les materiaux microporeux s'inscrivent dans le cadre des recherches visant a augmenter l'efficacite du stockage de l'hydrogene a bord des vehicules. Notre objectif etait d'etudier la possibilite d'utiliser l'adsorption afin d'ameliorer l'efficacite de la liquefaction de l'hydrogene des systemes a petite echelle. Nous avons egalement evalue les performances d'un systeme de stockage cryogenique de l'hydrogene base sur la physisorption. Comme nous avons affaire a des plages de temperatures particulierement etendues et a de hautes pressions dans la region supercritique du gaz, nous avons du commencer par travailler sur la modelisation et la thermodynamique de l'adsorption. La representation de la quantite de gaz adsorbee en fonction de la temperature et de la pression par un modele semi-empirique est un outil utile pour determiner la masse de gaz adsorbee dans un systeme mais egalement pour calculer les effets thermiques lies a l'adsorption. Nous avons adapte le modele Dubinin-Astakhov (D-A) pour modeliser des isothermes d'adsorption d'hydrogene, d'azote et de methane sur du charbon actif a haute pression et sur une grande plage de temperatures supercritiques en considerant un volume d'adsorption invariant. Avec cinq parametres de regression (incluant le volume d'adsorption Va), le modele que nous avons developpe permet de tres bien representer des isothermes experimentales d'adsorption d'hydrogene (de 30 a 293 K, jusqu'a 6 MPa), d'azote (de 93 a 298 K, jusqu'a 6 MPa) et de methane (de 243 a 333 K, jusqu'a 9 MPa) sur le charbon actif. Nous avons calcule l'energie interne de la phase adsorbee a partir du modele en nous servant de la thermodynamique des solutions sans negliger le volume d'adsorption. Par la suite, nous avons presente les equations de conservation de la niasse et de l'energie pour un systeme d'adsorption et valide notre demarche en comparant des simulations et des tests d'adsorption et de desorption. En plus de l'energie interne, nous avons evalue l'entropie, l'energie differentielle d'adsorption et la chaleur isosterique d'adsorption. Nous avons etudie la performance d'un systeme de stockage d'hydrogene par adsorption pour les vehicules. La capacite de stockage d'hydrogene et les performances thermiques d'un reservoir de 150 L contenant du charbon actif Maxsorb MSC-30(TM) (surface specifique ˜ 3000 m2/g) ont ete etudiees sur une plage de temperatures de 60 a 298 K et a des pressions allant jusqu'a 35 MPa. Le systeme a ete considere de facon globale, sans nous attarder a un design particulier. Il est possible de stocker 5 kg d'hydrogene a des pressions de 7.8, 15.2 et 29 MPa pour des temperatures respectivement de 80, 114 et 172 K, lorsqu'on recupere l'hydrogene residuel a 2.5 bars en le chauffant. La simulation des phenomenes thermiques nous a permis d'analyser le refroidissement necessaire lors du remplissage, le chauffage lors de la decharge et le temps de dormance. Nous avons developpe un cycle de liquefaction de l'hydrogene base sur l'adsorption avec compression mecanique (ACM) et avons evalue sa faisabilite. L'objectif etait d'augmenter sensiblement l'efficacite des systemes de liquefaction de l'hydrogene a petite echelle (moins d'une tonne/jour) et ce, sans en augmenter le cout en capital. Nous avons adapte le cycle de refrigeration par ACM afin qu'il puisse par la suite etre ajoute a un cycle de liquefaction de l'hydrogene. Nous avons ensuite simule des cycles idealises de refrigeration par ACM. Meme dans ces conditions ideales, la refrigeration specifique est faible. De plus, l'efficacite theorique maximale de ces cycles de refrigeration est d'environ 20 a 30% de l'ideal. Nous avons realise experimentalement un cycle de refrigeration par ACM avec le couple azote/charbon actif. (Abstract shortened by UMI.)

  3. Caracterisation et modelisation de la degradation des proprietes fonctionnelles des AMF soumis a un chargement cyclique

    NASA Astrophysics Data System (ADS)

    Paradis, Alexandre

    The principal objective of the present thesis is to elaborate a computational model describing the mechanical properties of NiTi under different loading conditions. Secondary objectives are to build an experimental database of NiTi under stress, strain and temperature in order to validate the versatility of the new model proposed herewith. The simulation model used presently at Laboratoire sur les Alliage a Memoire et les Systemes Intelligents (LAMSI) of ETS is showing good behaviour in quasi-static loading. However, dynamic loading with the same model do not allows one to include degradation. The goal of the present thesis is to build a model capable of describing such degradation in a relatively accurate manner. Some experimental testing and results will be presented. In particular, new results on the behaviour of NiTi being paused during cycling are presented in chapter 2. A model is developed in chapter 3 based on Likhachev's micromechanical model. Good agreement is found with experimental data. Finally, an adaptation of the model is presented in chapter 4, allowing it to be eventually implemented into a finite-element commercial software.

  4. Modelisation de la diffusion sur les surfaces metalliques: De l'adatome aux processus de croissance

    NASA Astrophysics Data System (ADS)

    Boisvert, Ghyslain

    Cette these est consacree a l'etude des processus de diffusion en surface dans le but ultime de comprendre, et de modeliser, la croissance d'une couche mince. L'importance de bien mai triser la croissance est primordiale compte tenu de son role dans la miniaturisation des circuits electroniques. Nous etudions ici les surface des metaux nobles et de ceux de la fin de la serie de transition. Dans un premier temps, nous nous interessons a la diffusion d'un simple adatome sur une surface metallique. Nous avons, entre autres, mis en evidence l'apparition d'une correlation entre evenements successifs lorsque la temperature est comparable a la barriere de diffusion, i.e., la diffusion ne peut pas etre associee a une marche aleatoire. Nous proposons un modele phenomenologique simple qui reproduit bien les resultats des simulations. Ces calculs nous ont aussi permis de montrer que la diffusion obeit a la loi de Meyer-Neldel. Cette loi stipule que, pour un processus active, le prefacteur augmente exponentiellement avec la barriere. En plus, ce travail permet de clarifier l'origine physique de cette loi. En comparant les resultats dynamiques aux resultats statiques, on se rend compte que la barriere extraite des calculs dynamiques est essentiellement la meme que celle obtenue par une approche statique, beaucoup plus simple. On peut donc obtenir cette barriere a l'aide de methodes plus precises, i.e., ab initio, comme la theorie de la fonctionnelle de la densite, qui sont aussi malheureusement beaucoup plus lourdes. C'est ce que nous avons fait pour plusieurs systemes metalliques. Nos resultats avec cette derniere approche se comparent tres bien aux resultats experimentaux. Nous nous sommes attardes plus longuement a la surface (111) du platine. Cette surface regorge de particularites interessantes, comme la forme d'equilibre non-hexagonale des i lots et deux sites d'adsorption differents pour l'adatome. De plus, des calculs ab initio precedents n'ont pas reussi a confirmer la forme d'equilibre et surestiment grandement la barriere. Nos calculs, plus complets et dans un formalisme mieux adapte a ce genre de probleme, predisent correctement la forme d'equilibre, qui est en fait due a un relachement different du stress de surface aux deux types de marches qui forment les cotes des i lots. Notre valeur pour la barriere est aussi fortement diminuee lorsqu'on relaxe les forces sur les atomes de la surface, amenant le resultat theorique beaucoup plus pres de la valeur experimentale. Nos calculs pour le cuivre demontre en effet que la diffusion de petits i lots pendant la croissance ne peut pas etre negligee dans ce cas, mettant en doute la valeur des interpretations des mesures experimentales. (Abstract shortened by UMI.)

  5. Modelisation de la Propagation des Ondes Sonores dans un Environnement Naturel Complexe

    NASA Astrophysics Data System (ADS)

    L'Esperance, Andre

    Ce travail est consacre a la propagation sonore a l'exterieur dans un environnement naturel complexe, i.e. en presence de conditions reelles de vent, de gradient de temperature et de turbulence atmospherique. Plus specifiquement ce travail comporte deux objectifs. D'une part, il vise a developper un modele heuristique de propagation sonore (MHP) permettant de prendre en consideration l'ensemble des phenomenes meteorologiques et acoustiques influencant la propagation du son a l'exterieur. D'autre part, il vise a identifier dans quelles circonstances et avec quelle importance les conditions meteorologiques interviennent sur la propagation sonore. Ce travail est divise en cinq parties. Apres une breve introduction identifiant les motivations de cette etude (chapitre 1), le chapitre 2 fait un rappel des travaux deja realises dans le domaine de la propagation du son a l'exterieur. Ce chapitre presente egalement les bases de l'acoustique geometrique a partir desquelles ont ete developpees la partie acoustique du modele heuristique de propagation. En outre, on y decrit comment les phenomenes de refraction et de turbulence atmospherique peuvent etre consideres dans la theorie des rayons. Le chapitre 3 presente le modele heuristique de propagation (MHP) developpe au cours de cet ouvrage. La premiere section de ce chapitre decrit le modele acoustique de propagation, modele qui fait l'hypothese d'un gradient de celerite lineaire et qui est base sur une solution hybride d'acoustique geometrique et de theorie des residus. La deuxieme section du chapitre 3 traite plus specifiquement de la modelisation des aspects meteorologiques et de la determination des profils de celerite et des index de fluctuation associes aux conditions meteorologiques. La section 3 de ce chapitre decrit comment les profils de celerite resultants sont linearises pour les calculs dans le modele acoustique, et finalement la section 4 donne les tendances generales obtenues par le modele. Le chapitre 4 decrit les compagnes de mesures qui ont ete realisees a Rock-Spring (Pennsylvanie, Etats -Unis) au cours de l'ete 90 et a Bouin (Vendee, France) au cours de l'automne 91. La campagne de mesure de Rock -Spring a porte plus specifiquement sur les effets de la refraction alors que la campagne de Bouin a prote plus specifiquement sur les effets de la turbulence. La section 4.1 decrit les equipements et le traitement des donnees meteorologiques realisees dans chaque cas et la section 4.2 fait de meme pour les resultats acoustiques. Finalement, le chapitre 5 compare les resultats experimentaux obtenus avec ceux donnes par le MHP, tant pour les resultats meteorologiques que pour les resultats acoustiques. Des comparaisons avec un autre modele (le Fast Field Program) sont egalement presentees.

  6. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    NASA Astrophysics Data System (ADS)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.

  7. La conception, la modelisation et la simulation du systeme VSC-HVDC offshore

    NASA Astrophysics Data System (ADS)

    Benhalima, Seghir

    Wind energy is recognized worldwide as a proven technology to meet the growing demands of green sustainable energy. To exploit this stochastic energy source and put together with the conventional energy sources without affecting the performance of existing electrical grids, several research projects have been achieved. In addition, at ocean level, wind energy has a great potential. It means that the production of this energy will increase in the world. The optimal extraction of this energy source needs to be connected to the grid via a voltage source converter which plays the role of interface. To minimise losses due to the transport of energy at very long distances, the technology called High Voltage Direct Current based on Voltage Source Converter (VSC-HVDC) is used. To achieve this goal, a new topology is designed with a new control algorithm based on control of power generated by the wind farm, the DC voltage regulation and the synchronization between wind farm and VSC-HVDC (based on NPC). The proposed topology and its control technique are validated using the "MATLAB/Simulink program". The results are promising, because the THD is less than 5% and the power factor is close to one.

  8. MONET: multidimensional radiative cloud scene model

    NASA Astrophysics Data System (ADS)

    Chervet, Patrick

    1999-12-01

    All cloud fields exhibit variable structures (bulge) and heterogeneities in water distributions. With the development of multidimensional radiative models by the atmospheric community, it is now possible to describe horizontal heterogeneities of the cloud medium, to study these influences on radiative quantities. We have developed a complete radiative cloud scene generator, called MONET (French acronym for: MOdelisation des Nuages En Tridim.) to compute radiative cloud scene from visible to infrared wavelengths for various viewing and solar conditions, different spatial scales, and various locations on the Earth. MONET is composed of two parts: a cloud medium generator (CSSM -- Cloud Scene Simulation Model) developed by the Air Force Research Laboratory, and a multidimensional radiative code (SHDOM -- Spherical Harmonic Discrete Ordinate Method) developed at the University of Colorado by Evans. MONET computes images for several scenario defined by user inputs: date, location, viewing angles, wavelength, spatial resolution, meteorological conditions (atmospheric profiles, cloud types)... For the same cloud scene, we can output different viewing conditions, or/and various wavelengths. Shadowing effects on clouds or grounds are taken into account. This code is useful to study heterogeneity effects on satellite data for various cloud types and spatial resolutions, and to determine specifications of new imaging sensor.

  9. Etude vibroacoustique d'un systeme coque-plancher-cavite avec application a un fuselage simplifie

    NASA Astrophysics Data System (ADS)

    Missaoui, Jemai

    L'objectif de ce travail est de developper des modeles semi-analytiques pour etudier le comportement structural, acoustique et vibro-acoustique d'un systeme coque-plancher-cavite. La connection entre la coque et le plancher est assuree en utilisant le concept de rigidite artificielle. Ce concept de modelisation flexible facilite le choix des fonctions de decomposition du mouvement de chaque sous-structure. Les resultats issus de cette etude vont permettre la comprehension des phenomenes physiques de base rencontres dans une structure d'avion. Une approche integro-modale est developpee pour calculer les caracteristiques modales acoustiques. Elle utilise une discretisation de la cavite irreguliere en sous-cavites acoustiques dont les bases de developpement sont connues a priori. Cette approche, a caractere physique, presente l'avantage d'etre efficace et precise. La validite de celle-ci a ete demontree en utilisant des resultats disponibles dans la litterature. Un modele vibro-acoustique est developpe dans un but d'analyser et de comprendre les effets structuraux et acoustiques du plancher dans la configuration. La validite des resultats, en termes de resonance et de fonction de transfert, est verifiee a l'aide des mesures experimentales realisees au laboratoire.

  10. Multi-scale properties of large eddy simulations: correlations between resolved-scale velocity-field increments and subgrid-scale quantities

    NASA Astrophysics Data System (ADS)

    Linkmann, Moritz; Buzzicotti, Michele; Biferale, Luca

    2018-06-01

    We provide analytical and numerical results concerning multi-scale correlations between the resolved velocity field and the subgrid-scale (SGS) stress-tensor in large eddy simulations (LES). Following previous studies for Navier-Stokes equations, we derive the exact hierarchy of LES equations governing the spatio-temporal evolution of velocity structure functions of any order. The aim is to assess the influence of the subgrid model on the inertial range intermittency. We provide a series of predictions, within the multifractal theory, for the scaling of correlation involving the SGS stress and we compare them against numerical results from high-resolution Smagorinsky LES and from a-priori filtered data generated from direct numerical simulations (DNS). We find that LES data generally agree very well with filtered DNS results and with the multifractal prediction for all leading terms in the balance equations. Discrepancies are measured for some of the sub-leading terms involving cross-correlation between resolved velocity increments and the SGS tensor or the SGS energy transfer, suggesting that there must be room to improve the SGS modelisation to further extend the inertial range properties for any fixed LES resolution.

  11. Mesure et retroaction sur un qubit multi-niveaux en electrodynamique quantique en circuit non lineair

    NASA Astrophysics Data System (ADS)

    Boissonneault, Maxime

    L'electrodynamique quantique en circuit est une architecture prometteuse pour le calcul quantique ainsi que pour etudier l'optique quantique. Dans cette architecture, on couple un ou plusieurs qubits supraconducteurs jouant le role d'atomes a un ou plusieurs resonateurs jouant le role de cavites optiques. Dans cette these, j'etudie l'interaction entre un seul qubit supraconducteur et un seul resonateur, en permettant cependant au qubit d'avoir plus de deux niveaux et au resonateur d'avoir une non-linearite Kerr. Je m'interesse particulierement a la lecture de l'etat du qubit et a son amelioration, a la retroaction du processus de mesure sur le qubit de meme qu'a l'etude des proprietes quantiques du resonateur a l'aide du qubit. J'utilise pour ce faire un modele analytique reduit que je developpe a partir de la description complete du systeme en utilisant principalement des transfprmations unitaires et une elimination adiabatique. J'utilise aussi une librairie de calcul numerique maison permettant de simuler efficacement l'evolution du systeme complet. Je compare les predictions du modele analytique reduit et les resultats de simulations numeriques a des resultats experimentaux obtenus par l'equipe de quantronique du CEASaclay. Ces resultats sont ceux d'une spectroscopie d'un qubit supraconducteur couple a un resonateur non lineaire excite. Dans un regime de faible puissance de spectroscopie le modele reduit predit correctement la position et la largeur de la raie. La position de la raie subit les decalages de Lamb et de Stark, et sa largeur est dominee par un dephasage induit par le processus de mesure. Je montre que, pour les parametres typiques de l'electrodynamique quantique en circuit, un accord quantitatif requiert un modele en reponse non lineaire du champ intra-resonateur, tel que celui developpe. Dans un regime de forte puissance de spectroscopie, des bandes laterales apparaissent et sont causees par les fluctuations quantiques du champ electromagnetique intra-resonateur autour de sa valeur d'equilibre. Ces fluctuations sont causees par la compression du champ electromagnetique due a la non-linearite du resonateur, et l'observation de leur effet via la spectroscopie d'un qubit constitue une premiere. Suite aux succes quantitatifs du modele reduit, je montre que deux regimes de parametres ameliorent marginalement la mesure dispersive d'un qubit avec un resonateur lineaire, et significativement une mesure par bifurcation avec un resonateur non lineaire. J'explique le fonctionnement d'une mesure de qubit dans un resonateur lineaire developpee par une equipe experimentale de l'Universite de Yale. Cette mesure, qui utilise les non-linearites induites par le qubit, a une haute fidelite, mais utilise une tres haute puissance et est destructrice. Dans tous ces cas, la structure multi-niveaux du qubit s'avere cruciale pour la mesure. En suggerant des facons d'ameliorer la mesure de qubits supraconducteurs, et en decrivant quantitativement la physique d'un systeme a plusieurs niveaux couple a un resonateur non lineaire excite, les resultats presentes dans cette these sont pertinents autant pour l'utilisation de l'architecture d'electrodynamique quantique en circuit pour l'informatique quantique que pour l'optique quantique. Mots-cles: electrodynamique quantique en circuit, informatique quantique, mesure, qubit supraconducteur, transmon, non-linearite Kerr

  12. Modelisation de l'erosion et des sources de pollution dans le bassin versant Iroquois/Blanchette dans un contexte de changements climatiques

    NASA Astrophysics Data System (ADS)

    Coulibaly, Issa

    Principale source d'approvisionnement en eau potable de la municipalite d'Edmundston, le bassin versant Iroquois/Blanchette est un enjeu capital pour cette derniere, d'ou les efforts constants deployes pour assurer la preservation de la qualite de son eau. A cet effet, plusieurs etudes y ont ete menees. Les plus recentes ont identifie des menaces de pollution de diverses origines dont celles associees aux changements climatiques (e.g. Maaref 2012). Au regard des impacts des modifications climatiques annonces a l'echelle du Nouveau-Brunswick, le bassin versant Iroquois/Blanchette pourrait etre fortement affecte, et cela de diverses facons. Plusieurs scenarios d'impacts sont envisageables, notamment les risques d'inondation, d'erosion et de pollution a travers une augmentation des precipitations et du ruissellement. Face a toutes ces menaces eventuelles, l'objectif de cette etude est d'evaluer les impacts potentiels des changements climatiques sur les risques d'erosion et de pollution a l'echelle du bassin versant Iroquois/Blanchette. Pour ce faire, la version canadienne de l'equation universelle revisee des pertes en sol RUSLE-CAN et le modele hydrologique SWAT ( Soil and Water Assessment Tool) ont ete utilises pour modeliser les risques d'erosion et de pollution au niveau dans la zone d'etude. Les donnees utilisees pour realiser ce travail proviennent de sources diverses et variees (teledetections, pedologiques, topographiques, meteorologiques, etc.). Les simulations ont ete realisees en deux etapes distinctes, d'abord dans les conditions actuelles ou l'annee 2013 a ete choisie comme annee de reference, ensuite en 2025 et 2050. Les resultats obtenus montrent une tendance a la hausse de la production de sediments dans les prochaines annees. La production maximale annuelle augmente de 8,34 % et 8,08 % respectivement en 2025 et 2050 selon notre scenario le plus optimiste, et de 29,99 % en 2025 et 29,72 % en 2050 selon le scenario le plus pessimiste par rapport a celle de 2013. Pour ce qui est de la pollution, les concentrations observees (sediment, nitrate et phosphore) connaissent une evolution avec les changements climatiques. La valeur maximale de la concentration en sediments connait une baisse en 2025 et 2050 par rapport a 2013, de 11,20 mg/l en 2013, elle passe a 9,03 mg/l en 2025 puis a 6,25 en 2050. On s'attend egalement a une baisse de la valeur maximale de la concentration en nitrate au fil des annees, plus accentuee en 2025. De 4,12 mg/l en 2013, elle passe a 1,85 mg/l en 2025 puis a 2,90 en 2050. La concentration en phosphore par contre connait une augmentation dans les annees a venir par rapport a celle de 2013, elle passe de 0,056 mg/l en 2013 a 0,234 mg/l en 2025 puis a 0,144 en 2050.

  13. POD and PPP with multi-frequency processing

    NASA Astrophysics Data System (ADS)

    Roldán, Pedro; Navarro, Pedro; Rodríguez, Daniel; Rodríguez, Irma

    2017-04-01

    Precise Orbit Determination (POD) and Precise Point Positioning (PPP) are methods for estimating the orbits and clocks of GNSS satellites and the precise positions and clocks of user receivers. These methods are traditionally based on processing the ionosphere-free combination. With this combination, the delay introduced in the signal when passing through the ionosphere is removed, taking advantage of the dependency of this delay with the square of the frequency. It is also possible to process the individual frequencies, but in this case it is needed to properly model the ionospheric delay. This modelling is usually very challenging, as the electron content in the ionosphere experiences important temporal and spatial variations. These two options define the two main kinds of processing: the dual-frequency ionosphere-free processing, typically used in the POD and in certain applications of PPP, and the single-frequency processing with estimation or modelisation of the ionosphere, mostly used in the PPP processing. In magicGNSS, a software tool developed by GMV for POD and PPP, a hybrid approach has been implemented. This approach combines observations from any number of individual frequencies and any number of ionosphere-free combinations of these frequencies. In such a way, the observations of ionosphere-free combination allow a better estimation of positions and orbits, while the inclusion of observations from individual frequencies allows to estimate the ionospheric delay and to reduce the noise of the solution. It is also possible to include other kind of combinations, such as geometry-free combination, instead of processing individual frequencies. The joint processing of all the frequencies for all the constellations requires both the estimation or modelisation of ionospheric delay and the estimation of inter-frequency biases. The ionospheric delay can be estimated from the single-frequency or dual-frequency geometry-free observations, but it is also possible to use a-priori information based on ionospheric models, on external estimations and on the expected behavior of the ionosphere. The inter-frequency biases appear because the delay of the signal inside the transmitter and the receiver strongly depends on its frequency. However, it is possible to include constraints in the estimator regarding these delays, assuming small variations over time. By using different types of combinations, all the available information from GNSS systems can be included in the processing. This is especially interesting for the case of Galileo satellites, which transmit in several frequencies, and the GPS IIF satellites, which transmit in L5 in addition to the traditional L1 and L2. Several experiments have been performed, to assess the improvement on performance of POD and PPP when using all the constellations and all the available frequencies for each constellation. This paper describes the new approach of multi-frequency processing, including the estimation of biases and ionospheric delays impacting on GNSS observations, and presents the results of the performed experimentation activities to assess the benefits in POD and PPP algorithms.

  14. Enhancement of anaerobic sludge digestion by high-pressure homogenization.

    PubMed

    Zhang, Sheng; Zhang, Panyue; Zhang, Guangming; Fan, Jie; Zhang, Yuxuan

    2012-08-01

    To improve anaerobic sludge digestion efficiency, the effects of high-pressure homogenization (HPH) conditions on the anaerobic sludge digestion were investigated. The VS and TCOD were significantly removed with the anaerobic digestion, and the VS removal and TCOD removal increased with increasing the homogenization pressure and homogenization cycle number; correspondingly, the accumulative biogas production also increased with increasing the homogenization pressure and homogenization cycle number. The optimal homogenization pressure was 50 MPa for one homogenization cycle and 40 MPa for two homogenization cycles. The SCOD of the sludge supernatant significantly increased with increasing the homogenization pressure and homogenization cycle number due to the sludge disintegration. The relationship between the biogas production and the sludge disintegration showed that the accumulative biogas and methane production were mainly enhanced by the sludge disintegration, which accelerated the anaerobic digestion process and improved the methane content in the biogas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Formulation, caracterisation, modelisation et prevision du comportement thermomecanique des pieces plastiques et composites de fibres de bois : Application aux engrenages =

    NASA Astrophysics Data System (ADS)

    Mijiyawa, Faycal

    Cette etude permet d'adapter des materiaux composites thermoplastiques a fibres de bois aux engrenages, de fabriquer de nouvelles generations d'engrenages et de predire le comportement thermique de ces engrenages. Apres une large revue de la litterature sur les materiaux thermoplastiques (polyethylene et polypropylene) renforces par les fibres de bois (bouleau et tremble), sur la formulation et l'etude du comportement thermomecanique des engrenages en plastique-composite; une relation a ete etablie avec notre presente these de doctorat. En effet, beaucoup d'etudes sur la formulation et la caracterisation des materiaux composites a fibres de bois ont ete deja realisees, mais aucune ne s'est interessee a la fabrication des engrenages. Les differentes techniques de formulation tirees de la litterature ont facilite l'obtention d'un materiau composite ayant presque les memes proprietes que les materiaux plastiques (nylon, acetal...) utilises dans la conception des engrenages. La formulation des materiaux thermoplastiques renforces par les fibres de bois a ete effectuee au Centre de recherche en materiaux lignocellulosiques (CRML) de l'Universite du Quebec a Trois-Rivieres (UQTR), en collaboration avec le departement de Genie Mecanique, en melangeant les composites avec deux rouleaux sur une machine de type Thermotron-C.W. Brabender (modele T-303, Allemand) ; puis des pieces ont ete fabriquees par thermocompression. Les thermoplastiques utilises dans le cadre de cette these sont le polypropylene (PP) et le polyethylene haute densite (HDPE), avec comme renfort des fibres de bouleau et de tremble. A cause de l'incompatibilite entre la fibre de bois et le thermoplastique, un traitement chimique a l'aide d'un agent de couplage a ete realise pour augmenter les proprietes mecaniques des materiaux composites. Pour les composites polypropylene/bois : (1) Les modules elastiques et les contraintes a la rupture en traction des composites PP/bouleau et PP/tremble evoluent lineairement en fonction du taux de fibres, avec ou sans agent de couplage (Maleate de polypropylene MAPP). De plus, l'adherence entre les fibres de bois et le plastique est amelioree en utilisant seulement 3 % MAPP, entrainant donc une augmentation de la contrainte maximale bien qu'aucun effet significatif ne soit observe sur le module d'elasticite. (2) Les resultats obtenus montrent que, en general, les proprietes en traction des composites polypropylene/bouleau, polypropylene/tremble et polypropylene/bouleau/ tremble sont tres semblables. Les composites plastique-bois (WPCs), en particulier ceux contenant 30 % et 40 % de fibres, ont des modules elastiques plus eleves que certains plastiques utilises dans l'application des engrenages (ex. Nylon). Pour les composites polyethylene/bois, avec 3%Maleate de polyethylene (MAPE): (1) Tests de traction : le module elastique passe de 1.34 GPa a 4.19 GPa pour le composite HDPE/bouleau, alors qu'il passe de 1.34 GPa a 3.86 GPa pour le composite HDPE/tremble. La contrainte maximale passe de 22 MPa a 42.65 MPa pour le composite HDPE/bouleau, alors qu'elle passe de 22 MPa a 43.48 MPa pour le composite HDPE/tremble. (2) Tests de flexion : le module elastique passe de 1.04 GPa a 3.47 GPa pour le composite HDPE/bouleau et a 3.64 GPa pour le composite HDPE/tremble. La contrainte maximale passe de 23.90 MPa a 66.70 MPa pour le composite HDPE/bouleau, alors qu'elle passe a 59.51 MPa pour le composite HDPE/tremble. (3) Le coefficient de Poisson determine par impulsion acoustique est autour de 0.35 pour tous les composites HDPE/bois. (4) Le test de degradation thermique TGA nous revele que les materiaux composites presentent une stabilite thermique intermediaire entre les fibres de bois et la matrice HDPE. (5) Le test de mouillabilite (angle de contact) revele que l'ajout de fibres de bois ne diminue pas de facon significative les angles de contact avec de l'eau parce que les fibres de bois (bouleau ou tremble) semblent etre enveloppees par la matrice sur la surface des composites, comme le montrent des images prises au microscope electronique a balayage MEB. (6) Le modele de Lavengoof-Goettler predit mieux le module elastique du composite thermoplastique/bois. (7) Le HDPE renforce par 40 % de bouleau est mieux adapte pour la fabrication des engrenages, car le retrait est moins important lors du refroidissement au moulage. La simulation numerique semble mieux predire la temperature d'equilibre a la vitesse de 500 tr/min; alors qu'a 1000 tr/min, on remarque une divergence du modele. (Abstract shortened by ProQuest.). None None None None None None None None

  16. Developpement de techniques numeriques pour l'estimation, la modelisation et la prediction de proprietes thermodynamiques et structurales de systems metalliques a fort ordonnancement chimique

    NASA Astrophysics Data System (ADS)

    Harvey, Jean-Philippe

    In this work, the possibility to calculate and evaluate with a high degree of precision the Gibbs energy of complex multiphase equilibria for which chemical ordering is explicitly and simultaneously considered in the thermodynamic description of solid (short range order and long range order) and liquid (short range order) metallic phases is studied. The cluster site approximation (CSA) and the cluster variation method (CVM) are implemented in a new minimization technique of the Gibbs energy of multicomponent and multiphase systems to describe the thermodynamic behaviour of metallic solid solutions showing strong chemical ordering. The modified quasichemical model in the pair approximation (MQMPA) is also implemented in the new minimization algorithm presented in this work to describe the thermodynamic behaviour of metallic liquid solutions. The constrained minimization technique implemented in this work consists of a sequential quadratic programming technique based on an exact Newton’s method (i.e. the use of exact second derivatives in the determination of the Hessian of the objective function) combined to a line search method to identify a direction of sufficient decrease of the merit function. The implementation of a new algorithm to perform the constrained minimization of the Gibbs energy is justified by the difficulty to identify, in specific cases, the correct multiphase assemblage of a system where the thermodynamic behaviour of the equilibrium phases is described by one of the previously quoted models using the FactSage software (ex.: solid_CSA+liquid_MQMPA; solid1_CSA+solid2_CSA). After a rigorous validation of the constrained Gibbs energy minimization algorithm using several assessed binary and ternary systems found in the literature, the CVM and the CSA models used to describe the energetic behaviour of metallic solid solutions present in systems with key industrial applications such as the Cu-Zr and the Al-Zr systems are parameterized using fully consistent thermodynamic an structural data generated from a Monte Carlo (MC) simulator also implemented in the framework of this project. In this MC simulator, the modified embedded atom model in the second nearest neighbour formalism (MEAM-2NN) is used to describe the cohesive energy of each studied structure. A new Al-Zr MEAM-2NN interatomic potential needed to evaluate the cohesive energy of the condensed phases of this system is presented in this work. The thermodynamic integration (TI) method implemented in the MC simulator allows the evaluation of the absolute Gibbs energy of the considered solid or liquid structures. The original implementation of the TI method allowed us to evaluate theoretically for the first time all the thermodynamic mixing contributions (i.e., mixing enthalpy and mixing entropy contributions) of a metallic liquid (Cu-Zr and Al-Zr) and of a solid solution (face-centered cubic (FCC) Al-Zr solid solution) described by the MEAM-2NN. Thermodynamic and structural data obtained from MC and molecular dynamic simulations are then used to parameterize the CVM for the Al-Zr FCC solid solution and the MQMPA for the Al-Zr and the Cu-Zr liquid phase respectively. The extended thermodynamic study of these systems allow the introduction of a new type of configuration-dependent excess parameters in the definition of the thermodynamic function of solid solutions described by the CVM or the CSA. These parameters greatly improve the precision of these thermodynamic models based on experimental evidences found in the literature. A new parameterization approach of the MQMPA model of metallic liquid solutions is presented throughout this work. In this new approach, calculated pair fractions obtained from MC/MD simulations are taken into account as well as configuration-independent volumetric relaxation effects (regular like excess parameters) in order to parameterize precisely the Gibbs energy function of metallic melts. The generation of a complete set of fully consistent thermodynamic, physical and structural data for solid, liquid, and stoichiometric compounds and the subsequent parameterization of their respective thermodynamic model lead to the first description of the complete Al-Zr phase diagram in the range of composition [0 ≤ XZr ≤ 5 / 9] based on theoretical and fully consistent thermodynamic properties. MC and MD simulations are performed for the Al-Zr system to define for the first time the precise thermodynamic behaviour of the amorphous phase for its entire range of composition. Finally, all the thermodynamic models for the liquid phase, the FCC solid solution and the amorphous phase are used to define conditions based on thermodynamic and volumetric considerations that favor the amorphization of Al-Zr alloys.

  17. Sewage sludge disintegration by high-pressure homogenization: a sludge disintegration model.

    PubMed

    Zhang, Yuxuan; Zhang, Panyue; Ma, Boqiang; Wu, Hao; Zhang, Sheng; Xu, Xin

    2012-01-01

    High-pressure homogenization (HPH) technology was applied as a pretreatment to disintegrate sewage sludge. The effects of homogenization pressure, homogenization cycle number, and total solid content on sludge disintegration were investigated. The sludge disintegration degree (DD(COD)), protein concentration, and polysaccharide concentration increased with the increase of homogenization pressure and homogenization cycle number, and decreased with the increase of sludge total solid (TS) content. The maximum DD(COD) of 43.94% was achieved at 80 MPa with four homogenization cycles for a 9.58 g/L TS sludge sample. A HPH sludge disintegration model of DD(COD) = kNaPb was established by multivariable linear regression to quantify the effects of homogenization parameters. The homogenization cycle exponent a and homogenization pressure exponent b were 0.4763 and 0.7324 respectively, showing that the effect of homogenization pressure (P) was more significant than that of homogenization cycle number (N). The value of the rate constant k decreased with the increase of sludge total solid content. The specific energy consumption increased with the increment of sludge disintegration efficiency. Lower specific energy consumption was required for higher total solid content sludge.

  18. Internal homogenization: effective permittivity of a coated sphere.

    PubMed

    Chettiar, Uday K; Engheta, Nader

    2012-10-08

    The concept of internal homogenization is introduced as a complementary approach to the conventional homogenization schemes, which could be termed as external homogenization. The theory for the internal homogenization of the permittivity of subwavelength coated spheres is presented. The effective permittivity derived from the internal homogenization of coreshells is discussed for plasmonic and dielectric constituent materials. The effective model provided by the homogenization is a useful design tool in constructing coated particles with desired resonant properties.

  19. Effect of homogenization on the properties and microstructure of Mozzarella cheese from buffalo milk.

    PubMed

    Abd El-Gawad, Mona A M; Ahmed, Nawal S; El-Abd, M M; Abd El-Rafee, S

    2012-04-02

    The name pasta filata refers to a unique plasticizing and texturing treatments of the fresh curd in hot water that imparts to the finished cheese its characteristic fibrous structure and melting properties. Mozzarella cheese made from standardized homogenized and non-homogenized buffalo milk with 3 and 1.5%fat. The effect of homogenization on rheological, microstructure and sensory evaluation was carried out. Fresh raw buffalo milk and starter cultures of Streptococcus thermophilus and Lactobacillus delbrueckii ssp. bulgaricus were used. The coagulants were calf rennet powder (HA-LA). Standardized buffalo milk was homogenized at 25 kg/cm2 pressure after heating to 60°C using homogenizer. Milk and cheese were analysed. Microstructure of the cheese samples was investigated either with an application of transmission or scanning electron microscope. Statistical analyses were applied on the obtained data. Soluble nitrogen total volatile free fatty acids, soluble tyrosine and tryptophan increased with using homogenized milk and also, increased with relatively decrease in case of homogenized Mozzarella cheese. Meltability of Mozzarella cheese increased with increasing the fat content and storage period and decrease with homogenization. Mozzarella cheese firmness increased with homogenization and also, increased with progressing of storage period. Flavour score, appearance and total score of Mozzarella cheese increased with homogenization and storage period progress, while body and texture score decreased with homogenization and increased with storage period progress. Microstructure of Mozzarella cheese showed the low fat cheese tends to be harder, more crumbly and less smooth than normal. Curd granule junctions were prominent in non-homogenized milk cheese. Homogenization of milk cheese caused changes in the microstructure of the Mozzarella cheese. Microstructure studies of cheese revealed that cheese made from homogenized milk is smoother and has a finer texture than non-homogenized but is also, firmer and more elastic.

  20. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    PubMed

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Un accumulateur echangeur de chaleur hybride pour la gestion simultanee des energies solaire et electrique

    NASA Astrophysics Data System (ADS)

    Ait Hammou, Zouhair

    Cette etude porte sur la conception d'un accumulateur echangeur de chaleur hybride (AECH) pour la gestion simultanee des energies solaire et electrique. Un modele mathematique reposant sur les equations de conservation de la quantite d'energie est expose. Il est developpe pour tester differents materiaux de stockage, entre autres, les materiaux a changement de phase (solide/liquide) et les materiaux de stockage sensible. Un code de calcul est mis en eeuvre sur ordinateur, puis valide a l'aide des resultats analytiques et numeriques de la litterature. En parallele, un prototype experimental a echelle reduite est concu au laboratoire afin de valider le code de calcul. Des simulations sont effectuees pour etudier les effets des parametres de conception et des materiaux de stockage sur le comportement thermique de l'AECH et sur la consommation d'energie electrique. Les resultats des simulations sur quatre mois d'hiver montrent que la paraffine n-octadecane et l'acide caprique sont deux candidats souhaitables pour le stockage d'energie destine au chauffage des habitats. L'utilisation de ces deux materiaux dans l'AECH permet de reduire la consommation d'energie electrique de 32% et d'aplanir le probleme de pointe electrique puisque 90% de l'energie electrique est consommee durant les heures creuses. En plus, en adoptant un tarif preferentiel, le calcul des couts lies a la consommation d'energie electrique montre que le consommateur adoptant ce systeme beneficie d'une reduction de 50% de la facture d'electricite.

  2. Caracterisation des Ondes Radar de Surface par la Simulation Numerique et les Mesures GPR pour l'Auscultation en

    NASA Astrophysics Data System (ADS)

    Filali, Bilai

    Graphene, as an advanced carbon nano-structure, has attracted a deluge of interest of scholars recently because of it's outstanding mechanical, electrical and thermal properties. There are several different ways to synthesis graphene in practical ways, such as Mechanical Exfoliation, Chemical Vapor Deposition (CVD), and Anodic Arc discharge. In this thesis a method of graphene synthesis in plasma will be discussed, in which this synthesis method is supported by the erosion of the anode material. This graphene synthesis method is one of the most practical methods which can provide high production rate. High purity of graphene flakes have been synthesized with an anodic arc method under certain pressure (about 500 torr). Raman spectrometer, Scanning Electron Microscope (SEM), Atomic Force Microscopy (AFM) and Transmission Electron Microscopy (TEM) have been utilized for characterization of the synthesis products. Arc produced graphene and commercially available graphene was compared by those machine and the difference lies in the number of layers, the thicknesses of each layer and the shape of the structure itself. Temperature dependence of the synthesis procedure has been studied. It has been found that the graphene can be produced on a copper foil substrate under temperatures near the melting point of copper. However, with a decrease in substrate temperature yields a transformation of the synthesized graphene into amorphous carbon. Glow discharge was utilized to functionalize grapheme. SEM and EDS observation indicated increases of oxygen content in the graphene after its exposure to glow discharge.

  3. Utilization of the multiple scattering model for the conception of heterogeneous materials with specific electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Vignéras-Lefebvre, V.; Miane, J. L.; Parneix, J. P.

    1993-03-01

    A modelisation of the behaviour of heterogeneous structures, made with spherical particles, submitted to microwave fields, using multiple scattering, is presented. First of all, we expose the principle of scattering by a single sphere, using Mie's equations represented in the form of a transfer matrix. This matrix, generalized to a distribution of particles, allows the translation of an average behaviour of the material. As part of the limit of Rayleigh, the value of the effective permittivity thus obtained is compared to experimental results. Une modélisation du comportement de structures hétérogènes composées de particules sphériques, soumises à des champs hyperfréquences, utilisant la multidiffusion, est présentée. Dans un premier temps, nous exposons le principe de la diffusion par une seule sphère en utilisant les équations de Mie représentées sous forme matricielle. Cette matrice généralisée à une distribution de particules permet de traduire un comportement moyen du matériau. Dans le cadre de l'approximation de Rayleigh, la valeur de la permittivité ainsi calculée est comparée à des résultats expérimentaux.

  4. Mechanized syringe homogenization of human and animal tissues.

    PubMed

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  5. Influence of homogenization treatment on physicochemical properties and enzymatic hydrolysis rate of pure cellulose fibers.

    PubMed

    Jacquet, N; Vanderghem, C; Danthine, S; Blecker, C; Paquot, M

    2013-02-01

    The aim of this study is to compare the effect of different homogenization treatments on the physicochemical properties and the hydrolysis rate of a pure bleached cellulose. Results obtained show that homogenization treatments improve the enzymatic hydrolysis rate of the cellulose fibers by 25 to 100 %, depending of the homogenization treatment applied. Characterization of the samples showed also that homogenization had an impact on some physicochemical properties of the cellulose. For moderate treatment intensities (pressure below 500 b and degree of homogenization below 25), an increase of water retention values (WRV) that correlated to the increase of the hydrolysis rate was highlighted. Result also showed that the overall crystallinity of the cellulose properties appeared not to be impacted by the homogenization treatment. For higher treatment intensities, homogenized cellulose samples developed a stable tridimentional network that contributes to decrease cellulase mobility and slowdown the hydrolysis process.

  6. Distributed parameter modeling of repeated truss structures

    NASA Technical Reports Server (NTRS)

    Wang, Han-Ching

    1994-01-01

    A new approach to find homogeneous models for beam-like repeated flexible structures is proposed which conceptually involves two steps. The first step involves the approximation of 3-D non-homogeneous model by a 1-D periodic beam model. The structure is modeled as a 3-D non-homogeneous continuum. The displacement field is approximated by Taylor series expansion. Then, the cross sectional mass and stiffness matrices are obtained by energy equivalence using their additive properties. Due to the repeated nature of the flexible bodies, the mass, and stiffness matrices are also periodic. This procedure is systematic and requires less dynamics detail. The first step involves the homogenization from a 1-D periodic beam model to a 1-D homogeneous beam model. The periodic beam model is homogenized into an equivalent homogeneous beam model using the additive property of compliance along the generic axis. The major departure from previous approaches in literature is using compliance instead of stiffness in homogenization. An obvious justification is that the stiffness is additive at each cross section but not along the generic axis. The homogenized model preserves many properties of the original periodic model.

  7. Mesoscopic homogenization of semi-insulating GaAs by two-step post growth annealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffmann, B.; Jurisch, M.; Koehler, A.

    1996-12-31

    Mesoscopic homogenization of the electrical properties of s.i. LEC-GaAs is commonly realized by thermal treatment of the crystals including the steps of dissolution of arsenic precipitates, homogenization of excess As and re-precipitation by creating a controlled supersaturation. Caused by the inhomogeneous distribution of dislocations and the corresponding cellular structure along and across LEC-grown crystals a proper choice of the time-temperature program is necessary to minimize fluctuations of mesoscopic homogeneity. A modified two-step ingot annealing process is demonstrated to ensure the homogeneous distribution of mesoscopic homogeneity.

  8. A comparison of maximal bioenergetic enzyme activities obtained with commonly used homogenization techniques.

    PubMed

    Grace, M; Fletcher, L; Powers, S K; Hughes, M; Coombes, J

    1996-12-01

    Homogenization of tissue for analysis of bioenergetic enzyme activities is a common practice in studies examining metabolic properties of skeletal muscle adaptation to disease, aging, inactivity or exercise. While numerous homogenization techniques are in use today, limited information exists concerning the efficacy of specific homogenization protocols. Therefore, the purpose of this study was to compare the efficacy of four commonly used approaches to homogenizing skeletal muscle for analysis of bioenergetic enzyme activity. The maximal enzyme activity (Vmax) of citrate synthase (CS) and lactate dehydrogenase (LDH) were measured from homogenous muscle samples (N = 48 per homogenization technique) and used as indicators to determine which protocol had the highest efficacy. The homogenization techniques were: (1) glass-on-glass pestle; (2) a combination of a mechanical blender and a teflon pestle (Potter-Elvehjem); (3) a combination of the mechanical blender and a biological detergent; and (4) the combined use of a mechanical blender and a sonicator. The glass-on-glass pestle homogenization protocol produced significantly higher (P < 0.05) enzyme activities compared to all other protocols for both enzymes. Of the four protocols examined, the data demonstrate that the glass-on-glass pestle homogenization protocol is the technique of choice for studying bioenergetic enzyme activity in skeletal muscle.

  9. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    NASA Astrophysics Data System (ADS)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  10. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  11. Sewage sludge solubilization by high-pressure homogenization.

    PubMed

    Zhang, Yuxuan; Zhang, Panyue; Guo, Jianbin; Ma, Weifang; Fang, Wei; Ma, Boqiang; Xu, Xiangzhe

    2013-01-01

    The behavior of sludge solubilization using high-pressure homogenization (HPH) treatment was examined by investigating the sludge solid reduction and organics solubilization. The sludge volatile suspended solids (VSS) decreased from 10.58 to 6.67 g/L for the sludge sample with a total solids content (TS) of 1.49% after HPH treatment at a homogenization pressure of 80 MPa with four homogenization cycles; total suspended solids (TSS) correspondingly decreased from 14.26 to 9.91 g/L. About 86.15% of the TSS reduction was attributed to the VSS reduction. The increase of homogenization pressure from 20 to 80 MPa or homogenization cycle number from 1 to 4 was favorable to the sludge organics solubilization, and the protein and polysaccharide solubilization linearly increased with the soluble chemical oxygen demand (SCOD) solubilization. More proteins were solubilized than polysaccharides. The linear relationship between SCOD solubilization and VSS reduction had no significant change under different homogenization pressures, homogenization cycles and sludge solid contents. The SCOD of 1.65 g/L was solubilized for the VSS reduction of 1.00 g/L for the three experimental sludge samples with a TS of 1.00, 1.49 and 2.48% under all HPH operating conditions. The energy efficiency results showed that the HPH treatment at a homogenization pressure of 30 MPa with a single homogenization cycle for the sludge sample with a TS of 2.48% was the most energy efficient.

  12. RY-Coding and Non-Homogeneous Models Can Ameliorate the Maximum-Likelihood Inferences From Nucleotide Sequence Data with Parallel Compositional Heterogeneity.

    PubMed

    Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo

    2012-01-01

    In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.

  13. Quantitative Homogenization in Nonlinear Elasticity for Small Loads

    NASA Astrophysics Data System (ADS)

    Neukamm, Stefan; Schäffner, Mathias

    2018-04-01

    We study quantitative periodic homogenization of integral functionals in the context of nonlinear elasticity. Under suitable assumptions on the energy densities (in particular frame indifference; minimality, non-degeneracy and smoothness at the identity; {p ≥q d} -growth from below; and regularity of the microstructure), we show that in a neighborhood of the set of rotations, the multi-cell homogenization formula of non-convex homogenization reduces to a single-cell formula. The latter can be expressed with the help of correctors. We prove that the homogenized integrand admits a quadratic Taylor expansion in an open neighborhood of the rotations - a result that can be interpreted as the fact that homogenization and linearization commute close to the rotations. Moreover, for small applied loads, we provide an estimate on the homogenization error in terms of a quantitative two-scale expansion.

  14. Operational coupled atmosphere - ocean - ice forecast system for the Gulf of St. Lawrence, Canada

    NASA Astrophysics Data System (ADS)

    Faucher, M.; Roy, F.; Desjardins, S.; Fogarty, C.; Pellerin, P.; Ritchie, H.; Denis, B.

    2009-09-01

    A fully interactive coupled atmosphere-ocean-ice forecasting system for the Gulf of St. Lawrence (GSL) has been running in experimental mode at the Canadian Meteorological Centre (CMC) for the last two winter seasons. The goal of this project is to provide more accurate weather and sea ice forecasts over the GSL and adjacent coastal areas by including atmosphere-oceanice interactions in the CMC operational forecast system using a formal coupling strategy between two independent modeling components. The atmospheric component is the Canadian operational GEM model (Côté et al. 1998) and the oceanic component is the ocean-ice model for the Gulf of St. Lawrence developed at the Maurice Lamontagne Institute (IML) (Saucier et al. 2003, 2004). The coupling between those two models is achieved by exchanging surface fluxes and variables through MPI communication. The re-gridding of the variables is done with a package developed at the Recherche en Prevision Numerique centre (RPN, Canada). Coupled atmosphere - ocean - ice forecasts are issued once a day based on 00GMT data. Results for the past two years have demonstrated that the coupled system produces improved forecasts in and around the GSL during all seasons, proving that atmosphere-ocean-ice interactions are indeed important even for short-term Canadian weather forecasts. This has important implications for other coupled modeling and data assimilation partnerships that are in progress involving EC, the Department of Fisheries and Oceans (DFO) and the National Defense (DND). Following this experimental phase, it is anticipated that this GSL system will be the first fully interactive coupled system to be implemented at CMC.

  15. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  16. Homogenization of Mammalian Cells.

    PubMed

    de Araújo, Mariana E G; Lamberti, Giorgia; Huber, Lukas A

    2015-11-02

    Homogenization is the name given to the methodological steps necessary for releasing organelles and other cellular constituents as a free suspension of intact individual components. Most homogenization procedures used for mammalian cells (e.g., cavitation pump and Dounce homogenizer) rely on mechanical force to break the plasma membrane and may be supplemented with osmotic or temperature alterations to facilitate membrane disruption. In this protocol, we describe a syringe-based homogenization method that does not require specialized equipment, is easy to handle, and gives reproducible results. The method may be adapted for cells that require hypotonic shock before homogenization. We routinely use it as part of our workflow to isolate endocytic organelles from mammalian cells. © 2015 Cold Spring Harbor Laboratory Press.

  17. On the connection between financial processes with stochastic volatility and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Queirós, S. M. D.; Tsallis, C.

    2005-11-01

    The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).

  18. Sewage sludge disintegration by combined treatment of alkaline+high pressure homogenization.

    PubMed

    Zhang, Yuxuan; Zhang, Panyue; Zhang, Guangming; Ma, Weifang; Wu, Hao; Ma, Boqiang

    2012-11-01

    Alkaline pretreatment combined with high pressure homogenization (HPH) was applied to promote sewage sludge disintegration. For sewage sludge with a total solid content of 1.82%, sludge disintegration degree (DD(COD)) with combined treatment was higher than the sum of DD(COD) with single alkaline and single HPH treatment. NaOH dosage ⩽0.04mol/L, homogenization pressure ⩽60MPa and a single homogenization cycle were the suitable conditions for combined sludge treatment. The combined sludge treatment showed a maximum DD(COD) of 59.26%. By regression analysis, the combined sludge disintegration model was established as 11-DD(COD)=0.713C(0.334)P(0.234)N(0.119), showing that the effect of operating parameters on sludge disintegration followed the order: NaOH dosage>homogenization pressure>number of homogenization cycle. The energy efficiency with combined sludge treatment significantly increased compared with that with single HPH treatment, and the high energy efficiency was achieved at low homogenization pressure with a single homogenization cycle. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Rheological Behavior of Tomato Fiber Suspensions Produced by High Shear and High Pressure Homogenization and Their Application in Tomato Products

    PubMed Central

    Sun, Ping; Adhikari, Benu P.; Li, Dong

    2018-01-01

    This study investigated the effects of high shear and high pressure homogenization on the rheological properties (steady shear viscosity, storage and loss modulus, and deformation) and homogeneity in tomato fiber suspensions. The tomato fiber suspensions at different concentrations (0.1%–1%, w/w) were subjected to high shear and high pressure homogenization and the morphology (distribution of fiber particles), rheological properties, and color parameters of the homogenized suspensions were measured. The homogenized suspensions were significantly more uniform compared to unhomogenized suspension. The homogenized suspensions were found to better resist the deformation caused by external stress (creep behavior). The apparent viscosity and storage and loss modulus of homogenized tomato fiber suspension are comparable with those of commercial tomato ketchup even at the fiber concentration as low as 0.5% (w/w), implying the possibility of using tomato fiber as thickener. The model tomato sauce produced using tomato fiber showed desirable consistency and color. These results indicate that the application of tomato fiber in tomato-based food products would be desirable and beneficial. PMID:29743890

  20. Sensitivity of liquid clouds to homogenous freezing parameterizations.

    PubMed

    Herbert, Ross J; Murray, Benjamin J; Dobbie, Steven J; Koop, Thomas

    2015-03-16

    Water droplets in some clouds can supercool to temperatures where homogeneous ice nucleation becomes the dominant freezing mechanism. In many cloud resolving and mesoscale models, it is assumed that homogeneous ice nucleation in water droplets only occurs below some threshold temperature typically set at -40°C. However, laboratory measurements show that there is a finite rate of nucleation at warmer temperatures. In this study we use a parcel model with detailed microphysics to show that cloud properties can be sensitive to homogeneous ice nucleation as warm as -30°C. Thus, homogeneous ice nucleation may be more important for cloud development, precipitation rates, and key cloud radiative parameters than is often assumed. Furthermore, we show that cloud development is particularly sensitive to the temperature dependence of the nucleation rate. In order to better constrain the parameterization of homogeneous ice nucleation laboratory measurements are needed at both high (>-35°C) and low (<-38°C) temperatures. Homogeneous freezing may be significant as warm as -30°CHomogeneous freezing should not be represented by a threshold approximationThere is a need for an improved parameterization of homogeneous ice nucleation.

  1. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  2. A facile approach to manufacturing non-ionic surfactant nanodipsersions using proniosome technology and high-pressure homogenization.

    PubMed

    Najlah, Mohammad; Hidayat, Kanar; Omer, Huner K; Mwesigwa, Enosh; Ahmed, Waqar; AlObaidy, Kais G; Phoenix, David A; Elhissi, Abdelbary

    2015-03-01

    In this study, a niosome nanodispersion was manufactured using high-pressure homogenization following the hydration of proniosomes. Using beclometasone dipropionate (BDP) as a model drug, the characteristics of the homogenized niosomes were compared with vesicles prepared via the conventional approach of probe-sonication. Particle size, zeta potential, and the drug entrapment efficiency were similar for both size reduction mechanisms. However, high-pressure homogenization was much more efficient than sonication in terms of homogenization output rate, avoidance of sample contamination, offering a greater potential for a large-scale manufacturing of noisome nanodispersions. For example, high-pressure homogenization was capable of producing small size niosomes (209 nm) using a short single-step of size reduction (6 min) as compared with the time-consuming process of sonication (237 nm in >18 min) and the BDP entrapment efficiency was 29.65% ± 4.04 and 36.4% ± 2.8. In addition, for homogenization, the output rate of the high-pressure homogenization was 10 ml/min compared with 0.83 ml/min using the sonication protocol. In conclusion, a facile, applicable, and highly efficient approach for preparing niosome nanodispersions has been established using proniosome technology and high-pressure homogenization.

  3. Homogeneous crystal nucleation in polymers.

    PubMed

    Schick, C; Androsch, R; Schmelzer, J W P

    2017-11-15

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 10 6 K s -1 , allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of the classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation.

  4. Utilizing Hierarchical Clustering to improve Efficiency of Self-Organizing Feature Map to Identify Hydrological Homogeneous Regions

    NASA Astrophysics Data System (ADS)

    Farsadnia, Farhad; Ghahreman, Bijan

    2016-04-01

    Hydrologic homogeneous group identification is considered both fundamental and applied research in hydrology. Clustering methods are among conventional methods to assess the hydrological homogeneous regions. Recently, Self-Organizing feature Map (SOM) method has been applied in some studies. However, the main problem of this method is the interpretation on the output map of this approach. Therefore, SOM is used as input to other clustering algorithms. The aim of this study is to apply a two-level Self-Organizing feature map and Ward hierarchical clustering method to determine the hydrologic homogenous regions in North and Razavi Khorasan provinces. At first by principal component analysis, we reduced SOM input matrix dimension, then the SOM was used to form a two-dimensional features map. To determine homogeneous regions for flood frequency analysis, SOM output nodes were used as input into the Ward method. Generally, the regions identified by the clustering algorithms are not statistically homogeneous. Consequently, they have to be adjusted to improve their homogeneity. After adjustment of the homogeneity regions by L-moment tests, five hydrologic homogeneous regions were identified. Finally, adjusted regions were created by a two-level SOM and then the best regional distribution function and associated parameters were selected by the L-moment approach. The results showed that the combination of self-organizing maps and Ward hierarchical clustering by principal components as input is more effective than the hierarchical method, by principal components or standardized inputs to achieve hydrologic homogeneous regions.

  5. A non-asymptotic homogenization theory for periodic electromagnetic structures.

    PubMed

    Tsukerman, Igor; Markel, Vadim A

    2014-08-08

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions.

  6. Assessment the effect of homogenized soil on soil hydraulic properties and soil water transport

    NASA Astrophysics Data System (ADS)

    Mohawesh, O.; Janssen, M.; Maaitah, O.; Lennartz, B.

    2017-09-01

    Soil hydraulic properties play a crucial role in simulating water flow and contaminant transport. Soil hydraulic properties are commonly measured using homogenized soil samples. However, soil structure has a significant effect on the soil ability to retain and to conduct water, particularly in aggregated soils. In order to determine the effect of soil homogenization on soil hydraulic properties and soil water transport, undisturbed soil samples were carefully collected. Five different soil structures were identified: Angular-blocky, Crumble, Angular-blocky (different soil texture), Granular, and subangular-blocky. The soil hydraulic properties were determined for undisturbed and homogenized soil samples for each soil structure. The soil hydraulic properties were used to model soil water transport using HYDRUS-1D.The homogenized soil samples showed a significant increase in wide pores (wCP) and a decrease in narrow pores (nCP). The wCP increased by 95.6, 141.2, 391.6, 3.9, 261.3%, and nCP decreased by 69.5, 10.5, 33.8, 72.7, and 39.3% for homogenized soil samples compared to undisturbed soil samples. The soil water retention curves exhibited a significant decrease in water holding capacity for homogenized soil samples compared with the undisturbed soil samples. The homogenized soil samples showed also a decrease in soil hydraulic conductivity. The simulated results showed that water movement and distribution were affected by soil homogenizing. Moreover, soil homogenizing affected soil hydraulic properties and soil water transport. However, field studies are being needed to find the effect of these differences on water, chemical, and pollutant transport under several scenarios.

  7. Fluctuations Magnetiques des Gaz D'electrons Bidimensionnels: Application AU Compose Supraconducteur LANTHANE(2-X) Strontium(x) Cuivre OXYGENE(4)

    NASA Astrophysics Data System (ADS)

    Benard, Pierre

    Nous presentons une etude des fluctuations magnetiques de la phase normale de l'oxyde de cuivre supraconducteur La_{2-x}Sr _{x}CuO_4 . Le compose est modelise par le Hamiltonien de Hubbard bidimensionnel avec un terme de saut vers les deuxiemes voisins (modele tt'U). Le modele est etudie en utilisant l'approximation de la GRPA (Generalized Random Phase Approximation) et en incluant les effets de la renormalisation de l'interaction de Hubbard par les diagrammes de Brueckner-Kanamori. Dans l'approche presentee dans ce travail, les maximums du facteur de structure magnetique observes par les experiences de diffusion de neutrons sont associes aux anomalies 2k _{F} de reseau du facteur de structure des gaz d'electrons bidimensionnels sans interaction. Ces anomalies proviennent de la diffusion entre particules situees a des points de la surface de Fermi ou les vitesses de Fermi sont tangentes, et conduisent a des divergences dont la nature depend de la geometrie de la surface de Fermi au voisinage de ces points. Ces resultats sont ensuite appliques au modele tt'U, dont le modele de Hubbard usuel tU est un cas particulier. Dans la majorite des cas, les interactions ne determinent pas la position des maximums du facteur de structure. Le role de l'interaction est d'augmenter l'intensite des structures du facteur de structure magnetique associees a l'instabilite magnetique du systeme. Ces structures sont souvent deja presentes dans la partie imaginaire de la susceptibilite sans interaction. Le rapport d'intensite entre les maximums absolus et les autres structures du facteur de structure magnetique permet de determiner le rapport U_ {rn}/U_{c} qui mesure la proximite d'une instabilite magnetique. Le diagramme de phase est ensuite etudie afin de delimiter la plage de validite de l'approximation. Apres avoir discute des modes collectifs et de l'effet d'une partie imaginaire non-nulle de la self-energie, l'origine de l'echelle d'energie des fluctuations magnetiques est examinee. Il est ensuite demontre que le modele a trois bandes predit les memes resultats pour la position des structures du facteur de structure magnetique que le modele a une bande, dans la limite ou l'hybridation des orbitales des atomes d'oxygene des plans Cu-O_2 et l'amplitude de sauts vers les seconds voisins sont nulles. Il est de plus constate que l'effet de l'hybridation des orbitales des atomes d'oxygene est bien modelise par le terme de saut vers les seconds voisins. Meme si ils decrivent correctement le comportement qualitatif des maximums du facteur de structure magnetique, les modeles a trois bandes et a une bande ne permettent pas d'obtenir une position de ces structures conforme avec les mesures experimentales, si on suppose que la bande est rigide, c'est-a-dire que les parametres du Hamiltonien sont independants de la concentration de strontium. Ceci peut etre cause par la dependance des parametres du Hamiltonien sur la concentration de strontium. Finalement, les resultats sont compares avec les experiences de diffusion de neutrons et les autres theories, en particulier celles de Littlewood et al. (1993) et de Q. Si et al. (1993). La comparaison avec les resultats experimentaux pour le compose de lanthane suggere que le liquide de Fermi possede une surface de Fermi disjointe, et qu'il est situe pres d'une instabilite magnetique incommensurable.

  8. Les mousses adaptatives pour l'amelioration de l'absorption acoustique: Modelisation, mise en oeuvre, mecanismes de controle

    NASA Astrophysics Data System (ADS)

    Leroy, Pierre

    The objective of this thesis is to conduct a thorough numerical and experimental analysis of the smart foam concept, in order to highlight the physical mechanisms and the technological limitations for the control of acoustic absorption. A smart foam is made of an absorbing material with an embedded actuator able to complete the lack of effectiveness of this material in the low frequencies (<500Hz). In this study, the absorbing material is a melamine foam and the actuator is a piezoelectric film of PVDF. A 3D finite element model coupling poroelastic, acoustic, elastic and piezoelectric fields is proposed. The model uses volume and surface quadratic elements. The improved formulation (u,p) is used. An orthotropic porous element is proposed. The power balance in the porous media is established. This model is a powerful and general tool allowing the modeling of all hybrid configurations using poroelastic and piezoelectric fields. Three smart foams prototypes have been built with the aim of validating the numerical model and setting up experimental active control. The comparison of numerical calculations and experimental measurements shows the validity of the model for passive aspects, transducer behaviors and also for control configuration. The active control of acoustic absorption is carried out in normal incidence with the assumption of plane wave in the frequency range [0-1500Hz]. The criterion of minimization is the reflected pressure measured by an unidirectional microphone. Three control cases were tested: off line control with a sum of pure tones, adaptive control with the nFX-LMS algorithm for a pure tone and for a random broad band noise. The results reveal the possibility of absorbing a pressure of 1.Pa at 1.00Hz with 100V and a broad band noise of 94dB with a hundred Vrms starting from 250Hz. These results have been obtained with a mean foam thickness of 4cm. The control ability of the prototypes is directly connected to the acoustic flow. An important limitation for the broad band control comes from the high distortion level through the system in the low and high frequency range (<500Hz, > 1500Hz). The use of the numerical model, supplemented by an analytical study made it possible to clarify the action mode and the dissipation mechanisms in smart foams. The PVDF moves with the same phase and amplitude of the residual incidental pressure which is not dissipated in the foam. Viscous effect dissipation is then very weak in the low frequencies and becomes more important in the high frequencies. The wave which was not been dissipated in the porous material is transmitted by the PVDF in the back cavity. The outlooks of this study are on the one hand, the improvement of the model and the prototypes and on the other hand, the widening of the field of research to the control of the acoustic transmission and the acoustic radiation of surfaces. The model could be improved by integrating viscoelastic elements able to account for the behavior of the adhesive layer between the PVDF and foam. A modelisation of electro-elastomers materials would also have to be implemented in the code. This new type of actuator could make it possible to exceed the PVDF displacement limitations. Finally it would be interesting for the industrial integration prospects to seek configurations able to maximize acoustic absorption and to limit the transmission and the radiation of surfaces at the same time.

  9. Effects of homogenization treatment on recrystallization behavior of 7150 aluminum sheet during post-rolling annealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Zhanying; Department of Applied Science, University of Québec at Chicoutimi, Saguenay, QC G7H 2B1; Zhao, Gang

    2016-04-15

    The effects of two homogenization treatments applied to the direct chill (DC) cast billet on the recrystallization behavior in 7150 aluminum alloy during post-rolling annealing have been investigated using the electron backscatter diffraction (EBSD) technique. Following hot and cold rolling to the sheet, measured orientation maps, the recrystallization fraction and grain size, the misorientation angle and the subgrain size were used to characterize the recovery and recrystallization processes at different annealing temperatures. The results were compared between the conventional one-step homogenization and the new two-step homogenization, with the first step being pretreated at 250 °C. Al{sub 3}Zr dispersoids with highermore » densities and smaller sizes were obtained after the two-step homogenization, which strongly retarded subgrain/grain boundary mobility and inhibited recrystallization. Compared with the conventional one-step homogenized samples, a significantly lower recrystallized fraction and a smaller recrystallized grain size were obtained under all annealing conditions after cold rolling in the two-step homogenized samples. - Highlights: • Effects of two homogenization treatments on recrystallization in 7150 Al sheets • Quantitative study on the recrystallization evolution during post-rolling annealing • Al{sub 3}Zr dispersoids with higher densities and smaller sizes after two-step treatment • Higher recrystallization resistance of 7150 sheets with two-step homogenization.« less

  10. Effect of high-pressure homogenization on different matrices of food supplements.

    PubMed

    Martínez-Sánchez, Ascensión; Tarazona-Díaz, Martha Patricia; García-González, Antonio; Gómez, Perla A; Aguayo, Encarna

    2016-12-01

    There is a growing demand for food supplements containing high amounts of vitamins, phenolic compounds and mineral content that provide health benefits. Those functional compounds have different solubility properties, and the maintenance of their compounds and the guarantee of their homogenic properties need the application of novel technologies. The quality of different drinkable functional foods after thermal processing (0.1 MPa) or high-pressure homogenization under two different conditions (80 MPa, 33 ℃ and 120 MPa, 43 ℃) was studied. Physicochemical characteristics and sensory qualities were evaluated throughout the six months of accelerated storage at 40 ℃ and 75% relative humidity (RH). Aroma and color were better maintained in high-pressure homogenization-treated samples than the thermally treated ones, which contributed significantly to extending their shelf life. The small particle size obtained after high-pressure homogenization treatments caused differences in turbidity and viscosity with respect to heat-treated samples. The use of high-pressure homogenization, more specifically, 120 MPa, provided active ingredient homogeneity to ensure uniform content in functional food supplements. Although the effect of high-pressure homogenization can be affected by the food matrix, high-pressure homogenization can be implemented as an alternative to conventional heat treatments in a commercial setting within the functional food supplement or pharmaceutical industry. © The Author(s) 2016.

  11. Effect of fat content and homogenization under conventional or ultra-high-pressure conditions on interactions between proteins in rennet curds.

    PubMed

    Zamora, A; Trujillo, A J; Armaforte, E; Waldron, D S; Kelly, A L

    2012-09-01

    The objective of this study was to investigate the influence of conventional and ultra-high-pressure homogenization on interactions between proteins within drained rennet curds. The effect of fat content of milk (0.0, 1.8, or 3.6%) and homogenization treatment on dissociation of proteins by different chemical agents was thus studied. Increasing the fat content of raw milk increased levels of unbound whey proteins and calcium-bonded caseins in curds; in contrast, hydrophobic interactions and hydrogen bonds were inhibited. Both homogenization treatments triggered the incorporation of unbound whey proteins in the curd, and of caseins through ionic bonds involving calcium salts. Conventional homogenization-pasteurization enhanced interactions between caseins through hydrogen bonds and hydrophobic interactions. In contrast, ultra-high-pressure homogenization impaired hydrogen bonding, led to the incorporation of both whey proteins and caseins through hydrophobic interactions and increased the amount of unbound caseins. Thus, both homogenization treatments provoked changes in the protein interactions within rennet curds; however, the nature of the changes depended on the homogenization conditions. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Nonlinear vibration of a traveling belt with non-homogeneous boundaries

    NASA Astrophysics Data System (ADS)

    Ding, Hu; Lim, C. W.; Chen, Li-Qun

    2018-06-01

    Free and forced nonlinear vibrations of a traveling belt with non-homogeneous boundary conditions are studied. The axially moving materials in operation are always externally excited and produce strong vibrations. The moving materials with the homogeneous boundary condition are usually considered. In this paper, the non-homogeneous boundaries are introduced by the support wheels. Equilibrium deformation of the belt is produced by the non-homogeneous boundaries. In order to solve the equilibrium deformation, the differential and integral quadrature methods (DIQMs) are utilized to develop an iterative scheme. The influence of the equilibrium deformation on free and forced nonlinear vibrations of the belt is explored. The DIQMs are applied to solve the natural frequencies and forced resonance responses of transverse vibration around the equilibrium deformation. The Galerkin truncation method (GTM) is utilized to confirm the DIQMs' results. The numerical results demonstrate that the non-homogeneous boundary conditions cause the transverse vibration to deviate from the straight equilibrium, increase the natural frequencies, and lead to coexistence of square nonlinear terms and cubic nonlinear terms. Moreover, the influence of non-homogeneous boundaries can be exacerbated by the axial speed. Therefore, non-homogeneous boundary conditions of axially moving materials especially should be taken into account.

  13. Homogenization models for thin rigid structured surfaces and films.

    PubMed

    Marigo, Jean-Jacques; Maurel, Agnès

    2016-07-01

    A homogenization method for thin microstructured surfaces and films is presented. In both cases, sound hard materials are considered, associated with Neumann boundary conditions and the wave equation in the time domain is examined. For a structured surface, a boundary condition is obtained on an equivalent flat wall, which links the acoustic velocity to its normal and tangential derivatives (of the Myers type). For a structured film, jump conditions are obtained for the acoustic pressure and the normal velocity across an equivalent interface (of the Ventcels type). This interface homogenization is based on a matched asymptotic expansion technique, and differs slightly from the classical homogenization, which is known to fail for small structuration thicknesses. In order to get insight into what causes this failure, a two-step homogenization is proposed, mixing classical homogenization and matched asymptotic expansion. Results of the two homogenizations are analyzed in light of the associated elementary problems, which correspond to problems of fluid mechanics, namely, potential flows around rigid obstacles.

  14. Homogenization versus homogenization-free method to measure muscle glycogen fractions.

    PubMed

    Mojibi, N; Rasouli, M

    2016-12-01

    The glycogen is extracted from animal tissues with or without homogenization using cold perchloric acid. Three methods were compared for determination of glycogen in rat muscle at different physiological states. Two groups of five rats were kept at rest or 45 minutes muscular activity. The glycogen fractions were extracted and measured by using three methods. The data of homogenization method shows that total glycogen decreased following 45 min physical activity and the change occurred entirely in acid soluble glycogen (ASG), while AIG did not change significantly. Similar results were obtained by using "total-glycogen-fractionation methods". The findings of "homogenization-free method" indicate that the acid insoluble fraction (AIG) was the main portion of muscle glycogen and the majority of changes occurred in AIG fraction. The results of "homogenization method" are identical with "total glycogen fractionation", but differ with "homogenization-free" protocol. The ASG fraction is the major portion of muscle glycogen and is more metabolically active form.

  15. Motion through a non-homogeneous porous medium: Hydrodynamic permeability of a membrane composed of cylindrical particles

    NASA Astrophysics Data System (ADS)

    Yadav, Pramod Kumar

    2018-01-01

    The present problem is concerned with the flow of a viscous steady incompressible fluid through a non-homogeneous porous medium. Here, the non-homogeneous porous medium is a membrane built up by cylindrical particles. The flow outside the membrane is governed by the Stokes equation and the flow through the non-homogeneous porous membrane composed by cylindrical particles is governed by Darcy's law. In this work, we discussed the effect of various fluid parameters like permeability parameter k0, discontinuity coefficient at fluid-non homogeneous porous interface, viscosity ratio of viscous incompressible fluid region and non-homogeneous porous region, etc. on hydrodynamic permeability of a membrane, stress and on velocity profile. The comparative study for hydrodynamic permeability of membrane built up by non-homogeneous porous cylindrical particles and porous cylindrical shell enclosing a cylindrical cavity has been studied. The effects of various fluid parameters on the streamlines flow patterns are also discussed.

  16. Numerical investigation of homogeneous cavitation nucleation in a microchannel

    NASA Astrophysics Data System (ADS)

    Lyu, Xiuxiu; Pan, Shucheng; Hu, Xiangyu; Adams, Nikolaus A.

    2018-06-01

    The physics of nucleation in water is an important issue for many areas, ranging from biomedical to engineering applications. Within the present study, we investigate numerically homogeneous nucleation in a microchannel induced by shock reflection to gain a better understanding of the mechanism of homogeneous nucleation. The liquid expands due to the reflected shock and homogeneous cavitation nuclei are generated. An Eulerian-Lagrangian approach is employed for modeling this process in a microchanel. Two-dimensional axisymmetric Euler equations are solved for obtaining the time evolution of shock, gas bubble, and the ambient fluid. The dynamics of dispersed vapor bubbles is coupled with the surrounding fluid in a Lagrangian framework, describing bubble location and bubble size variation. Our results reproduce nuclei distributions at different stages of homogeneous nucleation and are in good agreement with experimental results. We obtain numerical data for the negative pressure that water can sustain under the process of homogeneous nucleation. An energy transformation description for the homogeneous nucleation inside a microchannel flow is derived and analyzed in detail.

  17. Method of fabricating a homogeneous wire of inter-metallic alloy

    DOEpatents

    Ohriner, Evan Keith; Blue, Craig Alan

    2001-01-01

    A method for fabricating a homogeneous wire of inter-metallic alloy comprising the steps of providing a base-metal wire bundle comprising a metal, an alloy or a combination thereof; working the wire bundle through at least one die to obtain a desired dimension and to form a precursor wire; and, controllably heating the precursor wire such that a portion of the wire will become liquid while simultaneously maintaining its desired shape, whereby substantial homogenization of the wire occurs in the liquid state and additional homogenization occurs in the solid state resulting in a homogenous alloy product.

  18. Soy Protein Isolate-Phosphatidylcholine Nanoemulsions Prepared Using High-Pressure Homogenization

    PubMed Central

    Li, Yang; Liu, Jun; Zhu, Ying; Zhang, Xiao-Yuan; Jiang, Lian-Zhou; Qi, Bao-Kun; Zhang, Xiao-Nan; Wang, Zhong-Jiang; Teng, Fei

    2018-01-01

    The nanoemulsions of soy protein isolate-phosphatidylcholine (SPI-PC) with different emulsion conditions were studied. Homogenization pressure and homogenization cycle times were varied, along with SPI and PC concentration. Evaluations included turbidity, particle size, ζ-potential, particle distribution index, and turbiscan stability index (TSI). The nanoemulsions had the best stability when SPI was at 1.5%, PC was at 0.22%, the homogenization pressure was 100 MPa and homogenization was performed 4 times. The average particle size of the SPI-PC nanoemulsions was 217 nm, the TSI was 3.02 and the emulsification yield was 93.4% of nanoemulsions. PMID:29735918

  19. Soy Protein Isolate-Phosphatidylcholine Nanoemulsions Prepared Using High-Pressure Homogenization.

    PubMed

    Li, Yang; Wu, Chang-Ling; Liu, Jun; Zhu, Ying; Zhang, Xiao-Yuan; Jiang, Lian-Zhou; Qi, Bao-Kun; Zhang, Xiao-Nan; Wang, Zhong-Jiang; Teng, Fei

    2018-05-07

    The nanoemulsions of soy protein isolate-phosphatidylcholine (SPI-PC) with different emulsion conditions were studied. Homogenization pressure and homogenization cycle times were varied, along with SPI and PC concentration. Evaluations included turbidity, particle size, ζ-potential, particle distribution index, and turbiscan stability index (TSI). The nanoemulsions had the best stability when SPI was at 1.5%, PC was at 0.22%, the homogenization pressure was 100 MPa and homogenization was performed 4 times. The average particle size of the SPI-PC nanoemulsions was 217 nm, the TSI was 3.02 and the emulsification yield was 93.4% of nanoemulsions.

  20. A non-asymptotic homogenization theory for periodic electromagnetic structures

    PubMed Central

    Tsukerman, Igor; Markel, Vadim A.

    2014-01-01

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions. PMID:25104912

  1. Cosmological models with homogeneous and isotropic spatial sections

    NASA Astrophysics Data System (ADS)

    Katanaev, M. O.

    2017-05-01

    The assumption that the universe is homogeneous and isotropic is the basis for the majority of modern cosmological models. We give an example of a metric all of whose spatial sections are spaces of constant curvature but the space-time is nevertheless not homogeneous and isotropic as a whole. We give an equivalent definition of a homogeneous and isotropic universe in terms of embedded manifolds.

  2. Experimental Investigation of Piston Heat Transfer in a Light Duty Engine Under Conventional Diesel, Homogeneous Charge Compression Ignition, and Reactivity Controlled Compression Ignition Combustion Regimes

    DTIC Science & Technology

    2014-01-15

    in a Light Duty Engine Under Conventional Diesel, Homogeneous Charge Compression Ignition , and Reactivity Controlled Compression Ignition ...Conventional Diesel (CDC), Homogeneous Charge Compression Ignition (HCCI), and Reactivity Controlled Compression Ignition (RCCI) combustion...LTC) regimes, including reactivity controlled compression ignition (RCCI), partially premixed combustion (PPC), and homogenous charge compression

  3. MO-F-CAMPUS-I-02: Accuracy in Converting the Average Breast Dose Into the Mean Glandular Dose (MGD) Using the F-Factor in Cone Beam Breast CT- a Monte Carlo Study Using Homogeneous and Quasi-Homogeneous Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, C; Zhong, Y; Wang, T

    2015-06-15

    Purpose: To investigate the accuracy in estimating the mean glandular dose (MGD) for homogeneous breast phantoms by converting from the average breast dose using the F-factor in cone beam breast CT. Methods: EGSnrc-based Monte Carlo codes were used to estimate the MGDs. 13-cm in diameter, 10-cm high hemi-ellipsoids were used to simulate pendant-geometry breasts. Two different types of hemi-ellipsoidal models were employed: voxels in quasi-homogeneous phantoms were designed as either adipose or glandular tissue while voxels in homogeneous phantoms were designed as the mixture of adipose and glandular tissues. Breast compositions of 25% and 50% volume glandular fractions (VGFs), definedmore » as the ratio of glandular tissue voxels to entire breast voxels in the quasi-homogeneous phantoms, were studied. These VGFs were converted into glandular fractions by weight and used to construct the corresponding homogeneous phantoms. 80 kVp x-rays with a mean energy of 47 keV was used in the simulation. A total of 109 photons were used to image the phantoms and the energies deposited in the phantom voxels were tallied. Breast doses in homogeneous phantoms were averaged over all voxels and then used to calculate the MGDs using the F-factors evaluated at the mean energy of the x-rays. The MGDs for quasi-homogeneous phantoms were computed directly by averaging the doses over all glandular tissue voxels. The MGDs estimated for the two types of phantoms were normalized to the free-in-air dose at the iso-center and compared. Results: The normalized MGDs were 0.756 and 0.732 mGy/mGy for the 25% and 50% VGF homogeneous breasts and 0.761 and 0.733 mGy/mGy for the corresponding quasi-homogeneous breasts, respectively. The MGDs estimated for the two types of phantoms were similar within 1% in this study. Conclusion: MGDs for homogeneous breast models may be adequately estimated by converting from the average breast dose using the F-factor.« less

  4. Effect of high-pressure homogenization preparation on mean globule size and large-diameter tail of oil-in-water injectable emulsions.

    PubMed

    Peng, Jie; Dong, Wu-Jun; Li, Ling; Xu, Jia-Ming; Jin, Du-Jia; Xia, Xue-Jun; Liu, Yu-Ling

    2015-12-01

    The effect of different high pressure homogenization energy input parameters on mean diameter droplet size (MDS) and droplets with > 5 μm of lipid injectable emulsions were evaluated. All emulsions were prepared at different water bath temperatures or at different rotation speeds and rotor-stator system times, and using different homogenization pressures and numbers of high-pressure system recirculations. The MDS and polydispersity index (PI) value of the emulsions were determined using the dynamic light scattering (DLS) method, and large-diameter tail assessments were performed using the light-obscuration/single particle optical sensing (LO/SPOS) method. Using 1000 bar homogenization pressure and seven recirculations, the energy input parameters related to the rotor-stator system will not have an effect on the final particle size results. When rotor-stator system energy input parameters are fixed, homogenization pressure and recirculation will affect mean particle size and large diameter droplet. Particle size will decrease with increasing homogenization pressure from 400 bar to 1300 bar when homogenization recirculation is fixed; when the homogenization pressure is fixed at 1000 bar, the particle size of both MDS and percent of fat droplets exceeding 5 μm (PFAT 5 ) will decrease with increasing homogenization recirculations, MDS dropped to 173 nm after five cycles and maintained this level, volume-weighted PFAT 5 will drop to 0.038% after three cycles, so the "plateau" of MDS will come up later than that of PFAT 5 , and the optimal particle size is produced when both of them remained at plateau. Excess homogenization recirculation such as nine times under the 1000 bar may lead to PFAT 5 increase to 0.060% rather than a decrease; therefore, the high-pressure homogenization procedure is the key factor affecting the particle size distribution of emulsions. Varying storage conditions (4-25°C) also influenced particle size, especially the PFAT 5 . Copyright © 2015. Published by Elsevier B.V.

  5. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization.

    PubMed

    Kwiatkowski, M; Wurlitzer, M; Krutilin, A; Kiani, P; Nimer, R; Omidi, M; Mannaa, A; Bussmann, T; Bartkowiak, K; Kruber, S; Uschold, S; Steffen, P; Lübberstedt, J; Küpker, N; Petersen, H; Knecht, R; Hansen, N O; Zarrine-Afsar, A; Robertson, W D; Miller, R J D; Schlüter, H

    2016-02-16

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization

    PubMed Central

    Kwiatkowski, M.; Wurlitzer, M.; Krutilin, A.; Kiani, P.; Nimer, R.; Omidi, M.; Mannaa, A.; Bussmann, T.; Bartkowiak, K.; Kruber, S.; Uschold, S.; Steffen, P.; Lübberstedt, J.; Küpker, N.; Petersen, H.; Knecht, R.; Hansen, N.O.; Zarrine-Afsar, A.; Robertson, W.D.; Miller, R.J.D.; Schlüter, H.

    2016-01-01

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Biological significance Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. PMID:26778141

  7. Efficacy of various pasteurization time-temperature conditions in combination with homogenization on inactivation of Mycobacterium avium subsp. paratuberculosis in milk.

    PubMed

    Grant, Irene R; Williams, Alan G; Rowe, Michael T; Muir, D Donald

    2005-06-01

    The effect of various pasteurization time-temperature conditions with and without homogenization on the viability of Mycobacterium avium subsp. paratuberculosis was investigated using a pilot-scale commercial high-temperature, short-time (HTST) pasteurizer and raw milk spiked with 10(1) to 10(5) M. avium subsp. paratuberculosis cells/ml. Viable M. avium subsp. paratuberculosis was cultured from 27 (3.3%) of 816 pasteurized milk samples overall, 5 on Herrold's egg yolk medium and 22 by BACTEC culture. Therefore, in 96.7% of samples, M. avium subsp. paratuberculosis had been completely inactivated by HTST pasteurization, alone or in combination with homogenization. Heat treatments incorporating homogenization at 2,500 lb/in2, applied upstream (as a separate process) or in hold (at the start of a holding section), resulted in significantly fewer culture-positive samples than pasteurization treatments without homogenization (P < 0.001 for those in hold and P < 0.05 for those upstream). Where colony counts were obtained, the number of surviving M. avium subsp. paratuberculosis cells was estimated to be 10 to 20 CFU/150 ml, and the reduction in numbers achieved by HTST pasteurization with or without homogenization was estimated to be 4.0 to 5.2 log10. The impact of homogenization on clump size distribution in M. avium subsp. paratuberculosis broth suspensions was subsequently assessed using a Mastersizer X spectrometer. These experiments demonstrated that large clumps of M. avium subsp. paratuberculosis cells were reduced to single-cell or "miniclump" status by homogenization at 2,500 lb/in2. Consequently, when HTST pasteurization was being applied to homogenized milk, the M. avium subsp. paratuberculosis cells would have been present as predominantly declumped cells, which may possibly explain the greater inactivation achieved by the combination of pasteurization and homogenization.

  8. Efficacy of Various Pasteurization Time-Temperature Conditions in Combination with Homogenization on Inactivation of Mycobacterium avium subsp. paratuberculosis in Milk

    PubMed Central

    Grant, Irene R.; Williams, Alan G.; Rowe, Michael T.; Muir, D. Donald

    2005-01-01

    The effect of various pasteurization time-temperature conditions with and without homogenization on the viability of Mycobacterium avium subsp. paratuberculosis was investigated using a pilot-scale commercial high-temperature, short-time (HTST) pasteurizer and raw milk spiked with 101 to 105 M. avium subsp. paratuberculosis cells/ml. Viable M. avium subsp. paratuberculosis was cultured from 27 (3.3%) of 816 pasteurized milk samples overall, 5 on Herrold's egg yolk medium and 22 by BACTEC culture. Therefore, in 96.7% of samples, M. avium subsp. paratuberculosis had been completely inactivated by HTST pasteurization, alone or in combination with homogenization. Heat treatments incorporating homogenization at 2,500 lb/in2, applied upstream (as a separate process) or in hold (at the start of a holding section), resulted in significantly fewer culture-positive samples than pasteurization treatments without homogenization (P < 0.001 for those in hold and P < 0.05 for those upstream). Where colony counts were obtained, the number of surviving M. avium subsp. paratuberculosis cells was estimated to be 10 to 20 CFU/150 ml, and the reduction in numbers achieved by HTST pasteurization with or without homogenization was estimated to be 4.0 to 5.2 log10. The impact of homogenization on clump size distribution in M. avium subsp. paratuberculosis broth suspensions was subsequently assessed using a Mastersizer X spectrometer. These experiments demonstrated that large clumps of M. avium subsp. paratuberculosis cells were reduced to single-cell or “miniclump” status by homogenization at 2,500 lb/in2. Consequently, when HTST pasteurization was being applied to homogenized milk, the M. avium subsp. paratuberculosis cells would have been present as predominantly declumped cells, which may possibly explain the greater inactivation achieved by the combination of pasteurization and homogenization. PMID:15932977

  9. Modeling the Homogenization Kinetics of As-Cast U-10wt% Mo alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang Y.

    2016-01-15

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding ofmore » the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.« less

  10. Ultra-thin carbon-fiber paper fabrication and carbon-fiber distribution homogeneity evaluation method

    NASA Astrophysics Data System (ADS)

    Zhang, L. F.; Chen, D. Y.; Wang, Q.; Li, H.; Zhao, Z. G.

    2018-01-01

    A preparation technology of ultra-thin Carbon-fiber paper is reported. Carbon fiber distribution homogeneity has a great influence on the properties of ultra-thin Carbon-fiber paper. In this paper, a self-developed homogeneity analysis system is introduced to assist users to evaluate the distribution homogeneity of Carbon fiber among two or more two-value images of carbon-fiber paper. A relative-uniformity factor W/H is introduced. The experimental results show that the smaller the W/H factor, the higher uniformity of the distribution of Carbon fiber is. The new uniformity-evaluation method provides a practical and reliable tool for analyzing homogeneity of materials.

  11. Identification of hydrologically homogeneous regions in Ganga-Brahmaputra river basin using Self Organising Maps

    NASA Astrophysics Data System (ADS)

    Ojha, C. S. P.; Sharma, C.

    2017-12-01

    Identification of hydrologically homogeneous regions is crucial for topographically complex regions such as Himalayan river basins. Ganga-Brahmaputra river basin extends through three countries, i.e., India Nepal and China. High elevations and rugged topography impose challenge for in-situ gauges. So, it is always recommended to use data from hydrological similar site in absence of site records. We have tried to find out hydrologically homogeneous regions using Self-Organising-Map (SOM) in Ganga-Brahmaputra river basin. The station characteristics used for identification of homogeneous regions are annual precipitation, total wet season (July to September) precipitation, total dry season (January to March) precipitation, Latitude, Longitude and elevation. Precipitation data was obtained from Climate Research Unit (CRU). Number of cluster are find out using hierarchical k-means clustering. We found that the basin can be divided in 9 clusters. Mere division of regions in clusters does not clarify that identified cluster are homogeneous. The homogeneity of the clusters is found out using Hosking and Wallis heterogeneity test. All the clusters were found to be acceptably homogeneous with the value of Hosking-Wallis test static H<1.

  12. Homogenization patterns of the world’s freshwater fish faunas

    PubMed Central

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-01-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the “Homogocene era” is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes. PMID:22025692

  13. Homogenization patterns of the world's freshwater fish faunas.

    PubMed

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-11-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the "Homogocene era" is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes.

  14. Effect of dynamic high pressure homogenization on the aggregation state of soy protein.

    PubMed

    Keerati-U-Rai, Maneephan; Corredig, Milena

    2009-05-13

    Although soy proteins are often employed as functional ingredients in oil-water emulsions, very little is known about the aggregation state of the proteins in solution and whether any changes occur to soy protein dispersions during homogenization. The effect of dynamic high pressure homogenization on the aggregation state of the proteins was investigated using microdifferential scanning calorimetry and high performance size exclusion chromatography coupled with multiangle laser light scattering. Soy protein isolates as well as glycinin and beta-conglycinin fractions were prepared from defatted soy flakes and redispersed in 50 mM sodium phosphate buffer at pH 7.4. The dispersions were then subjected to homogenization at two different pressures, 26 and 65 MPa. The results demonstrated that dynamic high pressure homogenization causes changes in the supramolecular structure of the soy proteins. Both beta-conglycinin and glycinin samples had an increased temperature of denaturation after homogenization. The chromatographic elution profile showed a reduction in the aggregate concentration with homogenization pressure for beta-conglycinin and an increase in the size of the soluble aggregates for glycinin and soy protein isolate.

  15. Homogeneity of lava flows - Chemical data for historic Mauna Loan eruptions

    NASA Technical Reports Server (NTRS)

    Rhodes, J. M.

    1983-01-01

    Chemical analyses of basalts collected from the major historic eruptions of Mauna Loa volcano show that many of the flow fields are remarkably homogeneous in composition. Despite their large size (lengths 9-85 km), large areal extents (13-114 sq km), and various durations of eruption (1-450 days), many of the flow fields have compositional variability that is within, or close to, the analytical error for most elements. The flow fields that are not homogeneous vary mainly in olivine content in an otherwise homogeneous melt. Some are composite flow fields made up of several, apparently homogeneous subunits erupted at different elevations along the active volcanic rifts. Not all volcanoes produce lavas that are homogeneous like those of Mauna Loa. If studies such as this are to be used to evaluate compositional diversity in lavas where there is a lack of sampling control, such as on other planets, it is necessary to understand why some flow units and flow fields are compositionally homogeneous and others are not, and to develop criteria for distinguishing between them.

  16. Fracture of Rolled Homogeneous Steel Armor (Nucleation Threshold Stress).

    DTIC Science & Technology

    1980-01-01

    AD-AO81 618 ARMY ARMAMENT RESEARCH AND DEVELOPMENT COMMAND ABERD--ETC F/B 19/4 FRACTURE OF ROLLED HOMOGENEOUS STEEL ARMOR (NUCLEATION THRESHOL--ETC(U...ARBRL-MR-02984A QQ FRACTURE OF ROLLED HOMOGENEOUS STEEL ARMOR (NUCLEATION THRESHOLD STRESS) Gerald L Moss Lynn SeamanLy~ S, ,.DTIC S ELECTED January...nucleation stress, Crack threshold stress, Fracture, Fracture stress, Spallation, Armor, Rolled homogeneous steel armor M~ AS$TRACr (Vita ssf -- ebb

  17. Salty popcorn in a homogeneous low-dimensional toy model of holographic QCD

    NASA Astrophysics Data System (ADS)

    Elliot-Ripley, Matthew

    2017-04-01

    Recently, a homogeneous ansatz has been used to study cold dense nuclear matter in the Sakai-Sugimoto model of holographic QCD. To justify this homogeneous approximation we here investigate a homogeneous ansatz within a low-dimensional toy version of Sakai-Sugimoto to study finite baryon density configurations and compare it to full numerical solutions. We find the ansatz corresponds to enforcing a dyon salt arrangement in which the soliton solutions are split into half-soliton layers. Within this ansatz we find analogues of the proposed baryonic popcorn transitions, in which solutions split into multiple layers in the holographic direction. The homogeneous results are found to qualitatively match the full numerical solutions, lending confidence to the homogeneous approximations of the full Sakai-Sugimoto model. In addition, we find exact compact solutions in the high density, flat space limit which demonstrate the existence of further popcorn transitions to three layers and beyond.

  18. Pseudo-thermosetting chitosan hydrogels for biomedical application.

    PubMed

    Berger, J; Reist, M; Chenite, A; Felt-Baeyens, O; Mayer, J M; Gurny, R

    2005-01-20

    To prepare transparent chitosan/beta-glycerophosphate (betaGP) pseudo-thermosetting hydrogels, the deacetylation degree (DD) of chitosan has been modified by reacetylation with acetic anhydride. Two methods (I and II) of reacetylation have been compared and have shown that the use of previously filtered chitosan, dilution of acetic anhydride and reduction of temperature in method II improves efficiency and reproducibility. Chitosans with DD ranging from 35.0 to 83.2% have been prepared according to method II under homogeneous and non-homogeneous reacetylation conditions and the turbidity of chitosan/betaGP hydrogels containing homogeneously or non-homogeneously reacetylated chitosan has been investigated. Turbidity is shown to be modulated by the DD of chitosan and by the homogeneity of the medium during reacetylation, which influences the distribution mode of the chitosan monomers. The preparation of transparent chitosan/betaGP hydrogels requires a homogeneously reacetylated chitosan with a DD between 35 and 50%.

  19. Pseudo-thermosetting chitosan hydrogels for biomedical application.

    PubMed

    Berger, J; Reist, M; Chenite, A; Felt-Baeyens, O; Mayer, J M; Gurny, R

    2005-01-06

    To prepare transparent chitosan/beta-glycerophosphate (betaGP) pseudo-thermosetting hydrogels, the deacetylation degree (DD) of chitosan has been modified by reacetylation with acetic anhydride. Two methods (I and II) of reacetylation have been compared and have shown that the use of previously filtered chitosan, dilution of acetic anhydride and reduction of temperature in method II improves efficiency and reproducibility. Chitosans with DD ranging from 35.0 to 83.2% have been prepared according to method II under homogeneous and non-homogeneous reacetylation conditions and the turbidity of chitosan/betaGP hydrogels containing homogeneously or non-homogeneously reacetylated chitosan has been investigated. Turbidity is shown to be modulated by the DD of chitosan and by the homogeneity of the medium during reacetylation, which influences the distribution mode of the chitosan monomers. The preparation of transparent chitosan/betaGP hydrogels requires a homogeneously reacetylated chitosan with a DD between 35 and 50%.

  20. Data on the effect of homogenization heat treatments on the cast structure and tensile properties of alloy 718Plus in the presence of grain-boundary elements.

    PubMed

    Hosseini, Seyed Ali; Madar, Karim Zangeneh; Abbasi, Seyed Mehdi

    2017-08-01

    The segregation of the elements during solidification and the direct formation of destructive phases such as Laves from the liquid, result in in-homogeneity of the cast structure and degradation of mechanical properties. Homogenization heat treatment is one of the ways to eliminate destructive Laves from the cast structure of superalloys such as 718Plus. The collected data presents the effect of homogenization treatment conditions on the cast structure, hardness, and tensile properties of the alloy 718Plus in the presence of boron and zirconium additives. For this purpose, five alloys with different contents of boron and zirconium were cast by VIM/VAR process and then were homogenized at various conditions. The microstructural investigation by OM and SEM and phase analysis by XRD were done and then hardness and tensile tests were performed on the homogenized alloys.

  1. Numerical modeling of the acoustic wave propagation across a homogenized rigid microstructure in the time domain

    NASA Astrophysics Data System (ADS)

    Lombard, Bruno; Maurel, Agnès; Marigo, Jean-Jacques

    2017-04-01

    Homogenization of a thin micro-structure yields effective jump conditions that incorporate the geometrical features of the scatterers. These jump conditions apply across a thin but nonzero thickness interface whose interior is disregarded. This paper aims (i) to propose a numerical method able to handle the jump conditions in order to simulate the homogenized problem in the time domain, (ii) to inspect the validity of the homogenized problem when compared to the real one. For this purpose, we adapt the Explicit Simplified Interface Method originally developed for standard jump conditions across a zero-thickness interface. Doing so allows us to handle arbitrary-shaped interfaces on a Cartesian grid with the same efficiency and accuracy of the numerical scheme than those obtained in a homogeneous medium. Numerical experiments are performed to test the properties of the numerical method and to inspect the validity of the homogenization problem.

  2. Cosmic homogeneity: a spectroscopic and model-independent measurement

    NASA Astrophysics Data System (ADS)

    Gonçalves, R. S.; Carvalho, G. C.; Bengaly, C. A. P., Jr.; Carvalho, J. C.; Bernui, A.; Alcaniz, J. S.; Maartens, R.

    2018-03-01

    Cosmology relies on the Cosmological Principle, i.e. the hypothesis that the Universe is homogeneous and isotropic on large scales. This implies in particular that the counts of galaxies should approach a homogeneous scaling with volume at sufficiently large scales. Testing homogeneity is crucial to obtain a correct interpretation of the physical assumptions underlying the current cosmic acceleration and structure formation of the Universe. In this letter, we use the Baryon Oscillation Spectroscopic Survey to make the first spectroscopic and model-independent measurements of the angular homogeneity scale θh. Applying four statistical estimators, we show that the angular distribution of galaxies in the range 0.46 < z < 0.62 is consistent with homogeneity at large scales, and that θh varies with redshift, indicating a smoother Universe in the past. These results are in agreement with the foundations of the standard cosmological paradigm.

  3. Immortal homogeneous Ricci flows

    NASA Astrophysics Data System (ADS)

    Böhm, Christoph; Lafuente, Ramiro A.

    2018-05-01

    We show that for an immortal homogeneous Ricci flow solution any sequence of parabolic blow-downs subconverges to a homogeneous expanding Ricci soliton. This is established by constructing a new Lyapunov function based on curvature estimates which come from real geometric invariant theory.

  4. Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity

    PubMed Central

    Krasteva, Vessela TZ; Papazov, Sava P; Daskalov, Ivan K

    2003-01-01

    Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM) of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium. PMID:14693034

  5. Pattern and process of biotic homogenization in the New Pangaea

    PubMed Central

    Baiser, Benjamin; Olden, Julian D.; Record, Sydne; Lockwood, Julie L.; McKinney, Michael L.

    2012-01-01

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from −0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization. PMID:23055062

  6. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-07

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization.

  7. Microstructural characterization of Ni-based self-fluxing alloy after selective surface-engineering using diode laser

    NASA Astrophysics Data System (ADS)

    Chun, Eun-Joon; Park, Changkyoo; Nishikawa, Hiroshi; Kim, Min-Su

    2018-06-01

    The microstructural characterization of thermal-sprayed Ni-based self-fluxing alloy (Metco-16C®) after laser-assisted homogenization treatment was performed. To this end, a high-power diode laser system was used. This supported the real-time control of the target homogenization temperature at the substrate surface. Non-homogeneities of the macrosegregation of certain elements (C and Cu) and the local concentration of Cr-based carbides and borides in certain regions in the as-sprayed state could be enhanced with the application of homogenization. After homogenization at 1423 K, the hardness of the thermal-sprayed layer was found to have increased by 1280 HV from the as-sprayed state (750 HV). At this homogenization temperature, the microstructure of the thermal-sprayed layer consisted of a lamellar structuring of the matrix phase (austenite and Ni3Si) with fine (<5 μm) carbides and borides (the rod-like phase of Cr5B3, the lumpy phase of M23C6, and the extra-fine phase of M7C3). Despite the formation of several kinds of carbides and borides during homogenization at 1473 K, the lowest hardness level was found to be less than that of the as-sprayed state, because of the liquid-state homogenization treatment without formation of lamellar structuring between austenite and Ni3Si.

  8. De l'importance des orbites periodiques: Detection et applications

    NASA Astrophysics Data System (ADS)

    Doyon, Bernard

    L'ensemble des Orbites Periodiques Instables (OPIs) d'un systeme chaotique est intimement relie a ses proprietes dynamiques. A partir de l'ensemble (en principe infini) d'OPIs cachees dans l'espace des phases, on peut obtenir des quantites dynamiques importantes telles les exposants de Lyapunov, la mesure invariante, l'entropie topologique et la dimension fractale. En chaos quantique (i.e. l'etude de systemes quantiques qui ont un equivalent chaotique dans la limite classique), ces memes OPIs permettent de faire le pont entre le comportement classique et quantique de systemes non-integrables. La localisation de ces cycles fondamentaux est un probleme complexe. Cette these aborde dans un premier temps le probleme de la detection des OPIs dans les systemes chaotiques. Une etude comparative de deux algorithmes recents est presentee. Nous approfondissons ces deux methodes afin de les utiliser sur differents systemes dont des flots continus dissipatifs et conservatifs. Une analyse du taux de convergence des algorithmes est aussi realisee afin de degager les forces et les limites de ces schemes numeriques. Les methodes de detection que nous utilisons reposent sur une transformation particuliere de la dynamique initiale. Cette astuce nous a inspire une methode alternative pour cibler et stabiliser une orbite periodique quelconque dans un systeme chaotique. Le ciblage est en general combine aux methodes de controle pour stabiliser rapidement un cycle donne. En general, il faut connaitre la position et la stabilite du cycle en question. La nouvelle methode de ciblage que nous presentons ne demande pas de connaitre a priori la position et la stabilite des orbites periodiques. Elle pourrait etre un outil complementaire aux methodes de ciblage et de controle actuelles.

  9. Effets non lineaires transversaux dans les guides d'ondes plans

    NASA Astrophysics Data System (ADS)

    Dumais, Patrick

    Les effets non lineaires transversaux dus a l'effet Kerr optique non resonant sont etudies dans deux types de guides a geometrie plane. D'abord (au chapitre 2), l'emission de solitons spatiaux d'un guide de type canal est etudie historiquement, analytiquement et numeriquement dans le but d'en faire la conception et la fabrication, en AlGaAs, dans la region spectrale en deca de la moitie de la bande interdite de ce materiau, soit autour de 1,5 microns. Le composant, tel que concu, comporte une structure de multipuits quantiques. Le desordonnement local de cette structure permet une variation locale du coefficient Kerr dans le guide, ce qui mene a l'emission d'un soliton spatial au-dela d'une puissance optique de seuil. L'observation experimentale d'un changement en fonction de l'intensite du profil de champ a la sortie du guide realise est presentee. Deuxiemement (au chapitre 3) une technique de mesure du coefficient Kerr dans un guide plan est presentee. Cette technique consiste a mesurer le changement de transmission au travers d'un cache place a la sortie du guide en fonction de l'intensite crete a l'entree du guide plan. Une methode pour determiner les conditions optimales pour la sensibilite de la mesure est presentee, illustree de plusieurs exemples. Finalement, la realisation d'un oscillateur parametrique optique basee sur un cristal de niobate de lithium a domaines periodiquement inverses est presentee. La theorie des oscillateurs parametriques optiques est exposee avec une emphase sur la generation d'impulsions intenses a des longueurs d'onde autour de 1,5 microns a partir d'un laser Ti:saphir, dans le but d'obtenir une source pour faire les experiences sur l'emission solitonique.

  10. A Class of Homogeneous Scalar Tensor Cosmologies with a Radiation Fluid

    NASA Astrophysics Data System (ADS)

    Yazadjiev, Stoytcho S.

    We present a new class of exact homogeneous cosmological solutions with a radiation fluid for all scalar tensor theories. The solutions belong to Bianchi type VIh cosmologies. Explicit examples of nonsingular homogeneous scalar tensor cosmologies are also given.

  11. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training the users on homogenization software was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that automatic algorithms can perform as well as manual ones.

  12. Spatial homogenization methods for pin-by-pin neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Kozlowski, Tomasz

    For practical reactor core applications low-order transport approximations such as SP3 have been shown to provide sufficient accuracy for both static and transient calculations with considerably less computational expense than the discrete ordinate or the full spherical harmonics methods. These methods have been applied in several core simulators where homogenization was performed at the level of the pin cell. One of the principal problems has been to recover the error introduced by pin-cell homogenization. Two basic approaches to treat pin-cell homogenization error have been proposed: Superhomogenization (SPH) factors and Pin-Cell Discontinuity Factors (PDF). These methods are based on well established Equivalence Theory and Generalized Equivalence Theory to generate appropriate group constants. These methods are able to treat all sources of error together, allowing even few-group diffusion with one mesh per cell to reproduce the reference solution. A detailed investigation and consistent comparison of both homogenization techniques showed potential of PDF approach to improve accuracy of core calculation, but also reveal its limitation. In principle, the method is applicable only for the boundary conditions at which it was created, i.e. for boundary conditions considered during the homogenization process---normally zero current. Therefore, there exists a need to improve this method, making it more general and environment independent. The goal of proposed general homogenization technique is to create a function that is able to correctly predict the appropriate correction factor with only homogeneous information available, i.e. a function based on heterogeneous solution that could approximate PDFs using homogeneous solution. It has been shown that the PDF can be well approximated by least-square polynomial fit of non-dimensional heterogeneous solution and later used for PDF prediction using homogeneous solution. This shows a promise for PDF prediction for off-reference conditions, such as during reactor transients which provide conditions that can not typically be anticipated a priori.

  13. [Growth Factors and Interleukins in Amniotic Membrane Tissue Homogenate].

    PubMed

    Stachon, T; Bischoff, M; Seitz, B; Huber, M; Zawada, M; Langenbucher, A; Szentmáry, N

    2015-07-01

    Application of amniotic membrane homogenate eye drops may be a potential treatment alternative for therapy resistant corneal epithelial defects. The purpose of this study was to determine the concentrations of epidermal growth factor (EGF), fibroblast growth factor basic (bFGF), hepatocyte growth factor (HGF), keratinocyte growth factor (KGF), interleukin-6 (IL-6) and interleukin-8 (IL-8) in amniotic membrane homogenates. Amniotic membranes of 8 placentas were prepared and thereafter stored at - 80 °C using the standard methods of the LIONS Cornea Bank Saar-Lor-Lux, Trier/Westpfalz. Following defreezing, amniotic membranes were cut in two pieces and homogenized in liquid nitrogen. One part of the homogenate was prepared in cell-lysis buffer, the other part was prepared in PBS. The tissue homogenates were stored at - 20 °C until enzyme-linked immunosorbent assay (ELISA) analysis for EGF, bFGF, HGF, KGF, IL-6 and IL-8 concentrations. Concentrations of KGF, IL-6 and IL-8 were below the detection limit using both preparation techniques. The EGF concentration in tissue homogenates treated with cell-lysis buffer (2412 pg/g tissue) was not significantly different compared to that of tissue homogenates treated with PBS (1586 pg/g tissue, p = 0.72). bFGF release was also not significantly different using cell-lysis buffer (3606 pg/g tissue) or PBS treated tissue homogenates (4649 pg/g tissue, p = 0.35). HGF release was significantly lower using cell-lysis buffer (23,555 pg/g tissue), compared to PBS treated tissue (47,766 pg/g tissue, p = 0.007). Containing EGF, bFGF and HGF, and lacking IL-6 and IL-8, the application of amniotic membrane homogenate eye drops may be a potential treatment alternative for therapy-resistant corneal epithelial defects. Georg Thieme Verlag KG Stuttgart · New York.

  14. CO2-assisted high pressure homogenization: a solvent-free process for polymeric microspheres and drug-polymer composites.

    PubMed

    Kluge, Johannes; Mazzotti, Marco

    2012-10-15

    The study explores the enabling role of near-critical CO(2) as a reversible plasticizer in the high pressure homogenization of polymer particles, aiming at their comminution as well as at the formation of drug-polymer composites. First, the effect of near-critical CO(2) on the homogenization of aqueous suspensions of poly lactic-co-glycolic acid (PLGA) was investigated. Applying a pressure drop of 900 bar and up to 150 passes across the homogenizer, it was found that particles processed in the presence of CO(2) were generally of microspherical morphology and at all times significantly smaller than those obtained in the absence of a plasticizer. The smallest particles, exhibiting a median x(50) of 1.3 μm, were obtained by adding a small quantity of ethyl acetate, which exerts on PLGA an additional plasticizing effect during the homogenization step. Further, the study concerns the possibility of forming drug-polymer composites through simultaneous high pressure homogenization of the two relevant solids, and particularly the effect of near-critical CO(2) on this process. Therefore, PLGA was homogenized together with crystalline S-ketoprofen (S-KET), a non-steroidal anti-inflammatory drug, at a drug to polymer ratio of 1:10, a pressure drop of 900 bar and up to 150 passes across the homogenizer. When the process was carried out in the presence of CO(2), an impregnation efficiency of 91% has been reached, corresponding to 8.3 wt.% of S-KET in PLGA; moreover, composite particles were of microspherical morphology and significantly smaller than those obtained in the absence of CO(2). The formation of drug-polymer composites through simultaneous homogenization of the two materials is thus greatly enhanced by the presence of CO(2), which increases the efficiency for both homogenization and impregnation. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Some low homogenization pressures improve certain probiotic characteristics of yogurt culture bacteria and Lactobacillus acidophilus LA-K.

    PubMed

    Muramalla, T; Aryana, K J

    2011-08-01

    Lactobacillus delbrueckii ssp. bulgaricus, Streptococcus salivarius ssp. thermophilus, and Lactobacillus acidophilus are dairy cultures widely used in the manufacture of cultured dairy products. Commonly used homogenization pressures in the dairy industry are 13.80 MPa or less. It is not known whether low homogenization pressures can stimulate bacteria to improve their probiotic characteristics. Objectives were to determine the effect of homogenization at 0, 3.45, 6.90, 10.34, and 13.80 MPa on acid tolerance, bile tolerance, protease activity, and growth of L. delbrueckii ssp. bulgaricus LB-12, S. salivarius ssp. thermophilus ST-M5, and L. acidophilus LA-K. The cultures were individually inoculated in cool autoclaved skim milk (4°C) and homogenized for 5 continuous passes. Growth and bile tolerance of samples were determined hourly for 10h of incubation. Acid tolerance was determined every 20 min for 120 min of incubation. Protease activity was determined at 0, 12, and 24h of incubation. All homogenization pressures studied improved acid tolerance of L. delbrueckii ssp. bulgaricus LB-12 but had no beneficial effect on protease activity and had negative effects on growth and bile tolerance. A pressure of 6.90 MPa improved acid tolerance, bile tolerance, and protease activity of S. salivarius ssp. thermophilus ST-M5, but none of the homogenization pressures studied had an effect on its growth. Homogenization pressures of 13.80 and 6.90 MPa improved acid tolerance and bile tolerance, respectively, of L. acidophilus LA-K but had no effect on protease activity and its growth. Some low homogenization pressures positively influenced some characteristics of yogurt culture bacteria and L. acidophilus LA-K. Culture pretreatment with some low homogenization pressures can be recommended for improvement of certain probiotic characteristics. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. SU-E-T-76: Comparing Homogeneity Between Gafchromic Film EBT2 and EBT3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mizuno, H; Sumida, I; Ogawa, K

    2014-06-01

    Purpose: We found out that homogeneity of EBT2 was different among lot numbers in previous study. Variation in local homogeneity of EBT3 among several lot numbers has not been reported. In this study, we investigated film homogeneity of Gafcrhomic EBT3 films compared with EBT2 films. Methods: All sheets from five lots were cut into 12 pieces to investigate film homogeneity, and were irradiated at 0.5, 2, and 3 Gy. To investigate intra- and inter-sheet uniformity, five sheets from five lots were exposed to 2 Gy: intra-sheet uniformity was evaluated by the coefficient of variation of homogeneity for all pieces ofmore » a single sheet, and inter-sheet uniformity was evaluated by the coefficient of variation of homogeneity among the same piece numbers in the five sheets. To investigate the difference of ADC value in various doses, a single sheet from each of five lots was irradiated at 0.5 Gy and 3 Gy in addition to 2 Gy. A scan resolution of 72 dots per inch (dpi) and color depth of 48-bit RGB were used. Films were analyzed by the inhouse software; Average of ADC value in center ROI and profile X and Y axis were measured. Results and Conclusion: Intra-sheet uniformity of non-irradiated EBT2 films were ranged from 0.1% to 0.4%, however that of irradiated EBT2 films were ranged from 0.2% to 1.5%. On the other hand, intra-sheet uniformity of irradiated and non-irradiated EBT3 films were from 0.2% to 0.6%. Inter-sheet uniformity of all films were less than 0.5%. It was interesting point that homogeneity of EBT3 between no-irradiated and irradiated films were similar value, whereas EBT2 had dose dependence of homogeneity in ADC value evaluation. These results suggested that EBT3 homogeneity was corrected by this feature.« less

  17. Enhanced Detection of Vibrio Cholerae in Oyster Homogenate Based on Centrifugal Removal of Inhibitory Agents

    NASA Technical Reports Server (NTRS)

    Alexander, Donita; DePaola, Angelo; Young, Ronald B.

    1998-01-01

    The disease cholera, caused by Vibrio cholerae, has been associated with consumption of contaminated seafood, including raw oysters. Detection of V. cholerae in foods typically involves blending the oysters, diluting the homogenate in alkaline peptone water (APW), overnight enrichment, and isolation on selective agar. Unfortunately, the oyster homogenate must be diluted to large volumes because lower dilutions inhibit the growth of V. cholerae. The goals of this study were to develop an alternative to large dilutions and to evaluate the basis for the inhibition observed in lower dilutions of oyster homogenates. Centrifugation of oyster homogenates at 10,000 x g for 15 min, followed by enrichment of the resulting pellet in APW, was found to eliminate the inhibition of V. cholerae growth. Inhibition appears not to be due to competing microflora but to a component(s) released when V. cholerae grows in the presence of oyster homogenate. The inhibitory component(s) kills the V. cholerae after the cell concentration reaches > 10(exp 8) cells/mL, rather than initially preventing their growth. The pH also declines from 8.0 to 5.5 during this period; however, the pH decline by itself appears not to cause V. cholerae death. Seven strains of V. cholerae (01 and non-01) and two strains of V. vulnificus were susceptible to the inhibitory agent(s). However, other Vibrio and non-Vibrio species tested were not inhibited by the oyster homogenates. Based on digestion of oyster homogenates with pronase, trypsin and lipase, the inhibitory reaction involves a protein(s). In a preliminary trial with oyster homogenate seeded with 1 cfu/g of V. cholerae, the modified centrifugation technique detected a slightly higher percentage of samples at a 1:10 dilution than the standard FDA Bacteriological Analytical Method (BAM) detected in uncentrifuged oyster homogenate at a 1:100 dilution. V. cholerae in seeded samples could also be detected more frequently by the modified centrifugation method than by PCR at a 1:10 dilution.

  18. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  19. Improving homogeneity by dynamic speed limit systems.

    PubMed

    van Nes, Nicole; Brandenburg, Stefan; Twisk, Divera

    2010-05-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12 road sections in a driving simulator. The speed limit system (static-dynamic), the sophistication of the dynamic speed limit system (basic roadside, advanced roadside, and advanced in-car) and the situational condition (dangerous-non-dangerous) were varied. The homogeneity of driving speed, the rated credibility of the posted speed limit and the acceptance of the different dynamic speed limit systems were assessed. The results show that the homogeneity of individual speeds, defined as the variation in driving speed for an individual subject along a particular road section, was higher with the dynamic speed limit system than with the static speed limit system. The more sophisticated dynamic speed limit system tested within this study led to higher homogeneity than the less sophisticated systems. The acceptance of the dynamic speed limit systems used in this study was positive, they were perceived as quite useful and rather satisfactory. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  20. Generation of phase II in vitro metabolites using homogenized horse liver.

    PubMed

    Wong, Jenny K Y; Chan, George H M; Leung, David K K; Tang, Francis P W; Wan, Terence S M

    2016-02-01

    The successful use of homogenized horse liver for the generation of phase I in vitro metabolites has been previously reported by the authors' laboratory. Prior to the use of homogenized liver, the authors' laboratory had been using mainly horse liver microsomes for carrying out equine in vitro metabolism studies. Homogenized horse liver has shown significant advantages over liver microsomes for in vitro metabolism studies as the procedures are much quicker and have higher capability for generating more in vitro metabolites. In this study, the use of homogenized liver has been extended to the generation of phase II in vitro metabolites (glucuronide and/or sulfate conjugates) using 17β-estradiol, morphine, and boldenone undecylenate as model substrates. It was observed that phase II metabolites could also be generated even without the addition of cofactors. To the authors' knowledge, this is the first report of the successful use of homogenized horse liver for the generation of phase II metabolites. It also demonstrates the ease with which both phase I and phase II metabolites can now be generated in vitro simply by using homogenized liver without the need for ultracentrifuges or tedious preparation steps. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Searching regional rainfall homogeneity using atmospheric fields

    NASA Astrophysics Data System (ADS)

    Gabriele, Salvatore; Chiaravalloti, Francesco

    2013-03-01

    The correct identification of homogeneous areas in regional rainfall frequency analysis is fundamental to ensure the best selection of the probability distribution and the regional model which produce low bias and low root mean square error of quantiles estimation. In an attempt at rainfall spatial homogeneity, the paper explores a new approach that is based on meteo-climatic information. The results are verified ex-post using standard homogeneity tests applied to the annual maximum daily rainfall series. The first step of the proposed procedure selects two different types of homogeneous large regions: convective macro-regions, which contain high values of the Convective Available Potential Energy index, normally associated with convective rainfall events, and stratiform macro-regions, which are characterized by low values of the Q vector Divergence index, associated with dynamic instability and stratiform precipitation. These macro-regions are identified using Hot Spot Analysis to emphasize clusters of extreme values of the indexes. In the second step, inside each identified macro-region, homogeneous sub-regions are found using kriging interpolation on the mean direction of the Vertically Integrated Moisture Flux. To check the proposed procedure, two detailed examples of homogeneous sub-regions are examined.

  2. Optimization of the Magnetic Field Homogeneity Area for Solenoid Type Magnets

    NASA Astrophysics Data System (ADS)

    Perepelkin, Eugene; Polyakova, Rima; Tarelkin, Aleksandr; Kovalenko, Alexander; Sysoev, Pavel; Sadovnikova, Marianne; Yudin, Ivan

    2018-02-01

    Homogeneous magnetic fields are important requisites in modern physics research. In this paper we discuss the problem of magnetic field homogeneity area maximization for solenoid magnets. We discuss A-model and B-model, which are basic types of solenoid magnets used to provide a homogeneous field, and methods for their optimization. We propose C-model which can be used for the NICA project. We have also carried out a cross-check of the C-model with the parameters stated for the CLEO II detector.

  3. Method of chaotic mixing and improved stirred tank reactors

    DOEpatents

    Muzzio, F.J.; Lamberto, D.J.

    1999-07-13

    The invention provides a method and apparatus for efficiently achieving a homogeneous mixture of fluid components by introducing said components having a Reynolds number of between about [le]1 to about 500 into a vessel and continuously perturbing the mixing flow by altering the flow speed and mixing time until homogeneity is reached. This method prevents the components from aggregating into non-homogeneous segregated regions within said vessel during mixing and substantially reduces the time the admixed components reach homogeneity. 19 figs.

  4. Stabilisation of perturbed chains of integrators using Lyapunov-based homogeneous controllers

    NASA Astrophysics Data System (ADS)

    Harmouche, Mohamed; Laghrouche, Salah; Chitour, Yacine; Hamerlain, Mustapha

    2017-12-01

    In this paper, we present a Lyapunov-based homogeneous controller for the stabilisation of a perturbed chain of integrators of arbitrary order r ≥ 1. The proposed controller is based on homogeneous controller for stabilisation of pure chain of integrators. The control of homogeneity degree is also introduced and various controllers are designed using this concept, namely a bounded-controller with minimum amplitude of discontinuous control and a controller with globally fixed-time convergence. The performance of the controller is validated through simulations.

  5. Effects of two-step homogenization on precipitation behavior of Al{sub 3}Zr dispersoids and recrystallization resistance in 7150 aluminum alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Zhanying; Key Laboratory for Anisotropy and Texture of Materials, Northeastern University, Shenyang 110819, China,; Zhao, Gang

    2015-04-15

    The effect of two-step homogenization treatments on the precipitation behavior of Al{sub 3}Zr dispersoids was investigated by transmission electron microscopy (TEM) in 7150 alloys. Two-step treatments with the first step in the temperature range of 300–400 °C followed by the second step at 470 °C were applied during homogenization. Compared with the conventional one-step homogenization, both a finer particle size and a higher number density of Al{sub 3}Zr dispersoids were obtained with two-step homogenization treatments. The most effective dispersoid distribution was attained using the first step held at 300 °C. In addition, the two-step homogenization minimized the precipitate free zonesmore » and greatly increased the number density of dispersoids near dendrite grain boundaries. The effect of two-step homogenization on recrystallization resistance of 7150 alloys with different Zr contents was quantitatively analyzed using the electron backscattered diffraction (EBSD) technique. It was found that the improved dispersoid distribution through the two-step treatment can effectively inhibit the recrystallization process during the post-deformation annealing for 7150 alloys containing 0.04–0.09 wt.% Zr, resulting in a remarkable reduction of the volume fraction and grain size of recrystallization grains. - Highlights: • Effect of two-step homogenization on Al{sub 3}Zr dispersoids was investigated by TEM. • Finer and higher number of dispersoids obtained with two-step homogenization • Minimized the precipitate free zones and improved the dispersoid distribution • Recrystallization resistance with varying Zr content was quantified by EBSD. • Effectively inhibit the recrystallization through two-step treatments in 7150 alloy.« less

  6. Sperm quality and oxidative status as affected by homogenization of liquid-stored boar semen diluted in short- and long-term extenders.

    PubMed

    Menegat, Mariana B; Mellagi, Ana Paula G; Bortolin, Rafael C; Menezes, Tila A; Vargas, Amanda R; Bernardi, Mari Lourdes; Wentz, Ivo; Gelain, Daniel P; Moreira, José Cláudio F; Bortolozzo, Fernando P

    2017-04-01

    Homogenization of diluted boar semen during storage has for a long time been regarded as beneficial. Recent studies indicated an adverse effect of homogenization on sperm quality for yet unknown reasons. This study aimed to verify the effect of homogenization on sperm parameters and to elucidate the impact of oxidative stress. Twenty-one normospermic ejaculates (21 boars) were diluted with Androstar ® Plus (AND) and Beltsville Thawing Solution (BTS). Semen doses were submitted to no-homogenization (NoHom) or twice-a-day manual homogenization (2xHom) during storage at 17°C for 168h. NoHom and 2xHom were similar (P>0.05) for both short- and long-term extenders with respect to motility and kinematics parameters (CASA system), membrane viability (SYBR-14/PI), acrosome integrity, lipid peroxidation, protein oxidation, intracellular reactive oxygen species, sulfhydryl content, and total radical-trapping antioxidant potential. 2xHom reduced sperm motility and motion kinematics (VCL, VSL, VAP, BCF, and ALH) following the thermoresistance test and presented with a slight increase in pH along the storage (P=0.05) as compared to NoHom. Furthermore, 2xHom semen doses presented with a constant SOD and GSH-Px activity during storage whereas enzymatic activity increased for NoHom at the end of the storage. These findings confirm that homogenization of semen doses is detrimental to sperm quality. Moreover, it is shown that the effect of homogenization is unlikely to be primarily related to oxidative stress. Homogenization is not recommended for storage of liquid boar semen for up to 168h in both short- and long-term extenders. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Modelisation geometrique par NURBS pour le design aerodynamique des ailes d'avion

    NASA Astrophysics Data System (ADS)

    Bentamy, Anas

    The constant evolution of the computer science gives rise to many research areas especially in computer aided design. This study is part, of the advancement of the numerical methods in engineering computer aided design, specifically in aerospace science. The geometric modeling based on NURBS has been applied successfully to generate a parametric wing surface for aerodynamic design while satisfying manufacturing constraints. The goal of providing a smooth geometry described with few parameters has been achieved. In that case, a wing design including ruled surfaces at the leading edge slat and at the flap, and, curved central surfaces with intrinsic geometric property coming from conic curves, necessitates 130 control points and 15 geometric design variables. The 3D character of the wing need to be analyzed by techniques of investigation of surfaces in order to judge conveniently the visual aspect and detect any sign inversion in both directions of parametrization u and nu. Color mapping of the Gaussian curvature appears to be a very effective tools in visualization. The automation of the construction has been attained using an heuristic optimization algorithm, simulated annealing. The relative high speed of convergence to the solutions confirms its practical interest in engineering problems nowadays. The robustness of the geometric model has been tested successfully with an academic inverse design problem. The results obtained allow to foresee multiple possible applications from an extension to a complete geometric description of an airplane to the interaction with others disciplines belonging to a preliminary aeronautical design process.

  8. Development of a precision multimodal surgical navigation system for lung robotic segmentectomy

    PubMed Central

    Soldea, Valentin; Lachkar, Samy; Rinieri, Philippe; Sarsam, Mathieu; Bottet, Benjamin; Peillon, Christophe

    2018-01-01

    Minimally invasive sublobar anatomical resection is becoming more and more popular to manage early lung lesions. Robotic-assisted thoracic surgery (RATS) is unique in comparison with other minimally invasive techniques. Indeed, RATS is able to better integrate multiple streams of information including advanced imaging techniques, in an immersive experience at the level of the robotic console. Our aim was to describe three-dimensional (3D) imaging throughout the surgical procedure from preoperative planning to intraoperative assistance and complementary investigations such as radial endobronchial ultrasound (R-EBUS) and virtual bronchoscopy for pleural dye marking. All cases were operated using the DaVinci SystemTM. Modelisation was provided by Visible Patient™ (Strasbourg, France). Image integration in the operative field was achieved using the Tile Pro multi display input of the DaVinci console. Our experience was based on 114 robotic segmentectomies performed between January 2012 and October 2017. The clinical value of 3D imaging integration was evaluated in 2014 in a pilot study. Progressively, we have reached the conclusion that the use of such an anatomic model improves the safety and reliability of procedures. The multimodal system including 3D imaging has been used in more than 40 patients so far and demonstrated a perfect operative anatomic accuracy. Currently, we are developing an original virtual reality experience by exploring 3D imaging models at the robotic console level. The act of operating is being transformed and the surgeon now oversees a complex system that improves decision making. PMID:29785294

  9. Development of a precision multimodal surgical navigation system for lung robotic segmentectomy.

    PubMed

    Baste, Jean Marc; Soldea, Valentin; Lachkar, Samy; Rinieri, Philippe; Sarsam, Mathieu; Bottet, Benjamin; Peillon, Christophe

    2018-04-01

    Minimally invasive sublobar anatomical resection is becoming more and more popular to manage early lung lesions. Robotic-assisted thoracic surgery (RATS) is unique in comparison with other minimally invasive techniques. Indeed, RATS is able to better integrate multiple streams of information including advanced imaging techniques, in an immersive experience at the level of the robotic console. Our aim was to describe three-dimensional (3D) imaging throughout the surgical procedure from preoperative planning to intraoperative assistance and complementary investigations such as radial endobronchial ultrasound (R-EBUS) and virtual bronchoscopy for pleural dye marking. All cases were operated using the DaVinci System TM . Modelisation was provided by Visible Patient™ (Strasbourg, France). Image integration in the operative field was achieved using the Tile Pro multi display input of the DaVinci console. Our experience was based on 114 robotic segmentectomies performed between January 2012 and October 2017. The clinical value of 3D imaging integration was evaluated in 2014 in a pilot study. Progressively, we have reached the conclusion that the use of such an anatomic model improves the safety and reliability of procedures. The multimodal system including 3D imaging has been used in more than 40 patients so far and demonstrated a perfect operative anatomic accuracy. Currently, we are developing an original virtual reality experience by exploring 3D imaging models at the robotic console level. The act of operating is being transformed and the surgeon now oversees a complex system that improves decision making.

  10. Mechanical behavior and modelisation of Ti-6Al-4V titanium sheet under hot stamping conditions

    NASA Astrophysics Data System (ADS)

    Sirvin, Q.; Velay, V.; Bonnaire, R.; Penazzi, L.

    2017-10-01

    The Ti-6Al-4V titanium alloy is widely used for the manufacture of aeronautical and automotive parts (solid parts). In aeronautics, this alloy is employed for its excellent mechanical behavior associated with low density, outstanding corrosion resistance and good mechanical properties up to 600°C. It is especially used for the manufacture of fuselage frames, on the pylon for carrying out the primary structure (machining forged blocks) and the secondary structure in sheet form. In this last case, the sheet metal forming can be done through various methods: at room temperature by drawing operation, at very high temperature (≃900°C) by superplastic forming (SPF) and at intermediate temperature (≥750°C) by hot forming (HF). In order to reduce production costs and environmental troubles, the cycle times reduction associated with a decrease of temperature levels are relevant. This study focuses on the behavior modelling of Ti-6Al-4V alloy at temperatures above room temperature to obtained greater formability and below SPF condition to reduce tools workshop and energy costs. The displacement field measurement obtained by Digital Image Correlation (DIC) is based on innovative surface preparation pattern adapted to high temperature exposures. Different material parameters are identified to define a model able to predict the mechanical behavior of Ti-6Al-4V alloy under hot stamping conditions. The hardening plastic model identified is introduced in FEM to simulate an omega shape forming operation.

  11. Scaling forecast models for wind turbulence and wind turbine power intermittency

    NASA Astrophysics Data System (ADS)

    Duran Medina, Olmo; Schmitt, Francois G.; Calif, Rudy

    2017-04-01

    The intermittency of the wind turbine power remains an important issue for the massive development of this renewable energy. The energy peaks injected in the electric grid produce difficulties in the energy distribution management. Hence, a correct forecast of the wind power in the short and middle term is needed due to the high unpredictability of the intermittency phenomenon. We consider a statistical approach through the analysis and characterization of stochastic fluctuations. The theoretical framework is the multifractal modelisation of wind velocity fluctuations. Here, we consider three wind turbine data where two possess a direct drive technology. Those turbines are producing energy in real exploitation conditions and allow to test our forecast models of power production at a different time horizons. Two forecast models were developed based on two physical principles observed in the wind and the power time series: the scaling properties on the one hand and the intermittency in the wind power increments on the other. The first tool is related to the intermittency through a multifractal lognormal fit of the power fluctuations. The second tool is based on an analogy of the power scaling properties with a fractional brownian motion. Indeed, an inner long-term memory is found in both time series. Both models show encouraging results since a correct tendency of the signal is respected over different time scales. Those tools are first steps to a search of efficient forecasting approaches for grid adaptation facing the wind energy fluctuations.

  12. A Simplified and Reliable Damage Method for the Prediction of the Composites Pieces

    NASA Astrophysics Data System (ADS)

    Viale, R.; Coquillard, M.; Seytre, C.

    2012-07-01

    Structural engineers are often faced to test results on composite structures largely tougher than predicted. By attempting to reduce this frequent gap, a survey of some extensive synthesis works relative to the prediction methods and to the failure criteria was led. This inquiry dealts with the plane stress state only. All classical methods have strong and weak points wrt practice and reliability aspects. The main conclusion is that in the plane stress case, the best usaul industrial methods give predictions rather similar. But very generally they do not explain the often large discrepancies wrt the tests, mainly in the cases of strong stress gradients or of bi-axial laminate loadings. It seems that only the methods considering the complexity of the composites damages (so-called physical methods or Continuum Damage Mechanics “CDM”) bring a clear mending wrt the usual methods..The only drawback of these methods is their relative intricacy mainly in urged industrial conditions. A method with an approaching but simplified representation of the CDM phenomenology is presented. It was compared to tests and other methods: - it brings a fear improvement of the correlation with tests wrt the usual industrial methods, - it gives results very similar to the painstaking CDM methods and very close to the test results. Several examples are provided. In addition this method is really thrifty wrt the material characterization as well as for the modelisation and the computation efforts.

  13. Cascade catalysis for the homogeneous hydrogenation of CO2 to methanol.

    PubMed

    Huff, Chelsea A; Sanford, Melanie S

    2011-11-16

    This communication demonstrates the homogeneous hydrogenation of CO(2) to CH(3)OH via cascade catalysis. Three different homogeneous catalysts, (PMe(3))(4)Ru(Cl)(OAc), Sc(OTf)(3), and (PNN)Ru(CO)(H), operate in sequence to promote this transformation.

  14. Design and fabrication of optical homogenizer with micro structure by injection molding process

    NASA Astrophysics Data System (ADS)

    Chen, C.-C. A.; Chang, S.-W.; Weng, C.-J.

    2008-08-01

    This paper is to design and fabricate an optical homogenizer with hybrid design of collimator, toroidal lens array, and projection lens for beam shaping of Gaussian beam into uniform cylindrical beam. TracePro software was used to design the geometry of homogenizer and simulation of injection molding was preceded by Moldflow MPI to evaluate the mold design for injection molding process. The optical homogenizer is a cylindrical part with thickness 8.03 mm and diameter 5 mm. The micro structure of toroidal array has groove height designed from 12 μm to 99 μm. An electrical injection molding machine and PMMA (n= 1.4747) were selected to perform the experiment. Experimental results show that the optics homogenizer has achieved the transfer ratio of grooves (TRG) as 88.98% and also the optical uniformity as 68% with optical efficiency as 91.88%. Future study focuses on development of an optical homogenizer for LED light source.

  15. Crucial effect of melt homogenization on the fragility of non-stoichiometric chalcogenides

    NASA Astrophysics Data System (ADS)

    Ravindren, Sriram; Gunasekera, K.; Tucker, Z.; Diebold, A.; Boolchand, P.; Micoulaut, M.

    2014-04-01

    The kinetics of homogenization of binary AsxSe100 - x melts in the As concentration range 0% < x < 50% are followed in Fourier Transform (FT)-Raman profiling experiments, and show that 2 g sized melts in the middle concentration range 20% < x < 30% take nearly two weeks to homogenize when starting materials are reacted at 700 °C. In glasses of proven homogeneity, we find molar volumes to vary non-monotonically with composition, and the fragility index M displays a broad global minimum in the 20% < x < 30% range of x wherein M< 20. We show that properly homogenized samples have a lower measured fragility when compared to larger under-reacted melts. The enthalpy of relaxation at Tg, ΔHnr(x) shows a minimum in the 27% < x < 37% range. The super-strong nature of melt compositions in the 20% < x < 30% range suppresses melt diffusion at high temperatures leading to the slow kinetics of melt homogenization.

  16. Mechanical Homogenization Increases Bacterial Homogeneity in Sputum

    PubMed Central

    Stokell, Joshua R.; Khan, Ammad

    2014-01-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  17. Producing a lycopene nanodispersion: Formulation development and the effects of high pressure homogenization.

    PubMed

    Shariffa, Y N; Tan, T B; Uthumporn, U; Abas, F; Mirhosseini, H; Nehdi, I A; Wang, Y-H; Tan, C P

    2017-11-01

    The aim of this study was to develop formulations to produce lycopene nanodispersions and to investigate the effects of the homogenization pressure on the physicochemical properties of the lycopene nanodispersion. The samples were prepared by using emulsification-evaporation technique. The best formulation was achieved by dispersing an organic phase (0.3% w/v lycopene dissolved in dichloromethane) in an aqueous phase (0.3% w/v Tween 20 dissolved in deionized water) at a ratio of 1:9 by using homogenization process. The increased level of homogenization pressure to 500bar reduced the particle size and lycopene concentration significantly (p<0.05). Excessive homogenization pressure (700-900bar) resulted in large particle sizes with high dispersibility. The zeta potential and turbidity of the lycopene nanodispersion were significantly influenced by the homogenization pressure. The results from this study provided useful information for producing small-sized lycopene nanodispersions with a narrow PDI and good stability for application in beverage products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Statistical homogeneity tests applied to large data sets from high energy physics experiments

    NASA Astrophysics Data System (ADS)

    Trusina, J.; Franc, J.; Kůs, V.

    2017-12-01

    Homogeneity tests are used in high energy physics for the verification of simulated Monte Carlo samples, it means if they have the same distribution as a measured data from particle detector. Kolmogorov-Smirnov, χ 2, and Anderson-Darling tests are the most used techniques to assess the samples’ homogeneity. Since MC generators produce plenty of entries from different models, each entry has to be re-weighted to obtain the same sample size as the measured data has. One way of the homogeneity testing is through the binning. If we do not want to lose any information, we can apply generalized tests based on weighted empirical distribution functions. In this paper, we propose such generalized weighted homogeneity tests and introduce some of their asymptotic properties. We present the results based on numerical analysis which focuses on estimations of the type-I error and power of the test. Finally, we present application of our homogeneity tests to data from the experiment DØ in Fermilab.

  19. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  20. Versatile and Programmable DNA Logic Gates on Universal and Label-Free Homogeneous Electrochemical Platform.

    PubMed

    Ge, Lei; Wang, Wenxiao; Sun, Ximei; Hou, Ting; Li, Feng

    2016-10-04

    Herein, a novel universal and label-free homogeneous electrochemical platform is demonstrated, on which a complete set of DNA-based two-input Boolean logic gates (OR, NAND, AND, NOR, INHIBIT, IMPLICATION, XOR, and XNOR) is constructed by simply and rationally deploying the designed DNA polymerization/nicking machines without complicated sequence modulation. Single-stranded DNA is employed as the proof-of-concept target/input to initiate or prevent the DNA polymerization/nicking cyclic reactions on these DNA machines to synthesize numerous intact G-quadruplex sequences or binary G-quadruplex subunits as the output. The generated output strands then self-assemble into G-quadruplexes that render remarkable decrease to the diffusion current response of methylene blue and, thus, provide the amplified homogeneous electrochemical readout signal not only for the logic gate operations but also for the ultrasensitive detection of the target/input. This system represents the first example of homogeneous electrochemical logic operation. Importantly, the proposed homogeneous electrochemical logic gates possess the input/output homogeneity and share a constant output threshold value. Moreover, the modular design of DNA polymerization/nicking machines enables the adaptation of these homogeneous electrochemical logic gates to various input and output sequences. The results of this study demonstrate the versatility and universality of the label-free homogeneous electrochemical platform in the design of biomolecular logic gates and provide a potential platform for the further development of large-scale DNA-based biocomputing circuits and advanced biosensors for multiple molecular targets.

  1. Effect of high-pressure homogenization on droplet size distribution and rheological properties of ice cream mixes.

    PubMed

    Innocente, N; Biasutti, M; Venir, E; Spaziani, M; Marchesini, G

    2009-05-01

    The effect of different homogenization pressures (15/3 MPa and 97/3 MPa) on fat globule size and distribution as well as on structure-property relationships of ice cream mixes was investigated. Dynamic light scattering, steady shear, and dynamic rheological analyses were performed on mixes with different fat contents (5 and 8%) and different aging times (4 and 20 h). The homogenization of ice cream mixes determined a change from bimodal to monomodal particle size distributions and a reduction in the mean particle diameter. Mean fat globule diameters were reduced at higher pressure, but the homogenization effect on size reduction was less marked with the highest fat content. The rheological behavior of mixes was influenced by both the dispersed and the continuous phases. Higher fat contents caused greater viscosity and dynamic moduli. The lower homogenization pressure (15/3 MPa) mainly affected the dispersed phase and resulted in a more pronounced viscosity reduction in the higher fat content mixes. High-pressure homogenization (97/3 MPa) greatly enhanced the viscoelastic properties and the apparent viscosity. Rheological results indicated that unhomogenized and 15/3 MPa homogenized mixes behaved as weak gels. The 97/3 MPa treatment led to stronger gels, perhaps as the overall result of a network rearrangement or interpenetrating network formation, and the fat globules were found to behave as interactive fillers. High-pressure homogenization determined the apparent viscosity of 5% fat to be comparable to that of 8% fat unhomogenized mix.

  2. Microstructural evolution during the homogenization heat treatment of 6XXX and 7XXX aluminum alloys

    NASA Astrophysics Data System (ADS)

    Priya, Pikee

    Homogenization heat treatment of as-cast billets is an important step in the processing of aluminum extrusions. Microstructural evolution during homogenization involves elimination of the eutectic morphology by spheroidisation of the interdendritic phases, minimization of the microsegregation across the grains through diffusion, dissolution of the low-melting phases, which enhances the surface finish of the extrusions, and precipitation of nano-sized dispersoids (for Cr-, Zr-, Mn-, Sc-containing alloys), which inhibit grain boundary motion to prevent recrystallization. Post-homogenization cooling reprecipitates some of the phases, changing the flow stress required for subsequent extrusion. These precipitates, however, are deleterious for the mechanical properties of the alloy and also hamper the age-hardenability and are hence dissolved during solution heat treatment. Microstructural development during homogenization and subsequent cooling occurs both at the length scale of the Secondary Dendrite Arm Spacing (SDAS) in micrometers and dispersoids in nanometers. Numerical tools to simulate microstructural development at both the length scales have been developed and validated against experiments. These tools provide easy and convenient means to study the process. A Cellular Automaton-Finite Volume-based model for evolution of interdendritic phases is coupled with a Particle Size Distribution-based model for precipitation of dispersoids across the grain. This comprehensive model has been used to study the effect of temperature, composition, as-cast microstructure, and cooling rates during post-homogenization quenching on microstructural evolution. The numerical study has been complimented with experiments involving Scanning Electron Microscopy, Energy Dispersive Spectroscopy, X-Ray Diffraction and Differential Scanning Calorimetry and a good agreement has with numerical results has been found. The current work aims to study the microstructural evolution during homogenization heat treatment at both length scales which include the (i) dissolution and transformation of the as-cast secondary phases; (ii) precipitation of dispersoids; and (iii) reprecipitation of some of the secondary phases during post-homogenization cooling. The kinetics of the phase transformations are mostly diffusion controlled except for the eta to S phase transformation in 7XXX alloys which is interface reaction rate controlled which has been implemented using a novel approach. Recommendations for homogenization temperature, time, cooling rates and compositions are made for Al-Si-Mg-Fe-Mn and Al-Zn-Cu-Mg-Zr alloys. The numerical model developed has been applied for a through process solidification-homogenization modeling of a Direct-Chill cast AA7050 cylindrical billet to study the radial variation of microstructure after solidification, homogenization and post-homogenization cooling.

  3. Prediction of fat globule particle size in homogenized milk using Fourier transform mid-infrared spectra.

    PubMed

    Di Marzo, Larissa; Cree, Patrick; Barbano, David M

    2016-11-01

    Our objective was to develop partial least square models using data from Fourier transform mid-infrared (MIR) spectra to predict the particle size distributions d(0.5) and d(0.9), surface volume mean diameter D[3,2], and volume moment mean diameter D[4,3] of milk fat globules and validate the models. The goal of the study was to produce a method built into the MIR milk analyzer that could be used to warn the instrument operator that the homogenizer is near failure and needs to be replaced to ensure quality of results. Five homogenizers with different homogenization efficiency were used to homogenize pasteurized modified unhomogenized milks and farm raw bulk milks. Homogenized milks were collected from the homogenizer outlet and then run through an MIR milk analyzer without an in-line homogenizer to collect a MIR spectrum. A separate portion of each homogenized milk was analyzed with a laser light-scattering particle size analyzer to obtain reference values. The study was replicated 3 times with 3 independent sets of modified milks and bulk tank farm milks. Validation of the models was done with a set of 34 milks that were not used in the model development. Partial least square regression models were developed and validated for predicting the following milk fat globule particle size distribution parameters from MIR spectra: d(0.5) and d(0.9), surface volume mean diameter D[3,2], and volume moment mean diameter D[4,3]. The basis for the ability to model particle size distribution of milk fat emulsions was hypothesized to be the result of the partial least square modeling detecting absorbance shifts in MIR spectra of milk fat due to the Christiansen effect. The independent sample validation of particle size prediction methods found more variation in d(0.9) and D[4,3] predictions than the d(0.5) and D[3,2] predictions relative to laser light-scattering reference values, and this may be due to variation in particle size among different pump strokes. The accuracy of the d(0.9) prediction for routine quality assurance, to determine if a homogenizer within an MIR milk analyzer was near the failure level [i.e., d(0.9) >1.7µm] and needed to be replaced, is fit-for-purpose. The daily average particle size performance [i.e., d(0.9)] of a homogenizer based on the mean for the day could be used for monitoring homogenizer performance. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. Do Algorithms Homogenize Students' Achievements in Secondary School Better than Teachers' Tracking Decisions?

    ERIC Educational Resources Information Center

    Klapproth, Florian

    2015-01-01

    Two objectives guided this research. First, this study examined how well teachers' tracking decisions contribute to the homogenization of their students' achievements. Second, the study explored whether teachers' tracking decisions would be outperformed in homogenizing the students' achievements by statistical models of tracking decisions. These…

  5. Effect of heat and homogenization on in vitro digestion of milk

    USDA-ARS?s Scientific Manuscript database

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  6. Extreme between-study homogeneity in meta-analyses could offer useful insights.

    PubMed

    Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias

    2006-10-01

    Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.

  7. Genetic progress in homogeneous regions of wheat cultivation in Rio Grande do Sul State, Brazil.

    PubMed

    Follmann, D N; Cargnelutti Filho, A; Lúcio, A D; de Souza, V Q; Caraffa, M; Wartha, C A

    2017-03-30

    The State of Rio Grande do Sul (RS) stands out as the largest wheat producer in Brazil. Wheat is the most emphasized winter cereal in RS, attracting public and private investments directed to wheat genetic breeding. The study of genetic progress should be performed routinely at breeding programs to study the behavior of cultivars developed for homogeneous regions of cultivation. The objectives of this study were: 1) to evaluate the genetic progress of wheat grain yield in RS; 2) to evaluate the influence of cultivar competition trial stratification in homogeneous regions of cultivation on the study of genetic progress. Grain yield data of 122 wheat cultivars evaluated in 137 trials arranged in randomized block design with three or four replications were used. Field trials were carried out in 23 locations in RS divided into two homogeneous regions during the period from 2002 to 2013. Genetic progress for RS and homogeneous regions was studied utilizing the method proposed by Vencovsky. Annual genetic progress for wheat grain yield during the period of 12 years in the State of RS was 2.86%, oscillating between homogeneous regions of cultivation. The difference of annual genetic progress in region 1 (1.82%) in relation to region 2 (4.38%) justifies the study of genetic progress by homogeneous regions of cultivation.

  8. Effect of homogenization and pasteurization on the structure and stability of whey protein in milk.

    PubMed

    Qi, Phoebe X; Ren, Daxi; Xiao, Yingping; Tomasula, Peggy M

    2015-05-01

    The effect of homogenization alone or in combination with high-temperature, short-time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a 2-stage homogenizer at 35°C (6.9 MPa/10.3 MPa) and, along with skim milk, were subjected to HTST pasteurization (72°C for 15 s) or UHT processing (135°C for 2 s). Other whole milk samples were processed using homogenization followed by either HTST pasteurization or UHT processing. The processed skim and whole milk samples were centrifuged further to remove fat and then acidified to pH 4.6 to isolate the corresponding whey fractions, and centrifuged again. The whey fractions were then purified using dialysis and investigated using the circular dichroism, Fourier transform infrared, and Trp intrinsic fluorescence spectroscopic techniques. Results demonstrated that homogenization combined with UHT processing of milk caused not only changes in protein composition but also significant secondary structural loss, particularly in the amounts of apparent antiparallel β-sheet and α-helix, as well as diminished tertiary structural contact. In both cases of homogenization alone and followed by HTST treatments, neither caused appreciable chemical changes, nor remarkable secondary structural reduction. But disruption was evident in the tertiary structural environment of the whey proteins due to homogenization of whole milk as shown by both the near-UV circular dichroism and Trp intrinsic fluorescence. In-depth structural stability analyses revealed that even though processing of milk imposed little impairment on the secondary structural stability, the tertiary structural stability of whey protein was altered significantly. The following order was derived based on these studies: raw whole>HTST, homogenized, homogenized and pasteurized>skimmed and pasteurized, and skimmed UHT>homogenized UHT. The methodology demonstrated in this study can be used to gain insight into the behavior of milk proteins when processed and provides a new empirical and comparative approach for analyzing and assessing the effect of processing schemes on the nutrition and quality of milk and dairy product without the need for extended separation and purification, which can be both time-consuming and disruptive to protein structures. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. MHODE: a local-homogeneity theory for improved source-parameter estimation of potential fields

    NASA Astrophysics Data System (ADS)

    Fedi, Maurizio; Florio, Giovanni; Paoletti, Valeria

    2015-08-01

    We describe a multihomogeneity theory for source-parameter estimation of potential fields. Similar to what happens for random source models, where the monofractal scaling-law has been generalized into a multifractal law, we propose to generalize the homogeneity law into a multihomogeneity law. This allows a theoretically correct approach to study real-world potential fields, which are inhomogeneous and so do not show scale invariance, except in the asymptotic regions (very near to or very far from their sources). Since the scaling properties of inhomogeneous fields change with the scale of observation, we show that they may be better studied at a set of scales than at a single scale and that a multihomogeneous model is needed to explain its complex scaling behaviour. In order to perform this task, we first introduce fractional-degree homogeneous fields, to show that: (i) homogeneous potential fields may have fractional or integer degree; (ii) the source-distributions for a fractional-degree are not confined in a bounded region, similarly to some integer-degree models, such as the infinite line mass and (iii) differently from the integer-degree case, the fractional-degree source distributions are no longer uniform density functions. Using this enlarged set of homogeneous fields, real-world anomaly fields are studied at different scales, by a simple search, at any local window W, for the best homogeneous field of either integer or fractional-degree, this yielding a multiscale set of local homogeneity-degrees and depth estimations which we call multihomogeneous model. It is so defined a new technique of source parameter estimation (Multi-HOmogeneity Depth Estimation, MHODE), permitting retrieval of the source parameters of complex sources. We test the method with inhomogeneous fields of finite sources, such as faults or cylinders, and show its effectiveness also in a real-case example. These applications show the usefulness of the new concepts, multihomogeneity and fractional homogeneity-degree, to obtain valid estimates of the source parameters in a consistent theoretical framework, so overcoming the limitations imposed by global-homogeneity to widespread methods, such as Euler deconvolution.

  10. Impact of homogenization of pasteurized human milk on gastric digestion in the preterm infant: A randomized controlled trial.

    PubMed

    de Oliveira, Samira C; Bellanger, Amandine; Ménard, Olivia; Pladys, Patrick; Le Gouar, Yann; Henry, Gwénaële; Dirson, Emelyne; Rousseau, Florence; Carrière, Frédéric; Dupont, Didier; Bourlieu, Claire; Deglaire, Amélie

    2017-08-01

    It has been suggested that homogenization of Holder-pasteurized human milk (PHM) could improve fat absorption and weight gain in preterm infants, but the impact on the PHM digestive kinetics has never been studied. Our objective was to determine the impact of PHM homogenization on gastric digestion in preterm infants. In a randomized controlled trial, eight hospitalized tube-fed preterm infants were their own control to compare the gastric digestion of PHM and of homogenized PHM (PHHM). PHM was obtained from donors and, for half of it, was homogenized by ultrasonication. Over a six-day sequence, gastric aspirates were collected twice a day, before and 35, 60 or 90 min after the start of PHM or PHHM ingestion. The impact of homogenization on PHM digestive kinetics and disintegration was tested using a general linear mixed model. Results were expressed as means ± SD. Homogenization leaded to a six-fold increase in the specific surface (P < 0.01) of lipid droplets. The types of aggregates formed during digestion were different between PHM and PHHM, but the lipid fraction kept its initial structure all over the gastric digestion (native globules in PHM vs. blend of droplets in PHHM). Homogenization increased the gastric lipolysis level (P < 0.01), particularly at 35 and 60 min (22 and 24% higher for PHHM, respectively). Homogenization enhanced the proteolysis of serum albumin (P < 0.05) and reduced the meal emptying rate (P < 0.001, half-time estimated at 30 min for PHM and 38 min for PHHM). The postprandial gastric pH was not affected (4.7 ± 0.9 at 90 min). Homogenization of PHM increased the gastric lipolysis level. This could be a potential strategy to improve fat absorption, and thus growth and development in infants fed with PHM; however, its gastrointestinal tolerance needs to be investigated further. This trial was registered at clinicaltrials.gov as NCT02112331. Copyright © 2017 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  11. Use of focused acoustics for cell disruption to provide ultra scale-down insights of microbial homogenization and its bioprocess impact--recovery of antibody fragments from rec E. coli.

    PubMed

    Li, Qiang; Aucamp, Jean P; Tang, Alison; Chatel, Alex; Hoare, Mike

    2012-08-01

    An ultra scale-down (USD) device that provides insight of how industrial homogenization impacts bioprocess performance is desirable in the biopharmaceutical industry, especially at the early stage of process development where only a small quantity of material is available. In this work, we assess the effectiveness of focused acoustics as the basis of an USD cell disruption method to mimic and study high-pressure, step-wise homogenization of rec Escherichia coli cells for the recovery of an intracellular protein, antibody fragment (Fab'). The release of both Fab' and of overall protein follows first-order reaction kinetics with respect to time of exposure to focused acoustics. The rate constant is directly proportional to applied electrical power input per unit volume. For nearly total protein or Fab' release (>99%), the key physical properties of the disruptate produced by focused acoustics, such as cell debris particle size distribution and apparent viscosity show good agreement with those for homogenates produced by high-pressure homogenization operated to give the same fractional release. The only key difference is observed for partial disruption of cells where focused acoustics yields a disruptate of lower viscosity than homogenization, evidently due to a greater extent of polynucleic acids degradation. Verification of this USD approach to cell disruption by high-pressure homogenization is achieved using USD centrifugation to demonstrate the same sedimentation characteristics of disruptates prepared using both the scaled-down focused acoustic and the pilot-scale homogenization methods for the same fraction of protein release. Copyright © 2012 Wiley Periodicals, Inc.

  12. Experimental consideration of capillary chromatography based on tube radial distribution of ternary mixture carrier solvents under laminar flow conditions.

    PubMed

    Jinno, Naoya; Hashimoto, Masahiko; Tsukagoshi, Kazuhiko

    2011-01-01

    A capillary chromatography system has been developed based on the tube radial distribution of the carrier solvents using an open capillary tube and a water-acetonitrile-ethyl acetate mixture carrier solution. This tube radial distribution chromatography (TRDC) system works under laminar flow conditions. In this study, a phase diagram for the ternary mixture carrier solvents of water, acetonitrile, and ethyl acetate was constructed. The phase diagram that included a boundary curve between homogeneous and heterogeneous solutions was considered together with the component ratios of the solvents in the homogeneous carrier solutions required for the TRDC system. It was found that the TRDC system performed well with homogeneous solutions having component ratios of the solvents that were positioned near the homogeneous-heterogeneous solution boundary of the phase diagram. For preparing the carrier solutions of water-hydrophilic/hydrophobic organic solvents for the TRDC system, we used for the first time methanol, ethanol, 1,4-dioxane, and 1-propanol, instead of acetonitrile (hydrophilic organic solvent), as well as chloroform and 1-butanol, instead of ethyl acetate (hydrophobic organic solvent). The homogeneous ternary mixture carrier solutions were prepared near the homogeneous-heterogeneous solution boundary. Analyte mixtures of 2,6-naphthalenedisulfonic acid and 1-naphthol were separated with the TRDC system using these homogeneous ternary mixture carrier solutions. The pressure change in the capillary tube under laminar flow conditions might alter the carrier solution from homogeneous in the batch vessel to heterogeneous, thus affecting the tube radial distribution of the solvents in the capillary tube.

  13. Effect of homogenization and pasteurization on the structure and thermal stability of whey protein in milk

    USDA-ARS?s Scientific Manuscript database

    The effect of homogenization alone or in combination with high temperature, short time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a two-...

  14. Influence of Homogenization on Microstructural Response and Mechanical Property of Al-Cu-Mn Alloy.

    PubMed

    Wang, Jian; Lu, Yalin; Zhou, Dongshuai; Sun, Lingyan; Li, Renxing; Xu, Wenting

    2018-05-29

    The evolution of the microstructures and properties of large direct chill (DC)-cast Al-Cu-Mn alloy ingots during homogenization was investigated. The results revealed that the Al-Cu-Mn alloy ingots had severe microsegregation and the main secondary phase was Al₂Cu, with minimal Al₇Cu₂Fe phase. Numerous primary eutectic phases existed in the grain boundary and the main elements were segregated at the interfaces along the interdendritic region. The grain boundaries became discontinuous, residual phases were effectively dissolved into the matrix, and the segregation degree of all elements was reduced dramatically during homogenization. In addition, the homogenized alloys exhibited improved microstructures with finer grain size, higher number density of dislocation networks, higher density of uniformly distributed θ' or θ phase (Al₂Cu), and higher volume fraction of high-angle grain boundaries compared to the nonhomogenized samples. After the optimal homogenization scheme treated at 535 °C for 10 h, the tensile strength and elongation% were about 24 MPa, 20.5 MPa, and 1.3% higher than those of the specimen without homogenization treatment.

  15. Catalytic combustion of hydrogen-air mixtures in stagnation flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ikeda, H.; Libby, P.A.; Williams, F.A.

    1993-04-01

    The interaction between heterogeneous and homogeneous reactions arising when a mixture of hydrogen and air impinges on a platinum plate at elevated temperature is studied. A reasonably complete description of the kinetic mechanism for homogeneous reactions is employed along with a simplified model for heterogeneous reactions. Four regimes are identified depending on the temperature of the plate, on the rate of strain imposed on the flow adjacent to the plate and on the composition and temperature of the reactant stream: (1) surface reaction alone; (2) surface reaction inhibiting homogeneous reaction; (3) homogeneous reaction inhibiting surface reaction; and (4) homogeneous reactionmore » alone. These regimes are related to those found earlier for other chemical systems and form the basis of future experimental investigation of the chemical system considered in the present study.« less

  16. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Training was found to be very important. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  17. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape.

    PubMed

    Carneiro, Magda Silva; Campos, Caroline Cambraia Furtado; Beijo, Luiz Alberto; Ramos, Flavio Nunes

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to manage fragmented landscapes.

  18. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape

    PubMed Central

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to manage fragmented landscapes. PMID:27760218

  19. Voxel-wise meta-analyses of brain blood flow and local synchrony abnormalities in medication-free patients with major depressive disorder

    PubMed Central

    Chen, Zi-Qi; Du, Ming-Ying; Zhao, You-Jin; Huang, Xiao-Qi; Li, Jing; Lui, Su; Hu, Jun-Mei; Sun, Huai-Qiang; Liu, Jia; Kemp, Graham J.; Gong, Qi-Yong

    2015-01-01

    Background Published meta-analyses of resting-state regional cerebral blood flow (rCBF) studies of major depressive disorder (MDD) have included patients receiving antidepressants, which might affect brain activity and thus bias the results. To our knowledge, no meta-analysis has investigated regional homogeneity changes in medication-free patients with MDD. Moreover, an association between regional homogeneity and rCBF has been demonstrated in some brain regions in healthy controls. We sought to explore to what extent resting-state rCBF and regional homogeneity changes co-occur in the depressed brain without the potential confound of medication. Methods Using the effect-size signed differential mapping method, we conducted 2 meta-analyses of rCBF and regional homogeneity studies of medication-free patients with MDD. Results Our systematic search identified 14 rCBF studies and 9 regional homogeneity studies. We identified conjoint decreases in resting-state rCBF and regional homogeneity in the insula and superior temporal gyrus in medication-free patients with MDD compared with controls. Other changes included altered resting-state rCBF in the precuneus and in the frontal–limbic–thalamic–striatal neural circuit as well as altered regional homogeneity in the uncus and parahippocampal gyrus. Meta-regression revealed that the percentage of female patients with MDD was negatively associated with resting-state rCBF in the right anterior cingulate cortex and that the age of patients with MDD was negatively associated with rCBF in the left insula and with regional homogeneity in the left uncus. Limitations The analysis techniques, patient characteristics and clinical variables of the included studies were heterogeneous. Conclusion The conjoint alterations of rCBF and regional homogeneity in the insula and superior temporal gyrus may be core neuropathological changes in medication-free patients with MDD and serve as a specific region of interest for further studies on MDD. PMID:25853283

  20. Voxel-wise meta-analyses of brain blood flow and local synchrony abnormalities in medication-free patients with major depressive disorder.

    PubMed

    Chen, Zi-Qi; Du, Ming-Ying; Zhao, You-Jin; Huang, Xiao-Qi; Li, Jing; Lui, Su; Hu, Jun-Mei; Sun, Huai-Qiang; Liu, Jia; Kemp, Graham J; Gong, Qi-Yong

    2015-11-01

    Published meta-analyses of resting-state regional cerebral blood flow (rCBF) studies of major depressive disorder (MDD) have included patients receiving antidepressants, which might affect brain activity and thus bias the results. To our knowledge, no meta-analysis has investigated regional homogeneity changes in medication-free patients with MDD. Moreover, an association between regional homogeneity and rCBF has been demonstrated in some brain regions in healthy controls. We sought to explore to what extent resting-state rCBF and regional homogeneity changes co-occur in the depressed brain without the potential confound of medication. Using the effect-size signed differential mapping method, we conducted 2 meta-analyses of rCBF and regional homogeneity studies of medication-free patients with MDD. Our systematic search identified 14 rCBF studies and 9 regional homogeneity studies. We identified conjoint decreases in resting-state rCBF and regional homogeneity in the insula and superior temporal gyrus in medication-free patients with MDD compared with controls. Other changes included altered resting-state rCBF in the precuneus and in the frontal-limbic-thalamic-striatal neural circuit as well as altered regional homogeneity in the uncus and parahippocampal gyrus. Meta-regression revealed that the percentage of female patients with MDD was negatively associated with resting-state rCBF in the right anterior cingulate cortex and that the age of patients with MDD was negatively associated with rCBF in the left insula and with regional homogeneity in the left uncus. The analysis techniques, patient characteristics and clinical variables of the included studies were heterogeneous. The conjoint alterations of rCBF and regional homogeneity in the insula and superior temporal gyrus may be core neuropathological changes in medication-free patients with MDD and serve as a specific region of interest for further studies on MDD.

  1. Effects of poling over the orthorhombic-tetragonal phase transition temperature in compositionally homogeneous (K,Na)NbO3-based ceramics

    NASA Astrophysics Data System (ADS)

    Morozov, M. I.; Kungl, H.; Hoffmann, M. J.

    2011-03-01

    Li-, Ta-, and Mn-modified (K,Na)NbO3 ceramics with various compositional homogeneity have been prepared by conventional and precursor methods. The homogeneous ceramic has demonstrated a sharper peak in temperature dependent piezoelectric response. The dielectric and piezoelectric properties of the homogeneous ceramics have been characterized at the experimental subcoercive electric fields near the temperature of the orthorhombic-tetragonal phase transition with respect to poling in both phases. Poling in the tetragonal phase is shown to enhance the low-signal dielectric and piezoelectric properties in the orthorhombic phase.

  2. A MULTISCALE FRAMEWORK FOR THE STOCHASTIC ASSIMILATION AND MODELING OF UNCERTAINTY ASSOCIATED NCF COMPOSITE MATERIALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehrez, Loujaine; Ghanem, Roger; McAuliffe, Colin

    multiscale framework to construct stochastic macroscopic constitutive material models is proposed. A spectral projection approach, specifically polynomial chaos expansion, has been used to construct explicit functional relationships between the homogenized properties and input parameters from finer scales. A homogenization engine embedded in Multiscale Designer, software for composite materials, has been used for the upscaling process. The framework is demonstrated using non-crimp fabric composite materials by constructing probabilistic models of the homogenized properties of a non-crimp fabric laminate in terms of the input parameters together with the homogenized properties from finer scales.

  3. Mixed Mode Fuel Injector And Injection System

    DOEpatents

    Stewart, Chris Lee; Tian, Ye; Wang, Lifeng; Shafer, Scott F.

    2005-12-27

    A fuel injector includes a homogenous charge nozzle outlet set and a conventional nozzle outlet set that are controlled respectively by first and second three way needle control valves. Each fuel injector includes first and second concentric needle valve members. One of the needle valve members moves to an open position for a homogenous charge injection event, while the other needle valve member moves to an open position for a conventional injection event. The fuel injector has the ability to operate in a homogenous charge mode with a homogenous charge spray pattern, a conventional mode with a conventional spray pattern or a mixed mode.

  4. Low-gravity homogenization and solidification of aluminum antimonide. [Apollo-Soyuz test project

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1976-01-01

    The III-V semiconducting compound AlSb shows promise as a highly efficient solar cell material, but it has not been commercially exploited because of difficulties in compound synthesis. Liquid state homogenization and solidification of AlSb were carried out in the Apollo-Soyuz Test Project Experiment MA-044 in the hope that compositional homogeneity would be improved by negating the large density difference between the two constituents. Post-flight analysis and comparative characterization of the space-processed and ground-processed samples indicate that there are major homogeneity improvements in the low-gravity solidified material.

  5. Analysis of an Interface Crack for a Functionally Graded Strip Sandwiched between Two Homogeneous Layers of Finite Thickness

    NASA Technical Reports Server (NTRS)

    Shbeeh, N. I.; Binienda, W. K.

    1999-01-01

    The interface crack problem for a composite layer that consists of a homogeneous substrate, coating and a non-homogeneous interface was formulated for singular integral equations with Cauchy kernels and integrated using the Lobatto-Chebyshev collocation technique. Mixed-mode Stress Intensity Factors and Strain Energy Release Rates were calculated. The Stress Intensity Factors were compared for accuracy with relevant results previously published. The parametric studies were conducted for the various thickness of each layer and for various non-homogeneity ratios. Particular application to the Zirconia thermal barrier on steel substrate is demonstrated.

  6. Macro-architectured cellular materials: Properties, characteristic modes, and prediction methods

    NASA Astrophysics Data System (ADS)

    Ma, Zheng-Dong

    2017-12-01

    Macro-architectured cellular (MAC) material is defined as a class of engineered materials having configurable cells of relatively large (i.e., visible) size that can be architecturally designed to achieve various desired material properties. Two types of novel MAC materials, negative Poisson's ratio material and biomimetic tendon reinforced material, were introduced in this study. To estimate the effective material properties for structural analyses and to optimally design such materials, a set of suitable homogenization methods was developed that provided an effective means for the multiscale modeling of MAC materials. First, a strain-based homogenization method was developed using an approach that separated the strain field into a homogenized strain field and a strain variation field in the local cellular domain superposed on the homogenized strain field. The principle of virtual displacements for the relationship between the strain variation field and the homogenized strain field was then used to condense the strain variation field onto the homogenized strain field. The new method was then extended to a stress-based homogenization process based on the principle of virtual forces and further applied to address the discrete systems represented by the beam or frame structures of the aforementioned MAC materials. The characteristic modes and the stress recovery process used to predict the stress distribution inside the cellular domain and thus determine the material strengths and failures at the local level are also discussed.

  7. Progesterone lipid nanoparticles: Scaling up and in vivo human study.

    PubMed

    Esposito, Elisabetta; Sguizzato, Maddalena; Drechsler, Markus; Mariani, Paolo; Carducci, Federica; Nastruzzi, Claudio; Cortesi, Rita

    2017-10-01

    This investigation describes a scaling up study aimed at producing progesterone containing nanoparticles in a pilot scale. Particularly hot homogenization techniques based on ultrasound homogenization or high pressure homogenization have been employed to produce lipid nanoparticles constituted of tristearin or tristearin in association with caprylic-capric triglyceride. It was found that the high pressure homogenization method enabled to obtain nanoparticles without agglomerates and smaller mean diameters with respect to ultrasound homogenization method. X-ray characterization suggested a lamellar structural organization of both type of nanoparticles. Progesterone encapsulation efficiency was almost 100% in the case of high pressure homogenization method. Shelf life study indicated a double fold stability of progesterone when encapsulated in nanoparticles produced by the high pressure homogenization method. Dialysis and Franz cell methods were performed to mimic subcutaneous and skin administration. Nanoparticles constituted of tristearin in mixture with caprylic/capric triglyceride display a slower release of progesterone with respect to nanoparticles constituted of pure tristearin. Franz cell evidenced a higher progesterone skin uptake in the case of pure tristearin nanoparticles. A human in vivo study, based on tape stripping, was conducted to investigate the performance of nanoparticles as progesterone skin delivery systems. Tape stripping results indicated a decrease of progesterone concentration in stratum corneum within six hours, suggesting an interaction between nanoparticle material and skin lipids. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  9. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  10. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    PubMed Central

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  11. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    PubMed

    Tang, Liang; Zhang, Jinjie; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  12. Cooperative Learning: Homogeneous and Heterogeneous Grouping of Iranian EFL Learners in a Writing Context

    ERIC Educational Resources Information Center

    Zamani, Mona

    2016-01-01

    One of the important aspects of learning and teaching through cooperation is the group composition or grouping "who with whom". An unresolved issue is that of the superiority of heterogeneity or homogeneity in the structure of the groups. The present study was an attempt to investigate the impact that homogeneous and heterogeneous…

  13. Linking biotic homogenization to habitat type, invasiveness and growth form of naturalized alien plants in North America

    Treesearch

    Hong Qian; Qinfeng Guo

    2010-01-01

    Aim Biotic homogenization is a growing phenomenon and has recently attracted much attention. Here, we analyse a large dataset of native and alien plants in North America to examine whether biotic homogenization is related to several ecological and biological attributes. Location North America (north of Mexico). Methods We assembled...

  14. Homogeneous Grouping and the Individualization of Instruction in Remedial Reading in an Intermediate School.

    ERIC Educational Resources Information Center

    Kelly, Thomas F.

    A remedial reading program designed for intermediate-grade students who read from 1 to 7 years below grade level was studied. The program provided individualized instruction within classes homogeneously grouped on the basis of reading level only. Six seventh-grade classes were studied, with three acting as homogeneously grouped experimental…

  15. Effect of Ability Grouping in Reciprocal Teaching Technique of Collaborative Learning on Individual Achievements and Social Skills

    ERIC Educational Resources Information Center

    Sumadi; Degeng, I Nyoman S.; Sulthon; Waras

    2017-01-01

    This research focused on effects of ability grouping in reciprocal teaching technique of collaborative learning on individual achievements dan social skills. The results research showed that (1) there are differences in individual achievement significantly between high group of homogeneous, middle group of homogeneous, low group of homogeneous,…

  16. The homogeneity effect on figure/ground perception in infancy.

    PubMed

    Takashima, Midori; Kanazawa, So; Yamaguchi, Masami K; Shiina, Ken

    2014-02-01

    We examined whether the homogeneity of the two profiles of Rubin's goblet affects figure/ground perception in infants. We modified the two profiles of Rubin's goblet in order to compare figure/ground perception under four test patterns: (1) two profiles painted with horizontal lines (horizontal-line condition), (2) two profiles painted middle gray (uni-color condition), (3) one profile painted light gray and the other dark gray (two-color condition), and (4) a goblet painted with concentric circles (concentric-circles condition). In the horizontal-line condition the homogeneity of the profile was strengthened, and in the two-color condition the homogeneity of the profile was weakened compared to the uni-color condition, which was an original Rubin's goblet. In the concentric-circles condition the homogeneity of the reversed areas of the horizontal-line were strengthened. After infants were familiarized with each Rubin's goblet, the infants were tested on their discrimination between the two profiles and the goblet in the post-familiarization test. In horizontal-line condition, uni-color condition and concentric-circles condition infants showed a novelty preference for the two profiles in the post-familiarization test. On the other hand, in the two-color condition no preference was observed in the post-familiarization test. This means that infants perceived the goblet as figure and the two profiles as ground in the horizontal-line condition, the uni-color condition and the concentric-circles condition. We found that infants could not perceive the goblet area as figure when the homogeneity of the two profiles was weakened. It can be said that figure/ground perception in infancy is not affected by strengthened homogeneity, but is affected by weakened homogeneity. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Land-use intensification causes multitrophic homogenization of grassland communities

    NASA Astrophysics Data System (ADS)

    Gossner, Martin M.; Lewinsohn, Thomas M.; Kahl, Tiemo; Grassein, Fabrice; Boch, Steffen; Prati, Daniel; Birkhofer, Klaus; Renner, Swen C.; Sikorski, Johannes; Wubet, Tesfaye; Arndt, Hartmut; Baumgartner, Vanessa; Blaser, Stefan; Blüthgen, Nico; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Jorge, Leonardo Ré; Jung, Kirsten; Keyel, Alexander C.; Klein, Alexandra-Maria; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Müller, Jörg; Overmann, Jörg; Pašalić, Esther; Penone, Caterina; Perović, David J.; Purschke, Oliver; Schall, Peter; Socher, Stephanie A.; Sonnemann, Ilja; Tschapka, Marco; Tscharntke, Teja; Türke, Manfred; Venter, Paul Christiaan; Weiner, Christiane N.; Werner, Michael; Wolters, Volkmar; Wurst, Susanne; Westphal, Catrin; Fischer, Markus; Weisser, Wolfgang W.; Allan, Eric

    2016-12-01

    Land-use intensification is a major driver of biodiversity loss. Alongside reductions in local species diversity, biotic homogenization at larger spatial scales is of great concern for conservation. Biotic homogenization means a decrease in β-diversity (the compositional dissimilarity between sites). Most studies have investigated losses in local (α)-diversity and neglected biodiversity loss at larger spatial scales. Studies addressing β-diversity have focused on single or a few organism groups (for example, ref. 4), and it is thus unknown whether land-use intensification homogenizes communities at different trophic levels, above- and belowground. Here we show that even moderate increases in local land-use intensity (LUI) cause biotic homogenization across microbial, plant and animal groups, both above- and belowground, and that this is largely independent of changes in α-diversity. We analysed a unique grassland biodiversity dataset, with abundances of more than 4,000 species belonging to 12 trophic groups. LUI, and, in particular, high mowing intensity, had consistent effects on β-diversity across groups, causing a homogenization of soil microbial, fungal pathogen, plant and arthropod communities. These effects were nonlinear and the strongest declines in β-diversity occurred in the transition from extensively managed to intermediate intensity grassland. LUI tended to reduce local α-diversity in aboveground groups, whereas the α-diversity increased in belowground groups. Correlations between the β-diversity of different groups, particularly between plants and their consumers, became weaker at high LUI. This suggests a loss of specialist species and is further evidence for biotic homogenization. The consistently negative effects of LUI on landscape-scale biodiversity underscore the high value of extensively managed grasslands for conserving multitrophic biodiversity and ecosystem service provision. Indeed, biotic homogenization rather than local diversity loss could prove to be the most substantial consequence of land-use intensification.

  18. Concordance and discordance between taxonomic and functional homogenization: responses of soil mite assemblages to forest conversion.

    PubMed

    Mori, Akira S; Ota, Aino T; Fujii, Saori; Seino, Tatsuyuki; Kabeya, Daisuke; Okamoto, Toru; Ito, Masamichi T; Kaneko, Nobuhiro; Hasegawa, Motohiro

    2015-10-01

    The compositional characteristics of ecological assemblages are often simplified; this process is termed "biotic homogenization." This process of biological reorganization occurs not only taxonomically but also functionally. Testing both aspects of homogenization is essential if ecosystem functioning supported by a diverse mosaic of functional traits in the landscape is concerned. Here, we aimed to infer the underlying processes of taxonomic/functional homogenization at the local scale, which is a scale that is meaningful for this research question. We recorded species of litter-dwelling oribatid mites along a gradient of forest conversion from a natural forest to a monoculture larch plantation in Japan (in total 11 stands), and collected data on the functional traits of the recorded species to quantify functional diversity. We calculated the taxonomic and functional β-diversity, an index of biotic homogenization. We found that both the taxonomic and functional β-diversity decreased with larch dominance (stand homogenization). After further deconstructing β-diversity into the components of turnover and nestedness, which reflect different processes of community organization, a significant decrease in the response to larch dominance was observed only for the functional turnover. As a result, there was a steeper decline in the functional β-diversity than the taxonomic β-diversity. This discordance between the taxonomic and functional response suggests that species replacement occurs between species that are functionally redundant under environmental homogenization, ultimately leading to the stronger homogenization of functional diversity. The insights gained from community organization of oribatid mites suggest that the functional characteristics of local assemblages, which support the functionality of ecosystems, are of more concern in human-dominated forest landscapes.

  19. Land-use intensification causes multitrophic homogenization of grassland communities.

    PubMed

    Gossner, Martin M; Lewinsohn, Thomas M; Kahl, Tiemo; Grassein, Fabrice; Boch, Steffen; Prati, Daniel; Birkhofer, Klaus; Renner, Swen C; Sikorski, Johannes; Wubet, Tesfaye; Arndt, Hartmut; Baumgartner, Vanessa; Blaser, Stefan; Blüthgen, Nico; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Jorge, Leonardo Ré; Jung, Kirsten; Keyel, Alexander C; Klein, Alexandra-Maria; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Müller, Jörg; Overmann, Jörg; Pašalić, Esther; Penone, Caterina; Perović, David J; Purschke, Oliver; Schall, Peter; Socher, Stephanie A; Sonnemann, Ilja; Tschapka, Marco; Tscharntke, Teja; Türke, Manfred; Venter, Paul Christiaan; Weiner, Christiane N; Werner, Michael; Wolters, Volkmar; Wurst, Susanne; Westphal, Catrin; Fischer, Markus; Weisser, Wolfgang W; Allan, Eric

    2016-12-08

    Land-use intensification is a major driver of biodiversity loss. Alongside reductions in local species diversity, biotic homogenization at larger spatial scales is of great concern for conservation. Biotic homogenization means a decrease in β-diversity (the compositional dissimilarity between sites). Most studies have investigated losses in local (α)-diversity and neglected biodiversity loss at larger spatial scales. Studies addressing β-diversity have focused on single or a few organism groups (for example, ref. 4), and it is thus unknown whether land-use intensification homogenizes communities at different trophic levels, above- and belowground. Here we show that even moderate increases in local land-use intensity (LUI) cause biotic homogenization across microbial, plant and animal groups, both above- and belowground, and that this is largely independent of changes in α-diversity. We analysed a unique grassland biodiversity dataset, with abundances of more than 4,000 species belonging to 12 trophic groups. LUI, and, in particular, high mowing intensity, had consistent effects on β-diversity across groups, causing a homogenization of soil microbial, fungal pathogen, plant and arthropod communities. These effects were nonlinear and the strongest declines in β-diversity occurred in the transition from extensively managed to intermediate intensity grassland. LUI tended to reduce local α-diversity in aboveground groups, whereas the α-diversity increased in belowground groups. Correlations between the β-diversity of different groups, particularly between plants and their consumers, became weaker at high LUI. This suggests a loss of specialist species and is further evidence for biotic homogenization. The consistently negative effects of LUI on landscape-scale biodiversity underscore the high value of extensively managed grasslands for conserving multitrophic biodiversity and ecosystem service provision. Indeed, biotic homogenization rather than local diversity loss could prove to be the most substantial consequence of land-use intensification.

  20. Direct current dielectric barrier assistant discharge to get homogeneous plasma in capacitive coupled discharge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Yinchang, E-mail: ycdu@mail.ustc.edu.cn; Max-Planck Institute for Extraterrestrial Physics, D-85748 Garching; Li, Yangfang

    In this paper, we propose a method to get more homogeneous plasma in the geometrically asymmetric capacitive coupled plasma (CCP) discharge. The dielectric barrier discharge (DBD) is used for the auxiliary discharge system to improve the homogeneity of the geometrically asymmetric CCP discharge. The single Langmuir probe measurement shows that the DBD can increase the electron density in the low density volume, where the DBD electrodes are mounted, when the pressure is higher than 5 Pa. By this manner, we are able to improve the homogeneity of the plasma production and increase the overall density in the target volume. At last,more » the finite element simulation results show that the DC bias, applied to the DBD electrodes, can increase the homogeneity of the electron density in the CCP discharge. The simulation results show a good agreement with the experiment results.« less

  1. A novel approach to make homogeneous protease-stable monovalent streptavidin

    DOE PAGES

    Zhang, M.; Shao, J; Xiao, J.; ...

    2015-06-11

    The interaction between the tetramer streptavidin and biotin is recognized as one of the strongest non-covalent associations. Owing to the tight and specific binding, the streptavidin-biotin system has been used widely for bimolecular labeling, purification, immobilization, and even for targeted delivery of therapeutics drugs. Here, we report a novel approach to make homogeneous monovalent tetramer streptavidin. The purified monovalent protein showed both thermal stability and protease stability. Unexpectedly, we found that two proteases, Proteinase K (PK) and Subtilisin (SU), can efficiently remove the His8-tag from the wild-type subunit without affecting the tetramer architecture of monovalent streptavidin, thus making it moremore » homogeneous. In addition, crystallization was performed to assure the homogeneity of the monovalent protein prepared. Overall, monovalent streptavidin shows increased homogeneity and will likely be valuable for many future applications in a wide range of research areas.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Hummel, Andrew John; Hiruta, Hikaru

    The deterministic full core simulators require homogenized group constants covering the operating and transient conditions over the entire lifetime. Traditionally, the homogenized group constants are generated using lattice physics code over an assembly or block in the case of prismatic high temperature reactors (HTR). For the case of strong absorbers that causes strong local depressions on the flux profile require special techniques during homogenization over a large volume. Fuel blocks with burnable poisons or control rod blocks are example of such cases. Over past several decades, there have been a tremendous number of studies performed for improving the accuracy ofmore » full-core calculations through the homogenization procedure. However, those studies were mostly performed for light water reactor (LWR) analyses, thus, may not be directly applicable to advanced thermal reactors such as HTRs. This report presents the application of SuPer-Homogenization correction method to a hypothetical HTR core.« less

  3. Dark energy homogeneity in general relativity: Are we applying it correctly?

    NASA Astrophysics Data System (ADS)

    Duniya, Didam G. A.

    2016-04-01

    Thus far, there does not appear to be an agreed (or adequate) definition of homogeneous dark energy (DE). This paper seeks to define a valid, adequate homogeneity condition for DE. Firstly, it is shown that as long as w_x ≠ -1, DE must have perturbations. It is then argued, independent of w_x, that a correct definition of homogeneous DE is one whose density perturbation vanishes in comoving gauge: and hence, in the DE rest frame. Using phenomenological DE, the consequence of this approach is then investigated in the observed galaxy power spectrum—with the power spectrum being normalized on small scales, at the present epoch z=0. It is found that for high magnification bias, relativistic corrections in the galaxy power spectrum are able to distinguish the concordance model from both a homogeneous DE and a clustering DE—on super-horizon scales.

  4. Microphysical Modelling of the 1999-2000 Arctic Winter. 3; Impact of Homogeneous Freezing on PSCs

    NASA Technical Reports Server (NTRS)

    Drdla, K.

    2003-01-01

    Simulations of the 1999-2000 winter have tested the effect on polar stratospheric clouds (PSCs) of the homogeneous freezing of liquid ternary solutions into nitric acid trihydrate (NAT) and nitric acid dihydrate (NAD). Proposed laboratory-derived volume-based and surface-based homogeneous freezing rates have both been examined, including different assumptions about the extrapolation of laboratory measurements to atmospheric conditions. Widespread PSC formation and denitrification are possible in several of the scenarios examined. However, the simulations are all unable to explain the solid-phase PSCs observed early in the 1999-2000 winter, and are unable to reproduce the measured extent of vortex denitrification. These problems can both be attributed to the relatively cold temperatures, more than 5 K below the NAT condensation point, necessary for effective homogeneous freezing. Therefore synoptic-scale homogeneous freezing appears unlikely to be the primary mechanism responsible for solid-phase PSC formation.

  5. Method for preparing hydrous zirconium oxide gels and spherules

    DOEpatents

    Collins, Jack L.

    2003-08-05

    Methods for preparing hydrous zirconium oxide spherules, hydrous zirconium oxide gels such as gel slabs, films, capillary and electrophoresis gels, zirconium monohydrogen phosphate spherules, hydrous zirconium oxide spherules having suspendable particles homogeneously embedded within to form a composite sorbent, zirconium monohydrogen phosphate spherules having suspendable particles of at least one different sorbent homogeneously embedded within to form a composite sorbent having a desired crystallinity, zirconium oxide spherules having suspendable particles homogeneously embedded within to form a composite, hydrous zirconium oxide fiber materials, zirconium oxide fiber materials, hydrous zirconium oxide fiber materials having suspendable particles homogeneously embedded within to form a composite, zirconium oxide fiber materials having suspendable particles homogeneously embedded within to form a composite and spherules of barium zirconate. The hydrous zirconium oxide spherules and gel forms prepared by the gel-sphere, internal gelation process are useful as inorganic ion exchangers, catalysts, getters and ceramics.

  6. The efficiency of average linkage hierarchical clustering algorithm associated multi-scale bootstrap resampling in identifying homogeneous precipitation catchments

    NASA Astrophysics Data System (ADS)

    Chuan, Zun Liang; Ismail, Noriszura; Shinyie, Wendy Ling; Lit Ken, Tan; Fam, Soo-Fen; Senawi, Azlyna; Yusoff, Wan Nur Syahidah Wan

    2018-04-01

    Due to the limited of historical precipitation records, agglomerative hierarchical clustering algorithms widely used to extrapolate information from gauged to ungauged precipitation catchments in yielding a more reliable projection of extreme hydro-meteorological events such as extreme precipitation events. However, identifying the optimum number of homogeneous precipitation catchments accurately based on the dendrogram resulted using agglomerative hierarchical algorithms are very subjective. The main objective of this study is to propose an efficient regionalized algorithm to identify the homogeneous precipitation catchments for non-stationary precipitation time series. The homogeneous precipitation catchments are identified using average linkage hierarchical clustering algorithm associated multi-scale bootstrap resampling, while uncentered correlation coefficient as the similarity measure. The regionalized homogeneous precipitation is consolidated using K-sample Anderson Darling non-parametric test. The analysis result shows the proposed regionalized algorithm performed more better compared to the proposed agglomerative hierarchical clustering algorithm in previous studies.

  7. High Shear Homogenization of Lignin to Nanolignin and Thermal Stability of Nanolignin-Polyvinyl Alcohol Blends

    Treesearch

    Sandeep S. Nair; Sudhir Sharma; Yunqiao Pu; Qining Sun; Shaobo Pan; J.Y. Zhu; Yulin Deng; Art J. Ragauskas

    2014-01-01

    A new method to prepare nanolignin using a simple high shear homogenizer is presented. The kraft lignin particles with a broad distribution ranging from large micron- to nano-sized particles were completely homogenized to nanolignin particles with sizes less than 100 nm after 4 h of mechanical shearing. The 13C nuclear magnetic resonance (NMR)...

  8. Differential Effects of Literacy Instruction Time and Homogeneous Ability Grouping in Kindergarten Classrooms: Who Will Benefit? Who Will Suffer?

    ERIC Educational Resources Information Center

    Hong, Guanglei; Corter, Carl; Hong, Yihua; Pelletier, Janette

    2012-01-01

    This study challenges the belief that homogeneous ability grouping benefits high-ability students in cognitive and social-emotional development at the expense of their low-ability peers. From a developmental point of view, the authors hypothesize that homogeneous grouping may improve the learning behaviors and may benefit the literacy learning of…

  9. Comparison of different hydrological similarity measures to estimate flow quantiles

    NASA Astrophysics Data System (ADS)

    Rianna, M.; Ridolfi, E.; Napolitano, F.

    2017-07-01

    This paper aims to evaluate the influence of hydrological similarity measures on the definition of homogeneous regions. To this end, several attribute sets have been analyzed in the context of the Region of Influence (ROI) procedure. Several combinations of geomorphological, climatological, and geographical characteristics are also used to cluster potentially homogeneous regions. To verify the goodness of the resulting pooled sites, homogeneity tests arecarried out. Through a Monte Carlo simulation and a jack-knife procedure, flow quantiles areestimated for the regions effectively resulting as homogeneous. The analysis areperformed in both the so-called gauged and ungauged scenarios to analyze the effect of hydrological measures on flow quantiles estimation.

  10. Homogenization kinetics of a nickel-based superalloy produced by powder bed fusion laser sintering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Fan; Levine, Lyle E.; Allen, Andrew J.

    2017-04-01

    Additively manufactured (AM) metal components often exhibit fine dendritic microstructures and elemental segregation due to the initial rapid solidification and subsequent melting and cooling during the build process, which without homogenization would adversely affect materials performance. In this letter, we report in situ observation of the homogenization kinetics of an AM nickel-based superalloy using synchrotron small angle X-ray scattering. The identified kinetic time scale is in good agreement with thermodynamic diffusion simulation predictions using microstructural dimensions acquired by ex situ scanning electron microscopy. These findings could serve as a recipe for predicting, observing, and validating homogenization treatments in AM materials.

  11. Homogenization Kinetics of a Nickel-based Superalloy Produced by Powder Bed Fusion Laser Sintering.

    PubMed

    Zhang, Fan; Levine, Lyle E; Allen, Andrew J; Campbell, Carelyn E; Lass, Eric A; Cheruvathur, Sudha; Stoudt, Mark R; Williams, Maureen E; Idell, Yaakov

    2017-04-01

    Additively manufactured (AM) metal components often exhibit fine dendritic microstructures and elemental segregation due to the initial rapid solidification and subsequent melting and cooling during the build process, which without homogenization would adversely affect materials performance. In this letter, we report in situ observation of the homogenization kinetics of an AM nickel-based superalloy using synchrotron small angle X-ray scattering. The identified kinetic time scale is in good agreement with thermodynamic diffusion simulation predictions using microstructural dimensions acquired by ex situ scanning electron microscopy. These findings could serve as a recipe for predicting, observing, and validating homogenization treatments in AM materials.

  12. Mixed mode control method and engine using same

    DOEpatents

    Kesse, Mary L [Peoria, IL; Duffy, Kevin P [Metamora, IL

    2007-04-10

    A method of mixed mode operation of an internal combustion engine includes the steps of controlling a homogeneous charge combustion event timing in a given engine cycle, and controlling a conventional charge injection event to be at least a predetermined time after the homogeneous charge combustion event. An internal combustion engine is provided, including an electronic controller having a computer readable medium with a combustion timing control algorithm recorded thereon, the control algorithm including means for controlling a homogeneous charge combustion event timing and means for controlling a conventional injection event timing to be at least a predetermined time from the homogeneous charge combustion event.

  13. Heterogeneous boiling-up of superheated liquid at achievable superheat threshold.

    PubMed

    Ermakov, G V; Lipnyagov, E V; Perminov, S A; Gurashkin, A L

    2009-07-21

    The classical theory of homogeneous nucleation describes well the superheat threshold observed in experiments. It may be assumed therefore that homogeneous boiling-up of a liquid takes place in experiments, and the theory has been verified experimentally well. The streak photography used in this study showed that boiling-up of a superheated liquid at the threshold of the achievable superheat occurs at a limited number of surface fluctuation centers in a vessel, rather than in the bulk as one would expect with homogeneous nucleation. Thus, the homogeneous theory, which rather accurately describes the heterogeneous threshold of the achievable superheat, obviously is not confirmed in experiments.

  14. A Comparison of Aerosolization and Homogenization Techniques for Production of Alginate Microparticles for Delivery of Corticosteroids to the Colon.

    PubMed

    Samak, Yassmin O; El Massik, Magda; Coombes, Allan G A

    2017-01-01

    Alginate microparticles incorporating hydrocortisone hemisuccinate were produced by aerosolization and homogenization methods to investigate their potential for colonic drug delivery. Microparticle stabilization was achieved by CaCl 2 crosslinking solution (0.5 M and 1 M), and drug loading was accomplished by diffusion into blank microparticles or by direct encapsulation. Homogenization method produced smaller microparticles (45-50 μm), compared to aerosolization (65-90 μm). High drug loadings (40% wt/wt) were obtained for diffusion-loaded aerosolized microparticles. Aerosolized microparticles suppressed drug release in simulated gastric fluid (SGF) and simulated intestinal fluid (SIF) prior to drug release in simulated colonic fluid (SCF) to a higher extent than homogenized microparticles. Microparticles prepared using aerosolization or homogenization (1 M CaCl 2 , diffusion loaded) released 5% and 17% of drug content after 2 h in SGF and 4 h in SIF, respectively, and 75% after 12 h in SCF. Thus, aerosolization and homogenization techniques show potential for producing alginate microparticles for colonic drug delivery in the treatment of inflammatory bowel disease. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  15. Monte Carlo Estimation of Absorbed Dose Distributions Obtained from Heterogeneous 106Ru Eye Plaques.

    PubMed

    Zaragoza, Francisco J; Eichmann, Marion; Flühs, Dirk; Sauerwein, Wolfgang; Brualla, Lorenzo

    2017-09-01

    The distribution of the emitter substance in 106 Ru eye plaques is usually assumed to be homogeneous for treatment planning purposes. However, this distribution is never homogeneous, and it widely differs from plaque to plaque due to manufacturing factors. By Monte Carlo simulation of radiation transport, we study the absorbed dose distribution obtained from the specific CCA1364 and CCB1256 106 Ru plaques, whose actual emitter distributions were measured. The idealized, homogeneous CCA and CCB plaques are also simulated. The largest discrepancy in depth dose distribution observed between the heterogeneous and the homogeneous plaques was 7.9 and 23.7% for the CCA and CCB plaques, respectively. In terms of isodose lines, the line referring to 100% of the reference dose penetrates 0.2 and 1.8 mm deeper in the case of heterogeneous CCA and CCB plaques, respectively, with respect to the homogeneous counterpart. The observed differences in absorbed dose distributions obtained from heterogeneous and homogeneous plaques are clinically irrelevant if the plaques are used with a lateral safety margin of at least 2 mm. However, these differences may be relevant if the plaques are used in eccentric positioning.

  16. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    NASA Astrophysics Data System (ADS)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  17. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems.

    PubMed

    Ye, Ran; Harte, Federico

    2014-03-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions.

  18. High pressure homogenization to improve the stability of casein - hydroxypropyl cellulose aqueous systems

    PubMed Central

    Ye, Ran; Harte, Federico

    2013-01-01

    The effect of high pressure homogenization on the improvement of the stability hydroxypropyl cellulose (HPC) and micellar casein was investigated. HPC with two molecular weights (80 and 1150 kDa) and micellar casein were mixed in water to a concentration leading to phase separation (0.45% w/v HPC and 3% w/v casein) and immediately subjected to high pressure homogenization ranging from 0 to 300 MPa, in 100 MPa increments. The various dispersions were evaluated for stability, particle size, turbidity, protein content, and viscosity over a period of two weeks and Scanning Transmission Electron Microscopy (STEM) at the end of the storage period. The stability of casein-HPC complexes was enhanced with the increasing homogenization pressure, especially for the complex containing high molecular weight HPC. The apparent particle size of complexes was reduced from ~200nm to ~130nm when using 300 MPa, corresponding to the sharp decrease of absorbance when compared to the non-homogenized controls. High pressure homogenization reduced the viscosity of HPC-casein complexes regardless of the molecular weight of HPC and STEM imagines revealed aggregates consistent with nano-scale protein polysaccharide interactions. PMID:24159250

  19. Sampling and Homogenization Strategies Significantly Influence the Detection of Foodborne Pathogens in Meat.

    PubMed

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-01-01

    Efficient preparation of food samples, comprising sampling and homogenization, for microbiological testing is an essential, yet largely neglected, component of foodstuff control. Salmonella enterica spiked chicken breasts were used as a surface contamination model whereas salami and meat paste acted as models of inner-matrix contamination. A systematic comparison of different homogenization approaches, namely, stomaching, sonication, and milling by FastPrep-24 or SpeedMill, revealed that for surface contamination a broad range of sample pretreatment steps is applicable and loss of culturability due to the homogenization procedure is marginal. In contrast, for inner-matrix contamination long treatments up to 8 min are required and only FastPrep-24 as a large-volume milling device produced consistently good recovery rates. In addition, sampling of different regions of the spiked sausages showed that pathogens are not necessarily homogenously distributed throughout the entire matrix. Instead, in meat paste the core region contained considerably more pathogens compared to the rim, whereas in the salamis the distribution was more even with an increased concentration within the intermediate region of the sausages. Our results indicate that sampling and homogenization as integral parts of food microbiology and monitoring deserve more attention to further improve food safety.

  20. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    PubMed

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  1. Convergence and divergence in gesture repertoires as an adaptive mechanism for social bonding in primates.

    PubMed

    Roberts, Anna Ilona; Roberts, Sam George Bradley

    2017-11-01

    A key challenge for primates living in large, stable social groups is managing social relationships. Chimpanzee gestures may act as a time-efficient social bonding mechanism, and the presence (homogeneity) and absence (heterogeneity) of overlap in repertoires in particular may play an important role in social bonding. However, how homogeneity and heterogeneity in the gestural repertoire of primates relate to social interaction is poorly understood. We used social network analysis and generalized linear mixed modelling to examine this question in wild chimpanzees. The repertoire size of both homogeneous and heterogeneous visual, tactile and auditory gestures was associated with the duration of time spent in social bonding behaviour, centrality in the social bonding network and demography. The audience size of partners who displayed similar or different characteristics to the signaller (e.g. same or opposite age or sex category) also influenced the use of homogeneous and heterogeneous gestures. Homogeneous and heterogeneous gestures were differentially associated with the presence of emotional reactions in response to the gesture and the presence of a change in the recipient's behaviour. Homogeneity and heterogeneity of gestural communication play a key role in maintaining a differentiated set of strong and weak social relationships in complex, multilevel societies.

  2. Superfluid transition of homogeneous and trapped two-dimensional Bose gases.

    PubMed

    Holzmann, Markus; Baym, Gordon; Blaizot, Jean-Paul; Laloë, Franck

    2007-01-30

    Current experiments on atomic gases in highly anisotropic traps present the opportunity to study in detail the low temperature phases of two-dimensional inhomogeneous systems. Although, in an ideal gas, the trapping potential favors Bose-Einstein condensation at finite temperature, interactions tend to destabilize the condensate, leading to a superfluid Kosterlitz-Thouless-Berezinskii phase with a finite superfluid mass density but no long-range order, as in homogeneous fluids. The transition in homogeneous systems is conveniently described in terms of dissociation of topological defects (vortex-antivortex pairs). However, trapped two-dimensional gases are more directly approached by generalizing the microscopic theory of the homogeneous gas. In this paper, we first derive, via a diagrammatic expansion, the scaling structure near the phase transition in a homogeneous system, and then study the effects of a trapping potential in the local density approximation. We find that a weakly interacting trapped gas undergoes a Kosterlitz-Thouless-Berezinskii transition from the normal state at a temperature slightly below the Bose-Einstein transition temperature of the ideal gas. The characteristic finite superfluid mass density of a homogeneous system just below the transition becomes strongly suppressed in a trapped gas.

  3. Data re-arranging techniques leading to proper variable selections in high energy physics

    NASA Astrophysics Data System (ADS)

    Kůs, Václav; Bouř, Petr

    2017-12-01

    We introduce a new data based approach to homogeneity testing and variable selection carried out in high energy physics experiments, where one of the basic tasks is to test the homogeneity of weighted samples, mainly the Monte Carlo simulations (weighted) and real data measurements (unweighted). This technique is called ’data re-arranging’ and it enables variable selection performed by means of the classical statistical homogeneity tests such as Kolmogorov-Smirnov, Anderson-Darling, or Pearson’s chi-square divergence test. P-values of our variants of homogeneity tests are investigated and the empirical verification through 46 dimensional high energy particle physics data sets is accomplished under newly proposed (equiprobable) quantile binning. Particularly, the procedure of homogeneity testing is applied to re-arranged Monte Carlo samples and real DATA sets measured at the particle accelerator Tevatron in Fermilab at DØ experiment originating from top-antitop quark pair production in two decay channels (electron, muon) with 2, 3, or 4+ jets detected. Finally, the variable selections in the electron and muon channels induced by the re-arranging procedure for homogeneity testing are provided for Tevatron top-antitop quark data sets.

  4. Impact of the cation distribution homogeneity on the americium oxidation state in the U0.54Pu0.45Am0.01O2-x mixed oxide

    NASA Astrophysics Data System (ADS)

    Vauchy, Romain; Robisson, Anne-Charlotte; Martin, Philippe M.; Belin, Renaud C.; Aufore, Laurence; Scheinost, Andreas C.; Hodaj, Fiqiri

    2015-01-01

    The impact of the cation distribution homogeneity of the U0.54Pu0.45Am0.01O2-x mixed oxide on the americium oxidation state was studied by coupling X-ray diffraction (XRD), electron probe micro analysis (EPMA) and X-ray absorption spectroscopy (XAS). Oxygen-hypostoichiometric Am-bearing uranium-plutonium mixed oxide pellets were fabricated by two different co-milling based processes in order to obtain different cation distribution homogeneities. The americium was generated from β- decay of 241Pu. The XRD analysis of the obtained compounds did not reveal any structural difference between the samples. EPMA, however, revealed a high homogeneity in the cation distribution for one sample, and substantial heterogeneity of the U-Pu (so Am) distribution for the other. The difference in cation distribution was linked to a difference in Am chemistry as investigated by XAS, with Am being present at mixed +III/+IV oxidation state in the heterogeneous compound, whereas only Am(IV) was observed in the homogeneous compound. Previously reported discrepancies on Am oxidation states can hence be explained by cation distribution homogeneity effects.

  5. Sampling and Homogenization Strategies Significantly Influence the Detection of Foodborne Pathogens in Meat

    PubMed Central

    Rohde, Alexander; Hammerl, Jens Andre; Appel, Bernd; Dieckmann, Ralf; Al Dahouk, Sascha

    2015-01-01

    Efficient preparation of food samples, comprising sampling and homogenization, for microbiological testing is an essential, yet largely neglected, component of foodstuff control. Salmonella enterica spiked chicken breasts were used as a surface contamination model whereas salami and meat paste acted as models of inner-matrix contamination. A systematic comparison of different homogenization approaches, namely, stomaching, sonication, and milling by FastPrep-24 or SpeedMill, revealed that for surface contamination a broad range of sample pretreatment steps is applicable and loss of culturability due to the homogenization procedure is marginal. In contrast, for inner-matrix contamination long treatments up to 8 min are required and only FastPrep-24 as a large-volume milling device produced consistently good recovery rates. In addition, sampling of different regions of the spiked sausages showed that pathogens are not necessarily homogenously distributed throughout the entire matrix. Instead, in meat paste the core region contained considerably more pathogens compared to the rim, whereas in the salamis the distribution was more even with an increased concentration within the intermediate region of the sausages. Our results indicate that sampling and homogenization as integral parts of food microbiology and monitoring deserve more attention to further improve food safety. PMID:26539462

  6. Tai Chi Chuan optimizes the functional organization of the intrinsic human brain architecture in older adults

    PubMed Central

    Wei, Gao-Xia; Dong, Hao-Ming; Yang, Zhi; Luo, Jing; Zuo, Xi-Nian

    2014-01-01

    Whether Tai Chi Chuan (TCC) can influence the intrinsic functional architecture of the human brain remains unclear. To examine TCC-associated changes in functional connectomes, resting-state functional magnetic resonance images were acquired from 40 older individuals including 22 experienced TCC practitioners (experts) and 18 demographically matched TCC-naïve healthy controls, and their local functional homogeneities across the cortical mantle were compared. Compared to the controls, the TCC experts had significantly greater and more experience-dependent functional homogeneity in the right post-central gyrus (PosCG) and less functional homogeneity in the left anterior cingulate cortex (ACC) and the right dorsal lateral prefrontal cortex. Increased functional homogeneity in the PosCG was correlated with TCC experience. Intriguingly, decreases in functional homogeneity (improved functional specialization) in the left ACC and increases in functional homogeneity (improved functional integration) in the right PosCG both predicted performance gains on attention network behavior tests. These findings provide evidence for the functional plasticity of the brain’s intrinsic architecture toward optimizing locally functional organization, with great implications for understanding the effects of TCC on cognition, behavior and health in aging population. PMID:24860494

  7. Homogeneous ice nucleation from aqueous inorganic/organic particles representative of biomass burning: water activity, freezing temperatures, nucleation rates.

    PubMed

    Knopf, Daniel A; Rigg, Yannick J

    2011-02-10

    Homogeneous ice nucleation plays an important role in the formation of cirrus clouds with subsequent effects on the global radiative budget. Here we report on homogeneous ice nucleation temperatures and corresponding nucleation rate coefficients of aqueous droplets serving as surrogates of biomass burning aerosol. Micrometer-sized (NH(4))(2)SO(4)/levoglucosan droplets with mass ratios of 10:1, 1:1, 1:5, and 1:10 and aqueous multicomponent organic droplets with and without (NH(4))(2)SO(4) under typical tropospheric temperatures and relative humidities are investigated experimentally using a droplet conditioning and ice nucleation apparatus coupled to an optical microscope with image analysis. Homogeneous freezing was determined as a function of temperature and water activity, a(w), which was set at droplet preparation conditions. The ice nucleation data indicate that minor addition of (NH(4))(2)SO(4) to the aqueous organic droplets renders the temperature dependency of water activity negligible in contrast to the case of aqueous organic solution droplets. The mean homogeneous ice nucleation rate coefficient derived from 8 different aqueous droplet compositions with average diameters of ∼60 μm for temperatures as low as 195 K and a(w) of 0.82-1 is 2.18 × 10(6) cm(-3) s(-1). The experimentally derived freezing temperatures and homogeneous ice nucleation rate coefficients are in agreement with predictions of the water activity-based homogeneous ice nucleation theory when taking predictive uncertainties into account. However, the presented ice nucleation data indicate that the water activity-based homogeneous ice nucleation theory overpredicts the freezing temperatures by up to 3 K and corresponding ice nucleation rate coefficients by up to ∼2 orders of magnitude. A shift of 0.01 in a(w), which is well within the uncertainty of typical field and laboratory relative humidity measurements, brings experimental and predicted freezing temperatures and homogeneous ice nucleation rate coefficients into agreement. The experimentally derived ice nucleation data are applied to constrain the water activity-based homogeneous ice nucleation theory to smaller than ±1 order of magnitude compared to the predictive uncertainty of larger than ±6 orders of magnitude. The atmospheric implications of these findings are discussed.

  8. Homogeneous dielectric barrier discharges in atmospheric air and its influencing factor

    NASA Astrophysics Data System (ADS)

    Ran, Junxia; Li, Caixia; Ma, Dong; Luo, Haiyun; Li, Xiaowei

    2018-03-01

    The stable homogeneous dielectric barrier discharge (DBD) is obtained in atmospheric 2-3 mm air gap. It is generated using center frequency 1 kHz high voltage power supply between two plane parallel electrodes with specific alumina ceramic plates as the dielectric barriers. The discharge characteristics are studied by a measurement of its electrical discharge parameters and observation of its light emission phenomena. The results show that a large single current pulse of about 200 μs duration appearing in each voltage pulse, and its light emission is radially homogeneous and covers the entire surface of the two electrodes. The homogeneous discharge generated is a Townsend discharge during discharge. The influences of applied barrier, its thickness, and surface roughness on the transition of discharge modes are studied. The results show that it is difficult to produce a homogeneous discharge using smooth plates or alumina plate surface roughness Ra < 100 nm even at a 1 mm air gap. If the alumina plate is too thin, the discharge also transits to filamentary discharge. If it is too thick, the discharge is too weak to observe. With the increase of air gap distance and applied voltage, the discharge can also transit from a homogeneous mode to a filamentary mode. In order to generate stable and homogeneous DBD at a larger air gap, proper dielectric material, dielectric thickness, and dielectric surface roughness should be used, and proper applied voltage amplitude and frequency should also be used.

  9. Homogeneity revisited: analysis of updated precipitation series in Turkey

    NASA Astrophysics Data System (ADS)

    Bickici Arikan, Bugrayhan; Kahya, Ercan

    2018-01-01

    Homogeneous time series of meteorological variables are necessary for hydrologic and climate studies. Dependability of historical precipitation data is subjected to keen evaluation prior to every study in water resources, hydrology, and climate change fields. This study aims to characterize the homogeneity of long-term Turkish precipitation data in order to ensure that they can be reliably used. The homogeneity of monthly precipitation data set was tested using the standard normal homogeneity test, Buishand test, Von Neumann ratio test, and Pettitt test at the 5% significance level across Turkey. Our precipitation records including the most updated observations, extracted from 160 meteorological stations, for the periods 1974-2014 were analyzed by all the four homogeneity tests. According to the results of all tests, five out of 160 stations have an inhomogeneity. With regard to our strict confirmation rule, 44 out of 160 stations are said to be inhomogeneous since they failed from at least one of the four tests. The breaks captured by the Buishand and Pettitt tests usually tend to appear in the middle of the precipitation series, whereas the ability of standard normal homogeneity test is in favor of identifying inhomogeneities mostly at the beginning or at the end of the records. Our results showed that 42 out of 44 inhomogeneous stations passed all the four tests after applying a correction procedure based on the double mass curve analysis. Available metadata was used to interpret the detected inhomogeneity.

  10. Computational Homogenization of Mechanical Properties for Laminate Composites Reinforced with Thin Film Made of Carbon Nanotubes

    NASA Astrophysics Data System (ADS)

    El Moumen, A.; Tarfaoui, M.; Lafdi, K.

    2018-06-01

    Elastic properties of laminate composites based Carbone Nanotubes (CNTs), used in military applications, were estimated using homogenization techniques and compared to the experimental data. The composite consists of three phases: T300 6k carbon fibers fabric with 5HS (satin) weave, baseline pure Epoxy matrix and CNTs added with 0.5%, 1%, 2% and 4%. Two step homogenization methods based RVE model were employed. The objective of this paper is to determine the elastic properties of structure starting from the knowledge of those of constituents (CNTs, Epoxy and carbon fibers fabric). It is assumed that the composites have a geometric periodicity and the homogenization model can be represented by a representative volume element (RVE). For multi-scale analysis, finite element modeling of unit cell based two step homogenization method is used. The first step gives the properties of thin film made of epoxy and CNTs and the second is used for homogenization of laminate composite. The fabric unit cell is chosen using a set of microscopic observation and then identified by its ability to enclose the characteristic periodic repeat in the fabric weave. The unit cell model of 5-Harness satin weave fabric textile composite is identified for numerical approach and their dimensions are chosen based on some microstructural measurements. Finally, a good comparison was obtained between the predicted elastic properties using numerical homogenization approach and the obtained experimental data with experimental tests.

  11. Computational Homogenization of Mechanical Properties for Laminate Composites Reinforced with Thin Film Made of Carbon Nanotubes

    NASA Astrophysics Data System (ADS)

    El Moumen, A.; Tarfaoui, M.; Lafdi, K.

    2017-08-01

    Elastic properties of laminate composites based Carbone Nanotubes (CNTs), used in military applications, were estimated using homogenization techniques and compared to the experimental data. The composite consists of three phases: T300 6k carbon fibers fabric with 5HS (satin) weave, baseline pure Epoxy matrix and CNTs added with 0.5%, 1%, 2% and 4%. Two step homogenization methods based RVE model were employed. The objective of this paper is to determine the elastic properties of structure starting from the knowledge of those of constituents (CNTs, Epoxy and carbon fibers fabric). It is assumed that the composites have a geometric periodicity and the homogenization model can be represented by a representative volume element (RVE). For multi-scale analysis, finite element modeling of unit cell based two step homogenization method is used. The first step gives the properties of thin film made of epoxy and CNTs and the second is used for homogenization of laminate composite. The fabric unit cell is chosen using a set of microscopic observation and then identified by its ability to enclose the characteristic periodic repeat in the fabric weave. The unit cell model of 5-Harness satin weave fabric textile composite is identified for numerical approach and their dimensions are chosen based on some microstructural measurements. Finally, a good comparison was obtained between the predicted elastic properties using numerical homogenization approach and the obtained experimental data with experimental tests.

  12. Individual differences in verbal creative thinking are reflected in the precuneus.

    PubMed

    Chen, Qun-Lin; Xu, Ting; Yang, Wen-Jing; Li, Ya-Dan; Sun, Jiang-Zhou; Wang, Kang-Cheng; Beaty, Roger E; Zhang, Qing-Lin; Zuo, Xi-Nian; Qiu, Jiang

    2015-08-01

    There have been many structural and functional imaging studies of creative thinking, but combining structural and functional magnetic resonance imaging (MRI) investigations with respect to creative thinking is still lacking. Thus, the aim of the present study was to explore the associations among inter-individual verbal creative thinking and both regional homogeneity and cortical morphology of the brain surface. We related the local functional homogeneity of spontaneous brain activity to verbal creative thinking and its dimensions--fluency, originality, and flexibility--by examining these inter-individual differences in a large sample of 268 healthy college students. Results revealed that people with high verbal creative ability and high scores for the three dimensions of creativity exhibited lower regional functional homogeneity in the right precuneus. Both cortical volume and thickness of the right precuneus were positively associated with individual verbal creativity and its dimensions. Moreover, originality was negatively correlated with functional homogeneity in the left superior frontal gyrus and positively correlated with functional homogeneity in the right occipito-temporal gyrus. In contrast, flexibility was positively correlated with functional homogeneity in the left superior and middle occipital gyrus. These findings provide additional evidence of a link between verbal creative thinking and brain structure in the right precuneus--a region involved in internally--focused attention and effective semantic retrieval-and further suggest that local functional homogeneity of verbal creative thinking has neurobiological relevance that is likely based on anatomical substrates. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Comparing the Performance of Approaches for Testing the Homogeneity of Variance Assumption in One-Factor ANOVA Models

    ERIC Educational Resources Information Center

    Wang, Yan; Rodríguez de Gil, Patricia; Chen, Yi-Hsin; Kromrey, Jeffrey D.; Kim, Eun Sook; Pham, Thanh; Nguyen, Diep; Romano, Jeanine L.

    2017-01-01

    Various tests to check the homogeneity of variance assumption have been proposed in the literature, yet there is no consensus as to their robustness when the assumption of normality does not hold. This simulation study evaluated the performance of 14 tests for the homogeneity of variance assumption in one-way ANOVA models in terms of Type I error…

  14. Homogeneous free-form directional backlight for 3D display

    NASA Astrophysics Data System (ADS)

    Krebs, Peter; Liang, Haowen; Fan, Hang; Zhang, Aiqin; Zhou, Yangui; Chen, Jiayi; Li, Kunyang; Zhou, Jianying

    2017-08-01

    Realization of a near perfect homogeneous secondary emission source for 3D display is proposed and demonstrated. The light source takes advantage of an array of free-form emission surface with a specially tailored light guiding structure, a light diffuser and Fresnel lens. A seamless and homogeneous directional emission is experimentally obtained which is essential for a high quality naked-eye 3D display.

  15. Noncommutative complex structures on quantum homogeneous spaces

    NASA Astrophysics Data System (ADS)

    Ó Buachalla, Réamonn

    2016-01-01

    A new framework for noncommutative complex geometry on quantum homogeneous spaces is introduced. The main ingredients used are covariant differential calculi and Takeuchi's categorical equivalence for quantum homogeneous spaces. A number of basic results are established, producing a simple set of necessary and sufficient conditions for noncommutative complex structures to exist. Throughout, the framework is applied to the quantum projective spaces endowed with the Heckenberger-Kolb calculus.

  16. Damage of tracer erythropoietin results in erroneous estimation of concentration in mouse submaxillary gland.

    PubMed

    Vidal, A; Carcagno, M; Criscuolo, M; Barcelò, A C; Alippi, R M; Leal, T; Bozzini, C E

    1993-02-01

    It has been previously reported that 1) plasma erythropoietin (Epo) titer during exposure to hypobaria is lower in nephrectomized rats and mice whose submaxillary glands (SMG) were either ablated or atrophied than in nephrectomized controls whose SMG were intact and 2) that the gland shows one of the highest levels of immunoreactive Epo (iEpo) in the body. The latter observation, however, was questioned recently when it was observed that SMG extracts degrade labeled Epo used as tracer antigen in the radioimmunoassay (RIA), thus giving invalid estimates of Epo. Since this interpretation was in turn questioned, the present study was conducted to obtain more information on the subject and make these conflicting points clear. Investigation of the reported/possible degradation of Epo by SMG homogenates was conducted via polyacrylamide gel electrophoresis followed by radioautography or by a RIA in solid phase in which there was no simultaneous incubation of the tracer antigen with the SMG homogenates. It was observed that 125I-labeled rhEpo was degraded when incubated with SMG homogenates. Degradation was rapid, being evident when incubation lasted 30 minutes, and occurred in the presence of a protease inhibitor. It showed a high degree of specificity since it did not occur when Epo was incubated with kidney homogenate or normal mouse serum. SMG homogenate did not degrade labeled thyrotrophic hormone and degraded alpha interferon (IFN-alpha) only partially. When estimates of iEpo in SMG homogenate were performed in conditions of simultaneous (SI-RIA) or nonsimultaneous (NSI-RIA) incubation of the homogenate with tracer Epo, it was observed that while estimates of Epo in plasma were similar in both types of RIA and somewhat higher in kidney homogenate in the SI-RIA than in the NSI-RIA, estimates of Epo in SMG were about 60 times higher in the former than in the latter. Therefore, it could be concluded that most of the Epo detected by standard RIA in SMG homogenate does not represent true Epo because of damage of tracer Epo which determines loss of the integrity of the RIA system.

  17. Hydrogen storage materials and method of making by dry homogenation

    DOEpatents

    Jensen, Craig M.; Zidan, Ragaiy A.

    2002-01-01

    Dry homogenized metal hydrides, in particular aluminum hydride compounds, as a material for reversible hydrogen storage is provided. The reversible hydrogen storage material comprises a dry homogenized material having transition metal catalytic sites on a metal aluminum hydride compound, or mixtures of metal aluminum hydride compounds. A method of making such reversible hydrogen storage materials by dry doping is also provided and comprises the steps of dry homogenizing metal hydrides by mechanical mixing, such as be crushing or ball milling a powder, of a metal aluminum hydride with a transition metal catalyst. In another aspect of the invention, a method of powering a vehicle apparatus with the reversible hydrogen storage material is provided.

  18. Homogeneity and internal defects detect of infrared Se-based chalcogenide glass

    NASA Astrophysics Data System (ADS)

    Li, Zupana; Wu, Ligang; Lin, Changgui; Song, Bao'an; Wang, Xunsi; Shen, Xiang; Dai, Shixunb

    2011-10-01

    Ge-Sb-Se chalcogenide glasses is a kind of excellent infrared optical material, which has been enviromental friendly and widely used in infrared thermal imaging systems. However, due to the opaque feature of Se-based glasses in visible spectral region, it's difficult to measure their homogeneity and internal defect as the common oxide ones. In this study, a measurement was proposed to observe the homogeneity and internal defect of these glasses based on near-IR imaging technique and an effective measurement system was also constructed. The testing result indicated the method can gives the information of homogeneity and internal defect of infrared Se-based chalcogenide glass clearly and intuitionally.

  19. Gravitational influences on the liquid-state homogenization and solidification of aluminum antimonide. [space processing of solar cell material

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1979-01-01

    Typical commercial or laboratory-prepared samples of polycrystalline AlSb contain microstructural inhomogeneities of Al- or Sb-rich phases in addition to the primary AlSb grains. The paper reports on gravitational influences, such as density-driven convection or sedimentation, that cause microscopic phase separation and nonequilibrium conditions to exist in earth-based melts of AlSb. A triple-cavity electric furnace is used to homogenize the multiphase AlSb samples in space and on earth. A comparative characterization of identically processed low- and one-gravity samples of commercial AlSb reveals major improvements in the homogeneity of the low-gravity homogenized material.

  20. Linear scaling relationships and volcano plots in homogeneous catalysis - revisiting the Suzuki reaction.

    PubMed

    Busch, Michael; Wodrich, Matthew D; Corminboeuf, Clémence

    2015-12-01

    Linear free energy scaling relationships and volcano plots are common tools used to identify potential heterogeneous catalysts for myriad applications. Despite the striking simplicity and predictive power of volcano plots, they remain unknown in homogeneous catalysis. Here, we construct volcano plots to analyze a prototypical reaction from homogeneous catalysis, the Suzuki cross-coupling of olefins. Volcano plots succeed both in discriminating amongst different catalysts and reproducing experimentally known trends, which serves as validation of the model for this proof-of-principle example. These findings indicate that the combination of linear scaling relationships and volcano plots could serve as a valuable methodology for identifying homogeneous catalysts possessing a desired activity through a priori computational screening.

  1. On the time-homogeneous Ornstein-Uhlenbeck process in the foreign exchange rates

    NASA Astrophysics Data System (ADS)

    da Fonseca, Regina C. B.; Matsushita, Raul Y.; de Castro, Márcio T.; Figueiredo, Annibal

    2015-10-01

    Since Gaussianity and stationarity assumptions cannot be fulfilled by financial data, the time-homogeneous Ornstein-Uhlenbeck (THOU) process was introduced as a candidate model to describe time series of financial returns [1]. It is an Ornstein-Uhlenbeck (OU) process in which these assumptions are replaced by linearity and time-homogeneity. We employ the OU and THOU processes to analyze daily foreign exchange rates against the US dollar. We confirm that the OU process does not fit the data, while in most cases the first four cumulants patterns from data can be described by the THOU process. However, there are some exceptions in which the data do not follow linearity or time-homogeneity assumptions.

  2. [Relapse: causes and consequences].

    PubMed

    Thomas, P

    2013-09-01

    Relapse after a first episode of schizophrenia is the recurrence of acute symptoms after a period of partial or complete remission. Due to its variable aspects, there is no operational definition of relapse able to modelise the outcome of schizophrenia and measure how the treatment modifies the disease. Follow-up studies based on proxys such as hospital admission revealed that 7 of 10 patients relapsed after a first episode of schizophrenia. The effectiveness of antipsychotic medications on relapse prevention has been widely demonstrated. Recent studies claim for the advantages of atypical over first generation antipsychotic medication. Non-adherence to antipsychotic represents with addictions the main causes of relapse long before some non-consensual factors such as premorbid functioning, duration of untreated psychosis and associated personality disorders. The consequences of relapse are multiple, psychological, biological and social. Pharmaco-clinical studies have demonstrated that the treatment response decreases with each relapse. Relapse, even the first one, will contribute to worsen the outcome of the disease and reduce the capacity in general functionning. Accepting the idea of continuing treatment is a complex decision in which the psychiatrist plays a central role besides patients and their families. The development of integrated actions on modifiable risk factors such as psychosocial support, addictive comorbidities, access to care and the therapeutic alliance should be promoted. Relapse prevention is a major goal of the treatment of first-episode schizophrenia. It is based on adherence to the maintenance treatment, identification of prodromes, family active information and patient therapeutical education. Copyright © 2013 L’Encéphale. Published by Elsevier Masson SAS.. All rights reserved.

  3. Codigestion of solid wastes: a review of its uses and perspectives including modeling.

    PubMed

    Mata-Alvarez, Joan; Dosta, Joan; Macé, Sandra; Astals, Sergi

    2011-06-01

    The last two years have witnessed a dramatic increase in the number of papers published on the subject of codigestion, highlighting the relevance of this topic within anaerobic digestion research. Consequently, it seems appropriate to undertake a review of codigestion practices starting from the late 1970s, when the first papers related to this concept were published, and continuing to the present day, demonstrating the exponential growth in the interest shown in this approach in recent years. Following a general analysis of the situation, state-of-the-art codigestion is described, focusing on the two most important areas as regards publication: codigestion involving sewage sludge and the organic fraction of municipal solid waste (including a review of the secondary advantages for wastewater treatment plant related to biological nutrient removal), and codigestion in the agricultural sector, that is, including agricultural - farm wastes, and energy crops. Within these areas, a large number of oversized digesters appear which can be used to codigest other substrates, resulting in economic and environmental advantages. Although the situation may be changing, there is still a need for good examples on an industrial scale, particularly with regard to wastewater treatment plants, in order to extend this beneficial practice. In the last section, a detailed analysis of papers addressing the important aspect of modelisation is included. This analysis includes the first codigestion models to be developed as well as recent applications of the standardised anaerobic digestion model ADM1 to codigestion. (This review includes studies ranging from laboratory to industrial scale.).

  4. Method for preparing hydrous titanium oxide spherules and other gel forms thereof

    DOEpatents

    Collins, J.L.

    1998-10-13

    The present invention are methods for preparing hydrous titanium oxide spherules, hydrous titanium oxide gels such as gel slabs, films, capillary and electrophoresis gels, titanium monohydrogen phosphate spherules, hydrous titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite sorbent, titanium monohydrogen phosphate spherules having suspendible particles of at least one different sorbent homogeneously embedded within to form a composite sorbent having a desired crystallinity, titanium oxide spherules in the form of anatase, brookite or rutile, titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite, hydrous titanium oxide fiber materials, titanium oxide fiber materials, hydrous titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite, titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite and spherules of barium titanate. These variations of hydrous titanium oxide spherules and gel forms prepared by the gel-sphere, internal gelation process offer more useful forms of inorganic ion exchangers, catalysts, getters and ceramics. 6 figs.

  5. Method for preparing hydrous titanium oxide spherules and other gel forms thereof

    DOEpatents

    Collins, Jack L.

    1998-01-01

    The present invention are methods for preparing hydrous titanium oxide spherules, hydrous titanium oxide gels such as gel slabs, films, capillary and electrophoresis gels, titanium monohydrogen phosphate spherules, hydrous titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite sorbent, titanium monohydrogen phosphate spherules having suspendible particles of at least one different sorbent homogeneously embedded within to form a composite sorbent having a desired crystallinity, titanium oxide spherules in the form of anatase, brookite or rutile, titanium oxide spherules having suspendible particles homogeneously embedded within to form a composite, hydrous titanium oxide fiber materials, titanium oxide fiber materials, hydrous titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite, titanium oxide fiber materials having suspendible particles homogeneously embedded within to form a composite and spherules of barium titanate. These variations of hydrous titanium oxide spherules and gel forms prepared by the gel-sphere, internal gelation process offer more useful forms of inorganic ion exchangers, catalysts, getters and ceramics.

  6. Controllable synthesis of nickel bicarbonate nanocrystals with high homogeneity for a high-performance supercapacitor

    NASA Astrophysics Data System (ADS)

    Gu, Jianmin; Liu, Xin; Wang, Zhuang; Bian, Zhenpan; Jin, Cuihong; Sun, Xiao; Yin, Baipeng; Wu, Tianhui; Wang, Lin; Tang, Shoufeng; Wang, Hongchao; Gao, Faming

    2017-08-01

    The electrochemical performance of supercapacitors might be associated with the homogeneous structure of the electrode materials. However, the relationship between the degree of uniformity for the electrode materials and the electrochemical performance of the supercapacitor is not clear. Herein, we synthesize two types of nickel bicarbonate nanocrystals with different degrees of uniformity to investigate this relationship. As the electroactive material, the nickel bicarbonate nanocrystals with a homogeneous structure could provide a larger space and offer more exposed atoms for the electrochemical reaction than the nanocrystals with a heterogeneous structure. The homogeneous nickel bicarbonate nanocrystals exhibit better electrochemical performance and show excellent specific capacitance (1596 F g-1 at 2 A g-1 and 1260 F g-1 at 30 A g-1), which is approximately twice that of the heterogeneous nickel bicarbonate nanocrystals. The cycling stability for the homogeneity (˜80%) is higher than the inhomogeneity (˜61%) at a high current density of 5 A g-1.

  7. Preparation and characterization of molecularly homogeneous silica-titania film by sol-gel process with different synthetic strategies.

    PubMed

    Chen, Hsueh-Shih; Huang, Sheng-Hsin; Perng, Tsong-Pyng

    2012-10-24

    Three silica-titania thin films with various degrees of molecular homogeneity were synthesized by the sol-gel process with the same precursor formula but different reaction paths. The dried films prepared by a single spin-coating process have a thickness of 500-700 nm and displayed no cracks or pin holes. The transmittances and refractive indices of the samples are >97.8% in the range of 350-1800 nm and 1.62-1.65 at 500 nm, respectively. The in-plane and out-of-plane chemical homogeneities of the films were analyzed by X-ray photoelectron spectroscopy and Auger electron spectroscopy, respectively. For the film with the highest degree of homogeneity, the deviations of O, Si, and Ti atomic contents in both in-plane and out-of-plane directions are less than 1.5%, indicating that the film is highly molecularly homogeneous. It also possesses the highest transparency and the lowest refractive index among the three samples.

  8. The importance of carbon nanotube wire density, structural uniformity, and purity for fabricating homogeneous carbon nanotube-copper wire composites by copper electrodeposition

    NASA Astrophysics Data System (ADS)

    Sundaram, Rajyashree; Yamada, Takeo; Hata, Kenji; Sekiguchi, Atsuko

    2018-04-01

    We present the influence of density, structural regularity, and purity of carbon nanotube wires (CNTWs) used as Cu electrodeposition templates on fabricating homogeneous high-electrical performance CNT-Cu wires lighter than Cu. We show that low-density CNTWs (<0.6 g/cm3 for multiwall nanotube wires) with regular macro- and microstructures and high CNT content (>90 wt %) are essential for making homogeneous CNT-Cu wires. These homogeneous CNT-Cu wires show a continuous Cu matrix with evenly mixed nanotubes of high volume fractions (˜45 vol %) throughout the wire-length. Consequently, the composite wires show densities ˜5.1 g/cm3 (33% lower than Cu) and electrical conductivities ˜6.1 × 104 S/cm (>100 × CNTW conductivity). However, composite wires from templates with higher densities or structural inconsistencies are non-uniform with discontinuous Cu matrices and poor CNT/Cu mixing. These non-uniform CNT-Cu wires show conductivities 2-6 times lower than the homogeneous composite wires.

  9. Numerical Study of Microstructural Evolution During Homogenization of Al-Si-Mg-Fe-Mn Alloys

    NASA Astrophysics Data System (ADS)

    Priya, Pikee; Johnson, David R.; Krane, Matthew J. M.

    2016-09-01

    Microstructural evolution during homogenization of Al-Si-Mg-Fe-Mn alloys occurs in two stages at different length scales: while holding at the homogenization temperature (diffusion on the scale of the secondary dendrite arm spacing (SDAS) in micrometers) and during quenching to room temperature (dispersoid precipitation at the nanometer to submicron scale). Here a numerical study estimates microstructural changes during both stages. A diffusion-based model developed to simulate evolution at the SDAS length scale predicts homogenization times and microstructures matching experiments. That model is coupled with a Kampmann Wagner Neumann-based precipitate nucleation and growth model to study the effect of temperature, composition, as-cast microstructure, and cooling rates during posthomogenization quenching on microstructural evolution. A homogenization schedule of 853 K (580 °C) for 8 hours, followed by cooling at 250 K/h, is suggested to optimize microstructures for easier extrusion, consisting of minimal α-Al(FeMn)Si, no β-AlFeSi, and Mg2Si dispersoids <1 μm size.

  10. [Effect of caffeine on active Ca2+ ion transport in a homogenate of skeletal muscles and myocardium].

    PubMed

    Ritov, V B; Murzakhmetova, M K

    1985-08-01

    A Ca2-selective electrode was used to study active transport of Ca2+ by sarcoplasmic reticulum fragments of rabbit skeletal muscle and myocardium homogenates. The specific Ca2+ transport activities (mumol Ca2+/min/mg tissue) are 40 = 60 and 3 = 5 units for fast and slow muscles and the myocardium, respectively. Caffeine (5 mM) exerts a powerful inhibitory influence on Ca2+ transport in skeletal muscle homogenates. For fast muscles, the degree of inhibition exceeds 50%. The rate of Ca2+ transport in the myocardium homogenate increases in the presence of creatine phosphate. The latter produces no effect on Ca2+ transport in skeletal muscle homogenates. The high sensitivity of Ca2 transport to caffeine, a specific blocker of Ca2+ transport to the terminal cisterns of the sarcoplasmic reticulum, suggests that the terminal cisterns, apart from being a reservoir for Ca2+ needed for contraction trigger, may play an essential role in muscle relaxation.

  11. Theoretical investigation of mixing in warm clouds – Part 2: Homogeneous mixing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pinsky, Mark; Khain, Alexander; Korolev, Alexei

    Evolution of monodisperse and polydisperse droplet size distributions (DSD) during homogeneous mixing is analyzed. Time-dependent universal analytical expressions for supersaturation and liquid water content are derived. For an initial monodisperse DSD, these quantities are shown to depend on a sole non-dimensional parameter. The evolution of moments and moment-related functions in the course of homogeneous evaporation of polydisperse DSD is analyzed using a parcel model. It is shown that the classic conceptual scheme, according to which homogeneous mixing leads to a decrease in droplet mass at constant droplet concentration, is valid only in cases of monodisperse or initially very narrow polydispersemore » DSD. In cases of wide polydisperse DSD, mixing and successive evaporation lead to a decrease of both mass and concentration, so the characteristic droplet sizes remain nearly constant. As this feature is typically associated with inhomogeneous mixing, we conclude that in cases of an initially wide DSD at cloud top, homogeneous mixing is nearly indistinguishable from inhomogeneous mixing.« less

  12. Properties of lotus seed starch-glycerin monostearin complexes formed by high pressure homogenization.

    PubMed

    Chen, Bingyan; Zeng, Shaoxiao; Zeng, Hongliang; Guo, Zebin; Zhang, Yi; Zheng, Baodong

    2017-07-01

    Starch-lipid complexes were prepared using lotus seed starch (LS) and glycerin monostearate (GMS) via a high pressure homogenization (HPH) process, and the effect of HPH on the physicochemical properties of LS-GMS complexes was investigated. The results of Fourier transform infrared spectroscopy and complex index analysis showed that LS-GMS complexes were formed at 40MPa by HPH and the complex index increased with the increase of homogenization pressure. Scanning electron microscopy displayed LS-GMS complexes present more nest-shape structure with increasing homogenization pressure. X-ray diffraction and differential scanning calorimetry results revealed that V-type crystalline polymorph was formed between LS and GMS, with higher homogenization pressure producing an increasingly stable complex. LS-GMS complex inhibited starch granules swelling, solubility and pasting development, which further reduced peak and breakdown viscosity. During storage, LS-GMS complexes prepared by 70-100MPa had higher Avrami exponent values and lower recrystallization rates compared with native starch, which suggested a lower retrogradation trendency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Homogenization-based interval analysis for structural-acoustic problem involving periodical composites and multi-scale uncertain-but-bounded parameters.

    PubMed

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Liu, Jian; Ma, Zhengdong

    2017-04-01

    This paper presents a homogenization-based interval analysis method for the prediction of coupled structural-acoustic systems involving periodical composites and multi-scale uncertain-but-bounded parameters. In the structural-acoustic system, the macro plate structure is assumed to be composed of a periodically uniform microstructure. The equivalent macro material properties of the microstructure are computed using the homogenization method. By integrating the first-order Taylor expansion interval analysis method with the homogenization-based finite element method, a homogenization-based interval finite element method (HIFEM) is developed to solve a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters. The corresponding formulations of the HIFEM are deduced. A subinterval technique is also introduced into the HIFEM for higher accuracy. Numerical examples of a hexahedral box and an automobile passenger compartment are given to demonstrate the efficiency of the presented method for a periodical composite structural-acoustic system with multi-scale uncertain-but-bounded parameters.

  14. Cheese milk low homogenization enhanced early lipolysis and volatiles compounds production in hard cooked cheeses.

    PubMed

    Vélez, María A; Hynes, Erica R; Meinardi, Carlos A; Wolf, Verónica I; Perotti, María C

    2017-06-01

    Homogenization applied to cheese milk has shown to increase lipolysis but its use is not spread as it can induce detrimental effects. The aim of this work was to assess the effect of low-pressure homogenization of the cream followed by pre-incubation of cheese milk on the composition, ripening index, lipolysis and volatile profiles of hard cooked cheeses. For that, control and experimental miniature Reggianito cheeses were made and analyzed during ripening (3, 45 and 90days). Homogenization had no impact on composition and proteolysis. An acceleration of the lipolysis reaction was clearly noticed in cheeses made with homogenized milk at the beginning of ripening, while both type of cheeses reached similar levels at 90days. We found the level of several compounds derived from fatty acid catabolism were noticeably influenced by the treatment applied: straight-chain aldehydes such as hexanal, heptanal and nonanal and methylketones from C 5 to C 9 were preferentially formed in experimental cheeses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Non-identical multiplexing promotes chimera states

    NASA Astrophysics Data System (ADS)

    Ghosh, Saptarshi; Zakharova, Anna; Jalan, Sarika

    2018-01-01

    We present the emergence of chimeras, a state referring to coexistence of partly coherent, partly incoherent dynamics in networks of identical oscillators, in a multiplex network consisting of two non-identical layers which are interconnected. We demonstrate that the parameter range displaying the chimera state in the homogeneous first layer of the multiplex networks can be tuned by changing the link density or connection architecture of the same nodes in the second layer. We focus on the impact of the interconnected second layer on the enlargement or shrinking of the coupling regime for which chimeras are displayed in the homogeneous first layer. We find that a denser homogeneous second layer promotes chimera in a sparse first layer, where chimeras do not occur in isolation. Furthermore, while a dense connection density is required for the second layer if it is homogeneous, this is not true if the second layer is inhomogeneous. We demonstrate that a sparse inhomogeneous second layer which is common in real-world complex systems can promote chimera states in a sparse homogeneous first layer.

  16. Influence of Homogenization and Thermal Processing on the Gastrointestinal Fate of Bovine Milk Fat: In Vitro Digestion Study.

    PubMed

    Liang, Li; Qi, Ce; Wang, Xingguo; Jin, Qingzhe; McClements, David Julian

    2017-12-20

    Dairy lipids are an important source of energy and nutrients for infants and adults. The dimensions, aggregation state, and interfacial properties of fat globules in raw milk are changed by dairy processing operations, such as homogenization and thermal processing. These changes influence the behavior of fat globules within the human gastrointestinal tract (GIT). The gastrointestinal fate of raw milk, homogenized milk, high temperature short time (HTST) pasteurized milk, and ultrahigh temperature (UHT) pasteurized milk samples was therefore determined using a simulated GIT. The properties of particles in different regions of the GIT depended on the degree of milk processing. Homogenization increased the initial lipid digestion rate but did not influence the final digestion extent. Thermal processing of homogenized milk decreased the initial rate and final extent of lipid digestion, which was attributed to changes in interfacial structure. These results provide insights into the impact of dairy processing on the gastrointestinal fate of milk fat.

  17. Fresh broad (Vicia faba) tissue homogenate-based biosensor for determination of phenolic compounds.

    PubMed

    Ozcan, Hakki Mevlut; Sagiroglu, Ayten

    2014-08-01

    In this study, a novel fresh broad (Vicia faba) tissue homogenate-based biosensor for determination of phenolic compounds was developed. The biosensor was constructed by immobilizing tissue homogenate of fresh broad (Vicia faba) on to glassy carbon electrode. For the stability of the biosensor, general immobilization techniques were used to secure the fresh broad tissue homogenate in gelatin-glutaraldehyde cross-linking matrix. In the optimization and characterization studies, the amount of fresh broad tissue homogenate and gelatin, glutaraldehyde percentage, optimum pH, optimum temperature and optimum buffer concentration, thermal stability, interference effects, linear range, storage stability, repeatability and sample applications (Wine, beer, fruit juices) were also investigated. Besides, the detection ranges of thirteen phenolic compounds were obtained with the help of the calibration graphs. A typical calibration curve for the sensor revealed a linear range of 5-60 μM catechol. In reproducibility studies, variation coefficient (CV) and standard deviation (SD) were calculated as 1.59%, 0.64×10(-3) μM, respectively.

  18. Role of structural barriers for carotenoid bioaccessibility upon high pressure homogenization.

    PubMed

    Palmero, Paola; Panozzo, Agnese; Colle, Ines; Chigwedere, Claire; Hendrickx, Marc; Van Loey, Ann

    2016-05-15

    A specific approach to investigate the effect of high pressure homogenization on the carotenoid bioaccessibility in tomato-based products was developed. Six different tomato-based model systems were reconstituted in order to target the specific role of the natural structural barriers (chromoplast substructure/cell wall) and of the phases (soluble/insoluble) in determining the carotenoid bioaccessibility and viscosity changes upon high pressure homogenization. Results indicated that in the absence of natural structural barriers (carotenoid enriched oil), the soluble and insoluble phases determined the carotenoid bioaccessibility upon processing whereas, in their presence, these barriers governed the bioaccessibility. Furthermore, it was shown that the increment of the viscosity upon high pressure homogenization is determined by the presence of insoluble phase, however, this result was related to the initial ratio of the soluble:insoluble phases in the system. In addition, no relationship between the changes in viscosity and carotenoid bioaccessibility upon high pressure homogenization was found. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Method to study the effect of blend flowability on the homogeneity of acetaminophen.

    PubMed

    Llusá, Marcos; Pingali, Kalyana; Muzzio, Fernando J

    2013-02-01

    Excipient selection is key to product development because it affects their processability and physical properties, which ultimately affect the quality attributes of the pharmaceutical product. To study how the flowability of lubricated formulations affects acetaminophen (APAP) homogeneity. The formulations studied here contain one of two types of cellulose (Avicel 102 or Ceollus KG-802), one of three grades of Mallinckrodt APAP (fine, semi-fine, or micronized), lactose (Fast-Flo) and magnesium stearate. These components are mixed in a 300-liter bin blender. Blend flowability is assessed with the Gravitational Displacement Rheometer. APAP homogeneity is assessed with off-line NIR. Excluding blends dominated by segregation, there is a trend between APAP homogeneity and blend flow index. Blend flowability is affected by the type of microcrystalline cellulose and by the APAP grade. The preliminary results suggest that the methodology used in this paper is adequate to study of the effect of blend flow index on APAP homogeneity.

  20. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  1. Theoretical investigation of mixing in warm clouds – Part 2: Homogeneous mixing

    DOE PAGES

    Pinsky, Mark; Khain, Alexander; Korolev, Alexei; ...

    2016-07-28

    Evolution of monodisperse and polydisperse droplet size distributions (DSD) during homogeneous mixing is analyzed. Time-dependent universal analytical expressions for supersaturation and liquid water content are derived. For an initial monodisperse DSD, these quantities are shown to depend on a sole non-dimensional parameter. The evolution of moments and moment-related functions in the course of homogeneous evaporation of polydisperse DSD is analyzed using a parcel model. It is shown that the classic conceptual scheme, according to which homogeneous mixing leads to a decrease in droplet mass at constant droplet concentration, is valid only in cases of monodisperse or initially very narrow polydispersemore » DSD. In cases of wide polydisperse DSD, mixing and successive evaporation lead to a decrease of both mass and concentration, so the characteristic droplet sizes remain nearly constant. As this feature is typically associated with inhomogeneous mixing, we conclude that in cases of an initially wide DSD at cloud top, homogeneous mixing is nearly indistinguishable from inhomogeneous mixing.« less

  2. Refined Zigzag Theory for Homogeneous, Laminated Composite, and Sandwich Plates: A Homogeneous Limit Methodology for Zigzag Function Selection

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; DiSciuva, Marco; Gherlone, marco

    2010-01-01

    The Refined Zigzag Theory (RZT) for homogeneous, laminated composite, and sandwich plates is presented from a multi-scale formalism starting with the inplane displacement field expressed as a superposition of coarse and fine contributions. The coarse kinematic field is that of first-order shear-deformation theory, whereas the fine kinematic field has a piecewise-linear zigzag distribution through the thickness. The condition of limiting homogeneity of transverse-shear properties is proposed and yields four distinct sets of zigzag functions. By examining elastostatic solutions for highly heterogeneous sandwich plates, the best-performing zigzag functions are identified. The RZT predictive capabilities to model homogeneous and highly heterogeneous sandwich plates are critically assessed, demonstrating its superior efficiency, accuracy ; and a wide range of applicability. The present theory, which is derived from the virtual work principle, is well-suited for developing computationally efficient CO-continuous finite elements, and is thus appropriate for the analysis and design of high-performance load-bearing aerospace structures.

  3. [Characteristics of the glutamate decarboxylase reaction in homogenates of various regions of the rat brain].

    PubMed

    Rozanov, V A

    1987-01-01

    The glutamate decarboxylase activity in rough homogenates of cerebellum, cortex and truncal part of the rat brain was studied under different conditions of incubation: in the presence of 25 mM glutamate sodium, 0.4 mM pyridoxal-5'-phosphate and both these components. It is found that the initial glutamate decarboxylase activity in cerebellum homogenates is approximately twice as high as in the cortex and trunk homogenates. Addition of the substrate and cofactor, especially in the combination, stimulates considerably the yield of gamma-aminobutyric acid (GABA) in the glutamate decarboxylase reaction, the most pronounced activation being observed in the truncal homogenates. The glutamate/GABA relation both initial and after the completion of the reaction is the maximal in the cortex and minimal in the truncal part of the brain. The data obtained evidence for the differences in the content of the GABA-producing enzyme rather than for the presence of the specific mechanisms of the enzyme regulation in different brain areas.

  4. Predicting equilibrium states with Reynolds stress closures in channel flow and homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Abid, R.; Speziale, C. G.

    1993-01-01

    Turbulent channel flow and homogeneous shear flow have served as basic building block flows for the testing and calibration of Reynolds stress models. A direct theoretical connection is made between homogeneous shear flow in equilibrium and the log-layer of fully-developed turbulent channel flow. It is shown that if a second-order closure model is calibrated to yield good equilibrium values for homogeneous shear flow it will also yield good results for the log-layer of channel flow provided that the Rotta coefficient is not too far removed from one. Most of the commonly used second-order closure models introduce an ad hoc wall reflection term in order to mask deficient predictions for the log-layer of channel flow that arise either from an inaccurate calibration of homogeneous shear flow or from the use of a Rotta coefficient that is too large. Illustrative model calculations are presented to demonstrate this point which has important implications for turbulence modeling.

  5. Predicting equilibrium states with Reynolds stress closures in channel flow and homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Abid, R.; Speziale, C. G.

    1992-01-01

    Turbulent channel flow and homogeneous shear flow have served as basic building block flows for the testing and calibration of Reynolds stress models. A direct theoretical connection is made between homogeneous shear flow in equilibrium and the log-layer of fully-developed turbulent channel flow. It is shown that if a second-order closure model is calibrated to yield good equilibrium values for homogeneous shear flow it will also yield good results for the log-layer of channel flow provided that the Rotta coefficient is not too far removed from one. Most of the commonly used second-order closure models introduce an ad hoc wall reflection term in order to mask deficient predictions for the log-layer of channel flow that arise either from an inaccurate calibration of homogeneous shear flow or from the use of a Rotta coefficient that is too large. Illustrative model calculations are presented to demonstrate this point which has important implications for turbulence modeling.

  6. Effects of thermal inhomogeneity on 4m class mirror substrates

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Kunisch, Clemens; Westerhoff, Thomas

    2016-07-01

    The new ground based telescope generation is moving to a next stage of performance and resolution. Mirror substrate material properties tolerance and homogeneity are getting into focus. The coefficient of thermal expansion (CTE) homogeneity is even more important than the absolute CTE. The error in shape of a mirror, even one of ZERODUR, is affected by changes in temperature, and by gradients in temperature. Front to back gradients will change the radius of curvature R that in turn will change the focus. Some systems rely on passive athermalization and do not have means to focus. Similarly changes in soak temperature will result in surface changes to the extent there is a non-zero coefficient of thermal expansion. When there are in-homogeneities in CTE, the mirror will react accordingly. Results of numerical experiments are presented discussing the impact of CTE in-homogeneities on the optical performance of 4 m class mirror substrates. Latest improvements in 4 m class ZERODUR CTE homogeneity and the thermal expansion metrology are presented as well.

  7. Homogeneity of the coefficient of linear thermal expansion of ZERODUR: a review of a decade of evaluations

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Westerhoff, Thomas

    2017-09-01

    The coefficient of thermal expansion (CTE) and its spatial homogeneity from small to large formats is the most important property of ZERODUR. Since more than a decade SCHOTT has documented the excellent CTE homogeneity. It started with reviews of past astronomical telescope projects like the VLT, Keck and GTC mirror blanks and continued with dedicated evaluations of the production. In recent years, extensive CTE measurements on samples cut from randomly selected single ZERODUR parts in meter size and formats of arbitrary shape, large production boules and even 4 m sized blanks have demonstrated the excellent CTE homogeneity in production. The published homogeneity data shows single ppb/K peak to valley CTE variations on medium spatial scale of several cm down to small spatial scale of only a few mm mostly at the limit of the measurement reproducibility. This review paper summarizes the results also in respect to the increased CTE measurement accuracy over the last decade of ZERODUR production.

  8. Global Well-posedness of the Spatially Homogeneous Kolmogorov-Vicsek Model as a Gradient Flow

    NASA Astrophysics Data System (ADS)

    Figalli, Alessio; Kang, Moon-Jin; Morales, Javier

    2018-03-01

    We consider the so-called spatially homogenous Kolmogorov-Vicsek model, a non-linear Fokker-Planck equation of self-driven stochastic particles with orientation interaction under the space-homogeneity. We prove the global existence and uniqueness of weak solutions to the equation. We also show that weak solutions exponentially converge to a steady state, which has the form of the Fisher-von Mises distribution.

  9. Simple Köhler homogenizers for image-forming solar concentrators

    NASA Astrophysics Data System (ADS)

    Zhang, Weiya; Winston, Roland

    2010-08-01

    By adding simple Köhler homogenizers in the form of aspheric lenses generated with an optimization approach, we solve the problems of non-uniform irradiance distribution and non-square irradiance pattern existing in some image-forming solar concentrators. The homogenizers do not require optical bonding to the solar cells or total internal reflection surface. Two examples are shown including a Fresnel lens based concentrator and a two-mirror aplanatic system.

  10. Differential responses of Africanized and European honey bees (Apis mellifera) to viral replication following mechanical transmission or Varroa destructor parasitism.

    PubMed

    Hamiduzzaman, Mollah Md; Guzman-Novoa, Ernesto; Goodwin, Paul H; Reyes-Quintana, Mariana; Koleoglu, Gun; Correa-Benítez, Adriana; Petukhova, Tatiana

    2015-03-01

    For the first time, adults and brood of Africanized and European honey bees (Apis mellifera) were compared for relative virus levels over 48 h following Varroa destructor parasitism or injection of V. destructor homogenate. Rates of increase of deformed wing virus (DWV) for Africanized versus European bees were temporarily lowered for 12h with parasitism and sustainably lowered over the entire experiment (48 h) with homogenate injection in adults. The rates were also temporarily lowered for 24h with parasitism but were not affected by homogenate injection in brood. Rates of increase of black queen cell virus (BQCV) for Africanized versus European bees were similar with parasitism but sustainably lowered over the entire experiment with homogenate injection in adults and were similar for parasitism and homogenate injection in brood. Analyses of sac brood bee virus and Israeli acute paralysis virus were limited as detection did not occur after both homogenate injection and parasitism treatment, or levels were not significantly higher than those following control buffer injection. Lower rates of replication of DWV and BQCV in Africanized bees shows that they may have greater viral resistance, at least early after treatment. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Alterations in regional homogeneity of resting-state cerebral activity in patients with chronic prostatitis/chronic pelvic pain syndrome.

    PubMed

    Lin, Yusong; Bai, Yan; Liu, Peng; Yang, Xuejuan; Qin, Wei; Gu, Jianqin; Ding, Degang; Tian, Jie; Wang, Meiyun

    2017-01-01

    The purpose of this study was to explore the neural mechanism in Chronic prostatitis/Chronic pelvic pain syndrome (CP/CPPS) using resting-state functional magnetic resonance imaging. The functional magnetic resonance imaging was performed on 31 male CP/CPPS-patients and 31 age and education matched male healthy controls on a 3-T magnetic resonance imaging unit. A two-sample t-test was adopted to reveal the regional homogeneity between the patients and healthy controls. The mean regional homogeneity values in the alerted brain regions of patients were correlated with the clinical measurements by using Pearson's correlation analyses. The CP/CPPS-patients had significantly decreased regional homogeneity in the bilateral anterior cingulate cortices, insular cortices and right medial prefrontal cortex, while significantly increased regional homogeneity in the brainstem and right thalamus compared with the healthy controls. In the CP/CPPS-patients, the mean regional homogeneity value in the left anterior cingulate cortex, bilateral insular cortices and brainstem were respectively correlated with the National Institutes of Health Chronic Prostatitis Symptom Index total score and pain subscale. These brain regions are important in the pain modulation process. Therefore, an impaired pain modulatory system, either by decreased descending pain inhibition or enhanced pain facilitation, may explain the pain symptoms in CP/CPPS.

  12. Fragility and super-strong character of non-stoichiometric chalcogenides: implications on melt homogenization

    NASA Astrophysics Data System (ADS)

    Ravindren, Sriram; Gunasekera, Kapila; Boolchand, Punit; Micoulaut, Matthieu

    2014-03-01

    The kinetics of homogenization of binary AsxSe100-x melts in the As concentration range 0%

  13. A comparison of techniques for preparing fish fillet for ICP-AES multielemental analysis and the microwave digestion of whole fish.

    PubMed

    Moeller, A; Ambrose, R F; Que Hee, S S

    2001-01-01

    Four catfish fillet homogenate treatments before multielemental metal analysis by simultaneous inductively coupled plasma/atomic emission spectroscopy were compared in triplicate. These treatments were: nitric acid wet-ashing by Parr bomb digestion; nitric acid wet-ashing by microwave digestion; tetramethylammonium hydroxide/nitric acid wet digestion; and dry-ashing. The tetramethylammonium hydroxide/nitric acid method was imprecise (coefficients of variation > 20%). The dry-ashing method was fast and sensitive but had low recoveries of 50% for spiked Pb and Al and was not as precise as the Parr bomb or microwave treatments. The Parr bomb method was the most precise method but was less sensitive than the microwave method which had nearly the same precision. The microwave method was then adapted to homogenates of small whole fish < or = 3 cm in length. The whole fish homogenate required more vigorous digestion conditions, and addition of more acid after the evaporative step because of the presence of less oxidizable and acid-soluble components than fillet. The whole fish homogenate was also more heterogeneous than catfish fillet. A quality assurance protocol to demonstrate homogenate uniformity is essential. The use of a non-specialized microwave oven system allowed precise results for fillet and whole fish homogenates.

  14. Numerical simulation of elasto-plastic deformation of composites: evolution of stress microfields and implications for homogenization models

    NASA Astrophysics Data System (ADS)

    González, C.; Segurado, J.; LLorca, J.

    2004-07-01

    The deformation of a composite made up of a random and homogeneous dispersion of elastic spheres in an elasto-plastic matrix was simulated by the finite element analysis of three-dimensional multiparticle cubic cells with periodic boundary conditions. "Exact" results (to a few percent) in tension and shear were determined by averaging 12 stress-strain curves obtained from cells containing 30 spheres, and they were compared with the predictions of secant homogenization models. In addition, the numerical simulations supplied detailed information of the stress microfields, which was used to ascertain the accuracy and the limitations of the homogenization models to include the nonlinear deformation of the matrix. It was found that secant approximations based on the volume-averaged second-order moment of the matrix stress tensor, combined with a highly accurate linear homogenization model, provided excellent predictions of the composite response when the matrix strain hardening rate was high. This was not the case, however, in composites which exhibited marked plastic strain localization in the matrix. The analysis of the evolution of the matrix stresses revealed that better predictions of the composite behavior can be obtained with new homogenization models which capture the essential differences in the stress carried by the elastic and plastic regions in the matrix at the onset of plastic deformation.

  15. Gauge Fields in Homogeneous and Inhomogeneous Cosmologies

    NASA Astrophysics Data System (ADS)

    Darian, Bahman K.

    Despite its formidable appearance, the study of classical Yang-Mills (YM) fields on homogeneous cosmologies is amenable to a formal treatment. This dissertation is a report on a systematic approach to the general construction of invariant YM fields on homogeneous cosmologies undertaken for the first time in this context. This construction is subsequently followed by the investigation of the behavior of YM field variables for the most simple of self-gravitating YM fields. Particularly interesting was a dynamical system analysis and the discovery of chaotic signature in the axially symmetric Bianchi I-YM cosmology. Homogeneous YM fields are well studied and are known to have chaotic properties. The chaotic behavior of YM field variables in homogeneous cosmologies might eventually lead to an invariant definition of chaos in (general) relativistic cosmological models. By choosing the gauge fields to be Abelian, the construction and the field equations presented so far reduce to that of electromagnetic field in homogeneous cosmologies. A perturbative analysis of gravitationally interacting electromagnetic and scalar fields in inhomogeneous cosmologies is performed via the Hamilton-Jacobi formulation of general relativity. An essential feature of this analysis is the spatial gradient expansion of the generating functional (Hamilton principal function) to solve the Hamiltonian constraint. Perturbations of a spatially flat Friedman-Robertson-Walker cosmology with an exponential potential for the scalar field are presented.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: Different particle scanning beam delivery systems have different delivery accuracies. This study was performed to determine, for our particle treatment system, an appropriate composition (n=FWHM/GS) of spot size(FWHM) and grid size (GS), which can provide homogenous delivered dose distributions for both proton and heavy ion scanning beam radiotherapy. Methods: We analyzed the delivery errors of our beam delivery system using log files from the treatment of 28 patients. We used a homemade program to simulate square fields for different n values with and without considering the delivery errors and analyzed the homogeneity. All spots were located on a rectilinearmore » grid with equal spacing in the × and y directions. After that, we selected 7 energy levels for both proton and carbon ions. For each energy level, we made 6 square field plans with different n values (1, 1.5, 2, 2.5, 3, 3.5). Then we delivered those plans and used films to measure the homogeneity of each field. Results: For program simulation without delivery errors, when n≥1.1 the homogeneity can be within ±3%. For both proton and carbon program simulations with delivery errors and film measurements, the homogeneity can be within ±3% when n≥2.5. Conclusion: For our facility with system errors, the n≥2.5 is appropriate for maintaining homogeneity within ±3%.« less

  17. Biochemical studies of amylase, lipase and protease in Callosobruchus maculatus (Coleoptera: Chrysomelidae) populations fed with Vigna unguiculata grain cultivated with diazotrophic bacteria strains.

    PubMed

    Silva, L B; Torres, É B; Nóbrega, R A S; Lopes, G N; Vogado, R F; Pavan, B E; Fernandes-Junior, P I

    2017-12-01

    The objective of this study was to evaluate the enzymatic activity of homogenates of insects fed on grain of cowpea, Vigna unguiculata (L.), cultivars grown with different nitrogen sources. For the experiment we used aliquots of the homogenate of 100 unsexed adult insects, emerged from 10 g of grain obtained from four cowpea cultivars: 'BRS Acauã', 'BRS Carijó', 'BRS Pujante', and 'BRS Tapaihum' grown under different regimes of nitrogen sources: mineral fertilizer, inoculation with strains of diazotrophs (BR 3267, BR 3262, BR 3299; INPA 03-11B, 03-84 UFLA, as well as the control (with soil nitrogen). The parameters evaluated were enzymatic activities of insect protease, amylase and lipase and the starch content of the grains. There were differences in the enzymatic activity of amylase, lipase and protease of insect homogenate according to the food source. A lower activity of the enzyme amylase from C. maculatus homogenate was observed when insects were fed grain of the cultivar BRS Carijó. A lower activity of lipase enzyme from C. maculatus homogenate was observed when the insects fed on grain from the interaction of the cultivar Tapaihum inoculated with BR 3262 diazotrophs. The lowest proteolytic activity was observed in homogenate of insects fed on interaction of 'BRS Carijó' inoculated with BR 3262 diazotrophs. Starch content correlated positively with the amylase activity of C. maculatus homogenate. The cultivar BRS Carijó had a different behavior from the other cultivars, according to the cluster analysis.

  18. Elastic full waveform inversion based on the homogenization method: theoretical framework and 2-D numerical illustrations

    NASA Astrophysics Data System (ADS)

    Capdeville, Yann; Métivier, Ludovic

    2018-05-01

    Seismic imaging is an efficient tool to investigate the Earth interior. Many of the different imaging techniques currently used, including the so-called full waveform inversion (FWI), are based on limited frequency band data. Such data are not sensitive to the true earth model, but to a smooth version of it. This smooth version can be related to the true model by the homogenization technique. Homogenization for wave propagation in deterministic media with no scale separation, such as geological media, has been recently developed. With such an asymptotic theory, it is possible to compute an effective medium valid for a given frequency band such that effective waveforms and true waveforms are the same up to a controlled error. In this work we make the link between limited frequency band inversion, mainly FWI, and homogenization. We establish the relation between a true model and an FWI result model. This relation is important for a proper interpretation of FWI images. We numerically illustrate, in the 2-D case, that an FWI result is at best the homogenized version of the true model. Moreover, it appears that the homogenized FWI model is quite independent of the FWI parametrization, as long as it has enough degrees of freedom. In particular, inverting for the full elastic tensor is, in each of our tests, always a good choice. We show how the homogenization can help to understand FWI behaviour and help to improve its robustness and convergence by efficiently constraining the solution space of the inverse problem.

  19. Assessing the homogenization of urban land management with an application to US residential lawn care

    PubMed Central

    Polsky, Colin; Grove, J. Morgan; Knudson, Chris; Groffman, Peter M.; Bettez, Neil; Cavender-Bares, Jeannine; Hall, Sharon J.; Heffernan, James B.; Hobbie, Sarah E.; Larson, Kelli L.; Morse, Jennifer L.; Neill, Christopher; Nelson, Kristen C.; Ogden, Laura A.; O’Neil-Dunne, Jarlath; Pataki, Diane E.; Roy Chowdhury, Rinku; Steele, Meredith K.

    2014-01-01

    Changes in land use, land cover, and land management present some of the greatest potential global environmental challenges of the 21st century. Urbanization, one of the principal drivers of these transformations, is commonly thought to be generating land changes that are increasingly similar. An implication of this multiscale homogenization hypothesis is that the ecosystem structure and function and human behaviors associated with urbanization should be more similar in certain kinds of urbanized locations across biogeophysical gradients than across urbanization gradients in places with similar biogeophysical characteristics. This paper introduces an analytical framework for testing this hypothesis, and applies the framework to the case of residential lawn care. This set of land management behaviors are often assumed—not demonstrated—to exhibit homogeneity. Multivariate analyses are conducted on telephone survey responses from a geographically stratified random sample of homeowners (n = 9,480), equally distributed across six US metropolitan areas. Two behaviors are examined: lawn fertilizing and irrigating. Limited support for strong homogenization is found at two scales (i.e., multi- and single-city; 2 of 36 cases), but significant support is found for homogenization at only one scale (22 cases) or at neither scale (12 cases). These results suggest that US lawn care behaviors are more differentiated in practice than in theory. Thus, even if the biophysical outcomes of urbanization are homogenizing, managing the associated sustainability implications may require a multiscale, differentiated approach because the underlying social practices appear relatively varied. The analytical approach introduced here should also be productive for other facets of urban-ecological homogenization. PMID:24616515

  20. Influence of Interspecific Competition and Landscape Structure on Spatial Homogenization of Avian Assemblages

    PubMed Central

    Robertson, Oliver J.; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or ‘winners’, is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or ‘winners’, can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems. PMID:23724136

  1. Type of homogenization and fat loss during continuous infusion of human milk.

    PubMed

    García-Lara, Nadia Raquel; Escuder-Vieco, Diana; Alonso Díaz, Clara; Vázquez Román, Sara; De la Cruz-Bértolo, Javier; Pallás-Alonso, Carmen Rosa

    2014-11-01

    Substantial fat loss may occur during continuous feeding of human milk (HM). A decrease of fat loss has been described following homogenization. Well-established methods of homogenization of HM for routine use in the neonatal intensive care unit (NICU) would be desirable. We compared the loss of fat based on the use of 3 different methods for homogenizing thawed HM during continuous feeding. Sixteen frozen donor HM samples were thawed, homogenized with ultrasound and separated into 3 aliquots ("baseline agitation," "hourly agitation," and "ultrasound"), and then frozen for 48 hours. Aliquots were thawed again and a baseline agitation was applied. Subsequently, aliquots baseline agitation and hourly agitation were drawn into a syringe, while ultrasound was applied to aliquot ultrasound before it was drawn into a syringe. The syringes were loaded into a pump (2 mL/h; 4 hours). At hourly intervals the hourly agitation infusion was stopped, the syringe was disconnected and gently shaken. During infusion, samples from the 3 groups were collected hourly for analysis of fat and caloric content. The 3 groups of homogenization showed similar fat content at the beginning of the infusion. For fat, mean (SD) hourly changes of -0.03 (0.01), -0.09 (0.01), and -0.09 (0.01) g/dL were observed for the hourly agitation, baseline agitation, and ultrasound groups, respectively. The decrease was smaller for the hourly agitation group (P < .001). When thawed HM is continuously infused, a smaller fat loss is observed when syringes are agitated hourly versus when ultrasound or a baseline homogenization is used. © The Author(s) 2014.

  2. Processing effects on physicochemical properties of creams formulated with modified milk fat.

    PubMed

    Bolling, J C; Duncan, S E; Eigel, W N; Waterman, K M

    2005-04-01

    Type of thermal process [high temperature, short time pasteurization (HTST) or ultra-high temperature pasteurization (UHT)] and homogenization sequence (before or after pasteurization) were examined for influence on the physicochemical properties of natural cream (20% milk fat) and creams formulated with 20% low-melt, fractionated butteroil emulsified with skim milk, or buttermilk and butter-derived aqueous phase. Homogenization sequence influenced physicochemical makeup of the creams. Creams homogenized before pasteurization contained more milk fat surface material, higher phospholipid levels, and less protein at the milk fat interface than creams homogenized after pasteurization. Phosphodiesterase I activity was higher (relative to protein on lipid globule surface) when cream was homogenized before pasteurization. Creams formulated with skim milk and modified milk fat had relatively more phospholipid adsorbed at the milk fat interface. Ultra-high-temperature-pasteurized natural and reformulated creams were higher in viscosity at all shear rates investigated compared with HTST-pasteurized creams. High-temperature, short time-pasteurized natural cream was more viscous than HTST-pasteurized reformulated creams at most shear rates investigated. High-temperature, short time-pasteurized creams had better emulsion stability than UHT-pasteurized creams. Cream formulated with buttermilk had creaming stability most comparable to natural cream, and cream formulated with skim milk and modified butteroil was least stable to creaming. Most creams feathered in a pH range of 5.00 to 5.20, indicating that they were moderately stable to slightly unstable emulsions. All processing sequences yielded creams within sensory specifications with the exception of treatments homogenized before UHT pasteurization and skim milk formulations homogenized after UHT pasteurization.

  3. Diversity and Biotic Homogenization of Urban Land-Snail Faunas in Relation to Habitat Types and Macroclimate in 32 Central European Cities

    PubMed Central

    Horsák, Michal; Lososová, Zdeňka; Čejka, Tomáš; Juřičková, Lucie; Chytrý, Milan

    2013-01-01

    The effects of non-native species invasions on community diversity and biotic homogenization have been described for various taxa in urban environments, but not for land snails. Here we relate the diversity of native and non-native land-snail urban faunas to urban habitat types and macroclimate, and analyse homogenization effects of non-native species across cities and within the main urban habitat types. Land-snail species were recorded in seven 1-ha plots in 32 cities of ten countries of Central Europe and Benelux (224 plots in total). Each plot represented one urban habitat type characterized by different management and a specific disturbance regime. For each plot, we obtained January, July and mean annual temperature and annual precipitation. Snail species were classified into either native or non-native. The effects of habitat type and macroclimate on the number of native and non-native species were analysed using generalized estimating equations; the homogenization effect of non-native species based on the Jaccard similarity index and homogenization index. We recorded 67 native and 20 non-native species. Besides being more numerous, native species also had much higher beta diversity than non-natives. There were significant differences between the studied habitat types in the numbers of native and non-native species, both of which decreased from less to heavily urbanized habitats. Macroclimate was more important for the number of non-native than native species; however in both cases the effect of climate on diversity was overridden by the effect of urban habitat type. This is the first study on urban land snails documenting that non-native land-snail species significantly contribute to homogenization among whole cities, but both the homogenization and diversification effects occur when individual habitat types are compared among cities. This indicates that the spread of non-native snail species may cause biotic homogenization, but it depends on scale and habitat type. PMID:23936525

  4. Comparison of Methods to Assay Liver Glycogen Fractions: The Effects of Starvation

    PubMed Central

    Mojibi, Nastaran

    2017-01-01

    Introduction There are several methods to extract and measure glycogen in animal tissues. Glycogen is extracted with or without homogenization by using cold Perchloric Acid (PCA). Aim Three procedures were compared to determine glycogen fractions in rat liver at different physiological states. Materials and Methods The present study was conducted on two groups of rats, one group of five rats were fed standard rodent laboratory food and were marked as controls, and another five rats were starved overnight (15 hour) as cases. The glycogen fractions were extracted and measured by using three methods: classical homogenization, total-glycogen-fractionation and homogenization-free protocols. Results The data of homogenization methods showed that following 15 hour starvation, total glycogen decreased (36.4±1.9 vs. 27.7±2.5, p=0.01) and the change occurred entirely in Acid Soluble Glycogen (ASG) (32.0±1.1 vs. 22.7±2.5, p=0.01), while Acid Insoluble Glycogen (AIG) did not change significantly (4.9±0.9 vs. 4.6±0.3, p=0.7). Similar results were achieved by using the method of total-glycogen-fractionation. Homogenization-free procedure indicated that ASG and AIG fractions compromise about 2/3 and 1/3 of total glycogen and the changes occurred in both ASG (24.4±2.6 vs. 16.7±0.4, p<0.05) and AIG fraction (8.7±0.8 vs. 7.1±0.3, p=0.05). Conclusion The findings of ‘homogenization assay method’ indicate that ASG is the major portion of liver glycogen and is more metabolically active form. The same results were obtained by using ‘total-glycogen-fractionation method’. ‘Homogenization-free method’ gave different results, because AIG has been contaminated with ASG fraction. In both ‘homogenization’ and ‘homogenization-free’ methods ASG must be extracted at least twice to prevent contamination of AIG with ASG. PMID:28511372

  5. Diversity and biotic homogenization of urban land-snail faunas in relation to habitat types and macroclimate in 32 central European cities.

    PubMed

    Horsák, Michal; Lososová, Zdeňka; Čejka, Tomáš; Juřičková, Lucie; Chytrý, Milan

    2013-01-01

    The effects of non-native species invasions on community diversity and biotic homogenization have been described for various taxa in urban environments, but not for land snails. Here we relate the diversity of native and non-native land-snail urban faunas to urban habitat types and macroclimate, and analyse homogenization effects of non-native species across cities and within the main urban habitat types. Land-snail species were recorded in seven 1-ha plots in 32 cities of ten countries of Central Europe and Benelux (224 plots in total). Each plot represented one urban habitat type characterized by different management and a specific disturbance regime. For each plot, we obtained January, July and mean annual temperature and annual precipitation. Snail species were classified into either native or non-native. The effects of habitat type and macroclimate on the number of native and non-native species were analysed using generalized estimating equations; the homogenization effect of non-native species based on the Jaccard similarity index and homogenization index. We recorded 67 native and 20 non-native species. Besides being more numerous, native species also had much higher beta diversity than non-natives. There were significant differences between the studied habitat types in the numbers of native and non-native species, both of which decreased from less to heavily urbanized habitats. Macroclimate was more important for the number of non-native than native species; however in both cases the effect of climate on diversity was overridden by the effect of urban habitat type. This is the first study on urban land snails documenting that non-native land-snail species significantly contribute to homogenization among whole cities, but both the homogenization and diversification effects occur when individual habitat types are compared among cities. This indicates that the spread of non-native snail species may cause biotic homogenization, but it depends on scale and habitat type.

  6. Favorable effect of optimal lipid-lowering therapy on neointimal tissue characteristics after drug-eluting stent implantation: qualitative optical coherence tomographic analysis.

    PubMed

    Jang, Ji-Yong; Kim, Jung-Sun; Shin, Dong-Ho; Kim, Byeong-Keuk; Ko, Young-Guk; Choi, Donghoon; Jang, Yangsoo; Hong, Myeong-Ki

    2015-10-01

    Serial follow-up optical coherence tomography (OCT) was used to evaluate the effect of optimal lipid-lowering therapy on qualitative changes in neointimal tissue characteristics after drug-eluting stent (DES) implantation. DES-treated patients (n = 218) who received statin therapy were examined with serial follow-up OCT. First and second follow-up OCT evaluations were performed approximately 6 and 18 months after the index procedure, respectively. Patients were divided into two groups, based on the level of low-density lipoprotein-cholesterol (LDL-C), which was measured at the second follow-up. The optimal lipid-lowering group (n = 121) had an LDL-C reduction of ≥50% or an LDL-C level ≤70 mg/dL, and the conventional group (n = 97). Neointimal characteristics were qualitatively categorized as homogeneous or non-homogeneous patterns using OCT. The non-homogeneous group included heterogeneous, layered, or neoatherosclerosis patterns. Qualitative changes in neointimal tissue characteristics between the first and second follow-up OCT examinations were assessed. Between the first and second follow-up OCT procedures, the neointimal cross-sectional area increased more substantially in the conventional group (0.4 mm(2) vs. 0.2 mm(2) in the optimal lipid-lowering group, p = 0.01). The neointimal pattern changed from homogeneous to non-homogeneous less often in the optimal lipid-lowering group (1.3%, 1/77, p < 0.001) than in the conventional group (15.3%, 11/72, p = 0.44). Optimal LDL-C reduction was an independent predictor for the prevention of neointimal pattern change from homogeneous to non-homogeneous (odds ratio: 0.05, 95% confidence interval: 0.01∼0.46, p = 0.008). Our findings suggest that an intensive reduction in LDL-C levels can prevent non-homogeneous changes in the neointima and increases in neointimal cross-sectional area compared with conventional LDL-C controls. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. SU-E-T-417: The Impact of Normal Tissue Constraints On PTV Dose Homogeneity for Intensity Modulated Radiotherapy (IMRT), Volume Modulated Arc Therapy (VMAT) and Tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, J; McDonald, D; Ashenafi, M

    2014-06-01

    Purpose: Complex intensity modulated arc therapy tends to spread low dose to normal tissue(NT)regions to obtain improved target conformity and homogeneity and OAR sparing.This work evaluates the trade-offs between PTV homogeneity and reduction of the maximum dose(Dmax)spread to NT while planning of IMRT,VMAT and Tomotherapy. Methods: Ten prostate patients,previously planned with step-and-shoot IMRT,were selected.To fairly evaluate how PTV homogeneity was affected by NT Dmax constraints,original IMRT DVH objectives for PTV and OARs(femoral heads,and rectal and bladder wall)applied to 2 VMAT plans in Pinnacle(V9.0), and Tomotherapy(V4.2).The only constraint difference was the NT which was defined as body contours excluding targets,OARs andmore » dose rings.NT Dmax constraint for 1st VMAT was set to the prescription dose(Dp).For 2nd VMAT(VMAT-NT)and Tomotherapy,it was set to the Dmax achieved in IMRT(~70-80% of Dp).All NT constraints were set to the lowest priority.Three common homogeneity indices(HI),RTOG-HI=Dmax/Dp,moderated-HI=D95%/D5% and complex-HI=(D2%-D98%)/Dp*100 were calculated. Results: All modalities with similar dosimetric endpoints for PTV and OARs.The complex-HI shows the most variability of indices,with average values of 5.9,4.9,9.3 and 6.1 for IMRT,VMAT,VMAT-NT and Tomotherapy,respectively.VMAT provided the best PTV homogeneity without compromising any OAR/NT sparing.Both VMAT-NT and Tomotherapy,planned with more restrictive NT constraints,showed reduced homogeneity,with VMAT-NT showing the worst homogeneity(P<0.0001)for all HI.Tomotherapy gave the lowest NT Dmax,with slightly decreased homogeneity compared to VMAT. Finally, there was no significant difference in NT Dmax or Dmean between VMAT and VMAT-NT. Conclusion: PTV HI is highly dependent on permitted NT constraints. Results demonstrated that VMAT-NT with more restrictive NT constraints does not reduce Dmax NT,but significantly receives higher Dmax and worse target homogeneity.Therefore, it is critical that planners do not use too restrictive NT constraints during VMAT optimization.Tomotherapy plan was not as sensitive to NT constraints,however,care shall be taken to ensure NT is not pushed too hard.These results are relevant for clinical practice.The biological effect of higher Dmax and increased target heterogeneity needs further study.« less

  8. Simultaneous dual mode combustion engine operating on spark ignition and homogenous charge compression ignition

    DOEpatents

    Fiveland, Scott B.; Wiggers, Timothy E.

    2004-06-22

    An engine particularly suited to single speed operation environments, such as stationary power generators. The engine includes a plurality of combustion cylinders operable under homogenous charge compression ignition, and at least one combustion cylinder operable on spark ignition concepts. The cylinder operable on spark ignition concepts can be convertible to operate under homogenous charge compression ignition. The engine is started using the cylinders operable under spark ignition concepts.

  9. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, James Terry

    1998-01-01

    An apparatus and method for generating homogenous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set.

  10. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, James T.

    1998-01-01

    An apparatus and method for generating homogenous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set.

  11. Homogenization of periodic bi-isotropic composite materials

    NASA Astrophysics Data System (ADS)

    Ouchetto, Ouail; Essakhi, Brahim

    2018-07-01

    In this paper, we present a new method for homogenizing the bi-periodic materials with bi-isotropic components phases. The presented method is a numerical method based on the finite element method to compute the local electromagnetic properties. The homogenized constitutive parameters are expressed as a function of the macroscopic electromagnetic properties which are obtained from the local properties. The obtained results are compared to Unfolding Finite Element Method and Maxwell-Garnett formulas.

  12. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, James T.

    1997-01-01

    An apparatus and method for generating homogenous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially cancelling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set.

  13. Asymmetric dipolar ring

    DOEpatents

    Prosandeev, Sergey A.; Ponomareva, Inna V.; Kornev, Igor A.; Bellaiche, Laurent M.

    2010-11-16

    A device having a dipolar ring surrounding an interior region that is disposed asymmetrically on the ring. The dipolar ring generates a toroidal moment switchable between at least two stable states by a homogeneous field applied to the dipolar ring in the plane of the ring. The ring may be made of ferroelectric or magnetic material. In the former case, the homogeneous field is an electric field and in the latter case, the homogeneous field is a magnetic field.

  14. Large Area Crop Inventory Experiment (LACIE). Second-generation sampling strategy evaluation report. [Kansas, North Dakota, and U.S.S.R.

    NASA Technical Reports Server (NTRS)

    Basu, J. P. (Principal Investigator); Dragich, S. M.; Mcguigan, D. P.

    1978-01-01

    The author has identified the following significant results. The stratification procedure in the new sampling strategy for LACIE included: (1) correlation test results indicating that an agrophysical stratum may be homogeneous with respect to agricultural density, but not with respect to wheat density; and (2) agrophysical unit homogeneity test results indicating that with respect to agricultural density many agrophysical units are not homogeneous, but removal of one or more refined strata from any such current agrophysical unit can make the strata homogeneous. The apportioning procedure results indicated that the current procedure is not performing well and that the apportioned estimates of refined strata wheat area are often unreliable.

  15. Reactive sintering of ceramic lithium ion electrolyte membranes

    DOEpatents

    Badding, Michael Edward; Dutta, Indrajit; Iyer, Sriram Rangarajan; Kent, Brian Alan; Lonnroth, Nadja Teresia

    2017-06-06

    Disclosed herein are methods for making a solid lithium ion electrolyte membrane, the methods comprising combining a first reactant chosen from amorphous, glassy, or low melting temperature solid reactants with a second reactant chosen from refractory oxides to form a mixture; heating the mixture to a first temperature to form a homogenized composite, wherein the first temperature is between a glass transition temperature of the first reactant and a crystallization onset temperature of the mixture; milling the homogenized composite to form homogenized particles; casting the homogenized particles to form a green body; and sintering the green body at a second temperature to form a solid membrane. Solid lithium ion electrolyte membranes manufactured according to these methods are also disclosed herein.

  16. Stability analysis for virus spreading in complex networks with quarantine and non-homogeneous transition rates

    NASA Astrophysics Data System (ADS)

    Alarcon-Ramos, L. A.; Schaum, A.; Rodríguez Lucatero, C.; Bernal Jaquez, R.

    2014-03-01

    Virus propagations in complex networks have been studied in the framework of discrete time Markov process dynamical systems. These studies have been carried out under the assumption of homogeneous transition rates, yielding conditions for virus extinction in terms of the transition probabilities and the largest eigenvalue of the connectivity matrix. Nevertheless the assumption of homogeneous rates is rather restrictive. In the present study we consider non-homogeneous transition rates, assigned according to a uniform distribution, with susceptible, infected and quarantine states, thus generalizing the previous studies. A remarkable result of this analysis is that the extinction depends on the weakest element in the network. Simulation results are presented for large free-scale networks, that corroborate our theoretical findings.

  17. Study of homogeneity and inhomogeneity phantom in CUDA EGS for small field dosimetry

    NASA Astrophysics Data System (ADS)

    Yani, Sitti; Rhani, Mohamad Fahdillah; Haryanto, Freddy; Arif, Idam

    2017-02-01

    CUDA EGS was CUDA implementation to simulate transport photon in a material based on Monte Carlo algorithm for X-ray imaging. The objective of this study was to investigate the effect of inhomogeneities in inhomogeneity phantom for small field dosimetry (1×1, 2×2, 3×3, 4×4 and 5×5 cm2). Two phantoms, homogeneity and inhomogeneity phantom were used. The interaction in homogeneity and inhomogeneity phantom was dominated by Compton interaction and multiple scattering. The CUDA EGS can represent the inhomogeneity effect in small field dosimetry by combining the grayscale curve between homogeneity and inhomogeneity phantom. The grayscale curve in inhomogeneity phantom is not asymmetric because of the existence of different material in phantom.

  18. Gravitational self-interactions of a degenerate quantum scalar field

    NASA Astrophysics Data System (ADS)

    Chakrabarty, Sankha S.; Enomoto, Seishi; Han, Yaqi; Sikivie, Pierre; Todarello, Elisa M.

    2018-02-01

    We develop a formalism to help calculate in quantum field theory the departures from the description of a system by classical field equations. We apply the formalism to a homogeneous condensate with attractive contact interactions and to a homogeneous self-gravitating condensate in critical expansion. In their classical descriptions, such condensates persist forever. We show that in their quantum description, parametric resonance causes quanta to jump in pairs out of the condensate into all modes with wave vector less than some critical value. We calculate, in each case, the time scale over which the homogeneous condensate is depleted and after which a classical description is invalid. We argue that the duration of classicality of inhomogeneous condensates is shorter than that of homogeneous condensates.

  19. Effects of Annular Electromagnetic Stirring Coupled with Intercooling on Grain Refinement and Homogeneity During Direct Chill Casting of Large-Sized 7005 Alloy Billet

    NASA Astrophysics Data System (ADS)

    Luo, Yajun; Zhang, Zhifeng; Li, Bao; Gao, Mingwei; Qiu, Yang; He, Min

    2017-12-01

    To obtain a large-sized, high-quality aluminum alloy billet, an advanced uniform direct chill (UDC) casting method was developed by combining annular electromagnetic stirring (A-EMS) with intercooling in the sump. The 7005 alloy was chosen to investigate the effect of UDC on grain refinement and homogeneity during normal direct chill (NDC) casting. It was concluded that the microstructure consisting of both primary α-Al phase and secondary phases becomes finer and more homogeneous for the billets prepared with UDC casting compared to those prepared with NDC casting, and the forced cooling from both the inner and outer melt under A-EMS has a measurable effect on grain refinement and homogeneity.

  20. Homogeneous-heterogeneous reactions in curved channel with porous medium

    NASA Astrophysics Data System (ADS)

    Hayat, T.; Ayub, Sadia; Alsaedi, A.

    2018-06-01

    Purpose of the present investigation is to examine the peristaltic flow through porous medium in a curved conduit. Problem is modeled for incompressible electrically conducting Ellis fluid. Influence of porous medium is tackled via modified Darcy's law. The considered model utilizes homogeneous-heterogeneous reactions with equal diffusivities for reactant and autocatalysis. Constitutive equations are formulated in the presence of viscous dissipation. Channel walls are compliant in nature. Governing equations are modeled and simplified under the assumptions of small Reynolds number and large wavelength. Graphical results for velocity, temperature, heat transfer coefficient and homogeneous-heterogeneous reaction parameters are examined for the emerging parameters entering into the problem. Results reveal an activation in both homogenous-heterogenous reaction effect and heat transfer rate with increasing curvature of the channel.

  1. Volatile loss during homogenization of lunar melt inclusions

    NASA Astrophysics Data System (ADS)

    Ni, Peng; Zhang, Youxue; Guan, Yunbin

    2017-11-01

    Volatile abundances in lunar mantle are critical factors to consider for constraining the model of Moon formation. Recently, the earlier understanding of a ;dry; Moon has shifted to a fairly ;wet; Moon due to the detection of measurable amount of H2O in lunar volcanic glass beads, mineral grains, and olivine-hosted melt inclusions. The ongoing debate on a ;dry; or ;wet; Moon requires further studies on lunar melt inclusions to obtain a broader understanding of volatile abundances in the lunar mantle. One important uncertainty for lunar melt inclusion studies, however, is whether the homogenization of melt inclusions would cause volatile loss. In this study, a series of homogenization experiments were conducted on olivine-hosted melt inclusions from the sample 74220 to evaluate the possible loss of volatiles during homogenization of lunar melt inclusions. Our results suggest that significant loss of H2O could occur even during minutes of homogenization, while F, Cl and S in the inclusions remain unaffected. We model the trend of H2O loss in homogenized melt inclusions by a diffusive hydrogen loss model. The model can reconcile the observed experimental data well, with a best-fit H diffusivity in accordance with diffusion data explained by the ;slow; mechanism for hydrogen diffusion in olivine. Surprisingly, no significant effect for the low oxygen fugacity on the Moon is observed on the diffusive loss of hydrogen during homogenization of lunar melt inclusions under reducing conditions. Our experimental and modeling results show that diffusive H loss is negligible for melt inclusions of >25 μm radius. As our results mitigate the concern of H2O loss during homogenization for crystalline lunar melt inclusions, we found that H2O/Ce ratios in melt inclusions from different lunar samples vary with degree of crystallization. Such a variation is more likely due to H2O loss on the lunar surface, while heterogeneity in their lunar mantle source is also a possibility. A similar size-dependence trend of H2O concentrations was also observed in natural unheated melt inclusions in 74220. By comparing the trend of diffusive H loss in the natural MIs and in our homogenized MIs, the cooling rate for 74220 was estimated to be ∼1 °C/s or slower.

  2. A 14 h-3 Gpc3 study of cosmic homogeneity using BOSS DR12 quasar sample

    NASA Astrophysics Data System (ADS)

    Laurent, Pierre; Le Goff, Jean-Marc; Burtin, Etienne; Hamilton, Jean-Christophe; Hogg, David W.; Myers, Adam; Ntelis, Pierros; Pâris, Isabelle; Rich, James; Aubourg, Eric; Bautista, Julian; Delubac, Timothée; du Mas des Bourboux, Hélion; Eftekharzadeh, Sarah; Palanque Delabrouille, Nathalie; Petitjean, Patrick; Rossi, Graziano; Schneider, Donald P.; Yeche, Christophe

    2016-11-01

    The BOSS quasar sample is used to study cosmic homogeneity with a 3D survey in the redshift range 2.2 < z < 2.8. We measure the count-in-sphere, N(< r), i.e. the average number of objects around a given object, and its logarithmic derivative, the fractal correlation dimension, D2(r). For a homogeneous distribution N(< r) propto r3 and D2(r) = 3. Due to the uncertainty on tracer density evolution, 3D surveys can only probe homogeneity up to a redshift dependence, i.e. they probe so-called ``spatial isotropy". Our data demonstrate spatial isotropy of the quasar distribution in the redshift range 2.2 < z < 2.8 in a model-independent way, independent of any FLRW fiducial cosmology, resulting in 3 - langleD2rangle < 1.7 × 10-3 (2 σ) over the range 250 < r < 1200 h-1 Mpc for the quasar distribution. If we assume that quasars do not have a bias much less than unity, this implies spatial isotropy of the matter distribution on large scales. Then, combining with the Copernican principle, we finally get homogeneity of the matter distribution on large scales. Alternatively, using a flat ΛCDM fiducial cosmology with CMB-derived parameters, and measuring the quasar bias relative to this ΛCDM model, our data provide a consistency check of the model, in terms of how homogeneous the Universe is on different scales. D2(r) is found to be compatible with our ΛCDM model on the whole 10 < r < 1200 h-1 Mpc range. For the matter distribution we obtain 3 - langleD2rangle < 5 × 10-5 (2 σ) over the range 250 < r < 1200 h-1 Mpc, consistent with homogeneity on large scales.

  3. Chemical zoning and homogenization of olivines in ordinary chondrites and implications for thermal histories of chondrules

    NASA Technical Reports Server (NTRS)

    Miyamoto, Masamichi; Mckay, David S.; Mckay, Gordon A.; Duke, Michael B.

    1986-01-01

    The extent and degree of homogenization of chemical zoning of olivines in type 3 ordinary chondrites is studied in order to obtain some constraints on cooling histories of chondrites. Based on Mg-Fe and CaO zoning, olivines in type 3 chondrites are classified into four types. A single chondrule usually contains olivines with the same type of zoning. Microporphyritic olivines show all four zoning types. Barred olivines usually show almost homogenized chemical zoning. The cooling rates or burial depths needed to homogenize the chemical zoning are calculated by solving the diffusion equation, using the zoning profiles as an initial condition. Mg-Fe zoning of olivine may be altered during initial cooling, whereas CaO zoning is hardly changed. Barred olivines may be homogenized during initial cooling because their size is relatively small. To simulated microporphyritic olivine chondrules, cooling from just below the liquidus at moderately high rates is preferable to cooling from above the liquidus at low rates. For postaccumulation metamorphism of type 3 chondrites to keep Mg-Fe zoning unaltered, the maximum metamorphic temperature must be less than about 400 C if cooling rates based on Fe-Ni data are assumed. Calculated cooling rates for both Fa and CaO homogenization are consistent with those by Fe-Ni data for type 4 chondrites. A hot ejecta blanket several tens of meters thick on the surface of a parent body is sufficient to homogenize Mg-Fe zoning if the temperature of the blanket is 600-700 C. Burial depths for petrologic types of ordinary chondrites in a parent body heated by Al-26 are broadly consistent with those previously proposed.

  4. Proton Minibeam Radiation Therapy Reduces Side Effects in an In Vivo Mouse Ear Model.

    PubMed

    Girst, Stefanie; Greubel, Christoph; Reindl, Judith; Siebenwirth, Christian; Zlobinskaya, Olga; Walsh, Dietrich W M; Ilicic, Katarina; Aichler, Michaela; Walch, Axel; Wilkens, Jan J; Multhoff, Gabriele; Dollinger, Günther; Schmid, Thomas E

    2016-05-01

    Proton minibeam radiation therapy is a novel approach to minimize normal tissue damage in the entrance channel by spatial fractionation while keeping tumor control through a homogeneous tumor dose using beam widening with an increasing track length. In the present study, the dose distributions for homogeneous broad beam and minibeam irradiation sessions were simulated. Also, in an animal study, acute normal tissue side effects of proton minibeam irradiation were compared with homogeneous irradiation in a tumor-free mouse ear model to account for the complex effects on the immune system and vasculature in an in vivo normal tissue model. At the ion microprobe SNAKE, 20-MeV protons were administered to the central part (7.2 × 7.2 mm(2)) of the ear of BALB/c mice, using either a homogeneous field with a dose of 60 Gy or 16 minibeams with a nominal 6000 Gy (4 × 4 minibeams, size 0.18 × 0.18 mm(2), with a distance of 1.8 mm). The same average dose was used over the irradiated area. No ear swelling or other skin reactions were observed at any point after minibeam irradiation. In contrast, significant ear swelling (up to fourfold), erythema, and desquamation developed in homogeneously irradiated ears 3 to 4 weeks after irradiation. Hair loss and the disappearance of sebaceous glands were only detected in the homogeneously irradiated fields. These results show that proton minibeam radiation therapy results in reduced adverse effects compared with conventional homogeneous broad-beam irradiation and, therefore, might have the potential to decrease the incidence of side effects resulting from clinical proton and/or heavy ion therapy. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. The role of grammatical category information in spoken word retrieval.

    PubMed

    Duràn, Carolina Palma; Pillon, Agnesa

    2011-01-01

    We investigated the role of lexical syntactic information such as grammatical gender and category in spoken word retrieval processes by using a blocking paradigm in picture and written word naming experiments. In Experiments 1, 3, and 4, we found that the naming of target words (nouns) from pictures or written words was faster when these target words were named within a list where only words from the same grammatical category had to be produced (homogeneous category list: all nouns) than when they had to be produced within a list comprising also words from another grammatical category (heterogeneous category list: nouns and verbs). On the other hand, we detected no significant facilitation effect when the target words had to be named within a homogeneous gender list (all masculine nouns) compared to a heterogeneous gender list (both masculine and feminine nouns). In Experiment 2, using the same blocking paradigm by manipulating the semantic category of the items, we found that naming latencies were significantly slower in the semantic category homogeneous in comparison with the semantic category heterogeneous condition. Thus semantic category homogeneity caused an interference, not a facilitation effect like grammatical category homogeneity. Finally, in Experiment 5, nouns in the heterogeneous category condition had to be named just after a verb (category-switching position) or a noun (same-category position). We found a facilitation effect of category homogeneity but no significant effect of position, which showed that the effect of category homogeneity found in Experiments 1, 3, and 4 was not due to a cost of switching between grammatical categories in the heterogeneous grammatical category list. These findings supported the hypothesis that grammatical category information impacts word retrieval processes in speech production, even when words are to be produced in isolation. They are discussed within the context of extant theories of lexical production.

  6. Layout optimization using the homogenization method

    NASA Technical Reports Server (NTRS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  7. Homogeneous cosmological models and new inflation

    NASA Technical Reports Server (NTRS)

    Turner, Michael S.; Widrow, Lawrence M.

    1986-01-01

    The promise of the inflationary-universe scenario is to free the present state of the universe from extreme dependence upon initial data. Paradoxically, inflation is usually analyzed in the context of the homogeneous and isotropic Robertson-Walker cosmological models. It is shown that all but a small subset of the homogeneous models undergo inflation. Any initial anisotropy is so strongly damped that if sufficient inflation occurs to solve the flatness and horizon problems, the universe today would still be very isotropic.

  8. Temperature lowering program for homogeneous doping in flux growth

    NASA Astrophysics Data System (ADS)

    Qiwei, Wang; Shouquan, Jia

    1989-10-01

    Based on the mass conservation law and the Burton-Prim-Slichter equation, the temperature program for homogeneous doping in flux growth by slow cooling was derived. The effect of various factors, such as initial supersaturation, solution volume, growth kinetic coefficient and degree of mixing in the solution on growth rate, crystal size and temperature program is discussed in detail. Theoretical analysis shows that there is a critical crystal size above which homogeneous doping is impossible.

  9. Catalytic photodegradation of pharmaceuticals - homogeneous and heterogeneous photocatalysis.

    PubMed

    Klementova, S; Kahoun, D; Doubkova, L; Frejlachova, K; Dusakova, M; Zlamal, M

    2017-01-18

    Photocatalytic degradation of pharmaceuticals (hydrocortisone, estradiol, and verapamil) and personal care product additives (parabens-methyl, ethyl, and propyl derivatives) was investigated in the homogeneous phase (with ferric ions as the catalyst) and on TiO 2 . Ferric ions in concentrations corresponding to concentrations in natural water bodies were shown to be a significant accelerator of the degradation in homogeneous reaction mixtures. In heterogeneous photocatalytic reactions on TiO 2 , lower reaction rates, but mineralisation to higher extents, were observed.

  10. Organic, Organometallic and Bioorganic Catalysts for Electrochemical Reduction of CO2

    PubMed Central

    Schlager, Stefanie; Portenkirchner, Engelbert; Sariciftci, Niyazi Serdar

    2017-01-01

    Abstract A broad review of homogeneous and heterogeneous catalytic approaches toward CO2 reduction using organic, organometallic, and bioorganic systems is provided. Electrochemical, bioelectrochemical and photoelectrochemical approaches are discussed in terms of their faradaic efficiencies, overpotentials and reaction mechanisms. Organometallic complexes as well as semiconductors and their homogeneous and heterogeneous catalytic activities are compared to enzymes. In both cases, their immobilization on electrodes is discussed and compared to homogeneous catalysts in solution. PMID:28383174

  11. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, J.T.

    1997-06-24

    An apparatus and method are disclosed for generating homogeneous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set. 26 figs.

  12. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, J.T.

    1998-05-05

    An apparatus and method are disclosed for generating homogeneous electromagnetic fields within a volume. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set. 55 figs.

  13. Generating highly uniform electromagnetic field characteristics

    DOEpatents

    Crow, J.T.

    1998-02-10

    An apparatus and method for generating homogeneous electromagnetic fields within a volume is disclosed. The homogeneity provided may be for magnetic and/or electric fields, and for field magnitude, radial gradient, or higher order radial derivative. The invention comprises conductive pathways oriented mirror symmetrically about a desired region of homogeneity. A corresponding apparatus and method is provided for substantially canceling the electromagnetic field outside of the apparatus, comprising a second set of conductive pathways placed outside the first set. 39 figs.

  14. U.S. Joint Special Operations Forces: Two Few, Overworked, Young, Homogenous & Macho to Fulfill the Unconventional Demands of the Long War?

    DTIC Science & Technology

    2008-05-28

    OVERWORKED , YOUNG, HOMOGENOUS, & MACHO TO FULFILL THE UNCONVENTIONAL DEMANDS OF THE LONG WAR? SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR...U.S. Joint Special Operations Forces: Two Few, Overworked , Young, Homogenous & Macho to Fulfill the Unconventional Demands of the Long War? 5a...to be the targets of nearly daily mortar, improvised explosive devices (IEDs), and occasional suicide vehicle-borne IED (SVBIED) attacks. It

  15. Preparation and Immunoaffinity Depletion of Fresh Frozen Tissue Homogenates for Mass Spectrometry-Based Proteomics in the Context of Drug Target/Biomarker Discovery.

    PubMed

    Prieto, DaRue A; Chan, King C; Johann, Donald J; Ye, Xiaoying; Whitely, Gordon; Blonder, Josip

    2017-01-01

    The discovery of novel drug targets and biomarkers via mass spectrometry (MS)-based proteomic analysis of clinical specimens has proven to be challenging. The wide dynamic range of protein concentration in clinical specimens and the high background/noise originating from highly abundant proteins in tissue homogenates and serum/plasma encompass two major analytical obstacles. Immunoaffinity depletion of highly abundant blood-derived proteins from serum/plasma is a well-established approach adopted by numerous researchers; however, the utilization of this technique for immunodepletion of tissue homogenates obtained from fresh frozen clinical specimens is lacking. We first developed immunoaffinity depletion of highly abundant blood-derived proteins from tissue homogenates, using renal cell carcinoma as a model disease, and followed this study by applying it to different tissue types. Tissue homogenate immunoaffinity depletion of highly abundant proteins may be equally important as is the recognized need for depletion of serum/plasma, enabling more sensitive MS-based discovery of novel drug targets, and/or clinical biomarkers from complex clinical samples. Provided is a detailed protocol designed to guide the researcher through the preparation and immunoaffinity depletion of fresh frozen tissue homogenates for two-dimensional liquid chromatography, tandem mass spectrometry (2D-LC-MS/MS)-based molecular profiling of tissue specimens in the context of drug target and/or biomarker discovery.

  16. Differential reactivities of four homogeneous assays for LDL-cholesterol in serum to intermediate-density lipoproteins and small dense LDL: comparisons with the Friedewald equation.

    PubMed

    Yamashita, Shizuya; Kawase, Ryota; Nakaoka, Hajime; Nakatani, Kazuhiro; Inagaki, Miwako; Yuasa-Kawase, Miyako; Tsubakio-Yamamoto, Kazumi; Sandoval, Jose C; Masuda, Daisaku; Ohama, Tohru; Nakagawa-Toyama, Yumiko; Matsuyama, Akifumi; Nishida, Makoto; Ishigami, Masato

    2009-12-01

    In routine clinical laboratory testing and numerous epidemiological studies, LDL-cholesterol (LDL-C) has been estimated commonly using the Friedewald equation. We investigated the relationship between the Friedewald equation and 4 homogeneous assays for LDL-C. LDL-C was determined by 4 homogeneous assays [liquid selective detergent method: LDL-C (L), selective solubilization method: LDL-C (S), elimination method: LDL-C (E), and enzyme selective protecting method: LDL-C (P)]. Samples with discrepancies between the Friedewald equation and the 4 homogeneous assays for LDL-C were subjected to polyacrylamide gel electrophoresis and the beta-quantification method. The correlations between the Friedewald equation and the 4 homogeneous LDL-C assays were as follows: LDL-C (L) (r=0.962), LDL-C (S) (r=0.986), LDL-C (E) (r=0.946) and LDL-C (P) (r=0.963). Discrepancies were observed in sera from type III hyperlipoproteinemia patients and in sera containing large amounts of midband and small dense LDL on polyacrylamide gel electrophoresis. LDL-C (S) was most strongly correlated with the beta-quantification method even in sera from patients with type III hyperlipoproteinemia. Of the 4 homogeneous assays for LDL-C, LDL-C (S) exhibited the closest correlation with the Friedewald equation and the beta-quantification method, thus reflecting the current clinical databases for coronary heart disease.

  17. Optimization and characterization of high pressure homogenization produced chemically modified starch nanoparticles.

    PubMed

    Ding, Yongbo; Kan, Jianquan

    2017-12-01

    Chemically modified starch (RS4) nanoparticles were synthesized through homogenization and water-in-oil mini-emulsion cross-linking. Homogenization was optimized with regard to z-average diameter by using a three-factor-three-level Box-Behnken design. Homogenization pressure (X 1 ), oil/water ratio (X 2 ), and surfactant (X 3 ) were selected as independent variables, whereas z-average diameter was considered as a dependent variable. The following optimum preparation conditions were obtained to achieve the minimum average size of these nanoparticles: 50 MPa homogenization pressure, 10:1 oil/water ratio, and 2 g surfactant amount, when the predicted z-average diameter was 303.6 nm. The physicochemical properties of these nanoparticles were also determined. Dynamic light scattering experiments revealed that RS4 nanoparticles measuring a PdI of 0.380 and an average size of approximately 300 nm, which was very close to the predicted z-average diameter (303.6 nm). The absolute value of zeta potential of RS4 nanoparticles (39.7 mV) was higher than RS4 (32.4 mV), with strengthened swelling power. X-ray diffraction results revealed that homogenization induced a disruption in crystalline structure of RS4 nanoparticles led to amorphous or low-crystallinity. Results of stability analysis showed that RS4 nanosuspensions (particle size) had good stability at 30 °C over 24 h.

  18. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  19. Effective homogeneity of the exchange-correlation and non-interacting kinetic energy functionals under density scaling.

    PubMed

    Borgoo, Alex; Teale, Andrew M; Tozer, David J

    2012-01-21

    Correlated electron densities, experimental ionisation potentials, and experimental electron affinities are used to investigate the homogeneity of the exchange-correlation and non-interacting kinetic energy functionals of Kohn-Sham density functional theory under density scaling. Results are presented for atoms and small molecules, paying attention to the influence of the integer discontinuity and the choice of the electron affinity. For the exchange-correlation functional, effective homogeneities are highly system-dependent on either side of the integer discontinuity. By contrast, the average homogeneity-associated with the potential that averages over the discontinuity-is generally close to 4/3 when the discontinuity is computed using positive affinities for systems that do bind an excess electron and negative affinities for those that do not. The proximity to 4/3 becomes increasingly pronounced with increasing atomic number. Evaluating the discontinuity using a zero affinity in systems that do not bind an excess electron instead leads to effective homogeneities on the electron abundant side that are close to 4/3. For the non-interacting kinetic energy functional, the effective homogeneities are less system-dependent and the effect of the integer discontinuity is less pronounced. Average values are uniformly below 5/3. The study provides information that may aid the development of improved exchange-correlation and non-interacting kinetic energy functionals. © 2012 American Institute of Physics

  20. High-temperature viscoelastic creep constitutive equations for polymer composites: Homogenization theory and experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skontorp, A.; Wang, S.S.; Shibuya, Y.

    1994-12-31

    In this paper, a homogenization theory is developed to determine high-temperature effective viscoelastic constitutive equations for fiber-reinforced polymer composites. The homogenization theory approximates the microstructure of a fiber composite, and determine simultaneously effective macroscopic constitutive properties of the composite and the associated microscopic strain and stress in the heterogeneous material. The time-temperature dependent homogenization theory requires that the viscoelastic constituent properties of the matrix phase at elevated temperatures, the governing equations for the composites, and the boundary conditions of the problem be Laplace transformed to a conjugate problem. The homogenized effective properties in the transformed domain are determined, using amore » two-scale asymptotic expansion of field variables and an averaging procedure. Field solutions in the unit cell are determined from basic and first-order governing equations with the aid of a boundary integral method (BIM). Effective viscoelastic constitutive properties of the composite at elevated temperatures are determined by an inverse transformation, as are the microscopic stress and deformation in the composite. Using this method, interactions among fibers and between the fibers and the matrix can be evaluated explicitly, resulting in accurate solutions for composites with high-volume fraction of reinforcing fibers. Examples are given for the case of a carbon-fiber reinforced thermoplastic polyamide composite in an elevated temperature environment. The homogenization predictions are in good agreement with experimental data available for the composite.« less

  1. Evaluating a novel application of optical fibre evanescent field absorbance: rapid measurement of red colour in winegrape homogenates

    NASA Astrophysics Data System (ADS)

    Lye, Peter G.; Bradbury, Ronald; Lamb, David W.

    Silica optical fibres were used to measure colour (mg anthocyanin/g fresh berry weight) in samples of red wine grape homogenates via optical Fibre Evanescent Field Absorbance (FEFA). Colour measurements from 126 samples of grape homogenate were compared against the standard industry spectrophotometric reference method that involves chemical extraction and subsequent optical absorption measurements of clarified samples at 520 nm. FEFA absorbance on homogenates at 520 nm (FEFA520h) was correlated with the industry reference method measurements of colour (R2 = 0.46, n = 126). Using a simple regression equation colour could be predicted with a standard error of cross-validation (SECV) of 0.21 mg/g, with a range of 0.6 to 2.2 mg anthocyanin/g and a standard deviation of 0.33 mg/g. With a Ratio of Performance Deviation (RPD) of 1.6, the technique when utilizing only a single detection wavelength, is not robust enough to apply in a diagnostic sense, however the results do demonstrate the potential of the FEFA method as a fast and low-cost assay of colour in homogenized samples.

  2. Assessing the use of food coloring as an appropriate visual guide for homogenously mixed capsule powders in extemporaneous compounding.

    PubMed

    Hoffmann, Brittany; Carlson, Christie; Rao, Deepa A

    2014-01-01

    The purpose of this work was to assess the use of food colors as a visual aid to determine homogeneous mixing in the extemporaneous preparation of capsules. Six different batches of progesterone slow-release 200-mg capsules were prepared by different mixing methods until visually determined as homogeneous based on yellow food coloring distribution in the preparation by the Central Iowa Compounding Pharmacy, Des Moines, Iowa. UV-Vis spectrophotometry was used to extract and evaluate yellow food coloring content in each of these batches and compared to an in-house, small-batch geometric dilution preparation of progesterone slow- release 200-mg capsules. Of the 6 batches tested, only one, which followed the principles of additive dilution and an appropriate mixing time, was both visually and quantitatively homogeneous in the detection of yellow food coloring. The use of food coloring alone is not a valid quality-assurance tool in determining homogeneous mixing. Principles of geometric and/or additive dilution and appropriate mixing times along with the food color can serve as a quality-assurance tool.

  3. Effects of end-ring/shield configuration on homogeneity and signal-to-noise ratio in a birdcage-type coil loaded with a human head.

    PubMed

    Liu, Wanzhan; Collins, Christopher M; Delp, Pamela J; Smith, Michael B

    2004-01-01

    We modeled four different end-ring/shield configurations of a birdcage coil to examine their effects on field homogeneity and signal-to-noise ratio (SNR) at 64 MHz and 125 MHz. The configurations are defined as: 1) conventional: a conventional cylindrical shield; 2) surrounding shield: a shield with annular extensions to closely shield the end rings; 3) solid connection: a shield with annular extensions connected to the rungs; and 4) thin wire connection: a shield with thin wires connected to the rungs. At both frequencies, the coil with conventional end-ring/shield configuration produces the most homogeneous RF magnetic (B1) field when the coil is empty, but produces the least homogeneous B1 field when the coil is loaded with a human head. The surrounding shield configuration results in the most homogeneous B1 and highest SNR in the coil loaded with the human head at both frequencies, followed closely by the solid connection configuration. Copyright 2003 Wiley-Liss, Inc.

  4. How the Spectre of Societal Homogeneity Undermines Equitable Healthcare for Refugees

    PubMed Central

    Razum, Oliver; Wenner, Judith; Bozorgmehr, Kayvan

    2017-01-01

    Recourse to a purported ideal of societal homogeneity has become common in the context of the refugee reception crisis – not only in Japan, as Leppold et al report, but also throughout Europe. Calls for societal homogeneity in Europe originate from populist movements as well as from some governments. Often, they go along with reduced social support for refugees and asylum seekers, for example in healthcare provision. The fundamental right to health is then reduced to a citizens’ right, granted fully only to nationals. Germany, in spite of welcoming many refugees in 2015, is a case in point: entitlement and access to healthcare for asylum seekers are restricted during the first 15 months of their stay. We show that arguments brought forward to defend such restrictions do not hold, particularly not those which relate to maintaining societal homogeneity. European societies are not homogeneous, irrespective of migration. But as migration will continue, societies need to invest in what we call "globalization within." Removing entitlement restrictions and access barriers to healthcare for refugees and asylum seekers is one important element thereof. PMID:28812828

  5. Method for preparing hydrous iron oxide gels and spherules

    DOEpatents

    Collins, Jack L.; Lauf, Robert J.; Anderson, Kimberly K.

    2003-07-29

    The present invention is directed to methods for preparing hydrous iron oxide spherules, hydrous iron oxide gels such as gel slabs, films, capillary and electrophoresis gels, iron monohydrogen phosphate spherules, hydrous iron oxide spherules having suspendable particles homogeneously embedded within to form composite sorbents and catalysts, iron monohydrogen phosphate spherules having suspendable particles of at least one different sorbent homogeneously embedded within to form a composite sorbent, iron oxide spherules having suspendable particles homogeneously embedded within to form a composite of hydrous iron oxide fiber materials, iron oxide fiber materials, hydrous iron oxide fiber materials having suspendable particles homogeneously embedded within to form a composite, iron oxide fiber materials having suspendable particles homogeneously embedded within to form a composite, dielectric spherules of barium, strontium, and lead ferrites and mixtures thereof, and composite catalytic spherules of barium or strontium ferrite embedded with oxides of Mg, Zn, Pb, Ce and mixtures thereof. These variations of hydrous iron oxide spherules and gel forms prepared by the gel-sphere, internal gelation process offer more useful forms of inorganic ion exchangers, catalysts, getters, dielectrics, and ceramics.

  6. Selecting for extinction: nonrandom disease-associated extinction homogenizes amphibian biotas.

    PubMed

    Smith, Kevin G; Lips, Karen R; Chase, Jonathan M

    2009-10-01

    Studying the patterns in which local extinctions occur is critical to understanding how extinctions affect biodiversity at local, regional and global spatial scales. To understand the importance of patterns of extinction at a regional spatial scale, we use data from extirpations associated with a widespread pathogenic agent of amphibian decline, Batrachochytrium dendrobatidis (Bd) as a model system. We apply novel null model analyses to these data to determine whether recent extirpations associated with Bd have resulted in selective extinction and homogenization of diverse tropical American amphibian biotas. We find that Bd-associated extinctions in this region were nonrandom and disproportionately, but not exclusively, affected low-occupancy and endemic species, resulting in homogenization of the remnant amphibian fauna. The pattern of extirpations also resulted in phylogenetic homogenization at the family level and ecological homogenization of reproductive mode and habitat association. Additionally, many more species were extirpated from the region than would be expected if extirpations occurred randomly. Our results indicate that amphibian declines in this region are an extinction filter, reducing regional amphibian biodiversity to highly similar relict assemblages and ultimately causing amplified biodiversity loss at regional and global scales.

  7. Improved Homogeneity of the Transmit Field by Simultaneous Transmission with Phased Array and Volume Coil

    PubMed Central

    Avdievich, Nikolai I.; Oh, Suk-Hoon; Hetherington, Hoby P.; Collins, Christopher M.

    2010-01-01

    Purpose To improve the homogeneity of transmit volume coils at high magnetic fields (≥ 4 T). Due to RF field/ tissue interactions at high fields, 4–8 T, the transmit profile from head-sized volume coils shows a distinctive pattern with relatively strong RF magnetic field B1 in the center of the brain. Materials and Methods In contrast to conventional volume coils at high field strengths, surface coil phased arrays can provide increased RF field strength peripherally. In theory, simultaneous transmission from these two devices could produce a more homogeneous transmission field. To minimize interactions between the phased array and the volume coil, counter rotating current (CRC) surface coils consisting of two parallel rings carrying opposite currents were used for the phased array. Results Numerical simulations and experimental data demonstrate that substantial improvements in transmit field homogeneity can be obtained. Conclusion We have demonstrated the feasibility of using simultaneous transmission with human head-sized volume coils and CRC phased arrays to improve homogeneity of the transmit RF B1 field for high-field MRI systems. PMID:20677280

  8. Revisiting Shock Initiation Modeling of Homogeneous Explosives

    NASA Astrophysics Data System (ADS)

    Partom, Yehuda

    2013-04-01

    Shock initiation of homogeneous explosives has been a subject of research since the 1960s, with neat and sensitized nitromethane as the main materials for experiments. A shock initiation model of homogeneous explosives was established in the early 1960s. It involves a thermal explosion event at the shock entrance boundary, which develops into a superdetonation that overtakes the initial shock. In recent years, Sheffield and his group, using accurate experimental tools, were able to observe details of buildup of the superdetonation. There are many papers on modeling shock initiation of heterogeneous explosives, but there are only a few papers on modeling shock initiation of homogeneous explosives. In this article, bulk reaction reactive flow equations are used to model homogeneous shock initiation in an attempt to reproduce experimental data of Sheffield and his group. It was possible to reproduce the main features of the shock initiation process, including thermal explosion, superdetonation, input shock overtake, overdriven detonation after overtake, and the beginning of decay toward Chapman-Jouget (CJ) detonation. The time to overtake (TTO) as function of input pressure was also calculated and compared to the experimental TTO.

  9. A novel transcription initiation factor (TIF), TIF-IE, is required for homogeneous Acanthamoeba castellanii TIF-IB (SL1) to form a committed complex.

    PubMed

    Radebaugh, C A; Kubaska, W M; Hoffman, L H; Stiffler, K; Paule, M R

    1998-10-16

    The fundamental transcription initiation factor (TIF) for ribosomal RNA expression by eukaryotic RNA polymerase I, TIF-IB, has been purified to near homogeneity from Acanthamoeba castellanii using standard techniques. The purified factor consists of the TATA-binding protein and four TATA-binding protein-associated factors with relative molecular weights of 145,000, 99,000, 96,000, and 91,000. This yields a calculated native molecular weight of 460, 000, which compares well with its mass determined by scanning transmission electron microscopy (493,000) and its sedimentation rate, which is close to RNA polymerase I (515,000). Both impure and nearly homogeneous TIF-IB exhibit an apparent equilibrium dissociation constant of 56 +/- 3 pM. However, although impure TIF-IB can form a promoter-DNA complex resistant to challenge by other promoter-containing DNAs, near homogeneous TIF-IB cannot do so. An additional transcription factor, dubbed TIF-IE, restores the ability of near homogeneous TIF-IB to sequester DNA into a committed complex.

  10. Matrix algorithms for solving (in)homogeneous bound state equations

    PubMed Central

    Blank, M.; Krassnigg, A.

    2011-01-01

    In the functional approach to quantum chromodynamics, the properties of hadronic bound states are accessible via covariant integral equations, e.g. the Bethe–Salpeter equation for mesons. In particular, one has to deal with linear, homogeneous integral equations which, in sophisticated model setups, use numerical representations of the solutions of other integral equations as part of their input. Analogously, inhomogeneous equations can be constructed to obtain off-shell information in addition to bound-state masses and other properties obtained from the covariant analogue to a wave function of the bound state. These can be solved very efficiently using well-known matrix algorithms for eigenvalues (in the homogeneous case) and the solution of linear systems (in the inhomogeneous case). We demonstrate this by solving the homogeneous and inhomogeneous Bethe–Salpeter equations and find, e.g. that for the calculation of the mass spectrum it is as efficient or even advantageous to use the inhomogeneous equation as compared to the homogeneous. This is valuable insight, in particular for the study of baryons in a three-quark setup and more involved systems. PMID:21760640

  11. Production of starch nanoparticles using normal maize starch via heat-moisture treatment under mildly acidic conditions and homogenization.

    PubMed

    Park, Eun Young; Kim, Min-Jung; Cho, MyoungLae; Lee, Ju Hun; Kim, Jong-Yea

    2016-10-20

    Normal maize starch was subjected to heat-moisture treatment (HMT) under mildly acidic conditions (0.000, 0.050, or 0.075M H2SO4) for various treatment times (3, 5, or 8h) followed by homogenization up to 60min to prepare nanoparticles. The combination of HMT (0.075M, for 8h) and homogenization (60min) produced nanoparticles with diameters of less than 50nm at a yield higher than 80%. X-ray diffractometry and size-exclusion chromatography revealed that HMT under mildly acidic conditions selectively hydrolyzed the starch chains (especially amylose and/or long chains of amylopectin) in the amorphous region of the granules without significant damage to the crystalline structure, however, modification of the molecular structure in the amorphous region increased fragility of the granules during homogenization. Homogenization for 60min caused obvious damage in the long-range crystalline structure of the HMT starch (0.15N, for 8h), while the short-range chain associations (FT-IR) remained intact. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Effects of ultrasonication and conventional mechanical homogenization processes on the structures and dielectric properties of BaTiO3 ceramics.

    PubMed

    Akbas, Hatice Zehra; Aydin, Zeki; Yilmaz, Onur; Turgut, Selvin

    2017-01-01

    The effects of the homogenization process on the structures and dielectric properties of pure and Nb-doped BaTiO 3 ceramics have been investigated using an ultrasonic homogenization and conventional mechanical methods. The reagents were homogenized using an ultrasonic processor with high-intensity ultrasonic waves and using a compact mixer-shaker. The components and crystal types of the powders were determined by Fourier-transform infrared spectroscopy (FTIR) and X-ray diffraction (XRD) analyses. The complex permittivity (ε ' , ε″) and AC conductivity (σ') of the samples were analyzed in a wide frequency range of 20Hz to 2MHz at room temperature. The structures and dielectric properties of pure and Nb-doped BaTiO 3 ceramics strongly depend on the homogenization process in a solid-state reaction method. Using an ultrasonic processor with high-intensity ultrasonic waves based on acoustic cavitation phenomena can make a significant improvement in producing high-purity BaTiO 3 ceramics without carbonate impurities with a small dielectric loss. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Slowly digestible properties of lotus seed starch-glycerine monostearin complexes formed by high pressure homogenization.

    PubMed

    Chen, Bingyan; Jia, Xiangze; Miao, Song; Zeng, Shaoxiao; Guo, Zebin; Zhang, Yi; Zheng, Baodong

    2018-06-30

    Starch-lipid complexes were prepared using lotus seed starch (LS) and glycerin monostearate (GMS) via a high-pressure homogenization process, and the effect of high pressure homogenization (HPH) on the slow digestion properties of LS-GMS was investigated. The digestion profiles showed HPH treatment reduced the digestive rate of LS-GMS, and the extent of this change was dependent on homogenized pressure. Scanning electron microscopy displayed HPH treatment change the morphology of LS-GMS, with high pressure producing more compact block-shape structure to resist enzyme digestion. The results of Gel-permeation chromatography and Small-angle X-ray scattering revealed high homogenization pressure impacted molecular weight distribution and semi-crystalline region of complexes, resulting in the formation of new semi-crystalline with repeat unit distance of 16-18 nm and molecular weight distribution of 2.50-2.80 × 10 5  Da, which displayed strong enzymatic resistance. Differential scanning calorimeter results revealed new semi-crystalline lamellar may originate from type-II complexes that exhibited a high transition temperature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Effect of homogenous-heterogeneous reactions on MHD Prandtl fluid flow over a stretching sheet

    NASA Astrophysics Data System (ADS)

    Khan, Imad; Malik, M. Y.; Hussain, Arif; Salahuddin, T.

    An analysis is performed to explore the effects of homogenous-heterogeneous reactions on two-dimensional flow of Prandtl fluid over a stretching sheet. In present analysis, we used the developed model of homogeneous-heterogeneous reactions in boundary layer flow. The mathematical configuration of presented flow phenomenon yields the nonlinear partial differential equations. Using scaling transformations, the governing partial differential equations (momentum equation and homogenous-heterogeneous reactions equations) are transformed into non-linear ordinary differential equations (ODE's). Then, resulting non-linear ODE's are solved by computational scheme known as shooting method. The quantitative and qualitative manners of concerned physical quantities (velocity, concentration and drag force coefficient) are examined under prescribed physical constrained through figures and tables. It is observed that velocity profile enhances verses fluid parameters α and β while Hartmann number reduced it. The homogeneous and heterogeneous reactions parameters have reverse effects on concentration profile. Concentration profile shows retarding behavior for large values of Schmidt number. Skin fraction coefficient enhances with increment in Hartmann number H and fluid parameter α .

  15. Enzymatic production of N-acetyl-d-glucosamine from crayfish shell wastes pretreated via high pressure homogenization.

    PubMed

    Wei, Guoguang; Zhang, Alei; Chen, Kequan; Ouyang, Pingkai

    2017-09-01

    This study presents an efficient pretreatment of crayfish shell using high pressure homogenization that enables N-acetyl-d-glucosamine (GlcNAc) production by chitinase. Firstly, the chitinase from Serratia proteamaculans NJ303 was screened for its ability to degrade crayfish shell and produce GlcNAc as the sole product. Secondly, high pressure homogenization, which caused the crayfish shell to adopt a fluffy netted structure that was characterized by Scanning electron microscope (SEM), Fourier transform infrared spectrometer (FT-IR), X-ray diffraction (XRD), was evaluated as the best pretreatment method. In addition, the optimal conditions of high pressure homogenization of crayfish shell were determined to be five cycles at a pressure of 400bar, which achieved a yield of 3.9g/L of GlcNAc from 25g/L of crayfish shell in a batch enzymatic reaction over 1.5h. The results showed high pressure homogenization might be an efficient method for direct utilization of crayfish shell for enzymatic production of GlcNAc. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Experimental investigation of homogeneous charge compression ignition combustion of biodiesel fuel with external mixture formation in a CI engine.

    PubMed

    Ganesh, D; Nagarajan, G; Ganesan, S

    2014-01-01

    In parallel to the interest in renewable fuels, there has also been increased interest in homogeneous charge compression ignition (HCCI) combustion. HCCI engines are being actively developed because they have the potential to be highly efficient and to produce low emissions. Even though HCCI has been researched extensively, few challenges still exist. These include controlling the combustion at higher loads and the formation of a homogeneous mixture. To obtain better homogeneity, in the present investigation external mixture formation method was adopted, in which the fuel vaporiser was used to achieve excellent HCCI combustion in a single cylinder air-cooled direct injection diesel engine. In continuation of our previous works, in the current study a vaporised jatropha methyl ester (JME) was mixed with air to form a homogeneous mixture and inducted into the cylinder during the intake stroke to analyze the combustion, emission and performance characteristics. To control the early ignition of JME vapor-air mixture, cooled (30 °C) Exhaust gas recirculation (EGR) technique was adopted. The experimental result shows 81% reduction in NOx and 72% reduction in smoke emission.

  17. Heterogeneity in homogeneous nucleation from billion-atom molecular dynamics simulation of solidification of pure metal.

    PubMed

    Shibuta, Yasushi; Sakane, Shinji; Miyoshi, Eisuke; Okita, Shin; Takaki, Tomohiro; Ohno, Munekazu

    2017-04-05

    Can completely homogeneous nucleation occur? Large scale molecular dynamics simulations performed on a graphics-processing-unit rich supercomputer can shed light on this long-standing issue. Here, a billion-atom molecular dynamics simulation of homogeneous nucleation from an undercooled iron melt reveals that some satellite-like small grains surrounding previously formed large grains exist in the middle of the nucleation process, which are not distributed uniformly. At the same time, grains with a twin boundary are formed by heterogeneous nucleation from the surface of the previously formed grains. The local heterogeneity in the distribution of grains is caused by the local accumulation of the icosahedral structure in the undercooled melt near the previously formed grains. This insight is mainly attributable to the multi-graphics processing unit parallel computation combined with the rapid progress in high-performance computational environments.Nucleation is a fundamental physical process, however it is a long-standing issue whether completely homogeneous nucleation can occur. Here the authors reveal, via a billion-atom molecular dynamics simulation, that local heterogeneity exists during homogeneous nucleation in an undercooled iron melt.

  18. Ice nucleation triggered by negative pressure.

    PubMed

    Marcolli, Claudia

    2017-11-30

    Homogeneous ice nucleation needs supercooling of more than 35 K to become effective. When pressure is applied to water, the melting and the freezing points both decrease. Conversely, melting and freezing temperatures increase under negative pressure, i.e. when water is stretched. This study presents an extrapolation of homogeneous ice nucleation temperatures from positive to negative pressures as a basis for further exploration of ice nucleation under negative pressure. It predicts that increasing negative pressure at temperatures below about 262 K eventually results in homogeneous ice nucleation while at warmer temperature homogeneous cavitation, i. e. bubble nucleation, dominates. Negative pressure occurs locally and briefly when water is stretched due to mechanical shock, sonic waves, or fragmentation. The occurrence of such transient negative pressure should suffice to trigger homogeneous ice nucleation at large supercooling in the absence of ice-nucleating surfaces. In addition, negative pressure can act together with ice-inducing surfaces to enhance their intrinsic ice nucleation efficiency. Dynamic ice nucleation can be used to improve properties and uniformity of frozen products by applying ultrasonic fields and might also be relevant for the freezing of large drops in rainclouds.

  19. Homogenous charge compression ignition engine having a cylinder including a high compression space

    DOEpatents

    Agama, Jorge R.; Fiveland, Scott B.; Maloney, Ronald P.; Faletti, James J.; Clarke, John M.

    2003-12-30

    The present invention relates generally to the field of homogeneous charge compression engines. In these engines, fuel is injected upstream or directly into the cylinder when the power piston is relatively close to its bottom dead center position. The fuel mixes with air in the cylinder as the power piston advances to create a relatively lean homogeneous mixture that preferably ignites when the power piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. Thus, the present invention divides the homogeneous charge between a controlled volume higher compression space and a lower compression space to better control the start of ignition.

  20. Effect of Thermal Gradient on Vibration of Non-uniform Visco-elastic Rectangular Plate

    NASA Astrophysics Data System (ADS)

    Khanna, Anupam; Kaur, Narinder

    2016-04-01

    Here, a theoretical model is presented to analyze the effect of bilinear temperature variations on vibration of non-homogeneous visco-elastic rectangular plate with non-uniform thickness. Non-uniformity in thickness of the plate is assumed linear in one direction. Since plate's material is considered as non-homogeneous, authors characterized non-homogeneity in poisson ratio and density of the plate's material exponentially in x-direction. Plate is supposed to be clamped at the ends. Deflection for first two modes of vibration is calculated by using Rayleigh-Ritz technique and tabulated for various values of plate's parameters i.e. taper constant, aspect ratio, non-homogeneity constants and thermal gradient. Comparison of present findings with existing literature is also provided in tabular and graphical manner.

  1. First-order reactant in homogeneous turbulence before the final period of decay. [contaminant fluctuations in chemical reaction

    NASA Technical Reports Server (NTRS)

    Kumar, P.; Patel, S. R.

    1974-01-01

    A method is described for studying theoretically the concentration fluctuations of a dilute contaminate undergoing a first-order chemical reaction. The method is based on Deissler's (1958) theory for homogeneous turbulence for times before the final period, and it follows the approach used by Loeffler and Deissler (1961) to study temperature fluctuations in homogeneous turbulence. Four-point correlation equations are obtained; it is assumed that terms containing fifth-order correlation are very small in comparison with those containing fourth-order correlations, and can therefore be neglected. A spectrum equation is obtained in a form which can be solved numerically, yielding the decay law for the concentration fluctuations in homogeneous turbulence for the period much before the final period of decay.

  2. Homogenization of CZ Si wafers by Tabula Rasa annealing

    NASA Astrophysics Data System (ADS)

    Meduňa, M.; Caha, O.; Kuběna, J.; Kuběna, A.; Buršík, J.

    2009-12-01

    The precipitation of interstitial oxygen in Czochralski grown silicon has been investigated by infrared absorption spectroscopy, chemical etching, transmission electron microscopy and X-ray diffraction after application of homogenization annealing process called Tabula Rasa. The influence of this homogenization step consisting in short time annealing at high temperature has been observed for various temperatures and times. The experimental results involving the interstitial oxygen decay in Si wafers and absorption spectra of SiOx precipitates during precipitation annealing at 1000∘ C were compared with other techniques for various Tabula Rasa temperatures. The differences in oxygen precipitation, precipitate morphology and evolution of point defects in samples with and without Tabula Rasa applied is evident from all used experimental techniques. The results qualitatively correlate with prediction of homogenization annealing process based on classical nucleation theory.

  3. Soluble Molecularly Imprinted Nanorods for Homogeneous Molecular Recognition

    NASA Astrophysics Data System (ADS)

    Liang, Rongning; Wang, Tiantian; Zhang, Huan; Yao, Ruiqing; Qin, Wei

    2018-03-01

    Nowadays, it is still difficult for molecularly imprinted polymer (MIPs) to achieve homogeneous recognition since they cannot be easily dissolved in organic or aqueous phase. To address this issue, soluble molecularly imprinted nanorods have been synthesized by using soluble polyaniline doped with a functionalized organic protonic acid as the polymer matrix. By employing 1-naphthoic acid as a model, the proposed imprinted nanorods exhibit an excellent solubility and good homogeneous recognition ability. The imprinting factor for the soluble imprinted nanoroads is 6.8. The equilibrium dissociation constant and the apparent maximum number of the proposed imprinted nanorods are 248.5 μM and 22.1 μmol/g, respectively. We believe that such imprinted nanorods may provide an appealing substitute for natural receptors in homogeneous recognition related fields.

  4. Homogeneity study of fixed-point continuous marine environmental and meteorological data: a review

    NASA Astrophysics Data System (ADS)

    Yang, Jinkun; Yang, Yang; Miao, Qingsheng; Dong, Mingmei; Wan, Fangfang

    2018-02-01

    The principle of inhomogeneity and the classification of homogeneity test methods are briefly described, and several common inhomogeneity methods and relative merits are described in detail. Then based on the applications of the different homogeneity methods to the ground meteorological data and marine environment data, the present status and the progress are reviewed. At present, the homogeneity research of radiosonde and ground meteorological data is mature at home and abroad, and the research and application in the marine environmental data should also be given full attention. To carry out a variety of test and correction methods combined with the use of multi-mode test system, will make the results more reasonable and scientific, and also can be used to provide accurate first-hand information for the coastal climate change researches.

  5. Two-Dimensional Homogeneous Fermi Gases

    NASA Astrophysics Data System (ADS)

    Hueck, Klaus; Luick, Niclas; Sobirey, Lennart; Siegl, Jonas; Lompe, Thomas; Moritz, Henning

    2018-02-01

    We report on the experimental realization of homogeneous two-dimensional (2D) Fermi gases trapped in a box potential. In contrast to harmonically trapped gases, these homogeneous 2D systems are ideally suited to probe local as well as nonlocal properties of strongly interacting many-body systems. As a first benchmark experiment, we use a local probe to measure the density of a noninteracting 2D Fermi gas as a function of the chemical potential and find excellent agreement with the corresponding equation of state. We then perform matter wave focusing to extract the momentum distribution of the system and directly observe Pauli blocking in a near unity occupation of momentum states. Finally, we measure the momentum distribution of an interacting homogeneous 2D gas in the crossover between attractively interacting fermions and bosonic dimers.

  6. Improved model for detection of homogeneous production batches of electronic components

    NASA Astrophysics Data System (ADS)

    Kazakovtsev, L. A.; Orlov, V. I.; Stashkov, D. V.; Antamoshkin, A. N.; Masich, I. S.

    2017-10-01

    Supplying the electronic units of the complex technical systems with electronic devices of the proper quality is one of the most important problems for increasing the whole system reliability. Moreover, for reaching the highest reliability of an electronic unit, the electronic devices of the same type must have equal characteristics which assure their coherent operation. The highest homogeneity of the characteristics is reached if the electronic devices are manufactured as a single production batch. Moreover, each production batch must contain homogeneous raw materials. In this paper, we propose an improved model for detecting the homogeneous production batches of shipped lot of electronic components based on implementing the kurtosis criterion for the results of non-destructive testing performed for each lot of electronic devices used in the space industry.

  7. Nonstationary homogeneous nucleation

    NASA Technical Reports Server (NTRS)

    Harstad, K. G.

    1974-01-01

    The theory of homogeneous condensation is reviewed and equations describing this process are presented. Numerical computer solutions to transient problems in nucleation (relaxation to steady state) are presented and compared to a prior computation.

  8. Development of a homogeneous immunoassay system using protein A fusion fragmented Renilla luciferase.

    PubMed

    Mie, Masayasu; Thuy, Ngo Phan Bich; Kobatake, Eiry

    2012-03-07

    A homogeneous immunoassay system was developed using fragmented Renilla luciferase (Rluc). The B domain of protein A was fused to two Rluc fragments. When complexes between an antibody and fragmented Rluc fusion proteins bind to target molecules, the Rluc fragments come into close proximity and the luminescence activity of fragmented Rluc is restored by complementation. As proof-of-principle, this fragmented Rluc system was used to detect E. coli homogeneously using an anti-E. coli antibody.

  9. Method of chaotic mixing and improved stirred tank reactors

    DOEpatents

    Muzzio, Fernando J.; Lamberto, David J.

    1999-01-01

    The invention provides a method and apparatus for efficiently achieving a homogeneous mixture of fluid components by introducing said components having a Reynolds number of between about .ltoreq.1 to about 500 into a vessel and continuously perturbing the mixing flow by altering the flow speed and mixing time until homogeniety is reached. This method prevents the components from aggregating into non-homogeneous segregated regions within said vessel during mixing and substantially reduces the time the admixed components reach homogeneity.

  10. Preparation of pigments for space-stable thermal control coatings

    NASA Technical Reports Server (NTRS)

    Campbell, W. B.; Smith, R. G.

    1972-01-01

    The identification and control of vapor phase reaction kinetics to produce pigments by homogeneous nucleation were achieved. A vapor phase apparatus was designed, fabricated, and calibrated through 1800 C. Vapor phase reactions were analyzed, calculations made, and powders of alumina, rutile, zinc orthotitanate (in a mixed phase), calcium tungstate, and lanthana were produced by homogeneous nucleation. Electron microscopy shows uniform particle morphology and size, and supports anticipated advantages of vapor-phase homogeneous nucleation; namely, purity, freedom from defects, and uniform particle sizing without grinding.

  11. [Methods for enzymatic determination of triglycerides in liver homogenates].

    PubMed

    Höhn, H; Gartzke, J; Burck, D

    1987-10-01

    An enzymatic method is described for the determination of triacylglycerols in liver homogenate. In contrast to usual methods, higher reliability and selectivity are achieved by omitting the extraction step.

  12. Scar Homogenization Versus Limited-Substrate Ablation in Patients With Nonischemic Cardiomyopathy and Ventricular Tachycardia.

    PubMed

    Gökoğlan, Yalçın; Mohanty, Sanghamitra; Gianni, Carola; Santangeli, Pasquale; Trivedi, Chintan; Güneş, Mahmut F; Bai, Rong; Al-Ahmad, Amin; Gallinghouse, G Joseph; Horton, Rodney; Hranitzky, Patrick M; Sanchez, Javier E; Beheiry, Salwa; Hongo, Richard; Lakkireddy, Dhanunjaya; Reddy, Madhu; Schweikert, Robert A; Dello Russo, Antonio; Casella, Michela; Tondo, Claudio; Burkhardt, J David; Themistoclakis, Sakis; Di Biase, Luigi; Natale, Andrea

    2016-11-01

    Scar homogenization improves long-term ventricular arrhythmia-free survival compared with standard limited-substrate ablation in patients with post-infarction ventricular tachycardia (VT). Whether such benefit extends to patients with nonischemic cardiomyopathy and scar-related VT is unclear. The aim of this study was to assess the long-term efficacy of an endoepicardial scar homogenization approach compared with standard ablation in this population. Consecutive patients with dilated nonischemic cardiomyopathy (n = 93), scar-related VTs, and evidence of low-voltage regions on the basis of pre-defined criteria on electroanatomic mapping (i.e., bipolar voltage <1.5 mV) underwent either standard VT ablation (group 1 [n = 57]) or endoepicardial ablation of all abnormal potentials within the electroanatomic scar (group 2 [n = 36]). Acute procedural success was defined as noninducibility of any VT at the end of the procedure; long-term success was defined as freedom from any ventricular arrhythmia at follow-up. Acute procedural success rates were 69.4% and 42.1% after scar homogenization and standard ablation, respectively (p = 0.01). During a mean follow-up period of 14 ± 2 months, single-procedure success rates were 63.9% after scar homogenization and 38.6% after standard ablation (p = 0.031). After multivariate analysis, scar homogenization and left ventricular ejection fraction were predictors of long-term success. During follow-up, the rehospitalization rate was significantly lower in the scar homogenization group (p = 0.035). In patients with dilated nonischemic cardiomyopathy, scar-related VT, and evidence of low-voltage regions on electroanatomic mapping, endoepicardial homogenization of the scar significantly increased freedom from any recurrent ventricular arrhythmia compared with a standard limited-substrate ablation. However, the success rate with this approach appeared to be lower than previously reported with ischemic cardiomyopathy, presumably because of the septal and midmyocardial distribution of the scar in some patients. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. A physically constrained classical description of the homogeneous nucleation of ice in water.

    PubMed

    Koop, Thomas; Murray, Benjamin J

    2016-12-07

    Liquid water can persist in a supercooled state to below 238 K in the Earth's atmosphere, a temperature range where homogeneous nucleation becomes increasingly probable. However, the rate of homogeneous ice nucleation in supercooled water is poorly constrained, in part, because supercooled water eludes experimental scrutiny in the region of the homogeneous nucleation regime where it can exist only fleetingly. Here we present a new parameterization of the rate of homogeneous ice nucleation based on classical nucleation theory. In our approach, we constrain the key terms in classical theory, i.e., the diffusion activation energy and the ice-liquid interfacial energy, with physically consistent parameterizations of the pertinent quantities. The diffusion activation energy is related to the translational self-diffusion coefficient of water for which we assess a range of descriptions and conclude that the most physically consistent fit is provided by a power law. The other key term is the interfacial energy between the ice embryo and supercooled water whose temperature dependence we constrain using the Turnbull correlation, which relates the interfacial energy to the difference in enthalpy between the solid and liquid phases. The only adjustable parameter in our model is the absolute value of the interfacial energy at one reference temperature. That value is determined by fitting this classical model to a selection of laboratory homogeneous ice nucleation data sets between 233.6 K and 238.5 K. On extrapolation to temperatures below 233 K, into a range not accessible to standard techniques, we predict that the homogeneous nucleation rate peaks between about 227 and 231 K at a maximum nucleation rate many orders of magnitude lower than previous parameterizations suggest. This extrapolation to temperatures below 233 K is consistent with the most recent measurement of the ice nucleation rate in micrometer-sized droplets at temperatures of 227-232 K on very short time scales using an X-ray laser technique. In summary, we present a new physically constrained parameterization for homogeneous ice nucleation which is consistent with the latest literature nucleation data and our physical understanding of the properties of supercooled water.

  14. Supported Dendrimer-Encapsulated Metal Clusters: Toward Heterogenizing Homogeneous Catalysts

    DOE PAGES

    Ye, Rong; Zhukhovitskiy, Aleksandr V.; Deraedt, Christophe V.; ...

    2017-07-13

    Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles—some without homogeneous analogues—for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimizemore » and expand the scope of their reactivity and selectivity. Ongoing efforts in our laboratories are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence, and structural uniformity, dendrimers have proven to be versatile scaffolds for the synthesis and stabilization of small nanoclusters. Then these dendrimer-encapsulated metal clusters (DEMCs) are adsorbed onto mesoporous silica. Through this method, we have achieved selective transformations that had been challenging to accomplish in a heterogeneous setting, e.g., π-bond activation and aldol reactions. Extensive investigation into the catalytic systems under reaction conditions allowed us to correlate the structural features (e.g., oxidation states) of the catalysts and their activity. Moreover, we have demonstrated that supported DEMCs are also excellent catalysts for typical heterogeneous reactions, including hydrogenation and alkane isomerization. Critically, these investigations also confirmed that the supported DEMCs are heterogeneous and stable against leaching. Catalysts optimization is achieved through the modulation of various parameters. The clusters are oxidized (e.g., with PhICl 2) or reduced (e.g., with H 2) in situ. Changing the dendrimer properties (e.g., generation, terminal functional groups) is analogous to ligand modification in homogeneous catalysts, which affect both catalytic activity and selectivity. Similarly, pore size of the support is another factor in determining product distribution. In a flow reactor, the flow rate is adjusted to control the residence time of the starting material and intermediates, and thus the final product selectivity. Our approach to heterogeneous catalysis affords various advantages: (1) the catalyst system can tap into the reactivity typical to homogeneous catalysts, which conventional heterogeneous catalysts could not achieve; (2) unlike most homogeneous catalysts with comparable performance, the heterogenized homogeneous catalysts can be recycled; (3) improved activity or selectivity compared to conventional homogeneous catalysts is possible because of uniquely heterogeneous parameters for optimization. Here in this Account, we will briefly introduce metal clusters and describe the synthesis and characterizations of supported DEMCs. We will present the catalysis studies of supported DEMCs in both the batch and flow modes. Lastly, we will summarize the current state of heterogenizing homogeneous catalysis and provide future directions for this area of research.« less

  15. Dynamic facilitation explains 'democratic' particle motion of metabasin transitions

    NASA Astrophysics Data System (ADS)

    Hedges, Lester O.; Garrahan, Juan P.

    2008-08-01

    Transitions between metabasins in supercooled liquids seem to occur through rapid collective particle rearrangements. These events have been called 'democratic' as they appear homogeneous over a significant number of particles. This could suggest that 'democratic' rearrangements are fundamentally distinct to those leading to dynamic heterogeneity. Here we show, however, that this apparent homogeneous particle motion can be explained solely in terms of dynamic facilitation, and is therefore intriniscally heterogeneous. We do so by studying metabasin transitions in facilitated spin models and constrained lattice gases. We find that metabasin transitions occur through a sequence of locally facilitated events taking place over a relatively short time frame. When observed on small enough spatial windows these events appear sudden and homogeneous. Our results indicate that metabasin transitions, while apparently homogeneous and 'democratic', are yet another manifestation of dynamical heterogeneity in glass formers.

  16. Homogeneity of lithium distribution in cylinder-type Li-ion batteries

    PubMed Central

    Senyshyn, A.; Mühlbauer, M. J.; Dolotko, O.; Hofmann, M.; Ehrenberg, H.

    2015-01-01

    Spatially-resolved neutron powder diffraction with a gauge volume of 2 × 2 × 20 mm3 has been applied as an in situ method to probe the lithium concentration in the graphite anode of different Li-ion cells of 18650-type in charged state. Structural studies performed in combination with electrochemical measurements and X-ray computed tomography under real cell operating conditions unambiguously revealed non-homogeneity of the lithium distribution in the graphite anode. Deviations from a homogeneous behaviour have been found in both radial and axial directions of 18650-type cells and were discussed in the frame of cell geometry and electrical connection of electrodes, which might play a crucial role in the homogeneity of the lithium distribution in the active materials within each electrode. PMID:26681110

  17. An efficient, reliable and inexpensive device for the rapid homogenization of multiple tissue samples by centrifugation.

    PubMed

    Ilyin, S E; Plata-Salamán, C R

    2000-02-15

    Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.

  18. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  19. Finite-time consensus for controlled dynamical systems in network

    NASA Astrophysics Data System (ADS)

    Zoghlami, Naim; Mlayeh, Rhouma; Beji, Lotfi; Abichou, Azgal

    2018-04-01

    The key challenges in networked dynamical systems are the component heterogeneities, nonlinearities, and the high dimension of the formulated vector of state variables. In this paper, the emphasise is put on two classes of systems in network include most controlled driftless systems as well as systems with drift. For each model structure that defines homogeneous and heterogeneous multi-system behaviour, we derive protocols leading to finite-time consensus. For each model evolving in networks forming a homogeneous or heterogeneous multi-system, protocols integrating sufficient conditions are derived leading to finite-time consensus. Likewise, for the networking topology, we make use of fixed directed and undirected graphs. To prove our approaches, finite-time stability theory and Lyapunov methods are considered. As illustrative examples, the homogeneous multi-unicycle kinematics and the homogeneous/heterogeneous multi-second order dynamics in networks are studied.

  20. Partitioning of the degradation space for OCR training

    NASA Astrophysics Data System (ADS)

    Barney Smith, Elisa H.; Andersen, Tim

    2006-01-01

    Generally speaking optical character recognition algorithms tend to perform better when presented with homogeneous data. This paper studies a method that is designed to increase the homogeneity of training data, based on an understanding of the types of degradations that occur during the printing and scanning process, and how these degradations affect the homogeneity of the data. While it has been shown that dividing the degradation space by edge spread improves recognition accuracy over dividing the degradation space by threshold or point spread function width alone, the challenge is in deciding how many partitions and at what value of edge spread the divisions should be made. Clustering of different types of character features, fonts, sizes, resolutions and noise levels shows that edge spread is indeed shown to be a strong indicator of the homogeneity of character data clusters.

  1. Ultrapure glass optical waveguide development in microgravity by the sol-gel process

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Containerless melting of glasses in space for the preparation of ultrapure homogeneous glass for optical waveguides is discussed. The homogenization of the glass using conventional raw materials is normally achieved on Earth either by the gravity induced convection currents or by the mechanical stirring of the melt. Because of the absence of gravity induced convection currents, the homogenization of glass using convectional raw materials is difficult in the space environment. Multicomponent, homogeneous, noncrystalline oxide gels can be prepared by the sol-gel process and these gels are promising starting materials for melting glasses in the space environment. The sol-gel process is based on the polymerization reaction of alkoxysilane with other metal alkoxy compounds or suitable metal salts. Many of the alkoxysilanes or other metal alkoxides are liquids and thus can be purified by distillation.

  2. Computationally Probing the Performance of Hybrid, Heterogeneous, and Homogeneous Iridium-Based Catalysts for Water Oxidation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    García-Melchor, Max; Vilella, Laia; López, Núria

    2016-04-29

    An attractive strategy to improve the performance of water oxidation catalysts would be to anchor a homogeneous molecular catalyst on a heterogeneous solid surface to create a hybrid catalyst. The idea of this combined system is to take advantage of the individual properties of each of the two catalyst components. We use Density Functional Theory to determine the stability and activity of a model hybrid water oxidation catalyst consisting of a dimeric Ir complex attached on the IrO 2(110) surface through two oxygen atoms. We find that homogeneous catalysts can be bound to its matrix oxide without losing significant activity.more » Hence, designing hybrid systems that benefit from both the high tunability of activity of homogeneous catalysts and the stability of heterogeneous systems seems feasible.« less

  3. A homogeneous focusing system for diode lasers and its applications in metal surface modification

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Zhong, Lijing; Tang, Xiahui; Xu, Chengwen; Wan, Chenhao

    2018-06-01

    High power diode lasers are applied in many different areas, including surface modification, welding and cutting. It is an important technical trend in laser processing of metals in the future. This paper aims to analyze the impact of the shape and homogeneity of the focal spot of the diode laser on surface modification. A focusing system using the triplet lenses for a direct output diode laser which can be used to eliminate coma aberrations is studied. A rectangular stripe with an aspect ratio from 8:1 to 25:1 is obtained, in which the power is homogeneously distributed along the fast axis, the power is 1117.6 W and the peak power intensity is 1.1587 × 106 W/cm2. This paper also presents a homogeneous focusing system by use of a Fresnel lens, in which the incident beam size is 40 × 40 mm2, the focal length is 380 mm, and the dimension of the obtained focal spot is 2 × 10 mm2. When the divergence angle of the incident light is in the range of 12.5-20 mrad and the pitch is 1 mm, the obtained homogeneity in the focal spot is the optimum (about 95.22%). Experimental results show that the measured focal spot size is 2.04 × 10.39 mm2. This research presents a novel design of homogeneous focusing systems for high power diode lasers.

  4. Simulation of homogeneous condensation of small polyatomic systems in high pressure supersonic nozzle flows using Bhatnagar-Gross-Krook model

    NASA Astrophysics Data System (ADS)

    Kumar, Rakesh; Levin, Deborah A.

    2011-03-01

    In the present work, we have simulated the homogeneous condensation of carbon dioxide and ethanol using the Bhatnagar-Gross-Krook based approach. In an earlier work of Gallagher-Rogers et al. [J. Thermophys. Heat Transfer 22, 695 (2008)], it was found that it was not possible to simulate condensation experiments of Wegener et al. [Phys. Fluids 15, 1869 (1972)] using the direct simulation Monte Carlo method. Therefore, in this work, we have used the statistical Bhatnagar-Gross-Krook approach, which was found to be numerically more efficient than direct simulation Monte Carlo method in our previous studies [Kumar et al., AIAA J. 48, 1531 (2010)], to model homogeneous condensation of two small polyatomic systems, carbon dioxide and ethanol. A new weighting scheme is developed in the Bhatnagar-Gross-Krook framework to reduce the computational load associated with the study of homogeneous condensation flows. The solutions obtained by the use of the new scheme are compared with those obtained by the baseline Bhatnagar-Gross-Krook condensation model (without the species weighting scheme) for the condensing flow of carbon dioxide in the stagnation pressure range of 1-5 bars. Use of the new weighting scheme in the present work makes the simulation of homogeneous condensation of ethanol possible. We obtain good agreement between our simulated predictions for homogeneous condensation of ethanol and experiments in terms of the point of condensation onset and the distribution of mass fraction of ethanol condensed along the nozzle centerline.

  5. Performance of electrolyte measurements assessed by a trueness verification program.

    PubMed

    Ge, Menglei; Zhao, Haijian; Yan, Ying; Zhang, Tianjiao; Zeng, Jie; Zhou, Weiyan; Wang, Yufei; Meng, Qinghui; Zhang, Chuanbao

    2016-08-01

    In this study, we analyzed frozen sera with known commutabilities for standardization of serum electrolyte measurements in China. Fresh frozen sera were sent to 187 clinical laboratories in China for measurement of four electrolytes (sodium, potassium, calcium, and magnesium). Target values were assigned by two reference laboratories. Precision (CV), trueness (bias), and accuracy [total error (TEa)] were used to evaluate measurement performance, and the tolerance limit derived from the biological variation was used as the evaluation criterion. About half of the laboratories used a homogeneous system (same manufacturer for instrument, reagent and calibrator) for calcium and magnesium measurement, and more than 80% of laboratories used a homogeneous system for sodium and potassium measurement. More laboratories met the tolerance limit of imprecision (coefficient of variation [CVa]) than the tolerance limits of trueness (biasa) and TEa. For sodium, calcium, and magnesium, the minimal performance criterion derived from biological variation was used, and the pass rates for total error were approximately equal to the bias (<50%). For potassium, the pass rates for CV and TE were more than 90%. Compared with the non homogeneous system, the homogeneous system was superior for all three quality specifications. The use of commutable proficiency testing/external quality assessment (PT/EQA) samples with values assigned by reference methods can monitor performance and provide reliable data for improving the performance of laboratory electrolyte measurement. The homogeneous systems were superior to the non homogeneous systems, whereas accuracy of assigned values of calibrators and assay stability remained challenges.

  6. Use of vertical temperature gradients for prediction of tidal flat sediment characteristics

    USGS Publications Warehouse

    Miselis, Jennifer L.; Holland, K. Todd; Reed, Allen H.; Abelev, Andrei

    2012-01-01

    Sediment characteristics largely govern tidal flat morphologic evolution; however, conventional methods of investigating spatial variability in lithology on tidal flats are difficult to employ in these highly dynamic regions. In response, a series of laboratory experiments was designed to investigate the use of temperature diffusion toward sediment characterization. A vertical thermistor array was used to quantify temperature gradients in simulated tidal flat sediments of varying compositions. Thermal conductivity estimates derived from these arrays were similar to measurements from a standard heated needle probe, which substantiates the thermistor methodology. While the thermal diffusivities of dry homogeneous sediments were similar, diffusivities for saturated homogeneous sediments ranged approximately one order of magnitude. The thermal diffusivity of saturated sand was five times the thermal diffusivity of saturated kaolin and more than eight times the thermal diffusivity of saturated bentonite. This suggests that vertical temperature gradients can be used for distinguishing homogeneous saturated sands from homogeneous saturated clays and perhaps even between homogeneous saturated clay types. However, experiments with more realistic tidal flat mixtures were less discriminating. Relationships between thermal diffusivity and percent fines for saturated mixtures varied depending upon clay composition, indicating that clay hydration and/or water content controls thermal gradients. Furthermore, existing models for the bulk conductivity of sediment mixtures were improved only through the use of calibrated estimates of homogeneous end-member conductivity and water content values. Our findings suggest that remotely sensed observations of water content and thermal diffusivity could only be used to qualitatively estimate tidal flat sediment characteristics.

  7. High temperature homogenization improves impact toughness of vitamin E-diffused, irradiated UHMWPE.

    PubMed

    Oral, Ebru; O'Brien, Caitlin; Doshi, Brinda; Muratoglu, Orhun K

    2017-06-01

    Diffusion of vitamin E into radiation cross-linked ultrahigh molecular weight polyethylene (UHMWPE) is used to increase stability against oxidation of total joint implant components. The dispersion of vitamin E throughout implant preforms has been optimized by a two-step process of doping and homogenization. Both of these steps are performed below the peak melting point of the cross-linked polymer (<140°C) to avoid loss of crystallinity and strength. Recently, it was discovered that the exposure of UHMWPE to elevated temperatures, around 300°C, for a limited amount of time in nitrogen, could improve the toughness without sacrificing wear resistance. We hypothesized that high temperature homogenization of antioxidant-doped, radiation cross-linked UHMWPE could improve its toughness. We found that homogenization at 300°C for 8 h resulted in an increase in the impact toughness (74 kJ/m 2 compared to 67 kJ/m 2 ), the ultimate tensile strength (50 MPa compared to 43 MPa) and elongation at break (271% compared to 236%). The high temperature treatment did not compromise the wear resistance or the oxidative stability as measured by oxidation induction time. In addition, the desired homogeneity was achieved at a much shorter duration (8 h compared to >240 h) by using high temperature homogenization. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:1343-1347, 2017. © 2016 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  8. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    DOE PAGES

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; ...

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less

  9. Inhibitory competition in figure-ground perception: context and convexity.

    PubMed

    Peterson, Mary A; Salvagio, Elizabeth

    2008-12-15

    Convexity has long been considered a potent cue as to which of two regions on opposite sides of an edge is the shaped figure. Experiment 1 shows that for a single edge, there is only a weak bias toward seeing the figure on the convex side. Experiments 1-3 show that the bias toward seeing the convex side as figure increases as the number of edges delimiting alternating convex and concave regions increases, provided that the concave regions are homogeneous in color. The results of Experiments 2 and 3 rule out a probability summation explanation for these context effects. Taken together, the results of Experiments 1-3 show that the homogeneity versus heterogeneity of the convex regions is irrelevant. Experiment 4 shows that homogeneity of alternating regions is not sufficient for context effects; a cue that favors the perception of the intervening regions as figures is necessary. Thus homogeneity alone does not alone operate as a background cue. We interpret our results within a model of figure-ground perception in which shape properties on opposite sides of an edge compete for representation and the competitive strength of weak competitors is further reduced when they are homogeneous.

  10. Field Performance of an Optimized Stack of YBCO Square “Annuli” for a Compact NMR Magnet

    PubMed Central

    Hahn, Seungyong; Voccio, John; Bermond, Stéphane; Park, Dong-Keun; Bascuñán, Juan; Kim, Seok-Beom; Masaru, Tomita; Iwasa, Yukikazu

    2011-01-01

    The spatial field homogeneity and time stability of a trapped field generated by a stack of YBCO square plates with a center hole (square “annuli”) was investigated. By optimizing stacking of magnetized square annuli, we aim to construct a compact NMR magnet. The stacked magnet consists of 750 thin YBCO plates, each 40-mm square and 80- μm thick with a 25-mm bore, and has a Ø10 mm room-temperature access for NMR measurement. To improve spatial field homogeneity of the 750-plate stack (YP750) a three-step optimization was performed: 1) statistical selection of best plates from supply plates; 2) field homogeneity measurement of multi-plate modules; and 3) optimal assembly of the modules to maximize field homogeneity. In this paper, we present analytical and experimental results of field homogeneity and temporal stability at 77 K, performed on YP750 and those of a hybrid stack, YPB750, in which two YBCO bulk annuli, each Ø46 mm and 16-mm thick with a 25-mm bore, are added to YP750, one at the top and the other at the bottom. PMID:22081753

  11. Isotopic homogeneity of iron in the early solar nebula.

    PubMed

    Zhu, X K; Guo, Y; O'Nions, R K; Young, E D; Ash, R D

    2001-07-19

    The chemical and isotopic homogeneity of the early solar nebula, and the processes producing fractionation during its evolution, are central issues of cosmochemistry. Studies of the relative abundance variations of three or more isotopes of an element can in principle determine if the initial reservoir of material was a homogeneous mixture or if it contained several distinct sources of precursor material. For example, widespread anomalies observed in the oxygen isotopes of meteorites have been interpreted as resulting from the mixing of a solid phase that was enriched in 16O with a gas phase in which 16O was depleted, or as an isotopic 'memory' of Galactic evolution. In either case, these anomalies are regarded as strong evidence that the early solar nebula was not initially homogeneous. Here we present measurements of the relative abundances of three iron isotopes in meteoritic and terrestrial samples. We show that significant variations of iron isotopes exist in both terrestrial and extraterrestrial materials. But when plotted in a three-isotope diagram, all of the data for these Solar System materials fall on a single mass-fractionation line, showing that homogenization of iron isotopes occurred in the solar nebula before both planetesimal accretion and chondrule formation.

  12. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  13. Homogenization of Classification Functions Measurement (HOCFUN): A Method for Measuring the Salience of Emotional Arousal in Thinking.

    PubMed

    Tonti, Marco; Salvatore, Sergio

    2015-01-01

    The problem of the measurement of emotion is a widely debated one. In this article we propose an instrument, the Homogenization of Classification Functions Measure (HOCFUN), designed for assessing the influence of emotional arousal on a rating task consisting of the evaluation of a sequence of images. The instrument defines an indicator (κ) that measures the degree of homogenization of the ratings given over 2 rating scales (pleasant-unpleasant and relevant-irrelevant). Such a degree of homogenization is interpreted as the effect of emotional arousal on thinking and therefore lends itself to be used as a marker of emotional arousal. A preliminary study of validation was implemented. The association of the κ indicator with 3 additional indicators was analyzed. Consistent with the hypotheses, the κ indicator proved to be associated, even if weakly and nonlinearly, with a marker of the homogenization of classification functions derived from a separate rating task and with 2 indirect indicators of emotional activation: the speed of performance on the HOCFUN task and an indicator of mood intensity. Taken as a whole, such results provide initial evidence supporting the HOCFUN construct validity.

  14. Standard deviation index for stimulated Brillouin scattering suppression with different homogeneities.

    PubMed

    Ran, Yang; Su, Rongtao; Ma, Pengfei; Wang, Xiaolin; Zhou, Pu; Si, Lei

    2016-05-10

    We present a new quantitative index of standard deviation to measure the homogeneity of spectral lines in a fiber amplifier system so as to find the relation between the stimulated Brillouin scattering (SBS) threshold and the homogeneity of the corresponding spectral lines. A theoretical model is built and a simulation framework has been established to estimate the SBS threshold when input spectra with different homogeneities are set. In our experiment, by setting the phase modulation voltage to a constant value and the modulation frequency to different values, spectral lines with different homogeneities can be obtained. The experimental results show that the SBS threshold increases negatively with the standard deviation of the modulated spectrum, which is in good agreement with the theoretical results. When the phase modulation voltage is confined to 10 V and the modulation frequency is set to 80 MHz, the standard deviation of the modulated spectrum equals 0.0051, which is the lowest value in our experiment. Thus, at this time, the highest SBS threshold has been achieved. This standard deviation can be a good quantitative index in evaluating the power scaling potential in a fiber amplifier system, which is also a design guideline in suppressing the SBS to a better degree.

  15. Effect of the fat globule sizes on the meltdown of ice cream.

    PubMed

    Koxholt, M M; Eisenmann, B; Hinrichs, J

    2001-01-01

    The meltdown of ice cream is influenced by its composition and additives and by fat globule size. The objective of this study was to examine the effect of fat globule size and fat agglomerate size on the meltdown stability of ice cream. Therefore, an ice cream mix (10% milk fat) was homogenized at pressures ranging from 0 to 30 MPa in single-stage, double-stage, and selective homogenization processes. The ice cream, produced on a continuous ice cream freezer, was characterized by an optimized meltdown test while, in addition, the fat globule sizes and the free fat content were determined in the mix and the molten ice cream. The meltdown was dependent on the fat agglomerate sizes in the unfrozen serum phase. Agglomerates smaller than a critical diameter led to significantly higher meltdown rates. Homogenization pressures of at least 10 MPa were sufficient to produce a stable ice cream. Furthermore, proof was provided that double-stage homogenization is not necessary for fat contents up to 10% and that selective homogenization is possible to produce stable ice creams. Based on these results a model was deduced describing the stabilizing mechanisms during the meltdown process.

  16. Short communication: effect of homogenization on heat inactivation of Mycobacterium avium subspecies paratuberculosis in milk.

    PubMed

    Hammer, P; Kiesner, C; Walte, H-G C

    2014-01-01

    Mycobacterium avium ssp. paratuberculosis (MAP) can be present in cow milk and low numbers may survive high-temperature, short-time (HTST) pasteurization. Although HTST treatment leads to inactivation of at least 5 log10 cycles, it might become necessary to enhance the efficacy of HTST by additional treatments such as homogenization if the debate about the role of MAP in Crohn's disease of humans concludes that MAP is a zoonotic agent. This study aimed to determine whether disrupting the clumps of MAP in milk by homogenization during the heat treatment process would enhance the inactivation of MAP. We used HTST pasteurization in a continuous-flow pilot-plant pasteurizer and evaluated the effect of upstream, downstream, and in-hold homogenization on inactivation of MAP. Reduction of MAP at 72°C with a holding time of 28s was between 3.7 and 6.9 log10 cycles, with an overall mean of 5.5 log10 cycles. None of the 3 homogenization modes applied showed a statistically significant additional effect on the inactivation of MAP during HTST treatment. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. Using Floquet periodicity to easily calculate dispersion curves and wave structures of homogeneous waveguides

    NASA Astrophysics Data System (ADS)

    Hakoda, Christopher; Rose, Joseph; Shokouhi, Parisa; Lissenden, Clifford

    2018-04-01

    Dispersion curves are essential to any guided-wave-related project. The Semi-Analytical Finite Element (SAFE) method has become the conventional way to compute dispersion curves for homogeneous waveguides. However, only recently has a general SAFE formulation for commercial and open-source software become available, meaning that until now SAFE analyses have been variable and more time consuming than desirable. Likewise, the Floquet boundary conditions enable analysis of waveguides with periodicity and have been an integral part of the development of metamaterials. In fact, we have found the use of Floquet boundary conditions to be an extremely powerful tool for homogeneous waveguides, too. The nuances of using periodic boundary conditions for homogeneous waveguides that do not exhibit periodicity are discussed. Comparisons between this method and SAFE are made for selected homogeneous waveguide applications. The COMSOL Multiphysics software is used for the results shown, but any standard finite element software that can implement Floquet periodicity (user-defined or built-in) should suffice. Finally, we identify a number of complex waveguides for which dispersion curves can be found with relative ease by using the periodicity inherent to the Floquet boundary conditions.

  18. Tests of a Prototype for Assessing the Field Homogeneity of the Iseult/Inumac 11.7T Whole Body MRI Magnet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quettier, Lionel

    A neuroscience research center with very high field MRI equipments has been opened in November 2006 by the CEA life science division. One of the imaging systems will require a 11.75 T magnet with a 900 mm warm bore, the so-call Iseult/Inumac magnet. Regarding the large aperture and field strength, this magnet is a challenge as compared to the largest MRI systems ever built, and is then developed within an ambitious R&D program. With the objective of demonstrating the possibility of achieving field homogeneity better than 1 ppm using double pancake windings, a 24 double pancakes model coil, working atmore » 1.5 T has been designed. This model magnet has been manufactured by Alstom MSA and tested at CEA. It has been measured with a very high precision, in order to fully characterize the field homogeneity, and then to investigate and discriminate the parameters that influence the field map. This magnet has reached the bare magnet field homogeneity specification expected for Iseult and thus successfully demonstrated the feasibility of building a homogenous magnet with the double pancake winding technique.« less

  19. Permian paleoclimate data from fluid inclusions in halite

    USGS Publications Warehouse

    Benison, K.C.; Goldstein, R.H.

    1999-01-01

    This study has yielded surface water paleotemperatures from primary fluid inclusions in mid Permian Nippewalla Group halite from western Kansas. A 'cooling nucleation' method is used to generate vapor bubbles in originally all-liquid primary inclusions. Then, surface water paleotemperatures are obtained by measuring temperatures of homogenization to liquid. Homogenization temperatures ranged from 21??C to 50??C and are consistent along individual fluid inclusion assemblages, indicating that the fluid inclusions have not been altered by thermal reequilibration. Homogenization temperatures show a range of up to 26??C from base to top of individual cloudy chevron growth bands. Petrographic and fluid inclusion evidence indicate that no significant pressure correction is needed for the homogenization temperature data. We interpret these homogenization temperatures to represent shallow surface water paleotemperatures. The range in temperatures from base to top of single chevron bands may reflect daily temperatures variations. These Permian surface water temperatures fall within the same range as some modern evaporative surface waters, suggesting that this Permian environment may have been relatively similar to its modern counterparts. Shallow surface water temperatures in evaporative settings correspond closely to local air temperatures. Therefore, the Permian surface water temperatures determined in this study may be considered proxies for local Permian air temperatures.

  20. A homogeneous superconducting magnet design using a hybrid optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ni, Zhipeng; Wang, Qiuliang; Liu, Feng; Yan, Luguang

    2013-12-01

    This paper employs a hybrid optimization algorithm with a combination of linear programming (LP) and nonlinear programming (NLP) to design the highly homogeneous superconducting magnets for magnetic resonance imaging (MRI). The whole work is divided into two stages. The first LP stage provides a global optimal current map with several non-zero current clusters, and the mathematical model for the LP was updated by taking into account the maximum axial and radial magnetic field strength limitations. In the second NLP stage, the non-zero current clusters were discretized into practical solenoids. The superconducting conductor consumption was set as the objective function both in the LP and NLP stages to minimize the construction cost. In addition, the peak-peak homogeneity over the volume of imaging (VOI), the scope of 5 Gauss fringe field, and maximum magnetic field strength within superconducting coils were set as constraints. The detailed design process for a dedicated 3.0 T animal MRI scanner was presented. The homogeneous magnet produces a magnetic field quality of 6.0 ppm peak-peak homogeneity over a 16 cm by 18 cm elliptical VOI, and the 5 Gauss fringe field was limited within a 1.5 m by 2.0 m elliptical region.

  1. Computational prediction of new auxetic materials.

    PubMed

    Dagdelen, John; Montoya, Joseph; de Jong, Maarten; Persson, Kristin

    2017-08-22

    Auxetics comprise a rare family of materials that manifest negative Poisson's ratio, which causes an expansion instead of contraction under tension. Most known homogeneously auxetic materials are porous foams or artificial macrostructures and there are few examples of inorganic materials that exhibit this behavior as polycrystalline solids. It is now possible to accelerate the discovery of materials with target properties, such as auxetics, using high-throughput computations, open databases, and efficient search algorithms. Candidates exhibiting features correlating with auxetic behavior were chosen from the set of more than 67 000 materials in the Materials Project database. Poisson's ratios were derived from the calculated elastic tensor of each material in this reduced set of compounds. We report that this strategy results in the prediction of three previously unidentified homogeneously auxetic materials as well as a number of compounds with a near-zero homogeneous Poisson's ratio, which are here denoted "anepirretic materials".There are very few inorganic materials with auxetic homogenous Poisson's ratio in polycrystalline form. Here authors develop an approach to screening materials databases for target properties such as negative Poisson's ratio by using stability and structural motifs to predict new instances of homogenous auxetic behavior as well as a number of materials with near-zero Poisson's ratio.

  2. ROS production in homogenate from the body wall of sea cucumber Stichopus japonicus under UVA irradiation: ESR spin-trapping study.

    PubMed

    Qi, Hang; Dong, Xiu-fang; Zhao, Ya-ping; Li, Nan; Fu, Hui; Feng, Ding-ding; Liu, Li; Yu, Chen-xu

    2016-02-01

    Sea cucumber Stichopus japonicus (S. japonicus) shows a strong ability of autolysis, which leads to severe deterioration in sea cucumber quality during processing and storage. In this study, to further characterize the mechanism of sea cucumber autolysis, hydroxyl radical production induced by ultraviolet A (UVA) irradiation was investigated. Homogenate from the body wall of S. japonicas was prepared and subjected to UVA irradiation at room temperature. Electron Spin Resonance (ESR) spectra of the treated samples were subsequently recorded. The results showed that hydroxyl radicals (OH) became more abundant while the time of UVA treatment and the homogenate concentration were increased. Addition of superoxide dismutase (SOD), catalase, EDTA, desferal, NaN3 and D2O to the homogenate samples led to different degrees of inhibition on OH production. Metal cations and pH also showed different effects on OH production. These results indicated that OH was produced in the homogenate with a possible pathway as follows: O2(-) → H2O2 → OH, suggesting that OH might be a critical factor in UVA-induced S. japonicus autolysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Configuration optimization of space structures

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos; Crivelli, Luis A.; Vandenbelt, David

    1991-01-01

    The objective is to develop a computer aid for the conceptual/initial design of aerospace structures, allowing configurations and shape to be apriori design variables. The topics are presented in viewgraph form and include the following: Kikuchi's homogenization method; a classical shape design problem; homogenization method steps; a 3D mechanical component design example; forming a homogenized finite element; a 2D optimization problem; treatment of volume inequality constraint; algorithms for the volume inequality constraint; object function derivatives--taking advantage of design locality; stiffness variations; variations of potential; and schematics of the optimization problem.

  4. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 1, meso-scale

    NASA Astrophysics Data System (ADS)

    Milani, G.; Bertolesi, E.

    2017-07-01

    A simple quasi analytical holonomic homogenization approach for the non-linear analysis of masonry walls in-plane loaded is presented. The elementary cell (REV) is discretized with 24 triangular elastic constant stress elements (bricks) and non-linear interfaces (mortar). A holonomic behavior with softening is assumed for mortar. It is shown how the mechanical problem in the unit cell is characterized by very few displacement variables and how homogenized stress-strain behavior can be evaluated semi-analytically.

  5. Boundary element modelling of dynamic behavior of piecewise homogeneous anisotropic elastic solids

    NASA Astrophysics Data System (ADS)

    Igumnov, L. A.; Markov, I. P.; Litvinchuk, S. Yu

    2018-04-01

    A traditional direct boundary integral equations method is applied to solve three-dimensional dynamic problems of piecewise homogeneous linear elastic solids. The materials of homogeneous parts are considered to be generally anisotropic. The technique used to solve the boundary integral equations is based on the boundary element method applied together with the Radau IIA convolution quadrature method. A numerical example of suddenly loaded 3D prismatic rod consisting of two subdomains with different anisotropic elastic properties is presented to verify the accuracy of the proposed formulation.

  6. PHYSICAL PROPERTIES OF ZIRCONIUM NITRIDE IN THE HOMOGENEITY REGION (in Ukrainian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samsonov, G.V.; Verkhoglyadova, T.S.

    1962-01-01

    The x-ray method was used to determine the homogeneity region of zirconium nitride as 40 to 50 at.% (9.5 to 13.3% by weight) of nitrogen. It is also shown that part of the ionic bond in the zirconium nitride lattice increases with a decrease in the nitrogen content in this region, this increase being higher than in the homogeneity region of titunium nitride due to the smaller degree of unfilling of the electron d-shell of the zirconium atom in comparison with that of the titanium atom. (auth)

  7. Averaging principle for second-order approximation of heterogeneous models with homogeneous models.

    PubMed

    Fibich, Gadi; Gavious, Arieh; Solan, Eilon

    2012-11-27

    Typically, models with a heterogeneous property are considerably harder to analyze than the corresponding homogeneous models, in which the heterogeneous property is replaced by its average value. In this study we show that any outcome of a heterogeneous model that satisfies the two properties of differentiability and symmetry is O(ε(2)) equivalent to the outcome of the corresponding homogeneous model, where ε is the level of heterogeneity. We then use this averaging principle to obtain new results in queuing theory, game theory (auctions), and social networks (marketing).

  8. Retrotransposon insertion targeting: a mechanism for homogenization of centromere sequences on nonhomologous chromosomes.

    PubMed

    Birchler, James A; Presting, Gernot G

    2012-04-01

    The centromeres of most eukaryotic organisms consist of highly repetitive arrays that are similar across nonhomologous chromosomes. These sequences evolve rapidly, thus posing a mystery as to how such arrays can be homogenized. Recent work in species in which centromere-enriched retrotransposons occur indicates that these elements preferentially insert into the centromeric regions. In two different Arabidopsis species, a related element was recognized in which the specificity for such targeting was altered. These observations provide a partial explanation for how homogenization of centromere DNA sequences occurs.

  9. Averaging principle for second-order approximation of heterogeneous models with homogeneous models

    PubMed Central

    Fibich, Gadi; Gavious, Arieh; Solan, Eilon

    2012-01-01

    Typically, models with a heterogeneous property are considerably harder to analyze than the corresponding homogeneous models, in which the heterogeneous property is replaced by its average value. In this study we show that any outcome of a heterogeneous model that satisfies the two properties of differentiability and symmetry is O(ɛ2) equivalent to the outcome of the corresponding homogeneous model, where ɛ is the level of heterogeneity. We then use this averaging principle to obtain new results in queuing theory, game theory (auctions), and social networks (marketing). PMID:23150569

  10. The lunar crust - A product of heterogeneous accretion or differentiation of a homogeneous moon

    NASA Technical Reports Server (NTRS)

    Brett, R.

    1973-01-01

    The outer portion of the moon (including the aluminum-rich crust and the source regions of mare basalts) was either accreted heterogeneously or was the product of widespread differentiation of an originally homogeneous source. Existing evidence for and against each of these two models is reviewed. It is concluded that the accretionary model presents more problems than it solves, and the model involving differentiation of an originally homogeneous moon is considered to be more plausible. A hypothesis for the formation of mare basalts is advanced.

  11. Homogeneous Freezing of Water Droplets and its Dependence on Droplet Size

    NASA Astrophysics Data System (ADS)

    Schmitt, Thea; Möhler, Ottmar; Höhler, Kristina; Leisner, Thomas

    2014-05-01

    The formulation and parameterisation of microphysical processes in tropospheric clouds, such as phase transitions, is still a challenge for weather and climate models. This includes the homogeneous freezing of supercooled water droplets, since this is an important process in deep convective systems, where almost pure water droplets may stay liquid until homogeneous freezing occurs at temperatures around 238 K. Though the homogeneous ice nucleation in supercooled water is considered to be well understood, recent laboratory experiments with typical cloud droplet sizes showed one to two orders of magnitude smaller nucleation rate coefficients than previous literature results, including earlier results from experiments with single levitated water droplets and from cloud simulation experiments at the AIDA (Aerosol Interaction and Dynamics in the Atmosphere) facility. This motivated us to re-analyse homogeneous droplet freezing experiments conducted during the previous years at the AIDA cloud chamber. This cloud chamber has a volume of 84m3 and operates under atmospherically relevant conditions within wide ranges of temperature, pressure and humidity, whereby investigations of both tropospheric mixed-phase clouds and cirrus clouds can be realised. By controlled adiabatic expansions, the ascent of an air parcel in the troposphere can be simulated. According to our new results and their comparison to the results from single levitated droplet experiments, the homogeneous freezing of water droplets seems to be a volume-dependent process, at least for droplets as small as a few micrometers in diameter. A contribution of surface induced freezing can be ruled out, in agreement to previous conclusions from the single droplet experiments. The obtained volume nucleation rate coefficients are in good agreement, within error bars, with some previous literature data, including our own results from earlier AIDA experiments, but they do not agree with recently published lower volume nucleation rate coefficients. This contribution will show the results from the re-analysis of AIDA homogeneous freezing experiments with pure water droplets and will discuss the comparison to the literature data.

  12. A 14 h {sup −3} Gpc{sup 3} study of cosmic homogeneity using BOSS DR12 quasar sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurent, Pierre; Goff, Jean-Marc Le; Burtin, Etienne

    2016-11-01

    The BOSS quasar sample is used to study cosmic homogeneity with a 3D survey in the redshift range 2.2 < z < 2.8. We measure the count-in-sphere, N (< r ), i.e. the average number of objects around a given object, and its logarithmic derivative, the fractal correlation dimension, D {sub 2}( r ). For a homogeneous distribution N (< r ) ∝ r {sup 3} and D {sub 2}( r ) = 3. Due to the uncertainty on tracer density evolution, 3D surveys can only probe homogeneity up to a redshift dependence, i.e. they probe so-called ''spatial isotropy'. Ourmore » data demonstrate spatial isotropy of the quasar distribution in the redshift range 2.2 < z < 2.8 in a model-independent way, independent of any FLRW fiducial cosmology, resulting in 3 − ( D {sub 2}) < 1.7 × 10{sup −3} (2 σ) over the range 250 < r < 1200 h {sup −1} Mpc for the quasar distribution. If we assume that quasars do not have a bias much less than unity, this implies spatial isotropy of the matter distribution on large scales. Then, combining with the Copernican principle, we finally get homogeneity of the matter distribution on large scales. Alternatively, using a flat ΛCDM fiducial cosmology with CMB-derived parameters, and measuring the quasar bias relative to this ΛCDM model, our data provide a consistency check of the model, in terms of how homogeneous the Universe is on different scales. D {sub 2}( r ) is found to be compatible with our ΛCDM model on the whole 10 < r < 1200 h {sup −1} Mpc range. For the matter distribution we obtain 3 − ( D {sub 2}) < 5 × 10{sup −5} (2 σ) over the range 250 < r < 1200 h {sup −1} Mpc, consistent with homogeneity on large scales.« less

  13. Deflection load characteristics of laser-welded orthodontic wires.

    PubMed

    Watanabe, Etsuko; Stigall, Garrett; Elshahawy, Waleed; Watanabe, Ikuya

    2012-07-01

    To compare the deflection load characteristics of homogeneous and heterogeneous joints made by laser welding using various types of orthodontic wires. Four kinds of straight orthodontic rectangular wires (0.017 inch × 0.025 inch) were used: stainless-steel (SS), cobalt-chromium-nickel (Co-Cr-Ni), beta-titanium alloy (β-Ti), and nickel-titanium (Ni-Ti). Homogeneous and heterogeneous end-to-end joints (12 mm long each) were made by Nd:YAG laser welding. Two types of welding methods were used: two-point welding and four-point welding. Nonwelded wires were also used as a control. Deflection load (N) was measured by conducting the three-point bending test. The data (n  =  5) were statistically analyzed using analysis of variance/Tukey test (P < .05). The deflection loads for control wires measured were as follows: SS: 21.7 ± 0.8 N; Co-Cr-Ni: 20.0 ± 0.3 N; β-Ti: 13.9 ± 1.3 N; and Ni-Ti: 6.6 ± 0.4 N. All of the homogeneously welded specimens showed lower deflection loads compared to corresponding control wires and exhibited higher deflection loads compared to heterogeneously welded combinations. For homogeneous combinations, Co-Cr-Ni/Co-Cr-Ni showed a significantly (P < .05) higher deflection load than those of the remaining homogeneously welded groups. In heterogeneous combinations, SS/Co-Cr-Ni and β-Ti/Ni-Ti showed higher deflection loads than those of the remaining heterogeneously welded combinations (significantly higher for SS/Co-Cr-Ni). Significance (P < .01) was shown for the interaction between the two factors (materials combination and welding method). However, no significant difference in deflection load was found between four-point and two-point welding in each homogeneous or heterogeneous combination. Heterogeneously laser-welded SS/Co-Cr-Ni and β-Ti/Ni-Ti wires provide a deflection load that is comparable to that of homogeneously welded orthodontic wires.

  14. Effect of Varroa destructor, Wounding and Varroa Homogenate on Gene Expression in Brood and Adult Honey Bees.

    PubMed

    Koleoglu, Gun; Goodwin, Paul H; Reyes-Quintana, Mariana; Hamiduzzaman, Mollah Md; Guzman-Novoa, Ernesto

    2017-01-01

    Honey bee (Apis mellifera) gene expression related to immunity for hymenoptaecin (AmHym) and defensin-1 (AmDef-1), longevity for vitellogenin (AmVit2) and stem cell proliferation for poly U binding factor 68 kDa (AmPuf68) was compared following Varroa destructor parasitism, buffer injection and injection of V. destructor compounds in its homogenate. In adults, V. destructor parasitism decreased expression of all four genes, while buffer injection decreased expression of AmHym, AmPuf68 and AmVit2, and homogenate injection decreased expression of AmPuf68 and AmVit2 but increased expression of AmDef-1 relative to their respective controls. The effect of V. destructor parasitism in adults relative to the controls was not significantly different from buffer injection for AmHym and AmVit2 expression, and it was not significantly different from homogenate injection for AmPuf68 and AmVit2. In brood, V. destructor parasitism, buffer injection and homogenate injection decreased AmVit2 expression, whereas AmHym expression was decreased by V. destructor parasitism but increased by buffer and homogenate injection relative to the controls. The effect of varroa parasitism in brood was not significantly different from buffer or homogenate injection for AmPuf68 and AmVit2. Expression levels of the four genes did not correlate with detectable viral levels in either brood or adults. The results of this study indicate that the relative effects of V. destructor parasitism on honey bee gene expression are also shared with other types of stresses. Therefore, some of the effects of V. destructor on honey bees may be mostly due to wounding and injection of foreign compounds into the hemolymph of the bee during parasitism. Although both brood and adults are naturally parasitized by V. destructor, their gene expression responded differently, probably the result of different mechanisms of host responses during development.

  15. Does prescribed burning result in biotic homogenization of coastal heathlands?

    PubMed

    Velle, Liv Guri; Nilsen, Liv Sigrid; Norderhaug, Ann; Vandvik, Vigdis

    2014-05-01

    Biotic homogenization due to replacement of native biodiversity by widespread generalist species has been demonstrated in a number of ecosystems and taxonomic groups worldwide, causing growing conservation concern. Human disturbance is a key driver of biotic homogenization, suggesting potential conservation challenges in seminatural ecosystems, where anthropogenic disturbances such as grazing and burning are necessary for maintaining ecological dynamics and functioning. We test whether prescribed burning results in biotic homogenization in the coastal heathlands of north-western Europe, a seminatural landscape where extensive grazing and burning has constituted the traditional land-use practice over the past 6000 years. We compare the beta-diversity before and after fire at three ecological scales: within local vegetation patches, between wet and dry heathland patches within landscapes, and along a 470 km bioclimatic gradient. Within local patches, we found no evidence of homogenization after fire; species richness increased, and the species that entered the burnt Calluna stands were not widespread specialists but native grasses and herbs characteristic of the heathland system. At the landscapes scale, we saw a weak homogenization as wet and dry heathland patches become more compositionally similar after fire. This was because of a decrease in habitat-specific species unique to either wet or dry habitats and postfire colonization by a set of heathland specialists that established in both habitat types. Along the bioclimatic gradient, species that increased after fire generally had more specific environmental requirements and narrower geographical distributions than the prefire flora, resulting in a biotic 'heterogenisation' after fire. Our study demonstrates that human disturbance does not necessarily cause biotic homogenization, but that continuation of traditional land-use practices can instead be crucial for the maintenance of the diversity and ecological function of a seminatural ecosystem. The species that established after prescribed burning were heathland specialists with relatively narrow geographical ranges. © 2013 John Wiley & Sons Ltd.

  16. Effect of Varroa destructor, Wounding and Varroa Homogenate on Gene Expression in Brood and Adult Honey Bees

    PubMed Central

    Koleoglu, Gun; Goodwin, Paul H.; Reyes-Quintana, Mariana; Hamiduzzaman, Mollah Md.; Guzman-Novoa, Ernesto

    2017-01-01

    Honey bee (Apis mellifera) gene expression related to immunity for hymenoptaecin (AmHym) and defensin-1 (AmDef-1), longevity for vitellogenin (AmVit2) and stem cell proliferation for poly U binding factor 68 kDa (AmPuf68) was compared following Varroa destructor parasitism, buffer injection and injection of V. destructor compounds in its homogenate. In adults, V. destructor parasitism decreased expression of all four genes, while buffer injection decreased expression of AmHym, AmPuf68 and AmVit2, and homogenate injection decreased expression of AmPuf68 and AmVit2 but increased expression of AmDef-1 relative to their respective controls. The effect of V. destructor parasitism in adults relative to the controls was not significantly different from buffer injection for AmHym and AmVit2 expression, and it was not significantly different from homogenate injection for AmPuf68 and AmVit2. In brood, V. destructor parasitism, buffer injection and homogenate injection decreased AmVit2 expression, whereas AmHym expression was decreased by V. destructor parasitism but increased by buffer and homogenate injection relative to the controls. The effect of varroa parasitism in brood was not significantly different from buffer or homogenate injection for AmPuf68 and AmVit2. Expression levels of the four genes did not correlate with detectable viral levels in either brood or adults. The results of this study indicate that the relative effects of V. destructor parasitism on honey bee gene expression are also shared with other types of stresses. Therefore, some of the effects of V. destructor on honey bees may be mostly due to wounding and injection of foreign compounds into the hemolymph of the bee during parasitism. Although both brood and adults are naturally parasitized by V. destructor, their gene expression responded differently, probably the result of different mechanisms of host responses during development. PMID:28081188

  17. Direct vibro-elastography FEM inversion in Cartesian and cylindrical coordinate systems without the local homogeneity assumption

    NASA Astrophysics Data System (ADS)

    Honarvar, M.; Lobo, J.; Mohareri, O.; Salcudean, S. E.; Rohling, R.

    2015-05-01

    To produce images of tissue elasticity, the vibro-elastography technique involves applying a steady-state multi-frequency vibration to tissue, estimating displacements from ultrasound echo data, and using the estimated displacements in an inverse elasticity problem with the shear modulus spatial distribution as the unknown. In order to fully solve the inverse problem, all three displacement components are required. However, using ultrasound, the axial component of the displacement is measured much more accurately than the other directions. Therefore, simplifying assumptions must be used in this case. Usually, the equations of motion are transformed into a Helmholtz equation by assuming tissue incompressibility and local homogeneity. The local homogeneity assumption causes significant imaging artifacts in areas of varying elasticity. In this paper, we remove the local homogeneity assumption. In particular we introduce a new finite element based direct inversion technique in which only the coupling terms in the equation of motion are ignored, so it can be used with only one component of the displacement. Both Cartesian and cylindrical coordinate systems are considered. The use of multi-frequency excitation also allows us to obtain multiple measurements and reduce artifacts in areas where the displacement of one frequency is close to zero. The proposed method was tested in simulations and experiments against a conventional approach in which the local homogeneity is used. The results show significant improvements in elasticity imaging with the new method compared to previous methods that assumes local homogeneity. For example in simulations, the contrast to noise ratio (CNR) for the region with spherical inclusion increases from an average value of 1.5-17 after using the proposed method instead of the local inversion with homogeneity assumption, and similarly in the prostate phantom experiment, the CNR improved from an average value of 1.6 to about 20.

  18. Phantom Preparation and Optical Property Determination

    NASA Astrophysics Data System (ADS)

    He, Di; He, Jie; Mao, Heng

    2018-12-01

    Tissue-like optical phantoms are important in testing new imaging algorithms. Homogeneous optical phantoms with determined optical properties are the first step of making a proper heterogeneous phantom for multi-modality imaging. Typical recipes for such phantoms consist of epoxy resin, hardener, India ink and titanium oxide. By altering the concentration of India ink and titanium oxide, we are able to get multiple homogeneous phantoms with different absorption and scattering coefficients by carefully mixing all the ingredients. After fabricating the phantoms, we need to find their individual optical properties including the absorption and scattering coefficients. This is achieved by solving diffusion equation of each phantom as a homogeneous slab under canonical illumination. We solve the diffusion equation of homogeneous slab in frequency domain and get the formula for theoretical measurements. Under our steady-state diffused optical tomography (DOT) imaging system, we are able to obtain the real distribution of the incident light produced by a laser. With this source distribution we got and the formula we derived, numerical experiments show how measurements change while varying the value of absorption and scattering coefficients. Then we notice that the measurements alone will not be enough for us to get unique optical properties for steady-state DOT problem. Thus in order to determine the optical properties of a homogeneous slab we want to fix one of the coefficients first and use optimization methods to find another one. Then by assemble multiple homogeneous slab phantoms with different optical properties, we are able to obtain a heterogeneous phantom suitable for testing multi-modality imaging algorithms. In this paper, we describe how to make phantoms, derive a formula to solve the diffusion equation, demonstrate the non-uniqueness of steady-state DOT problem by analysing some numerical results of our formula, and finally propose a possible way to determine optical properties for homogeneous slab for our future work.

  19. The Role of Grammatical Category Information in Spoken Word Retrieval

    PubMed Central

    Duràn, Carolina Palma; Pillon, Agnesa

    2011-01-01

    We investigated the role of lexical syntactic information such as grammatical gender and category in spoken word retrieval processes by using a blocking paradigm in picture and written word naming experiments. In Experiments 1, 3, and 4, we found that the naming of target words (nouns) from pictures or written words was faster when these target words were named within a list where only words from the same grammatical category had to be produced (homogeneous category list: all nouns) than when they had to be produced within a list comprising also words from another grammatical category (heterogeneous category list: nouns and verbs). On the other hand, we detected no significant facilitation effect when the target words had to be named within a homogeneous gender list (all masculine nouns) compared to a heterogeneous gender list (both masculine and feminine nouns). In Experiment 2, using the same blocking paradigm by manipulating the semantic category of the items, we found that naming latencies were significantly slower in the semantic category homogeneous in comparison with the semantic category heterogeneous condition. Thus semantic category homogeneity caused an interference, not a facilitation effect like grammatical category homogeneity. Finally, in Experiment 5, nouns in the heterogeneous category condition had to be named just after a verb (category-switching position) or a noun (same-category position). We found a facilitation effect of category homogeneity but no significant effect of position, which showed that the effect of category homogeneity found in Experiments 1, 3, and 4 was not due to a cost of switching between grammatical categories in the heterogeneous grammatical category list. These findings supported the hypothesis that grammatical category information impacts word retrieval processes in speech production, even when words are to be produced in isolation. They are discussed within the context of extant theories of lexical production. PMID:22110465

  20. Analytical Framework for Identifying and Differentiating Recent Hitchhiking and Severe Bottleneck Effects from Multi-Locus DNA Sequence Data

    DOE PAGES

    Sargsyan, Ori

    2012-05-25

    Hitchhiking and severe bottleneck effects have impact on the dynamics of genetic diversity of a population by inducing homogenization at a single locus and at the genome-wide scale, respectively. As a result, identification and differentiation of the signatures of such events from DNA sequence data at a single locus is challenging. This study develops an analytical framework for identifying and differentiating recent homogenization events at multiple neutral loci in low recombination regions. The dynamics of genetic diversity at a locus after a recent homogenization event is modeled according to the infinite-sites mutation model and the Wright-Fisher model of reproduction withmore » constant population size. In this setting, I derive analytical expressions for the distribution, mean, and variance of the number of polymorphic sites in a random sample of DNA sequences from a locus affected by a recent homogenization event. Based on this framework, three likelihood-ratio based tests are presented for identifying and differentiating recent homogenization events at multiple loci. Lastly, I apply the framework to two data sets. First, I consider human DNA sequences from four non-coding loci on different chromosomes for inferring evolutionary history of modern human populations. The results suggest, in particular, that recent homogenization events at the loci are identifiable when the effective human population size is 50000 or greater in contrast to 10000, and the estimates of the recent homogenization events are agree with the “Out of Africa” hypothesis. Second, I use HIV DNA sequences from HIV-1-infected patients to infer the times of HIV seroconversions. The estimates are contrasted with other estimates derived as the mid-time point between the last HIV-negative and first HIV-positive screening tests. Finally, the results show that significant discrepancies can exist between the estimates.« less

  1. Proton Minibeam Radiation Therapy Reduces Side Effects in an In Vivo Mouse Ear Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Girst, Stefanie, E-mail: stefanie.girst@unibw.de; Greubel, Christoph; Reindl, Judith

    Purpose: Proton minibeam radiation therapy is a novel approach to minimize normal tissue damage in the entrance channel by spatial fractionation while keeping tumor control through a homogeneous tumor dose using beam widening with an increasing track length. In the present study, the dose distributions for homogeneous broad beam and minibeam irradiation sessions were simulated. Also, in an animal study, acute normal tissue side effects of proton minibeam irradiation were compared with homogeneous irradiation in a tumor-free mouse ear model to account for the complex effects on the immune system and vasculature in an in vivo normal tissue model. Methods andmore » Materials: At the ion microprobe SNAKE, 20-MeV protons were administered to the central part (7.2 × 7.2 mm{sup 2}) of the ear of BALB/c mice, using either a homogeneous field with a dose of 60 Gy or 16 minibeams with a nominal 6000 Gy (4 × 4 minibeams, size 0.18 × 0.18 mm{sup 2}, with a distance of 1.8 mm). The same average dose was used over the irradiated area. Results: No ear swelling or other skin reactions were observed at any point after minibeam irradiation. In contrast, significant ear swelling (up to fourfold), erythema, and desquamation developed in homogeneously irradiated ears 3 to 4 weeks after irradiation. Hair loss and the disappearance of sebaceous glands were only detected in the homogeneously irradiated fields. Conclusions: These results show that proton minibeam radiation therapy results in reduced adverse effects compared with conventional homogeneous broad-beam irradiation and, therefore, might have the potential to decrease the incidence of side effects resulting from clinical proton and/or heavy ion therapy.« less

  2. Isolation of Salmonella from alfalfa seed and demonstration of impaired growth of heat-injured cells in seed homogenates.

    PubMed

    Liao, Ching-Hsing; Fett, William F

    2003-05-15

    Three major foodborne outbreaks of salmonellosis in 1998 and 1999 were linked to the consumption of raw alfalfa sprouts. In this report, an improved method is described for isolation of Salmonella from alfalfa seed lots, which had been implicated in these outbreaks. From each seed lot, eight samples each containing 25 g of seed were tested for the presence of Salmonella by the US FDA Bacteriological Analytical Manual (BAM) procedure and by a modified method applying two successive pre-enrichment steps. Depending on the seed lot, one to four out of eight samples tested positive for Salmonella by the standard procedure and two to seven out of eight samples tested positive by the modified method. Thus, the use of two consecutive pre-enrichment steps led to a higher detection rate than a single pre-enrichment step. This result indirectly suggested that Salmonella cells on contaminated seeds might be injured and failed to fully resuscitate in pre-enrichment broth containing seed components during the first 24 h of incubation. Responses of heat-injured Salmonella cells grown in buffered peptone water (BPW) and in three alfalfa seed homogenates were investigated. For preparation of seed homogenates, 25 g of seeds were homogenized in 200 ml of BPW using a laboratory Stomacher and subsequently held at 37 degrees C for 24 h prior to centrifugation and filtration. While untreated cells grew at about the same rate in BPW and in seed homogenates, heat-injured cells (52 degrees C, 10 min) required approximately 0.5 to 4.0 h longer to resuscitate in seed homogenates than in BPW. This result suggests that the alfalfa seed components or fermented metabolites from native bacteria hinder the repair and growth of heat-injured cells. This study also shows that an additional pre-enrichment step increases the frequency of isolation of Salmonella from naturally contaminated seeds, possibly by alleviating the toxic effect of seed homogenates on repair or growth of injured cells.

  3. Recalibrating disease parameters for increasing realism in modeling epidemics in closed settings.

    PubMed

    Bioglio, Livio; Génois, Mathieu; Vestergaard, Christian L; Poletto, Chiara; Barrat, Alain; Colizza, Vittoria

    2016-11-14

    The homogeneous mixing assumption is widely adopted in epidemic modelling for its parsimony and represents the building block of more complex approaches, including very detailed agent-based models. The latter assume homogeneous mixing within schools, workplaces and households, mostly for the lack of detailed information on human contact behaviour within these settings. The recent data availability on high-resolution face-to-face interactions makes it now possible to assess the goodness of this simplified scheme in reproducing relevant aspects of the infection dynamics. We consider empirical contact networks gathered in different contexts, as well as synthetic data obtained through realistic models of contacts in structured populations. We perform stochastic spreading simulations on these contact networks and in populations of the same size under a homogeneous mixing hypothesis. We adjust the epidemiological parameters of the latter in order to fit the prevalence curve of the contact epidemic model. We quantify the agreement by comparing epidemic peak times, peak values, and epidemic sizes. Good approximations of the peak times and peak values are obtained with the homogeneous mixing approach, with a median relative difference smaller than 20 % in all cases investigated. Accuracy in reproducing the peak time depends on the setting under study, while for the peak value it is independent of the setting. Recalibration is found to be linear in the epidemic parameters used in the contact data simulations, showing changes across empirical settings but robustness across groups and population sizes. An adequate rescaling of the epidemiological parameters can yield a good agreement between the epidemic curves obtained with a real contact network and a homogeneous mixing approach in a population of the same size. The use of such recalibrated homogeneous mixing approximations would enhance the accuracy and realism of agent-based simulations and limit the intrinsic biases of the homogeneous mixing.

  4. Capteur de CO{2} à fibres optiques par absorption moléculaire à 4,3 μm

    NASA Astrophysics Data System (ADS)

    Bendamardji, S.; Alayli, Y.; Huard, S.

    1996-04-01

    This paper describes a remote optical fibre sensor for the carbon dioxide detection by molecular absorption in the near infrared (4.3 μm) corresponding to fundamental mode ν3. To overcome the problem of the strong attenuation signal of optical fibre in the near infrared, we have used the opto-suppling technique which changes the working wavelength from 4.3 μm to 860 nm and permits the use of standard optical fibre 50/125. The simulation of absorption has been obtained by original modelisation of the absorption spectrum and the establishment of the calibration curves takes to the sensor to detect a partial pressures greater than 100 μbar with a minimal error margin of 100 μbar, which is acceptable considering the future use of the device. The sensor has been designed to monitor the CO{2} rate in enriched greenhouses. Cet article décrit un capteur à fibres optiques de gaz carbonique par absorption moléculaire dans l'infrarouge moyen (4,3 μm) correspondant au mode fondamental ν3. La liaison entre le site de mesure et le site de contrôle est assurée par un fibre optique standard 50/125 après une transposition de longueur d'onde de 4,3 μm à 860 nm par opto-alimentation. La simulation de l'absorption a été obtenue par modélisation originale du spectre d'absorption et l'établissement des courbes d'étalonnage prévoit une marge d'erreur minimale de 100 μbar, ce qui est suffisant pour l'application du dispositif à la régulation de taux CO{2} dans les serres agricoles enrichies par de gaz.

  5. Modelisation agregee de chauffe-eau electriques commandes par champ moyen pour la gestion des charges dans un reseau

    NASA Astrophysics Data System (ADS)

    Losseau, Romain

    The ongoing energy transition is about to entail important changes in the way we use and manage energy. In this view, smart grids are expected to play a significant part through the use of intelligent storage techniques. Initiated in 2014, the SmartDesc project follows this trend to create an innovative load management program by exploiting the thermal storage associated with electric water heaters existing in residential households. The device control algorithms rely on the recent theory of mean field games to achieve a decentralized control of the water heaters temperatures producing an aggregate optimal trajectory, designed to smooth the electric demand of a neighborhood. Currently, this theory does not include power and temperature constraints due to the tank heating system or necessary for the user's safety and comfort. Therefore, a trajectory violating these constraints would not be feasible and would not induce the forecast load smoothing. This master's thesis presents a method to detect the non-feasability, of a target trajectory based on the Kolmogorov equations associated with the controlled electric water heaters and suggests a way to correct it so as to make it achievable under constraints. First, a partial differential equations based model of the water heaters under temperature constraints is presented. Subsequently, a numerical scheme is developed to simulate it, and applied to the mean field control. The results of the mean field control with and without constraints are compared, and non-feasabilities of the target trajectory are highlighted upon violations. The last part of the thesis is dedicated to developing an accelerated version of the mean field and a method of correcting the target trajectory so as to enlarge as much as possible the set of achievable profiles.

  6. Modelling a radiology department service using a VDL integrated approach.

    PubMed

    Guglielmino, Maria Gabriella; Celano, Giovanni; Costa, Antonio; Fichera, Sergio

    2009-01-01

    The healthcare industry is facing several challenges such as the reduction of costs and quality improvement of the provided services. Engineering studies could be very useful in supporting organizational and management processes. Healthcare service efficiency depends on a strong collaboration between clinical and engineering experts, especially when it comes to analyzing the system and its constraints in detail and subsequently, when it comes to deciding on the reengineering of some key activities. The purpose of this paper is to propose a case study showing how a mix of representation tools allow a manager of a radiology department to solve some human and technological resource re-organizational issues, which have to be faced due to the introduction of a new technology and a new portfolio of services. In order to simulate the activities within the radiology department and examine the relationship between human and technological resources, different visual diagrammatic language (VDL) techniques have been implemented to get knowledge about the heterogeneous factors related to the healthcare service delivery. In particular, flow charts, IDEFO diagrams and Petri nets have been integrated each other with success as a modelisation tools. The simulation study performed through the application of the aforementioned VDL techniques suggests the opportunity of re-organizing the nurse activities within the radiology department. The re-organization of a healthcare service and in particular of a radiology department by means of joint flow charts, IDEF0 diagrams and Petri nets is a poorly investigated topic in literature. This paper demonstrates how flow charts and IDEF0 can help people working within the department to understand the weak points of their organization and constitute an efficient base of knowledge for the implementation of a Petri net aimed at improving the departmental performance.

  7. Etude de la dynamique des porteurs dans des nanofils de silicium par spectroscopie terahertz

    NASA Astrophysics Data System (ADS)

    Beaudoin, Alexandre

    Ce memoire presente une etude des proprietes de conduction electrique et de la dynamique temporelle des porteurs de charges dans des nanofils de silicium sondes par rayonnement terahertz. Les cas de nanofils de silicium non intentionnellement dopes et dopes type n sont compares pour differentes configurations du montage experimental. Les mesures de spectroscopie terahertz en transmission montre qu'il est possible de detecter la presence de dopants dans les nanofils via leur absorption du rayonnement terahertz (˜ 1--12 meV). Les difficultes de modelisation de la transmission d'une impulsion electromagnetique dans un systeme de nanofils sont egalement discutees. La detection differentielle, une modification au systeme de spectroscopie terahertz, est testee et ses performances sont comparees au montage de caracterisation standard. Les instructions et des recommendations pour la mise en place de ce type de mesure sont incluses. Les resultats d'une experience de pompe optique-sonde terahertz sont egalement presentes. Dans cette experience, les porteurs de charge temporairement crees suite a l'absorption de la pompe optique (lambda ˜ 800 nm) dans les nanofils (les photoporteurs) s'ajoutent aux porteurs initialement presents et augmentent done l'absorption du rayonnement terahertz. Premierement, l'anisotropie de l'absorption terahertz et de la pompe optique par les nanofils est demontree. Deuxiemement, le temps de recombinaison des photoporteurs est etudie en fonction du nombre de photoporteurs injectes. Une hypothese expliquant les comportements observes pour les nanofils non-dopes et dopes-n est presentee. Troisiemement, la photoconductivite est extraite pour les nanofils non-dopes et dopes-n sur une plage de 0.5 a 2 THz. Un lissage sur la photoconductivite permet d'estimer le nombre de dopants dans les nanofils dopes-n. Mots-cles: nanofil, silicium, terahertz, conductivite, spectroscopie, photoconductivite.

  8. Étude de la réponse photoacoustique d'objets massifs en 3D

    NASA Astrophysics Data System (ADS)

    Séverac, H.; Mousseigne, M.; Franceschi, J. L.

    1996-11-01

    In some sectors such as microelectronics or the physics of materials, reliability is of capital importance. It is also particularly attractive to have access on informations on the material behaviour without the use of a destructive test like chemical analysis or others mechanical tests. The submitted method for non-destructive testing is based on the waves generation with a laser beam. The aim of studying the various waves in the three-dimensional space is to bring informations about materials response. Thermoelastic modelisation allowed a rigorous analytic approach and to give rise to a software written in Turbo-Pascal for a more general solution. Dans les secteurs où la fiabilité est capitale, tels la micro-électronique ou la physique des matériaux, il est particulièrement utile d'accéder aux informations sur le comportement du matériau sans avoir à utiliser une méthode destructive (analyses chimiques ou autres essais mécaniques). La méthode de contrôle non destructif présentée est basée sur la génération d'ondes par impact d'un faisceau laser focalisé à la surface d'un échantillon, sans atteindre le régime d'ablation. L'étude de la propagation des diverses ondes dans l'espace tridimensionnel permet d'apporter des mesures quantitatives sur l'analyse de la réponse des matériaux utilisés. La modélisation des phénomènes thermoélastiques a permis une approche analytique rigoureuse et donné naissance à un logiciel de simulation écrit en Turbo-Pascal pour des études plus générales.

  9. HOMOGENEOUS CATALYTIC OXIDATION OF HYDROCARBONS IN ALTERNATIVE SOLVENTS

    EPA Science Inventory

    Homogeneous Catalytic Oxidations of Hydrocarbons in Alternative Solvent Systems

    Michael A. Gonzalez* and Thomas M. Becker, Sustainable Technology Division, Office of Research and Development; United States Environmental Protection Agency, 26 West Martin Luther King Drive, ...

  10. Nano-catalysts: Bridging the gap between homogeneous and heterogeneous catalysis

    EPA Science Inventory

    Functionalized nanoparticles have emerged as sustainable alternatives to conventional materials, as robust, high-surface-area heterogeneous catalyst supports. We envisioned a catalyst system, which can bridge the homogenous and heterogeneous system. Postsynthetic surface modifica...

  11. More than Just Convenient: The Scientific Merits of Homogeneous Convenience Samples

    PubMed Central

    Jager, Justin; Putnick, Diane L.; Bornstein, Marc H.

    2017-01-01

    Despite their disadvantaged generalizability relative to probability samples, non-probability convenience samples are the standard within developmental science, and likely will remain so because probability samples are cost-prohibitive and most available probability samples are ill-suited to examine developmental questions. In lieu of focusing on how to eliminate or sharply reduce reliance on convenience samples within developmental science, here we propose how to augment their advantages when it comes to understanding population effects as well as subpopulation differences. Although all convenience samples have less clear generalizability than probability samples, we argue that homogeneous convenience samples have clearer generalizability relative to conventional convenience samples. Therefore, when researchers are limited to convenience samples, they should consider homogeneous convenience samples as a positive alternative to conventional or heterogeneous) convenience samples. We discuss future directions as well as potential obstacles to expanding the use of homogeneous convenience samples in developmental science. PMID:28475254

  12. Collective strong coupling with homogeneous Rabi frequencies using a 3D lumped element microwave resonator

    NASA Astrophysics Data System (ADS)

    Angerer, Andreas; Astner, Thomas; Wirtitsch, Daniel; Sumiya, Hitoshi; Onoda, Shinobu; Isoya, Junichi; Putz, Stefan; Majer, Johannes

    2016-07-01

    We design and implement 3D-lumped element microwave cavities that spatially focus magnetic fields to a small mode volume. They allow coherent and uniform coupling to electron spins hosted by nitrogen vacancy centers in diamond. We achieve large homogeneous single spin coupling rates, with an enhancement of more than one order of magnitude compared to standard 3D cavities with a fundamental resonance at 3 GHz. Finite element simulations confirm that the magnetic field distribution is homogeneous throughout the entire sample volume, with a root mean square deviation of 1.54%. With a sample containing 1017 nitrogen vacancy electron spins, we achieve a collective coupling strength of Ω = 12 MHz, a cooperativity factor C = 27, and clearly enter the strong coupling regime. This allows to interface a macroscopic spin ensemble with microwave circuits, and the homogeneous Rabi frequency paves the way to manipulate the full ensemble population in a coherent way.

  13. Point matching: A new electronic method for homogenizing the phase characteristics of giant magnetoimpedance sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, E. Costa, E-mail: edusilva@ele.puc-rio.br; Gusmão, L. A. P.; Barbosa, C. R. Hall

    2014-08-15

    Recently, our research group at PUC-Rio discovered that magnetic transducers based on the impedance phase characteristics of GMI sensors have the potential to multiply by one hundred the sensitivity values when compared to magnitude-based GMI transducers. Those GMI sensors can be employed in the measurement of ultra-weak magnetic fields, which intensities are even lower than the environmental magnetic noise. A traditional solution for cancelling the electromagnetic noise and interference makes use of gradiometric configurations, but the performance is strongly tied to the homogeneity of the sensing elements. This paper presents a new method that uses electronic circuits to modify themore » equivalent impedance of the GMI samples, aiming at homogenizing their phase characteristics and, consequently, improving the performance of gradiometric configurations based on GMI samples. It is also shown a performance comparison between this new method and another homogenization method previously developed.« less

  14. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  15. Phenoloxidase activity in larval and juvenile homogenates and adult plasma and haemocytes of bivalve molluscs.

    PubMed

    Luna-González, Antonio; Maeda-Martínez, Alfonso N; Vargas-Albores, Francisco; Ascencio-Valle, Felipe; Robles-Mungaray, Miguel

    2003-10-01

    Phenoloxidase (PO) activity was studied in larval and juvenile homogenates and in the plasma and haemocytes of adult Crassostrea gigas, Argopecten ventricosus, Nodipecten subnodosus, and Atrina maura. Samples were tested for the presence of PO activity by incubation with the substrate L-3, 4-dihydroxyphenylalanine using trypsin, alpha-chymotrypsin, laminarin, lipopolysaccharides (LPS), and sodium dodecyl sulphate (SDS) to elicit activation of prophenoloxidase (proPO) system. PO activity was not detected in larval homogenate. In juvenile homogenate, PO activity was found only in C. gigas and N. subnodosus. PO activity was present in adult samples and was enhanced by elicitors in the plasma of all species tested, but in haemocyte lysate supernatant (HLS) of only N. subnodosus. Activation of proPO by laminarin was suppressed by a protease inhibitor cocktail (P-2714) in plasma and HLS of all species tested.

  16. High-Efficiency Inhibition of Gravity Segregation in Al-Bi Immiscible Alloys by Adding Lanthanum

    NASA Astrophysics Data System (ADS)

    Jia, Peng; Zhang, Jinyang; Geng, Haoran; Teng, Xinying; Zhao, Degang; Yang, Zhongxi; Wang, Yi; Hu, Song; Xiang, Jun; Hu, Xun

    2018-05-01

    The inhibition of gravity segregation has been a long-standing challenge in fabrication and applications of homogeneous immiscible alloys. Therefore, the effect of rare-earth La on the gravity segregation of Al-Bi immiscible alloys was investigated to understand the homogenization mechanism. The results showed that the addition of La can completely suppress the gravity segregation. This is attributed to the nucleation of Bi-rich liquid phase on the in-situ produced LaBi2 phase and the change of the shape of LaBi2@Bi droplets. In addition, a novel strategy is developed to prepare the homogeneous immiscible alloys through the addition of rare-earth elements. This strategy not only is applicable to other immiscible alloys, but also is conducive to finding more elements to suppress the gravity segregation. This study provided a useful reference for the fabrication of the homogeneous immiscible alloys.

  17. Numerical homogenization of elastic and thermal material properties for metal matrix composites (MMC)

    NASA Astrophysics Data System (ADS)

    Schindler, Stefan; Mergheim, Julia; Zimmermann, Marco; Aurich, Jan C.; Steinmann, Paul

    2017-01-01

    A two-scale material modeling approach is adopted in order to determine macroscopic thermal and elastic constitutive laws and the respective parameters for metal matrix composite (MMC). Since the common homogenization framework violates the thermodynamical consistency for non-constant temperature fields, i.e., the dissipation is not conserved through the scale transition, the respective error is calculated numerically in order to prove the applicability of the homogenization method. The thermomechanical homogenization is applied to compute the macroscopic mass density, thermal expansion, elasticity, heat capacity and thermal conductivity for two specific MMCs, i.e., aluminum alloy Al2024 reinforced with 17 or 30 % silicon carbide particles. The temperature dependency of the material properties has been considered in the range from 0 to 500°C, the melting temperature of the alloy. The numerically determined material properties are validated with experimental data from the literature as far as possible.

  18. Determination of macro-scale soil properties from pore-scale structures: model derivation.

    PubMed

    Daly, K R; Roose, T

    2018-01-01

    In this paper, we use homogenization to derive a set of macro-scale poro-elastic equations for soils composed of rigid solid particles, air-filled pore space and a poro-elastic mixed phase. We consider the derivation in the limit of large deformation and show that by solving representative problems on the micro-scale we can parametrize the macro-scale equations. To validate the homogenization procedure, we compare the predictions of the homogenized equations with those of the full equations for a range of different geometries and material properties. We show that the results differ by [Formula: see text] for all cases considered. The success of the homogenization scheme means that it can be used to determine the macro-scale poro-elastic properties of soils from the underlying structure. Hence, it will prove a valuable tool in both characterization and optimization.

  19. Rapid biotic homogenization of marine fish assemblages

    PubMed Central

    Magurran, Anne E.; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J.; McGill, Brian

    2015-01-01

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change. PMID:26400102

  20. Pyroxene Homogenization and the Isotopic Systematics of Eucrites

    NASA Technical Reports Server (NTRS)

    Nyquist, L. E.; Bogard, D. D.

    1996-01-01

    The original Mg-Fe zoning of eucritic pyroxenes has in nearly all cases been partly homogenized, an observation that has been combined with other petrographic and compositional criteria to establish a scale of thermal "metamorphism" for eucrites. To evaluate hypotheses explaining development of conditions on the HED parent body (Vesta?) leading to pyroxene homogenization against their chronological implications, it is necessary to know whether pyroxene metamorphism was recorded in the isotopic systems. However, identifying the effects of the thermal metamorphism with specific effects in the isotopic systems has been difficult, due in part to a lack of correlated isotopic and mineralogical studies of the same eucrites. Furthermore, isotopic studies often place high demands on analytical capabilities, resulting in slow growth of the isotopic database. Additionally, some isotopic systems would not respond in a direct and sensitive way to pyroxene homogenization. Nevertheless, sufficient data exist to generalize some observations, and to identify directions of potentially fruitful investigations.

  1. Homogenization of locally resonant acoustic metamaterials towards an emergent enriched continuum.

    PubMed

    Sridhar, A; Kouznetsova, V G; Geers, M G D

    This contribution presents a novel homogenization technique for modeling heterogeneous materials with micro-inertia effects such as locally resonant acoustic metamaterials. Linear elastodynamics is used to model the micro and macro scale problems and an extended first order Computational Homogenization framework is used to establish the coupling. Craig Bampton Mode Synthesis is then applied to solve and eliminate the microscale problem, resulting in a compact closed form description of the microdynamics that accurately captures the Local Resonance phenomena. The resulting equations represent an enriched continuum in which additional kinematic degrees of freedom emerge to account for Local Resonance effects which would otherwise be absent in a classical continuum. Such an approach retains the accuracy and robustness offered by a standard Computational Homogenization implementation, whereby the problem and the computational time are reduced to the on-line solution of one scale only.

  2. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Taylor, Bryant D. (Inventor); Woodard, Stanley E. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  3. Enzymatic cell wall degradation of high-pressure-homogenized tomato puree and its effect on lycopene bioaccessibility.

    PubMed

    Palmero, Paola; Colle, Ines; Lemmens, Lien; Panozzo, Agnese; Nguyen, Tuyen Thi My; Hendrickx, Marc; Van Loey, Ann

    2016-01-15

    High-pressure homogenization disrupts cell structures, assisting carotenoid release from the matrix and subsequent micellarization. However, lycopene bioaccessibility of tomato puree upon high-pressure homogenization is limited by the formation of a process-induced barrier. In this context, cell wall-degrading enzymes were applied to hydrolyze the formed barrier and enhance lycopene bioaccessibility. The effectiveness of the enzymes in degrading their corresponding substrates was evaluated (consistency, amount of reducing sugars, molar mass distribution and immunolabeling). An in vitro digestion procedure was applied to evaluate the effect of the enzymatic treatments on lycopene bioaccessibility. Enzymatic treatments with pectinases and cellulase were proved to effectively degrade their corresponding cell wall polymers; however, no further significant increase in lycopene bioaccessibility was obtained. A process-induced barrier consisting of cell wall material is not the only factor governing lycopene bioaccessibility upon high-pressure homogenization. © 2015 Society of Chemical Industry.

  4. The Effect of Homogenization on the Corrosion Behavior of Al-Mg Alloy

    NASA Astrophysics Data System (ADS)

    Li, Yin; Hung, Yuanchun; Du, Zhiyong; Xiao, Zhengbing; Jia, Guangze

    2018-04-01

    The effect of homogenization on the corrosion behavior of 5083-O aluminum alloy is presented in this paper. The intergranular corrosion and exfoliation corrosion were used to characterize the discussed corrosion behavior of 5083-O aluminum alloy. The variations in the morphology, the kind and distribution of the precipitates, and the dislocation configurations in the samples after the homogenization were evaluated using optical microscopy (OM), scanning electron microscopy (SEM), and transmission electron microscopy (TEM). The effects of the highly active grain boundary character distribution and the types of constituent particles on the corrosion are discussed on the basis of experimental observations. The results indicated that the corrosion behavior of 5083-O alloy was closely related to the microstructure obtained by the heat treatment. Homogenization carried out after casting had the optimal effect on the overall corrosion resistance of the material. Nevertheless, all samples could satisfy the requirements of corrosion resistance in marine applications.

  5. Decay and growth laws in homogeneous shear turbulence

    NASA Astrophysics Data System (ADS)

    Briard, Antoine; Gomez, Thomas; Mons, Vincent; Sagaut, Pierre

    2016-07-01

    Homogeneous anisotropic turbulence has been widely studied in the past decades, both numerically and experimentally. Shear flows have received a particular attention because of the numerous physical phenomena they exhibit. In the present paper, both the decay and growth of anisotropy in homogeneous shear flows at high Reynolds numbers are revisited thanks to a recent eddy-damped quasi-normal Markovian closure adapted to homogeneous anisotropic turbulence. The emphasis is put on several aspects: an asymptotic model for the slow part of the pressure-strain tensor is derived for the return to isotropy process when mean velocity gradients are released. Then, a general decay law for purely anisotropic quantities in Batchelor turbulence is proposed. At last, a discussion is proposed to explain the scattering of global quantities obtained in DNS and experiments in sustained shear flows: the emphasis is put on the exponential growth rate of the kinetic energy and on the shear parameter.

  6. Retinoic Acid Engineered Amniotic Membrane Used as Graft or Homogenate: Positive Effects on Corneal Alkali Burns.

    PubMed

    Joubert, Romain; Daniel, Estelle; Bonnin, Nicolas; Comptour, Aurélie; Gross, Christelle; Belville, Corinne; Chiambaretta, Frédéric; Blanchon, Loïc; Sapin, Vincent

    2017-07-01

    Alkali burns are the most common, severe chemical ocular injuries, their functional prognosis depending on corneal wound healing efficiency. The purpose of our study was to compare the benefits of amniotic membrane (AM) grafts and homogenates for wound healing in the presence or absence of previous all-trans retinoic acid (atRA) treatment. Fifty male CD1 mice with reproducible corneal chemical burn were divided into five groups, as follows: group 1 was treated with saline solution; groups 2 and 3 received untreated AM grafts or grafts treated with atRA, respectively; and groups 4 and 5 received untreated AM homogenates or homogenates treated with atRA, respectively. After 7 days of treatment, ulcer area and depth were measured, and vascular endothelial growth factor (VEGF) and matrix metalloproteinase 9 (MMP-9) were quantified. AM induction by atRA was confirmed via quantification of retinoic acid receptor β (RARβ), a well-established retinoic acid-induced gene. Significant improvements of corneal wound healing in terms of ulcer area and depth were obtained with both strategies. No major differences were found between the efficiency of AM homogenates and grafts. This positive action was increased when AM was pretreated with atRA. Furthermore, AM induced a decrease in VEGF and MMP-9 levels during the wound healing process. The atRA treatment led to an even greater decrease in the expression of both proteins. Amnion homogenate is as effective as AM grafts in promoting corneal wound healing in a mouse model. A higher positive effect was obtained with atRA treatment.

  7. Nonlinear Boltzmann equation for the homogeneous isotropic case: Some improvements to deterministic methods and applications to relaxation towards local equilibrium

    NASA Astrophysics Data System (ADS)

    Asinari, P.

    2011-03-01

    Boltzmann equation is one the most powerful paradigms for explaining transport phenomena in fluids. Since early fifties, it received a lot of attention due to aerodynamic requirements for high altitude vehicles, vacuum technology requirements and nowadays, micro-electro-mechanical systems (MEMs). Because of the intrinsic mathematical complexity of the problem, Boltzmann himself started his work by considering first the case when the distribution function does not depend on space (homogeneous case), but only on time and the magnitude of the molecular velocity (isotropic collisional integral). The interest with regards to the homogeneous isotropic Boltzmann equation goes beyond simple dilute gases. In the so-called econophysics, a Boltzmann type model is sometimes introduced for studying the distribution of wealth in a simple market. Another recent application of the homogeneous isotropic Boltzmann equation is given by opinion formation modeling in quantitative sociology, also called socio-dynamics or sociophysics. The present work [1] aims to improve the deterministic method for solving homogenous isotropic Boltzmann equation proposed by Aristov [2] by two ideas: (a) the homogeneous isotropic problem is reformulated first in terms of particle kinetic energy (this allows one to ensure exact particle number and energy conservation during microscopic collisions) and (b) a DVM-like correction (where DVM stands for Discrete Velocity Model) is adopted for improving the relaxation rates (this allows one to satisfy exactly the conservation laws at macroscopic level, which is particularly important for describing the late dynamics in the relaxation towards the equilibrium).

  8. Achieving Continuous Manufacturing for Final Dosage Formation: Challenges and How to Meet Them May 20-21 2014 Continuous Manufacturing Symposium.

    PubMed

    Byrn, Stephen; Futran, Maricio; Thomas, Hayden; Jayjock, Eric; Maron, Nicola; Meyer, Robert F; Myerson, Allan S; Thien, Michael P; Trout, Bernhardt L

    2015-03-01

    We describe the key issues and possibilities for continuous final dosage formation, otherwise known as downstream processing or drug product manufacturing. A distinction is made between heterogeneous processing and homogeneous processing, the latter of which is expected to add more value to continuous manufacturing. We also give the key motivations for moving to continuous manufacturing, some of the exciting new technologies, and the barriers to implementation of continuous manufacturing. Continuous processing of heterogeneous blends is the natural first step in converting existing batch processes to continuous. In heterogeneous processing, there are discrete particles that can segregate, versus in homogeneous processing, components are blended and homogenized such that they do not segregate. Heterogeneous processing can incorporate technologies that are closer to existing technologies, where homogeneous processing necessitates the development and incorporation of new technologies. Homogeneous processing has the greatest potential for reaping the full rewards of continuous manufacturing, but it takes long-term vision and a more significant change in process development than heterogeneous processing. Heterogeneous processing has the detriment that, as the technologies are adopted rather than developed, there is a strong tendency to incorporate correction steps, what we call below "The Rube Goldberg Problem." Thus, although heterogeneous processing will likely play a major role in the near-term transformation of heterogeneous to continuous processing, it is expected that homogeneous processing is the next step that will follow. Specific action items for industry leaders are. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  9. High throughput film dosimetry in homogeneous and heterogeneous media for a small animal irradiator

    PubMed Central

    Wack, L.; Ngwa, W.; Tryggestad, E.; Tsiamas, P.; Berbeco, R.; Ng, S.K.; Hesser, J.

    2013-01-01

    Purpose We have established a high-throughput Gafchromic film dosimetry protocol for narrow kilo-voltage beams in homogeneous and heterogeneous media for small-animal radiotherapy applications. The kV beam characterization is based on extensive Gafchromic film dosimetry data acquired in homogeneous and heterogeneous media. An empirical model is used for parameterization of depth and off-axis dependence of measured data. Methods We have modified previously published methods of film dosimetry to suit the specific tasks of the study. Unlike film protocols used in previous studies, our protocol employs simultaneous multichannel scanning and analysis of up to nine Gafchromic films per scan. A scanner and background correction were implemented to improve accuracy of the measurements. Measurements were taken in homogeneous and inhomogeneous phantoms at 220 kVp and a field size of 5 × 5 mm2. The results were compared against Monte Carlo simulations. Results Dose differences caused by variations in background signal were effectively removed by the corrections applied. Measurements in homogeneous phantoms were used to empirically characterize beam data in homogeneous and heterogeneous media. Film measurements in inhomogeneous phantoms and their empirical parameterization differed by about 2%–3%. The model differed from MC by about 1% (water, lung) to 7% (bone). Good agreement was found for measured and modelled off-axis ratios. Conclusions EBT2 films are a valuable tool for characterization of narrow kV beams, though care must be taken to eliminate disturbances caused by varying background signals. The usefulness of the empirical beam model in interpretation and parameterization of film data was demonstrated. PMID:23510532

  10. [Analysis of COX1 sequences of Taenia isolates from four areas of Guangxi].

    PubMed

    Yang, Yi-Chao; Ou-Yang, Yi; Su, Ai-Rong; Wan, Xiao-Ling; Li, Shu-Lin

    2012-06-01

    To analyze the COX1 sequences of Taenia isolates from four areas of Guangxi Zhuang Autonomous Region, and to understand the distribution of Taenia asiatica in Guangxi. Patients with taeniasis in Luzhai, Rongshui, Tiandong and Sanjiang in Guangxi were treated by deworming, and the Taenia isolates were collected. Cyclooxygenase-1 (COX1) sequences of these isolates were amplified by PCR, and the PCR products were sequenced by T-A clone sequencing. The homogeneities and genetic distances were calculated and analyzed, and the phylogenic trees were constructed by some softwares. Meanwhile, the COX1 sequences of the isolates from the 4 areas were compared separately with the sequences of Taenia species in GenBank. The COX1 sequence of the 5 Taenia isolates collected had the same length of 444 bp. There were 5 variable positions between the Luzhai isolate and Taenia asiatica, the homogeneity was 98.87% and their genetic distance was 0.011. The phylogenetic tree analysis revealed that the Luzhai isolate and Taenia asiatica locating at the same node had a close relationship. The homogeneity between Rongshui isolate A and Taenia solium was 100%, while the homogeneity of Rongshui isolate B with Taeniasis saginata and Taenia asiatica were 98.20% and 96.17%, respectively. The homogeneities of the Tiandong and Sanjiang isolates with Taenia solium were 99.55% and 96.40%, respectively, and the genetic distances were 0.005 and 0.037, respectively. The homogeneity between the Luzhai isolate and Taeniasis saginate was 96.40%. Taenia asiatica exists in Luzhai and Taenia solium and Taenia saginata coexist in Rongshui, Guangxi Zhuang Autonomous Region.

  11. Intrinsic brain abnormalities in young healthy adults with childhood trauma: A resting-state functional magnetic resonance imaging study of regional homogeneity and functional connectivity.

    PubMed

    Lu, Shaojia; Gao, Weijia; Wei, Zhaoguo; Wang, Dandan; Hu, Shaohua; Huang, Manli; Xu, Yi; Li, Lingjiang

    2017-06-01

    Childhood trauma confers great risk for the development of multiple psychiatric disorders; however, the neural basis for this association is still unknown. The present resting-state functional magnetic resonance imaging study aimed to detect the effects of childhood trauma on brain function in a group of young healthy adults. In total, 24 healthy individuals with childhood trauma and 24 age- and sex-matched adults without childhood trauma were recruited. Each participant underwent resting-state functional magnetic resonance imaging scanning. Intra-regional brain activity was evaluated by regional homogeneity method and compared between groups. Areas with altered regional homogeneity were further selected as seeds in subsequent functional connectivity analysis. Statistical analyses were performed by setting current depression and anxiety as covariates. Adults with childhood trauma showed decreased regional homogeneity in bilateral superior temporal gyrus and insula, and the right inferior parietal lobule, as well as increased regional homogeneity in the right cerebellum and left middle temporal gyrus. Regional homogeneity values in the left middle temporal gyrus, right insula and right cerebellum were correlated with childhood trauma severity. In addition, individuals with childhood trauma also exhibited altered default mode network, cerebellum-default mode network and insula-default mode network connectivity when the left middle temporal gyrus, right cerebellum and right insula were selected as seed area, respectively. The present outcomes suggest that childhood trauma is associated with disturbed intrinsic brain function, especially the default mode network, in adults even without psychiatric diagnoses, which may mediate the relationship between childhood trauma and psychiatric disorders in later life.

  12. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Curie temperatures of titanomagnetite in ignimbrites: Effects of emplacement temperatures, cooling rates, exsolution, and cation ordering

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Bowles, Julie A.

    2014-11-01

    Pumices, ashes, and tuffs from Mt. St. Helens and from Novarupta contain two principal forms of titanomagnetite: homogeneous grains with Curie temperatures in the range 350-500°C and oxyexsolved grains with similar bulk composition, containing ilmenite lamellae and having Curie temperatures above 500°C. Thermomagnetic analyses and isothermal annealing experiments in combination with stratigraphic settings and thermal models show that emplacement temperatures and cooling history may have affected the relative proportions of homogeneous and exsolved grains and have clearly had a strong influence on the Curie temperature of the homogeneous phase. The exsolved grains are most common where emplacement temperatures exceeded 600°C, and in laboratory experiments, heating to over 600°C in air causes the homogeneous titanomagnetites to oxyexsolve rapidly. Where emplacement temperatures were lower, Curie temperatures of the homogeneous grains are systematically related to overburden thickness and cooling timescales, and thermomagnetic curves are generally irreversible, with lower Curie temperatures measured during cooling, but little or no change is observed in room temperature susceptibility. We interpret this irreversible behavior as reflecting variations in the degree of cation ordering in the titanomagnetites, although we cannot conclusively rule out an alternative interpretation involving fine-scale subsolvus unmixing. Short-range ordering within the octahedral sites may play a key role in the observed phenomena. Changes in the Curie temperature have important implications for the acquisition, stabilization, and retention of natural remanence and may in some cases enable quantification of the emplacement temperatures or cooling rates of volcanic units containing homogeneous titanomagnetites.

  14. Homogeneity of the geochemical reference material BRP-1 (paraná basin basalt) and assessment of minimum mass

    USGS Publications Warehouse

    Cotta, Aloisio J. B.; Enzweiler, Jacinta; Wilson, Stephen A.; Perez, Carlos A.; Nardy, Antonio J. R.; Larizzatti, Joao H.

    2007-01-01

    Reference materials (RM) are required for quantitative analyses and their successful use is associated with the degree of homogeneity, and the traceability and confidence limits of the values established by characterisation. During the production of a RM, the chemical characterisation can only commence after it has been demonstrated that the material has the required level of homogeneity. Here we describe the preparation of BRP-1, a proposed geochemical reference material, and the results of the tests to evaluate its degree of homogeneity between and within bottles. BRP-1 is the first of two geochemical RM being produced by Brazilian institutions in collaboration with the United States Geological Survey (USGS) and the International Association of Geoanalysts (IAG). Two test portions of twenty bottles of BRP-1 were analysed by wavelength dispersive-XRF spectrometry and major, minor and eighteen trace elements were determined. The results show that for most of the investigated elements, the units of BRP-1 were homogeneous at conditions approximately three times more rigorous than those strived for by the test of “sufficient homogeneity”. Furthermore, the within bottle homogeneity of BRP-1 was evaluated using small beam (1 mm2) synchrotron radiation XRF spectrometry and, for comparison, the USGS reference materials BCR-2 and GSP-2 were also evaluated. From our data, it has been possible to assign representative minimum masses for some major constituents (1 mg) and for some trace elements (1-13 mg), except Zr in GSP-2, for which test portions of 74 mg are recommended.

  15. ANALYSIS OF FISH HOMOGENATES FOR PERFLUORINATED COMPOUNDS

    EPA Science Inventory

    Perfluorinated compounds (PFCs) which include PFOS and PFOA are widely distributed in wildlife. Whole fish homogenates were analyzed for PFCs from the upper Mississippi, the Missouri and the Ohio rivers. Methods development, validation data, and preliminary study results will b...

  16. Climate differentiates forest structure across a residential macrosystem

    EPA Science Inventory

    The extent of urban ecological homogenization depends on how humans build, inhabit, and manage cities. Morphological and socio-economic facets of neighborhoods can drive the homogenization of forest cover, thus affecting urban ecological and hydrological processes, and ecosystem...

  17. New exact perfect fluid solutions of Einstein's equations. II

    NASA Astrophysics Data System (ADS)

    Uggla, Claes; Rosquist, Kjell

    1990-12-01

    A family of new spatially homogeneous Bianchi type VIh perfect fluid solutions of the Einstein equations is presented. The fluid flow is orthogonal to the spatially homogeneous hypersurfaces, and the pressure is proportional to the energy density.

  18. General Theorems about Homogeneous Ellipsoidal Inclusions

    ERIC Educational Resources Information Center

    Korringa, J.; And Others

    1978-01-01

    Mathematical theorems about the properties of ellipsoids are developed. Included are Poisson's theorem concerning the magnetization of a homogeneous body of ellipsoidal shape, the polarization of a dielectric, the transport of heat or electricity through an ellipsoid, and other problems. (BB)

  19. Mid-infrared spectrometry of milk for dairy metabolomics: a comparison of two sampling techniques and effect of homogenization.

    PubMed

    Aernouts, Ben; Polshin, Evgeny; Saeys, Wouter; Lammertyn, Jeroen

    2011-10-31

    Milk production is a dominant factor in the metabolism of dairy cows involving a very intensive interaction with the blood circulation. As a result, the extracted milk contains valuable information on the metabolic status of the cow. On-line measurement of milk components during milking two or more times a day would promote early detection of systemic and local alterations, thus providing a great input for strategic and management decisions. The objective of this study was to investigate the potential of mid-infrared (mid-IR) spectroscopy to measure the milk composition using two different measurement modes: micro attenuated total reflection (μATR) and high throughput transmission (HTT). Partial least squares (PLS) regression was used for prediction of fat, crude protein, lactose and urea after preprocessing IR data and selecting the most informative wavenumber variables. The prediction accuracies were determined separately for raw and homogenized copies of a wide range of milk samples in order to estimate the possibility for on-line analysis of the milk. In case of fat content both measurement modes resulted in an excellent prediction for homogenized samples (R(2)>0.92) but in poor results for raw samples (R(2)<0.70). Homogenization was however not mandatory to achieve good predictions for crude protein and lactose with both μATR and HTT, and urea with μATR spectroscopy. Excellent results were obtained for prediction of crude protein, lactose and urea content (R(2)>0.99, 0.98 and 0.86 respectively) in raw and homogenized milk using μATR IR spectroscopy. These results were significantly better than those obtained by HTT IR spectroscopy. However, the prediction performance of HTT was still good for crude protein and lactose content (R(2)>0.86 and 0.78 respectively) in raw and homogenized samples. However, the detection of urea in milk with HTT spectroscopy was significantly better (R(2)=0.69 versus 0.16) after homogenization of the milk samples. Based on these observations it can be concluded that μATR approach is most suitable for rapid at line or even on-line milk composition measurement, although homogenization is crucial to achieve good prediction of the fat content. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Communication—indentation of Li-ion pouch cell: Effect of material homogenization on prediction of internal short circuit

    DOE PAGES

    Kumar, A.; Kalnaus, Sergiy; Simunovic, Srdjan; ...

    2016-09-12

    We performed finite element simulations of spherical indentation of Li-ion pouch cells. Our model fully resolves different layers in the cell. The results of the layer resolved models were compared to the models available in the literature that treat the cell as an equivalent homogenized continuum material. Simulations were carried out for different sizes of the spherical indenter. Here, we show that calibration of a failure criterion for the cell in the homogenized model depends on the indenter size, whereas in the layer-resoled model, such dependency is greatly diminished.

  1. Goedel, Penrose, anti-Mach: Extra supersymmetries of time-dependent plane waves

    NASA Astrophysics Data System (ADS)

    Blau, Matthias; Meessen, Patrick; O'Loughlin, Martin

    2003-09-01

    We prove that M-theory plane waves with extra supersymmetries are necessarily homogeneous (but possibly time-dependent), and we show by explicit construction that such time-dependent plane waves can admit extra supersymmetries. To that end we study the Penrose limits of Gödel-like metrics, show that the Penrose limit of the M-theory Gödel metric (with 20 supercharges) is generically a time-dependent homogeneous plane wave of the anti-Mach type, and display the four extra Killings spinors in that case. We conclude with some general remarks on the Killing spinor equations for homogeneous plane waves.

  2. LANDSAT-D investigations in snow hydrology

    NASA Technical Reports Server (NTRS)

    Dozier, J. (Principal Investigator)

    1984-01-01

    Two stream methods provide rapid approximate calculations of radiative transfer in scattering and absorbing media. Although they provide information on fluxes only, and not on intensities, their speed makes them attractive to more precise methods. The methods provide a comprehensive, unified review for a homogeneous layer, and solve the equations for reflectance and transmittance for a homogeneous layer over a non reflecting surface. Any of the basic kernels for a single layer can be extended to a vertically inhomogeneous medium over a surface whose reflectance properties vary with illumination angle, as long as the medium can be subdivided into homogeneous layers.

  3. Two stroke homogenous charge compression ignition engine with pulsed air supplier

    DOEpatents

    Clarke, John M.

    2003-08-05

    A two stroke homogenous charge compression ignition engine includes a volume pulsed air supplier, such as a piston driven pump, for efficient scavenging. The usage of a homogenous charge tends to decrease emissions. The use of a volume pulsed air supplier in conjunction with conventional poppet type intake and exhaust valves results in a relatively efficient scavenging mode for the engine. The engine preferably includes features that permit valving event timing, air pulse event timing and injection event timing to be varied relative to engine crankshaft angle. The principle use of the invention lies in improving diesel engines.

  4. Manifestations of drag reduction by polymer additives in decaying, homogeneous, isotropic turbulence.

    PubMed

    Perlekar, Prasad; Mitra, Dhrubaditya; Pandit, Rahul

    2006-12-31

    The existence of drag reduction by polymer additives, well established for wall-bounded turbulent flows, is controversial in homogeneous, isotropic turbulence. To settle this controversy, we carry out a high-resolution direct numerical simulation of decaying, homogeneous, isotropic turbulence with polymer additives. Our study reveals clear manifestations of drag-reduction-type phenomena: On the addition of polymers to the turbulent fluid, we obtain a reduction in the energy-dissipation rate, a significant modification of the fluid energy spectrum especially in the deep-dissipation range, a suppression of small-scale intermittency, and a decrease in small-scale vorticity filaments.

  5. Drag reduction in homogeneous turbulence by scale-dependent effective viscosity.

    PubMed

    Benzi, Roberto; Ching, Emily S C; Procaccia, Itamar

    2004-08-01

    We demonstrate, by using suitable shell models, that drag reduction in homogeneous turbulence is usefully discussed in terms of a scale-dependent effective viscosity. The essence of the phenomenon of drag reduction found in models that couple the velocity field to the polymers can be recaptured by an "equivalent" equation of motion for the velocity field alone, with a judiciously chosen scale-dependent effective viscosity that succinctly summarizes the important aspects of the interaction between the velocity and the polymer fields. Finally, we clarify the differences between drag reduction in homogeneous and in wall bounded flows.

  6. Method of making metal oxide ceramic powders by using a combustible amino acid compound

    DOEpatents

    Pederson, L.R.; Chick, L.A.; Exarhos, G.J.

    1992-05-19

    This invention is directed to the formation of homogeneous, aqueous precursor mixtures of at least one substantially soluble metal salt and a substantially soluble, combustible co-reactant compound, typically an amino acid. This produces, upon evaporation, a substantially homogeneous intermediate material having a total solids level which would support combustion. The homogeneous intermediate material essentially comprises highly dispersed or solvated metal constituents and the co-reactant compound. The intermediate material is quite flammable. A metal oxide powder results on ignition of the intermediate product which combusts same to produce the product powder.

  7. Method of making metal oxide ceramic powders by using a combustible amino acid compound

    DOEpatents

    Pederson, Larry R.; Chick, Lawrence A.; Exarhos, Gregory J.

    1992-01-01

    This invention is directed to the formation of homogeneous, aqueous precursor mixtures of at least one substantially soluble metal salt and a substantially soluble, combustible co-reactant compound, typically an amino acid. This produces, upon evaporation, a substantially homogeneous intermediate material having a total solids level which would support combustion. The homogeneous intermediate material essentially comprises highly dispersed or solvated metal constituents and the co-reactant compound. The intermediate material is quite flammable. A metal oxide powder results on ignition of the intermediate product which combusts same to produce the product powder.

  8. Homogeneous illusion device exhibiting transformed and shifted scattering effect

    NASA Astrophysics Data System (ADS)

    Mei, Jin-Shuo; Wu, Qun; Zhang, Kuang; He, Xun-Jun; Wang, Yue

    2016-06-01

    Based on the theory of transformation optics, a type of homogeneous illusion device exhibiting transformed and shifted scattering effect is proposed in this paper. The constitutive parameters of the proposed device are derived, and full-wave simulations are performed to validate the electromagnetic properties of transformed and shifted scattering effect. The simulation results show that the proposed device not only can visually shift the image of target in two dimensions, but also can visually transform the shape of target. It is expected that such homogeneous illusion device could possess potential applications in military camouflage and other field of electromagnetic engineering.

  9. Transmission of hemic neoplasia in the bay mussel, Mytilus edulis, using whole cells and cell homogenate.

    PubMed

    Elston, R A; Kent, M L; Drum, A S

    1988-01-01

    Experimental studies with hemic neoplasia in the bay mussel indicated that the condition can be transmitted allogeneically with intact whole cells and cell-free homogenate. A differential pathogenesis of the disease in mussels receiving the two different inocula supports the argument that actual cell transplantation occurred. In addition to the first demonstration of the infectious nature of the disease with cell-free homogenates, it was also shown that the disease is transmitted by cohabitation. Remission of the disease occurred in some mussels indicating individual variation in recognition mechanisms.

  10. Spectral reproducibility and quantification of peptides in MALDI of samples prepared by micro-spotting.

    PubMed

    Bae, Yong Jin; Park, Kyung Man; Ahn, Sung Hee; Moon, Jeong Hee; Kim, Myung Soo

    2014-08-01

    Previously, we reported that MALDI spectra of peptides became reproducible when temperature was kept constant. Linear calibration curves derived from such spectral data could be used for quantification. Homogeneity of samples was one of the requirements. Among the three popular matrices used in peptide MALDI [i.e., α-cyano-4-hydroxycinnamic acid (CHCA), 2,5-dihydroxybenzoic acid (DHB), and sinapinic acid (SA)], homogeneous samples could be prepared by conventional means only for CHCA. In this work, we showed that sample preparation by micro-spotting improved the homogeneity for all three cases.

  11. Quadrature transmit coil for breast imaging at 7 tesla using forced current excitation for improved homogeneity.

    PubMed

    McDougall, Mary Preston; Cheshkov, Sergey; Rispoli, Joseph; Malloy, Craig; Dimitrov, Ivan; Wright, Steven M

    2014-11-01

    To demonstrate the use of forced current excitation (FCE) to create homogeneous excitation of the breast at 7 tesla, insensitive to the effects of asymmetries in the electrical environment. FCE was implemented on two breast coils: one for quadrature (1) H imaging and one for proton-decoupled (13) C spectroscopy. Both were a Helmholtz-saddle combination, with the saddle tuned to 298 MHz for imaging and 75 MHz for spectroscopy. Bench measurements were acquired to demonstrate the ability to force equal currents on elements in the presence of asymmetric loading to improve homogeneity. Modeling and temperature measurements were conducted per safety protocol. B1 mapping, imaging, and proton-decoupled (13) C spectroscopy were demonstrated in vivo. Using FCE to ensure balanced currents on elements enabled straightforward tuning and maintaining of isolation between quadrature elements of the coil. Modeling and bench measurements confirmed homogeneity of the field, which resulted in images with excellent fat suppression and in broadband proton-decoupled carbon-13 spectra. FCE is a straightforward approach to ensure equal currents on multiple coil elements and a homogeneous excitation field, insensitive to the effects of asymmetries in the electrical environment. This enabled effective breast imaging and proton-decoupled carbon-13 spectroscopy at 7T. © 2014 Wiley Periodicals, Inc.

  12. Bioimmunological responses to Schistosoma mansoni and Fasciola gigantica worm homogenates either with or without saponin.

    PubMed

    Maghraby, Amany Sayed; Hamed, Manal Abdel-Aziz; Ali, Sanaa Ahmed

    2010-06-03

    In this study, we evaluated the biochemical, immunological, histopathological and antischistosomal activities of Schistosoma mansoni or Fasciola gigantica worm homogenates mixed either with or without saponin that was extracted from Atriplex nummularia. The immunization schedule was based on subcutaneous administration of two doses (50 microg /100 microl PBS) of each homogenate with time intervals of 15 days. After 15 days of the last homogenate inoculation, all mice were challenged with 100 Schistosoma mansoni cercariae and sacrificed after two months. Free radical scavengers and liver function enzymes were determined in mice liver. Worm counting and the histopathological picture of the liver were also done. Immunization with Schistosoma or Fasciola worm homogenates, mixed either with or without saponin, recorded an amelioration of the free radical scavenger levels, liver function enzymes and reduction in worm burden, as well as improvement of the histological feature of the liver, the number and size of granuloma, evidence of increased immune reaction manifested by a lymphocytic cuff surrounding the granuloma, diminution of its fibrotic and collagen content, and destruction of Schistosoma ova. Fasciola or Schistosoma worm antigens mixed with or without saponin succeeded to eliminate the product of oxidative stress and assistance in immune-mediated destruction of eggs that ameliorate the histopathological picture of the liver cells and preserve its function.

  13. Homogeneous and inhomogeneous material effect in gamma index evaluation of IMRT technique based on fan beam and Cone Beam CT patient images

    NASA Astrophysics Data System (ADS)

    Wibowo, W. E.; Waliyyulhaq, M.; Pawiro, S. A.

    2017-05-01

    Patient-specific Quality Assurance (QA) technique in lung case Intensity-Modulated Radiation Therapy (IMRT) is traditionally limited to homogeneous material, although the fact that the planning is carried out with inhomogeneous material present. Moreover, the chest area has many of inhomogeneous material, such as lung, soft tissue, and bone, which inhomogeneous material requires special attention to avoid inaccuracies in dose calculation in the Treatment Planning System (TPS). Recent preliminary studies shown that the role of Cone Beam CT (CBCT) can be used not only to position the patient at the time prior to irradiation but also to serve as planning modality. Our study presented the influence of a homogeneous and inhomogeneous materials using Fan Beam CT and Cone Beam CT modalities in IMRT technique on the Gamma Index (GI) value. We used a variation of the segment and Calculation Grid Resolution (CGR). The results showed the deviation of averaged GI value to be between CGR 0.2 cm and 0.4 cm with homogeneous material ranging from -0.44% to 1.46%. For inhomogeneous material, the value was range from -1.74% to 0.98%. In performing patient-specific IMRT QA techniques for lung cancer, homogeneous material can be implemented in evaluating the gamma index.

  14. The Fourier transforms for the spatially homogeneous Boltzmann equation and Landau equation

    NASA Astrophysics Data System (ADS)

    Meng, Fei; Liu, Fang

    2018-03-01

    In this paper, we study the Fourier transforms for two equations arising in the kinetic theory. The first equation is the spatially homogeneous Boltzmann equation. The Fourier transform of the spatially homogeneous Boltzmann equation has been first addressed by Bobylev (Sov Sci Rev C Math Phys 7:111-233, 1988) in the Maxwellian case. Alexandre et al. (Arch Ration Mech Anal 152(4):327-355, 2000) investigated the Fourier transform of the gain operator for the Boltzmann operator in the cut-off case. Recently, the Fourier transform of the Boltzmann equation is extended to hard or soft potential with cut-off by Kirsch and Rjasanow (J Stat Phys 129:483-492, 2007). We shall first establish the relation between the results in Alexandre et al. (2000) and Kirsch and Rjasanow (2007) for the Fourier transform of the Boltzmann operator in the cut-off case. Then we give the Fourier transform of the spatially homogeneous Boltzmann equation in the non cut-off case. It is shown that our results cover previous works (Bobylev 1988; Kirsch and Rjasanow 2007). The second equation is the spatially homogeneous Landau equation, which can be obtained as a limit of the Boltzmann equation when grazing collisions prevail. Following the method in Kirsch and Rjasanow (2007), we can also derive the Fourier transform for Landau equation.

  15. Exploring cosmic homogeneity with the BOSS DR12 galaxy sample

    NASA Astrophysics Data System (ADS)

    Ntelis, Pierros; Hamilton, Jean-Christophe; Le Goff, Jean-Marc; Burtin, Etienne; Laurent, Pierre; Rich, James; Guillermo Busca, Nicolas; Tinker, Jeremy; Aubourg, Eric; du Mas des Bourboux, Hélion; Bautista, Julian; Palanque Delabrouille, Nathalie; Delubac, Timothée; Eftekharzadeh, Sarah; Hogg, David W.; Myers, Adam; Vargas-Magaña, Mariana; Pâris, Isabelle; Petitjean, Partick; Rossi, Graziano; Schneider, Donald P.; Tojeiro, Rita; Yeche, Christophe

    2017-06-01

    In this study, we probe the transition to cosmic homogeneity in the Large Scale Structure (LSS) of the Universe using the CMASS galaxy sample of BOSS spectroscopic survey which covers the largest effective volume to date, 3 h-3 Gpc3 at 0.43 <= z <= 0.7. We study the scaled counts-in-spheres, N(2.97 for r>RH, we find RH = (63.3±0.7) h-1 Mpc, in agreement at the percentage level with the predictions of the ΛCDM model RH=62.0 h-1 Mpc. Thanks to the large cosmic depth of the survey, we investigate the redshift evolution of the transition to homogeneity scale and find agreement with the ΛCDM prediction. Finally, we find that Script D2 is compatible with 3 at scales larger than 300 h-1 Mpc in all redshift bins. These results consolidate the Cosmological Principle and represent a precise consistency test of the ΛCDM model.

  16. Identification of homogeneous regions for regionalization of watersheds by two-level self-organizing feature maps

    NASA Astrophysics Data System (ADS)

    Farsadnia, F.; Rostami Kamrood, M.; Moghaddam Nia, A.; Modarres, R.; Bray, M. T.; Han, D.; Sadatinejad, J.

    2014-02-01

    One of the several methods in estimating flood quantiles in ungauged or data-scarce watersheds is regional frequency analysis. Amongst the approaches to regional frequency analysis, different clustering techniques have been proposed to determine hydrologically homogeneous regions in the literature. Recently, Self-Organization feature Map (SOM), a modern hydroinformatic tool, has been applied in several studies for clustering watersheds. However, further studies are still needed with SOM on the interpretation of SOM output map for identifying hydrologically homogeneous regions. In this study, two-level SOM and three clustering methods (fuzzy c-mean, K-mean, and Ward's Agglomerative hierarchical clustering) are applied in an effort to identify hydrologically homogeneous regions in Mazandaran province watersheds in the north of Iran, and their results are compared with each other. Firstly the SOM is used to form a two-dimensional feature map. Next, the output nodes of the SOM are clustered by using unified distance matrix algorithm and three clustering methods to form regions for flood frequency analysis. The heterogeneity test indicates the four regions achieved by the two-level SOM and Ward approach after adjustments are sufficiently homogeneous. The results suggest that the combination of SOM and Ward is much better than the combination of either SOM and FCM or SOM and K-mean.

  17. Field homogeneity improvement of maglev NdFeB magnetic rails from joints.

    PubMed

    Li, Y J; Dai, Q; Deng, C Y; Sun, R X; Zheng, J; Chen, Z; Sun, Y; Wang, H; Yuan, Z D; Fang, C; Deng, Z G

    2016-01-01

    An ideal magnetic rail should provide a homogeneous magnetic field along the longitudinal direction to guarantee the reliable friction-free operation of high temperature superconducting (HTS) maglev vehicles. But in reality, magnetic field inhomogeneity may occur due to lots of reasons; the joint gap is the most direct one. Joint gaps inevitably exist between adjacent segments and influence the longitudinal magnetic field homogeneity above the rail since any magnetic rails are consisting of many permanent magnet segments. To improve the running performance of maglev systems, two new rail joints are proposed based on the normal rail joint, which are named as mitered rail joint and overlapped rail joint. It is found that the overlapped rail joint has a better effect to provide a competitive homogeneous magnetic field. And the further structure optimization has been done to ensure maglev vehicle operation as stable as possible when passing through those joint gaps. The results show that the overlapped rail joint with optimal parameters can significantly reduce the magnetic field inhomogeneity comparing with the other two rail joints. In addition, an appropriate gap was suggested when balancing the thermal expansion of magnets and homogenous magnetic field, which is considered valuable references for the future design of the magnetic rails.

  18. Analysis of openings and wide of leaf on multileaf Colimators Using Gafchromic RTQA2 Film

    NASA Astrophysics Data System (ADS)

    Setiawati, Evi; Lailla Rachma, Assyifa; Hidayatullah, M.

    2018-05-01

    The research determined an excitence of correction openings leaf for treatment, and the distribution dose using Gafchromic RTQA2 film. This was about MLC’s correction based on result of movement leaf and field irradiating uniform was done. Methods of research was conduct an irradiating on Gafchromic RTQA2 film based on the index planning homogeneity philosophy, openings leaf and wide leaf. The result of film was lit later in scan. It was continued to include image of the software scanning into matlab. From this case, the image of films common to greyscale image and analysis on the rise in doses blackish films. In this step, we made a correlation between the doses and determine the homogenity to know film dosimetri used homogeneous, and correction of openings leaf and wide leaf. The result between pixel and doses was linear with the equation y = (-0,6)x+108 to low dose and y = (-0,28)x + 108 to high doses and the index of homogeneity range of 0,003 – 0,084. The result homogeneous and correction distribution doses at the openings leaf and wide leaf was around 5% with a value still into the suggested tolerance from ICRU No.50 was 10%.

  19. Effect of lipid viscosity and high-pressure homogenization on the physical stability of "Vitamin E" enriched emulsion.

    PubMed

    Alayoubi, Alaadin; Abu-Fayyad, Ahmed; Rawas-Qalaji, Mutasem M; Sylvester, Paul W; Nazzal, Sami

    2015-01-01

    Recently there has been a growing interest in vitamin E for its potential use in cancer therapy. The objective of this work was therefore to formulate a physically stable parenteral lipid emulsion to deliver higher doses of vitamin E than commonly used in commercial products. Specifically, the objectives were to study the effects of homogenization pressure, number of homogenizing cycles, viscosity of the oil phase, and oil content on the physical stability of emulsions fortified with high doses of vitamin E (up to 20% by weight). This was done by the use of a 27-run, 4-factor, 3-level Box-Behnken statistical design. Viscosity, homogenization pressure, and number of cycles were found to have a significant effect on particle size, which ranged from 213 to 633 nm, and on the percentage of vitamin E remaining emulsified after storage, which ranged from 17 to 100%. Increasing oil content from 10 to 20% had insignificant effect on the responses. Based on the results it was concluded that stable vitamin E rich emulsions could be prepared by repeated homogenization at higher pressures and by lowering the viscosity of the oil phase, which could be adjusted by blending the viscous vitamin E with medium-chain triglycerides (MCT).

  20. Homogenization Theory for the Prediction of Obstructed Solute Diffusivity in Macromolecular Solutions.

    PubMed

    Donovan, Preston; Chehreghanianzabi, Yasaman; Rathinam, Muruhan; Zustiak, Silviya Petrova

    2016-01-01

    The study of diffusion in macromolecular solutions is important in many biomedical applications such as separations, drug delivery, and cell encapsulation, and key for many biological processes such as protein assembly and interstitial transport. Not surprisingly, multiple models for the a-priori prediction of diffusion in macromolecular environments have been proposed. However, most models include parameters that are not readily measurable, are specific to the polymer-solute-solvent system, or are fitted and do not have a physical meaning. Here, for the first time, we develop a homogenization theory framework for the prediction of effective solute diffusivity in macromolecular environments based on physical parameters that are easily measurable and not specific to the macromolecule-solute-solvent system. Homogenization theory is useful for situations where knowledge of fine-scale parameters is used to predict bulk system behavior. As a first approximation, we focus on a model where the solute is subjected to obstructed diffusion via stationary spherical obstacles. We find that the homogenization theory results agree well with computationally more expensive Monte Carlo simulations. Moreover, the homogenization theory agrees with effective diffusivities of a solute in dilute and semi-dilute polymer solutions measured using fluorescence correlation spectroscopy. Lastly, we provide a mathematical formula for the effective diffusivity in terms of a non-dimensional and easily measurable geometric system parameter.

Top