Science.gov

Sample records for process creates homogenous

  1. Children Creating Ways To Represent Changing Situations: On the Development of Homogeneous Spaces.

    ERIC Educational Resources Information Center

    Nemirovsky, Ricardo; Tierney, Cornelia

    2001-01-01

    Focuses on children creating representations on paper for situations that change over time. Articulates the distinction between homogeneous and heterogeneous spaces and reflects on children's tendency to create hybrids between them. (Author/MM)

  2. Creating a Flexible Budget Process

    ERIC Educational Resources Information Center

    Frew, James; Olson, Robert; Pelton, M. Lee

    2009-01-01

    The budget process is often an especially thorny area in communication between administrators and faculty members. Last year, Willamette University took a step toward reducing tensions surrounding the budget. As university administrators planned for the current year, they faced the high degree of uncertainty that the financial crisis has forced on…

  3. Creating a Flexible Budget Process

    ERIC Educational Resources Information Center

    Frew, James; Olson, Robert; Pelton, M. Lee

    2009-01-01

    The budget process is often an especially thorny area in communication between administrators and faculty members. Last year, Willamette University took a step toward reducing tensions surrounding the budget. As university administrators planned for the current year, they faced the high degree of uncertainty that the financial crisis has forced on…

  4. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-07

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization.

  5. The Largest Fragment of a Homogeneous Fragmentation Process

    NASA Astrophysics Data System (ADS)

    Kyprianou, Andreas; Lane, Francis; Mörters, Peter

    2017-03-01

    We show that in homogeneous fragmentation processes the largest fragment at time t has size e^{-t Φ '(overline{p})}t^{-3/2 (log Φ )'(overline{p})+o(1)}, where Φ is the Lévy exponent of the fragmentation process, and overline{p} is the unique solution of the equation (log Φ )'(bar{p})=1/1+bar{p}. We argue that this result is in line with predictions arising from the classification of homogeneous fragmentation processes as logarithmically correlated random fields.

  6. Autoregressive Processes in Homogenization of GNSS Tropospheric Data

    NASA Astrophysics Data System (ADS)

    Klos, A.; Bogusz, J.; Teferle, F. N.; Bock, O.; Pottiaux, E.; Van Malderen, R.

    2016-12-01

    Offsets due to changes in hardware equipment or any other artificial event are all a subject of a task of homogenization of tropospheric data estimated within a processing of Global Navigation Satellite System (GNSS) observables. This task is aimed at identifying exact epochs of offsets and estimate their magnitudes since they may artificially under- or over-estimate trend and its uncertainty delivered from tropospheric data and used in climate studies. In this research, we analysed a common data set of differences of Integrated Water Vapour (IWV) from GPS and ERA-Interim (1995-2010) provided for a homogenization group working within ES1206 COST Action GNSS4SWEC. We analysed daily IWV records of GPS and ERA-Interim in terms of trend, seasonal terms and noise model with Maximum Likelihood Estimation in Hector software. We found that this data has a character of autoregressive process (AR). Basing on this analysis, we performed Monte Carlo simulations of 25 years long data with two different noise types: white as well as combination of white and autoregressive and also added few strictly defined offsets. This synthetic data set of exactly the same character as IWV from GPS and ERA-Interim was then subjected to a task of manual and automatic/statistical homogenization. We made blind tests and detected possible epochs of offsets manually. We found that simulated offsets were easily detected in series with white noise, no influence of seasonal signal was noticed. The autoregressive series were much more problematic when offsets had to be determined. We found few epochs, for which no offset was simulated. This was mainly due to strong autocorrelation of data, which brings an artificial trend within. Due to regime-like behaviour of AR it is difficult for statistical methods to properly detect epochs of offsets, which was previously reported by climatologists.

  7. Process to create simulated lunar agglutinate particles

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert J. (Inventor); Gustafson, Marty A. (Inventor); White, Brant C. (Inventor)

    2011-01-01

    A method of creating simulated agglutinate particles by applying a heat source sufficient to partially melt a raw material is provided. The raw material is preferably any lunar soil simulant, crushed mineral, mixture of crushed minerals, or similar material, and the heat source creates localized heating of the raw material.

  8. Creep rupture as a non-homogeneous Poissonian process

    PubMed Central

    Danku, Zsuzsa; Kun, Ferenc

    2013-01-01

    Creep rupture of heterogeneous materials occurring under constant sub-critical external loads is responsible for the collapse of engineering constructions and for natural catastrophes. Acoustic monitoring of crackling bursts provides microscopic insight into the failure process. Based on a fiber bundle model, we show that the accelerating bursting activity when approaching failure can be described by the Omori law. For long range load redistribution the time series of bursts proved to be a non-homogeneous Poissonian process with power law distributed burst sizes and waiting times. We demonstrate that limitations of experiments such as finite detection threshold and time resolution have striking effects on the characteristic exponents, which have to be taken into account when comparing model calculations with experiments. Recording events solely within the Omori time to failure the size distribution of bursts has a crossover to a lower exponent which is promising for forecasting the imminent catastrophic failure. PMID:24045539

  9. Competing Contact Processes on Homogeneous Networks with Tunable Clusterization

    NASA Astrophysics Data System (ADS)

    Rybak, Marcin; Kułakowski, Krzysztof

    2013-03-01

    We investigate two homogeneous networks: the Watts-Strogatz network with mean degree ⟨k⟩ = 4 and the Erdös-Rényi network with ⟨k⟩ = 10. In both kinds of networks, the clustering coefficient C is a tunable control parameter. The network is an area of two competing contact processes, where nodes can be in two states, S or D. A node S becomes D with probability 1 if at least two its mutually linked neighbors are D. A node D becomes S with a given probability p if at least one of its neighbors is S. The competition between the processes is described by a phase diagram, where the critical probability pc depends on the clustering coefficient C. For p > pc the rate of state S increases in time, seemingly to dominate in the whole system. Below pc, the majority of nodes is in the D-state. The numerical results indicate that for the Watts-Strogatz network the D-process is activated at the finite value of the clustering coefficient C, close to 0.3. On the contrary, for the Erdös-Rényi network the transition is observed at the whole investigated range of C.

  10. Experimenting With Ore: Creating the Taconite Process; flow chart of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Experimenting With Ore: Creating the Taconite Process; flow chart of process - Mines Experiment Station, University of Minnesota, Twin Cities Campus, 56 East River Road, Minneapolis, Hennepin County, MN

  11. Creating Documentary Theatre as Educational Process.

    ERIC Educational Resources Information Center

    Hirschfeld-Medalia, Adeline

    With the celebration of the United States bicentennial as impetus, university students and faculty attempted several approaches to the creation of a touring documentary production composed almost completely from primary sources. This paper describes the process involved in producing a traveling show which featured groups relatively excluded from…

  12. Can An Evolutionary Process Create English Text?

    SciTech Connect

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  13. Improved microbiological diagnostic due to utilization of a high-throughput homogenizer for routine tissue processing.

    PubMed

    Redanz, Sylvio; Podbielski, Andreas; Warnke, Philipp

    2015-07-01

    Tissue specimens are valuable materials for microbiological diagnostics and require swift and accurate processing. Established processing methods are complex, labor intensive, hardly if at all standardizable, and prone to incorporate contaminants. To improve analyses from tissue samples in routine microbiological diagnostics, by facilitating, fastening, and standardizing processing as well as increasing the microbial yield, performance of Precellys 24 high-throughput tissue homogenizer was evaluated. Therefore, tissue samples were artificially inoculated with Staphylococcus aureus, Escherichia coli, and Candida albicans in 3 different ways on the surface and within the material. Microbial yield from homogenized samples was compared to direct plating method. Further, as proof of principle, routine tissue samples from knee and hip endoprosthesis infections were analyzed. The process of tissue homogenization with Precellys 24 homogenizer is easy and fast to perform and allows for a high degree of standardization. Microbial yield after homogenization was significantly higher as compared to conventional plating technique.

  14. [Chemiluminescence spectroscopic analysis of homogeneous charge compression ignition combustion processes].

    PubMed

    Liu, Hai-feng; Yao, Ming-fa; Jin, Chao; Zhang, Peng; Li, Zhe-ming; Zheng, Zun-qing

    2010-10-01

    To study the combustion reaction kinetics of homogeneous charge compression ignition (HCCI) under different port injection strategies and intake temperature conditions, the tests were carried out on a modified single-cylinder optical engine using chemiluminescence spectroscopic analysis. The experimental conditions are keeping the fuel mass constant; fueling the n-heptane; controlling speed at 600 r x min(-1) and inlet pressure at 0.1 MPa; controlling inlet temperature at 95 degrees C and 125 degrees C, respectively. The results of chemiluminescence spectrum show that the chemiluminescence is quite faint during low temperature heat release (LTHR), and these bands spectrum originates from formaldehyde (CH2O) chemiluminescence. During the phase of later LTHR-negative temperature coefficient (NTC)-early high temperature heat release (HTHR), these bands spectrum also originates from formaldehyde (CH2O) chemiluminescence. The CO--O* continuum is strong during HTHR, and radicals such as OH, HCO, CH and CH2O appear superimposed on this CO--O* continuum. After the HTHR, the chemiluminescence intensity is quite faint. In comparison to the start of injection (SOI) of -30 degrees ATDC, the chemiluminescence intensity is higher under the SOI = -300 degrees ATDC condition due to the more intense emissions of CO--O* continuum. And more radicals of HCO and OH are formed, which also indicates a more intense combustion reaction. Similarly, more intense CO--O* continuum and more radicals of HCO and OH are emitted under higher intake temperature case.

  15. A criterion for assessing homogeneity distribution in hyperspectral images. Part 1: homogeneity index bases and blending processes.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    The Process Analytical Technologies (PAT) initiative of the US Food and Drug Administration (US FDA) has established a framework for the development of imaging techniques to determine the real-time distribution of mixture components during the production of solid dosage forms. This study, which is the first in a series of two parts, uses existing mixing indices and a new criterion called the "percentage of homogeneity" (H%) to assess image homogeneity. Image analysis techniques use feature extraction procedures to extract information from images subjected to treatments including colour segmentation and binarization. The surface distribution of components was determined by macropixel analysis, which splits an image into non-overlapping blocks of a preset size and calculates several statistical parameters for the resulting divisional structure. Such parameters were used to compute mixing indices. In this work, we explored the potential of image processing in combination with mixing indices and H% for assessing blending end-point and component distribution on images. As a simplified test, an arrangement of binary and ternary systems of coloured particles was mixed collecting at-line multispectral (MSI) and non-invasive RGB pictures at preset intervals. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Non-homogeneous biofilm modeling applied to bioleaching processes.

    PubMed

    Olivera-Nappa, Alvaro; Picioreanu, Cristian; Asenjo, Juan A

    2010-07-01

    A two-dimensional non-homogeneous biofilm model is proposed for the first time to study chemical and biochemical reactions at the microorganism scale applied to biological metal leaching from mineral ores. The spatial and temporal relation between these reactions, microorganism growth and the morphological changes of the biofilm caused by solid inorganic precipitate formation were studied using this model. The model considers diffusion limitations due to accumulation of inorganic particles over the mineral substratum, and allows the study of the effect of discrete phases on chemical and microbiological mineral solubilization. The particle-based modeling strategy allowed representation of contact reactions between the microorganisms and the insoluble precipitates, such as those required for sulfur attack and solubilization. Time-dependent simulations of chemical chalcopyrite leaching showed that chalcopyrite passivation occurs only when an impervious solid layer is formed on the mineral surface. This mineral layer hinders the diffusion of one kinetically determinant mineral-attacking chemical species through a nearly irreversible chemical mechanism. Simulations with iron and sulfur oxidizing microorganisms revealed that chemolithoautotrophic biofilms are able to delay passivation onset by formation of corrosion pits and increase of the solid layer porosity through sulfur dissolution. The model results also show that the observed flat morphology of bioleaching biofilms is favored preferentially at low iron concentrations due to preferential growth at the biofilm edge on the surface of sulfur-forming minerals. Flat biofilms can also be advantageous for chalcopyrite bioleaching because they tend to favor sulfur dissolution over iron oxidation. The adopted modeling strategy is of great interest for the numerical representation of heterogeneous biofilm systems including abiotic solid particles.

  17. Homogeneous and Heterogeneous Catalytic Processes Promoted by Organoactinides

    NASA Astrophysics Data System (ADS)

    Burns, Carol J.; Eisen, Moris S.

    During the last two decades, the chemistry of organoactinides has flourished, reaching a high level of sophistication. The use of organoactinide complexes as stoichiometric or catalytic compounds to promote synthetically important organic transformations has matured due to their rich, complex, and uniquely informative organometallic chemistry. Compared to early or late transition metal complexes, the actinides sometimes exhibit parallel and sometimes totally different reactivities for similar processes. In many instances the regiospecific and chemical selectivities displayed by organoactinide complexes are complementary to that observed for other transition metal complexes. Several recent review articles (Edelman et al., 1995; Edelmann and Gun'ko, 1997; Ephritikhine, 1997; Hitchcock et al., 1997; Berthet and Ephritikhine, 1998; Blake et al., 1998; Edelmann and Lorenz, 2000), dealing mostly with the synthesis of new actinide complexes, confirm the broad and rapidly expanding scope of this field.

  18. Effect of homogenization process on the hardness of Zn-Al-Cu alloys

    NASA Astrophysics Data System (ADS)

    Villegas-Cardenas, Jose D.; Saucedo-Muñoz, Maribel L.; Lopez-Hirata, Victor M.; De Ita-De la Torre, Antonio; Avila-Davila, Erika O.; Gonzalez-Velazquez, Jorge Luis

    2015-10-01

    The effect of a homogenizing treatment on the hardness of as-cast Zn-Al-Cu alloys was investigated. Eight alloy compositions were prepared and homogenized at 350 °C for 180 h, and their Rockwell "B" hardness was subsequently measured. All the specimens were analyzed by X-ray diffraction and metallographically prepared for observation by optical microscopy and scanning electron microscopy. The results of the present work indicated that the hardness of both alloys (as-cast and homogenized) increased with increasing Al and Cu contents; this increased hardness is likely related to the presence of the θ and τ' phases. A regression equation was obtained to determine the hardness of the homogenized alloys as a function of their chemical composition and processing parameters, such as homogenization time and temperature, used in their preparation.

  19. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  20. A Tool for Creating Healthier Workplaces: The Conducivity Process

    ERIC Educational Resources Information Center

    Karasek, Robert A.

    2004-01-01

    The conducivity process, a methodology for creating healthier workplaces by promoting conducive production, is illustrated through the use of the "conducivity game" developed in the NordNet Project in Sweden, which was an action research project to test a job redesign methodology. The project combined the "conducivity" hypotheses about a…

  1. A test of homogeneity for age-dependent branching processes with immigration

    PubMed Central

    Yanev, Nikolay M.; Jordan, Craig T.

    2016-01-01

    We propose a novel procedure to test whether the immigration process of a discretely observed age-dependent branching process with immigration is time-homogeneous. The construction of the test is motivated by the behavior of the coefficient of variation of the population size. When immigration is time-homogeneous, we find that this coefficient converges to a constant, whereas when immigration is time-inhomogeneous we find that it is time-dependent, at least transiently. Thus, we test the assumption that the immigration process is time-homogeneous by verifying that the sample coefficient of variation does not vary significantly over time. The test is simple to implement and does not require specification or fitting any branching process to the data. Simulations and an application to real data on the progression of leukemia are presented to illustrate the approach. PMID:27134694

  2. A test of homogeneity for age-dependent branching processes with immigration.

    PubMed

    Hyrien, Ollivier; Yanev, Nikolay M; Jordan, Craig T

    We propose a novel procedure to test whether the immigration process of a discretely observed age-dependent branching process with immigration is time-homogeneous. The construction of the test is motivated by the behavior of the coefficient of variation of the population size. When immigration is time-homogeneous, we find that this coefficient converges to a constant, whereas when immigration is time-inhomogeneous we find that it is time-dependent, at least transiently. Thus, we test the assumption that the immigration process is time-homogeneous by verifying that the sample coefficient of variation does not vary significantly over time. The test is simple to implement and does not require specification or fitting any branching process to the data. Simulations and an application to real data on the progression of leukemia are presented to illustrate the approach.

  3. Markov processes and partial differential equations on a group: the space-homogeneous case

    NASA Astrophysics Data System (ADS)

    Bendikov, A. D.

    1987-10-01

    CONTENTS Introduction Terminology and notation Chapter I. Potential theory of conjugate processes § 1.1. Markov processes and harmonic spaces § 1.2. Processes of class \\mathscr{A} and Brelot spaces § 1.3. Processes of class \\mathscr{B} and Bauer spaces Chapter II. Space-homogeneous processes on a group § 2.1. Space-homogeneous processes and harmonic structures § 2.2. Quasidiagonal processes § 2.3. An example of a non-quasidiagonal process Chapter III. Elliptic equations on a group § 3.1. Admissible distributions and multipliers § 3.2. Weak solutions of elliptic equations ( L_p-theory) § 3.3. Weyl's lemma and the hypoelliptic property References

  4. Process spectroscopy in microemulsions—Raman spectroscopy for online monitoring of a homogeneous hydroformylation process

    NASA Astrophysics Data System (ADS)

    Paul, Andrea; Meyer, Klas; Ruiken, Jan-Paul; Illner, Markus; Müller, David-Nicolas; Esche, Erik; Wozny, Günther; Westad, Frank; Maiwald, Michael

    2017-03-01

    A major industrial reaction based on homogeneous catalysis is hydroformylation for the production of aldehydes from alkenes and syngas. Hydroformylation in microemulsions, which is currently under investigation at Technische Universität Berlin on a mini-plant scale, was identified as a cost efficient approach which also enhances product selectivity. Herein, we present the application of online Raman spectroscopy on the reaction of 1-dodecene to 1-tridecanal within a microemulsion. To achieve a good representation of the operation range in the mini-plant with regard to concentrations of the reactants a design of experiments was used. Based on initial Raman spectra partial least squares regression (PLSR) models were calibrated for the prediction of 1-dodecene and 1-tridecanal. Limits of predictions arise from nonlinear correlations between Raman intensity and mass fractions of compounds in the microemulsion system. Furthermore, the prediction power of PLSR models becomes limited due to unexpected by-product formation. Application of the lab-scale derived calibration spectra and PLSR models on online spectra from a mini-plant operation yielded promising estimations of 1-tridecanal and acceptable predictions of 1-dodecene mass fractions suggesting Raman spectroscopy as a suitable technique for process analytics in microemulsions.

  5. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  6. Deep nursing: a thoughtful, co-created nursing process.

    PubMed

    Griffiths, Colin

    2017-03-30

    This article examines some of the challenges in nursing practice experienced by patients and nurses in the UK and Ireland, and considers some of the associated stressors in the system. Nurses must respond to these challenges by crafting their own practice, and the article offers a blueprint for developing personal nursing practice through acceptance, paying detailed attention to patients, taking time with patients and personal reflection. It draws on innovations in learning disability practice to suggest that care should be jointly thought through and co-created by patients and nurses, and that this process of thoughtful engagement constitutes 'deep nursing'.

  7. A homogeneous microwave curing process for epoxy/glass fibre composites

    SciTech Connect

    Outifa, L.; Delmotte, M.; Jullien, H.

    1995-12-01

    A global model is established which describes a successful process for an optimised and homogeneous curing of large epoxy/glass fibre composite pieces. The process is characterised by a strong coupling between dielectric chemical and thermal aspects. It appears that the use of a single mode travelling continuous wave is necessary to obtain a homogeneous distribution of the electric field. Dielectric properties of matched transitions and of moulds are selected to focus the energy inside the material to cure. A thermal model describes the energy transfers related to chemical and dielectric heat sources and their variations during processing. The whole model is founded on general considerations and equations and therefore can be extended to many other configurations.

  8. Low-risk gasoline alkylation process using a homogeneous liquid phase catalyst

    SciTech Connect

    Nelson, S.R.; Nelson, L.G.

    1996-12-31

    Kerr-McGee`s interest in finding additional applications for its ROSES technology has led to a promising new alkylation process for the production of gasoline. The technology is timely due to its inherent environmental safety. The Homogeneous Alkylation Technology (HAT{trademark}) process uses a soluble alkylaluminum chloride-based catalyst at less than 1 percent of the acid concentrations used in conventional alkylation processes. The patented process greatly reduces the environmental risks associated with accidental acid releases from HF and sulfuric acid alkylation units. In addition, the process is projected to operate at lower cost than sulfuric acid alkylation and is a retrofit option for existing HF and sulfuric-acid alkylation units. Kerr-McGee has entered into a relationship with a major U.S. refiner to carry on the development of the HAT process. A gallon-per-day-scale pilot unit has been constructed for use in developing the process. 1 fig., 1 tab.

  9. The Processes and Timescales That Produce Zoning and Homogeneity in Magmatic Systems

    NASA Astrophysics Data System (ADS)

    Bergantz, G. W.; Bachmann, O.

    2006-12-01

    Erupted sequences that are continuously zoned are common in both intermediate and silicic systems. The zoning is established and sustained primarily by chaotic low-Reynolds number convection associated with cooling, crystallization, degassing and/or from addition of new magma. The dynamics of this process and the formation of the gradients are reasonably well understood. In most cases, systems that show continuous large-scale zoning are mechanically dominated by the silicate liquid phase. However as the crystallinity increases, many magmatic systems display a dual nature: a lack of large-scale zoning, with compositional uniformity at the macro-to-meso scales, accompanied and great complexity of textures and age at the micro-to-meso scale. This is particularly well expressed by an absence of whole-rock compositional gradients in Monotonous Intermediates ignimbrites and in many plutons. Since most convective processes will produce large-scale gradients, what processes can produce large-scale homogeneity, but crystal-scale heterogeneity? We propose and will exemplify a multistage process that combines large-scale, but low-Reynolds number circulation to produce a complex crystal cargo (the active regime), with a propagating rheological capture-front that can lock-in and sustain homogeneity (the mushy regime). The gradients in the crystal-rich, propagating mushy regime are minor, and near eutectic conditions buffer the compositions and intensive variables, producing homogeneity. Repeated rejuvenation or unlocking of the crystal-rich mushy material back into the convective regime will yield the common observation of reversals in temperature and prolongued crystallization histories seen in zoned crystals. This approach unifies the application of new multiphase fluid dynamics and emerging micro-analytical techniques with the 'convective liquidus/soldification front' of Marsh and the 'defrosting' model of Mahood.

  10. Selves creating stories creating selves: a process model of self-development.

    PubMed

    McLean, Kate C; Pasupathi, Monisha; Pals, Jennifer L

    2007-08-01

    This article is focused on the growing empirical emphasis on connections between narrative and self-development. The authors propose a process model of self-development in which storytelling is at the heart of both stability and change in the self. Specifically, we focus on how situated stories help develop and maintain the self with reciprocal impacts on enduring aspects of self, specifically self-concept and the life story. This article emphasizes the research that has shown how autobiographical stories affect the self and provides a direction for future work to maximize the potential of narrative approaches to studying processes of self-development.

  11. Land Use and the Democratic Process: Creating Your Own Magazine

    ERIC Educational Resources Information Center

    King, David C.

    1976-01-01

    Presents ideas for creating an environmental education publication or television documentary as a secondary level class activity. Suggested topics for the publication include information on international land use, climatic factors, food production, nuclear energy, and individual freedom versus land use restriction. (Author/DB)

  12. Process for forming a homogeneous oxide solid phase of catalytically active material

    DOEpatents

    Perry, Dale L.; Russo, Richard E.; Mao, Xianglei

    1995-01-01

    A process is disclosed for forming a homogeneous oxide solid phase reaction product of catalytically active material comprising one or more alkali metals, one or more alkaline earth metals, and one or more Group VIII transition metals. The process comprises reacting together one or more alkali metal oxides and/or salts, one or more alkaline earth metal oxides and/or salts, one or more Group VIII transition metal oxides and/or salts, capable of forming a catalytically active reaction product, in the optional presence of an additional source of oxygen, using a laser beam to ablate from a target such metal compound reactants in the form of a vapor in a deposition chamber, resulting in the deposition, on a heated substrate in the chamber, of the desired oxide phase reaction product. The resulting product may be formed in variable, but reproducible, stoichiometric ratios. The homogeneous oxide solid phase product is useful as a catalyst, and can be produced in many physical forms, including thin films, particulate forms, coatings on catalyst support structures, and coatings on structures used in reaction apparatus in which the reaction product of the invention will serve as a catalyst.

  13. Cyclization of 1,4-hydroxycarbonyls is not a homogenous gas phase process

    NASA Astrophysics Data System (ADS)

    Dibble, Theodore S.

    2007-10-01

    Previous studies of 1,4-hydroxycarbonyls derived from alkanes have suggested that they can cyclize to saturated furans, which can subsequently eliminate water to form the corresponding dihydrofurans. CBS-QB3 and G3 studies of 5-hydroxy-2-pentanone and 2-hydroxypentanal show that both steps have activation barriers far too large for these reactions to occur as homogenous gas phase reactions. Similar results were obtained in CBS-QB3 studies of the analogous process leading from 2- and 3-methyl-4-hydroxy-2-butenal (species posited to form in the degradation of isoprene) to 3-methylfuran. The latter two processes are much more favorable, thermodynamically, than the formation of dihydrofurans from the saturated 1,4-hydroxycarbonyls.

  14. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment.

  15. Novel particulate production processes to create unique security materials

    NASA Astrophysics Data System (ADS)

    Hampden-Smith, Mark; Kodas, Toivo; Haubrich, Scott; Oljaca, Miki; Einhorn, Rich; Williams, Darryl

    2006-02-01

    Particles are frequently used to impart security features to high value items. These particles are typically produced by traditional methods, and therefore the security must be derived from the chemical composition of the particles rather than the particle production process. Here, we present new and difficult-to-reproduce particle production processes based on spray pyrolysis that can produce unique particles and features that are dependent on the use of these new-to-the-world processes and process trade secrets. Specifically two examples of functional materials are described, luminescent materials and electrocatalytic materials.

  16. The influence of homogenization process on lasing performance in polymer-nematic liquid crystal emulsions

    NASA Astrophysics Data System (ADS)

    Adamow, Alina; Sznitko, Lech; Mysliwiec, Jaroslaw

    2017-07-01

    In this letter we report on the results of studies of amplified spontaneous emission in polymer - liquid crystal emulsions based on mixtures of poly(vinyl alcohol) and 5CB nematic liquid crystal doped with three luminescent dyes: DCM, Coumarin 504 and Coumarin 540. The mixture of dyes was used in order to extend the range of stimulated emission spectra. We have investigated the emission properties of four samples with different size and distribution of liquid crystal micro droplets, controlled by the length of time exposure on ultrasounds during the homogenization process. We have designated the threshold conditions for stimulated emission occurrence and compared the emission spectra obtained below as well as above threshold conditions.

  17. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda

    2015-12-01

    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  18. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    SciTech Connect

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the charged wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.

  19. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  20. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  1. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  2. Homogeneous sonophotolysis of food processing industry wastewater: Study of synergistic effects, mineralization and toxicity removal.

    PubMed

    Durán, A; Monteagudo, J M; Sanmartín, I; Gómez, P

    2013-03-01

    The mineralization of industrial wastewater coming from food industry using an emerging homogeneous sonophotolytic oxidation process was evaluated as an alternative to or a rapid pretreatment step for conventional anaerobic digestion with the aim of considerably reducing the total treatment time. At the selected operation conditions ([H(2)O(2)]=11,750ppm, pH=8, amplitude=50%, pulse length (cycles)=1), 60% of TOC is removed after 60min and 98% after 180min when treating an industrial effluent with 2114ppm of total organic carbon (TOC). This process removed completely the toxicity generated during storing or due to intermediate compounds. An important synergistic effect between sonolysis and photolysis (H(2)O(2)/UV) was observed. Thus the sonophotolysis (ultrasound/H(2)O(2)/UV) technique significantly increases TOC removal when compared with each individual process. Finally, a preliminary economical analysis confirms that the sono-photolysis with H(2)O(2) and pretreated water is a profitable system when compared with the same process without using ultrasound waves and with no pretreatment.

  3. Parallel information processing channels created in the retina

    PubMed Central

    Schiller, Peter H.

    2010-01-01

    In the retina, several parallel channels originate that extract different attributes from the visual scene. This review describes how these channels arise and what their functions are. Following the introduction four sections deal with these channels. The first discusses the “ON” and “OFF” channels that have arisen for the purpose of rapidly processing images in the visual scene that become visible by virtue of either light increment or light decrement; the ON channel processes images that become visible by virtue of light increment and the OFF channel processes images that become visible by virtue of light decrement. The second section examines the midget and parasol channels. The midget channel processes fine detail, wavelength information, and stereoscopic depth cues; the parasol channel plays a central role in processing motion and flicker as well as motion parallax cues for depth perception. Both these channels have ON and OFF subdivisions. The third section describes the accessory optic system that receives input from the retinal ganglion cells of Dogiel; these cells play a central role, in concert with the vestibular system, in stabilizing images on the retina to prevent the blurring of images that would otherwise occur when an organism is in motion. The last section provides a brief overview of several additional channels that originate in the retina. PMID:20876118

  4. Evidence of linked biogeochemical and hydrological processes in homogeneous and layered vadose zone systems

    NASA Astrophysics Data System (ADS)

    McGuire, J. T.; Hansen, D. J.; Mohanty, B. P.

    2010-12-01

    Understanding chemical fate and transport in the vadose zone is critical to protect groundwater resources and preserve ecosystem health. However, prediction can be challenging due to the dynamic hydrologic and biogeochemical nature of the vadose zone. Additional controls on hydrobiogeochemical processes are added by subsurface structural heterogeneity. This study uses repacked soil column experiments to quantify linkages between microbial activity, geochemical cycling and hydrologic flow. Three “short” laboratory soil columns were constructed to evaluate the effects of soil layering: a homogenized medium-grained sand, homogenized organic-rich loam, and a sand-over-loam layered column. In addition, two “long” columns were constructed using either gamma-irradiated (sterilized) or untreated sediments to evaluate the effects of both soil layers and the presence of microorganisms. The long columns were packed identically; a medium-grained sand matrix with two vertically separated and horizontally offset lenses of organic-rich loam. In all 5 columns, downward and upward infiltration of water was evaluated to simulate rainfall and rising water table events respectively. In-situ colocated probes were used to measure soil water content, matric potential, Eh, major anions, ammonium, Fe2+, and total sulfide. Enhanced biogeochemical cycling was observed in the short layered column versus the short, homogeneous columns, and enumerations of iron and sulfate reducing bacteria were 1-2 orders of magnitude greater. In the long columns, microbial activity caused mineral bands and produced insoluble gases that impeded water flow through the pores of the sediment. Capillary barriers, formed around the lenses due to soil textural differences, retarded water flow rates through the lenses. This allowed reducing conditions to develop, evidenced by the production of Fe2+ and S2-. At the fringes of the lenses, Fe2+ oxidized to form Fe(III)-oxide bands that further retarded water

  5. Spatial Division Multiplexed Microwave Signal processing by selective grating inscription in homogeneous multicore fibers.

    PubMed

    Gasulla, Ivana; Barrera, David; Hervás, Javier; Sales, Salvador

    2017-01-30

    The use of Spatial Division Multiplexing for Microwave Photonics signal processing is proposed and experimentally demonstrated, for the first time to our knowledge, based on the selective inscription of Bragg gratings in homogeneous multicore fibers. The fabricated devices behave as sampled true time delay elements for radiofrequency signals offering a wide range of operation possibilities within the same optical fiber. The key to processing flexibility comes from the implementation of novel multi-cavity configurations by inscribing a variety of different fiber Bragg gratings along the different cores of a 7-core fiber. This entails the development of the first fabrication method to inscribe high-quality gratings characterized by arbitrary frequency spectra and located in arbitrary longitudinal positions along the individual cores of a multicore fiber. Our work opens the way towards the development of unique compact fiber-based solutions that enable the implementation of a wide variety of 2D (spatial and wavelength diversity) signal processing functionalities that will be key in future fiber-wireless communications scenarios. We envisage that Microwave Photonics systems and networks will benefit from this technology in terms of compactness, operation versatility and performance stability.

  6. Spatial Division Multiplexed Microwave Signal processing by selective grating inscription in homogeneous multicore fibers

    PubMed Central

    Gasulla, Ivana; Barrera, David; Hervás, Javier; Sales, Salvador

    2017-01-01

    The use of Spatial Division Multiplexing for Microwave Photonics signal processing is proposed and experimentally demonstrated, for the first time to our knowledge, based on the selective inscription of Bragg gratings in homogeneous multicore fibers. The fabricated devices behave as sampled true time delay elements for radiofrequency signals offering a wide range of operation possibilities within the same optical fiber. The key to processing flexibility comes from the implementation of novel multi-cavity configurations by inscribing a variety of different fiber Bragg gratings along the different cores of a 7-core fiber. This entails the development of the first fabrication method to inscribe high-quality gratings characterized by arbitrary frequency spectra and located in arbitrary longitudinal positions along the individual cores of a multicore fiber. Our work opens the way towards the development of unique compact fiber-based solutions that enable the implementation of a wide variety of 2D (spatial and wavelength diversity) signal processing functionalities that will be key in future fiber-wireless communications scenarios. We envisage that Microwave Photonics systems and networks will benefit from this technology in terms of compactness, operation versatility and performance stability. PMID:28134304

  7. Spatial Division Multiplexed Microwave Signal processing by selective grating inscription in homogeneous multicore fibers

    NASA Astrophysics Data System (ADS)

    Gasulla, Ivana; Barrera, David; Hervás, Javier; Sales, Salvador

    2017-01-01

    The use of Spatial Division Multiplexing for Microwave Photonics signal processing is proposed and experimentally demonstrated, for the first time to our knowledge, based on the selective inscription of Bragg gratings in homogeneous multicore fibers. The fabricated devices behave as sampled true time delay elements for radiofrequency signals offering a wide range of operation possibilities within the same optical fiber. The key to processing flexibility comes from the implementation of novel multi-cavity configurations by inscribing a variety of different fiber Bragg gratings along the different cores of a 7-core fiber. This entails the development of the first fabrication method to inscribe high-quality gratings characterized by arbitrary frequency spectra and located in arbitrary longitudinal positions along the individual cores of a multicore fiber. Our work opens the way towards the development of unique compact fiber-based solutions that enable the implementation of a wide variety of 2D (spatial and wavelength diversity) signal processing functionalities that will be key in future fiber-wireless communications scenarios. We envisage that Microwave Photonics systems and networks will benefit from this technology in terms of compactness, operation versatility and performance stability.

  8. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    SciTech Connect

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987.

  9. Decolorization of Reactive Red 2 by advanced oxidation processes: Comparative studies of homogeneous and heterogeneous systems.

    PubMed

    Wu, Chung-Hsin; Chang, Chung-Liang

    2006-02-06

    This study investigated the decolorization of the Reactive Red 2 in water using advanced oxidation processes (AOPs): UV/TiO2, UV/SnO2, UV/TiO2+SnO2, O3, O3+MnO2, UV/O3 and UV/O3+TiO2+SnO2. Kinetic analyses indicated that the decolorization rates of Reactive Red 2 could be approximated as pseudo-first-order kinetics for both homogeneous and heterogeneous systems. The decolorization rate at pH 7 exceeded pH 4 and 10 in UV/TiO2 and UV/TiO2+SnO2 systems, respectively. However, the rate constants in the systems (including O3) demonstrated the order of pH 10>pH 7>pH 4. The UV/TiO2+SnO2 and O3+MnO2 systems exhibited a greater decolorization rate than the UV/TiO2 and O3 systems, respectively. Additionally, the promotion of rate depended on pH. The variation of dye concentration influenced the decolorization efficiency of heterogeneous systems more significant than homogeneous systems. Experimental results verified that decolorization and desulfuration occurred at nearly the same rate. Moreover, the decolorization rate constants at pH 7 in various systems followed the order of UV/O3 > or = O3+MnO2 > or = UV/O3+TiO2+SnO2 > O3 > UV/TiO2+SnO2 > or = UV/TiO2 > UV/SnO2.

  10. Creating the Virtual Work: Readers' Processes in Understanding Literary Texts.

    ERIC Educational Resources Information Center

    Earthman, Elise Ann

    A study examined the ways in which college readers interact with literary texts. The method of interviews and think-along protocols, in which a text was read aloud by the subject while he simultaneously verbalized his thoughts, was used to compare the reading processes of eight college freshman to those of eight masters students in literature who…

  11. Comprehension Process Instruction: Creating Reading Success in Grades K-3

    ERIC Educational Resources Information Center

    Block, Cathy Collins; Rodgers, Lori L.; Johnson, Rebecca B.

    2004-01-01

    Filling a crucial gap in the literature, this immensely practical volume presents innovative tools for helping K-3 students significantly increase their ability to make meaning from texts. The focus is on teaching the comprehension processes employed by expert readers, using a carefully sequenced combination of whole-class activities, specially…

  12. Creating a national citizen engagement process for energy policy

    PubMed Central

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-01-01

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants’ broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  13. Creating a national citizen engagement process for energy policy.

    PubMed

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-09-16

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy.

  14. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    PubMed

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  15. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  16. People Create Health: Effective Health Promotion is a Creative Process

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    Effective health promotion involves the creative cultivation of physical, mental, social, and spiritual well-being. Efforts at health promotion produce weak and inconsistent benefits when it does not engage people to express their own goals and values. Likewise, health promotion has been ineffective when it relies only on instruction about facts regarding a healthy lifestyle, or focuses on reduction of disease rather than the cultivation of well-being. Meta-analysis of longitudinal studies and experimental interventions shows that improvements in subjective well-being lead to short-term and long-term reductions in medical morbidity and mortality, as well as to healthier functioning and longevity. However, these effects are inconsistent and weak (correlations of about 0.15). The most consistent and strong predictor of both subjective well-being and objective health status in longitudinal studies is a creative personality profile characterized by being highly self-directed, cooperative, and self-transcendent. There is a synergy among these personality traits that enhances all aspects of the health and happiness of people. Experimental interventions to cultivate this natural creative potential of people are now just beginning, but available exploratory research has shown that creativity can be enhanced and the changes are associated with widespread and profound benefits, including greater physical, mental, social, and spiritual well-being. In addition to benefits mediated by choice of diet, physical activity, and health care utilization, the effect of a creative personality on health may be partly mediated by effects on the regulation of heart rate variability. Creativity promotes autonomic balance with parasympathetic dominance leading to a calm alert state that promotes an awakening of plasticities and intelligences that stress inhibits. We suggest that health, happiness, and meaning can be cultivated by a complex adaptive process that enhances healthy functioning

  17. Process spectroscopy in microemulsions—setup and multi-spectral approach for reaction monitoring of a homogeneous hydroformylation process

    NASA Astrophysics Data System (ADS)

    Meyer, K.; Ruiken, J.-P.; Illner, M.; Paul, A.; Müller, D.; Esche, E.; Wozny, G.; Maiwald, M.

    2017-03-01

    Reaction monitoring in disperse systems, such as emulsions, is of significant technical importance in various disciplines like biotechnological engineering, chemical industry, food science, and a growing number other technical fields. These systems pose several challenges when it comes to process analytics, such as heterogeneity of mixtures, changes in optical behavior, and low optical activity. Concerning this, online nuclear magnetic resonance (NMR) spectroscopy is a powerful technique for process monitoring in complex reaction mixtures due to its unique direct comparison abilities, while at the same time being non-invasive and independent of optical properties of the sample. In this study the applicability of online-spectroscopic methods on the homogeneously catalyzed hydroformylation system of 1-dodecene to tridecanal is investigated, which is operated in a mini-plant scale at Technische Universität Berlin. The design of a laboratory setup for process-like calibration experiments is presented, including a 500 MHz online NMR spectrometer, a benchtop NMR device with 43 MHz proton frequency as well as two Raman probes and a flow cell assembly for an ultraviolet and visible light (UV/VIS) spectrometer. Results of high-resolution online NMR spectroscopy are shown and technical as well as process-specific problems observed during the measurements are discussed.

  18. We're Born To Learn: Using the Brain's Natural Learning Process To Create Today's Curriculum.

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    This book provides research-based, concrete strategies for creating a student-centered curriculum in which every student can learn. It breaks down the Natural Human Learning Process (NHLP) into six stages, providing guidelines and models showing educators how to create learning experiences at each stage of the process for individuals, small…

  19. A monolith purification process for virus-like particles from yeast homogenate.

    PubMed

    Burden, Claire S; Jin, Jing; Podgornik, Aleš; Bracewell, Daniel G

    2012-01-01

    Monoliths are an alternative stationary phase format to conventional particle based media for large biomolecules. Conventional resins suffer from limited capacities and flow rates when used for viruses, virus-like particles (VLP) and other nanoplex materials. The monolith structure provides a more open pore structure to improve accessibility for these materials and better mass transport from convective flow and reduced pressure drops. To examine the performance of this format for bioprocessing we selected the challenging capture of a VLP from clarified yeast homogenate. Using a recombinant Saccharomyces cerevisiae host it was found hydrophobic interaction based separation using a hydroxyl derivatised monolith had the best performance. The monolith was then compared to a known beaded resin method, where the dynamic binding capacity was shown to be three-fold superior for the monolith with equivalent 90% recovery of the VLP. To understand the impact of the crude feed material confocal microscopy was used to visualise lipid contaminants, deriving from the homogenised yeast. It was seen that the lipid formed a layer on top of the column, even after regeneration of the column with isopropanol, resulting in increasing pressure drops with the number of operational cycles. Removal of the lipid pre-column significantly reduces the amount and rate of this fouling process. Using Amberlite/XAD-4 beads around 70% of the lipid was removed, with a loss of VLP around 20%. Applying a reduced lipid feed versus an untreated feed further increased the dynamic binding capacity of the monolith from 0.11 mg/mL column to 0.25 mg/mL column.

  20. Modeling of HIV/AIDS dynamic evolution using non-homogeneous semi-markov process.

    PubMed

    Dessie, Zelalem Getahun

    2014-01-01

    The purpose of this study is to model the progression of HIV/AIDS disease of an individual patient under ART follow-up using non-homogeneous semi-Markov processes. The model focuses on the patient's age as a relevant factor to forecast the transitions among the different levels of seriousness of the disease. A sample of 1456 patients was taken from a hospital record at Amhara Referral Hospitals, Amhara Region, Ethiopia, who were under ART follow up from June 2006 to August 2013. The states of disease progression adopted in the model were defined based on of the following CD4 cell counts: >500 cells/mm(3) (SI); 349 to 500 cells/mm(3) (SII); 199 to 350 cells/mm(3)(SIII); ≤200 cells/mm(3) (SIV); and death (D). The first four states are referred as living states. The probability that an HIV/AIDS patient with any one of the living states will transition to the death state is greater with increasing age, irrespective of the current state and age of the patient. More generally, the probability of dying decreases with increasing CD4 counts over time. For an HIV/AIDS patient in a specific state of the disease, the probability of remaining in the same state decreases with increasing age. Within the living states, the results show that the probability of being in a better state is non-zero, but less than the probability of being in a worse state for all ages. A reliability analysis also revealed that the survival probabilities are all declining over time. Computed conditional probabilities show differential subject response that depends on the age of the patient. The dynamic nature of AIDS progression is confirmed with particular findings that patients are more likely to be in a worse state than a better one unless interventions are made. Our findings suggest that ongoing ART treatment services could be provided more effectively with careful consideration of the recent disease status of patients.

  1. Impact of the homogenization process on the structure and antioxidant properties of chitosan-lignin composite films.

    PubMed

    Crouvisier-Urion, Kevin; Lagorce-Tachon, Aurélie; Lauquin, Camille; Winckler, Pascale; Tongdeesoontorn, Wirongrong; Domenek, Sandra; Debeaufort, Frédéric; Karbowiak, Thomas

    2017-12-01

    This work investigated the impact of two homogenization treatments, High Shear (HS) and High Pressure (HP), on the structure and antioxidant activity of chitosan-lignin bio-composite films. Laser light scattering analysis revealed that smaller lignin particles were obtained after HP processing, around 0.6μm, compared to HS treatment, between 2.5 and 5μm. Moreover, these particles were more homogeneously distributed in the chitosan film matrix after HP process, while some aggregates remained after HS treatment, as highlighted by two-photon microscopy. The surface hydrophobicity of the composite films, as measured by water contact angle, increased after the two homogenization treatments. Finally, the antioxidant activity of the composite films was determined using the DPPH· assay. No significant difference in the radical scavenging activity was noticeable, neither after HS nor HP processing. However, a migration of lignin residues from the film to the extraction medium was noticed, particularly for HP process. Copyright © 2017. Published by Elsevier Ltd.

  2. Efficacy of low-temperature high hydrostatic pressure processing in inactivating Vibrio parahaemolyticus in culture suspension and oyster homogenate.

    PubMed

    Phuvasate, Sureerat; Su, Yi-Cheng

    2015-03-02

    Culture suspensions of five clinical and five environmental Vibrio parahaemolyticus strains in 2% NaCl solution were subjected to high pressure processing (HPP) under various conditions (200-300MPa for 5 and 10 min at 1.5-20°C) to study differences in pressure resistance among the strains. The most pressure-resistant and pressure-sensitive strains were selected to investigate the effects of low temperatures (15, 5 and 1.5°C) on HPP (200 or 250MPa for 5 min) to inactivate V. parahaemolyticus in sterile oyster homogenates. Inactivation of V. parahaemolyticus cells in culture suspensions and oyster homogenates was greatly enhanced by lowering the processing temperature from 15 to 5 or 1.5°C. A treatment of oyster homogenates at 250MPa for 5 min at 5°C decreased the populations of V. parahaemolyticus by 6.2logCFU/g for strains 10290 and 100311Y11 and by >7.4logCFU/g for strain 10292. Decreasing the processing temperature of the same treatment to 1.5°C reduced all the V. parahaemolyticus strains inoculated to oyster homogenates to non-detectable (<10CFU/g) levels. Factors including pressure level, processing temperature and time all need to be considered for developing effective HPP for eliminating pathogens from foods. Further studies are needed to validate the efficacy of the HPP (250MPa for 5 min at 1.5°C) in inactivating V. parahaemolyticus cells in whole oysters. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    SciTech Connect

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCP is referred to herein as a Virtual ADCP (VADCP).

  4. Hot Deformation Behaviors and Processing Maps of 2024 Aluminum Alloy in As-cast and Homogenized States

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Zhao, Guoqun; Gong, Jie; Chen, Xiaoxue; Chen, Mengmeng

    2015-12-01

    The isothermal hot compression tests of as-cast and homogenized 2024 aluminum alloy were carried out under wide range of deformation temperatures (623-773 K) and strain rates (0.001-10 s-1). The constitutive equations for both initial states were established based on Arrhenius model, and the processing maps were constructed based on the dynamic material model. The results show that the flow stress of samples is evidently affected by both the strain rate and deformation temperature, and the flow stress in homogenized state is always higher than that in as-cast state. Through calculating the correlation coefficient ( R) and average absolute relative error of the established constitutive equations, it indicates that Arrhenius model can only provide a rough estimation on the flow stress. However, a much more precise value of the flow stress was obtained by introducing the strain compensation into Arrhenius model, since the effects of strain on the material constants were well considered. Furthermore, according to the processing maps, a suggested range of deformation temperature and strain rate for hot forming process were given then: temperature range 710-773 K and strain rate range 0.001-1 s-1 for as-cast state, and temperature range 680-773 K and strain rate range 0.003-0.22 s-1 for homogenized state.

  5. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    SciTech Connect

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-12-01

    Observations of the flow conditions in the forebay of a hydroelectric power station indicate significant regions of non-homogeneous velocities near the intakes and shoreline. The effect of these non-homogeneous regions on the velocity measurement of an acoustic Doppler current profiler (ADCP) is investigated. By using a numerical model of an ADCP operating in a velocity field calculated using computational fluid dynamics (CFD), the errors due to the spatial variation of the flow velocity are identified. The numerical model of the ADCP is referred to herein as a Virtual ADCP (VADCP). Two scenarios are modeled in the numerical analyses presented. Firstly the measurement error of the VADCP is calculated for a single instrument adjacent to the short converging intake of the powerhouse. Secondly, the flow discharge through the forebay is estimated from a transect of VADCP instruments at dif- ferent distances from the powerhouse. The influence of instrument location and orientation are investigated for both cases. A velocity error of over up to 94% of the reference velocity is calculated for a VADCP modeled adjacent to an operating intake. Qualitative agreement is observed between the calculated VADCP velocities and reference velocities by an offset of one intake height upstream of the powerhouse.

  6. Creating a Context for the Learning of Science Process Skills through Picture Books

    ERIC Educational Resources Information Center

    Monhardt, Leigh; Monhardt, Rebecca

    2006-01-01

    This article provides suggestions on ways in which science process skills can be taught in a meaningful context through children's literature. It is hoped that the following examples of how process skills can be taught using children's books will provide a starting point from which primary teachers can create additional examples. Many…

  7. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT AUDIOVISUAL, CARTOGRAPHIC, AND RELATED RECORDS MANAGEMENT § 1237.26 What materials and processes must agencies use to create audiovisual..., see § 1237.3). (1) Ensure that residual sodium thiosulfate (hypo) on newly processed black-and-white...

  8. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    DOE PAGES

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; ...

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCPmore » is referred to herein as a Virtual ADCP (VADCP).« less

  9. Microstructure, hardness homogeneity, and tensile properties of 1050 aluminum processed by constrained groove pressing

    NASA Astrophysics Data System (ADS)

    Hajizadeh, K.; Ejtemaei, S.; Eghbali, B.

    2017-08-01

    1050 commercial purity aluminum was subjected to severe plastic deformation through constrained groove pressing (CGP) at room temperature. Transmission electron microscope observations showed that after four CGP passes the majority of microstructure is composed of elongated grains/subgrains whose width/length average sizes are 506/1440 nm. This ultrafine-grained microstructure leads to a significant increase in yield strength of starting material from 93 to 182 MPa. At the same time, after four passes of CGP the material still displays a considerable ductility of 19%. Microhardness profiles reveal that average microhardness value in sample increases monotonically with increased straining during CGP. However, the degree of deformation homogeneity in samples remains almost unchanged at higher number passes. The latter was also confirmed by non-uniform distribution of imposed plastic strain in samples predicted by finite-element analysis.

  10. Optimization of homogenization-evaporation process for lycopene nanoemulsion production and its beverage applications.

    PubMed

    Kim, Sang Oh; Ha, Thi Van Anh; Choi, Young Jin; Ko, Sanghoon

    2014-08-01

    Lycopene is a natural antioxidant which has several health benefits. Undesirable oxidation of lycopene compromises its health benefits and also affects the sensory quality of food products containing lycopene. Health benefits associated with lycopene in food preparations can be enhanced by preventing its degradation by incorporating it into the oil phase of an oil-in-water nanoemulsion. In this study, lycopene nanoemulsions were prepared from a low-concentration lycopene extract using an emulsification-evaporation technique. The effects of the concentrations of the lycopene extract (0.015 to 0.085 mg/mL) and emulsifier (0.3 to 0.7 mg/mL), and the number of homogenization cycles (2 to 4) on the droplet size, emulsification efficiency (EE), and nanoemulsion stability were investigated and optimized by statistical analysis using a Box-Behnken design. Regression analysis was used to determine the 2nd-order polynomial model relationship of independent and dependent variables, with multiple regression coefficients (R(2)) of 0.924, 0.933, and 0.872, for the droplet size, EE, and nanoemulsion stability, respectively. Analysis of variance showed that the lycopene extract concentration has the most significant effect on all the response variables. Response surface methodology predicted that a formulation containing 0.085 mg/mL of lycopene extract and 0.7 mg/mL of emulsifier, subjected to 3 homogenization cycles, is optimal for achieving the smallest droplet size, greatest emulsion stability, and acceptable EE. The observed responses were in agreement with the predicted values of the optimized formulation. This study provided important information about the statistical design of lycopene nanoemulsion preparation.

  11. Regional Homogeneity

    PubMed Central

    Jiang, Lili; Zuo, Xi-Nian

    2015-01-01

    Much effort has been made to understand the organizational principles of human brain function using functional magnetic resonance imaging (fMRI) methods, among which resting-state fMRI (rfMRI) is an increasingly recognized technique for measuring the intrinsic dynamics of the human brain. Functional connectivity (FC) with rfMRI is the most widely used method to describe remote or long-distance relationships in studies of cerebral cortex parcellation, interindividual variability, and brain disorders. In contrast, local or short-distance functional interactions, especially at a scale of millimeters, have rarely been investigated or systematically reviewed like remote FC, although some local FC algorithms have been developed and applied to the discovery of brain-based changes under neuropsychiatric conditions. To fill this gap between remote and local FC studies, this review will (1) briefly survey the history of studies on organizational principles of human brain function; (2) propose local functional homogeneity as a network centrality to characterize multimodal local features of the brain connectome; (3) render a neurobiological perspective on local functional homogeneity by linking its temporal, spatial, and individual variability to information processing, anatomical morphology, and brain development; and (4) discuss its role in performing connectome-wide association studies and identify relevant challenges, and recommend its use in future brain connectomics studies. PMID:26170004

  12. Study on rheo-diecasting process of 7075R alloys by SA-EMS melt homogenized treatment

    NASA Astrophysics Data System (ADS)

    Zhihua, G.; Jun, X.; Zhifeng, Z.; Guojun, L.; Mengou, T.

    2016-03-01

    An advanced melt processing technology, spiral annular electromagnetic stirring (SA-EMS) based on the annular electromagnetic stirring (A-EMS) process was developed for manufacturing Al-alloy components with high integrity. The SA-EMS process innovatively combines non-contact electromagnetic stirring and a spiral annular chamber with specially designed profiles to in situ make high quality melt slurry, and intensive forced shearing can be achieved under high shear rate and high intensity of turbulence inside the spiral annular chamber. In this paper, the solidification microstructure and hardness of 7075R alloy die-casting connecting rod conditioned by the SA-EMS melt processing technology were investigated. The results indicate that, the SA-EMS melt processing technology exhibited superior grain refinement and remarkable structure homogeneity. In addition, it can evidently enhance the mechanical performance and reduce the crack tendency.

  13. The Challenges of Creating a Benchmarking Process for Administrative and Support Services

    ERIC Educational Resources Information Center

    Manning, Terri M.

    2007-01-01

    In the current climate of emphasis on outcomes assessment, colleges and universities are working diligently to create assessment processes for student learning outcomes, competence in general education, student satisfaction with services, and electronic tracking media to document evidence of competence in graduates. Benchmarking has become a…

  14. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT AUDIOVISUAL, CARTOGRAPHIC, AND RELATED RECORDS MANAGEMENT § 1237.26 What materials and processes must agencies use to create audiovisual... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false What materials and...

  15. PROCESS WATER BUILDING, TRA605. FORMS AR SET TO CREATE THREE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. FORMS AR SET TO CREATE THREE SHIELDED CELLS FOR THE PUMPS THAT WILL BE IN WEST HALF OF THE BUILDING. PUMPS WILL LIFT WATER TO WORKING RESERVOIR. CAMERA FACES NORTHEAST. INL NEGATIVE NO. 1465. Unknown Photographer, 2/13/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  16. Porcine liver decellularization under oscillating pressure conditions: a technical refinement to improve the homogeneity of the decellularization process.

    PubMed

    Struecker, Benjamin; Hillebrandt, Karl Herbert; Voitl, Robert; Butter, Antje; Schmuck, Rosa B; Reutzel-Selke, Anja; Geisel, Dominik; Joehrens, Korinna; Pickerodt, Philipp A; Raschzok, Nathanael; Puhl, Gero; Neuhaus, Peter; Pratschke, Johann; Sauer, Igor M

    2015-03-01

    Decellularization and recellularization of parenchymal organs may facilitate the generation of autologous functional liver organoids by repopulation of decellularized porcine liver matrices with induced liver cells. We present an accelerated (7 h overall perfusion time) and effective protocol for human-scale liver decellularization by pressure-controlled perfusion with 1% Triton X-100 and 1% sodium dodecyl sulfate via the hepatic artery (120 mmHg) and portal vein (60 mmHg). In addition, we analyzed the effect of oscillating pressure conditions on pig liver decellularization (n=19). The proprietary perfusion device used to generate these pressure conditions mimics intra-abdominal conditions during respiration to optimize microperfusion within livers and thus optimize the homogeneity of the decellularization process. The efficiency of perfusion decellularization was analyzed by macroscopic observation, histological staining (hematoxylin and eosin [H&E], Sirius red, and alcian blue), immunohistochemical staining (collagen IV, laminin, and fibronectin), and biochemical assessment (DNA, collagen, and glycosaminoglycans) of decellularized liver matrices. The integrity of the extracellular matrix (ECM) postdecellularization was visualized by corrosion casting and three-dimensional computed tomography scanning. We found that livers perfused under oscillating pressure conditions (P(+)) showed a more homogenous course of decellularization and contained less DNA compared with livers perfused without oscillating pressure conditions (P(-)). Microscopically, livers from the (P(-)) group showed remnant cell clusters, while no cells were found in livers from the (P(+)) group. The grade of disruption of the ECM was higher in livers from the (P(-)) group, although the perfusion rates and pressure did not significantly differ. Immunohistochemical staining revealed that important matrix components were still present after decellularization. Corrosion casting showed an intact

  17. Challenges in modelling homogeneous catalysis: new answers from ab initio molecular dynamics to the controversy over the Wacker process.

    PubMed

    Stirling, András; Nair, Nisanth N; Lledós, Agustí; Ujaque, Gregori

    2014-07-21

    We present here a review of the mechanistic studies of the Wacker process stressing the long controversy about the key reaction steps. We give an overview of the previous experimental and theoretical studies on the topic. Then we describe the importance of the most recent Ab Initio Molecular Dynamics (AIMD) calculations in modelling organometallic reactivity in water. As a prototypical example of homogeneous catalytic reactions, the Wacker process poses serious challenges to modelling. The adequate description of the multiple role of the water solvent is very difficult by using static quantum chemical approaches including cluster and continuum solvent models. In contrast, such reaction systems are suitable for AIMD, and by combining with rare event sampling techniques, the method provides reaction mechanisms and the corresponding free energy profiles. The review also highlights how AIMD has helped to obtain a novel understanding of the mechanism and kinetics of the Wacker process.

  18. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    SciTech Connect

    Suzuki, Hiroyuki Y.

    2008-02-15

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  19. A Comparison of Two Algorithms for the Simulation of Non-Homogeneous Poisson Processes with Degree-Two Exponential Polynomial Intensity Function.

    DTIC Science & Technology

    1977-09-01

    Professor Richard w. Hamming advised me on numerical analysis . Lieutenant Donald R. Bouchoux, OSN, on format, content and sundry other matters, and... analysis of such 11 ’ activities. For very simple systems involving event streams the use of the homogeneous Pcisson process as a model... analysis cf processes that exhibit gross departures from the homogeneous event stream criterion. If these processes are to be simulated, it is necessary

  20. Flexible printed circuit boards laser bonding using a laser beam homogenization process

    NASA Astrophysics Data System (ADS)

    Kim, Joohan; Choi, Haewoon

    2012-11-01

    A laser micro-bonding process using laser beam shaping is successfully demonstrated for flexible printed circuit boards. A CW Ytterbium fiber laser with a wavelength of 1070 nm and a laser power density of 1-7 W/mm2 is employed as a local heat source for bonding flexible printed circuit boards to rigid printed circuit boards. To improve the bonding quality, a micro-lens array is used to modify the Gaussian laser beam for the bonding process. An electromagnetic modeling and heat transfer simulation is conducted to verify the effect of the micro-lens array on the laser bonding process. The optimal bonding parameters are found experimentally. As the measured temperature ramp rate of the boards exceeds 1100 K/s, bonding occurs within 100-200 ms at a laser power density of 5 W/mm2. The bonding quality of the FPCB is verified with a shear strength test. Process characteristics are also discussed.

  1. Polymer powder processing of cryomilled polycaprolactone for solvent-free generation of homogeneous bioactive tissue engineering scaffolds.

    PubMed

    Lim, Jing; Chong, Mark Seow Khoon; Chan, Jerry Kok Yen; Teoh, Swee-Hin

    2014-06-25

    Synthetic polymers used in tissue engineering require functionalization with bioactive molecules to elicit specific physiological reactions. These additives must be homogeneously dispersed in order to achieve enhanced composite mechanical performance and uniform cellular response. This work demonstrates the use of a solvent-free powder processing technique to form osteoinductive scaffolds from cryomilled polycaprolactone (PCL) and tricalcium phosphate (TCP). Cryomilling is performed to achieve micrometer-sized distribution of PCL and reduce melt viscosity, thus improving TCP distribution and improving structural integrity. A breakthrough is achieved in the successful fabrication of 70 weight percentage of TCP into a continuous film structure. Following compaction and melting, PCL/TCP composite scaffolds are found to display uniform distribution of TCP throughout the PCL matrix regardless of composition. Homogeneous spatial distribution is also achieved in fabricated 3D scaffolds. When seeded onto powder-processed PCL/TCP films, mesenchymal stem cells are found to undergo robust and uniform osteogenic differentiation, indicating the potential application of this approach to biofunctionalize scaffolds for tissue engineering applications.

  2. Homogenous VUV advanced oxidation process for enhanced degradation and mineralization of antibiotics in contaminated water.

    PubMed

    Pourakbar, Mojtaba; Moussavi, Gholamreza; Shekoohiyan, Sakine

    2016-03-01

    This study was aimed to evaluate the degradation and mineralization of amoxicillin(AMX), using VUV advanced process. The effect of pH, AMX initial concentration, presence of water ingredients, the effect of HRT, and mineralization level by VUV process were taken into consideration. In order to make a direct comparison, the test was also performed by UVC radiation. The results show that the degradation of AMX was following the first-order kinetic. It was found that direct photolysis by UVC was able to degrade 50mg/L of AMX in 50min,while it was 3min for VUV process. It was also found that the removal efficiency by VUV process was directly influenced by pH of the solution, and higher removal rates were achieved at high pH values.The results show that 10mg/L of AMX was completely degraded and mineralized within 50s and 100s, respectively, indicating that the AMX was completely destructed into non-hazardous materials. Operating the photoreactor in contentious-flow mode revealed that 10mg/L AMX was completely degraded and mineralized at HRT values of 120s and 300s. it was concluded that the VUV advanced process was an efficient and viable technique for degradation and mineralization of contaminated water by antibiotics.

  3. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  4. Experimental development of processes to produce homogenized alloys of immiscible metals, phase 3

    NASA Technical Reports Server (NTRS)

    Reger, J. L.

    1976-01-01

    An experimental drop tower package was designed and built for use in a drop tower. This effort consisted of a thermal analysis, container/heater fabrication, and assembly of an expulsion device for rapid quenching of heated specimens during low gravity conditions. Six gallium bismuth specimens with compositions in the immiscibility region (50 a/o of each element) were processed in the experimental package: four during low gravity conditions and two under a one gravity environment. One of the one gravity processed specimens did not have telemetry data and was subsequently deleted for analysis since the processing conditions were not known. Metallurgical, Hall effect, resistivity, and superconductivity examinations were performed on the five specimens. Examination of the specimens showed that the gallium was dispersed in the bismuth. The low gravity processed specimens showed a relatively uniform distribution of gallium, with particle sizes of 1 micrometer or less, in contrast to the one gravity control specimen. Comparison of the cooling rates of the dropped specimens versus microstructure indicated that low cooling rates are more desirable.

  5. Multiple-pass high-pressure homogenization of milk for the development of pasteurization-like processing conditions.

    PubMed

    Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J

    2013-02-01

    Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P < 0·05) on microbial reduction of both spoilage microflora and Staph. aureus in milk. Optimized UHPH processes (five 202-MPa passes; four 232-MPa passes) defined a region where a 5-log(10) reduction of total bacterial count of milk and a baroresistant pathogen are attainable, as a requisite parameter for establishing an alternative method of pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization.

  6. Photodegradation of methyl red by advanced and homogeneous photo-Fenton's processes: a comparative study and kinetic approach.

    PubMed

    Devi, L Gomathi; Raju, K S Anantha; Kumar, S Girish

    2009-07-01

    The degradation of methyl red (MR), an azo dye, was carried out by the homogeneous photo-Fenton's process (HPFP) and the advanced photo-Fenton's process (APFP) using symmetrical peroxides such as hydrogen peroxide and ammonium persulfate (APS) as oxidants. The APFP showed higher efficiency than their homogeneous counterparts even at high dye concentrations due to the faster reduction of Fe3+ to Fe2+ ions on the iron surface. H2O2 proved to be a better oxidant for both the processes. However, APS efficiently inhibited the precipitation of iron oxy hydroxides at higher dosage of iron powder compared to H2O2 by providing excess acidity to the reaction medium. The rate constant for the kinetics of decolorisation by various oxidation processes is of the order: Fe0/H2O2/UV>Fe0/H2O2/dark>Fe0/APS/UV>Fe2+/H2O2/UV>Fe0/UV>Fe0/APS/dark>Fe0/dark approximately H2O2/UV>Fe2+/APS/UV>APS/UV>Fe2+/H2O2/dark>Fe2+/APS/dark approximately Fe2+/UV. The degradation reaction was followed by UV-visible and GC-MS spectroscopic techniques. Based on the intermediates obtained, probable degradation mechanisms have been proposed. It was found that the initial mechanism in the APFP involves the reduction of azo groups to amines while in the case of HPFP it leads to the formation of hydroxylated products due to the oxidation of azo groups.

  7. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2003-01-01

    The objective of this study was to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements were carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The main experimental technique employed was turbulent flow-chemical ionization mass spectrometry, which is particularly well suited for investigations of radical-radical reactions.

  8. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique.

    PubMed

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5-20 minutes) and speed (1,000-9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters.

  9. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique

    PubMed Central

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5–20 minutes) and speed (1,000–9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters. PMID:25709435

  10. Small-scale variability in solute transport processes in a homogeneous clay loam soil

    SciTech Connect

    Garrido, F.; Ghodrati, M.; Chendorain, M.; Campbell, C.G.

    1999-12-01

    Small-scale variations in transport parameters may have a profound influence on larger scale flow processes. Fiber-optic miniprobes (FOMPs) provide the opportunity to continuously measure solute resident concentration in small soil volumes. A 20-channel multi-plexed-FOMP system was used in repeated miscible displacements in a repacked clay loam soil column to examine small-scale, point-to-point variability in convective-dispersive transport processes. Transport parameters, measured 10 cm below the surface, were compared at two drip irrigation point densities and two fluxes. Irrigation densities of one irrigation drip point per 4 cm{sup 2} and 11 cm{sup 2} of column surface area produced similar results. The breakthrough curves measured at 0.10 cm h{sup {minus}1} had a larger immobile phase than at a flux of 1.07 cm h{sup {minus}1}. In the clay loam soil the mobile-immobile model fit the breakthrough curves better than the convective-dispersive equation (CDE), with r{sup 2} values of 99.6 and 97.1, respectively. This analysis demonstrated that dispersion and mass recovery were much more variable than pore water velocity in this repacked clay loam soil. However, even in the most variable transport conditions encountered, only 17 sampling points were necessary to describe the column average transport parameters within 20% of the mean.

  11. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2001-01-01

    The objective of this study is to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases such as HCl with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements will be carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The techniques to be employed include turbulent flow - chemical ionization mass spectrometry, and optical ellipsometry. The next section summarizes our research activities during the second year of the project, and the section that follows consists of the statement of work for the third year.

  12. Microstructural Homogeneity and Hot Deformation of Various Friction-Stir-Processed 5083 Al Alloys

    NASA Astrophysics Data System (ADS)

    García-Bernal, M. A.; Mishra, R. S.; Hernández-Silva, D.; Sauce-Rangel, V. M.

    2016-12-01

    Diverse studies on FSP of 5083 Al alloys have been conducted, and some have made comparisons with previous studies of similar alloys, but many times such comparisons could be invalid because of differences in the parameters used during FSP, above all, tool profile. Five 5083 Al alloys produced by different production routes were friction-stir-processed and compared among themselves and with other two superplastic forming (SPF) grade 5083 Al alloys. Results suggest that the grain size refinement is independent of the original microstructure and that there is a relationship between the size of the second phase before and after FSP. The combination of continuous casting 5083 Al alloys + FSP had an outstanding behavior in hot deformation in comparison with rolled or extruded 5083 Al alloys + FSP, and even SPF 5083 Al alloys.

  13. Microstructural Homogeneity and Hot Deformation of Various Friction-Stir-Processed 5083 Al Alloys

    NASA Astrophysics Data System (ADS)

    García-Bernal, M. A.; Mishra, R. S.; Hernández-Silva, D.; Sauce-Rangel, V. M.

    2017-01-01

    Diverse studies on FSP of 5083 Al alloys have been conducted, and some have made comparisons with previous studies of similar alloys, but many times such comparisons could be invalid because of differences in the parameters used during FSP, above all, tool profile. Five 5083 Al alloys produced by different production routes were friction-stir-processed and compared among themselves and with other two superplastic forming (SPF) grade 5083 Al alloys. Results suggest that the grain size refinement is independent of the original microstructure and that there is a relationship between the size of the second phase before and after FSP. The combination of continuous casting 5083 Al alloys + FSP had an outstanding behavior in hot deformation in comparison with rolled or extruded 5083 Al alloys + FSP, and even SPF 5083 Al alloys.

  14. Degradation mechanism of cyanobacterial toxin cylindrospermopsin by hydroxyl radicals in homogeneous UV/H₂O₂ process.

    PubMed

    He, Xuexiang; Zhang, Geshan; de la Cruz, Armah A; O'Shea, Kevin E; Dionysiou, Dionysios D

    2014-04-15

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometry. Various byproducts identified indicated three common reaction pathways: hydroxyl addition (+16 Da), alcoholic oxidation or dehydrogenation (-2 Da), and elimination of sulfate (-80 Da). The initiation of the degradation was observed at the hydroxymethyl uracil and tricyclic guanidine groups; uracil moiety cleavage/fragmentation and further ring-opening of the alkaloid were also noted at an extended reaction time or higher UV fluence. The degradation rates of CYN decreased and less byproducts (species) were detected using natural water matrices; however, CYN was effectively eliminated under extended UV irradiation. This study demonstrates the efficiency of CYN degradation and provides a better understanding of the mechanism of CYN degradation by hydroxyl radical, a reactive oxygen species that can be generated by most AOPs and is present in natural water environment.

  15. Application and possible benefits of high hydrostatic pressure or high-pressure homogenization on beer processing: A review.

    PubMed

    Santos, Lígia Mr; Oliveira, Fabiano A; Ferreira, Elisa Hr; Rosenthal, Amauri

    2017-10-01

    Beer is the most consumed beverage in the world, especially in countries such as USA, China and Brazil.It is an alcoholic beverage made from malted cereals, and the barley malt is the main ingredient, added with water, hops and yeast. High-pressure processing is a non-traditional method to preserve food and beverages. This technology has become more interesting compared to heat pasteurization, due to the minimal changes it brings to the original nutritional and sensory characteristics of the product, and it comprises two processes: high hydrostatic pressure, which is the most industrially used process, and high-pressure homogenization. The use of high pressure almost does not affect the molecules that are responsible for the aroma and taste, pigments and vitamins compared to the conventional thermal processes. Thus, the products processed by high-pressure processing have similar characteristics compared to fresh products, including beer. The aim of this paper was to review what has been investigated about beer processing using this technology regarding the effects on physicochemical, microbiology and sensory characteristics and related issues. It is organized by processing steps, since high pressure can be applied to malting, mashing, boiling, filtration and pasteurization. Therefore, the beer processed with high-pressure processing may have an extended shelf-life because this process can inactivate beer spoilage microorganisms and result in a superior sensory quality related to freshness and preservation of flavors as it does to juices that are already commercialized. However, beyond this application, high-pressure processing can modify protein structures, such as enzymes that are present in the malt, like α- and β-amylases. This process can activate enzymes to promote, for example, saccharification, or instead inactivate at the end of mashing, depending on the pressure the product is submitted, besides being capable of isomerizing hops to raise beer bitterness

  16. Processing of α-chitin nanofibers by dynamic high pressure homogenization: characterization and antifungal activity against A. niger.

    PubMed

    Salaberria, Asier M; Fernandes, Susana C M; Diaz, Rene Herrera; Labidi, Jalel

    2015-02-13

    Chitin nano-objects become more interesting and attractive material than native chitin because of their usable form, low density, high surface area and promising mechanical properties. This work suggests a straightforward and environmentally friendly method for processing chitin nanofibers using dynamic high pressure homogenization. This technique proved to be a remarkably simple way to get α-chitin into α-chitin nanofibers from yellow lobster wastes with a uniform width (bellow 100 nm) and high aspect ratio; and may contributes to a major breakthrough in chitin applications. Moreover, the resulting α-chitin nanofibers were characterized and compared with native α-chitin in terms of chemical and crystal structure, thermal degradation and antifungal activity. The biological assays highlighted that the nano nature of chitin nanofibers plays an important role in the antifungal activity against Aspergillus niger.

  17. New American Cancer Society process for creating trustworthy cancer screening guidelines.

    PubMed

    Brawley, Otis; Byers, Tim; Chen, Amy; Pignone, Michael; Ransohoff, David; Schenk, Maryjean; Smith, Robert; Sox, Harold; Thorson, Alan G; Wender, Richard

    2011-12-14

    Guidelines for cancer screening written by different organizations often differ, even when they are based on the same evidence. Those dissimilarities can create confusion among health care professionals, the general public, and policy makers. The Institute of Medicine (IOM) recently released 2 reports to establish new standards for developing more trustworthy clinical practice guidelines and conducting systematic evidence reviews that serve as their basis. Because the American Cancer Society (ACS) is an important source of guidance about cancer screening for both health care practitioners and the general public, it has revised its methods to create a more transparent, consistent, and rigorous process for developing and communicating guidelines. The new ACS methods align with the IOM principles for trustworthy clinical guideline development by creating a single generalist group for writing the guidelines, commissioning independent systematic evidence reviews, and clearly articulating the benefits, limitations, and harms associated with a screening test. This new process should ensure that ACS cancer screening guidelines will continue to be a trustworthy source of information for both health care practitioners and the general public to guide clinical practice, personal choice, and public policy about cancer screening.

  18. Development of a new cucumber reference material for pesticide residue analysis: feasibility study for material processing, homogeneity and stability assessment.

    PubMed

    Grimalt, Susana; Harbeck, Stefan; Shegunova, Penka; Seghers, John; Sejerøe-Olsen, Berit; Emteborg, Håkan; Dabrio, Marta

    2015-04-01

    The feasibility of the production of a reference material for pesticide residue analysis in a cucumber matrix was investigated. Cucumber was spiked at 0.075 mg/kg with each of the 15 selected pesticides (acetamiprid, azoxystrobin, carbendazim, chlorpyrifos, cypermethrin, diazinon, (α + β)-endosulfan, fenitrothion, imazalil, imidacloprid, iprodione, malathion, methomyl, tebuconazole and thiabendazole) respectively. Three different strategies were considered for processing the material, based on the physicochemical properties of the vegetable and the target pesticides. As a result, a frozen spiked slurry of fresh cucumber, a spiked freeze-dried cucumber powder and a freeze-dried cucumber powder spiked by spraying the powder were studied. The effects of processing and aspects related to the reconstitution of the material were evaluated by monitoring the pesticide levels in the three materials. Two separate analytical methods based on LC-MS/MS and GC-MS/MS were developed and validated in-house. The spiked freeze-dried cucumber powder was selected as the most feasible material and more exhaustive studies on homogeneity and stability of the pesticide residues in the matrix were carried out. The results suggested that the between-unit homogeneity was satisfactory with a sample intake of dried material as low as 0.1 g. A 9-week isochronous stability study was undertaken at -20 °C, 4 °C and 18 °C, with -70 °C designated as the reference temperature. The pesticides tested exhibited adequate stability at -20 °C during the 9-week period as well as at -70 °C for a period of 18 months. These results constitute a good basis for the development of a new candidate reference material for selected pesticides in a cucumber matrix.

  19. Digital Image Processing Techniques to Create Attractive Astronomical Images from Research Data

    NASA Astrophysics Data System (ADS)

    Rector, T. A.; Levay, Z.; Frattare, L. M.; English, J.; Pummill, K.

    2003-12-01

    The quality of modern astronomical data, the power of modern computers and the agility of current image processing software enable the creation of high-quality images in a purely digital form that rival the quality of traditional photographic astronomical images. The combination of these technological advancements has created a new ability to make color astronomical images. And in many ways, it has led to a new philosophy towards how to create them. We present a practical guide to generate astronomical images from research data by using powerful image processing programs. These programs use a layering metaphor that allows an unlimited number of astronomical datasets to be combined in any desired color scheme, creating an immense parameter space to be explored using an iterative approach. Several examples of image creation are presented. We also present a philosophy on how to use color and composition to create images that simultaneously highlight the scientific detail within an image and are aesthetically appealing. We advocate an approach that uses visual grammar, defined as the elements which affect the interpretation of an image, to maximize the richness and detail in an image while maintaining scientific accuracy. By properly using visual grammar, one can imply qualities that a two-dimensional image intrinsically cannot show, such as depth, motion and energy. In addition, composition can be used to engage the viewer and keep him or her interested for a longer period of time. The effective use of these techniques can result in a striking image that will effectively convey the science within the image, to scientists and to the public.

  20. Digital Image Processing Techniques to Create Attractive Astronomical Images from Research Data

    NASA Astrophysics Data System (ADS)

    Rector, T. A.; Levay, Z.; Frattare, L.; English, J.; Pu'uohau-Pummill, K.

    2004-05-01

    The quality of modern astronomical data, the power of modern computers and the agility of current image processing software enable the creation of high-quality images in a purely digital form that rival the quality of traditional photographic astronomical images. The combination of these technological advancements has created a new ability to make color astronomical images. And in many ways, it has led to a new philosophy towards how to create them. We present a practical guide to generate astronomical images from research data by using powerful image processing programs. These programs use a layering metaphor that allows an unlimited number of astronomical datasets to be combined in any desired color scheme, creating an immense parameter space to be explored using an iterative approach. Several examples of image creation are presented. We also present a philosophy on how to use color and composition to create images that simultaneously highlight the scientific detail within an image and are aesthetically appealing. We advocate an approach that uses visual grammar, defined as the elements which affect the interpretation of an image, to maximize the richness and detail in an image while maintaining scientific accuracy. By properly using visual grammar, one can imply qualities that a two-dimensional image intrinsically cannot show, such as depth, motion and energy. In addition, composition can be used to engage the viewer and keep him or her interested for a longer period of time. The effective use of these techniques can result in a striking image that will effectively convey the science within the image, to scientists and to the public.

  1. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with

  2. The Parametric Model of the Human Mandible Coronoid Process Created by Method of Anatomical Features.

    PubMed

    Vitković, Nikola; Mitić, Jelena; Manić, Miodrag; Trajanović, Miroslav; Husain, Karim; Petrović, Slađana; Arsić, Stojanka

    2015-01-01

    Geometrically accurate and anatomically correct 3D models of the human bones are of great importance for medical research and practice in orthopedics and surgery. These geometrical models can be created by the use of techniques which can be based on input geometrical data acquired from volumetric methods of scanning (e.g., Computed Tomography (CT)) or on the 2D images (e.g., X-ray). Geometrical models of human bones created in such way can be applied for education of medical practitioners, preoperative planning, etc. In cases when geometrical data about the human bone is incomplete (e.g., fractures), it may be necessary to create its complete geometrical model. The possible solution for this problem is the application of parametric models. The geometry of these models can be changed and adapted to the specific patient based on the values of parameters acquired from medical images (e.g., X-ray). In this paper, Method of Anatomical Features (MAF) which enables creation of geometrically precise and anatomically accurate geometrical models of the human bones is implemented for the creation of the parametric model of the Human Mandible Coronoid Process (HMCP). The obtained results about geometrical accuracy of the model are quite satisfactory, as it is stated by the medical practitioners and confirmed in the literature.

  3. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each

  4. Creating OGC Web Processing Service workflows using a web-based editor

    NASA Astrophysics Data System (ADS)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  5. Redistribution Mechanisms and Quantification of Homogeneity in Friction Stir Welding and Processing of an Aluminum Silicon Alloy

    DTIC Science & Technology

    2012-09-01

    slurries , which are high-viscosity fluids containing particulates such as platelets in a blood flow , etc., all definitively show that shearing...211 APPENDIX G: FLOW CHART OF HOMOGENEITY INDEX PROCEDURE ........215 APPENDIX H: MATLAB CODES...narrow deformation zone and “ flow -line” type chip microstructure. Note the improved homogeneity of the chip as compared to the bulk. After [48

  6. Preparation of cotton linter nanowhiskers by high-pressure homogenization process and its application in thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Savadekar, N. R.; Karande, V. S.; Vigneshwaran, N.; Kadam, P. G.; Mhaske, S. T.

    2015-03-01

    The present work deals with the preparation of cotton linter nanowhiskers (CLNW) by acid hydrolysis and subsequent processing in a high-pressure homogenizer. Prepared CLNW were then used as a reinforcing material in thermoplastic starch (TPS), with an aim to improve its performance properties. Concentration of CLNW was varied as 0, 1, 2, 3, 4 and 5 wt% in TPS. TPS/CLNW nanocomposite films were prepared by solution-casting process. The nanocomposite films were characterized by tensile, differential scanning calorimetry, scanning electron microscopy (SEM), water vapor permeability (WVP), oxygen permeability (OP), X-ray diffraction and light transmittance properties. 3 wt% CLNW-loaded TPS nanocomposite films demonstrated 88 % improvement in the tensile strength as compared to the pristine TPS polymer film; whereas, WVP and OP decreased by 90 and 92 %, respectively, which is highly appreciable compared to the quantity of CLNW added. DSC thermograms of nanocomposite films did not show any significant effect on melting temperature as compared to the pristine TPS. Light transmittance ( T r) value of TPS decreased with increased content of CLNW. Better interaction between CLNW and TPS, caused due to the hydrophilic nature of both the materials, and uniform distribution of CLNW in TPS were the prime reason for the improvement in properties observed at 3 wt% loading of CLNW in TPS. However, CLNW was seen to have formed agglomerates at higher concentration as determined from SEM analysis. These nanocomposite films can have potential use in food and pharmaceutical packaging applications.

  7. Development of a reference material for Staphylococcus aureus enterotoxin A in cheese: feasibility study, processing, homogeneity and stability assessment.

    PubMed

    Zeleny, R; Emteborg, H; Charoud-Got, J; Schimmel, H; Nia, Y; Mutel, I; Ostyn, A; Herbin, S; Hennekinne, J-A

    2015-02-01

    Staphylococcal food poisoning is caused by enterotoxins excreted into foods by strains of staphylococci. Commission Regulation 1441/2007 specifies thresholds for the presence of these toxins in foods. In this article we report on the progress towards reference materials (RMs) for Staphylococcal enterotoxin A (SEA) in cheese. RMs are crucial to enforce legislation and to implement and safeguard reliable measurements. First, a feasibility study revealed a suitable processing procedure for cheese powders: the blank material was prepared by cutting, grinding, freeze-drying and milling. For the spiked material, a cheese-water slurry was spiked with SEA solution, freeze-dried and diluted with blank material to the desired SEA concentration. Thereafter, batches of three materials (blank; two SEA concentrations) were processed. The materials were shown to be sufficiently homogeneous, and storage at ambient temperature for 4weeks did not indicate degradation. These results provide the basis for the development of a RM for SEA in cheese. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Regional Homogeneity of Resting-State Brain Activity Suppresses the Effect of Dopamine-Related Genes on Sensory Processing Sensitivity

    PubMed Central

    Chen, Chuansheng; Moyzis, Robert; Xia, Mingrui; He, Yong; Xue, Gui; Li, Jin; He, Qinghua; Lei, Xuemei; Wang, Yunxin; Liu, Bin; Chen, Wen; Zhu, Bi; Dong, Qi

    2015-01-01

    Sensory processing sensitivity (SPS) is an intrinsic personality trait whose genetic and neural bases have recently been studied. The current study used a neural mediation model to explore whether resting-state brain functions mediated the effects of dopamine-related genes on SPS. 298 healthy Chinese college students (96 males, mean age = 20.42 years, SD = 0.89) were scanned with magnetic resonance imaging during resting state, genotyped for 98 loci within the dopamine system, and administered the Highly Sensitive Person Scale. We extracted a “gene score” that summarized the genetic variations representing the 10 loci that were significantly linked to SPS, and then used path analysis to search for brain regions whose resting-state data would help explain the gene-behavior association. Mediation analysis revealed that temporal homogeneity of regional spontaneous activity (ReHo) in the precuneus actually suppressed the effect of dopamine-related genes on SPS. The path model explained 16% of the variance of SPS. This study represents the first attempt at using a multi-gene voxel-based neural mediation model to explore the complex relations among genes, brain, and personality. PMID:26308205

  9. Regional Homogeneity of Resting-State Brain Activity Suppresses the Effect of Dopamine-Related Genes on Sensory Processing Sensitivity.

    PubMed

    Chen, Chunhui; Xiu, Daiming; Chen, Chuansheng; Moyzis, Robert; Xia, Mingrui; He, Yong; Xue, Gui; Li, Jin; He, Qinghua; Lei, Xuemei; Wang, Yunxin; Liu, Bin; Chen, Wen; Zhu, Bi; Dong, Qi

    2015-01-01

    Sensory processing sensitivity (SPS) is an intrinsic personality trait whose genetic and neural bases have recently been studied. The current study used a neural mediation model to explore whether resting-state brain functions mediated the effects of dopamine-related genes on SPS. 298 healthy Chinese college students (96 males, mean age = 20.42 years, SD = 0.89) were scanned with magnetic resonance imaging during resting state, genotyped for 98 loci within the dopamine system, and administered the Highly Sensitive Person Scale. We extracted a "gene score" that summarized the genetic variations representing the 10 loci that were significantly linked to SPS, and then used path analysis to search for brain regions whose resting-state data would help explain the gene-behavior association. Mediation analysis revealed that temporal homogeneity of regional spontaneous activity (ReHo) in the precuneus actually suppressed the effect of dopamine-related genes on SPS. The path model explained 16% of the variance of SPS. This study represents the first attempt at using a multi-gene voxel-based neural mediation model to explore the complex relations among genes, brain, and personality.

  10. Development of a web-based support system for both homogeneous and heterogeneous air quality control networks: process and product.

    PubMed

    Andrade, J; Ares, J; García, R; Presa, J; Rodríguez, S; Piñeiro-Iglesias, M; López-Mahía, P; Muniategui, S; Prada, D

    2007-10-01

    The Environmental Laboratories Automation Software System or PALMA (Spanish abbreviation) was developed by a multidisciplinary team in order to support the main tasks of heterogeneous air quality control networks. The software process for PALMA development, which can be perfectly applied to similar multidisciplinary projects, was (a) well-defined, (b) arranged between environmental technicians and informatics, (c) based on quality guides, and (d) clearly user-centred. Moreover, it introduces some interesting advantages with regard to the classical step-by-step approaches. PALMA is a web-based system that allows 'off-line' and automated telematic data acquisition from distributed inmission stations belonging not only to homogeneous but also to heterogeneous air quality control networks. It provides graphic and tabular representations for a comprehensive and centralised analysis of acquired data, and considers the daily work that is associated with such networks: validation of the acquired data, alerts with regard to (periodical) tasks (e.g., analysers verification), downloading of files with environmental information (e.g., dust forecasts), etc. The implantation of PALMA has provided qualitative and quantitative improvements in the work performed by the people in charge of the considered control network.

  11. Processing sleep data created with the Drosophila Activity Monitoring (DAM) System.

    PubMed

    Pfeiffenberger, Cory; Lear, Bridget C; Keegan, Kevin P; Allada, Ravi

    2010-11-01

    Adult behavioral assays have been used with great success in Drosophila melanogaster to identify circadian rhythm genes. In particular, the locomotor activity assay can identify altered behavior patterns over the course of several days in small populations, or even individual flies. Sleep is a highly conserved behavior that is required for optimal performance and, in many cases, life of an organism. Drosophila demonstrate a behavioral state that shows traits consistent with sleep: periods of relative behavioral immobility that coincide with an increased arousal threshold after ~5 min of inactivity, regulated by circadian and homeostatic mechanisms. However, because flies do not produce brain waves recordable by electroencephalography, sleep researchers use behavior-based paradigms to infer when a fly is asleep, as opposed to awake but immobile. Data on Drosophila activity can be collected using an automated monitoring system to provide insight into sleep duration, consolidation, and latency, as well as sleep deprivation and rebound. This protocol details the use of Counting Macro, an Excel-based program, to process data created with the Drosophila Activity Monitoring (DAM) System from TriKinetics for sleep analyses. Specifically, it details the steps necessary to convert the raw data created by the DAM System into sleep duration and consolidation data, broken down into the light (L), dark (D), light:dark cycling (LD), and constant darkness (DD) phases of a behavior experiment.

  12. Clinical perspective: creating an effective practice peer review process-a primer.

    PubMed

    Gandhi, Manisha; Louis, Frances S; Wilson, Shae H; Clark, Steven L

    2017-03-01

    Peer review serves as an important adjunct to other hospital quality and safety programs. Despite its importance, the available literature contains virtually no guidance regarding the structure and function of effective peer review committees. This Clinical Perspective provides a summary of the purposes, structure, and functioning of effective peer review committees. We also discuss important legal considerations that are a necessary component of such processes. This discussion includes useful templates for case selection and review. Proper committee structure, membership, work flow, and leadership as well as close cooperation with the hospital medical executive committee and legal representatives are essential to any effective peer review process. A thoughtful, fair, systematic, and organized approach to creating a peer review process will lead to confidence in the committee by providers, hospital leadership, and patients. If properly constructed, such committees may also assist in monitoring and enforcing compliance with departmental protocols, thus reducing harm and promoting high-quality practice. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Using a critical reflection process to create an effective learning community in the workplace.

    PubMed

    Walker, Rachel; Cooke, Marie; Henderson, Amanda; Creedy, Debra K

    2013-05-01

    Learning circles are an enabling process to critically examine and reflect on practices with the purpose of promoting individual and organizational growth and change. The authors adapted and developed a learning circle strategy to facilitate open discourse between registered nurses, clinical leaders, clinical facilitators and students, to critically reflect on practice experiences to promote a positive learning environment. This paper reports on an analysis of field notes taken during a critical reflection process used to create an effective learning community in the workplace. A total of 19 learning circles were conducted during in-service periods (that is, the time allocated for professional education between morning and afternoon shifts) over a 3 month period with 56 nurses, 33 students and 1 university-employed clinical supervisor. Participation rates ranged from 3 to 12 individuals per discussion. Ten themes emerged from content analysis of the clinical learning issues identified through the four-step model of critical reflection used in learning circle discussions. The four-step model of critical reflection allowed participants to reflect on clinical learning issues, and raise them in a safe environment that enabled topics to be challenged and explored in a shared and cooperative manner.

  14. Creating Economic Incentives for Waste Disposal in Developing Countries Using the MixAlco Process.

    PubMed

    Lonkar, Sagar; Fu, Zhihong; Wales, Melinda; Holtzapple, Mark

    2017-01-01

    In rapidly growing developing countries, waste disposal is a major challenge. Current waste disposal methods (e.g., landfills and sewage treatment) incur costs and often are not employed; thus, wastes accumulate in the environment. To address this challenge, it is advantageous to create economic incentives to collect and process wastes. One approach is the MixAlco process, which uses methane-inhibited anaerobic fermentation to convert waste biomass into carboxylate salts, which are chemically converted to industrial chemicals and fuels. In this paper, humanure (raw human feces and urine) is explored as a possible nutrient source for fermentation. This work focuses on fermenting municipal solid waste (energy source) and humanure (nutrient source) in batch fermentations. Using the Continuum Particle Distribution Model (CPDM), the performance of continuous countercurrent fermentation was predicted at different volatile solid loading rates (VSLR) and liquid residence times (LRT). For a four-stage countercurrent fermentation system at VSLR = 4 g/(L∙day), LRT = 30 days, and solids concentration = 100 g/L liquid, the model predicts carboxylic acid concentration of 68 g/L and conversion of 78.5 %.

  15. Comparing the impact of homogenization and heat processing on the properties and in vitro digestion of milk from organic and conventional dairy herds

    USDA-ARS?s Scientific Manuscript database

    The effects of homogenization and heat processing on the chemical and in vitro digestion traits of milk from organic and conventional herds were compared. Raw milk from organic (>50% of dry matter intake from pasture) and conventional (no access to pasture) farms were adjusted to commercial whole a...

  16. Waste container weighing data processing to create reliable information of household waste generation.

    PubMed

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Noble metal-catalyzed homogeneous and heterogeneous processes in treating simulated nuclear waste media with formic acid

    SciTech Connect

    King, R.B.; Bhattacharyya, N.K.; Smith, H.D.

    1995-09-01

    Simulants for the Hanford Waste Vitrification Plant feed containing the major non-radioactive components Al, Cd, Fe, Mn, Nd, Ni, Si, Zr, Na, CO{sub 3}{sup 2}-, NO{sub 3}-, and NO{sub 2}- were used to study reactions of formic acid at 90{degrees}C catalyzed by the noble metals Ru, Rh, and/or Pd found in significant quantities in uranium fission products. Such reactions were monitored using gas chromatography to analyze the CO{sub 2}, H{sub 2}, NO, and N{sub 2}O in the gas phase and a microammonia electrode to analyze the NH{sub 4}+/NH{sub 3} in the liquid phase as a function of time. The following reactions have been studied in these systems since they are undesirable side reactions in nuclear waste processing: (1) Decomposition of formic acid to CO{sub 2} + H{sub 2} is undesirable because of the potential fire and explosion hazard of H{sub 2}. Rhodium, which was introduced as soluble RhCl{sub 3}-3H{sub 2}O, was found to be the most active catalyst for H{sub 2} generation from formic acid above {approximately} 80{degrees}C in the presence of nitrite ion. The H{sub 2} production rate has an approximate pseudo first-order dependence on the Rh concentration, (2) Generation of NH{sub 3} from the formic acid reduction of nitrate and/or nitrite is undesirable because of a possible explosion hazard from NH{sub 4}NO{sub 3} accumulation in a waste processing plant off-gas system. The Rh-catalyzed reduction of nitrogen-oxygen compounds to ammonia by formic acid was found to exhibit the following features: (a) Nitrate rather than nitrite is the principal source of NH{sub 3}. (b) Ammonia production occurs at the expense of hydrogen production. (c) Supported rhodium metal catalysts are more active than rhodium in any other form, suggesting that ammonia production involves heterogeneous rather than homogeneous catalysis.

  18. Two-Step Process To Create "Roll-Off" Superamphiphobic Paper Surfaces.

    PubMed

    Jiang, Lu; Tang, Zhenguan; Clinton, Rahmat M; Breedveld, Victor; Hess, Dennis W

    2017-03-15

    Surface modification of cellulose-based paper, which displays roll-off properties for water and oils (surface tension ≥23.8 mN·m(-1)) and good repellency toward n-heptane (20.1 mN·m(-1)), is reported. Droplets of water, diiodomethane, motor oil, hexadecane, and decane all "bead up", i.e., exhibit high contact angles, and roll off the treated surface under the influence of gravity. Unlike widely used approaches that rely on the deposition of nanoparticles or electrospun nanofibers to create superamphiphobic surfaces, our method generates a hierarchical structure as an inherent property of the substrate and displays good adhesion between the film and substrate. The two-step combination of plasma etching and vapor deposition used in this study enables fine-tuning of the nanoscale roughness and thereby facilitates enhanced fundamental understanding of the effect of micro- and nanoscale roughness on the paper wetting properties. The surfaces maintain their "roll-off" properties after dynamic impact tests, demonstrating their mechanical robustness. Furthermore, the superamphiphobic paper has high gas permeability due to pore-volume enhancement by plasma etching but maintains the mechanical flexibility and strength of untreated paper, despite the presence of nanostructures. The unique combination of the chemical and physical properties of the resulting superamphiphobic paper is of practical interest for a range of applications such as breathable and disposable medical apparel, antifouling biomedical devices, antifingerprint paper, liquid packaging, microfluidic devices, and medical testing strips through a simple surface etching plus coating process.

  19. Effective learning environments - the process of creating and maintaining an online continuing education tool.

    PubMed

    Davies, Sharon; Lorello, Gianni Roberto; Downey, Kristi; Friedman, Zeev

    2017-01-01

    Continuing medical education (CME) is an indispensable part of maintaining physicians' competency. Since attending conferences requires clinical absenteeism and is not universally available, online learning has become popular. The purpose of this study is to conduct a retrospective analysis examining the creation process of an anesthesia website for adherence to the published guidelines and, in turn, provide an illustration of developing accredited online CME. Using Kern's guide to curriculum development, our website analysis confirmed each of the six steps was met. As well, the technical design features are consistent with the published literature on efficient online educational courses. Analysis of the database from 3937 modules and 1628 site evaluations reveals the site is being used extensively and is effective as demonstrated by the participants' examination results, content evaluations and reports of improvements in patient management. Utilizing technology to enable distant learning has become a priority for many educators. When creating accredited online CME programs, course developers should understand the educational principles and technical design characteristics that foster effective online programs. This study provides an illustration of incorporating these features. It also demonstrates significant participation in online CME by anesthesiologists and highlights the need for more accredited programs.

  20. Five Important Lessons I Learned during the Process of Creating New Child Care Centers

    ERIC Educational Resources Information Center

    Whitehead, R. Ann

    2005-01-01

    In this article, the author describes her experiences of developing new child care sites and offers five important lessons that she learned through her experiences which helped her to create successful child care centers. These lessons include: (1) Finding an appropriate area and location; (2) Creating realistic financial projections based on real…

  1. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  2. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  3. Effects of ultrasonication and conventional mechanical homogenization processes on the structures and dielectric properties of BaTiO3 ceramics.

    PubMed

    Akbas, Hatice Zehra; Aydin, Zeki; Yilmaz, Onur; Turgut, Selvin

    2017-01-01

    The effects of the homogenization process on the structures and dielectric properties of pure and Nb-doped BaTiO3 ceramics have been investigated using an ultrasonic homogenization and conventional mechanical methods. The reagents were homogenized using an ultrasonic processor with high-intensity ultrasonic waves and using a compact mixer-shaker. The components and crystal types of the powders were determined by Fourier-transform infrared spectroscopy (FTIR) and X-ray diffraction (XRD) analyses. The complex permittivity (ε('), ε″) and AC conductivity (σ') of the samples were analyzed in a wide frequency range of 20Hz to 2MHz at room temperature. The structures and dielectric properties of pure and Nb-doped BaTiO3 ceramics strongly depend on the homogenization process in a solid-state reaction method. Using an ultrasonic processor with high-intensity ultrasonic waves based on acoustic cavitation phenomena can make a significant improvement in producing high-purity BaTiO3 ceramics without carbonate impurities with a small dielectric loss.

  4. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  5. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  6. Near InfraRed Spectroscopy homogeneity evaluation of complex powder blends in a small-scale pharmaceutical preformulation process, a real-life application.

    PubMed

    Storme-Paris, I; Clarot, I; Esposito, S; Chaumeil, J C; Nicolas, A; Brion, F; Rieutord, A; Chaminade, P

    2009-05-01

    Near InfraRed Spectroscopy (NIRS) is a potentially powerful tool for assessing the homogeneity of industrial powder blends. In the particular context of hospital manufacturing, we considered the introduction of the technique at a small pharmaceutical process scale, with the objective of following blend homogeneity in mixtures of seven components. This article investigates the performance of various NIRS-based methodologies to assess powder blending. The formulation studied is prescribed in haematology unit, as part of the treatment for digestive decontamination in children receiving stem-cell transplantation. It is composed of the active pharmaceutical ingredients (APIs) colimycin and tobramycin and five excipients. We evaluated 39 different blends composing 14 different formulations, with uncorrelated proportions of constituents between these 14 formulations. The reference methods used to establish the NIRS models were gravimetry and a High Performance Liquid Chromatography method coupled to an Evaporative Light Scattering Detection. Unsupervised and supervised qualitative and quantitative chemometric methods were performed to assess powder blend homogeneity using a bench top instrument equipped with an optical fibre. For qualitative evaluations, unsupervised Moving Block Standard Deviation, autocorrelation functions and Partial Least Square Discriminant Analysis (PLS-DA) were used. For quantitative evaluations, Partial Least Square Cross-Validated models were chosen. Results are expressed as API, and major excipient percentages of theoretical values as a function of blending time. The 14 different formulations were only satisfactorily discriminated by supervised algorithms, such as an optimised PLS-DA model. The homogeneity state was demonstrated after 16 min of blending, quantifying three components with a precision between 1.2% and 1.4% w/w. This study demonstrates, for the first time, the effective implementation of NIRS for blend homogeneity evaluation, as

  7. Homogeneous Atomic Fermi Gases

    NASA Astrophysics Data System (ADS)

    Mukherjee, Biswaroop; Yan, Zhenjie; Patel, Parth B.; Hadzibabic, Zoran; Yefsah, Tarik; Struck, Julian; Zwierlein, Martin W.

    2017-03-01

    We report on the creation of homogeneous Fermi gases of ultracold atoms in a uniform potential. In the momentum distribution of a spin-polarized gas, we observe the emergence of the Fermi surface and the saturated occupation of one particle per momentum state: the striking consequence of Pauli blocking in momentum space for a degenerate gas. Cooling a spin-balanced Fermi gas at unitarity, we create homogeneous superfluids and observe spatially uniform pair condensates. For thermodynamic measurements, we introduce a hybrid potential that is harmonic in one dimension and uniform in the other two. The spatially resolved compressibility reveals the superfluid transition in a spin-balanced Fermi gas, saturation in a fully polarized Fermi gas, and strong attraction in the polaronic regime of a partially polarized Fermi gas.

  8. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  9. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  10. Simulation of the Vapor Intrusion Process for Non-Homogeneous Soils Using a Three-Dimensional Numerical Model.

    PubMed

    Bozkurt, Ozgur; Pennell, Kelly G; Suuberg, Eric M

    2009-01-01

    This paper presents model simulation results of vapor intrusion into structures built atop sites contaminated with volatile or semi-volatile chemicals of concern. A three-dimensional finite element model was used to investigate the importance of factors that could influence vapor intrusion when the site is characterized by non-homogeneous soils. Model simulations were performed to examine how soil layers of differing properties alter soil gas concentration profiles and vapor intrusion rates into structures. The results illustrate difference in soil gas concentration profiles and vapor intrusion rates between homogeneous and layered soils. The findings support the need for site conceptual models to adequately represent the site's geology when conducting site characterizations, interpreting field data and assessing the risk of vapor intrusion at a given site. For instance, in layered geologies, a lower permeability and diffusivity soil layer between the source and building often limits vapor intrusion rates, even if a higher permeability layer near the foundation permits increased soil gas flow rates into the building. In addition, the presence of water-saturated clay layers can considerably influence soil gas concentration profiles. Therefore, interpreting field data without accounting for clay layers in the site conceptual model could result in inaccurate risk calculations. Important considerations for developing more accurate conceptual site models are discussed in light of the findings.

  11. Lycopene degradation, isomerization and in vitro bioaccessibility in high pressure homogenized tomato puree containing oil: effect of additional thermal and high pressure processing.

    PubMed

    Knockaert, Griet; Pulissery, Sudheer K; Colle, Ines; Van Buggenhout, Sandy; Hendrickx, Marc; Loey, Ann Van

    2012-12-01

    In the present study, the effect of equivalent thermal and high pressure processes at pasteurization and sterilization intensities on some health related properties of high pressure homogenized tomato puree containing oil were investigated. Total lycopene concentration, cis-lycopene content and in vitro lycopene bioaccessibility were examined as health related properties. Results showed that pasteurization hardly affected the health related properties of tomato puree. Only the formation of cis-lycopene during intense thermal pasteurization was observed. Sterilization processes on the other hand had a significant effect on the health related properties. A significant decrease in total lycopene concentration was found after the sterilization processes. Next to degradation, significant isomerization was also observed: all-trans-lycopene was mainly converted to 9-cis- and 13-cis-lycopene. High pressure sterilization limited the overall lycopene isomerization, when compared to the equivalent thermal sterilization processes. The formation of 5-cis-lycopene on the other hand seemed to be favoured by high pressure. The in vitro lycopene bioaccessibility of high pressure homogenized tomato puree containing oil was decreased during subsequent thermal or high pressure processing, whereby significant changes were observed for all the sterilization processes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Method of removing the effects of electrical shorts and shunts created during the fabrication process of a solar cell

    DOEpatents

    Nostrand, Gerald E.; Hanak, Joseph J.

    1979-01-01

    A method of removing the effects of electrical shorts and shunts created during the fabrication process and improving the performance of a solar cell with a thick film cermet electrode opposite to the incident surface by applying a reverse bias voltage of sufficient magnitude to burn out the electrical shorts and shunts but less than the break down voltage of the solar cell.

  13. The difference between implicit and explicit associative processes at study in creating false memory in the DRM paradigm.

    PubMed

    Kawasaki, Yayoi; Yama, Hiroshi

    2006-01-01

    The effects of implicit and explicit associative processes for false recognition were examined by manipulating exposure duration of studied items; 20 ms or 2000 ms. Participants studied lists of words that were high associates to a nonpresented word (critical lure) in either condition. After learning each list, they took a recognition test and remember/know judgements immediately (Experiment 1) or 1 minute later (Experiment 2). In Experiment 1, know responses for critical lures were more in the 20 ms than in the 2000 ms conditions, while remember responses for them were more in the 2000 ms condition. Implicit associative processes create familiarity of critical lures, and explicit associative processes create details of false memories. Comparing the results of Experiment 1 with those of Experiment 2, remember responses for critical lures were increased with the prolonged time only in the 20 ms condition. Characteristics of false memory made by implicit associative processes could be changed by prolonged time.

  14. Disruption of Pseudomonas putida by high pressure homogenization: a comparison of the predictive capacity of three process models for the efficient release of arginine deiminase.

    PubMed

    Patil, Mahesh D; Patel, Gopal; Surywanshi, Balaji; Shaikh, Naeem; Garg, Prabha; Chisti, Yusuf; Banerjee, Uttam Chand

    2016-12-01

    Disruption of Pseudomonas putida KT2440 by high-pressure homogenization in a French press is discussed for the release of arginine deiminase (ADI). The enzyme release response of the disruption process was modelled for the experimental factors of biomass concentration in the broth being disrupted, the homogenization pressure and the number of passes of the cell slurry through the homogenizer. For the same data, the response surface method (RSM), the artificial neural network (ANN) and the support vector machine (SVM) models were compared for their ability to predict the performance parameters of the cell disruption. The ANN model proved to be best for predicting the ADI release. The fractional disruption of the cells was best modelled by the RSM. The fraction of the cells disrupted depended mainly on the operating pressure of the homogenizer. The concentration of the biomass in the slurry was the most influential factor in determining the total protein release. Nearly 27 U/mL of ADI was released within a single pass from slurry with a biomass concentration of 260 g/L at an operating pressure of 510 bar. Using a biomass concentration of 100 g/L, the ADI release by French press was 2.7-fold greater than in a conventional high-speed bead mill. In the French press, the total protein release was 5.8-fold more than in the bead mill. The statistical analysis of the completely unseen data exhibited ANN and SVM modelling as proficient alternatives to RSM for the prediction and generalization of the cell disruption process in French press.

  15. Atomic processes in plasmas created by an ultra-short laser pulse

    NASA Astrophysics Data System (ADS)

    Audebert, P.; Lecherbourg, L.; Bastiani-Ceccotti, S.; Geindre, J.-P.; Blancard, C.; Cossé, P.; Faussurier, G.; Shepherd, R.; Renaudin, P.

    2008-05-01

    Point projection K-shell absorption spectroscopy has been used to measure absorption spectra of transient aluminum plasma created by an ultra-short laser pulse. 1s-2p and 1s-3p absorption lines of weakly ionized aluminum were measured for an extended range of densities in a relatively low-temperature regime. Independent plasma characterization was obtained from frequency domain interferometry (FDI) diagnostic and allows the interpretation of the absorption spectra in terms of spectral opacities. The experimental spectra are compared with opacity calculations using the density and temperature inferred from the analysis of the FDI data.

  16. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  17. Measurement of USMC Logistics Processes: Creating a Baseline to Support Precision Logistics Implementation

    DTIC Science & Technology

    1998-01-01

    unavailability of parts. ORDER AND SHIP TIMES FROM RETAIL SUPPLY We turn now to the supply of parts, beginning with measurement of the order and ship ( O &S...Point, according to archived supply data. Defining the Order and Ship Process from Retail Supply The retail O &S process begins with the identification...take more than two weeks for the entire O &S process, even though backorders are not at issue here. What is not clear from these results is what

  18. Process scale-up considerations for non-thermal atmospheric-pressure plasma synthesis of nanoparticles by homogenous nucleation

    NASA Astrophysics Data System (ADS)

    Cole, Jonathan; Zhang, Yao; Liu, Tianqi; Liu, Chang-jun; Mohan Sankaran, R.

    2017-08-01

    Scale-up of non-thermal atmospheric-pressure plasma reactors for the synthesis of nanoparticles by homogeneous nucleation is challenging because the active volume is typically reduced to facilitate gas breakdown, enhance discharge stability, and limit particle size and agglomeration, but thus limits throughput. Here, we introduce a dielectric barrier discharge reactor consisting of a coaxial electrode geometry for nanoparticle production that enables a simple scale-up strategy whereby increasing the outer and inner electrode diameters, the plasma volume is increased approximately linearly, while maintaining a sufficiently small electrode gap to maintain the electric field strength. We show with two test reactors that for a given residence time, the nanoparticle production rate increases linearly with volume over a range of precursor concentrations, while having minimal effect on the shape of the particle size distribution. However, our study also reveals that increasing the total gas flow rate in a smaller volume reactor leads to an enhancement of precursor conversion and a comparable production rate to a larger volume reactor. These results suggest that scale-up requires better understanding of the influence of reactor geometry on particle growth dynamics and may not always be a simple function of reactor volume.

  19. Feasibility study for producing a carrot/potato matrix reference material for 11 selected pesticides at EU MRL level: material processing, homogeneity and stability assessment.

    PubMed

    Saldanha, Helena; Sejerøe-Olsen, Berit; Ulberth, Franz; Emons, Hendrik; Zeleny, Reinhard

    2012-05-01

    The feasibility for producing a matrix reference material for selected pesticides in a carrot/potato matrix was investigated. A commercially available baby food (carrot/potato-based mash) was spiked with 11 pesticides at the respective EU maximum residue limits (MRLs), and further processed by either freezing or freeze-drying. Batches of some 150 units were produced per material type. First, the materials were assessed for the relative amount of pesticide recovered after processing (ratio of pesticide concentration in the processed material to the initially spiked pesticide concentration). In addition, the materials' homogeneity (bottle-to-bottle variation), and the short-term (1 month) and mid-term (5 months) stability at different temperatures were assessed. For this, an in-house validated GC-EI-MS method operated in the SIM mode with a sample preparation procedure based on the QuEChERS ("quick, easy, cheap, effective, rugged, and safe") principle was applied. Measurements on the frozen material provided the most promising results (smallest analyte losses during production), and also freeze-drying proved to be a suitable alternative processing technique for most of the investigated pesticides. Both the frozen and the freeze-dried material showed to be sufficiently homogeneous for the intended use, and storage at -20°C for 5 months did not reveal any detectable material degradation. The results constitute an important step towards the development of a pesticide matrix reference material. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  1. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  2. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film in accordance with this standard. Process color film in accordance with the manufacturer's recommendations. (3...) version of each image must be comparable in quality to a 35 mm film photograph or better, and must...

  3. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  4. Creating Space Force Structure Through Strategic Planning: The Air Force Reserve Visioning Process

    DTIC Science & Technology

    2007-11-02

    strategic visioning and strategic planning processes together to achieve military mission objectives. It focuses on our current National and Military...dissimilar and similar organizations to work together through the strategic planning process to achieve common objectives. Finally, a case study will be

  5. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  6. The Contribution of Prefrontal Executive Processes to Creating a Sense of Self**

    PubMed Central

    Hirstein, William

    2011-01-01

    According to several current theories, executive processes help achieve various mental actions such as remembering, planning and decision-making, by executing cognitive operations on representations held in consciousness. I plan to argue that these executive processes are partly responsible for our sense of self, because of the way they produce the impression of an active, controlling presence in consciousness. If we examine what philosophers have said about the “ego” (Descartes), “the Self” (Locke and Hume), the “self of all selves” (William James), we will find that it fits what is now known about executive processes. Hume, for instance, famously argued that he could not detect the self in consciousness, and this would correspond to the claim (made by Crick and Koch, for instance) that we are not conscious of the executive processes themselves, but rather of their results. PMID:21694967

  7. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film in... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT AUDIOVISUAL, CARTOGRAPHIC, AND...

  8. Creating Resilient Children and Empowering Families Using a Multifamily Group Process.

    ERIC Educational Resources Information Center

    Sayger, Thomas V.

    1996-01-01

    Presents a model for prevention and early intervention using a multifamily group counseling process to increase the resiliency of children and to empower families living with multiple stressors in high-risk environments. (Author)

  9. Creating low vision and nonvisual instructions for diabetes technology: an empirically validated process.

    PubMed

    Williams, Ann S

    2012-03-01

    Nearly 20% of the adults with diagnosed diabetes in the United States also have visual impairment. Many individuals in this group perform routine diabetes self-management tasks independently, often using technology that was not specifically designed for use by people with visual impairment (e.g., insulin pumps and pens). Equitable care for persons with disabilities requires providing instructions in formats accessible for nonreaders. However, instructions in accessible formats, such as recordings, braille, or digital documents that are legible to screen readers, are seldom available. This article includes a summary of existing guidelines for creating accessible documents. The guidelines are followed by a description of the production of accessible nonvisual instructions for use of insulin pens used in a study of dosing accuracy. The study results indicate that the instructions were used successfully by 40 persons with visual impairment. Instructions in accessible formats can increase access to the benefits of diabetes technology for persons with visual impairment. Recorded instructions may also be useful to sighted persons who do not read well, such as those with dyslexia, low literacy, or who use English as a second language. Finally, they may have important benefits for fully sighted people who find it easier to learn to use technology by handling the equipment while listening to instructions. Manufacturers may also benefit from marketing to an increased pool of potential users. © 2012 Diabetes Technology Society.

  10. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head.

    PubMed

    Li, Kai; Papademetris, Xenophon; Tucker, Don M

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages.

  11. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head

    PubMed Central

    Li, Kai; Papademetris, Xenophon; Tucker, Don M.

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  12. Rapid monitoring for the enhanced definition and control of a selective cell homogenate purification by a batch-flocculation process.

    PubMed

    Habib, G; Zhou, Y; Hoare, M

    2000-10-20

    Downstream-bioprocess operations, for example, selective flocculation, are inherently variable due to fluctuations in feed material, equipment performance, and quality of additives such as flocculating agents. Due to these fluctuations in operating conditions, some form of process control is essential for reproducible and satisfactory process performance and hence, product quality. Both product (alcohol dehydrogenase) and key contaminants (RNA, protein, cell debris) within a Saccharomyces cerevisiae system were monitored in real-time adopting an at-line enzymatic reaction and rapid UV-VIS spectral-analysis technique every 135 seconds. The real-time measurements were implemented within two control configurations to regulate the batch-flocculation process according to prespecified control objectives, using the flocculant dose as the sole manipulative variable. An adaptive, model-based control arrangement was studied, which combined the rapid measurements with a process model and two model parameter-identification techniques for real-time prediction of process behavior. Based on an up-to-date mathematical description of the flocculation system, process optimization was attained and subsequent feedback control to this optimum operating set point was reproducibly demonstrated with a 92% accuracy. A simpler control configuration was also investigated adopting the cell debris concentration as the control variable. Both control arrangements resulted in superior flocculation-process performances in terms of contaminant removal, product recovery, and excess flocculant usage compared to an uncontrolled system.

  13. Study of stirred layers on 316L steel created by friction stir processing

    NASA Astrophysics Data System (ADS)

    Langlade, C.; Roman, A.; Schlegel, D.; Gete, E.; Folea, M.

    2014-08-01

    Nanostructured materials are known to exhibit attractive properties, especially in the mechanical field where high hardness is of great interest. The friction stir process (FSP) is a recent surface engineering technique derived from the friction stir welding method (FSW). In this study, the FSP of an 316L austenitic stainless steel has been evaluated. The treated layers have been characterized in terms of hardness and microstructure and these results have been related to the FSP operational parameters. The process has been analysed using a Response Surface Method (RSM) to enable the stirred layer thickness prediction.

  14. Creating an Administrative Structure to Support Faculty Governance: A Participatory Process.

    ERIC Educational Resources Information Center

    Littlefield, Vivian M.

    1989-01-01

    Principles of organizational change are examined as they apply to academic units in general, and the way in which one well-established academic department in nursing changed its administrative structure is described. The process used faculty participation in decision-making. (Author/MSE)

  15. Creating Professional Communities in Schools through Organizational Learning: An Evaluation of a School Improvement Process.

    ERIC Educational Resources Information Center

    Scribner, Jay Paredes; Cockrell, Karen Sunday; Cockrell, Dan H.; Valentine, Jerry W.

    1999-01-01

    Analyzes a school-improvement process's potential to foster professional community in three rural middle schools through organizational learning. Findings of a two-year qualitative case study reveal bureaucracy/community tensions and isolate four influential community-building factors: principal leadership, organizational history, organizational…

  16. Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement

    ERIC Educational Resources Information Center

    Raven, Arjan; Randolph, Adriane B.; Heil, Shelli

    2010-01-01

    The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…

  17. Not All Analogies Are Created Equal: Associative and Categorical Analogy Processing following Brain Damage

    ERIC Educational Resources Information Center

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and…

  18. Not All Analogies Are Created Equal: Associative and Categorical Analogy Processing following Brain Damage

    ERIC Educational Resources Information Center

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and…

  19. Partners in Process: How Museum Educators and Classroom Teachers Can Create Outstanding Results

    ERIC Educational Resources Information Center

    Moisan, Heidi

    2009-01-01

    Collaborative processes by nature are not neat and tidy; and if mismanaged, they can lead to chaos rather than creative productivity. However, when a museum and a group of teachers establish a respectful peer community that maximizes all the members talents, truly impactful teaching and learning result. This article analyzes the "Great…

  20. Microstructure and homogeneity of semi-solid 7075 aluminum tubes processed by parallel tubular channel angular pressing

    NASA Astrophysics Data System (ADS)

    Meshkabadi, Ramin; Faraji, Ghader; Javdani, Akbar; Fata, Ali; Pouyafar, Vahid

    2017-09-01

    Semi-solid processing is a new developing technology for the fabrication of intricate parts at low temperatures in comparison with conventional casting routes. In this research, a parallel tubular channel angular pressing (PTCAP) was used as a pre strain-inducing process, and the influence of the number of PTCAP passes and semi-solid heating parameters on the microstructural characteristics of Al7075 tubes was investigated. The results demonstrated that PTCAP process could successfully be used as a best pre-straining method for tubular samples showing semi-solid microstructures. As the temperature is increased, the solid particle size first decreased, then increased, and gradually became more spherical. The appropriate condition for the subsequent semi-solid forming process was gained at reheating temperature of 620 oC. The suitability of this temperature was confirmed by the uniform distribution of grains. The grain size and shape factor both increased along with increasing the holding time. The distribution of the solid particles was strongly dependent upon the holding time and became uniform as the holding time increased. Comparison of the grain size and shape factor between ECAPed and PTCAPed samples revealed a high capacity of the PTCAP process as a strain-inducing stage. Also the two-pass PTCAP samples exhibited higher hardness than one-pass treated samples in the semi-solid state.

  1. Novel Processing for Creating 3D Architectured Porous Shape Memory Alloy

    DTIC Science & Technology

    2013-03-01

    together with HIPing, liquid phase sintering was used. At 1150°C Nb forms a eutectic with binary NiTi. This eutectic is very reactive and dissolves...mechanical properties [6]. The eutectic phase has been shown to have high tensile strength bonding [7], and high wettability of NiTi. The liquid... eutectic wicks into space between powder particles, liquid sintering them together upon solidification. The liquid phase sintering processing method is

  2. The Process of Creating Integrated Home Care in Lithuania: from Idea to Reality

    PubMed Central

    Jurkuvienė, Ramunė; Butkevičienė, Rūta; Gajdosikienė, Indrė

    2016-01-01

    Background: The article presents an analysis of the formulation and implementation of a social innovation: integrated home care (IHC) in post-soviet Lithuania. From 1998 a series of top-down orders to implement IHC were issued, however, home nursing did not start. In 2011 the Ministry of Social Security and Labour began a process to develop integrated home care using new, collaborative processes. The result was 21 pilot projects with well-conceptualized IHC services. Method: Using data from focus groups, interviews, and recorded observations, the research team systematically documented the innovation process, including themes and deviations, employing Smale’s Innovation Trinity framework to organize the larger picture. Results: In the Lithuanian post-totalitarian context, top-down communication was found to be prevalent. Not only IHC, but also openness to change and dialogue at high levels were innovations. Patient-centered practice at local levels could only occur when a new attitude of mind was reached through dialogue with officials at higher levels and between peers. Conclusions: The enactment, rather than the mask of dialogue, participatory program development were critical in the success of IHC innovation. This is difficult to achieve in the light of antiquated public bureaucracies, but in this case, the Ministry team, rather than avoiding the expectation of top-down communication, made it into an asset through promotion of collaboration. PMID:28435419

  3. Biorefinery cascade processing for creating added value on tomato industrial by-products from Tunisia.

    PubMed

    Kehili, Mouna; Schmidt, Lisa Marie; Reynolds, Wienke; Zammel, Ayachi; Zetzl, Carsten; Smirnova, Irina; Allouche, Noureddine; Sayadi, Sami

    2016-01-01

    In today's consumer perception of industrial processes and food production, aspects like food quality, human health, environmental safety, and energy security have become the keywords. Therefore, much effort has been extended toward adding value to biowastes of agri-food industries through biorefinery processing approaches. This study focused, for the first time, on the valorization of tomato by-products of a Tunisian industry for the recovery of value-added compounds using biorefinery cascade processing. The process integrated supercritical CO2 extraction of carotenoids within the oil fractions from tomato seeds (TS) and tomato peels (TP), followed by a batch isolation of protein from the residues. The remaining lignocellulosic matter from both fractions was then submitted to a liquid hot water (LHW) hydrolysis. Supercritical CO2 experiments extracted 5.79% oleoresin, 410.53 mg lycopene/kg, and 31.38 mg β-carotene/kg from TP and 26.29% oil, 27.84 mg lycopene/kg, and 5.25 mg β-carotene/kg from TS, on dry weights. Protein extraction yields, nearing 30% of the initial protein contents equal to 13.28% in TP and 39.26% in TS, revealed that TP and TS are a rich source of essential amino acids. LHW treatment run at 120-200 °C, 50 bar for 30 min showed that a temperature of 160 °C was the most convenient for cellulose and hemicellulose hydrolysis from TP and TS, while keeping the degradation products low. Results indicated that tomato by-products are not only a green source of lycopene-rich oleoresin and tomato seed oil (TSO) and of protein with good nutritional quality but also a source of lignocellulosic matter with potential for bioethanol production. This study would provide an important reference for the concept and the feasibility of the cascade fractionation of valuable compounds from tomato industrial by-products.Graphical abstractSchema of biorefinery cascade processing of tomato industrial by-products toward isolation of valuable fractions.

  4. Degradation Mechanism of Cyanobacterial Toxin Cylindrospermopsin by Hydroxyl Radicals in Homogeneous UV/H2O2 Process

    EPA Science Inventory

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometr...

  5. Degradation Mechanism of Cyanobacterial Toxin Cylindrospermopsin by Hydroxyl Radicals in Homogeneous UV/H2O2 Process

    EPA Science Inventory

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometr...

  6. Dimensional Methods: Dimensions, Units and the Principle of Dimensional Homogeneity. Physical Processes in Terrestrial and Aquatic Ecosystems, Applied Mathematics.

    ERIC Educational Resources Information Center

    Fletcher, R. Ian

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. The module is concerned with conventional techniques such as concepts of measurement,…

  7. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse

    PubMed Central

    Hovick, Torre J.; Allred, Brady W.; Elmore, R. Dwayne; Fuhlendorf, Samuel D.; Hamilton, Robert G.; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993–2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy’s Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken’s distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as

  8. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse.

    PubMed

    Hovick, Torre J; Allred, Brady W; Elmore, R Dwayne; Fuhlendorf, Samuel D; Hamilton, Robert G; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993-2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy's Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken's distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as static

  9. ArhiNet - A Knowledge-Based System for Creating, Processing and Retrieving Archival eContent

    NASA Astrophysics Data System (ADS)

    Salomie, Ioan; Dinsoreanu, Mihaela; Pop, Cristina; Suciu, Sorin

    This paper addresses the problem of creating, processing and querying semantically enhanced eContent from archives and digital libraries. We present an analysis of the archival domain, resulting in the creation of an archival domain model and of a domain ontology core. Our system adds semantic mark-up to the historical documents content, thus enabling document and knowledge retrieval as response to natural language ontology-guided queries. The system functionality follows two main workflows: (i) semantically enhanced eContent generation and knowledge acquisition and (ii) knowledge processing and retrieval. Within the first workflow, the relevant domain information is extracted from documents written in natural languages, followed by semantic annotation and domain ontology population. In the second workflow, ontologically guided natural language queries trigger reasoning processes that provide relevant search results. The paper also discusses the transformation of the OWL domain ontology into a hierarchical data model, thus providing support for the efficient ontology processing.

  10. Creating a process for incorporating epidemiological modelling into outbreak management decisions.

    PubMed

    Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy

    2012-01-01

    Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks.

  11. Methodology for Creating UMLS Content Views Appropriate for Biomedical Natural Language Processing

    PubMed Central

    Aronson, Alan R.; Mork, James G.; Névéol, Aurélie; Shooshan, Sonya E.; Demner-Fushman, Dina

    2008-01-01

    Given the growth in UMLS Metathesaurus content and the consequent growth in language complexity, it is not surprising that NLP applications that depend on the UMLS are experiencing increased difficulty in maintaining adequate levels of performance. This phenomenon underscores the need for UMLS content views which can support NLP processing of both the biomedical literature and clinical text. We report on experiments designed to provide guidance as to whether to adopt a conservative vs. an aggressive approach to the construction of UMLS content views. We tested three conservative views and two new aggressive views against two NLP applications and found that the conservative views consistently performed better for the literature application, but the most aggressive view performed best for the clinical application. PMID:18998883

  12. Creating memory illusions: expectancy-based processing and the generation of false memories.

    PubMed

    MacRae, C Neil; Schloerscheidt, Astrid M; Bodenhausen, Galen V; Milne, Alan B

    2002-01-01

    The present research investigated the generation of memory illusions. In particular, it attempted to delineate the conditions under which category-based thinking prompts the elicitation of false memories. Noting fundamental differences in the manner in which expected and unexpected person-related information is processed and represented in the mind, it was anticipated that, via gist-based recognition, participants would display a pronounced propensity to generate expectancy-consistent false memories. The results of three experiments supported this prediction. In addition, the research revealed that participants' false memories were accompanied by the subjective experience of knowing (Expt. 2) and that false recognition was exacerbated under conditions of executive dysfunction (Expt. 3). We consider the theoretical implications of these findings for recent treatments of memory illusions and social cognition.

  13. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  14. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  15. An approach for extraction of kernel oil from Pinus pumila using homogenate-circulating ultrasound in combination with an aqueous enzymatic process and evaluation of its antioxidant activity.

    PubMed

    Chen, Fengli; Zhang, Qiang; Gu, Huiyan; Yang, Lei

    2016-11-04

    In this study, a novel approach involving homogenate-circulating ultrasound in combination with aqueous enzymatic extraction (H-CUAEE) was developed for extraction of kernel oil from Pinus pumila. Following comparison of enzyme types and concentrations, an enzyme mixture consisting of cellulase, pectinase and hemicellulase (1:1:1, w/w/w) at a concentration of 2.5% was selected and applied for effective oil extraction and release. Several variables potentially influencing extraction yields, namely, homogenization time, incubation temperature, incubation time, mark-space ratio of ultrasound irradiation, ultrasound irradiation power, liquid-solid ratio, pH and stirring rate, were optimized by Plackett-Burman design. Among the eight variables, incubation temperature, incubation time and liquid-solid ratio were statistically significant and were further optimized by Box-Behnken design to predict optimum extraction conditions and ascertain operability ranges for maximum extraction yield. Under optimum operating conditions, extraction yields of P. pumila kernel oil were 31.89±1.12% with a Δ5-unsaturated polymethylene interrupted fatty acid content of 20.07% and an unsaturated fatty acid content of 93.47%. Our study results indicate that the proposed H-CUAEE process has enormous potential for efficient and environmentally friendly extraction of edible oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Open system Hf isotope homogenization by a DISPOREP process under amphibolite-facies conditions, an example from the Limpopo Belt (South Africa)

    NASA Astrophysics Data System (ADS)

    Zeh, Armin; Gerdes, Axel

    2013-04-01

    Isotope homogenization in metamorphic rock is a prerequisite for precise isochrone dating. However, whether or not homogenisation occurs during a metamorphic overprint dependent on several parameters and processes, which compete with each other and comprise at least (i) volume diffusion, (ii) dissolution-re-precipitation, (iii) intergranular diffusive or fluid enhanced transport, and (iv) metamorphic mineral reaction(s). Isotope homogenisation is commonly reached in high-grade (granulite-facies) metamorphic rocks, where diffusion is fast, and mineral reactions and dissolution-re-precipitation accompanied or maintained by a melt phase, but it is incomplete in low-grade to amphibolite-facies rocks, in the presence of an aqueous fluid phase. This holds true, in particular, for the Lu-Hf isotope system, which is mainly controlled by accessory zircon, which is very resistant against dissolution in aqueous fluids and has slow diffusivity for Hf, U, Pb. Thus zircon often maintains it primary U-Pb-Hf isotope composition obtained during previous magmatic crystallisation (i.e, magmatic grains in orthogneisses or detrital magmatic grains in paragneisses), even under very high-grade metamorphic conditions >1000° C. However, results of recent isotope studies show, that the U-Pb and Lu-Hf isotope systems of zircon-bearing ortho- and paragneisses can homogenize completely (on hand specimen scale) even under amphibolite facies T - P conditions of

  17. Enhanced intestinal absorption activity and hepatoprotective effect of herpetrione via preparation of nanosuspensions using pH-dependent dissolving-precipitating/homogenization process.

    PubMed

    Shen, Baode; Jin, Shiying; Lv, Qingyuan; Jin, Shixiao; Yu, Chao; Yue, Pengfei; Han, Jin; Yuan, Hailong

    2013-09-01

    The main purpose of this study was to enhance the intestinal absorption activity and hepatoprotective effect of herpetrione by drug nanosuspensions. Herpetrione nanosuspensions (HNS) were prepared using pH-dependent dissolving-precipitating/homogenization process and then systematically characterized. The intestinal absorption activity of HNS were studied using the recirculating perfusion technique in comparison with herpetrione coarse suspensions (HCS) and pure herpetrione using the recirculating perfusion technique. The protective effect of HNS against acute liver injury induced by carbon tetrachloride (CCl4 ) in mice was also investigated and compared with that of HCS. The mean particle size of HNS was 269 ± 7 nm with a polydispersity index of 0.187 ± 0.021. The result of X-ray powder diffraction indicated that herpetrione was in amorphous state in both coarse powder and nanosuspensions. The intestinal absorption activity of HNS were superior to the HCS and pure herpetrione. As evidenced by the lowering of serum aminotransferase levels and the improvement of the degree of liver lesion, pretreatment with HNS markedly enhanced the hepatoprotective effect of herpetrione against acute liver injury induced by CCl4 in mice. HNS prepared using pH-dependent dissolving-precipitating/homogenization technique are able to significantly enhance the intestinal absorption activity and the hepatoprotective effect of herpetrione due to the particle size reduction. © 2013 Royal Pharmaceutical Society.

  18. A triplet-triplet annihilation based up-conversion process investigated in homogeneous solutions and oil-in-water microemulsions of a surfactant.

    PubMed

    Penconi, Marta; Gentili, Pier Luigi; Massaro, Giuseppina; Elisei, Fausto; Ortica, Fausto

    2014-01-01

    The triplet-triplet annihilation based up-conversion process, involving a platinum octaethyl-porphyrin (PtOEP) as a sensitizer and tetraphenyl-pyrene (TPPy) as an emitter, has been investigated in homogeneous solutions of toluene, bromobenzene and anisole, and oil-in-water microemulsions of the TX-100 surfactant, where toluene constitutes the non-polar phase. In homogeneous solutions, the highest up-conversion quantum yield (of the order of 20%) has been achieved in toluene, being the solvent that has the lowest viscosity among those explored. The up-conversion emission from the PtOEP-TPPy pair has been then investigated in a toluene based oil-in-water microemulsion at three different concentrations of the solutes, showing quantum yields up to the order of 1%, under the same irradiation conditions, but different deoxygenating procedures. The results herein reported might represent a good starting point for a future investigation in microheterogeneous systems. An optimization of the microemulsion composition, in terms of surfactant, co-surfactant and toluene concentrations, could allow us to increase the sensitizer and emitter concentrations and set up the best operative conditions to obtain even higher up-conversion efficiencies.

  19. Not all analogies are created equal: Associative and categorical analogy processing following brain damage

    PubMed Central

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile; and contain more distant relationships. PMID:22402184

  20. Integrated assessment of emerging science and technologies as creating learning processes among assessment communities.

    PubMed

    Forsberg, Ellen-Marie; Ribeiro, Barbara; Heyen, Nils B; Nielsen, Rasmus Øjvind; Thorstensen, Erik; de Bakker, Erik; Klüver, Lars; Reiss, Thomas; Beekman, Volkert; Millar, Kate

    2016-12-01

    Emerging science and technologies are often characterised by complexity, uncertainty and controversy. Regulation and governance of such scientific and technological developments needs to build on knowledge and evidence that reflect this complicated situation. This insight is sometimes formulated as a call for integrated assessment of emerging science and technologies, and such a call is analysed in this article. The article addresses two overall questions. The first is: to what extent are emerging science and technologies currently assessed in an integrated way. The second is: if there appears to be a need for further integration, what should such integration consist in? In the article we briefly outline the pedigree of the term 'integrated assessment' and present a number of interpretations of the concept that are useful for informing current analyses and discussions of integration in assessment. Based on four case studies of assessment of emerging science and technologies, studies of assessment traditions, literature analysis and dialogues with assessment professionals, currently under-developed integration dimensions are identified. It is suggested how these dimensions can be addressed in a practical approach to assessment where representatives of different assessment communities and stakeholders are involved. We call this approach the Trans Domain Technology Evaluation Process (TranSTEP).

  1. Not all analogies are created equal: Associative and categorical analogy processing following brain damage.

    PubMed

    Schmidt, Gwenda L; Cardillo, Eileen R; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-06-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile, and contain more distant relationships. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Creating an environment to implement and sustain evidence based practice: a developmental process.

    PubMed

    Aitken, Leanne M; Hackwood, Ben; Crouch, Shannon; Clayton, Samantha; West, Nicky; Carney, Debbie; Jack, Leanne

    2011-11-01

    Elements of evidence based practice (EBP) are well described in the literature and achievement of EBP is frequently being cited as an organisational goal. Despite this, the practical processes and resources for achieving EBP are often not readily apparent, available or successful. To describe a multi-dimensional EBP program designed to incorporate evidence into practice to lead to sustainable improvement in patient care and ultimately patient outcome. A multi-dimensional EBP program incorporating EBP champions and mentors, provision of resources, creation of a culture to foster EBP and use of practical EBP strategies was implemented in a 22-bed intensive care unit (ICU) in a public, tertiary hospital in Brisbane, Australia. The practical EBP strategies included workgroups, journal club and nursing rounds. The multi-dimensional EBP program has been successfully implemented over the past three years. EBP champions and mentors are now active and two EBP workgroups have investigated specific aspects of practice, with one of these resulting in development of an associated research project. Journal club is a routine component of the education days that all ICU nurses attend. Nursing rounds is now conducted twice a week, with between one and seven short-term issues identified for each patient reviewed in the first 12 months. A multi-dimensional program of practice change has been implemented in one setting and is providing a forum for discussion of practice-related issues and improvements. Adaptation of these strategies to multiple different health care settings is possible, with the potential for sustained practice change and improvement. Copyright © 2011 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.

  3. Magnifying absolute instruments for optically homogeneous regions

    SciTech Connect

    Tyc, Tomas

    2011-09-15

    We propose a class of magnifying absolute optical instruments with a positive isotropic refractive index. They create magnified stigmatic images, either virtual or real, of optically homogeneous three-dimensional spatial regions within geometrical optics.

  4. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  5. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants

    PubMed Central

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (−47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  6. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants.

    PubMed

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-06-03

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (-47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents.

  7. PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Glauco; Netto Lahoz, Carlos Henrique

    2013-09-01

    This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).

  8. Study of fundamental chemical processes in explosive decomposition by laser-powered homogeneous pyrolysis. Final report 1 jul 78-31 aug 81

    SciTech Connect

    McMillen, D.F.; Golden, D.M.

    1981-11-12

    Very Low-Pressure Pyrolysis studies of 2,4-dinitrotoluene decomposition resulted in decomposition rates consistent with log (ks) = 12.1 - 43.9/2.3 RT. These results support the conclusion that previously reported 'anomalously' low Arrhenius parameters for the homogeneous gas-phase decomposition of ortho-nitrotoluene actually represent surface-catalyzed reactions. Preliminary qualitative results for pyrolysis of ortho-nitrotouene in the absence of hot reactor walls, using the Laser-Powered Homogeneous Pyrolysis technique (LPHP), provide further support for this conclusion: only products resulting from Ph-NO2 bond scission were observed; no products indicating complex intramolecular oxidation-reduction or elimination processes could be detected. The LPHP technique was successfully modified to use a pulsed laser and a heated flow system, so that the technique becomes suitable for study of surface-sensitive, low vapor pressure substrates such as TNT. The validity and accuracy of the technique was demonstrated by applying it to the decomposition of substances whose Arrhenius parameters for decomposition were already well known. IR-fluorescence measurements show that the temperature-space-time behavior under the present LPHP conditions is in agreement with expectations and with requirements which must be met if the method is to have quantitative validity. LPHP studies of azoisopropane decomposition, chosen as a radical-forming test reaction, show the accepted literature parameters to be substantially in error and indicate that the correct values are in all probability much closer to those measured in this work: log (k/s) = 13.9 - 41.2/2.3 RT.

  9. Homogeneous processes of atmospheric interest

    NASA Technical Reports Server (NTRS)

    Rossi, M. J.; Barker, J. R.; Golden, D. M.

    1983-01-01

    Upper atmospheric research programs in the department of chemical kinetics are reported. Topics discussed include: (1) third-order rate constants of atmospheric importance; (2) a computational study of the HO2 + HO2 and DO2 + DO2 reactions; (3) measurement and estimation of rate constants for modeling reactive systems; (4) kinetics and thermodynamics of ion-molecule association reactions; (5) entropy barriers in ion-molecule reactions; (6) reaction rate constant for OH + HOONO2 yields products over the temperature range 246 to 324 K; (7) very low-pressure photolysis of tert-bytyl nitrite at 248 nm; (8) summary of preliminary data for the photolysis of C1ONO2 and N2O5 at 285 nm; and (9) heterogeneous reaction of N2O5 and H2O.

  10. Homogeneity and elemental distribution in self-assembled bimetallic Pd-Pt aerogels prepared by a spontaneous one-step gelation process.

    PubMed

    Oezaslan, M; Liu, W; Nachtegaal, M; Frenkel, A I; Rutkowski, B; Werheid, M; Herrmann, A-K; Laugier-Bonnaud, C; Yilmaz, H-C; Gaponik, N; Czyrska-Filemonowicz, A; Eychmüller, A; Schmidt, T J

    2016-07-27

    Multi-metallic aerogels have recently emerged as a novel and promising class of unsupported electrocatalyst materials due to their high catalytic activity and improved durability for various electrochemical reactions. Aerogels can be prepared by a spontaneous one-step gelation process, where the chemical co-reduction of metal precursors and the prompt formation of nanochain-containing hydrogels, as a preliminary stage for the preparation of aerogels, take place. However, detailed knowledge about the homogeneity and chemical distribution of these three-dimensional Pd-Pt aerogels at the nano-scale as well as at the macro-scale is still unclear. Therefore, we used a combination of spectroscopic and microscopic techniques to obtain a better insight into the structure and elemental distribution of the various Pd-rich Pd-Pt aerogels prepared by the spontaneous one-step gelation process. Synchrotron-based extended X-ray absorption fine structure (EXAFS) spectroscopy and high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) in combination with energy-dispersive X-ray spectroscopy (EDX) were employed in this work to uncover the structural architecture and chemical composition of the various Pd-rich Pd-Pt aerogels over a broad length range. The Pd80Pt20, Pd60Pt40 and Pd50Pt50 aerogels showed heterogeneity in the chemical distribution of the Pt and Pd atoms inside the macroscopic nanochain-network. The features of mono-metallic clusters were not detected by EXAFS or STEM-EDX, indicating alloyed nanoparticles. However, the local chemical composition of the Pd-Pt alloys strongly varied along the nanochains and thus within a single aerogel. To determine the electrochemically active surface area (ECSA) of the Pd-Pt aerogels for application in electrocatalysis, we used the electrochemical CO stripping method. Due to their high porosity and extended network structure, the resulting values of the ECSA for the Pd-Pt aerogels were higher than that for

  11. High pressure homogenization processing, thermal treatment and milk matrix affect in vitro bioaccessibility of phenolics in apple, grape and orange juice to different extents.

    PubMed

    He, Zhiyong; Tao, Yadan; Zeng, Maomao; Zhang, Shuang; Tao, Guanjun; Qin, Fang; Chen, Jie

    2016-06-01

    The effects of high pressure homogenization processing (HPHP), thermal treatment (TT) and milk matrix (soy, skimmed and whole milk) on the phenolic bioaccessibility and the ABTS scavenging activity of apple, grape and orange juice (AJ, GJ and OJ) were investigated. HPHP and soy milk diminished AJ's total phenolic bioaccessibility 29.3%, 26.3%, respectively, whereas TT and bovine milk hardly affected it. HPHP had little effect on GJ's and OJ's total phenolic bioaccessibility, while TT enhanced them 27.3-33.9%, 19.0-29.2%, respectively, and milk matrix increased them 26.6-31.1%, 13.3-43.4%, respectively. Furthermore, TT (80 °C/30 min) and TT (90 °C/30 s) presented the similar influences on GJ's and OJ's phenolic bioaccessibility. Skimmed milk showed a better enhancing effect on OJ's total phenolic bioaccessibility than soy and whole milk, but had a similar effect on GJ's as whole milk. These results contribute to promoting the health benefits of fruit juices by optimizing the processing and formulas in the food industry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Creating a Patient Complaint Capture and Resolution Process to Incorporate Best Practices for Patient-Centered Representation.

    PubMed

    Levin, Cynthia Mahood; Hopkins, Joseph

    2014-11-01

    A growing body of evidence suggests that patient (including family) feedback can provide compelling opportunities for developing risk management and quality improvement strategies, as well as improving customer satisfaction. The Patient Representative Department (PRD) at Stanford Health Care (SHC) (Stanford, California) created a streamlined patient complaint capture and resolution process to improve the capture of patient complaints and grievances from multiple parts of the organization and manage them in a centralized database. In March 2008 the PRD rolled out a data management system for tracking patient complaints and generating reports to SHC leadership, and SHC needed to modify and address its data input procedures. A reevaluation of the overall work flow showed it to be complex, with over-lapping and redundant steps, and to lack standard processes and actions. Best-practice changes were implemented: (1) leadership engagement, (2) increased capture of complaints, (3) centralized data and reporting, (4) improved average response times to patient grievances and complaints, and (5) improved service recovery. Standard work flows were created for each category of complaint linked to specific actions. Complaints captured increased from 20 to 270 per month. Links to a specific physician rose from 16%-36% to more than 80%. In addition, 68% of high-complaint physicians improved. With improved work flows, responses to patients expressing concerns met a requirement of less than seven days. Standardized work flows for managing complaints and grievances, centralized data management and clear leadership accountability can improve responsiveness to patients, capture incidents more consistently, and meet regulatory and accreditation requirements.

  13. Creating a high-reliability health care system: improving performance on core processes of care at Johns Hopkins Medicine.

    PubMed

    Pronovost, Peter J; Armstrong, C Michael; Demski, Renee; Callender, Tiffany; Winner, Laura; Miller, Marlene R; Austin, J Matthew; Berenholtz, Sean M; Yang, Ting; Peterson, Ronald R; Reitz, Judy A; Bennett, Richard G; Broccolino, Victor A; Davis, Richard O; Gragnolati, Brian A; Green, Gene E; Rothman, Paul B

    2015-02-01

    In this article, the authors describe an initiative that established an infrastructure to manage quality and safety efforts throughout a complex health care system and that improved performance on core measures for acute myocardial infarction, heart failure, pneumonia, surgical care, and children's asthma. The Johns Hopkins Medicine Board of Trustees created a governance structure to establish health care system-wide oversight and hospital accountability for quality and safety efforts throughout Johns Hopkins Medicine. The Armstrong Institute for Patient Safety and Quality was formed; institute leaders used a conceptual model nested in a fractal infrastructure to implement this initiative to improve performance at two academic medical centers and three community hospitals, starting in March 2012. The initiative aimed to achieve ≥ 96% compliance on seven inpatient process-of-care core measures and meet the requirements for the Delmarva Foundation and Joint Commission awards. The primary outcome measure was the percentage of patients at each hospital who received the recommended process of care. The authors compared health system and hospital performance before (2011) and after (2012, 2013) the initiative. The health system achieved ≥ 96% compliance on six of the seven targeted measures by 2013. Of the five hospitals, four received the Delmarva Foundation award and two received The Joint Commission award in 2013. The authors argue that, to improve quality and safety, health care systems should establish a system-wide governance structure and accountability process. They also should define and communicate goals and measures and build an infrastructure to support peer learning.

  14. ABA Southern Region Burn disaster plan: the process of creating and experience with the ABA southern region burn disaster plan.

    PubMed

    Kearns, Randy D; Cairns, Bruce A; Hickerson, William L; Holmes, James H

    2014-01-01

    The Southern Region of the American Burn Association began to craft a regional plan to address a surge of burn-injured patients after a mass casualty event in 2004. Published in 2006, this plan has been tested through modeling, exercise, and actual events. This article focuses on the process of how the plan was created, how it was tested, and how it interfaces with other ongoing efforts on preparedness. One key to success regarding how people respond to a disaster can be traced to preexisting relationships and collaborations. These activities would include training or working together and building trust long before the crisis. Knowing who you can call and rely on when you need help, within the context of your plan, can be pivotal in successfully managing a disaster. This article describes how a coalition of burn center leaders came together. Their ongoing personal association has facilitated the development of planning activities and has kept the process dynamic. This article also includes several of the building blocks for developing a plan from creation to composition, implementation, and testing. The plan discussed here is an example of linking leadership, relationships, process, and documentation together. On the basis of these experiences, the authors believe these elements are present in other regions. The intent of this work is to share an experience and to offer it as a guide to aid others in their regional burn disaster planning efforts.

  15. Creating Sub-50 Nm Nanofluidic Junctions in PDMS Microfluidic Chip via Self-Assembly Process of Colloidal Particles.

    PubMed

    Wei, Xi; Syed, Abeer; Mao, Pan; Han, Jongyoon; Song, Yong-Ak

    2016-03-13

    Polydimethylsiloxane (PDMS) is the prevailing building material to make microfluidic devices due to its ease of molding and bonding as well as its transparency. Due to the softness of the PDMS material, however, it is challenging to use PDMS for building nanochannels. The channels tend to collapse easily during plasma bonding. In this paper, we present an evaporation-driven self-assembly method of silica colloidal nanoparticles to create nanofluidic junctions with sub-50 nm pores between two microchannels. The pore size as well as the surface charge of the nanofluidic junction is tunable simply by changing the colloidal silica bead size and surface functionalization outside of the assembled microfluidic device in a vial before the self-assembly process. Using the self-assembly of nanoparticles with a bead size of 300 nm, 500 nm, and 900 nm, it was possible to fabricate a porous membrane with a pore size of ~45 nm, ~75 nm and ~135 nm, respectively. Under electrical potential, this nanoporous membrane initiated ion concentration polarization (ICP) acting as a cation-selective membrane to concentrate DNA by ~1,700 times within 15 min. This non-lithographic nanofabrication process opens up a new opportunity to build a tunable nanofluidic junction for the study of nanoscale transport processes of ions and molecules inside a PDMS microfluidic chip.

  16. The Denali EarthScope Education Partnership: Creating Opportunities for Learning About Solid Earth Processes in Alaska and Beyond.

    NASA Astrophysics Data System (ADS)

    Roush, J. J.; Hansen, R. A.

    2003-12-01

    The Geophysical Institute of the University of Alaska Fairbanks, in partnership with Denali National Park and Preserve, has begun an education outreach program that will create learning opportunities in solid earth geophysics for a wide sector of the public. We will capitalize upon a unique coincidence of heightened public interest in earthquakes (due to the M 7.9 Denali Fault event of Nov. 3rd, 2002), the startup of the EarthScope experiment, and the construction of the Denali Science & Learning Center, a premiere facility for science education located just 43 miles from the epicenter of the Denali Fault earthquake. Real-time data and current research results from EarthScope installations and science projects in Alaska will be used to engage students and teachers, national park visitors, and the general public in a discovery process that will enhance public understanding of tectonics, seismicity and volcanism along the boundary between the Pacific and North American plates. Activities will take place in five program areas, which are: 1) museum displays and exhibits, 2) outreach via print publications and electronic media, 3) curriculum development to enhance K-12 earth science education, 4) teacher training to develop earth science expertise among K-12 educators, and 5) interaction between scientists and the public. In order to engage the over 1 million annual visitors to Denali, as well as people throughout Alaska, project activities will correspond with the opening of the Denali Science and Learning Center in 2004. An electronic interactive kiosk is being constructed to provide public access to real-time data from seismic and geodetic monitoring networks in Alaska, as well as cutting edge visualizations of solid earth processes. A series of print publications and a website providing access to real-time seismic and geodetic data will be developed for park visitors and the general public, highlighting EarthScope science in Alaska. A suite of curriculum modules

  17. Important components to create personal working alliances with clients in the mental health sector to support the recovery process.

    PubMed

    Klockmo, Carolina; Marnetoft, Sven-Uno; Selander, John; Nordenmark, Mikael

    2014-03-01

    Personligt ombud (PO) is a Swedish version of case management that aims to support individuals with psychiatric disabilities. Guidelines to the PO service emphasize the different role that the PO plays with respect to the relationship with clients. The aim of this study was to investigate the components that POs found to be important in the relationship with clients. Telephone interviews with 22 POs across Sweden were carried out. The interviews were recorded, transcribed, and analyzed using qualitative content analysis. The relationship with each client was described as the foundation of the POs' work; it was the only 'tool' they had. The findings were reflected in a main theme, which showed the importance of creating personal working alliances with each client where POs put the client at the center of the work and adjusted their support according to the client's needs at the time. Important components were that the PO and the client trusted each other, that the power between the PO and the client was balanced, and to be a personal support. Many of the components that POs found to be important are shown as essential in recovery-oriented services. POs followed the client in the process and remained as long as necessary and this is one way of bringing hope to the client's recovery process. However, the personal tone can be fraught with difficulties and to maintain professionalism, it is necessary to reflect, through discussions with colleagues, with the leader and in supervision.

  18. Thermomechanical process optimization of U-10wt% Mo – Part 2: The effect of homogenization on the mechanical properties and microstructure

    SciTech Connect

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean M.; Burkes, Douglas E.

    2015-07-09

    Low-enriched uranium alloyed with 10 wt% molybdenum (U-10Mo) is currently being investigated as an alternative fuel for the highly enriched uranium used in several of the United States’ high performance research reactors. Development of the methods to fabricate the U-10Mo fuel plates is currently underway and requires fundamental understanding of the mechanical properties at the expected processing temperatures. In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the “homogenized” materials were then tested under compression between 500 and 800°C. The as-cast samples and those treated at 800°C for 24 hours had grain sizes of 25-30 μm, whereas those treated at 1000°C for 16 hours had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  19. Geovisualisation as a process of creating complementary visualisations: static two-dimensional, surface three-dimensional, and interactive

    NASA Astrophysics Data System (ADS)

    Horbiński, Tymoteusz; Medyńska-Gulij, Beata

    2017-06-01

    In the following paper, geovisualisation will be applied to one spatial phenomenon and understood as a process of creating complementary visualisations: static two-dimensional, surface three-dimensional, and interactive. The central challenge that the researchers faced was to find a method of presenting the phenomenon in a multi-faceted way. The main objective of the four-stage study was to show the capacity of the contemporary software for presenting geographical space from various perspectives while maintaining the standards of cartographic presentation and making sure that the form remains attractive for the user. The correctness, effectiveness, and usefulness of the proposed approach was analysed on the basis of a geovisualisation of natural aggregate extraction in the Gniezno district in the years 2005-2015. For each of the three visualisations, the researchers planned a different range of information, different forms of graphic and cartographic presentation, different use and function, but as far as possible the same accessible databases and the same free technologies. On the basis of the final publication, the researchers pointed out the advantages of the proposed workflow and the correctness of the detailed flowchart.

  20. Dangling conversations: reflections on the process of creating digital stories during a workshop with people with early-stage dementia.

    PubMed

    Stenhouse, R; Tait, J; Hardy, P; Sumner, T

    2013-03-01

    Care and compassion are key features of the NHS Constitution. Recent reports have identified a lack of compassion in the care and treatment of older people. Nurses draw on aesthetic knowledge, developed through engagement with the experience of others, when providing compassionate care. Patient Voices reflective digital stories are used in healthcare education to facilitate student engagement with the patient experience. Digital stories were made with seven people with early-stage dementia as part of a learning package for student nurses. In this paper the authors reflect on their experience and observations from facilitating the 4-day digital story-making workshop. Social theories of dementia provide a theoretical framework for understanding these reflections. Despite considerable challenges in developing a story, and anxiety about using the technology, reading and speaking, all participants engaged in creating their own digital stories. Positive changes in the participants' interactions were observed. These improvements appeared to be the product of the person-centred facilitation and the creative process which supported self-expression and a sense of identity. Nurses working in this way could facilitate ability of the person with dementia to participate in their care, and improve their sense of well-being by supporting self-expression. © 2012 Blackwell Publishing.

  1. Influence of Thermal Boundary Effects on the Process of Creating Recovery Stresses in a SMA Wire Activated by Joule Heating

    NASA Astrophysics Data System (ADS)

    Debska, Aleksandra; Balandraud, Xavier; Destrebecq, Jean-François; Gwozdziewicz, Piotr; Seruga, Andrzej

    2017-07-01

    The study deals with the influence of thermal boundary effects on the process of creating recovery stresses in a SMA wire activated by Joule heating, during a thermal cycle (up to the return to ambient temperature). First, a thermal characterization is performed using infrared thermography for temperature profile measurements along the wire in a steady-state regime. Second, recovery stress tests are performed using a uniaxial testing machine. Finally, tests are analyzed using a thermomechanical model, taking the inhomogeneous temperature distribution along the wire into account. The influence of the initial distribution of martensite (before thermal activation of the memory effect) is discussed, as well as the influence of the wire length. It is shown that the thermal boundary effects at the contact with the grips of the testing machine significantly influence the response of the wire. For instance, during the heating of the wire, an austenite-to-martensite transformation may occur in the zones near the wire ends (where the temperature remains close to ambient) due to the increased stress. A length of influence of the thermal boundary effects on the overall wire response is defined, and a condition to neglect this influence is proposed. The study highlights the importance of taking thermal boundary effects into account for practical applications of SMAs based on Joule heating.

  2. Don't homogenize, synchronize.

    PubMed

    Sawhney, M

    2001-01-01

    To be more responsive to customers, companies often break down organizational walls between their units--setting up all manner of cross-business and cross-functional task forces and working groups and promoting a "one-company" culture. But such attempts can backfire terribly by distracting business and functional units and by contaminating their strategies and processes. Fortunately, there's a better way, says the author. Rather than tear down organizational walls, a company can make them permeable to information. It can synchronize all its data on products, filtering the information through linked databases and applications and delivering it in a coordinated, meaningful form to customers. As a result, the organization can present a single, unified face to the customer--one that can change as market conditions warrant--without imposing homogeneity on its people. Such synchronization can lead not just to stronger customer relationships and more sales but also to greater operational efficiency. It allows a company, for example, to avoid the high costs of maintaining many different information systems with redundant data. The decoupling of product control from customer control in a synchronized company reflects a fundamental fact about business: While companies have to focus on creating great products, customers think in terms of the activities they perform and the benefits they seek. For companies, products are ends, but for customers, products are means. The disconnect between how customers think and how companies organize themselves is what leads to inefficiencies and missed opportunities, and that's exactly the problem that synchronization solves. Synchronized companies can get closer to customers, sustain product innovation, and improve operational efficiency--goals that have traditionally been very difficult to achieve simultaneously.

  3. Is the Universe homogeneous?

    PubMed

    Maartens, Roy

    2011-12-28

    The standard model of cosmology is based on the existence of homogeneous surfaces as the background arena for structure formation. Homogeneity underpins both general relativistic and modified gravity models and is central to the way in which we interpret observations of the cosmic microwave background (CMB) and the galaxy distribution. However, homogeneity cannot be directly observed in the galaxy distribution or CMB, even with perfect observations, since we observe on the past light cone and not on spatial surfaces. We can directly observe and test for isotropy, but to link this to homogeneity we need to assume the Copernican principle (CP). First, we discuss the link between isotropic observations on the past light cone and isotropic space-time geometry: what observations do we need to be isotropic in order to deduce space-time isotropy? Second, we discuss what we can say with the Copernican assumption. The most powerful result is based on the CMB: the vanishing of the dipole, quadrupole and octupole of the CMB is sufficient to impose homogeneity. Real observations lead to near-isotropy on large scales--does this lead to near-homogeneity? There are important partial results, and we discuss why this remains a difficult open question. Thus, we are currently unable to prove homogeneity of the Universe on large scales, even with the CP. However, we can use observations of the cosmic microwave background, galaxies and clusters to test homogeneity itself.

  4. Fe2O3-loaded activated carbon fiber/polymer materials and their photocatalytic activity for methylene blue mineralization by combined heterogeneous-homogeneous photocatalytic processes

    NASA Astrophysics Data System (ADS)

    Kadirova, Zukhra C.; Hojamberdiev, Mirabbos; Katsumata, Ken-Ichi; Isobe, Toshihiro; Matsushita, Nobuhiro; Nakajima, Akira; Okada, Kiyoshi

    2017-04-01

    Fe2O3-supported activated carbon felts (Fe-ACFTs) were prepared by impregnating the felts consisted of activated carbon fibers (ACFs) with either polyester fibers (PS-A20) or polyethylene pulp (PE-W15) in Fe(III) nitrate solution and calcination at 250 °C for 1 h. The prepared Fe-ACFTs with 31-35 wt% Fe were characterized by N2-adsorption, scanning electron microscopy, and X-ray diffraction. The Fe-ACFT(PS-A20) samples with 5-31 wt% Fe were microporous with specific surface areas (SBET) ranging from 750 to 150 m2/g, whereas the Fe-ACFT(PE-W15) samples with 2-35 wt% Fe were mesoporous with SBET ranging from 830 to 320 m2/g. The deposition of iron oxide resulted in a decrease in the SBET and methylene blue (MB) adsorption capacity while increasing the photodegradation of MB. The optimum MB degradation conditions included 0.98 mM oxalic acid, pH = 3, 0.02-0.05 mM MB, and 100 mg/L photocatalyst. The negative impact of MB desorption during the photodegradation reaction was more pronounced for mesoporous PE-W15 samples and can be neglected by adding oxalic acid in cyclic experiments. Almost complete and simultaneous mineralization of oxalate and MB was achieved by the combined heterogeneous-homogeneous photocatalytic processes. The leached Fe ions in aqueous solution [Fe3+]f were measured after 60 min for every cycle and found to be about 2 ppm in all four successive cycles. The developed photocatalytic materials have shown good performance even at low content of iron oxide (2-5 wt% Fe-ACFT). Moreover, it is easy to re-impregnate the ACF when the content of iron oxide is reduced during the cyclic process. Thus, low leaching of Fe ions and possibility of cyclic usage are the advantages of the photocatalytic materials developed in this study.

  5. Acid-catalyzed cyclization of terpenes under homogeneous and heterogeneous conditions as probed through stereoisotopic studies: a concerted process with competing preorganized chair and boat transition states.

    PubMed

    Raptis, Christos; Lykakis, Ioannis N; Tsangarakis, Constantinos; Stratakis, Manolis

    2009-11-09

    Based on stereoisotopic studies and beta-secondary isotope effects, we propose that the acid-catalyzed cyclization of geranyl acetate proceeds through a concerted mechanism. Under heterogeneous conditions (zeolite Y confinement), a preorganized chairlike transition state predominates, whereas under homogeneous conditions the boat- and chairlike transition states are almost isoenergetic. For the case of farnesyl acetate, we propose that under homogeneous conditions a concerted dicyclization occurs with a preorganized boat-chair transition state competing with the chair-chair transition state. Under zeolite confinement conditions, the chair-chairlike dicyclization transition state is highly favorable. The preference of chairlike transition states within the cavities of zeolite Y is attributed to a transition state shape selectivity effect.

  6. Economy, Culture, Public Policy, and the Urban Underclass. A Discussion of Research on Processes and Mechanisms That Create, Maintain, or Overcome Urban Poverty.

    ERIC Educational Resources Information Center

    Pearson, Robert W.

    1989-01-01

    Research is examining the processes by which persistent and concentrated urban poverty is created, maintained, prevented, or overcome. This paper reports on discussion and suggestions generated in a planning meeting of the Social Science Research Council's Committee for Research on the Urban Underclass held on September 21-23, 1988. Issues…

  7. Report: Recipient Subawards to Fellows Did Not Comply With Federal Requirements and EPA’s Involvement in Fellow Selection Process Creates the Appearance EPA Could Be Circumventing the Hiring Process

    EPA Pesticide Factsheets

    Report #14-P-0357, September 17, 2014. ASPH’s subawards to fellows made under the CA are contrary to federal requirements ... and ... creates an appearance that the EPA could be circumventing the hiring process.

  8. Stimulus homogeneity enhances implicit learning: evidence from contextual cueing.

    PubMed

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2014-04-01

    Visual search for a target object is faster if the target is embedded in a repeatedly presented invariant configuration of distractors ('contextual cueing'). It has also been shown that the homogeneity of a context affects the efficiency of visual search: targets receive prioritized processing when presented in a homogeneous context compared to a heterogeneous context, presumably due to grouping processes at early stages of visual processing. The present study investigated in three Experiments whether context homogeneity also affects contextual cueing. In Experiment 1, context homogeneity varied on three levels of the task-relevant dimension (orientation) and contextual cueing was most pronounced for context configurations with high orientation homogeneity. When context homogeneity varied on three levels of the task-irrelevant dimension (color) and orientation homogeneity was fixed, no modulation of contextual cueing was observed: high orientation homogeneity led to large contextual cueing effects (Experiment 2) and low orientation homogeneity led to low contextual cueing effects (Experiment 3), irrespective of color homogeneity. Enhanced contextual cueing for homogeneous context configurations suggest that grouping processes do not only affect visual search but also implicit learning. We conclude that memory representation of context configurations are more easily acquired when context configurations can be processed as larger, grouped perceptual units. However, this form of implicit perceptual learning is only improved by stimulus homogeneity when stimulus homogeneity facilitates grouping processes on a dimension that is currently relevant in the task. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. An investigation into the effects of excipient particle size, blending techniques and processing parameters on the homogeneity and content uniformity of a blend containing low-dose model drug

    PubMed Central

    Alyami, Hamad; Dahmash, Eman; Bowen, James

    2017-01-01

    Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation. PMID:28609454

  10. An investigation into the effects of excipient particle size, blending techniques and processing parameters on the homogeneity and content uniformity of a blend containing low-dose model drug.

    PubMed

    Alyami, Hamad; Dahmash, Eman; Bowen, James; Mohammed, Afzal R

    2017-01-01

    Powder blend homogeneity is a critical attribute in formulation development of low dose and potent active pharmaceutical ingredients (API) yet a complex process with multiple contributing factors. Excipient characteristics play key role in efficient blending process and final product quality. In this work the effect of excipient type and properties, blending technique and processing time on content uniformity was investigated. Powder characteristics for three commonly used excipients (starch, pregelatinised starch and microcrystalline cellulose) were initially explored using laser diffraction particle size analyser, angle of repose for flowability, followed by thorough evaluations of surface topography employing scanning electron microscopy and interferometry. Blend homogeneity was evaluated based on content uniformity analysis of the model API, ergocalciferol, using a validated analytical technique. Flowability of powders were directly related to particle size and shape, while surface topography results revealed the relationship between surface roughness and ability of excipient with high surface roughness to lodge fine API particles within surface groves resulting in superior uniformity of content. Of the two blending techniques, geometric blending confirmed the ability to produce homogeneous blends at low dilution when processed for longer durations, whereas manual ordered blending failed to achieve compendial requirement for content uniformity despite mixing for 32 minutes. Employing the novel dry powder hybrid mixer device, developed at Aston University laboratory, results revealed the superiority of the device and enabled the production of homogenous blend irrespective of excipient type and particle size. Lower dilutions of the API (1% and 0.5% w/w) were examined using non-sieved excipients and the dry powder hybrid mixing device enabled the development of successful blends within compendial requirements and low relative standard deviation.

  11. Creating Poetry.

    ERIC Educational Resources Information Center

    Drury, John

    Encouraging exploration and practice, this book offers hundreds of exercises and numerous tips covering every step involved in creating poetry. Each chapter is a self-contained unit offering an overview of material in the chapter, a definition of terms, and poetry examples from well-known authors designed to supplement the numerous exercises.…

  12. A study of the process of using Pro/ENGINEER geometry models to create finite element models

    SciTech Connect

    Kistler, B.L.

    1997-02-01

    Methods for building Pro/ENGINEER models which allowed integration with structural and thermal mesh generation and analyses software without recreating geometry were evaluated. This study was not intended to be an in-depth study of the mechanics of Pro/ENGINEER or of mesh generation or analysis software, but instead was a first cut attempt to provide recommendations for Sandia personnel which would yield useful analytical models in less time than an analyst would require to create a separate model. The study evaluated a wide variety of geometries built in Pro/ENGINEER and provided general recommendations for designers, drafters, and analysts.

  13. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  14. Phase-shifting of correlation fringes created by image processing as an alternative to improve digital shearography

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Marcon, Marlon; Magalhães, Ricardo R.; Paiva-Almeida, Thiago; Santos, Igor V. A.; Martins, Moisés

    2016-12-01

    The adoption of digital speckle pattern shearing interferometry, or speckle shearography, is well known in many areas when one needs to measure micro-displacements in-plane and out of the plane in biological and non-biological objects; it is based on the Michelson's Interferometer with the use of a piezoelectric transducer (PZT) in order to provide the phase-shift of the fringes and then to improve the quality of the final image. The creation of the shifting images using a PZT, despite its widespread use, has some drawbacks or limitations, such as the cost of the apparatus, the difficulties in applying the same displacement in the mirror repeated times, and when the phase-shift cannot be used in dynamic object measurement. The aim of this work was to create digitally phase-shift images avoiding the mechanical adjustments of the PZT, testing them with the digital shearography method. The methodology was tested using a well-known object, a cantilever beam of aluminium under deformation. The results documented the ability to create the deformation map and curves with reliability and sensitivity, reducing the cost, and improving the robustness and also the accessibility of digital speckle pattern shearing interferometry.

  15. Creating bulk nanocrystalline metal.

    SciTech Connect

    Fredenburg, D. Anthony; Saldana, Christopher J.; Gill, David D.; Hall, Aaron Christopher; Roemer, Timothy John; Vogler, Tracy John; Yang, Pin

    2008-10-01

    Nanocrystalline and nanostructured materials offer unique microstructure-dependent properties that are superior to coarse-grained materials. These materials have been shown to have very high hardness, strength, and wear resistance. However, most current methods of producing nanostructured materials in weapons-relevant materials create powdered metal that must be consolidated into bulk form to be useful. Conventional consolidation methods are not appropriate due to the need to maintain the nanocrystalline structure. This research investigated new ways of creating nanocrystalline material, new methods of consolidating nanocrystalline material, and an analysis of these different methods of creation and consolidation to evaluate their applicability to mesoscale weapons applications where part features are often under 100 {micro}m wide and the material's microstructure must be very small to give homogeneous properties across the feature.

  16. Creating a Whole Greater than the Sum of Its Parts: Fostering Integrative Learning with a Reflective ePortfolio Process

    ERIC Educational Resources Information Center

    McGuinness, Thomas Patrick

    2015-01-01

    This research explores one university's effort to facilitate integrative learning with a reflective ePortfolio process. Integrative learning is conceptualized using a multi-theoretical construct consisting of transfer of learning, reflective practice, and self-authorship. As part of the evaluation of this process, students completed a pre-survey…

  17. Creating a Whole Greater than the Sum of Its Parts: Fostering Integrative Learning with a Reflective ePortfolio Process

    ERIC Educational Resources Information Center

    McGuinness, Thomas Patrick

    2015-01-01

    This research explores one university's effort to facilitate integrative learning with a reflective ePortfolio process. Integrative learning is conceptualized using a multi-theoretical construct consisting of transfer of learning, reflective practice, and self-authorship. As part of the evaluation of this process, students completed a pre-survey…

  18. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    NASA Astrophysics Data System (ADS)

    Bulutsuz, A. G.; Demircioglu, P.; Bogrekci, I.; Durakbasa, M. N.; Katiboglu, A. B.

    2015-03-01

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  19. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    SciTech Connect

    Bulutsuz, A. G.; Demircioglu, P. Bogrekci, I.; Durakbasa, M. N.

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  20. Creating Processes Associated with Providing Government Goods and Services Under the Commercial Space Launch Act at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Letchworth, Janet F.

    2011-01-01

    Kennedy Space Center (KSC) has decided to write its agreements under the Commercial Space Launch Act (CSLA) authority to cover a broad range of categories of support that KSC could provide to our commercial partner. Our strategy was to go through the onerous process of getting the agreement in place once and allow added specificity and final cost estimates to be documented on a separate Task Order Request (TOR). This paper is written from the implementing engineering team's perspective. It describes how we developed the processes associated with getting Government support to our emerging commercial partners, such as SpaceX and reports on our success to date.

  1. Are Children's Memory Illusions Created Differently from Those of Adults? Evidence from Levels-of-Processing and Divided Attention Paradigms

    ERIC Educational Resources Information Center

    Wimmer, Marina C.; Howe, Mark L.

    2010-01-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for…

  2. Challenges of hand hygiene in healthcare: the development of a tool kit to create supportive processes and environments.

    PubMed

    Chagpar, Anjum; Banez, Carleene; Lopez, Raquel; Cafazzo, Joseph A

    2010-01-01

    Hand hygiene compliance by healthcare providers has been difficult to achieve due to diverse environments, work culture, processes and task requirements. Because of this complexity, hand hygiene lends itself well to a human factors analysis in order to design a system that matches human cognitive and physical strengths and makes allowances for human limitations.

  3. Decision-Making Processes of SME in Cloud Computing Adoption to Create Disruptive Innovation: Mediating Effect of Collaboration

    ERIC Educational Resources Information Center

    Sonthiprasat, Rattanawadee

    2014-01-01

    THE PROBLEM. The purpose of this quantitative correlation study was to assess the relationship between different Cloud service levels of effective business innovation for SMEs. In addition, the new knowledge gained from the benefits of Cloud adoption with knowledge sharing would enhance the decision making process for businesses to consider the…

  4. Decision-Making Processes of SME in Cloud Computing Adoption to Create Disruptive Innovation: Mediating Effect of Collaboration

    ERIC Educational Resources Information Center

    Sonthiprasat, Rattanawadee

    2014-01-01

    THE PROBLEM. The purpose of this quantitative correlation study was to assess the relationship between different Cloud service levels of effective business innovation for SMEs. In addition, the new knowledge gained from the benefits of Cloud adoption with knowledge sharing would enhance the decision making process for businesses to consider the…

  5. Are Children's Memory Illusions Created Differently from Those of Adults? Evidence from Levels-of-Processing and Divided Attention Paradigms

    ERIC Educational Resources Information Center

    Wimmer, Marina C.; Howe, Mark L.

    2010-01-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for…

  6. A model cerium oxide matrix composite reinforced with a homogeneous dispersion of silver particulate - prepared using the glycine-nitrate process

    SciTech Connect

    Weil, K. Scott; Hardy, John S.

    2005-01-31

    Recently a new method of ceramic brazing has been developed. Based on a two-phase liquid composed of silver and copper oxide, brazing is conducted directly in air without the need of an inert cover gas or the use of surface reactive fluxes. Because the braze displays excellent wetting characteristics on a number ceramic surfaces, including alumina, various perovskites, zirconia, and ceria, we were interested in investigating whether a metal-reinforced ceramic matrix composite (CMC) could be developed with this material. In the present study, two sets of homogeneously mixed silver/copper oxide/ceria powders were synthesized using a combustion synthesis technique. The powders were compacted and heat treated in air above the liquidus temperature for the chosen Ag-CuO composition. Metallographic analysis indicates that the resulting composite microstructures are extremely uniform with respect to both the size of the metallic reinforcement as well as its spatial distribution within the ceramic matrix. The size, morphology, and spacing of the metal particulate in the densified composite appears to be dependent on the original size and the structure of the starting combustion synthesized powders.

  7. Applying appropriate risk management strategies to improve the Superfund process: Creating mutual gains for PRPs and regulators

    SciTech Connect

    Shultz, S.R.; Forney, J.; Padovani, S.; Jones, T.; Wisenbaker, B.

    1994-12-31

    A new mechanism to develop an appropriate risk management strategy is the ``Presumptive Remedy``. This allows the Superfund process to be streamlined for certain kinds of sites, e.g. municipal landfill sites. Total Quality Management concepts are also being used to improve risk management decisions and the remediation process. This presentation will describe site-specific case studies and include a panel discussion covering how incorporation of these risk management elements into selection of the site remediation strategy leads to decisions that are acceptable to all stakeholders. The presentation will discuss lessons learned from the following perspectives: Private Industry; Department of Defense (DOD); Department of Energy (DOE); (US EPA); and A-E Firms.

  8. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  9. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  10. Modeling unsaturated zone flow and runoff processes by integrating MODFLOW-LGR and VSF, and creating the new CFL package

    USGS Publications Warehouse

    Borsia, I.; Rossetto, R.; Schifani, C.; Hill, Mary C.

    2013-01-01

    In this paper two modifications to the MODFLOW code are presented. One concerns an extension of Local Grid Refinement (LGR) to Variable Saturated Flow process (VSF) capability. This modification allows the user to solve the 3D Richards’ equation only in selected parts of the model domain. The second modification introduces a new package, named CFL (Cascading Flow), which improves the computation of overland flow when ground surface saturation is simulated using either VSF or the Unsaturated Zone Flow (UZF) package. The modeling concepts are presented and demonstrated. Programmer documentation is included in appendices.

  11. Optimizing homogenization by chaotic unmixing?

    NASA Astrophysics Data System (ADS)

    Weijs, Joost; Bartolo, Denis

    2016-11-01

    A number of industrial processes rely on the homogeneous dispersion of non-brownian particles in a viscous fluid. An ideal mixing would yield a so-called hyperuniform particle distribution. Such configurations are characterized by density fluctuations that grow slower than the standard √{ N}-fluctuations. Even though such distributions have been found in several natural structures, e.g. retina receptors in birds, they have remained out of experimental reach until very recently. Over the last 5 years independent experiments and numerical simulations have shown that periodically driven suspensions can self-assemble hyperuniformally. Simple as the recipe may be, it has one important disadvantage. The emergence of hyperuniform states co-occurs with a critical phase transition from reversible to non reversible particle dynamics. As a consequence the homogenization dynamics occurs over a time that diverges with the system size (critical slowing down). Here, we discuss how this process can be sped up by exploiting the stirring properties of chaotic advection. Among the questions that we answer are: What are the physical mechanisms in a chaotic flow that are relevant for hyperuniformity? How can we tune the flow parameters such to obtain optimal hyperuniformity in the fastest way? JW acknowledges funding by NWO (Netherlands Organisation for Scientific Research) through a Rubicon Grant.

  12. Are children's memory illusions created differently from those of adults? Evidence from levels-of-processing and divided attention paradigms.

    PubMed

    Wimmer, Marina C; Howe, Mark L

    2010-09-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for all ages. For false recognition, when information was encoded on a shallow level, we found a different pattern for young children compared with that for older children and adults. False recognition rates were related to the overall amount of correctly remembered information for 7-year-olds, whereas no such association was found for the other age groups. In the second experiment, divided attention decreased true recognition for all ages. In contrast, children's (7- and 11-year-olds) false recognition rates were again dependent on the overall amount of correctly remembered information, whereas adults' false recognition was left unaffected. Overall, children's false recognition rates changed when levels of processing or divided attention was manipulated in comparison with adults. Together, these results suggest that there may be both quantitative and qualitative changes in false memory rates with age.

  13. Miniature Fabry-Perot pressure sensor created by using UV-molding process with an optical fiber based mold.

    PubMed

    Bae, H; Yu, M

    2012-06-18

    We present a miniature Fabry-Perot pressure sensor fabricated at the tip of an optical fiber with a pre-written Bragg grating by using UV-molding polymer process. The mold is constructed by integrating an optical fiber of 80 μm diameter with a zirconia ferrule. The optical fiber based mold makes it possible to use optical aligning method to monitor the coupled intensity between the mold-side and replica-side fibers, rendering a maskless alignment process with a submicrometer accuracy. A polymer-metal composite thin diaphragm is employed as the pressure transducer. The overall sensor size is around 200 μm in diameter. Experimental study shows that the sensor exhibits a good linearity over a pressure range of 1.9-7.9 psi, with a sensitivity of 0.0106 μm/psi. The fiber Bragg grating is exploited for simultaneous temperature measurements or compensation for temperature effects in pressure readings. The sensor is expected to benefit many fronts that require miniature and inexpensive sensors for reliable pressure measurement, especially biomedical applications.

  14. Homogeneous, bioluminescent proteasome assays.

    PubMed

    O'Brien, Martha A; Moravec, Richard A; Riss, Terry L; Bulleit, Robert F

    2015-01-01

    Protein degradation is mediated predominantly through the ubiquitin-proteasome pathway. The importance of the proteasome in regulating degradation of proteins involved in cell-cycle control, apoptosis, and angiogenesis led to the recognition of the proteasome as a therapeutic target for cancer. The proteasome is also essential for degrading misfolded and aberrant proteins, and impaired proteasome function has been implicated in neurodegerative and cardiovascular diseases. Robust, sensitive assays are essential for monitoring proteasome activity and for developing inhibitors of the proteasome. Peptide-conjugated fluorophores are widely used as substrates for monitoring proteasome activity, but fluorogenic substrates can exhibit significant background and can be problematic for screening because of cellular autofluorescence or interference from fluorescent library compounds. Furthermore, fluorescent proteasome assays require column-purified 20S or 26S proteasome (typically obtained from erythrocytes), or proteasome extracts from whole cells, as their samples. To provide assays more amenable to high-throughput screening, we developed a homogeneous, bioluminescent method that combines peptide-conjugated aminoluciferin substrates and a stabilized luciferase. Using substrates for the chymotrypsin-like, trypsin-like, and caspase-like proteasome activities in combination with a selective membrane permeabilization step, we developed single-step, cell-based assays to measure each of the proteasome catalytic activities. The homogeneous method eliminates the need to prepare individual cell extracts as samples and has adequate sensitivity for 96- and 384-well plates. The simple "add and read" format enables sensitive and rapid proteasome assays ideal for inhibitor screening.

  15. Restoration of overwash processes creates piping plover (Charadrius melodus) habitat on a barrier island (Assateague Island, Maryland)

    NASA Astrophysics Data System (ADS)

    Schupp, Courtney A.; Winn, Neil T.; Pearl, Tami L.; Kumer, John P.; Carruthers, Tim J. B.; Zimmerman, Carl S.

    2013-01-01

    On Assateague Island, an undeveloped barrier island along Maryland and Virginia, a foredune was constructed to protect the island from the erosion and breaching threat caused by permanent jetties built to maintain Ocean City Inlet. Scientists and engineers integrated expertise in vegetation, wildlife, geomorphology, and coastal engineering in order to design a habitat restoration project that would be evaluated in terms of coastal processes rather than static features. Development of specific restoration targets, thresholds for intervention, and criteria to evaluate long-term project success were based on biological and geomorphological data and coastal engineering models. A detailed long-term monitoring plan was established to measure project sustainability. The foredune unexpectedly acted as near-total barrier to both overwash and wind, and the dynamic ecosystem underwent undesirable habitat changes including conversion of early-succession beach habitat to herbaceous and shrub communities, diminishing availability of foraging habitat and thereby reducing productivity of the Federally-listed Threatened Charadrius melodus (piping plover). To address these impacts, multiple notches were cut through the constructed foredune. The metric for initial geomorphological success-restoration of at least one overwash event per year across the constructed foredune, if occurring elsewhere on the island-was reached. New overwash fans increased island stability by increasing interior island elevation. At every notch, areas of sparse vegetation increased and the new foraging habitat was utilized by breeding pairs during the 2010 breeding season. However, the metric for long-term biological success-an increase to 37% sparsely vegetated habitat on the North End and an increase in piping plover productivity to 1.25 chicks fledged per breeding pair-has not yet been met. By 2010 there was an overall productivity of 1.2 chicks fledged per breeding pair and a 1.7% decrease in sparsely

  16. Sources of Peer Group Homogeneity

    ERIC Educational Resources Information Center

    Cohen, Jere M.

    1977-01-01

    Investigates how adolescent friendship groups become homogeneous. Analysis of 49 student friendship groups indicates that homophilic selection is most important for group homogeneity, conformity pressures are somewhat important, and disproportionate group leaving contributes nothing to homogeneity. The conclusion is that the magnitude of peer…

  17. HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Hammond, R.P.; Busey, H.M.

    1959-02-17

    Nuclear reactors of the homogeneous liquid fuel type are discussed. The reactor is comprised of an elongated closed vessel, vertically oriented, having a critical region at the bottom, a lower chimney structure extending from the critical region vertically upwardly and surrounded by heat exchanger coils, to a baffle region above which is located an upper chimney structure containing a catalyst functioning to recombine radiolyticallydissociated moderator gages. In operation the liquid fuel circulates solely by convection from the critical region upwardly through the lower chimney and then downwardly through the heat exchanger to return to the critical region. The gases formed by radiolytic- dissociation of the moderator are carried upwardly with the circulating liquid fuel and past the baffle into the region of the upper chimney where they are recombined by the catalyst and condensed, thence returning through the heat exchanger to the critical region.

  18. Homogeneous quantum electrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1992-01-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  19. Experimental Simulation of the Radionuclide Behaviour in the Process of Creating Additional Safety Barriers in Solid Radioactive Waste Repositories Containing Irradiated Graphite

    NASA Astrophysics Data System (ADS)

    Pavliuk, A. O.; Kotlyarevskiy, S. G.; Bespala, E. V.; Zakarova, E. V.; Rodygina, N. I.; Ermolaev, V. M.; Proshin, I. M.; Volkova, A.

    2016-08-01

    Results of the experimental modeling of radionuclide behavior when creating additional safety barriers in solid radioactive waste repositories are presented. The experiments were run on the repository mockup containing solid radioactive waste fragments including irradiated graphite. The repository mockup layout is given; the processes with radionuclides that occur during the barrier creation with a clayey solution and during the following barrier operation are investigated. The results obtained confirm high anti-migration and anti-filtration properties of clay used for the barrier creation even under the long-term excessive water saturation of rocks confining the repository.

  20. Homogeneous nucleation kinetics

    NASA Technical Reports Server (NTRS)

    Rasmussen, D. H.; Appleby, M. R.; Leedom, G. L.; Babu, S. V.; Naumann, R. J.

    1983-01-01

    Homogeneous nucleation kinetics are rederived in a manner fundamentally similar to the approach of classical nucleation theory with the following modifications and improvements. First, the cluster is a parent phase cluster and does not require energization to the parent state. Second, the thermodynamic potential used to describe phase stability is a continuous function along the pathway of phase decomposition. Third, the kinetics of clustering corresponds directly to the diffusional flux of monomers through the cluster distribution and are formally similar to classical theory with the resulting kinetic equation modified by two terms in the preexponential factor. These terms correct for the influence of a supersaturation dependent clustering within the parent phase and for the influence of an asymmetrical cluster concentration as a function of cluster size at the critical cluster size. Fourth, the supersaturation dependence of the nucleation rate is of the same form as that given by classical nucleation theory. This supersaturation dependence must however be interpreted in terms of a size dependent surface tension. Finally, there are two scaling laws which describe supersaturation to either constant nucleation rate or to the thermodynamically determined physical spinodal.

  1. Light-created chemiluminescence

    NASA Astrophysics Data System (ADS)

    Vasil'ev, Rostislav F.; Tsaplev, Yuri B.

    2006-11-01

    The results of studies of light-created chemiluminescence are described systematically. Conditions for the transformation of a dark chemical reaction into a chemiluminescence reaction are considered. Examples of photosensitised and photoinduced processes as well as of analytical applications are given.

  2. Creating Collaborative Advantage.

    ERIC Educational Resources Information Center

    Huxham, Chris, Ed.

    Although interorganizational collaboration is becoming increasingly significant as a means of achieving organizational objectives, it is not an easy process to implement. Drawing on the work of authors with extensive experience, an accessible introduction to the theory and practice of creating collaborative advantage is presented in this volume.…

  3. Shear wave splitting hints at dynamical features of mantle convection: a global study of homogeneously processed source and receiver side upper mantle anisotropy

    NASA Astrophysics Data System (ADS)

    Walpole, J.; Wookey, J. M.; Masters, G.; Kendall, J. M.

    2013-12-01

    The asthenosphere is embroiled in the process of mantle convection. Its viscous properties allow it to flow around sinking slabs and deep cratonic roots as it is displaced by intruding material and dragged around by the moving layer above. As the asthenosphere flows it develops a crystalline fabric with anisotropic crystals preferentially aligned in the direction of flow. Meanwhile, the lithosphere above deforms as it is squeezed and stretched by underlying tectonic processes, enabling anisotropic fabrics to develop and become fossilised in the rigid rock and to persist over vast spans of geological time. As a shear wave passes through an anisotropic medium it splits into two orthogonally polarised quasi shear waves that propagate at different velocities (this phenomenon is known as shear wave splitting). By analysing the polarisation and the delay time of many split waves that have passed through a region it is possible to constrain the anisotropy of the medium in that region. This anisotropy is the key to revealing the deformation history of the deep Earth. In this study we present measurements of shear wave splitting recorded on S, SKS, and SKKS waves from earthquakes recorded at stations from the IRIS DMC catalogue (1976-2010). We have used a cluster analysis phase picking technique [1] to pick hundreds of thousands of high signal to noise waveforms on long period data. These picks are used to feed the broadband data into an automated processing workflow that recovers shear wave splitting parameters [2,3]. The workflow includes a new method for making source and receiver corrections, whereby the stacked error surfaces are used as input to correction rather than a single set of parameters, this propagates uncertainty information into the final measurement. Using SKS, SKKS, and source corrected S, we recover good measurements of anisotropy beneath 1,569 stations. Using receiver corrected S we recover good measurements of anisotropy beneath 470 events. We compare

  4. Universum Inference and Corpus Homogeneity

    NASA Astrophysics Data System (ADS)

    Vogel, Carl; Lynch, Gerard; Janssen, Jerom

    Universum Inference is re-interpreted for assessment of corpus homogeneity in computational stylometry. Recent stylometric research quantifies strength of characterization within dramatic works by assessing the homogeneity of corpora associated with dramatic personas. A methodological advance is suggested to mitigate the potential for the assessment of homogeneity to be achieved by chance. Baseline comparison analysis is constructed for contributions to debates by nonfictional participants: the corpus analyzed consists of transcripts of US Presidential and Vice-Presidential debates from the 2000 election cycle. The corpus is also analyzed in translation to Italian, Spanish and Portuguese. Adding randomized categories makes assessments of homogeneity more conservative.

  5. Reciprocity theory of homogeneous reactions

    NASA Astrophysics Data System (ADS)

    Agbormbai, Adolf A.

    1990-03-01

    The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.

  6. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  7. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  8. A comparison of the source processes of four Boso Peninsula slow slip events from 1996 to 2011 based on nearly homogeneous GNSS stations

    NASA Astrophysics Data System (ADS)

    Hirose, H.; Matsuzawa, T.; Kimura, T.

    2013-12-01

    Around the Boso Peninsula, Japan, slow slip events (SSEs) accompanied with earthquake swarms recurs with the repeating intervals between four to seven years, associated with the subduction of the Philippine Sea Plate (PHS) from the Sagami trough beneath the Kanto area. The latest event occurred in October 2011, which was likely hastened by the great Tohoku earthquake (magnitude 9.0) in March 2011 and an earlier episode was likely delayed by an intraslab earthquake in 1987 (magnitude 6.7) that occurred just beneath the source area of the SSEs (Hirose et al., 2012). This suggests that the occurrence of the Boso SSEs is largely influenced by stress disturbances of the order of 0.1 MPa, indicating the sensitive nature of the SSE source area. This recurrence history is useful to understand the frictional properties on the plate interface because it is rare to observe recurrent slip events on the interface occur at almost the same place with nearly the same observation coverage. The later four episodes (1996, 2002, 2007, 2011) were observed with GNSS Earth Observation Network System (GEONET) operated by Geospatial Authority of Japan (GSI) (Sagiya, 2004; Ozawa et al., 2003, 2007; Hirose et al., 2012). We invert displacement data for these four Boso SSEs to obtain the source process for each episode, and to discuss the possible relation to the fluctuation in the recurrence intervals. Network Inversion Filter (Segall and Matthews, 1997; Hirose and Obara, 2010) is applied to the GNSS data sets. We define the PHS plate configuration beneath the Kanto area based on the distribution of repeating earthquakes (Kimura et al., 2006) and the compilation of seismic reflection surveys (Takeda et al., 2007). There is a common slip area among the four SSEs in the eastern offshore region in the study area. Slip always starts on the offshore region and migrates to the west or to the north. This migration pattern roughly corresponds to the migration of the accompanied earthquake activity

  9. A homogeneous fluorometric assay platform based on novel synthetic proteins

    SciTech Connect

    Vardar-Schara, Goenuel; Krab, Ivo M.; Yi, Guohua; Su, Wei Wen . E-mail: wsu@hawaii.edu

    2007-09-14

    Novel synthetic recombinant sensor proteins have been created to detect analytes in solution, in a rapid single-step 'mix and read' noncompetitive homogeneous assay process, based on modulating the Foerster resonance energy transfer (FRET) property of the sensor proteins upon binding to their targets. The sensor proteins comprise a protein scaffold that incorporates a specific target-capturing element, sandwiched by genetic fusion between two molecules that form a FRET pair. The utility of the sensor proteins was demonstrated via three examples, for detecting an anti-biotin Fab antibody, a His-tagged recombinant protein, and an anti-FLAG peptide antibody, respectively, all done directly in solution. The diversity of sensor-target interactions that we have demonstrated in this study points to a potentially universal applicability of the biosensing concept. The possibilities for integrating a variety of target-capturing elements with a common sensor scaffold predict a broad range of practical applications.

  10. Effects of sample homogenization on solid phase sediment toxicity

    SciTech Connect

    Anderson, B.S.; Hunt, J.W.; Newman, J.W.; Tjeerdema, R.S.; Fairey, W.R.; Stephenson, M.D.; Puckett, H.M.; Taberski, K.M.

    1995-12-31

    Sediment toxicity is typically assessed using homogenized surficial sediment samples. It has been recognized that homogenization alters sediment integrity and may result in changes in chemical bioavailability through oxidation-reduction or other chemical processes. In this study, intact (unhomogenized) sediment cores were taken from a Van Veen grab sampler and tested concurrently with sediment homogenate from the same sample in order to investigate the effect of homogenization on toxicity. Two different solid-phase toxicity test protocols were used for these comparisons. Results of amphipod exposures to samples from San Francisco Bay indicated minimal difference between intact and homogenized samples. Mean amphipod survival in intact cores relative to homogenates was similar at two contaminated sites. Mean survival was 34 and 33% in intact and homogenized samples, respectively, at Castro Cove. Mean survival was 41% and 57%, respectively, in intact and homogenized samples from Islais Creek. Studies using the sea urchin development protocol, modified for testing at the sediment/water interface, indicated considerably more toxicity in intact samples relative to homogenized samples from San Diego Bay. Measures of metal flux into the overlying water demonstrated greater flux of metals from the intact samples. Zinc flux was five times greater, and copper flux was twice as great in some intact samples relative to homogenates. Future experiments will compare flux of metals and organic compounds in intact and homogenized sediments to further evaluate the efficacy of using intact cores for solid phase toxicity assessment.

  11. STEAM STIRRED HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Busey, H.M.

    1958-06-01

    A homogeneous nuclear reactor utilizing a selfcirculating liquid fuel is described. The reactor vessel is in the form of a vertically disposed tubular member having the lower end closed by the tube walls and the upper end closed by a removal fianged assembly. A spherical reaction shell is located in the lower end of the vessel and spaced from the inside walls. The reaction shell is perforated on its lower surface and is provided with a bundle of small-diameter tubes extending vertically upward from its top central portion. The reactor vessel is surrounded in the region of the reaction shell by a neutron reflector. The liquid fuel, which may be a solution of enriched uranyl sulfate in ordinary or heavy water, is mainiained at a level within the reactor vessel of approximately the top of the tubes. The heat of the reaction which is created in the critical region within the spherical reaction shell forms steam bubbles which more upwardly through the tubes. The upward movement of these bubbles results in the forcing of the liquid fuel out of the top of these tubes, from where the fuel passes downwardly in the space between the tubes and the vessel wall where it is cooled by heat exchangers. The fuel then re-enters the critical region in the reaction shell through the perforations in the bottom. The upper portion of the reactor vessel is provided with baffles to prevent the liquid fuel from splashing into this region which is also provided with a recombiner apparatus for recombining the radiolytically dissociated moderator vapor and a control means.

  12. Homogeneous crystal nucleation in polymers.

    PubMed

    Schick, Christoph; Androsch, R; Schmelzer, Juern W P

    2017-07-14

    The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 106 K s-1, allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation. © 2017 IOP Publishing Ltd.

  13. The Leadership Assignment: Creating Change.

    ERIC Educational Resources Information Center

    Calabrese, Raymond L.

    This book provides change-motivated leaders with an understanding of the change process and the tools to drive change. Eight change principles guide change agents in creating and sustaining change: prepare to lead change; knowledge is power; create empowering mental models; overcome resistance to change; lead change; accelerate the change process;…

  14. Locally homogeneous pp-waves

    NASA Astrophysics Data System (ADS)

    Globke, Wolfgang; Leistner, Thomas

    2016-10-01

    We show that every n-dimensional locally homogeneous pp-wave is a plane wave, provided it is indecomposable and its curvature operator, when acting on 2-forms, has rank greater than one. As a consequence we obtain that indecomposable, Ricci-flat locally homogeneous pp-waves are plane waves. This generalises a classical result by Jordan, Ehlers and Kundt in dimension 4. Several examples show that our assumptions on indecomposability and the rank of the curvature are essential.

  15. Operator estimates in homogenization theory

    NASA Astrophysics Data System (ADS)

    Zhikov, V. V.; Pastukhova, S. E.

    2016-06-01

    This paper gives a systematic treatment of two methods for obtaining operator estimates: the shift method and the spectral method. Though substantially different in mathematical technique and physical motivation, these methods produce basically the same results. Besides the classical formulation of the homogenization problem, other formulations of the problem are also considered: homogenization in perforated domains, the case of an unbounded diffusion matrix, non-self-adjoint evolution equations, and higher-order elliptic operators. Bibliography: 62 titles.

  16. Integration of a nurse navigator into the triage process for patients with non-small-cell lung cancer: creating systematic improvements in patient care.

    PubMed

    Zibrik, K; Laskin, J; Ho, C

    2016-06-01

    Nurse navigation is a developing facet of oncology care. The concept of patient navigation was originally created in 1990 at the Harlem Hospital Center in New York City as a strategy to assist vulnerable and socially disadvantaged populations with timely access to breast cancer care. Since the mid-1990s, navigation programs have expanded to include many patient populations that require specialized management and prompt access to diagnostic and clinical resources. Advanced non-small-cell lung cancer is ideally suited for navigation to facilitate efficient assessment in this fragile patient population and to ensure timely results of molecular tests for first-line therapy with appropriately targeted agents. At the BC Cancer Agency, nurse navigator involvement with thoracic oncology triage has been demonstrated to increase the proportion of patients receiving systemic treatment, to shorten the time to delivery of systemic treatment, and to increase the rate of molecular testing and the number of patients with molecular testing results available at time of initial consultation. Insights gained through the start-up process are briefly discussed, and a framework for implementation at other institutions is outlined.

  17. Integration of a nurse navigator into the triage process for patients with non-small-cell lung cancer: creating systematic improvements in patient care

    PubMed Central

    Zibrik, K.; Laskin, J.; Ho, C.

    2016-01-01

    Nurse navigation is a developing facet of oncology care. The concept of patient navigation was originally created in 1990 at the Harlem Hospital Center in New York City as a strategy to assist vulnerable and socially disadvantaged populations with timely access to breast cancer care. Since the mid-1990s, navigation programs have expanded to include many patient populations that require specialized management and prompt access to diagnostic and clinical resources. Advanced non-small-cell lung cancer is ideally suited for navigation to facilitate efficient assessment in this fragile patient population and to ensure timely results of molecular tests for first-line therapy with appropriately targeted agents. At the BC Cancer Agency, nurse navigator involvement with thoracic oncology triage has been demonstrated to increase the proportion of patients receiving systemic treatment, to shorten the time to delivery of systemic treatment, and to increase the rate of molecular testing and the number of patients with molecular testing results available at time of initial consultation. Insights gained through the start-up process are briefly discussed, and a framework for implementation at other institutions is outlined. PMID:27330366

  18. Homogeneous coordinates in motion correction.

    PubMed

    Zahneisen, Benjamin; Ernst, Thomas

    2016-01-01

    Prospective motion correction for MRI and other imaging modalities are commonly based on the assumption of affine motion, i.e., rotations, shearing, scaling and translations. In addition it often involves transformations between different reference frames, especially for applications with an external tracking device. The goal of this work is to develop a computational framework for motion correction based on homogeneous transforms. The homogeneous representation of affine transformations uses 4 × 4 transformation matrices applied to four-dimensional augmented vectors. It is demonstrated how homogenous transforms can be used to describe the motion of slice objects during an MRI scan. Furthermore, we extend the concept of homogeneous transforms to gradient and k-space vectors, and show that the fourth dimension of an augmented k-space vector encodes the complex phase of the corresponding signal sample due to translations. The validity of describing motion tracking in real space and k-space using homogeneous transformations only is demonstrated on phantom experiments. Homogeneous transformations allows for a conceptually simple, consistent and computationally efficient theoretical framework for motion correction applications. © 2015 Wiley Periodicals, Inc.

  19. Large Eddy Simulation of Homogeneous Rotating Turbulence

    NASA Technical Reports Server (NTRS)

    Squires, Kyle D.; Mansour, Nagi N.; Cambon, Claude; Chasnov, Jeffrey R.; Kutler, Paul (Technical Monitor)

    1994-01-01

    Study of turbulent flows in rotating reference frames has proven to be one of the more challenging areas of turbulence research. The large number of theoretical, experimental, and computational studies performed over the years have demonstrated that the effect of solid-body rotation on turbulent flows is subtle and remains exceedingly difficult to predict. Because of the complexities associated with non-homogeneous turbulence, it is worthwhile to examine the effect of steady system rotation on the evolution of an initially isotropic turbulent flow. The assumption of statistical homogeneity considerably simplifies analysis and computation; calculation of homogeneous turbulence is further motivated since it possesses the essential physics found in more complex rotating flows. The principal objectives of the present study have therefore been to increase our fundamental understanding of turbulent flows in rotating reference frames through an examination of the asymptotic state of homogeneous rotating turbulence; particularly as to the existence of an asymptotic state which is self similar. Knowledge of an asymptotic similarity state permits prediction of the ultimate statistical evolution of the flow without requiring detailed knowledge of the complex, and not well understood, non-linear transfer processes. Aside from examination of possible similarity states in rotating turbulence, of further interest in this study has been an examination of the degree to which solid-body rotation induces a two-dimensional state in an initially isotropic flow.

  20. Revisiting Shock Initiation Modeling of Homogeneous Explosives

    NASA Astrophysics Data System (ADS)

    Partom, Yehuda

    2013-04-01

    Shock initiation of homogeneous explosives has been a subject of research since the 1960s, with neat and sensitized nitromethane as the main materials for experiments. A shock initiation model of homogeneous explosives was established in the early 1960s. It involves a thermal explosion event at the shock entrance boundary, which develops into a superdetonation that overtakes the initial shock. In recent years, Sheffield and his group, using accurate experimental tools, were able to observe details of buildup of the superdetonation. There are many papers on modeling shock initiation of heterogeneous explosives, but there are only a few papers on modeling shock initiation of homogeneous explosives. In this article, bulk reaction reactive flow equations are used to model homogeneous shock initiation in an attempt to reproduce experimental data of Sheffield and his group. It was possible to reproduce the main features of the shock initiation process, including thermal explosion, superdetonation, input shock overtake, overdriven detonation after overtake, and the beginning of decay toward Chapman-Jouget (CJ) detonation. The time to overtake (TTO) as function of input pressure was also calculated and compared to the experimental TTO.

  1. A Homogeneous Billet Layer Casting Fabrication Method

    NASA Astrophysics Data System (ADS)

    Ren, FengLi; Wang, JunGe; Ge, HongHao; Li, Jun; Hu, Qiaodan; Nadendla, Hari-Babu; Xia, MingXu; Li, JianGuo

    2017-10-01

    A novel additive casting approach, termed as layer casting (LC), was proposed to fabricate ingots with homogeneous composition and grain structure distribution. Ingots of Al-4.5 wt pct Cu were fabricated using conventional and novel methods to verify the feasibility of this novel approach. The results show that the novel processing not only alleviates macrosegregation but also reduces the shrinkage cavity and improves the tensile properties of the as-cast condition.

  2. Using high-performance ¹H NMR (HP-qNMR®) for the certification of organic reference materials under accreditation guidelines--describing the overall process with focus on homogeneity and stability assessment.

    PubMed

    Weber, Michael; Hellriegel, Christine; Rueck, Alexander; Wuethrich, Juerg; Jenks, Peter

    2014-05-01

    Quantitative NMR spectroscopy (qNMR) is gaining interest across both analytical and industrial research applications and has become an essential tool for the content assignment and quantitative determination of impurities. The key benefits of using qNMR as measurement method for the purity determination of organic molecules are discussed, with emphasis on the ability to establish traceability to "The International System of Units" (SI). The work describes a routine certification procedure from the point of view of a commercial producer of certified reference materials (CRM) under ISO/IEC 17025 and ISO Guide 34 accreditation, that resulted in a set of essential references for (1)H qNMR measurements, and the relevant application data for these substances are given. The overall process includes specific selection criteria, pre-tests, experimental conditions, homogeneity and stability studies. The advantages of an accelerated stability study over the classical stability-test design are shown with respect to shelf-life determination and shipping conditions. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  3. (Ultra) high pressure homogenization for continuous high pressure sterilization of pumpable foods - a review.

    PubMed

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work.

  4. (Ultra) High Pressure Homogenization for Continuous High Pressure Sterilization of Pumpable Foods – A Review

    PubMed Central

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work. PMID:25988118

  5. Political homogeneity can nurture threats to research validity.

    PubMed

    Chambers, John R; Schlenker, Barry R

    2015-01-01

    Political homogeneity within a scientific field nurtures threats to the validity of many research conclusions by allowing ideologically compatible values to influence interpretations, by minimizing skepticism, and by creating premature consensus. Although validity threats can crop in any research, the usual corrective activities in science are more likely to be minimized and delayed.

  6. Proficiency testing in immunopathology: establishing the homogeneity of test material.

    PubMed

    Esterman, Adrian; Javanovich, Sue; McEvoy, Robert; Roberts-Thomson, Peter

    2005-04-01

    To develop a technique for homogeneity testing of serum aliquot samples suitable for use in the Quality Assurance Program in Clinical Immunology (QAP Pty Ltd). Albumin was selected as the surrogate protein marker for the product to be tested and the coefficient of dispersion (COD) calculated as the measure of homogeneity. To detect changes in the average level of homogeneity, cumulative sum control (cusum) charts were used. The COD(%) for each triplicate reading of albumin obtained from 34 specimens was normally distributed with a mean of 0.49% and a standard deviation of 0.25%. In industrial quality control schemes the action line is generally set at the upper 99% confidence limits, hence any triplicate sample would be considered to have acceptable homogeneity if the COD was < or = 1.08%. Cusum charts were created to monitor albumin homogeneity over time. The use of albumin measurement as the surrogate appears statistically suitable for homogeneity testing in QAP programs for immunodiagnostic testing. CUSUM charts are particularly useful to monitor such homogeneity testing.

  7. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    SciTech Connect

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  8. Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (initial award through Award Modification 2); Energy & Risk Transfer Assessment (Award Modifications 3 - 6)

    SciTech Connect

    Michael Ebert

    2008-02-28

    This is the final report for the DOE-NETL grant entitled 'Creating New Incentives for Risk Identification & Insurance Processes for the Electric Utility Industry' and later, 'Energy & Risk Transfer Assessment'. It reflects work done on projects from 15 August 2004 to 29 February 2008. Projects were on a variety of topics, including commercial insurance for electrical utilities, the Electrical Reliability Organization, cost recovery by Gulf State electrical utilities after major hurricanes, and review of state energy emergency plans. This Final Technical Report documents and summarizes all work performed during the award period, which in this case is from 15 August 2004 (date of notification of original award) through 29 February 2008. This report presents this information in a comprehensive, integrated fashion that clearly shows a logical and synergistic research trajectory, and is augmented with findings and conclusions drawn from the research as a whole. Four major research projects were undertaken and completed during the 42 month period of activities conducted and funded by the award; these are: (1) Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (also referred to as the 'commercial insurance' research). Three major deliverables were produced: a pre-conference white paper, a two-day facilitated stakeholders workshop conducted at George Mason University, and a post-workshop report with findings and recommendations. All deliverables from this work are published on the CIP website at http://cipp.gmu.edu/projects/DoE-NETL-2005.php. (2) The New Electric Reliability Organization (ERO): an examination of critical issues associated with governance, standards development and implementation, and jurisdiction (also referred to as the 'ERO study'). Four major deliverables were produced: a series of preliminary memoranda for the staff of the Office of Electricity Delivery and Energy Reliability ('OE'), an ERO interview

  9. Dynamics of compact homogeneous universes

    SciTech Connect

    Tanimoto, M.; Koike, T.; Hosoya, A.

    1997-01-01

    A complete description of dynamics of compact locally homogeneous universes is given, which, in particular, includes explicit calculations of Teichm{umlt u}ller deformations and careful counting of dynamical degrees of freedom. We regard each of the universes as a simply connected four-dimensional space{endash}time with identifications by the action of a discrete subgroup of the isometry group. We then reduce the identifications defined by the space{endash}time isometries to ones in a homogeneous section, and find a condition that such spatial identifications must satisfy. This is essential for explicit construction of compact homogeneous universes. Some examples are demonstrated for Bianchi II, VI{sub 0}, VII{sub 0}, and I universal covers. {copyright} {ital 1997 American Institute of Physics.}

  10. Harmonic analysis of homogeneous networks.

    PubMed

    Wolfe, W J; Rothman, J A; Chang, E H; Aultman, W; Ripton, G

    1995-01-01

    We introduce a generalization of mutually inhibitory networks called homogeneous networks. Such networks have symmetric connection strength matrices that are circulant (one-dimensional case) or block circulant with circulant blocks (two-dimensional case). Fourier harmonics provide universal eigenvectors, and we apply them to several homogeneous examples: k-wta, k-cluster, on/center off/surround, and the assignment problem. We also analyze one nonhomogeneous case: the subset-sum problem. We present the results of 10000 trials on a 50-node k-cluster problem and 100 trials on a 25-node subset-sum problem.

  11. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  12. The OPtimising HEalth LIterAcy (Ophelia) process: study protocol for using health literacy profiling and community engagement to create and implement health reform

    PubMed Central

    2014-01-01

    Background Health literacy is a multi-dimensional concept comprising a range of cognitive, affective, social, and personal skills and attributes. This paper describes the research and development protocol for a large communities-based collaborative project in Victoria, Australia that aims to identify and respond to health literacy issues for people with chronic conditions. The project, called Ophelia (OPtimising HEalth LIterAcy) Victoria, is a partnership between two universities, eight service organisations and the Victorian Government. Based on the identified issues, it will develop and pilot health literacy interventions across eight disparate health services to inform the creation of a health literacy response framework to improve health outcomes and reduce health inequalities. Methods/Design The protocol draws on many inputs including the experience of the partners in previous co-creation and roll-out of large-scale health-promotion initiatives. Three key conceptual models/discourses inform the protocol: intervention mapping; quality improvement collaboratives, and realist synthesis. The protocol is outcomes-oriented and focuses on two key questions: ‘What are the health literacy strengths and weaknesses of clients of participating sites?’, and ‘How do sites interpret and respond to these in order to achieve positive health and equity outcomes for their clients?’. The process has six steps in three main phases. The first phase is a needs assessment that uses the Health Literacy Questionnaire (HLQ), a multi-dimensional measure of health literacy, to identify common health literacy needs among clients. The second phase involves front-line staff and management within each service organisation in co-creating intervention plans to strategically respond to the identified local needs. The third phase will trial the interventions within each site to determine if the site can improve identified limitations to service access and/or health outcomes. Discussion

  13. Homogeneous cooling state of frictionless rod particles

    NASA Astrophysics Data System (ADS)

    Rubio-Largo, S. M.; Alonso-Marroquin, F.; Weinhart, T.; Luding, S.; Hidalgo, R. C.

    2016-02-01

    In this work, we report some theoretical results on granular gases consisting of frictionless 3D rods with low energy dissipation. We performed simulations on the temporal evolution of soft spherocylinders, using a molecular dynamics algorithm implemented on GPU architecture. A homogeneous cooling state for rods, where the time dependence of the system's intensive variables occurs only through a global granular temperature, has been identified. We have found a homogeneous cooling process, which is in excellent agreement with Haff's law, when using an adequate rescaling time τ(ξ), the value of which depends on the particle elongation ξ and the restitution coefficient. It was further found that scaled particle velocity distributions remain approximately Gaussian regardless of the particle shape. Similarly to a system of ellipsoids, energy equipartition between rotational and translational degrees of freedom was better satisfied as one gets closer to the elastic limit. Taking advantage of scaling properties, we have numerically determined the general functionality of the magnitude Dc(ξ), which describes the efficiency of the energy interchange between rotational and translational degrees of freedom, as well as its dependence on particle shape. We have detected a range of particle elongations (1.5 < ξ < 4.0), where the average energy transfer between the rotational and translational degrees of freedom results greater for spherocylinders than for homogeneous ellipsoids with the same aspect ratio.

  14. A compact setup to study homogeneous nucleation and condensation

    NASA Astrophysics Data System (ADS)

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 106 Pa resulting from mass flow rates of up to 600 lN min-1 and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD.

  15. A compact setup to study homogeneous nucleation and condensation.

    PubMed

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 10(6) Pa resulting from mass flow rates of up to 600 l(N) min(-1) and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD.

  16. Homogeneous Pt-bimetallic Electrocatalysts

    SciTech Connect

    Wang, Chao; Chi, Miaofang; More, Karren Leslie; Markovic, Nenad; Stamenkovic, Vojislav

    2011-01-01

    Alloying has shown enormous potential for tailoring the atomic and electronic structures, and improving the performance of catalytic materials. Systematic studies of alloy catalysts are, however, often compromised by inhomogeneous distribution of alloying components. Here we introduce a general approach for the synthesis of monodispersed and highly homogeneous Pt-bimetallic alloy nanocatalysts. Pt{sub 3}M (where M = Fe, Ni, or Co) nanoparticles were prepared by an organic solvothermal method and then supported on high surface area carbon. These catalysts attained a homogeneous distribution of elements, as demonstrated by atomic-scale elemental analysis using scanning transmission electron microscopy. They also exhibited high catalytic activities for the oxygen reduction reaction (ORR), with improvement factors of 2-3 versus conventional Pt/carbon catalysts. The measured ORR catalytic activities for Pt{sub 3}M nanocatalysts validated the volcano curve established on extended surfaces, with Pt{sub 3}Co being the most active alloy.

  17. Entanglement Created by Dissipation

    SciTech Connect

    Alharbi, Abdullah F.; Ficek, Zbigniew

    2011-10-27

    A technique for entangling closely separated atoms by the process of dissipative spontaneous emission is presented. The system considered is composed of two non-identical two-level atoms separated at the quarter wavelength of a driven standing wave laser field. At this atomic distance, only one of the atoms can be addressed by the laser field. In addition, we arrange the atomic dipole moments to be oriented relative to the inter-atomic axis such that the dipole-dipole interaction between the atoms is zero at this specific distance. It is shown that an entanglement can be created between the atoms on demand by tuning the Rabi frequency of the driving field to the difference between the atomic transition frequencies. The amount of the entanglement created depends on the ratio between the damping rates of the atoms, but is independent of the frequency difference between the atoms. We also find that the transient buildup of an entanglement between the atoms may differ dramatically for different initial atomic conditions.

  18. The Art of Gymnastics: Creating Sequences.

    ERIC Educational Resources Information Center

    Rovegno, Inez

    1988-01-01

    Offering students opportunities for creating movement sequences in gymnastics allows them to understand the essence of gymnastics, have creative experiences, and learn about themselves. The process of creating sequences is described. (MT)

  19. The Birth and Re-Birth of the ISBDs: Process and Procedures for Creating and Revising the International Standard Bibliographic Descriptions [and] Section on Bibliography--Review of Activities, 1999-2000.

    ERIC Educational Resources Information Center

    Byrum, John D.

    This document contains two papers. The first paper discusses the process and procedures for creating and revising the ISBD (International Standard Bibliographic Description), including historical background from 1969 to the present, a description of revision projects, and a chart that summarizes the history and current status of the full range of…

  20. Multifractal spectra in homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  1. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  2. Creating visual explanations improves learning.

    PubMed

    Bobek, Eliza; Tversky, Barbara

    2016-01-01

    Many topics in science are notoriously difficult for students to learn. Mechanisms and processes outside student experience present particular challenges. While instruction typically involves visualizations, students usually explain in words. Because visual explanations can show parts and processes of complex systems directly, creating them should have benefits beyond creating verbal explanations. We compared learning from creating visual or verbal explanations for two STEM domains, a mechanical system (bicycle pump) and a chemical system (bonding). Both kinds of explanations were analyzed for content and learning assess by a post-test. For the mechanical system, creating a visual explanation increased understanding particularly for participants of low spatial ability. For the chemical system, creating both visual and verbal explanations improved learning without new teaching. Creating a visual explanation was superior and benefitted participants of both high and low spatial ability. Visual explanations often included crucial yet invisible features. The greater effectiveness of visual explanations appears attributable to the checks they provide for completeness and coherence as well as to their roles as platforms for inference. The benefits should generalize to other domains like the social sciences, history, and archeology where important information can be visualized. Together, the findings provide support for the use of learner-generated visual explanations as a powerful learning tool.

  3. Rapid homogeneous endothelialization of high aspect ratio microvascular networks.

    PubMed

    Naik, Nisarga; Hanjaya-Putra, Donny; Haller, Carolyn A; Allen, Mark G; Chaikof, Elliot L

    2015-08-01

    Microvascularization of an engineered tissue construct is necessary to ensure the nourishment and viability of the hosted cells. Microvascular constructs can be created by seeding the luminal surfaces of microfluidic channel arrays with endothelial cells. However, in a conventional flow-based system, the uniformity of endothelialization of such an engineered microvascular network is constrained by mass transfer of the cells through high length-to-diameter (L/D) aspect ratio microchannels. Moreover, given the inherent limitations of the initial seeding process to generate a uniform cell coating, the large surface-area-to-volume ratio of microfluidic systems demands long culture periods for the formation of confluent cellular microconduits. In this report, we describe the design of polydimethylsiloxane (PDMS) and poly(glycerol sebacate) (PGS) microvascular constructs with reentrant microchannels that facilitates rapid, spatially homogeneous endothelial cell seeding of a high L/D (2 cm/35 μm; > 550:1) aspect ratio microchannels. MEMS technology was employed for the fabrication of a monolithic, elastomeric, reentrant microvascular construct. Isotropic etching and PDMS micromolding yielded a near-cylindrical microvascular channel array. A 'stretch - seed - seal' operation was implemented for uniform incorporation of endothelial cells along the entire microvascular area of the construct yielding endothelialized microvascular networks in less than 24 h. The feasibility of this endothelialization strategy and the uniformity of cellularization were established using confocal microscope imaging.

  4. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    SciTech Connect

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  5. Homogeneous Open Quantum Random Walks on a Lattice

    NASA Astrophysics Data System (ADS)

    Carbone, Raffaella; Pautrat, Yan

    2015-09-01

    We study open quantum random walks (OQRWs) for which the underlying graph is a lattice, and the generators of the walk are homogeneous in space. Using the results recently obtained in Carbone and Pautrat (Ann Henri Poincaré, 2015), we study the quantum trajectory associated with the OQRW, which is described by a position process and a state process. We obtain a central limit theorem and a large deviation principle for the position process. We study in detail the case of homogeneous OQRWs on the lattice , with internal space.

  6. High School Student Perceptions of the Utility of the Engineering Design Process: Creating Opportunities to Engage in Engineering Practices and Apply Math and Science Content

    NASA Astrophysics Data System (ADS)

    Berland, Leema; Steingut, Rebecca; Ko, Pat

    2014-12-01

    Research and policy documents increasingly advocate for incorporating engineering design into K-12 classrooms in order to accomplish two goals: (1) provide an opportunity to engage with science content in a motivating real-world context; and (2) introduce students to the field of engineering. The present study uses multiple qualitative data sources (i.e., interviews, artifact analysis) in order to examine the ways in which engaging in engineering design can support students in participating in engineering practices and applying math and science knowledge. This study suggests that students better understand and value those aspects of engineering design that are more qualitative (i.e., interviewing users, generating multiple possible solutions) than the more quantitative aspects of design which create opportunities for students to integrate traditional math and science content into their design work (i.e., modeling or systematically choosing between possible design solutions). Recommendations for curriculum design and implementation are discussed.

  7. An approximation for homogeneous freezing temperature of water droplets

    NASA Astrophysics Data System (ADS)

    O, K.-T.; Wood, R.

    2015-11-01

    In this work, based on the well-known formulae of classical nucleation theory (CNT), the temperature TNc = 1 at which the mean number of critical embryos inside a droplet is unity is derived and proposed as a new approximation for homogeneous freezing temperature of water droplets. Without consideration of time dependence and stochastic nature of the ice nucleation process, the approximation TNc = 1 is able to reproduce the dependence of homogeneous freezing temperature on drop size and water activity of aqueous drops observed in a wide range of experimental studies. We use the TNc = 1 approximation to argue that the distribution of homogeneous freezing temperatures observed in the experiments may largely be explained by the spread in the size distribution of droplets used in the particular experiment. It thus appears that this approximation is useful for predicting homogeneous freezing temperatures of water droplets in the atmosphere.

  8. Effects of homogenous loading on silicon direct bonding

    NASA Astrophysics Data System (ADS)

    Huang, Li-Yang; Ho, Kuan-Lin; Hu, Chen-Ti

    2011-06-01

    The effect of a homogenous loaded stress on the bonding quality of silicon wafer pairs was investigated by employing a Nano-Imprint System and a homogenous plane-stress applied over the entire surface area of pre-cleaned wafers. In addition, the effects of variations in the applied homogenous stress (1, 10, 100, 500 psi) on the interface energy of the bonded pairs were examined using a dynamic blade insertion (DBI) method. Infrared imaging was used to evaluate the quality of the bonded interface of each bonded pair immediately after the bonding process and after allowing the bonded pairs to rest at room temperature for 80 h after bonding. The results indicated that the homogenous loading with the Nano-Imprint System further improved the bonding condition of wafer pairs that had been pre-bonded using an anodic bonder. Furthermore, the bonded pairs exhibited almost identical interfacial energies of about 0.2 Jm -2 when the homogenous stress was varied from 1 psi to 500 psi, which clearly indicates that the interfacial energy of bonded wafers is independent of the amount of stress applied by the homogenous loading process.

  9. 7 CFR 58.623 - Homogenizer.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE....623 Homogenizer. Homogenizer shall comply with 3-A Sanitary Standards....

  10. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  11. Coherence delay augmented laser beam homogenizer

    DOEpatents

    Rasmussen, P.; Bernhardt, A.

    1993-06-29

    The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.

  12. Coherence delay augmented laser beam homogenizer

    DOEpatents

    Rasmussen, Paul; Bernhardt, Anthony

    1993-01-01

    The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.

  13. On the decay of homogeneous isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Skrbek, L.; Stalp, Steven R.

    2000-08-01

    wind tunnels and a water channel, the temporal decay of turbulence created by an oscillating grid in water and the decay of energy and vorticity created by a towed grid in a stationary sample of water. We also analyze decaying vorticity data we obtained in superfluid helium and show that decaying superfluid turbulence can be described classically. This paper offers a unified investigation of decaying isotropic, homogeneous turbulence that is based on accepted forms of the three-dimensional turbulent spectra and a variety of experimental decay data obtained in air, water, and superfluid helium.

  14. Homogeneous catalytic hydrogenolysis of biomass

    SciTech Connect

    Vasilakos, N.P.; Barreiros, M.T.

    1984-01-01

    Aqueous hydrogenation of cellulose in the presence of various homogeneous acidic catalysts was studied in a batch autoclave system at 250-350 degrees, residence times of 1-4 h, catalyst concentrations of less than 10% weight, and H pressures of less than 735 psi. The use of heterogeneous cocatalysts was also investigated. Under these conditions, less than 93% of the initial cellulose feed was converted, yielding mainly water-soluble components and gases rich in CO and CO2. C conversions to water-solubles and gases of less than 57% (on the basis of cellulose) were obtained, constituting in some cases improvements of more than 100% over the noncatalytic experiments. The conversion mechanism involved competing hydrolysis, pyrolysis, and hydrogenation reactions, the relative importance of which was strongly dependent on temperature. FeCl3 was the best overall catalyst, while molybdates displayed high selectivity toward hydrogenation.

  15. Numerical experiments in homogeneous turbulence

    NASA Technical Reports Server (NTRS)

    Rogallo, R. S.

    1981-01-01

    The direct simulation methods developed by Orszag and Patternson (1972) for isotropic turbulence were extended to homogeneous turbulence in an incompressible fluid subjected to uniform deformation or rotation. The results of simulations for irrotational strain (plane and axisymmetric), shear, rotation, and relaxation toward isotropy following axisymmetric strain are compared with linear theory and experimental data. Emphasis is placed on the shear flow because of its importance and because of the availability of accurate and detailed experimental data. The computed results are used to assess the accuracy of two popular models used in the closure of the Reynolds-stress equations. Data from a variety of the computed fields and the details of the numerical methods used in the simulation are also presented.

  16. An Updated Homogeneous GPS Velocity Field for Studies of Earthquake Hazard Prediction and Assessment in Turkey

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Aktug, B.; Dogru, A.; Tasci, L.; Acar, M.

    2016-12-01

    While the GPS-based crustal deformation studies in Turkey date back to early 1990s, a homogenous velocity field utilizing all the available data is still missing. Regional studies employing different site distributions, observation plans, processing software and methodology not only create reference frame variations but also heterogeneous stochastic models. While the reference frame effect between different velocity fields could easily be removed by estimating a set of rotations, the homogenization of the stochastic models of the individual velocity fields requires a more detailed analysis. Using a rigorous Variance Component Estimation (VCE) methodology, we estimated the variance factors for each of the contributing velocity fields and combined them into a single homogenous velocity field covering whole Turkey. Results show that variance factors between velocity fields including the survey mode and continuous observations can vary a few orders of magnitude.In this study, we present the most complete velocity field in Turkey rigorously combined from 20 individual velocity fields including the 146 station CORS network with 8 years continuous stations. In addition, two GPS campaigns were performed at 35 stations along the North Anatolian Fault to fill the gap between existing velocity fields. The homogenously combined new velocity field is nearly complete in terms of geographic coverage, and will serve as the basis for further analyses such as the estimation of the deformation rates and the determination of the slip rates across main fault zones. As the Active Fault Map of Turkey was recently revised and 500 faults were tagged as having the potential of generating destructive earthquakes, the new velocity field is also expected to have a direct impact on the earthquake hazard studies.

  17. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-03

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales.

  18. Homogenization of regional river dynamics by dams and global biodiversity implications

    PubMed Central

    Poff, N. LeRoy; Olden, Julian D.; Merritt, David M.; Pepin, David M.

    2007-01-01

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales. PMID:17360379

  19. Development of an efficient anaerobic co-digestion process for garbage, excreta, and septic tank sludge to create a resource recycling-oriented society.

    PubMed

    Sun, Zhao-Yong; Liu, Kai; Tan, Li; Tang, Yue-Qin; Kida, Kenji

    2017-03-01

    In order to develop a resource recycling-oriented society, an efficient anaerobic co-digestion process for garbage, excreta and septic tank sludge was studied based on the quantity of each biomass waste type discharged in Ooki machi, Japan. The anaerobic digestion characteristics of garbage, excreta and 5-fold condensed septic tank sludge (hereafter called condensed sludge) were determined separately. In single-stage mesophilic digestion, the excreta with lower C/N ratios yielded lower biogas volumes and accumulated higher volumes of volatile fatty acid (VFA). On the other hand, garbage allowed for a significantly larger volatile total solid (VTS) digestion efficiency as well as biogas yield by thermophilic digestion. Thus, a two-stage anaerobic co-digestion process consisting of thermophilic liquefaction and mesophilic digestion phases was proposed. In the thermophilic liquefaction of mixed condensed sludge and household garbage (wet mass ratio of 2.2:1), a maximum VTS loading rate of 24g/L/d was achieved. In the mesophilic digestion of mixed liquefied material and excreta (wet mass ratio of 1:1), biogas yield reached approximately 570ml/g-VTS fed with a methane content of 55% at a VTS loading rate of 1.0g/L/d. The performance of the two-stage process was evaluated by comparing it with a single-stage process in which biomass wastes were treated separately. Biogas production by the two-stage process was found to increase by approximately 22.9%. These results demonstrate the effectiveness of a two-stage anaerobic co-digestion process in enhancement of biogas production.

  20. Rapid Evidence Assessment of the Literature (REAL(©)): streamlining the systematic review process and creating utility for evidence-based health care.

    PubMed

    Crawford, Cindy; Boyd, Courtney; Jain, Shamini; Khorsan, Raheleh; Jonas, Wayne

    2015-11-02

    Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice guidelines. Samueli Institute has developed the Rapid Evidence Assessment of the Literature (REAL) SR process to address these issues. REAL provides up-to-date, rigorous, high quality SR information on health care practices, products, or programs in a streamlined, efficient and reliable manner. This process is a component of the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™) program developed by Samueli Institute, which aims at answering the question of "What works?" in health care. The REAL process (1) tailors a standardized search strategy to a specific and relevant research question developed with various stakeholders to survey the available literature; (2) evaluates the quantity and quality of the literature using structured tools and rulebooks to ensure objectivity, reliability and reproducibility of reviewer ratings in an independent fashion and; (3) obtains formalized, balanced input from trained subject matter experts on the implications of the evidence for future research and current practice. Online tools and quality assurance processes are utilized for each step of the review to ensure a rapid, rigorous, reliable, transparent and reproducible SR process. The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care. Using the REAL system allows for the facilitation of recommendations on appropriate next steps in policy, funding

  1. Steps Towards a Homogenized Sub-Monthly Temperature Monitoring Tool

    NASA Astrophysics Data System (ADS)

    Rennie, J.; Kunkel, K.

    2015-12-01

    Land surface air temperature products have been essential for monitoring the evolution of the climate system. Before a temperature dataset is included in such reports, it is important that non-climatic influences be removed or changed so the dataset is considered homogenous. These inhomogeneities include changes in station location, instrumentation and observing practices. Very few datasets are free of these influences and therefore require homogenization schemes. While many homogenized products exist on the monthly time scale, few daily products exist, due to the complication of removing break points that are truly inhomogeneous rather than solely by chance (for example, sharp changes due to synoptic conditions). Since there is a high demand for sub-monthly monitoring tools, there is a need to address these issues. The Global Historical Climatology Network - Daily dataset provides a strong foundation of the Earth's climate on the daily scale, and is the official archive of daily data in the United States. While the dataset adheres to a strict set of quality assurance, no daily adjustments are applied. However, this dataset lays the groundwork for other products distributed at NCEI-Asheville, including the climate divisional dataset (nClimDiv), the North American monthly homogenized product (Northam) and the 1981-2010 Normals. Since these downstream products already provide homogenization and base period schemes, it makes sense to combine these datasets to provide a sub-monthly monitoring tool for the United States. Using these datasets already in existence, monthly adjustments are applied to daily data, and then anomalies are created using a base climatology defined by the 1981-2010 Normals. Station data is then aggregated to the state level and then regions defined by the National Climate Assessment. Ranks are then created to provide informational monitoring tools that could be of use for public dissemination. This presentation goes over the product, including

  2. Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals: stereoelectronic role of trans effect in a metal-mediated pericyclic process and a shift from homogeneous to heterogeneous catalysis during a one-pot reaction.

    PubMed

    Vidhani, Dinesh V; Krafft, Marie E; Alabugin, Igor V

    2014-01-03

    The combination of experiments and computations reveals unusual features of stereoselective Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals. The first step, the conversion of propargyl vinyl ethers into allene aldehydes, proceeds under homogeneous conditions via a "cyclization-mediated" mechanism initiated by Rh(I) coordination at the alkyne. This path agrees well with the small experimental effects of substituents on the carbinol carbon. The key feature revealed by the computational study is the stereoelectronic effect of the ligand arrangement at the catalytic center. The rearrangement barriers significantly decrease due to the greater transfer of electron density from the catalytic metal center to the CO ligand oriented trans to the alkyne. This effect increases electrophilicity of the metal and lowers the calculated barriers by 9.0 kcal/mol. Subsequent evolution of the catalyst leads to the in situ formation of Rh(I) nanoclusters that catalyze stereoselective tautomerization. The intermediacy of heterogeneous catalysis by nanoclusters was confirmed by mercury poisoning, temperature-dependent sigmoidal kinetic curves, and dynamic light scattering. The combination of experiments and computations suggests that the initially formed allene-aldehyde product assists in the transformation of a homogeneous catalyst (or "a cocktail of catalysts") into nanoclusters, which in turn catalyze and control the stereochemistry of subsequent transformations.

  3. Creating a Comprehensive, Efficient, and Sustainable Nuclear Regulatory Structure: A Process Report from the U.S. Department of Energy's Material Protection, Control and Accounting Program

    SciTech Connect

    Wright, Troy L.; O'Brien, Patricia E.; Hazel, Michael J.; Tuttle, John D.; Cunningham, Mitchel E.; Schlegel, Steven C.

    2010-08-11

    With the congressionally mandated January 1, 2013 deadline for the U.S. Department of Energy’s (DOE) Nuclear Material Protection, Control and Accounting (MPC&A) program to complete its transition of MPC&A responsibility to the Russian Federation, National Nuclear Security Administration (NNSA) management directed its MPC&A program managers and team leaders to demonstrate that work in ongoing programs would lead to successful and timely achievement of these milestones. In the spirit of planning for successful project completion, the NNSA review of the Russian regulatory development process confirmed the critical importance of an effective regulatory system to a sustainable nuclear protection regime and called for an analysis of the existing Russian regulatory structure and the identification of a plan to ensure a complete MPC&A regulatory foundation. This paper describes the systematic process used by DOE’s MPC&A Regulatory Development Project (RDP) to develop an effective and sustainable MPC&A regulatory structure in the Russian Federation. This nuclear regulatory system will address all non-military Category I and II nuclear materials at State Corporation for Atomic Energy “Rosatom,” the Federal Service for Ecological, Technological, and Nuclear Oversight (Rostechnadzor), the Federal Agency for Marine and River Transport (FAMRT, within the Ministry of Transportation), and the Ministry of Industry and Trade (Minpromtorg). The approach to ensuring a complete and comprehensive nuclear regulatory structure includes five sequential steps. The approach was adopted from DOE’s project management guidelines and was adapted to the regulatory development task by the RDP. The five steps in the Regulatory Development Process are: 1) Define MPC&A Structural Elements; 2) Analyze the existing regulatory documents using the identified Structural Elements; 3) Validate the analysis with Russian colleagues and define the list of documents to be developed; 4) Prioritize and

  4. BLENDING LOW ENRICHED URANIUM WITH DEPLETED URANIUM TO CREATE A SOURCE MATERIAL ORE THAT CAN BE PROCESSED FOR THE RECOVERY OF YELLOWCAKE AT A CONVENTIONAL URANIUM MILL

    SciTech Connect

    Schutt, Stephen M.; Hochstein, Ron F.; Frydenlund, David C.; Thompson, Anthony J.

    2003-02-27

    Throughout the United States Department of Energy (DOE) complex, there are a number of streams of low enriched uranium (LEU) that contain various trace contaminants. These surplus nuclear materials require processing in order to meet commercial fuel cycle specifications. To date, they have not been designated as waste for disposal at the DOE's Nevada Test Site (NTS). Currently, with no commercial outlet available, the DOE is evaluating treatment and disposal as the ultimate disposition path for these materials. This paper will describe an innovative program that will provide a solution to DOE that will allow disposition of these materials at a cost that will be competitive with treatment and disposal at the NTS, while at the same time recycling the material to recover a valuable energy resource (yellowcake) for reintroduction into the commercial nuclear fuel cycle. International Uranium (USA) Corporation (IUSA) and Nuclear Fuel Services, Inc. (NFS) have entered into a commercial relationship to pursue the development of this program. The program involves the design of a process and construction of a plant at NFS' site in Erwin, Tennessee, for the blending of contaminated LEU with depleted uranium (DU) to produce a uranium source material ore (USM Ore{trademark}). The USM Ore{trademark} will then be further processed at IUC's White Mesa Mill, located near Blanding, Utah, to produce conventional yellowcake, which can be delivered to conversion facilities, in the same manner as yellowcake that is produced from natural ores or other alternate feed materials. The primary source of feed for the business will be the significant sources of trace contaminated materials within the DOE complex. NFS has developed a dry blending process (DRYSM Process) to blend the surplus LEU material with DU at its Part 70 licensed facility, to produce USM Ore{trademark} with a U235 content within the range of U235 concentrations for source material. By reducing the U235 content to source

  5. Catalytic Parallel Kinetic Resolution under Homogeneous Conditions

    PubMed Central

    Duffey, Trisha A.; MacKay, James A.; Vedejs, Edwin

    2010-01-01

    Two complementary chiral catalysts, the phosphine 8d and the DMAP-derived ent-23b, are used simultaneously to selectively activate one of a mixture of two different achiral anhydrides as acyl donors under homogeneous conditions. The resulting activated intermediates 25 and 26 react with the racemic benzylic alcohol 5 to form enantioenriched esters (R)-24 and (S)-17 by fully catalytic parallel kinetic resolution (PKR). The aroyl ester (R)-24 is obtained with near-ideal enantioselectivity for the PKR process, but (S)-17 is contaminated by ca. 8% of the minor enantiomer (R)-17 resulting from a second pathway via formation of mixed anhydride 24 and its activation by 8d. PMID:20557113

  6. Text-mining of PubMed abstracts by natural language processing to create a public knowledge base on molecular mechanisms of bacterial enteropathogens.

    PubMed

    Zaremba, Sam; Ramos-Santacruz, Mila; Hampton, Thomas; Shetty, Panna; Fedorko, Joel; Whitmore, Jon; Greene, John M; Perna, Nicole T; Glasner, Jeremy D; Plunkett, Guy; Shaker, Matthew; Pot, David

    2009-06-10

    The Enteropathogen Resource Integration Center (ERIC; http://www.ericbrc.org) has a goal of providing bioinformatics support for the scientific community researching enteropathogenic bacteria such as Escherichia coli and Salmonella spp. Rapid and accurate identification of experimental conclusions from the scientific literature is critical to support research in this field. Natural Language Processing (NLP), and in particular Information Extraction (IE) technology, can be a significant aid to this process. We have trained a powerful, state-of-the-art IE technology on a corpus of abstracts from the microbial literature in PubMed to automatically identify and categorize biologically relevant entities and predicative relations. These relations include: Genes/Gene Products and their Roles; Gene Mutations and the resulting Phenotypes; and Organisms and their associated Pathogenicity. Evaluations on blind datasets show an F-measure average of greater than 90% for entities (genes, operons, etc.) and over 70% for relations (gene/gene product to role, etc). This IE capability, combined with text indexing and relational database technologies, constitute the core of our recently deployed text mining application. Our Text Mining application is available online on the ERIC website (http://www.ericbrc.org/portal/eric/articles). The information retrieval interface displays a list of recently published enteropathogen literature abstracts, and also provides a search interface to execute custom queries by keyword, date range, etc. Upon selection, processed abstracts and the entities and relations extracted from them are retrieved from a relational database and marked up to highlight the entities and relations. The abstract also provides links from extracted genes and gene products to the ERIC Annotations database, thus providing access to comprehensive genomic annotations and adding value to both the text-mining and annotations systems.

  7. Cryogenic Homogenization and Sampling of Heterogeneous Multi-Phase Feedstock

    SciTech Connect

    Doyle, Glenn M.; Ideker, Virgene D.; Siegwarth, James D.

    1999-09-21

    An apparatus and process for producing a homogeneous analytical sample from a heterogeneous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77K (-196 C). Further, with the process of this invention the representative sample maybe maintained below the critical temperature until being analyzed.

  8. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  9. Homogeneous modes of cosmological instantons

    SciTech Connect

    Gratton, Steven; Turok, Neil

    2001-06-15

    We discuss the O(4) invariant perturbation modes of cosmological instantons. These modes are spatially homogeneous in Lorentzian spacetime and thus not relevant to density perturbations. But their properties are important in establishing the meaning of the Euclidean path integral. If negative modes are present, the Euclidean path integral is not well defined, but may nevertheless be useful in an approximate description of the decay of an unstable state. When gravitational dynamics is included, counting negative modes requires a careful treatment of the conformal factor problem. We demonstrate that for an appropriate choice of coordinate on phase space, the second order Euclidean action is bounded below for normalized perturbations and has a finite number of negative modes. We prove that there is a negative mode for many gravitational instantons of the Hawking-Moss or Coleman{endash}De Luccia type, and discuss the associated spectral flow. We also investigate Hawking-Turok constrained instantons, which occur in a generic inflationary model. Implementing the regularization and constraint proposed by Kirklin, Turok and Wiseman, we find that those instantons leading to substantial inflation do not possess negative modes. Using an alternate regularization and constraint motivated by reduction from five dimensions, we find a negative mode is present. These investigations shed new light on the suitability of Euclidean quantum gravity as a potential description of our universe.

  10. Phosphoproteomic Analysis of Liver Homogenates

    PubMed Central

    Demirkan, Gokhan; Salomon, Arthur R.; Gruppuso, Philip A.

    2013-01-01

    Summary Regulation of protein function via reversible phosphorylation is an essential component of cell signaling. Our ability to understand complex phosphorylation networks in the physiological context of a whole organism or tissue remains limited. This is largely due to the technical challenge of isolating serine/threonine phosphorylated peptides from a tissue sample. In the present study, we developed a phosphoproteomic strategy to purify and identify phosphopeptides from a tissue sample by employing protein gel filtration, protein SAX (strong anion exchange) and SCX (strong cation exchange) chromatography, peptide SCX chromatography and TiO2 affinity purification. By applying this strategy to the mass spectrometry-based analysis of rat liver homogenates, we were able to identify with high confidence and quantify over four thousand unique phosphopeptides. Finally, the reproducibility of our methodology was demonstrated by its application to analysis of the mammalian Target of Rapamycin (mTOR) signaling pathways in liver samples obtained from rats in which hepatic mTOR was activated by refeeding following a period of fasting. PMID:22903715

  11. Magnetic field homogeneity of a conical coaxial coil pair.

    PubMed

    Salazar, F J; Nieves, F J; Bayón, A; Gascón, F

    2017-09-01

    An analytical study of the magnetic field created by a double-conical conducting sheet is presented. The analysis is based on the expansion of the magnetic field in terms of Legendre polynomials. It is demonstrated analytically that the angle of the conical surface that produces a nearly homogeneous magnetic field coincides with that of a pair of loops that fulfills the Helmholtz condition. From the results obtained, we propose an electric circuit formed by pairs of isolated conducting loops tightly wound around a pair of conical surfaces, calculating numerically the magnetic field produced by this system and its heterogeneity. An experimental setup of the proposed circuit was constructed and its magnetic field was measured. The results were compared with those obtained by numerical calculation, finding a good agreement. The numerical results demonstrate a significant improvement in homogeneity in the field of the proposed pair of conical coils compared with that achieved with a simple pair of Helmholtz loops or with a double solenoid. Moreover, a new design of a double pair of conical coils based on Braunbek's four loops is also proposed to achieve greater homogeneity. Regarding homogeneity, the rating of the analyzed configurations from best to worst is as follows: (1) double pair of conical coils, (2) pair of conical coils, (3) Braunbek's four loops, (4) Helmholtz pair, and (5) solenoid pair.

  12. Creating physics stars

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2013-07-01

    Korea has begun an ambitious 5bn plan to create 50 new institutes dedicated to fundamental research. Michael Banks meets physicist Se-Jung Oh, president of the Institute for Basic Science, to find out more.

  13. Create a Logo.

    ERIC Educational Resources Information Center

    Duchen, Gail

    2002-01-01

    Presents an art lesson that introduced students to graphic art as a career path. Explains that the students met a graphic artist and created a logo for a pretend client. Explains that the students researched logos. (CMK)

  14. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  15. Homogeneous catalysts in hypersonic combustion

    SciTech Connect

    Harradine, D.M.; Lyman, J.L.; Oldenborg, R.C.; Pack, R.T.; Schott, G.L.

    1989-01-01

    Density and residence time both become unfavorably small for efficient combustion of hydrogen fuel in ramjet propulsion in air at high altitude and hypersonic speed. Raising the density and increasing the transit time of the air through the engine necessitates stronger contraction of the air flow area. This enhances the kinetic and thermodynamic tendency of H/sub 2/O to form completely, accompanied only by N/sub 2/ and any excess H/sub 2/(or O/sub 2/). The by-products to be avoided are the energetically expensive fragment species H and/or O atoms and OH radicals, and residual (2H/sub 2/ plus O/sub 2/). However, excessive area contraction raises air temperature and consequent combustion-product temperature by adiabatic compression. This counteracts and ultimately overwhelms the thermodynamic benefit by which higher density favors the triatomic product, H/sub 2/O, over its monatomic and diatomic alternatives. For static pressures in the neighborhood of 1 atm, static temperature must be kept or brought below ca. 2400 K for acceptable stability of H/sub 2/O. Another measure, whose requisite chemistry we address here, is to extract propulsive work from the combustion products early in the expansion. The objective is to lower the static temperature of the combustion stream enough for H/sub 2/O to become adequately stable before the exhaust flow is massively expanded and its composition ''frozen.'' We proceed to address this mechanism and its kinetics, and then examine prospects for enhancing its rate by homogeneous catalysts. 9 refs.

  16. Homogeneity and thermodynamic identities in geometrothermodynamics

    NASA Astrophysics Data System (ADS)

    Quevedo, Hernando; Quevedo, María N.; Sánchez, Alberto

    2017-03-01

    We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics.

  17. The Homogenization and Optimization of Thermoelectric Composites

    DTIC Science & Technology

    2015-04-17

    AFRL-OSR-VA-TR-2015-0090 The Homogenization and Optimization of Thermoelectric Composites Jiangyu Li UNIVERSITY OF WASHINGTON Final Report 04/17/2015...SUBTITLE The Homogenization and Optimization of Thermoelectric Composites 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1-0325 5c. PROGRAM ELEMENT...behavior of thermoelectric composites using rigorous homogenization technique in this project. In the last three years, our accomplishment includes: (1

  18. Bounds for nonlinear composites via iterated homogenization

    NASA Astrophysics Data System (ADS)

    Ponte Castañeda, P.

    2012-09-01

    Improved estimates of the Hashin-Shtrikman-Willis type are generated for the class of nonlinear composites consisting of two well-ordered, isotropic phases distributed randomly with prescribed two-point correlations, as determined by the H-measure of the microstructure. For this purpose, a novel strategy for generating bounds has been developed utilizing iterated homogenization. The general idea is to make use of bounds that may be available for composite materials in the limit when the concentration of one of the phases (say phase 1) is small. It then follows from the theory of iterated homogenization that it is possible, under certain conditions, to obtain bounds for more general values of the concentration, by gradually adding small amounts of phase 1 in incremental fashion, and sequentially using the available dilute-concentration estimate, up to the final (finite) value of the concentration (of phase 1). Such an approach can also be useful when available bounds are expected to be tighter for certain ranges of the phase volume fractions. This is the case, for example, for the "linear comparison" bounds for porous viscoplastic materials, which are known to be comparatively tighter for large values of the porosity. In this case, the new bounds obtained by the above-mentioned "iterated" procedure can be shown to be much improved relative to the earlier "linear comparison" bounds, especially at low values of the porosity and high triaxialities. Consistent with the way in which they have been derived, the new estimates are, strictly, bounds only for the class of multi-scale, nonlinear composites consisting of two well-ordered, isotropic phases that are distributed with prescribed H-measure at each stage in the incremental process. However, given the facts that the H-measure of the sequential microstructures is conserved (so that the final microstructures can be shown to have the same H-measure), and that H-measures are insensitive to length scales, it is conjectured

  19. Influence of homogenization treatment on physicochemical properties and enzymatic hydrolysis rate of pure cellulose fibers.

    PubMed

    Jacquet, N; Vanderghem, C; Danthine, S; Blecker, C; Paquot, M

    2013-02-01

    The aim of this study is to compare the effect of different homogenization treatments on the physicochemical properties and the hydrolysis rate of a pure bleached cellulose. Results obtained show that homogenization treatments improve the enzymatic hydrolysis rate of the cellulose fibers by 25 to 100 %, depending of the homogenization treatment applied. Characterization of the samples showed also that homogenization had an impact on some physicochemical properties of the cellulose. For moderate treatment intensities (pressure below 500 b and degree of homogenization below 25), an increase of water retention values (WRV) that correlated to the increase of the hydrolysis rate was highlighted. Result also showed that the overall crystallinity of the cellulose properties appeared not to be impacted by the homogenization treatment. For higher treatment intensities, homogenized cellulose samples developed a stable tridimentional network that contributes to decrease cellulase mobility and slowdown the hydrolysis process.

  20. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  1. Inhibitors of microsomal oxidations in insect homogenates

    PubMed Central

    Chakraborty, J.; Sissons, C. H.; Smith, J. N.

    1967-01-01

    1. Homogenates of insect tissues were assayed for enzymes capable of oxidizing p-nitrotoluene to p-nitrobenzoic acid. 2. Locust fat-body homogenate 10000g supernatant was an effective enzyme and required no added cofactors. 3. Homogenates of other insects or locust organs and 10000g sediment from locust fat-body were not active and inhibited microsomal oxidations carried out by locust fat-body or rabbit liver enzyme. 4. Inhibitory power was high in homogenates of whole flies and of fly heads or thoraces. 5. Inhibition appeared to involve both irreversible inactivation of enzyme and the removal of essential cofactors. PMID:4382105

  2. Comparative Analysis of a MOOC and a Residential Community Using Introductory College Physics: Documenting How Learning Environments Are Created, Lessons Learned in the Process, and Measurable Outcomes

    NASA Astrophysics Data System (ADS)

    Olsen, Jack Ryan

    Higher education institutions, such as the University of Colorado Boulder (CU-Boulder), have as a core mission to advance their students' academic performance. On the frontier of education technologies that hold the promise to address our educational mission are Massively Open Online Courses (MOOCs) which are new enough to not be fully understood or well-researched. MOOCs, in theory, have vast potential for being cost-effective and for reaching diverse audiences across the world. This thesis examines the implementation of one MOOC, Physics 1 for Physical Science Majors, implemented in the augural round of institutionally sanctioned MOOCs in Fall 2013. While comparatively inexpensive to a brick-and-mortar course and while it initially enrolled audience of nearly 16,000 students, this MOOC was found to be time-consuming to implement, and only roughly 1.5% of those who enrolled completed the course---approximately 1/4 of those who completed the standard brick and mortar course that the MOOC was designed around. An established education technology, residential communities, contrast the MOOCs by being high-touch and highly humanized, but by being expensive and locally-based. The Andrews Hall Residential College (AHRC) on the CU campus fosters academic success and retention by engaging and networking students outside of the standard brick and mortar courses and enculturating students into an environment with vertical integration through the different classes: freshman, sophomore, junior, etc. The physics MOOC and the AHRC were studied to determine how the environments were made and what lessons were learned in the process. Also, student performance was compared for the physics MOOC, a subset of the AHRC students enrolled in a special physics course, and the standard CU Physics 1 brick and mortar course. All yielded similar learning gains for physics 1 performance, for those who completed the courses. These environments are presented together to compare and contrast their

  3. Effect of heat and homogenization on in vitro digestion of milk

    USDA-ARS?s Scientific Manuscript database

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  4. Cell-Laden Poly(ɛ-caprolactone)/Alginate Hybrid Scaffolds Fabricated by an Aerosol Cross-Linking Process for Obtaining Homogeneous Cell Distribution: Fabrication, Seeding Efficiency, and Cell Proliferation and Distribution

    PubMed Central

    Lee, HyeongJin; Ahn, SeungHyun; Bonassar, Lawrence J.; Chun, Wook

    2013-01-01

    Generally, solid-freeform fabricated scaffolds show a controllable pore structure (pore size, porosity, pore connectivity, and permeability) and mechanical properties by using computer-aided techniques. Although the scaffolds can provide repeated and appropriate pore structures for tissue regeneration, they have a low biological activity, such as low cell-seeding efficiency and nonuniform cell density in the scaffold interior after a long culture period, due to a large pore size and completely open pores. Here we fabricated three different poly(ɛ-caprolactone) (PCL)/alginate scaffolds: (1) a rapid prototyped porous PCL scaffold coated with an alginate, (2) the same PCL scaffold coated with a mixture of alginate and cells, and (3) a multidispensed hybrid PCL/alginate scaffold embedded with cell-laden alginate struts. The three scaffolds had similar micropore structures (pore size=430–580 μm, porosity=62%–68%, square pore shape). Preosteoblast cells (MC3T3-E1) were used at the same cell density in each scaffold. By measuring cell-seeding efficiency, cell viability, and cell distribution after various periods of culturing, we sought to determine which scaffold was more appropriate for homogeneously regenerated tissues. PMID:23469894

  5. A tree-based model for homogeneous groupings of multinomials.

    PubMed

    Yang, Tae Young

    2005-11-30

    The motivation of this paper is to provide a tree-based method for grouping multinomial data according to their classification probability vectors. We produce an initial tree by binary recursive partitioning whereby multinomials are successively split into two subsets and the splits are determined by maximizing the likelihood function. If the number of multinomials k is too large, we propose to order the multinomials, and then build the initial tree based on a dramatically smaller number k-1 of possible splits. The tree is then pruned from the bottom up. The pruning process involves a sequence of hypothesis tests of a single homogeneous group against the alternative that there are two distinct, internally homogeneous groups. As pruning criteria, the Bayesian information criterion and the Wilcoxon rank-sum test are proposed. The tree-based model is illustrated on genetic sequence data. Homogeneous groupings of genetic sequences present new opportunities to understand and align these sequences.

  6. Visual quality of printed surfaces: study of homogeneity

    NASA Astrophysics Data System (ADS)

    Nébouy, D.; Hébert, M.; Fournel, T.; Lesur, J.-L.

    2014-01-01

    This paper introduces a homogeneity assessment method for the printed versions of uniform color images. This parameter has been specifically selected as one of the relevant attributes of printing quality. The method relies on image processing algorithms from a scanned image of the printed surface, especially the computation of gray level co-occurrence matrices and of objective homogeneity attribute inspired of Haralick's parameters. The viewing distance is also taken into account when computing the homogeneity index. Resizing and filtering of the scanned image are performed in order to keep the level of details visible by a standard human observer at short and long distances. The combination of the obtained homogeneity scores on both high and low resolution images provides a homogeneity index, which can be computed for any printed version of a uniform digital image. We tested the method on several hardcopies of a same image, and compared the scores to the empirical evaluations carried out by non-expert observers who were asked to sort the samples and to place them on a metric scale. Our experiments show a good matching between the sorting by the observers and the score computed by our algorithm.

  7. Context homogeneity facilitates both distractor inhibition and target enhancement.

    PubMed

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2013-05-06

    Homogeneous contexts were shown to result in prioritized processing of embedded targets compared to heterogeneous contexts (Duncan & Humphreys, 1989). The present experiment used behavioral and ERP measures to examine whether context homogeneity affects both enhancing relevant information and inhibiting irrelevant in contexts of varying homogeneity. Targets and distractors were presented laterally or on the vertical midline which allowed disentangling target- and distractor-related activity in the lateralized ERP (Hickey, diLollo, & McDonald, 2009). In homogeneous contexts, targets elicited an NT component from 150 ms on and a PD component from 200 ms on, showing early attention deployment at target locations and active suppression of distractors. In heterogeneous contexts, an NT component was also found from 150 ms on and PD was found from 250 ms on, suggesting delayed suppression of the distractor. Before 250 ms, distractors in heterogeneous contexts elicited a contralateral negativity, indicating attentional capture of the distractor prior to active suppression. In sum the present results suggest that top-down control of attention is more pronounced in homogeneous than in heterogeneous contexts.

  8. Creating speech-synchronized animation.

    PubMed

    King, Scott A; Parent, Richard E

    2005-01-01

    We present a facial model designed primarily to support animated speech. Our facial model takes facial geometry as input and transforms it into a parametric deformable model. The facial model uses a muscle-based parameterization, allowing for easier integration between speech synchrony and facial expressions. Our facial model has a highly deformable lip model that is grafted onto the input facial geometry to provide the necessary geometric complexity needed for creating lip shapes and high-quality renderings. Our facial model also includes a highly deformable tongue model that can represent the shapes the tongue undergoes during speech. We add teeth, gums, and upper palate geometry to complete the inner mouth. To decrease the processing time, we hierarchically deform the facial surface. We also present a method to animate the facial model over time to create animated speech using a model of coarticulation that blends visemes together using dominance functions. We treat visemes as a dynamic shaping of the vocal tract by describing visemes as curves instead of keyframes. We show the utility of the techniques described in this paper by implementing them in a text-to-audiovisual-speech system that creates animation of speech from unrestricted text. The facial and coarticulation models must first be interactively initialized. The system then automatically creates accurate real-time animated speech from the input text. It is capable of cheaply producing tremendous amounts of animated speech with very low resource requirements.

  9. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  10. A near-perfect invisibility cloak constructed with homogeneous materials.

    PubMed

    Li, Wei; Guan, Jianguo; Sun, Zhigang; Wang, Wei; Zhang, Qingjie

    2009-12-21

    A near-perfect, non-singular cylindrical invisibility cloak with diamond cross section is achieved by a two-step coordinate transformation. A small line segment is stretched and then blown up into a diamond space, and finally the cloak consisting of four kinds and eight blocks of homogeneous transformation media is obtained. Numerical simulations confirm the well performance of the cloak. The operation bandwidth of the cloak is also investigated. Our scheme is promising to create a simple and well-performed cloak in practice.

  11. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  12. Creating dedicated bioenergy crops

    USDA-ARS?s Scientific Manuscript database

    Bioenergy is one of the current mechanisms of producing renewable energy to reduce our use of nonrenewable fossil fuels and to reduce carbon emissions into the atmosphere. Humans have been using bioenergy since we first learned to create and control fire - burning manure, peat, and wood to cook food...

  13. Creating a Classroom Newspaper.

    ERIC Educational Resources Information Center

    Buss, Kathleen, Ed.; McClain-Ruelle, Leslie, Ed.

    Based on the premise that students can learn a great deal by reading and writing a newspaper, this book was created by preservice instructors to teach upper elementary students (grades 3-5) newspaper concepts, journalism, and how to write newspaper articles. It shows how to use newspaper concepts to help students integrate knowledge from multiple…

  14. Creating a Virtual Gymnasium

    ERIC Educational Resources Information Center

    Fiorentino, Leah H.; Castelli, Darla

    2005-01-01

    Physical educators struggle with the challenges of assessing student performance, providing feedback about motor skills, and creating opportunities for all students to engage in game-play on a daily basis. The integration of technology in the gymnasium can address some of these challenges by improving teacher efficiency and increasing student…

  15. Creating an Interactive PDF

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2008-01-01

    There are many ways to begin a PDF document using Adobe Acrobat. The easiest and most popular way is to create the document in another application (such as Microsoft Word) and then use the Adobe Acrobat software to convert it to a PDF. In this article, the author describes how he used Acrobat's many tools in his project--an interactive…

  16. Creating a Classroom Makerspace

    ERIC Educational Resources Information Center

    Rivas, Luz

    2014-01-01

    What is a makerspace? Makerspaces are community-operated physical spaces where people (makers) create do-it-yourself projects together. These membership spaces serve as community labs where people learn together and collaborate on projects. Makerspaces often have tools and equipment like 3-D printers, laser cutters, and soldering irons.…

  17. Creating Motivating Job Aids.

    ERIC Educational Resources Information Center

    Tilaro, Angie; Rossett, Allison

    1993-01-01

    Explains how to create job aids that employees will be motivated to use, based on a review of pertinent literature and interviews with professionals. Topics addressed include linking motivation with job aids; Keller's ARCS (Attention, Relevance, Confidence, Satisfaction) model of motivation; and design strategies for job aids based on Keller's…

  18. Creating Dialogue by Storytelling

    ERIC Educational Resources Information Center

    Passila, Anne; Oikarinen, Tuija; Kallio, Anne

    2013-01-01

    Purpose: The objective of this paper is to develop practice and theory from Augusto Boal's dialogue technique (Image Theatre) for organisational use. The paper aims to examine how the members in an organisation create dialogue together by using a dramaturgical storytelling framework where the dialogue emerges from storytelling facilitated by…

  19. Creating Quality Schools.

    ERIC Educational Resources Information Center

    American Association of School Administrators, Arlington, VA.

    This booklet presents information on how total quality management can be applied to school systems to create educational improvement. Total quality management offers education a systemic approach and a new set of assessment tools. Chapter 1 provides a definition and historical overview of total quality management. Chapter 2 views the school…

  20. Creating snags with explosives.

    Treesearch

    Evelyn L. Bull; Arthur D. Partridge; Wayne G. Williams

    1981-01-01

    The tops of ponderosa pine (Pinus ponderosa) trees were blown off with dynamite to create nest sites for cavity-nesting wildlife. The procedure included drilling a hole almost through the trunk, inserting the dynamite, and setting the charge with primacord and fuse. Trees were simultaneously innoculated with a decay organism. The average cost was $...

  1. Creating a Virtual Gymnasium

    ERIC Educational Resources Information Center

    Fiorentino, Leah H.; Castelli, Darla

    2005-01-01

    Physical educators struggle with the challenges of assessing student performance, providing feedback about motor skills, and creating opportunities for all students to engage in game-play on a daily basis. The integration of technology in the gymnasium can address some of these challenges by improving teacher efficiency and increasing student…

  2. Creating Dialogue by Storytelling

    ERIC Educational Resources Information Center

    Passila, Anne; Oikarinen, Tuija; Kallio, Anne

    2013-01-01

    Purpose: The objective of this paper is to develop practice and theory from Augusto Boal's dialogue technique (Image Theatre) for organisational use. The paper aims to examine how the members in an organisation create dialogue together by using a dramaturgical storytelling framework where the dialogue emerges from storytelling facilitated by…

  3. Looking, Writing, Creating.

    ERIC Educational Resources Information Center

    Katzive, Bonnie

    1997-01-01

    Describes how a middle school language arts teacher makes analyzing and creating visual art a partner to reading and writing in her classroom. Describes a project on art and Vietnam which shows how background information can add to and influence interpretation. Describes a unit on Greek mythology and Greek vases which leads to a related visual…

  4. Create Your State

    ERIC Educational Resources Information Center

    Dunham, Kris; Melvin, Samantha

    2011-01-01

    Students are often encouraged to work together with their classmates, sometimes with other classes, occasionally with kids at other schools, but rarely with kids across the country. In this article the authors describe the Create Your State project, a collaborative nationwide project inspired by the Texas Chair Project wherein the artist, Damien…

  5. Creating a Third Culture

    ERIC Educational Resources Information Center

    Weisbuch, Robert A.

    2008-01-01

    In this article, the author laments higher education's lack of concern towards the development of teaching in the public schools over the last half of the 20th century. Most of academe's work on the topic of teacher training has been done at the branches of state universities that needed to make money and create a niche. The author observes that…

  6. Creating a Market.

    ERIC Educational Resources Information Center

    Kazimirski, J.; And Others

    The second in a series of programmed books, "Creating a Market" is published by the International Labour Office as a manual for persons studying marketing. This manual was designed to meet the needs of the labor organization's technical cooperation programs and is primarily concerned with consumer goods industries. Using a fill-in-the-blanks and…

  7. Creating a Classroom Makerspace

    ERIC Educational Resources Information Center

    Rivas, Luz

    2014-01-01

    What is a makerspace? Makerspaces are community-operated physical spaces where people (makers) create do-it-yourself projects together. These membership spaces serve as community labs where people learn together and collaborate on projects. Makerspaces often have tools and equipment like 3-D printers, laser cutters, and soldering irons.…

  8. Creating an Assessments Library

    ERIC Educational Resources Information Center

    Duncan, Greg; Gilbert, Jacqueline; Mackenzie, Mary; Meulener, Carol; Smith, Martin; Yetman, Beatrice; Zeppieri, Rosanne

    2006-01-01

    This article presents the steps taken over three years (2003-2006) by the Consortium for Assessing Performance Standards, a New Jersey Grant Project to create a database of thematically organized, integrated performance assessment tasks at the benchmark levels of proficiency, novice-mid, intermediate-low and pre-advanced as defined by the ACTFL…

  9. Creating Historical Drama.

    ERIC Educational Resources Information Center

    Cassler, Robert

    1990-01-01

    Describes creating for the National Archives Public Education Department a historical drama, "Second in the Realm," based on the story of the Magna Carta. Demonstrates the effectiveness of historical drama as a teaching tool. Explains the difficulties of writing such dramas and provides guidelines for overcoming these problems. (NL)

  10. [Teenagers creating art].

    PubMed

    Ahovi, Jonathan; Viverge, Agathe

    Teenagers need to interpret the world around them, sometimes in a completely different way to that in which, as children, they represented external reality. Some like drawing. They can use it to express their thoughts on death, sexuality or freedom. Their creative capacities are immense: they are creating art.

  11. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  12. Creating Motivating Job Aids.

    ERIC Educational Resources Information Center

    Tilaro, Angie; Rossett, Allison

    1993-01-01

    Explains how to create job aids that employees will be motivated to use, based on a review of pertinent literature and interviews with professionals. Topics addressed include linking motivation with job aids; Keller's ARCS (Attention, Relevance, Confidence, Satisfaction) model of motivation; and design strategies for job aids based on Keller's…

  13. Creating Photo Illustrations.

    ERIC Educational Resources Information Center

    Wilson, Bradley

    2003-01-01

    Explains the uses of photo illustrations. Notes that the key to developing a successful photo illustration is collaborative planning. Outlines the following guidelines for photo illustrations: never set up a photograph to mimic reality; create only abstractions with photo illustrations; clearly label photo illustrations; and never play photo…

  14. Creating Customer Delight.

    ERIC Educational Resources Information Center

    Black, Jim

    1995-01-01

    This article proposes that college admissions officers interested in improving service should focus on creating customer delight rather than simply satisfaction, studying the system when things go wrong rather than placing blame, establishing employee well-being as the highest priority of the organization, providing necessary tools and training…

  15. Creating a Reference Toolbox.

    ERIC Educational Resources Information Center

    Scott, Jane

    1997-01-01

    To help students understand that references are tools used to locate specific information, one librarian has her third-grade students create their own reference toolboxes as she introduces dictionaries, atlases, encyclopedias, and thesauri. Presents a lesson plan to introduce print and nonprint thesauri to third and fourth graders and includes a…

  16. Creating a Study Guide.

    ERIC Educational Resources Information Center

    Kelley, Laura C.

    2001-01-01

    Argues that making a theater study guide is an excellent in-class project, encouraging research, analysis, writing, and creative thinking. Offers a framework for creating one as a classroom project using John Steinbeck's "Of Mice and Men" as an example. Lists further resources. (SR)

  17. Creating an Effective Newsletter

    ERIC Educational Resources Information Center

    Shackelford, Ray; Griffis, Kurt

    2006-01-01

    Newsletters are an important resource or form of media. They offer a cost-effective way to keep people informed, as well as to promote events and programs. Production of a newsletter makes an excellent project, relevant to real-world communication, for technology students. This article presents an activity on how to create a short newsletter. The…

  18. How Banks Create Money.

    ERIC Educational Resources Information Center

    Beale, Lyndi

    This teaching module explains how the U.S. banking system uses excess reserves to create money in the form of new deposits for borrowers. The module is part of a computer-animated series of four-to-five-minute modules illustrating standard concepts in high school economics. Although the module is designed to accompany the video program, it may be…

  19. Creating Pupils' Internet Magazine

    ERIC Educational Resources Information Center

    Bognar, Branko; Šimic, Vesna

    2014-01-01

    This article presents an action research, which aimed to improve pupils' literary creativity and enable them to use computers connected to the internet. The study was conducted in a small district village school in Croatia. Creating a pupils' internet magazine appeared to be an excellent way for achieving the educational aims of almost all…

  20. Creating Multiple Processes from Multiple Intelligences.

    ERIC Educational Resources Information Center

    Wolffe, Robert; Robinson, Helja; Grant, Jean Marie

    1998-01-01

    Howard Gardner's multiple-intelligences theory stresses that all humans possess the various intelligences (linguistic, logical-mathematical, spatial, bodily-kinesthetic, musical, interpersonal, intrapersonal, and naturalist) to differing degrees, and most people can attain adequate competency levels. This article provides a sample checklist for…

  1. Polished homogeneity testing of Corning fused silica boules

    NASA Astrophysics Data System (ADS)

    Fanning, Andrew W.; Ellison, Joseph F.; Green, Daniel E.

    1999-11-01

    Interferometrically measuring the index of refraction variation (index homogeneity) of glass blanks requires that the blanks be made transparent to the interferometer laser. One method for achieving this is to 'sandwich' a rough ground blank between two polished flats while adding an index matching liquid at each surface interface. This is better known as oil-on-flat (OOF) or oil-on-plate testing. Another method requires polishing both surfaces and is better known as polished homogeneity (PHOM) testing or the Schwider method. Corning Inc. historically has used OOF testing to measure the index homogeneity of disk-shaped, fused silica boules over multiple 18' diameter apertures. Recently a boule polishing and PHOM testing process was developed by Corning for measuring the homogeneity over 24' diameter apertures to support fused silica production for the National Ignition Facility (NIF). Consequently, the PHOM technique has been compared to the OOF process using a number of different methods including repeatability/reproducibility studies, data stitching, and vibration analysis. The analysis performed demonstrates PHOM's advantages over OOF testing.

  2. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  3. Homogeneous near-perfect invisible ground and free space cloak

    NASA Astrophysics Data System (ADS)

    Fazeli, Mohamad; Sedighy, Seyyed Hassan; Hassani, Hamid Reza

    2017-04-01

    A general approach to design near perfect invisible ground and free space cloaks is introduced in this paper. The proposed method which is based on the optical transformation theory, leads to homogeneous constitutive parameters for the cloaks without any singularities. Moreover, the single-step mapping process with linear relations achieves an uncomplicated designing process. Invisibility performance obtained by using this approach does not depend on the incident wave direction, also. The simplicity and design flexibility of the introduced approach with the homogeneity of extracted parameters greatly facilitate the design and fabrication processes of the both proposed ground and free space invisible cloaks. The numerical simulations prove the capability and universality of the proposed design approach.

  4. Creating Geoscience Leaders

    NASA Astrophysics Data System (ADS)

    Buskop, J.; Buskop, W.

    2013-12-01

    The United Nations Educational, Scientific, and Cultural Organization recognizes 21 World Heritage in the United States, ten of which have astounding geological features: Wrangell St. Elias National Park, Olympic National Park, Mesa Verde National Park, Chaco Canyon, Glacier National Park, Carlsbad National Park, Mammoth Cave, Great Smokey Mountains National Park, Hawaii Volcanoes National Park, and Everglades National Park. Created by a student frustrated with fellow students addicted to smart phones with an extreme lack of interest in the geosciences, one student visited each World Heritage site in the United States and created one e-book chapter per park. Each chapter was created with original photographs, and a geological discovery hunt to encourage teen involvement in preserving remarkable geological sites. Each chapter describes at least one way young adults can get involved with the geosciences, such a cave geology, glaciology, hydrology, and volcanology. The e-book describes one park per chapter, each chapter providing a geological discovery hunt, information on how to get involved with conservation of the parks, geological maps of the parks, parallels between archaeological and geological sites, and how to talk to a ranger. The young author is approaching UNESCO to publish the work as a free e-book to encourage involvement in UNESCO sites and to prove that the geosciences are fun.

  5. Numerical Computation of Homogeneous Slope Stability

    PubMed Central

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS). PMID:25784927

  6. Convective mixing in homogeneous porous media flow

    NASA Astrophysics Data System (ADS)

    Ching, Jia-Hau; Chen, Peilong; Tsai, Peichun Amy

    2017-01-01

    Inspired by the flow processes in the technology of carbon dioxide (CO2) storage in saline formations, we modeled a homogeneous porous media flow in a Hele-Shaw cell to investigate density-driven convection due to dissolution. We used an analogy of the fluid system to mimic the diffusion and subsequent convection when CO2 dissolves in brine, which generates a heavier solution. By varying the permeability, we examined the onset of convection, the falling dynamics, the wavelengths of fingers, and the rate of dissolution, for the Rayleigh number Ra (a dimensionless forcing term which is the ratio of buoyancy to diffusivity) in the range of 2.0 ×104≤Ra≤8.26 ×105 . Our results reveal that the effect of permeability influences significantly the initial convective speed, as well as the later coarsening dynamics of the heavier fingering plumes. However, the total dissolved mass, characterized by a nondimensional Nusselt number Nu, has an insignificant dependence on Ra. This implies that the total dissolution rate of CO2 is nearly constant in high Ra geological porous structures.

  7. ANALYSIS OF FISH HOMOGENATES FOR PERFLUORINATED COMPOUNDS

    EPA Science Inventory

    Perfluorinated compounds (PFCs) which include PFOS and PFOA are widely distributed in wildlife. Whole fish homogenates were analyzed for PFCs from the upper Mississippi, the Missouri and the Ohio rivers. Methods development, validation data, and preliminary study results will b...

  8. Producing tritium in a homogenous reactor

    DOEpatents

    Cawley, William E.

    1985-01-01

    A method and apparatus are described for the joint production and separation of tritium. Tritium is produced in an aqueous homogenous reactor and heat from the nuclear reaction is used to distill tritium from the lower isotopes of hydrogen.

  9. Model Misspecification: Finite Mixture or Homogeneous?

    PubMed Central

    Tarpey, Thaddeus; Yun, Dong; Petkova, Eva

    2007-01-01

    A common problem in statistical modelling is to distinguish between finite mixture distribution and a homogeneous non-mixture distribution. Finite mixture models are widely used in practice and often mixtures of normal densities are indistinguishable from homogenous non-normal densities. This paper illustrates what happens when the EM algorithm for normal mixtures is applied to a distribution that is a homogeneous non-mixture distribution. In particular, a population-based EM algorithm for finite mixtures is introduced and applied directly to density functions instead of sample data. The population-based EM algorithm is used to find finite mixture approximations to common homogeneous distributions. An example regarding the nature of a placebo response in drug treated depressed subjects is used to illustrate ideas. PMID:18974843

  10. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  11. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  12. Skyrmion production on demand by homogeneous DC currents

    NASA Astrophysics Data System (ADS)

    Everschor-Sitte, Karin; Sitte, Matthias; Valet, Thierry; Abanov, Artem; Sinova, Jairo

    2017-09-01

    Topological magnetic textures—like skyrmions—are major players in the design of next-generation magnetic storage technology due to their stability and the control of their motion by ultra-low currents. A major challenge to develop new skyrmion-based technologies is the controlled creation of magnetic skyrmions without the need of complex setups. We show how to create skyrmions and other magnetic textures in ferromagnetic thin films by means of a homogeneous DC current and without requiring Dzyaloshinskii–Moriya interactions. This is possible by exploiting a static loss of stability arising from the interplay of current-induced spin-transfer-torque and a spatially inhomogeneous magnetization, which can be achieved, e.g., by locally engineering the anisotropy, the magnetic field, or other magnetic interactions. The magnetic textures are created controllably and efficiently with a period that can be tuned by the applied current strength. We propose a specific experimental setup realizable with simple materials, such as cobalt based materials, to observe the periodic formation of skyrmions. We show that adding chiral interactions will not influence the basics of the generations but the consequent dynamics w.r.t. the stabilization of topological textures. Our findings allow for skyrmion production on demand in simple ferromagnetic thin films by homogeneous DC currents.

  13. Effect of homogenization and pasteurization on the structure and thermal stability of whey protein in milk

    USDA-ARS?s Scientific Manuscript database

    The effect of homogenization alone or in combination with high temperature, short time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a two-...

  14. Generating and controlling homogeneous air turbulence using random jet arrays

    NASA Astrophysics Data System (ADS)

    Carter, Douglas; Petersen, Alec; Amili, Omid; Coletti, Filippo

    2016-12-01

    The use of random jet arrays, already employed in water tank facilities to generate zero-mean-flow homogeneous turbulence, is extended to air as a working fluid. A novel facility is introduced that uses two facing arrays of individually controlled jets (256 in total) to force steady homogeneous turbulence with negligible mean flow, shear, and strain. Quasi-synthetic jet pumps are created by expanding pressurized air through small straight nozzles and are actuated by fast-response low-voltage solenoid valves. Velocity fields, two-point correlations, energy spectra, and second-order structure functions are obtained from 2D PIV and are used to characterize the turbulence from the integral-to-the Kolmogorov scales. Several metrics are defined to quantify how well zero-mean-flow homogeneous turbulence is approximated for a wide range of forcing and geometric parameters. With increasing jet firing time duration, both the velocity fluctuations and the integral length scales are augmented and therefore the Reynolds number is increased. We reach a Taylor-microscale Reynolds number of 470, a large-scale Reynolds number of 74,000, and an integral-to-Kolmogorov length scale ratio of 680. The volume of the present homogeneous turbulence, the largest reported to date in a zero-mean-flow facility, is much larger than the integral length scale, allowing for the natural development of the energy cascade. The turbulence is found to be anisotropic irrespective of the distance between the jet arrays. Fine grids placed in front of the jets are effective at modulating the turbulence, reducing both velocity fluctuations and integral scales. Varying the jet-to-jet spacing within each array has no effect on the integral length scale, suggesting that this is dictated by the length scale of the jets.

  15. Creating new growth platforms.

    PubMed

    Laurie, Donald L; Doz, Yves L; Sheer, Claude P

    2006-05-01

    Sooner or later, most companies can't attain the growth rates expected by their boards and CEOs and demanded by investors. To some extent, such businesses are victims of their own successes. Many were able to sustain high growth rates for a long time because they were in high-growth industries. But once those industries slowed down, the businesses could no longer deliver the performance that investors had come to take for granted. Often, companies have resorted to acquisition, though this strategy has a discouraging track record. Over time, 65% of acquisitions destroy more value than they create. So where does real growth come from? For the past 12 years, the authors have been researching and advising companies on this issue. With the support of researchers at Harvard Business School and Insead, they instituted a project titled "The CEO Agenda and Growth". They identified and approached 24 companies that had achieved significant organic growth and interviewed their CEOs, chief strategists, heads of R&D, CFOs, and top-line managers. They asked, "Where does your growth come from?" and found a consistent pattern in the answers. All the businesses grew by creating new growth platforms (NGPs) on which they could build families of products and services and extend their capabilities into multiple new domains. Identifying NGP opportunities calls for executives to challenge conventional wisdom. In all the companies studied, top management believed that NGP innovation differed significantly from traditional product or service innovation. They had independent, senior-level units with a standing responsibility to create NGPs, and their CEOs spent as much as 50% of their time working with these units. The payoff has been spectacular and lasting. For example, from 1985 to 2004, the medical devices company Medtronic grew revenues at 18% per year, earnings at 20%, and market capitalization at 30%.

  16. Cluster Mechanism of Homogeneous Crystallization (Computer Study)

    NASA Astrophysics Data System (ADS)

    Belashchenko, D. K.

    2008-12-01

    A molecular dynamics (MD) study of homogeneous crystallization of liquid rubidium is conducted with an inter-particle pair potential. The equilibrium crystallization temperature of the models was 313 K. Models consisted of 500, 998, and 1968 particles in a basic cube. The main investigation method was as follows: to detect (along the MD run) the atoms with Voronoi polyhedrons (VP) of 0608 type (“0608-atoms,” as in a bcc crystal) and to detect the bound groups of 0608-atoms (“0608-clusters”) that could play the role of the seeds in crystallization. Full crystallization was observed only at temperatures lower than 185 K with the creation of a predominant bcc crystal. The crystallization mechanism of Rb models differs drastically from the mechanism adopted in classical nucleation theory. It consists of the growth of the total number of 0608-atoms on cooling and the formation of 0608-clusters, analogous to the case of coagulation of solute for a supersaturated two-component solution. At the first stage of the process the clusters have a very loose structure (something like medusa or octopus with many tentacles) and include inside atoms with other Voronoi polyhedron types. The dimensions of clusters quickly increase and approach those of the basic cube. 0608-atoms play the leading role in the crystallization process and activate the transition of the atoms involved in the 0608-coordination. The fast growth of the maximum cluster begins after it attains a critical size (about 150 0608-atoms). The fluctuations of cluster sizes are very important in the creation of a 0608-cluster of critical (threshold) size. These fluctuations are especially large in the interval from 180 K to 185 K.

  17. A study of the homogenization of soils

    SciTech Connect

    Giovine, L.R.S.; Miller, F.L. Jr.

    1993-06-01

    In accordance with US Environmental Protection Agency (US EPA) regulations, areas of land that have been contaminated must be returned to an environmental condition that permits less restrictive forms of use. In anticipation of being listed as an EPA Superfund Site, the United States Department of Energy`s (US DOE) Nevada Test Site (NTS) is evaluating existing the technologies, and supporting the development of new technologies, for the removal of plutonium contaminants from soils. During the 1950s, DOE conducted a series of tests on the NTS wherein attempts were made to detonate nuclear weapons by igniting charges of high explosives packed around the weapons` warheads. While the warheads never achieved criticality, each test disseminated particulate plutonium over several square miles, principally in Area 11 of the NTS.DOE`s Nevada Operations Office has committed to a Plutonium In Soils Integrated Demonstration Project (PuID) to evaluate existing and developmental technologies for the safe removal of plutonium contamination from soils. It is DOE`s intention to provide approximately one ton of Area 11 soil, traced with a non-radioactive plutonium surrogate, to each of several companies with expertise in the removal of soil contaminants. These companies have expressed an interest in contracting with DOE for remediation of NTS soils. DOE wishes to evaluate each company`s process in an unbiased and statistically justifiable manner. For this reason, DOE must provide to each company a large sample of soil for prototype testing. The soil must be homogenized such that the representativeness of each split is well documented and defensible. The process of uniformly mixing large volumes of soil has not been addressed, to our knowledge, in the hydrogeologic, soil science or mining literature. Several mixing devices are currently being evaluated by DOE for use in the PuID. This report describes the results of some initial experimentation with a small cement mixer.

  18. Creating a practice website.

    PubMed

    Downes, P K

    2007-05-26

    A website is a window to the outside world. For a dental practice, it may be the first point of contact for a prospective new patient and will therefore provide them with their 'first impression'; this may be days or weeks before actually visiting the practice. This section considers the different ways of creating a dental practice website and lists some of the main dental website design companies. It also describes what factors make a successful website and offers advice on how to ensure that it complies with current regulations and recommendations.

  19. Creating healthy camp experiences.

    PubMed

    Walton, Edward A; Tothy, Alison S

    2011-04-01

    The American Academy of Pediatrics has created recommendations for health appraisal and preparation of young people before participation in day or resident camps and to guide health and safety practices for children at camp. These recommendations are intended for parents, primary health care providers, and camp administration and health center staff. Although camps have diverse environments, there are general guidelines that apply to all situations and specific recommendations that are appropriate under special conditions. This policy statement has been reviewed and is supported by the American Camp Association.

  20. A homogeneous superconducting magnet design using a hybrid optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ni, Zhipeng; Wang, Qiuliang; Liu, Feng; Yan, Luguang

    2013-12-01

    This paper employs a hybrid optimization algorithm with a combination of linear programming (LP) and nonlinear programming (NLP) to design the highly homogeneous superconducting magnets for magnetic resonance imaging (MRI). The whole work is divided into two stages. The first LP stage provides a global optimal current map with several non-zero current clusters, and the mathematical model for the LP was updated by taking into account the maximum axial and radial magnetic field strength limitations. In the second NLP stage, the non-zero current clusters were discretized into practical solenoids. The superconducting conductor consumption was set as the objective function both in the LP and NLP stages to minimize the construction cost. In addition, the peak-peak homogeneity over the volume of imaging (VOI), the scope of 5 Gauss fringe field, and maximum magnetic field strength within superconducting coils were set as constraints. The detailed design process for a dedicated 3.0 T animal MRI scanner was presented. The homogeneous magnet produces a magnetic field quality of 6.0 ppm peak-peak homogeneity over a 16 cm by 18 cm elliptical VOI, and the 5 Gauss fringe field was limited within a 1.5 m by 2.0 m elliptical region.

  1. Isotopic homogeneity of iron in the early solar nebula.

    PubMed

    Zhu, X K; Guo, Y; O'Nions, R K; Young, E D; Ash, R D

    2001-07-19

    The chemical and isotopic homogeneity of the early solar nebula, and the processes producing fractionation during its evolution, are central issues of cosmochemistry. Studies of the relative abundance variations of three or more isotopes of an element can in principle determine if the initial reservoir of material was a homogeneous mixture or if it contained several distinct sources of precursor material. For example, widespread anomalies observed in the oxygen isotopes of meteorites have been interpreted as resulting from the mixing of a solid phase that was enriched in 16O with a gas phase in which 16O was depleted, or as an isotopic 'memory' of Galactic evolution. In either case, these anomalies are regarded as strong evidence that the early solar nebula was not initially homogeneous. Here we present measurements of the relative abundances of three iron isotopes in meteoritic and terrestrial samples. We show that significant variations of iron isotopes exist in both terrestrial and extraterrestrial materials. But when plotted in a three-isotope diagram, all of the data for these Solar System materials fall on a single mass-fractionation line, showing that homogenization of iron isotopes occurred in the solar nebula before both planetesimal accretion and chondrule formation.

  2. Creating corporate advantage.

    PubMed

    Collis, D J; Montgomery, C A

    1998-01-01

    What differentiates truly great corporate strategies from the merely adequate? How can executives at the corporate level create tangible advantage for their businesses that makes the whole more than the sum of the parts? This article presents a comprehensive framework for value creation in the multibusiness company. It addresses the most fundamental questions of corporate strategy: What businesses should a company be in? How should it coordinate activities across businesses? What role should the corporate office play? How should the corporation measure and control performance? Through detailed case studies of Tyco International, Sharp, the Newell Company, and Saatchi and Saatchi, the authors demonstrate that the answers to all those questions are driven largely by the nature of a company's special resources--its assets, skills, and capabilities. These range along a continuum from the highly specialized at one end to the very general at the other. A corporation's location on the continuum constrains the set of businesses it should compete in and limits its choices about the design of its organization. Applying the framework, the authors point out the common mistakes that result from misaligned corporate strategies. Companies mistakenly enter businesses based on similarities in products rather than the resources that contribute to competitive advantage in each business. Instead of tailoring organizational structures and systems to the needs of a particular strategy, they create plain-vanilla corporate offices and infrastructures. The company examples demonstrate that one size does not fit all. One can find great corporate strategies all along the continuum.

  3. Creating sustainable performance.

    PubMed

    Spreitzer, Gretchen; Porath, Christine

    2012-01-01

    What makes for sustainable individual and organizational performance? Employees who are thriving-not just satisfied and productive but also engaged in creating the future. The authors found that people who fit this description demonstrated 16% better overall performance, 125% less burnout, 32% more commitment to the organization, and 46% more job satisfaction than their peers. Thriving has two components: vitality, or the sense of being alive and excited, and learning, or the growth that comes from gaining knowledge and skills. Some people naturally build vitality and learning into their jobs, but most employees are influenced by their environment. Four mechanisms, none of which requires heroic effort or major resources, create the conditions for thriving: providing decision-making discretion, sharing information about the organization and its strategy, minimizing incivility, and offering performance feedback. Organizations such as Alaska Airlines, Zingerman's, Quicken Loans, and Caiman Consulting have found that helping people grow and remain energized at work is valiant on its own merits-but it can also boost performance in a sustainable way.

  4. Creating breakthroughs at 3M.

    PubMed

    von Hippel, E; Thomke, S; Sonnack, M

    1999-01-01

    Most senior managers want their product development teams to create break-throughs--new products that will allow their companies to grow rapidly and maintain high margins. But more often they get incremental improvements to existing products. That's partly because companies must compete in the short term. Searching for breakthroughs is expensive and time consuming; line extensions can help the bottom line immediately. In addition, developers simply don't know how to achieve breakthroughs, and there is usually no system in place to guide them. By the mid-1990s, the lack of such a system was a problem even for an innovative company like 3M. Then a project team in 3M's Medical-Surgical Markets Division became acquainted with a method for developing breakthrough products: the lead user process. The process is based on the fact that many commercially important products are initially thought of and even prototyped by "lead users"--companies, organizations, or individuals that are well ahead of market trends. Their needs are so far beyond those of the average user that lead users create innovations on their own that may later contribute to commercially attractive breakthroughs. The lead user process transforms the job of inventing breakthroughs into a systematic task of identifying lead users and learning from them. The authors explain the process and how the 3M project team successfully navigated through it. In the end, the team proposed three major new product lines and a change in the division's strategy that has led to the development of breakthrough products. And now several more divisions are using the process to break away from incrementalism.

  5. Simulator for SUPO, a Benchmark Aqueous Homogeneous Reactor (AHR)

    SciTech Connect

    Klein, Steven Karl; Determan, John C.

    2015-10-14

    A simulator has been developed for SUPO (Super Power) an aqueous homogeneous reactor (AHR) that operated at Los Alamos National Laboratory (LANL) from 1951 to 1974. During that period SUPO accumulated approximately 600,000 kWh of operation. It is considered the benchmark for steady-state operation of an AHR. The SUPO simulator was developed using the process that resulted in a simulator for an accelerator-driven subcritical system, which has been previously reported.

  6. Energy cost of creating quantum coherence

    NASA Astrophysics Data System (ADS)

    Misra, Avijit; Singh, Uttam; Bhattacharya, Samyadeb; Pati, Arun Kumar

    2016-05-01

    We consider physical situations where the resource theories of coherence and thermodynamics play competing roles. In particular, we study the creation of quantum coherence using unitary operations with limited thermodynamic resources. We find the maximal coherence that can be created under unitary operations starting from a thermal state and find explicitly the unitary transformation that creates the maximal coherence. Since coherence is created by unitary operations starting from a thermal state, it requires some amount of energy. This motivates us to explore the trade-off between the amount of coherence that can be created and the energy cost of the unitary process. We also find the maximal achievable coherence under the constraint on the available energy. Additionally, we compare the maximal coherence and the maximal total correlation that can be created under unitary transformations with the same available energy at our disposal. We find that when maximal coherence is created with limited energy, the total correlation created in the process is upper bounded by the maximal coherence, and vice versa. For two-qubit systems we show that no unitary transformation exists that creates the maximal coherence and maximal total correlation simultaneously with a limited energy cost.

  7. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation.

  8. Creating a TQM culture.

    PubMed

    Lynn, G; Curto, C

    1992-11-01

    Creating a culture and environment for quality improvement is hard work that takes time and commitment. It is often frustrating and painful. For an organization to be successful in this transformation, leadership is not just important, it is vital. The leaders in TQM have new roles to play, roles that go against the grain of many of the forces that led to management success. The tasks of the leaders in a TQM organization emphasize building teamwork and removing barriers that prevent the organization from meeting customer needs. When Jamie Haughton, CEO of Corning, was asked where in his job he found the time to commit to TQM, he replied, "Continuous quality improvement is my job; it is the most important thing I do ... Quality is the primary responsibility of the leader."

  9. Creating With Carbon

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A subsidiary of SI Diamond Technology, Inc., Applied Nanotech, of Austin, Texas, is creating a buzz among various technology firms and venture capital groups interested in the company s progressive research on carbon-related field emission devices, including carbon nanotubes, filaments of pure carbon less than one ten-thousandth the width of human hair. Since their discovery in 1991, carbon nanotubes have gained considerable attention due to their unique physical properties. For example, a single perfect carbon nanotube can range from 10 to 100 times stronger than steel, per unit weight. Recent studies also indicate that the nanotubes may be the best heat-conducting material in existence. These properties, combined with the ease of growing thin films or nanotubes by a variety of deposition techniques, make the carbon-based material one of the most desirable for cold field emission cathodes.

  10. Creating virtual ARDS patients.

    PubMed

    Das, Anup; Haque, Mainul; Chikhani, Marc; Wenfei Wang; Hardman, Jonathan G; Bates, Declan G

    2016-08-01

    This paper presents the methodology used in patient-specific calibration of a novel highly integrated model of the cardiovascular and pulmonary pathophysiology associated with Acute Respiratory Distress Syndrome (ARDS). We focus on data from previously published clinical trials on the static and dynamic cardio-pulmonary responses of three ARDS patients to changes in ventilator settings. From this data, the parameters of the integrated model were identified using an optimization-based methodology in multiple stages. Computational simulations confirm that the resulting model outputs accurately reproduce the available clinical data. Our results open up the possibility of creating in silico a biobank of virtual ARDS patients that could be used to evaluate current, and investigate novel, therapeutic strategies.

  11. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    NASA Astrophysics Data System (ADS)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  12. Creating the living brand.

    PubMed

    Bendapudi, Neeli; Bendapudi, Venkat

    2005-05-01

    It's easy to conclude from the literature and the lore that top-notch customer service is the province of a few luxury companies and that any retailer outside that rarefied atmosphere is condemned to offer mediocre service at best. But even companies that position themselves for the mass market can provide outstanding customer-employee interactions and profit from them, if they train employees to reflect the brand's core values. The authors studied the convenience store industry in depth and focused on two that have developed a devoted following: QuikTrip (QT) and Wawa. Turnover rates at QT and Wawa are 14% and 22% respectively, much lower than the typical rate in retail. The authors found six principles that both firms embrace to create a strong culture of customer service. Know what you're looking for: A focus on candidates' intrinsic traits allows the companies to hire people who will naturally bring the right qualities to the job. Make the most of talent: In mass-market retail, talent is generally viewed as a commodity, but that outlook becomes a self-fulfilling prophesy. Create pride in the brand: Service quality depends directly on employees' attachment to the brand. Build community: Wawa and QT have made concerted efforts to build customer loyalty through a sense of community. Share the business context: Employees need a clear understanding of how their company operates and how it defines success. Satisfy the soul: To win an employee's passionate engagement, a company must meet his or her needs for security, esteem, and justice.

  13. Creating Griffith Observatory

    NASA Astrophysics Data System (ADS)

    Cook, Anthony

    2013-01-01

    Griffith Observatory has been the iconic symbol of the sky for southern California since it began its public mission on May 15, 1935. While the Observatory is widely known as being the gift of Col. Griffith J. Griffith (1850-1919), the story of how Griffith’s gift became reality involves many of the people better known for other contributions that made Los Angeles area an important center of astrophysics in the 20th century. Griffith began drawing up his plans for an observatory and science museum for the people of Los Angeles after looking at Saturn through the newly completed 60-inch reflector on Mt. Wilson. He realized the social impact that viewing the heavens could have if made freely available, and discussing the idea of a public observatory with Mt. Wilson Observatory’s founder, George Ellery Hale, and Director, Walter Adams. This resulted, in 1916, in a will specifying many of the features of Griffith Observatory, and establishing a committee managed trust fund to build it. Astronomy popularizer Mars Baumgardt convinced the committee at the Zeiss Planetarium projector would be appropriate for Griffith’s project after the planetarium was introduced in Germany in 1923. In 1930, the trust committee judged funds to be sufficient to start work on creating Griffith Observatory, and letters from the Committee requesting help in realizing the project were sent to Hale, Adams, Robert Millikan, and other area experts then engaged in creating the 200-inch telescope eventually destined for Palomar Mountain. A Scientific Advisory Committee, headed by Millikan, recommended that Caltech Physicist Edward Kurth be put in charge of building and exhibit design. Kurth, in turn, sought help from artist Russell Porter. The architecture firm of John C. Austin and Fredrick Ashley was selected to design the project, and they adopted the designs of Porter and Kurth. Philip Fox of the Adler Planetarium was enlisted to manage the completion of the Observatory and become its

  14. Identifying homogenous subgroups for individual patient meta-analysis based on Rough Set Theory.

    PubMed

    Gil-Herrera, Eleazar; Tsalatsanis, Athanasios; Kumar, Ambuj; Mhaskar, Rahul; Miladinovic, Branko; Yalcin, Ali; Djulbegovic, Benjamin

    2014-01-01

    Failure to detect and manage heterogeneity between clinical trials included in meta-analysis may lead to misinterpretation of summary effect estimates. This may ultimately compromise the validity of the results of the meta-analysis. Typically, when heterogeneity between trials is detected, researchers use sensitivity or subgroup analysis to manage it. However, both methods fail to explain why heterogeneity existed in the first place. Here we propose a novel methodology that relies on Rough Set Theory (RST) to detect, explain, and manage the sources of heterogeneity applicable to meta-analysis performed on individual patient data (IPD). The method exploits the RST relations of discernibility and indiscernibility to create homogeneous groups of patients. We applied our methodology on a dataset of 1,111 patients enrolled in 9 randomized controlled trials studying the effect of two transplantation procedures in the management of hematologic malignancies. Our method was able to create three subgroups of patients with remarkably low statistical heterogeneity values (16.8%, 0% and 0% respectively). The proposed methodology has the potential to automatize and standardize the process of detecting and managing heterogeneity in IPD meta-analysis. Future work involves investigating the applications of the proposed methodology in analyzing treatment effects in patients belonging to different risk groups, which will ultimately assist in personalized healthcare decision making.

  15. Multimode stretched spiral vortex and nonequilibrium energy spectrum in homogeneous shear flow turbulence

    NASA Astrophysics Data System (ADS)

    Horiuti, Kiyosi; Ozawa, Tetsuya

    2011-03-01

    The stretched spiral vortex [T. S. Lundgren, "Strained spiral vortex model for turbulent structures," Phys. Fluids 25, 2193 (1982)] is identified in turbulence in homogeneous shear flow and the spectral properties of this flow are studied using direct-numerical simulation data. The effects of mean shear on the genesis, growth, and annihilation processes of the spiral vortex are elucidated, and the role of the spiral vortex in the generation of turbulence is shown. As in homogeneous isotropic turbulence [K. Horiuti and T. Fujisawa, "The multi mode stretched spiral vortex in homogeneous isotropic turbulence," J. Fluid Mech. 595, 341 (2008)], multimodes of the spiral vortex are extracted. Two symmetric modes of configurations with regard to the vorticity alignment along the vortex tube in the core region and dual vortex sheets spiraling around the tube are often educed. One of the two symmetric modes is created by a conventional rolling-up of a single spanwise shear layer. Another one is created by the convergence of the recirculating flow or streamwise roll [F. Waleffe, "Homotopy of exact coherent structures in plane shear flows," Phys. Fluids 15, 1517 (2003)] caused by the upward and downward motions associated with the streaks. The vortex tube is formed by axial straining and lowering of pressure in the recirculating region. The spanwise shear layers are entrained by the tube and they form spiral turns. The latter symmetric mode tends to be transformed into the former mode with lapse of time due to the action of the pressure Hessian term. The power law in the inertial subrange energy spectrum is studied. The base steady spectrum fits the equilibrium Kolmogorov -5/3 spectrum, to which a nonequilibrium component induced by the fluctuation of the dissipation rate ɛ is added. This component is extracted using the conditional sampling on ɛ, and it is shown that it fits the -7/3 power in accordance with the statistical theory. The correlation between these spectra and

  16. Effect of non-homogenous thermal stress during sub-lethal photodynamic antimicrobial chemotherapy

    NASA Astrophysics Data System (ADS)

    Gadura, N.; Kokkinos, D.; Dehipawala, S.; Cheung, E.; Sullivan, R.; Subramaniam, R.; Schneider, P.; Tremberger, G., Jr.; Holden, T.; Lieberman, D.; Cheung, T.

    2012-03-01

    Pathogens could be inactivated via a light source coupled with a photosensitizing agent in photodynamic antimicrobial chemotherapy (PACT). This project studied the effect of non-homogenous substrate on cell colony. The non-homogeneity could be controlled by iron oxide nano-particles doping in porous glassy substrates such that each cell would experience tens of hot spots when illuminated with additional light source. The substrate non-homogeneity was characterized by Atomic Force Microscopy, Transmission Electron Microscopy and Extended X-Ray Absorption Fine Structure at Brookhaven Synchrotron Light Source. Microscopy images of cell motion were used to study the motility. Laboratory cell colonies on non-homogenous substrates exhibit reduced motility similar to those observed with sub-lethal PCAT treatment. Such motility reduction on non-homogenous substrate is interpreted as the presence of thermal stress. The studied pathogens included E. coli and Pseudomonas aeruginosa. Non-pathogenic microbes Bacillus subtilis was also studied for comparison. The results show that sub-lethal PACT could be effective with additional non-homogenous thermal stress. The use of non-uniform illumination on a homogeneous substrate to create thermal stress in sub-micron length scale is discussed via light correlation in propagation through random medium. Extension to sub-lethal PACT application complemented with thermal stress would be an appropriate application.

  17. Analysis of homogeneous/non-homogeneous nanofluid models accounting for nanofluid-surface interactions

    NASA Astrophysics Data System (ADS)

    Ahmad, R.

    2016-07-01

    This article reports an unbiased analysis for the water based rod shaped alumina nanoparticles by considering both the homogeneous and non-homogeneous nanofluid models over the coupled nanofluid-surface interface. The mechanics of the surface are found for both the homogeneous and non-homogeneous models, which were ignored in previous studies. The viscosity and thermal conductivity data are implemented from the international nanofluid property benchmark exercise. All the simulations are being done by using the experimentally verified results. By considering the homogeneous and non-homogeneous models, the precise movement of the alumina nanoparticles over the surface has been observed by solving the corresponding system of differential equations. For the non-homogeneous model, a uniform temperature and nanofluid volume fraction are assumed at the surface, and the flux of the alumina nanoparticle is taken as zero. The assumption of zero nanoparticle flux at the surface makes the non-homogeneous model physically more realistic. The differences of all profiles for both the homogeneous and nonhomogeneous models are insignificant, and this is due to small deviations in the values of the Brownian motion and thermophoresis parameters.

  18. Climate Data Homogenization Using Edge Detection Algorithms

    NASA Astrophysics Data System (ADS)

    Hammann, A. C.; Rennermalm, A. K.

    2015-12-01

    The problem of climate data homogenization has predominantly been addressed by testing the likelihood of one or more breaks inserted into a given time series and modeling the mean to be stationary in between the breaks. We recast the same problem in a slightly different form: that of detecting step-like changes in noisy data, and observe that this problem has spawned a large number of approaches to its solution as the "edge detection" problem in image processing. With respect to climate data, we ask the question: How can we optimally separate step-like from smoothly-varying low-frequency signals? We study the hypothesis that the edge-detection approach makes better use of all information contained in the time series than the "traditional" approach (e.g. Caussinus and Mestre, 2004), which we base on several observations. 1) The traditional formulation of the problem reduces the available information from the outset to that contained in the test statistic. 2) The criterion of local steepness of the low-frequency variability, while at least hypothetically useful, is ignored. 3) The practice of using monthly data corresponds, mathematically, to applying a moving average filter (to reduce noise) and subsequent subsampling of the result; this subsampling reduces the amount of available information beyond what is necessary for noise reduction. Most importantly, the tradeoff between noise reduction (better with filters with wide support in the time domain) and localization of detected changes (better with filters with narrow support) is expressed in the well-known uncertainty principle and can be addressed optimally within a time-frequency framework. Unsurprisingly, a large number of edge-detection algorithms have been proposed that make use of wavelet decompositions and similar techniques. We are developing this framework in part to be applied to a particular set of climate data from Greenland; we will present results from this application as well as from tests with

  19. Homogenization in compiling ICRF combined catalogs

    NASA Astrophysics Data System (ADS)

    Marco, F. J.; Martínez, M. J.; López, J. A.

    2013-10-01

    Context. The International Astronomical Union (IAU) recommendations regarding the International Celestial Reference Frame (ICRF) realizations require the construction of radio sources catalogs obtained using very-long-baseline interferometry (VLBI) methods. The improvement of these catalogs is a necessary procedure for the further densification of the ICRF over the celestial sphere. Aims: The different positions obtained from several catalogs using common sources to the ICRF make it necessary to critically revise the different methods employed in improving the ICRF from several radio sources catalogs. In this sense, a revision of the analytical and the statistical methods is necessary in line with their advantages and disadvantages. We have a double goal: first, we propose an adequate treatment of the residual of several catalogs to obtain a homogeneous catalog; second, we attempt to discern whether a combined catalog is homogeneous. Methods: We define homogeneity as applied to our problem in a dual sense: the first deals with the spatial distribution of the data over the celestial sphere. The second has a statistical meaning, as we consider that homogeneity exists when the residual between a given catalog and the ICRF behaves as a unimodal pure Gaussian. We use a nonparametrical method, which enables us to homogeneously extend the statistical properties of the residual over the entire sphere. This intermediate adjustment allows for subsequent computation of the coefficients for any parametrical adjustment model that has a higher accuracy and greater stability, and it prevents problems related with direct adjustments using the models. On the other hand, the homogeneity of the residuals in a catalog is tested using different weights. Our procedure also serves to propose the most suitable weights to maintain homogeneity in the final results. We perform a test using the ICRF-Ext2, JPL, and USNO quasar catalogs. Results: We show that a combination of catalogs can only

  20. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E. (Inventor); Taylor, Bryant D. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  1. Rapid biotic homogenization of marine fish assemblages.

    PubMed

    Magurran, Anne E; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J; McGill, Brian

    2015-09-24

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change.

  2. Computational Homogenization of Defect Driving Forces

    NASA Astrophysics Data System (ADS)

    Ricker, Sarah; Mergheim, Julia; Steinmann, Paul

    Due to the fact that many engineering materials and also biological tissues possess an underlying (heterogeneous) micro-structure it is not sufficient to simulate these materials by pre-assumed overall constitutive assumptions. Therefore, we apply a homogenization scheme, which determines the macroscopic material behavior based on analysis of the underlying micro-structure. In the work at hand focus is put on the extension of the classical computational homogenization scheme towards the homogenization of material forces. Therefore, volume forces have to incorporated which may emerge due to inhomogeneities in the material. With assistance of this material formulation and the equivalence of the J-integral and the material force at a crack tip, studies on the influence of the micro-structure onto the macroscopic crack-propagation are carried out.

  3. Supported Dendrimer-Encapsulated Metal Clusters: Toward Heterogenizing Homogeneous Catalysts.

    PubMed

    Ye, Rong; Zhukhovitskiy, Aleksandr V; Deraedt, Christophe V; Toste, F Dean; Somorjai, Gabor A

    2017-08-15

    Recyclable catalysts, especially those that display selective reactivity, are vital for the development of sustainable chemical processes. Among available catalyst platforms, heterogeneous catalysts are particularly well-disposed toward separation from the reaction mixture via filtration methods, which renders them readily recyclable. Furthermore, heterogeneous catalysts offer numerous handles-some without homogeneous analogues-for performance and selectivity optimization. These handles include nanoparticle size, pore profile of porous supports, surface ligands and interface with oxide supports, and flow rate through a solid catalyst bed. Despite these available handles, however, conventional heterogeneous catalysts are themselves often structurally heterogeneous compared to homogeneous catalysts, which complicates efforts to optimize and expand the scope of their reactivity and selectivity. Ongoing efforts in our laboratories are aimed to address the above challenge by heterogenizing homogeneous catalysts, which can be defined as the modification of homogeneous catalysts to render them in a separable (solid) phase from the starting materials and products. Specifically, we grow the small nanoclusters in dendrimers, a class of uniform polymers with the connectivity of fractal trees and generally radial symmetry. Thanks to their dense multivalency, shape persistence, and structural uniformity, dendrimers have proven to be versatile scaffolds for the synthesis and stabilization of small nanoclusters. Then these dendrimer-encapsulated metal clusters (DEMCs) are adsorbed onto mesoporous silica. Through this method, we have achieved selective transformations that had been challenging to accomplish in a heterogeneous setting, e.g., π-bond activation and aldol reactions. Extensive investigation into the catalytic systems under reaction conditions allowed us to correlate the structural features (e.g., oxidation states) of the catalysts and their activity. Moreover, we have

  4. Quality Control and Homogeneity of Precipitation Data in the Southwest of Europe.

    NASA Astrophysics Data System (ADS)

    González-Rouco, J. Fidel; Jiménez, J. Luis; Quesada, Vicente; Valero, Francisco

    2001-03-01

    A quality control process involving outliers processing, homogenization, and interpolation has been applied to 95 monthly precipitation series in the Iberian Peninsula, southern France, and northern Africa during the period 1899-1989. A detailed description of the procedure results is provided and the impact of adjustments on trend estimation is discussed.Outliers have been censored by trimming extreme values. Homogeneity adjustments have been developed by applying the Standard Normal Homogeneity Test in combination with an objective methodology to select reference series.The spatial distribution of outliers indicates that they are due to climate variability rather than measurement errors. After carrying out the homogeneity procedure, 40% of the series were found to be homogeneous, 49.5% became homogeneous after one adjustment, and 9.5% after two adjustments. About 30% of the inhomogeneities could be traced to information in the scarce history files.It is shown that these data present severe homogeneity problems and that applying outliers and homogeneity adjustments greatly changes the patterns of trends for this area.

  5. Creating alternatives in science

    PubMed Central

    2009-01-01

    Traditional scientist training at the PhD level does not prepare students to be competitive in biotechnology or other non-academic science careers. Some universities have developed biotechnology-relevant doctoral programmes, but most have not. Forming a life science career club makes a statement to university administrators that it is time to rework the curriculum to include biotechnology-relevant training. A career club can supplement traditional PhD training by introducing students to available career choices, help them develop a personal network and teach the business skills that they will need to be competitive in science outside of academia. This paper is an instructional guide designed to help students create a science career club at their own university. These suggestions are based on the experience gained in establishing such a club for the Graduate School at the University of Colorado Denver. We describe the activities that can be offered, the job descriptions for the offices required and potential challenges. With determination, a creative spirit, and the guidance of this paper, students should be able to greatly increase awareness of science career options, and begin building the skills necessary to become competitive in non-academic science. PMID:20161069

  6. Creating alternatives in science.

    PubMed

    Gravagna, Nicole G

    2009-04-01

    Traditional scientist training at the PhD level does not prepare students to be competitive in biotechnology or other non-academic science careers. Some universities have developed biotechnology-relevant doctoral programmes, but most have not. Forming a life science career club makes a statement to university administrators that it is time to rework the curriculum to include biotechnology-relevant training. A career club can supplement traditional PhD training by introducing students to available career choices, help them develop a personal network and teach the business skills that they will need to be competitive in science outside of academia. This paper is an instructional guide designed to help students create a science career club at their own university. These suggestions are based on the experience gained in establishing such a club for the Graduate School at the University of Colorado Denver. We describe the activities that can be offered, the job descriptions for the offices required and potential challenges. With determination, a creative spirit, and the guidance of this paper, students should be able to greatly increase awareness of science career options, and begin building the skills necessary to become competitive in non-academic science.

  7. Creating new market space.

    PubMed

    Kim, W C; Mauborgne, R

    1999-01-01

    Most companies focus on matching and beating their rivals. As a result, their strategies tend to take on similar dimensions. What ensues is head-to-head competition based largely on incremental improvements in cost, quality, or both. The authors have studied how innovative companies break free from the competitive pack by staking out fundamentally new market space--that is, by creating products or services for which there are no direct competitors. This path to value innovation requires a different competitive mind-set and a systematic way of looking for opportunities. Instead of looking within the conventional boundaries that define how an industry competes, managers can look methodically across them. By so doing, they can find unoccupied territory that represents real value innovation. Rather than looking at competitors within their own industry, for example, managers can ask why customers make the trade-off between substitute products or services. Home Depot, for example, looked across the substitutes serving home improvement needs. Intuit looked across the substitutes available to individuals managing their personal finances. In both cases, powerful insights were derived from looking at familiar data from a new perspective. Similar insights can be gleaned by looking across strategic groups within an industry; across buyer groups; across complementary product and service offerings; across the functional-emotional orientation of an industry; and even across time. To help readers explore new market space systematically, the authors developed a tool, the value curve, that can be used to represent visually a range of value propositions.

  8. Creating an open mind.

    PubMed

    Monaghan, Duncan

    2011-07-01

    Duncan Monaghan is 33 years old and in his second year of an Arts degree in Creative Writing. He is a published poet and is currently producing a music CD. Duncan has a history of bipolar disorder which was diagnosed when he was nineteen: "It worried me at first a lot. It played on my mind constantly. I felt different from everybody else--I did not understand what was happening to me." Drawing on his life experiences, Duncan has been enhancing his recovery through creativity--in poetry, lyrics, music and story. "Life for me was a constant battle of relying on medication and appointments with my case manager...until I realized I could combine my recovery with my passions as a tool to use as an outlet to many of the "mind traps" I so often found hindering my own recovery." Duncan is Aboriginal and has experience of the mental health systems in most states and territories and now lives in Brisbane. This is a shortened version of his presentation at Creating Futures 2010.

  9. Creating Heliophysics Concept Maps

    NASA Astrophysics Data System (ADS)

    Ali, N. A.; Peticolas, L. M.; Paglierani, R.; Mendez, B. J.

    2011-12-01

    The Center for Science Education at University of California Berkeley's Space Sciences Laboratory is creating concept maps for Heliophysics and would like to get input from scientists. The purpose of this effort is to identify key concepts related to Heliophysics and map their progression to show how students' understanding of Heliophysics might develop from Kindergarten through higher education. These maps are meant to tie into the AAAS Project 2061 Benchmarks for Scientific Literacy and National Science Education Standards. It is hoped that the results of this effort will be useful for curriculum designers developing Heliophysics-related curriculum materials and classroom teachers using Heliophysics materials. The need for concept maps was identified as a result of product analysis undertaken by the NASA Heliophysics Forum Team. The NASA Science Education and Public Outreach Forums have as two of their goals to improve the characterization of the contents of the Science Mission Directorate and Public Outreach (SMD E/PO) portfolio (Objective 2.1) and assist SMD in addressing gaps in the portfolio of SMD E/PO products and project activities (Objective 2.2). An important part of this effort is receiving feedback from solar scientists regarding the inclusion of key concepts and their progression in the maps. This session will introduce the draft concept maps and elicit feedback from scientists.

  10. On gain in homogenized composite materials

    NASA Astrophysics Data System (ADS)

    Mackay, Tom G.; Lakhtakia, Akhlesh

    2016-09-01

    Three theoretical studies were undertaken, each based on the Bruggeman homogenization formalism and each involving homogenized composite materials (HCMs) comprising active component materials. It was found that: (i) HCMs can exhibit higher degrees of amplification than are exhibited by the HCM's component materials; (ii) anisotropic HCMs can simultaneously exhibit plane-wave amplification for certain propagation directions and plane-wave attenuation for other propagation directions; and (iii) for isotropic chiral HCMs, left-circularly polarized fields may be amplified while right-circularly polarized fields may be simultaneously attenuated (or vice versa) in any propagation direction.

  11. A community-based participatory approach and engagement process creates culturally appropriate and community informed pandemic plans after the 2009 H1N1 influenza pandemic: remote and isolated First Nations communities of sub-arctic Ontario, Canada

    PubMed Central

    2012-01-01

    Background Public health emergencies have the potential to disproportionately impact disadvantaged populations due to pre-established social and economic inequalities. Internationally, prior to the 2009 H1N1 influenza pandemic, existing pandemic plans were created with limited public consultation; therefore, the unique needs and characteristics of some First Nations communities may not be ethically and adequately addressed. Engaging the public in pandemic planning can provide vital information regarding local values and beliefs that may ultimately lead to increased acceptability, feasibility, and implementation of pandemic plans. Thus, the objective of the present study was to elicit and address First Nations community members’ suggested modifications to their community-level pandemic plans after the 2009 H1N1 influenza pandemic. Methods The study area included three remote and isolated First Nations communities located in sub-arctic Ontario, Canada. A community-based participatory approach and community engagement process (i.e., semi-directed interviews (n = 13), unstructured interviews (n = 4), and meetings (n = 27)) were employed. Participants were purposively sampled and represented various community stakeholders (e.g., local government, health care, clergy, education, etc.) involved in the community’s pandemic response. Collected data were manually transcribed and coded using deductive and inductive thematic analysis. The data subsequently informed the modification of the community-level pandemic plans. Results The primary modifications incorporated in the community-level pandemic plans involved adding community-specific detail. For example, ‘supplies’ emerged as an additional category of pandemic preparedness and response, since including details about supplies and resources was important due to the geographical remoteness of the study communities. Furthermore, it was important to add details of how, when, where, and who was responsible

  12. A community-based participatory approach and engagement process creates culturally appropriate and community informed pandemic plans after the 2009 H1N1 influenza pandemic: remote and isolated First Nations communities of sub-arctic Ontario, Canada.

    PubMed

    Charania, Nadia A; Tsuji, Leonard J S

    2012-04-03

    Public health emergencies have the potential to disproportionately impact disadvantaged populations due to pre-established social and economic inequalities. Internationally, prior to the 2009 H1N1 influenza pandemic, existing pandemic plans were created with limited public consultation; therefore, the unique needs and characteristics of some First Nations communities may not be ethically and adequately addressed. Engaging the public in pandemic planning can provide vital information regarding local values and beliefs that may ultimately lead to increased acceptability, feasibility, and implementation of pandemic plans. Thus, the objective of the present study was to elicit and address First Nations community members' suggested modifications to their community-level pandemic plans after the 2009 H1N1 influenza pandemic. The study area included three remote and isolated First Nations communities located in sub-arctic Ontario, Canada. A community-based participatory approach and community engagement process (i.e., semi-directed interviews (n = 13), unstructured interviews (n = 4), and meetings (n = 27)) were employed. Participants were purposively sampled and represented various community stakeholders (e.g., local government, health care, clergy, education, etc.) involved in the community's pandemic response. Collected data were manually transcribed and coded using deductive and inductive thematic analysis. The data subsequently informed the modification of the community-level pandemic plans. The primary modifications incorporated in the community-level pandemic plans involved adding community-specific detail. For example, 'supplies' emerged as an additional category of pandemic preparedness and response, since including details about supplies and resources was important due to the geographical remoteness of the study communities. Furthermore, it was important to add details of how, when, where, and who was responsible for implementing recommendations

  13. Contribution of the live-vertebrate trade toward taxonomic homogenization.

    PubMed

    Romagosa, Christina M; Guyer, Craig; Wooten, Michael C

    2009-08-01

    The process of taxonomic homogenization occurs through two mechanisms, extinctions and introductions, and leads to a reduction of global biodiversity. We used available U.S. trade data as a proxy for global trade in live vertebrates to assess the contribution of trade to the process of taxonomic homogenization. Data included all available U.S. importation and exportation records, estimation of extinction risk, and reports of establishment outside the native range for species within six vertebrate groups. Based on Monte Carlo sampling, the number of species traded, established outside of the native range, and threatened with extinction was not randomly distributed among vertebrate families. Twenty-eight percent of vertebrate families that were traded preferentially were also established or threatened with extinction, an unusually high percentage compared with the 7% of families that were not traded preferentially but that became established or threatened with extinction. The importance of trade in homogenization of vertebrates suggests that additional efforts should be made to prevent introductions and extinctions through this medium.

  14. On the origin of metal homogeneities in globular clusters

    NASA Technical Reports Server (NTRS)

    Murray, Stephen D.; Lin, Douglas N. C.

    1990-01-01

    Various transport processes which may have affected the chemical homogeneity in protocluster clouds are examined. It is shown that the characteristic diffusion time scale associated with collisions between grains and gas atoms is considerably longer than that on which star formation is expected to occur. Collisions between large grains and gas atoms lead to mass segregation and metallicity gradients on a time scale comparable to the crossing time of the clusters in the Galaxy. One possible mechanism for inducing and maintaining chemical homogeneity is turbulent diffusion in the clouds. The mixing time scale required in this case is comparable to several internal dynamical time scales, longer than the evolutionary time scale of the most massive stars, and shorter than the Galactic orbital time scale of the clouds. Thus, metals in presently observed stars probably did not originate from upper main-sequence stars of a coeval generation.

  15. A comparative study of Casson fluid with homogeneous-heterogeneous reactions.

    PubMed

    Khan, Muhammad Ijaz; Waqas, Muhammad; Hayat, Tasawar; Alsaedi, Ahmed

    2017-03-09

    Magnetohydrodynamic (MHD) stagnation point flow of Casson fluid towards a stretching sheet is addressed. Homogeneous-heterogeneous reactions together with homogeneous heat effect subject to a resistive force of electromagnetic origin is discussed. It is assumed that the homogeneous process in the ambient fluid is governed by first order kinetics and the heterogeneous process on the wall surface is given by isothermal cubic autocatalator kinetics. Ordinary differential systems have been considered. Solutions of the problems are presented via a numerical technique namely built in shooting method. Graphical behaviors of velocity, temperature and concentration are analyzed comprehensively. Velocity is noticed a decreasing function of Hartman number.

  16. Hyperelastic bodies under homogeneous Cauchy stress induced by non-homogeneous finite deformations

    NASA Astrophysics Data System (ADS)

    Mihai, L. Angela; Neff, Patrizio

    2017-03-01

    We discuss whether homogeneous Cauchy stress implies homogeneous strain in isotropic nonlinear elasticity. While for linear elasticity the positive answer is clear, we exhibit, through detailed calculations, an example with inhomogeneous continuous deformation but constant Cauchy stress. The example is derived from a non rank-one convex elastic energy.

  17. Creating your own leadership brand.

    PubMed

    Kerfoot, Karlene

    2002-01-01

    Building equity in a brand happens through many encounters. The initial attraction must be followed by the meeting of expectations. This creates a loyalty that is part of an emotional connection to that brand. This is the same process people go through when they first meet a leader and decide if this is a person they want to buy into. People will examine your style, your competence, and your standards. If you fail on any of these fronts, your ability to lead will be severely compromised. People expect more of leaders now, because they know and recognize good leaders. And, predictably, people are now more cynical of leaders because of the well-publicized excess of a few leaders who advanced their own causes at the expense of their people and their financial future. This will turn out to be a good thing, because it will create a higher standard of leadership that all must aspire to achieve. When the bar is raised for us, our standards of performance are also raised.

  18. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  19. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  20. Creating electron vortex beams with light.

    PubMed

    Handali, Jonathan; Shakya, Pratistha; Barwick, Brett

    2015-02-23

    We propose an all-optical method of creating electron vortices utilizing the Kapitza-Dirac effect. This technique uses the transfer of orbital angular momentum from photons to free electrons creating electron vortex beams in the process. The laser intensities needed for this experiment can be obtained with available pulsed lasers and the resulting electron beams carrying orbital angular momentum will be particularly useful in the study of magnetic materials and chiral plasmonic structures in ultrafast electron microscopy.

  1. Creating a Desired Future

    ERIC Educational Resources Information Center

    Jenkins-Scott, Jackie

    2008-01-01

    When the author became president of Wheelock College in Boston in 2004, she asked the trustees and the entire campus community to engage in an innovative strategic planning and visioning process. The goal was to achieve consensus on a strategic vision for the future of Wheelock College by the end of her first year. This article discusses how…

  2. Creating a Children's Village

    ERIC Educational Resources Information Center

    Roberts, Paul

    2012-01-01

    Five years ago the author embarked on an odyssey that would fundamentally change his life as an architect. He and his partner, Dave Deppen, were selected through a very competitive process to design a new Child Development and Family Studies Center in the Sierra Foothills, near Yosemite National Park for Columbia College. The Columbia College…

  3. Creating a sling - slideshow

    MedlinePlus

    ... this important distinction for online health information and services. Learn more about A.D.A.M.'s editorial policy , editorial process and privacy policy . A.D.A.M. is also a founding member of Hi-Ethics and subscribes to the principles of the Health on the Net Foundation (www. ...

  4. Create a Classroom Blog!

    ERIC Educational Resources Information Center

    Brunsell, Eric; Horejsi, Martin

    2010-01-01

    Science education blogs can serve as powerful digital lab notebooks that contain text, images, and videos. Each blog entry documents a moment in time, but becomes interactive with the addition of readers' comments. Blogs can provide a realistic experience of the peer-review process and generate evolving descriptions of observations through time.…

  5. Create a Classroom Blog!

    ERIC Educational Resources Information Center

    Brunsell, Eric; Horejsi, Martin

    2010-01-01

    Science education blogs can serve as powerful digital lab notebooks that contain text, images, and videos. Each blog entry documents a moment in time, but becomes interactive with the addition of readers' comments. Blogs can provide a realistic experience of the peer-review process and generate evolving descriptions of observations through time.…

  6. Creating a Children's Village

    ERIC Educational Resources Information Center

    Roberts, Paul

    2012-01-01

    Five years ago the author embarked on an odyssey that would fundamentally change his life as an architect. He and his partner, Dave Deppen, were selected through a very competitive process to design a new Child Development and Family Studies Center in the Sierra Foothills, near Yosemite National Park for Columbia College. The Columbia College…

  7. Creating a Desired Future

    ERIC Educational Resources Information Center

    Jenkins-Scott, Jackie

    2008-01-01

    When the author became president of Wheelock College in Boston in 2004, she asked the trustees and the entire campus community to engage in an innovative strategic planning and visioning process. The goal was to achieve consensus on a strategic vision for the future of Wheelock College by the end of her first year. This article discusses how…

  8. Spatial Homogeneity and Redshift--Distance Laws

    NASA Astrophysics Data System (ADS)

    Nicoll, J. F.; Segal, I. E.

    1982-06-01

    Spatial homogeneity in the radial direction of low-redshift galaxies is subjected to Kafka-Schmidt V/Vm tests using well-documented samples. Homogeneity is consistent with the assumption of the Lundmark (quadratic redshift-distance) law, but large deviations from homogeneity are implied by the assumption of the Hubble (linear redshift-distance) law. These deviations are similar to what would be expected on the basis of the Lundmark law. Luminosity functions are obtained for each law by a nonparametric statistically optimal method that removes the observational cutoff bias in complete samples. Although the Hubble law correlation of absolute magnitude with redshift is reduced considerably by elimination of the bias, computer simulations show that its bias-free value is nevertheless at a satistically quite significant level, indicating the self-inconsistency of the law. The corresponding Lundmark law correlations are quite satisfactory satistically. The regression of redshift on magnitude also involves radial spatial homogeneity and, according to R. Soneira, has slope determining the redshift-magnitude exponent independently of the luminosity function. We have, however, rigorously proved the material dependence of the regression on this function and here exemplify our treatment by using the bias-free functions indicated, with results consistent with the foregoing argument.

  9. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  10. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  11. HSTEP - Homogeneous Studies of Transiting Extrasolar Planets

    NASA Astrophysics Data System (ADS)

    Southworth, John

    2014-04-01

    This paper presents a summary of the HSTEP project: an effort to calculate the physical properties of the known transiting extrasolar planets using a homogeneous approach. I discuss the motivation for the project, list the 83 planets which have already been studied, run through some important aspects of the methodology, and finish with a synopsis of the results.

  12. Reduced-order modelling numerical homogenization.

    PubMed

    Abdulle, A; Bai, Y

    2014-08-06

    A general framework to combine numerical homogenization and reduced-order modelling techniques for partial differential equations (PDEs) with multiple scales is described. Numerical homogenization methods are usually efficient to approximate the effective solution of PDEs with multiple scales. However, classical numerical homogenization techniques require the numerical solution of a large number of so-called microproblems to approximate the effective data at selected grid points of the computational domain. Such computations become particularly expensive for high-dimensional, time-dependent or nonlinear problems. In this paper, we explain how numerical homogenization method can benefit from reduced-order modelling techniques that allow one to identify offline and online computational procedures. The effective data are only computed accurately at a carefully selected number of grid points (offline stage) appropriately 'interpolated' in the online stage resulting in an online cost comparable to that of a single-scale solver. The methodology is presented for a class of PDEs with multiple scales, including elliptic, parabolic, wave and nonlinear problems. Numerical examples, including wave propagation in inhomogeneous media and solute transport in unsaturated porous media, illustrate the proposed method.

  13. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  14. General Theorems about Homogeneous Ellipsoidal Inclusions

    ERIC Educational Resources Information Center

    Korringa, J.; And Others

    1978-01-01

    Mathematical theorems about the properties of ellipsoids are developed. Included are Poisson's theorem concerning the magnetization of a homogeneous body of ellipsoidal shape, the polarization of a dielectric, the transport of heat or electricity through an ellipsoid, and other problems. (BB)

  15. Coherence delay augmented laser beam homogenizer

    SciTech Connect

    Rasmussen, P.; Bernhardt, A.

    1991-12-31

    It is an object of the present invention to provide an apparatus that can reduce the apparent coherence length of a laser beam so the beam can be used with an inexpensive homogenizer to produce an output beam with a uniform spatial intensity across its entire cross section. It is a further object of the invention to provide an improved homogenizer with a variable aperture size that is simple and easily made. It is still an additional object of the invention to provide an improved liquid filled homogenizer utilizing total internal reflection for improved efficiency. These, and other objects of the invention are realized by using a ``coherence delay line,`` according to the present invention, in series between a laser and a homogenizer. The coherence delay line is an optical ``line`` that comprises two mirrors, one partially reflecting, and one totally reflecting, arranged so that light incident from the laser first strikes the partially reflecting mirror. A portion of the beam passes through, and a portion is reflected back to the totally reflecting mirror.

  16. On the supposed influence of milk homogenization on the risk of CVD, diabetes and allergy.

    PubMed

    Michalski, Marie-Caroline

    2007-04-01

    Commercial milk is homogenized for the purpose of physical stability, thereby reducing fat droplet size and including caseins and some whey proteins at the droplet interface. This seems to result in a better digestibility than untreated milk. Various casein peptides and milk fat globule membrane (MFGM) proteins are reported to present either harmful (e.g. atherogenic) or beneficial bioactivity (e.g. hypotensive, anticarcinogenic and others). Homogenization might enhance either of these effects, but this remains controversial. The effect of homogenization has not been studied regarding the link between early cow's milk consumption and occurrence of type I diabetes in children prone to the disease and no link appears in the general population. Homogenization does not influence milk allergy and intolerance in allergic children and lactose-intolerant or milk-hypersensitive adults. The impact of homogenization, as well as heating and other treatments such as cheesemaking processes, on the health properties of milk and dairy products remains to be fully elucidated.

  17. Numerical homogenization of the viscoplastic behavior of snow based on X-ray tomography images

    NASA Astrophysics Data System (ADS)

    Wautier, Antoine; Geindreau, Christian; Flin, Frédéric

    2017-06-01

    While the homogenization of snow elastic properties has been widely reported in the literature, homogeneous rate-dependent behavior responsible for the densification of the snowpack has hardly ever been upscaled from snow microstructure. We therefore adapt homogenization techniques developed within the framework of elasticity to the study of snow viscoplastic behavior. Based on the definition of kinematically uniform boundary conditions, homogenization problems are applied to 3-D images obtained from X-ray tomography, and the mechanical response of snow samples is explored for several densities. We propose an original post-processing approach in terms of viscous dissipated powers in order to formulate snow macroscopic behavior. Then, we show that Abouaf models are able to capture snow viscoplastic behavior and we formulate a homogenized constitutive equation based on a density parametrization. Eventually, we demonstrate the ability of the proposed models to account for the macroscopic mechanical response of snow for classical laboratory tests.

  18. Homogeneity of Latvian temperature and precipitation series

    NASA Astrophysics Data System (ADS)

    Lizuma, L.; Briede, A.

    2010-09-01

    During previous years and decades the homogenization of Latvian monthly temperature and precipitation data series was based on the direct homogenization methods which relayed on metadata and studies of the effects of specific changes in time of observation as well as methods of observation. However, the method is not effective for temperature and precipitation data series shifts detection caused by measurement's place relocation or environmental changes. The both climatological temperature and precipitation records are significantly affected by a number of non-climatological factors (station moves, changes in instrumentation; introduction of different observing practices like a different observing time or introduction of wetting corrections for precipitation, changes in the local urban environment). If these non-homogeneities are not accounted for properly, that makes the data unrepresentative to be used for analyses of climate state, variations and changes. Monthly and daily Latvian station series (1950-2008) of surface air temperature and precipitation are statistically tested with respect to homogeneity. Two homogeneity tests are applied to evaluate monthly series. The multiple analyses of series for homogenization MASHv3.02 has been applied to 23 Latvian mean, maximum and minimum daily and monthly data series and daily and monthly precipitation series. The standard normal homogeneity tests (SNHT) has been applied to monthly mean temperature and precipitation series. During the tested period the station network is dense enough for efficient homogeneity testing. It has been found that all the time series contain the homogeneity breaks at least during one of the month. For some stations the multiple breaks were found. For mean temperature time series the 80 % of the breaks are generally less than ±0.20C. The largest detected homogeneity breaks in the mean monthly temperatures are up to ±1.00C, in mean monthly maximum temperature are up to ±1.30C and for mean

  19. Creating a Social World

    PubMed Central

    Kendler, Kenneth S.; Jacobson, Kristen C.; Gardner, Charles O.; Gillespie, Nathan; Aggen, Steven A.; Prescott, Carol A.

    2014-01-01

    Context Peer-group deviance is strongly associated with externalizing behaviors. We have limited knowledge of the sources of individual differences in peer-group deviance. Objective To clarify genetic and environmental contributions to peer-group deviance in twins from mid-childhood through early adulthood. Design Retrospective assessments using a life-history calendar. Analysis by biometric growth curves. Setting General community. Participants Members of male-male pairs from the population-based Virginia Twin Registry personally interviewed in 1998–2004 (n=1802). Main Outcome Measure Self-reported peer-group deviance at ages 8 to 11, 12 to 14, 15 to 17, 18 to 21, and 22 to 25 years. Results Mean and variance of peer-group deviance increased substantially with age. Genetic effects on peer-group deviance showed a strong and steady increase over time. Family environment generally declined in importance over time. Individual-specific environmental influences on peer-group deviance levels were stable in the first 3 age periods and then increased as most twins left home. When standardized, the heritability of peer-group deviance is approximately 30% at ages 8 to 11 years and rises to approximately 50% across the last 3 time periods. Both genes and shared environment contributed to individual differences in the developmental trajectory of peer-group deviance. However, while the correlation between childhood peer-group deviance levels and the subsequent slope of peer-group deviance over time resulting from genetic factors was positive, the same relationship resulting from shared environmental factors was negative. Conclusions As male twins mature and create their own social worlds, genetic factors play an increasingly important role in their choice of peers, while shared environment becomes less influential. The individual specific environment increases in importance when individuals leave home. Individuals who have deviant peers in childhood, as a result of genetic vs

  20. Researchers Create Artificial Mouse 'Embryo'

    MedlinePlus

    ... news/fullstory_163881.html Researchers Create Artificial Mouse 'Embryo' Experiment used two types of gene-modified stem ... they've created a kind of artificial mouse embryo using stem cells, which can be coaxed to ...

  1. Creating Chemigrams in the Classroom.

    ERIC Educational Resources Information Center

    Guhin, Paula

    2003-01-01

    Describes an art activity in which students create "chemigrams" using exposed photo paper to create designs. Explains that this activity can be used with middle and high school students as an introduction to photography or use of chemicals. (CMK)

  2. Creating Chemigrams in the Classroom.

    ERIC Educational Resources Information Center

    Guhin, Paula

    2003-01-01

    Describes an art activity in which students create "chemigrams" using exposed photo paper to create designs. Explains that this activity can be used with middle and high school students as an introduction to photography or use of chemicals. (CMK)

  3. A study on beam homogeneity for a Siemens Primus linac.

    PubMed

    Cutanda Henriquez, F; Vargas-Castrillón, S T

    2007-06-01

    Asymmetric offset fields are an important tool for radiotherapy and their suitability for treatment should be assessed. Dose homogeneity for highly asymmetric fields has been studied for a Siemens PRIMUS clinical linear accelerator. Profiles and absolute dose have been measured in fields with two jaws at maximal position (20 cm) and the other two at maximal overtravel (10 cm), corresponding to 10 cm x 10 cm fields with extreme offset. Measured profiles have a marked decreasing gradient towards the beam edge, making these fields unsuitable for treatments. The flattening filter radius is smaller than the primary collimator aperture, and this creates beam inhomogeneities that affect large fields in areas far from the collimator axis, and asymmetric fields with large offset. The results presented assess the effect that the design of the primary collimator and flattening filter assembly has on beam homogeneity. This can have clinical consequences for treatments involving fields that include these inhomogeneous areas. Comparison with calculations from a treatment planning system, Philips Pinnacle v6.3, which computes under the hypotheses of a uniformly flattened beam, results in severe discrepancies.

  4. Are geological media homogeneous or heterogeneous for neutron investigations?

    PubMed

    Woźnicka, U; Drozdowicz, K; Gabańska, B; Krynicka, E; Igielski, A

    2003-01-01

    The thermal neutron absorption cross section of a heterogeneous material is lower than that of the corresponding homogeneous one which contains the same components. When rock materials are investigated the sample usually contains grains which create heterogeneity. The heterogeneity effect depends on the mass contribution of highly and low-absorbing centers, on the ratio of their absorption cross sections, and on their sizes. An influence of the granulation of silicon and diabase samples on the absorption cross section measured with Czubek's method has been experimentally investigated. A 20% underestimation of the absorption cross section has been observed for diabase grains of sizes from 6.3 to 12.8 mm. Copyright 2002 Elsevier Science Ltd.

  5. The Copenhagen problem with a quasi-homogeneous potential

    NASA Astrophysics Data System (ADS)

    Fakis, Demetrios; Kalvouridis, Tilemahos

    2017-05-01

    The Copenhagen problem is a well-known case of the famous restricted three-body problem. In this work instead of considering Newtonian potentials and forces we assume that the two primaries create a quasi-homogeneous potential, which means that we insert to the inverse square law of gravitation an inverse cube corrective term in order to approximate various phenomena as the radiation pressure of the primaries or the non-sphericity of them. Based on this new consideration we investigate the equilibrium locations of the small body and their parametric dependence, as well as the zero-velocity curves and surfaces for the planar motion, and the evolution of the regions where this motion is permitted when the Jacobian constant varies.

  6. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape

    PubMed Central

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to

  7. Vertically homogeneous stationary tornado-type vortex

    NASA Astrophysics Data System (ADS)

    Rutkevich, P. B.; Rutkevych, P. P.

    2010-05-01

    Tornado is regarded as one of the most dangerous atmosphere phenomena. The tornado phenomenon has been intensively studied so far, however, there is still no established and accepted theory of how tornadoes form, an uncertainty still exists concerning extreme winds and pressure drops in tornadoes. It is commonly accepted that it is possible to describe tornado from the set of nonlinear hydrodynamical equations, however, it is still unclear which non-linear processes are responsible for its formation. Nonlinear terms in the system are associated with either centrifugal force, or entropy transport, or transport of humidity. It appears that the amount and spatial distribution of precipitation with the convection are important indicators of the weather phenomena associated with a particular storm. The low-precipitation supercells that produce relatively little precipitation and yet show clear visual signs of rotation. Low-precipitation supercells occur most often near the surface dryline and, owing to the sparse precipitation and relatively dry environments with little cloudiness. Low-precipitation storms are frequently non-tornadic and many are non-severe despite exhibiting persistent rotation. On the other hand, the so-called high-precipitation storms are characterized by substantial precipitation within their mesocyclonic circulations. When high-precipitation storms have a recognizable hook radar echo, reflectivity in the hook is comparable to those in the precipitation core. High-precipitation supercells are probably the most common form of supercell and produce severe weather of all types including tornadoes. Therefore, in this work we consider a hydrodynamic system with only one nonlinear term associated with atmosphere humidity, which yields energy to the system. The tornado vortex is usually to a good approximation cylindrical so we use cylindrical geometry and homogeneity in vertical direction. In this case the problem reduces to a system of ordinary

  8. Photoinduced electron transfer processes in homogeneous and microheterogeneous solutions

    SciTech Connect

    Whitten, D.G.

    1991-10-01

    The focus of the work described in this report is on single electron transfer reactions of excited states which culminate in the formation of stable or metastable even electron species. For the most part the studies have involved even electron organic substrates which are thus converted photochemically to odd electron species and then at some stage reconvert to even electron products. These reactions generally fall into two rather different categories. In one set of studies we have examined reactions in which the metastable reagents generated by single electron transfer quenching of an excited state undergo novel fragmentation reactions, chiefly involving C-C bond cleavage. These reactions often culminate in novel and potentially useful chemical reactions and frequently have the potential for leading to new chemical products otherwise unaffordable by conventional reaction paths. In a rather different investigation we have also studied reactions in which single electron transfer quenching of an excited state is followed by subsequent reactions which lead reversibly to metastable two electron products which, often stable in themselves, can nonetheless be reacted with each other or with other reagents to regenerate the starting materials with release of energy. 66 refs., 9 figs., 1 tab.

  9. Effects of homogenization treatment on recrystallization behavior of 7150 aluminum sheet during post-rolling annealing

    SciTech Connect

    Guo, Zhanying; Zhao, Gang; Chen, X.-Grant

    2016-04-15

    The effects of two homogenization treatments applied to the direct chill (DC) cast billet on the recrystallization behavior in 7150 aluminum alloy during post-rolling annealing have been investigated using the electron backscatter diffraction (EBSD) technique. Following hot and cold rolling to the sheet, measured orientation maps, the recrystallization fraction and grain size, the misorientation angle and the subgrain size were used to characterize the recovery and recrystallization processes at different annealing temperatures. The results were compared between the conventional one-step homogenization and the new two-step homogenization, with the first step being pretreated at 250 °C. Al{sub 3}Zr dispersoids with higher densities and smaller sizes were obtained after the two-step homogenization, which strongly retarded subgrain/grain boundary mobility and inhibited recrystallization. Compared with the conventional one-step homogenized samples, a significantly lower recrystallized fraction and a smaller recrystallized grain size were obtained under all annealing conditions after cold rolling in the two-step homogenized samples. - Highlights: • Effects of two homogenization treatments on recrystallization in 7150 Al sheets • Quantitative study on the recrystallization evolution during post-rolling annealing • Al{sub 3}Zr dispersoids with higher densities and smaller sizes after two-step treatment • Higher recrystallization resistance of 7150 sheets with two-step homogenization.

  10. Effect of high-pressure homogenization on different matrices of food supplements.

    PubMed

    Martínez-Sánchez, Ascensión; Tarazona-Díaz, Martha Patricia; García-González, Antonio; Gómez, Perla A; Aguayo, Encarna

    2016-12-01

    There is a growing demand for food supplements containing high amounts of vitamins, phenolic compounds and mineral content that provide health benefits. Those functional compounds have different solubility properties, and the maintenance of their compounds and the guarantee of their homogenic properties need the application of novel technologies. The quality of different drinkable functional foods after thermal processing (0.1 MPa) or high-pressure homogenization under two different conditions (80 MPa, 33 ℃ and 120 MPa, 43 ℃) was studied. Physicochemical characteristics and sensory qualities were evaluated throughout the six months of accelerated storage at 40 ℃ and 75% relative humidity (RH). Aroma and color were better maintained in high-pressure homogenization-treated samples than the thermally treated ones, which contributed significantly to extending their shelf life. The small particle size obtained after high-pressure homogenization treatments caused differences in turbidity and viscosity with respect to heat-treated samples. The use of high-pressure homogenization, more specifically, 120 MPa, provided active ingredient homogeneity to ensure uniform content in functional food supplements. Although the effect of high-pressure homogenization can be affected by the food matrix, high-pressure homogenization can be implemented as an alternative to conventional heat treatments in a commercial setting within the functional food supplement or pharmaceutical industry.

  11. Numerical analysis of homogeneous and inhomogeneous intermittent search strategies

    NASA Astrophysics Data System (ADS)

    Schwarz, Karsten; Schröder, Yannick; Rieger, Heiko

    2016-10-01

    Random search processes for targets that are inhomogeneously distributed in a search domain require spatially inhomogeneous search strategies to find the target as fast as possible. Here, we compare systematically the efficiency of homogeneous and inhomogeneous strategies for intermittent search, which alternates stochastically between slow, diffusive motion in which the target can be detected and fast ballistic motion during which targets cannot be detected. We analyze the mean first-passage time of homogeneous and inhomogeneous strategies for three paradigmatic search problems: (1) the narrow escape problem, i.e., the searcher looks for a small area on the boundary of the search domain, (2) reaction kinetics, i.e., the detection of an immobile target in the interior of a search domain, and (3) the reaction-escape problem, i.e., the searcher first needs to find a mobile target before it can escape through a narrow area on the boundary. Using families of inhomogeneous strategies, partially motivated by the organization of the cytoskeleton in cells with a centrosome, we show that they are almost always more efficient than homogeneous strategies.

  12. Homogeneous UVA system for corneal cross-linking treatment

    NASA Astrophysics Data System (ADS)

    Ayres Pereira, Fernando R.; Stefani, Mario A.; Otoboni, José A.; Richter, Eduardo H.; Ventura, Liliane

    2010-02-01

    The treatment of keratoconus and corneal ulcers by collagen cross-linking using ultraviolet type A irradiation, combined with photo-sensitizer Riboflavin (vitamin B2), is a promising technique. The standard protocol suggests instilling Riboflavin in the pre-scratched cornea every 5min for 30min, during the UVA irradiation of the cornea at 3mW/cm2 for 30 min. This process leads to an increase of the biomechanical strength of the cornea, stopping the progression, or sometimes, even reversing Keratoconus. The collagen cross-linking can be achieved by many methods, but the utilization of UVA light, for this purpose, is ideal because of its possibility of a homogeneous treatment leading to an equal result along the treated area. We have developed a system, to be clinically used for treatment of unhealthy corneas using the cross-linking technique, which consists of an UVA emitting delivery device controlled by a closed loop system with high homogeneity. The system is tunable and delivers 3-5 mW/cm2, at 365nm, for three spots (6mm, 8mm and 10mm in diameter). The electronics close loop presents 1% of precision, leading to an overall error, after the calibration, of less than 10% and approximately 96% of homogeneity.

  13. Tissue homogeneity requires inhibition of unequal gene silencing during development

    PubMed Central

    Le, Hai H.; Looney, Monika; Strauss, Benjamin; Bloodgood, Michael

    2016-01-01

    Multicellular organisms can generate and maintain homogenous populations of cells that make up individual tissues. However, cellular processes that can disrupt homogeneity and how organisms overcome such disruption are unknown. We found that ∼100-fold differences in expression from a repetitive DNA transgene can occur between intestinal cells in Caenorhabditis elegans. These differences are caused by gene silencing in some cells and are actively suppressed by parental and zygotic factors such as the conserved exonuclease ERI-1. If unsuppressed, silencing can spread between some cells in embryos but can be repeat specific and independent of other homologous loci within each cell. Silencing can persist through DNA replication and nuclear divisions, disrupting uniform gene expression in developed animals. Analysis at single-cell resolution suggests that differences between cells arise during early cell divisions upon unequal segregation of an initiator of silencing. Our results suggest that organisms with high repetitive DNA content, which include humans, could use similar developmental mechanisms to achieve and maintain tissue homogeneity. PMID:27458132

  14. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization

    PubMed Central

    Kwiatkowski, M.; Wurlitzer, M.; Krutilin, A.; Kiani, P.; Nimer, R.; Omidi, M.; Mannaa, A.; Bussmann, T.; Bartkowiak, K.; Kruber, S.; Uschold, S.; Steffen, P.; Lübberstedt, J.; Küpker, N.; Petersen, H.; Knecht, R.; Hansen, N.O.; Zarrine-Afsar, A.; Robertson, W.D.; Miller, R.J.D.; Schlüter, H.

    2016-01-01

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Biological significance Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. PMID:26778141

  15. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  16. Gender training: creating change.

    PubMed

    Craun-selka, P

    1997-01-01

    Over the last 20 years, the Centre for Development and Population Activities (CEDPA) has developed a training program concerning gender policies and practices; it includes a curriculum, "Gender and Development," and a handbook, "Gender Equity: Concepts and Tools for Development." Gender training focuses on increasing individual awareness of gender issues and incorporating gender practices in programs. CEDPA has expanded its programs to include projects promoting increased decision-making power for women regarding their own lives. Family planning and reproductive health projects now include programs designed to increase "women's literacy, credit and income-generation opportunities, and participation in civil society and the political process." Projects address reproductive and human rights, land distribution, economic expansion, credit and savings, and violence against women. Youth programs focus on the changing nature of gender roles, the equal rights of women and girls, and the shared responsibility and mutual respect of the sexes. In the Better Life Options projects, youth of both sexes attend family life and sex education programs. The curriculum "Choose a Future" provides life skills training for young women; a version for young men will be provided in the future. Including men (community health workers and supervisors, educators, trainers, leaders, fathers, and husbands) in the CEDPA programs is essential for the empowerment of women.

  17. Homogeneous freezing nucleation of stratospheric solution droplets

    NASA Technical Reports Server (NTRS)

    Jensen, Eric J.; Toon, Owen B.; Hamill, Patrick

    1991-01-01

    The classical theory of homogeneous nucleation was used to calculate the freezing rate of sulfuric acid solution aerosols under stratospheric conditions. The freezing of stratospheric aerosols would be important for the nucleation of nitric acid trihydrate particles in the Arctic and Antarctic stratospheres. In addition, the rate of heterogeneous chemical reactions on stratospheric aerosols may be very sensitive to their state. The calculations indicate that homogeneous freezing nucleation of pure water ice in the stratospheric solution droplets would occur at temperatures below about 192 K. However, the physical properties of H2SO4 solution at such low temperatures are not well known, and it is possible that sulfuric acid aerosols will freeze out at temperatures ranging from about 180 to 195 K. It is also shown that the temperature at which the aerosols freeze is nearly independent of their size.

  18. Homogeneous crystal nucleation in binary metallic melts

    NASA Technical Reports Server (NTRS)

    Thompson, C. V.; Spaepen, F.

    1983-01-01

    A method for calculating the homogeneous crystal nucleation frequency in binary metallic melts is developed. The free energy of crystallization is derived from regular solution models for the liquid and solid and is used, together with model-based estimates of the interfacial tension, to calculate the nucleation frequency from the classical theory. The method can account for the composition dependence of the maximum undercooling observed in a number of experiments on small droplet dispersions. It can also be used to calculate the driving force for crystal growth and to obtain more precise estimates of the homogeneous crystal nucleation frequency in glass-forming alloys. This method, although approximate, is simple to apply, and requires only knowledge of the phase diagram and a few readily available thermodynamic quantities as input data.

  19. Detonation in shocked homogeneous high explosives

    SciTech Connect

    Yoo, C.S.; Holmes, N.C.; Souers, P.C.

    1995-11-01

    We have studied shock-induced changes in homogeneous high explosives including nitromethane, tetranitromethane, and single crystals of pentaerythritol tetranitrate (PETN) by using fast time-resolved emission and Raman spectroscopy at a two-stage light-gas gun. The results reveal three distinct steps during which the homogeneous explosives chemically evolve to final detonation products. These are (1) the initiation of shock compressed high explosives after an induction period, (2) thermal explosion of shock-compressed and/or reacting materials, and (3) a decay to a steady-state representing a transition to the detonation of uncompressed high explosives. Based on a gray-body approximation, we have obtained the CJ temperatures: 3800 K for nitromethane, 2950 K for tetranitromethane, and 4100 K for PETN. We compare the data with various thermochemical equilibrium calculations. In this paper we will also show a preliminary result of single-shot time-resolved Raman spectroscopy applied to shock-compressed nitromethane.

  20. Beyond relationships between homogeneous and heterogeneous catalysis

    SciTech Connect

    Dixon, David A.; Katz, Alexander; Arslan, Ilke; Gates, Bruce C.

    2014-08-13

    Scientists who regard catalysis as a coherent field have been striving for decades to articulate the fundamental unifying principles. But because these principles seem to be broader than chemistry, chemical engineering, and materials science combined, catalytic scientists commonly interact within the sub-domains of homogeneous, heterogeneous, and bio-catalysis, and increasingly within even narrower domains such as organocatalysis, phase-transfer catalysis, acid-base catalysis, zeolite catalysis, etc. Attempts to unify catalysis have motivated researchers to find relationships between homogeneous and heterogeneous catalysis and to mimic enzymes. These themes have inspired vibrant international meetings and workshops, and we have benefited from the idea exchanges and have some thoughts about a path forward.

  1. CUDA Simulation of Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John V.; Shum, Victor; Fu, Terry

    2011-01-01

    We discuss very fast Compute Unified Device Architecture (CUDA) simulations of ideal homogeneous incompressible turbulence based on Fourier models. These models have associated statistical theories that predict that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. Prior numerical simulations have shown that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We review the theoretical basis of this "broken ergodicity" as applied to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence. Our new simulations examine the phenomenon of broken ergodicity through very long time and large grid size runs performed on a state-of-the-art CUDA platform. Results comparing various CUDA hardware configurations and grid sizes are discussed. NS and MHD results are compared.

  2. Program Logics for Homogeneous Meta-programming

    NASA Astrophysics Data System (ADS)

    Berger, Martin; Tratt, Laurence

    A meta-program is a program that generates or manipulates another program; in homogeneous meta-programming, a program may generate new parts of, or manipulate, itself. Meta-programming has been used extensively since macros were introduced to Lisp, yet we have little idea how formally to reason about meta-programs. This paper provides the first program logics for homogeneous meta-programming - using a variant of MiniML_e^{square} by Davies and Pfenning as underlying meta-programming language. We show the applicability of our approach by reasoning about example meta-programs from the literature. We also demonstrate that our logics are relatively complete in the sense of Cook, enable the inductive derivation of characteristic formulae, and exactly capture the observational properties induced by the operational semantics.

  3. Kinematical uniqueness of homogeneous isotropic LQC

    NASA Astrophysics Data System (ADS)

    Engle, Jonathan; Hanusch, Maximilian

    2017-01-01

    In a paper by Ashtekar and Campiglia, invariance under volume preserving residual diffeomorphisms has been used to single out the standard representation of the reduced holonomy-flux algebra in homogeneous loop quantum cosmology (LQC). In this paper, we use invariance under all residual diffeomorphisms to single out the standard kinematical Hilbert space of homogeneous isotropic LQC for both the standard configuration space {{{R}}\\text{Bohr}} , as well as for the Fleischhack one {R}\\sqcup {{{R}}\\text{Bohr}} . We first determine the scale invariant Radon measures on these spaces, and then show that the Haar measure on {{{R}}\\text{Bohr}} is the only such measure for which the momentum operator is hermitian w.r.t. the corresponding inner product. In particular, the measure is forced to be identically zero on {R} in the Fleischhack case, so that for both approaches, the standard kinematical LQC-Hilbert space is singled out.

  4. Homogeneous freezing nucleation of stratospheric solution droplets

    NASA Astrophysics Data System (ADS)

    Jensen, Eric J.; Toon, Owen B.; Hamill, Patrick

    1991-10-01

    The classical theory of homogeneous nucleation was used to calculate the freezing rate of sulfuric acid solution aerosols under stratospheric conditions. The freezing of stratospheric aerosols would be important for the nucleation of nitric acid trihydrate particles in the Arctic and Antarctic stratospheres. In addition, the rate of heterogeneous chemical reactions on stratospheric aerosols may be very sensitive to their state. The calculations indicate that homogeneous freezing nucleation of pure water ice in the stratospheric solution droplets would occur at temperatures below about 192 K. However, the physical properties of H2SO4 solution at such low temperatures are not well known, and it is possible that sulfuric acid aerosols will freeze out at temperatures ranging from about 180 to 195 K. It is also shown that the temperature at which the aerosols freeze is nearly independent of their size.

  5. Coherent Eigenmodes in Homogeneous MHD Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2010-01-01

    The statistical mechanics of Fourier models of ideal, homogeneous, incompressible magnetohydrodynamic (MHD) turbulence is discussed, along with their relevance for dissipative magnetofluids. Although statistical theory predicts that Fourier coefficients of fluid velocity and magnetic field are zero-mean random variables, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation, i.e., we have coherent structure. We use eigenanalysis of the modal covariance matrices in the probability density function to explain this phenomena in terms of `broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We provide examples from 2-D and 3-D magnetohydrodynamic simulations of homogeneous turbulence, and show new results from long-time simulations of MHD turbulence with and without a mean magnetic field

  6. Genetic homogeneity of Fascioloides magna in Austria.

    PubMed

    Husch, Christian; Sattmann, Helmut; Hörweg, Christoph; Ursprung, Josef; Walochnik, Julia

    2017-08-30

    The large American liver fluke, Fascioloides magna, is an economically relevant parasite of both domestic and wild ungulates. F. magna was repeatedly introduced into Europe, for the first time already in the 19th century. In Austria, a stable population of F. magna has established in the Danube floodplain forests southeast of Vienna. The aim of this study was to determine the genetic diversity of F. magna in Austria. A total of 26 individuals from various regions within the known area of distribution were investigated for their cytochrome oxidase subunit 1 (cox1) and nicotinamide dehydrogenase subunit 1 (nad1) gene haplotypes. Interestingly, all 26 individuals revealed one and the same haplotype, namely concatenated haplotype Ha5. This indicates a homogenous population of F. magna in Austria and may argue for a single introduction. Alternatively, genetic homogeneity might also be explained by a bottleneck effect and/or genetic drift. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Homogeneous Superpixels from Markov Random Walks

    NASA Astrophysics Data System (ADS)

    Perbet, Frank; Stenger, Björn; Maki, Atsuto

    This paper presents a novel algorithm to generate homogeneous superpixels from Markov random walks. We exploit Markov clustering (MCL) as the methodology, a generic graph clustering method based on stochastic flow circulation. In particular, we introduce a graph pruning strategy called compact pruning in order to capture intrinsic local image structure. The resulting superpixels are homogeneous, i.e. uniform in size and compact in shape. The original MCL algorithm does not scale well to a graph of an image due to the square computation of the Markov matrix which is necessary for circulating the flow. The proposed pruning scheme has the advantages of faster computation, smaller memory footprint, and straightforward parallel implementation. Through comparisons with other recent techniques, we show that the proposed algorithm achieves state-of-the-art performance.

  8. Homogeneous isolation of nanocellulose from sugarcane bagasse by high pressure homogenization.

    PubMed

    Li, Jihua; Wei, Xiaoyi; Wang, Qinghuang; Chen, Jiacui; Chang, Gang; Kong, Lingxue; Su, Junbo; Liu, Yuhuan

    2012-11-06

    Nanocellulose from sugarcane bagasse was isolated by high pressure homogenization in a homogeneous media. Pretreatment with an ionic liquid (1-butyl-3-methylimidazolium chloride ([Bmim]Cl)) was initially involved to dissolve the bagasse cellulose. Subsequently, the homogeneous solution was passed through a high pressure homogenizer without any clogging. The nanocellulose was obtained at 80 MPa for 30 cycles with recovery of 90% under the optimum refining condition. Nanocellulose had been characterized by Fourier transformed infrared spectra, X-ray diffraction, thermogravimetric analysis, rheological measurements and transmission electron microscopy. The results showed that nanocellulose was 10-20 nm in diameter, and presented lower thermal stability and crystallinity than the original cellulose. The developed nanocellulose would be a very versatile renewable material.

  9. Metastable states in homogeneous Ising models

    SciTech Connect

    Achilles, M.; Bendisch, J.; von Trotha, H.

    1987-04-01

    Metastable states of homogeneous 2D and 3D Ising models are studied under free boundary conditions. The states are defined in terms of weak and strict local minima of the total interaction energy. The morphology of these minima is characterized locally and globally on square and cubic grids. Furthermore, in the 2D case, transition from any spin configuration that is not a strict minimum to a strict minimum is possible via non-energy-increasing single flips.

  10. Castings, Steel, Homogenization of Steel Castings

    DTIC Science & Technology

    1942-12-05

    diffraction pattern of quenched and tempered steel castings. 2. Calculations based upon known diffusion rates show: A. Practical homogenizing heat ...will be largely eliminated by either the usual heating for nuenching or a homo- genizing treatment. C. Interdendritic segregation of sulfur will...26 Appendix A - History of the Heat Treatment and Composition of Centrifugal Gun Castings at W-tertown Ar- sen-.l. ..... ..................... 2

  11. Recent advances in homogeneous nickel catalysis

    NASA Astrophysics Data System (ADS)

    Tasker, Sarah Z.; Standley, Eric A.; Jamison, Timothy F.

    2014-05-01

    Tremendous advances have been made in nickel catalysis over the past decade. Several key properties of nickel, such as facile oxidative addition and ready access to multiple oxidation states, have allowed the development of a broad range of innovative reactions. In recent years, these properties have been increasingly understood and used to perform transformations long considered exceptionally challenging. Here we discuss some of the most recent and significant developments in homogeneous nickel catalysis, with an emphasis on both synthetic outcome and mechanism.

  12. Homogeneous Biosensing Based on Magnetic Particle Labels

    PubMed Central

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  13. Equilibrium states of homogeneous sheared compressible turbulence

    NASA Astrophysics Data System (ADS)

    Riahi, M.; Lili, T.

    2011-06-01

    Equilibrium states of homogeneous compressible turbulence subjected to rapid shear is studied using rapid distortion theory (RDT). The purpose of this study is to determine the numerical solutions of unsteady linearized equations governing double correlations spectra evolution. In this work, RDT code developed by authors solves these equations for compressible homogeneous shear flows. Numerical integration of these equations is carried out using a second-order simple and accurate scheme. The two Mach numbers relevant to homogeneous shear flow are the turbulent Mach number Mt, given by the root mean square turbulent velocity fluctuations divided by the speed of sound, and the gradient Mach number Mg which is the mean shear rate times the transverse integral scale of the turbulence divided by the speed of sound. Validation of this code is performed by comparing RDT results with direct numerical simulation (DNS) of [A. Simone, G.N. Coleman, and C. Cambon, Fluid Mech. 330, 307 (1997)] and [S. Sarkar, J. Fluid Mech. 282, 163 (1995)] for various values of initial gradient Mach number Mg0. It was found that RDT is valid for small values of the non-dimensional times St (St < 3.5). It is important to note that RDT is also valid for large values of St (St > 10) in particular for large values of Mg0. This essential feature justifies the resort to RDT in order to determine equilibrium states in the compressible regime.

  14. TESTING HOMOGENEITY WITH GALAXY STAR FORMATION HISTORIES

    SciTech Connect

    Hoyle, Ben; Jimenez, Raul; Tojeiro, Rita; Maartens, Roy; Heavens, Alan; Clarkson, Chris

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past light cone, while observations take place on the light cone. The star formation history (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked luminous red galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal-area contiguous sky patches and 10 redshift slices (0.2 < z < 0.5), which correspond to 120 blocks of volume {approx}0.04 Gpc{sup 3}. Using the SFH in a time period that samples the history of the universe between look-back times 11.5 and 13.4 Gyr as a proxy for homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is no extra variance at all. At 95% credibility, there is no evidence of deviations larger than 5.8%.

  15. Tits Satake projections of homogeneous special geometries

    NASA Astrophysics Data System (ADS)

    Fré, Pietro; Gargiulo, Floriana; Rosseel, Jan; Rulik, Ksenya; Trigiante, Mario; Van Proeyen, Antoine

    2007-01-01

    We organize the homogeneous special geometries, describing as well the couplings of D = 6, 5, 4 and 3 supergravities with eight supercharges, in a small number of universality classes. This relates manifolds on which similar types of dynamical solutions can exist. The mathematical ingredient is the Tits Satake projection of real simple Lie algebras, which we extend to all solvable Lie algebras occurring in these homogeneous special geometries. Apart from some exotic cases all the other, 'very special', homogeneous manifolds can be grouped into seven universality classes. The organization of these classes, which capture the essential features of their basic dynamics, commutes with the r- and c-map. Different members are distinguished by different choices of the paint group, a notion discovered in the context of cosmic billiard dynamics of non-maximally supersymmetric supergravities. We comment on the usefulness of this organization in universality class both in relation with cosmic billiard dynamics and with configurations of branes and orbifolds defining special geometry backgrounds.

  16. A criterion for assessing homogeneity distribution in hyperspectral images. Part 2: application of homogeneity indices to solid pharmaceutical dosage forms.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    This article is the second of a series of two articles detailing the application of mixing index to assess homogeneity distribution in oral pharmaceutical solid dosage forms by image analysis. Chemical imaging (CI) is an emerging technique integrating conventional imaging and spectroscopic techniques with a view to obtaining spatial and spectral information from a sample. Near infrared chemical imaging (NIR-CI) has proved an excellent analytical tool for extracting high-quality information from sample surfaces. The primary objective of this second part was to demonstrate that the approach developed in the first part could be successfully applied to near infrared hyperspectral images of oral pharmaceutical solid dosage forms such as coated, uncoated and effervescent tablets, as well as to powder blends. To this end, we assessed a new criterion for establishing mixing homogeneity by using four different methods based on a three-dimensional (M×N×λ) data array of hyperspectral images (spectral standard deviations and correlation coefficients) or a two-dimensional (M×N) data array (concentration maps and binary images). The four methods were used applying macropixel analysis to the Poole (M(P)) and homogeneity (H%(Poole)) indices. Both indices proved useful for assessing the degree of homogeneity of pharmaceutical samples. The results testify that the proposed approach can be effectively used in the pharmaceutical industry, in the finished products (e.g., tablets) and in mixing unit operations for example, as a process analytical technology tool for the blending monitoring (see part 1). Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Effect of homogenization and ultrasonication on the physical properties of insoluble wheat bran fibres

    NASA Astrophysics Data System (ADS)

    Hu, Ran; Zhang, Min; Adhikari, Benu; Liu, Yaping

    2015-10-01

    Wheat bran is rich in dietary fibre and its annual output is abundant, but underutilized. Insoluble dietary fibre often influences food quality negatively; therefore, how to improve the physical and chemical properties of insoluble dietary fibre of wheat bran for post processing is a challenge. Insoluble dietary fibre was obtained from wheat bran and micronized using high-pressure homogenization, high-intensity sonication, and a combination of these two methods. The high-pressure homogenization and high-pressure homogenization+high-intensity sonication treatments significantly (p<0.05) improved the solubility, swelling, water-holding, oil-holding, and cation exchange capacities. The improvement of the above properties by high-intensity sonication alone was marginal. In most cases, the high-pressure homogenization process was as good as the high-pressure homogenization+high-intensity sonication process in improving the above-mentioned properties; hence, the contribution of high-`intensity sonication in the high-pressure homogenization+high-intensity sonication process was minimal. The best results show that the minimum particle size of wheat bran can reach 9 μm, and the solubility, swelling, water-holding, oil-holding, cation exchange capacities change significantly.

  18. A Common Genetic Variant in the 3′-UTR of Vacuolar H+-ATPase ATP6V0A1 Creates a Micro-RNA Motif to Alter Chromogranin A (CHGA) Processing and Hypertension Risk

    PubMed Central

    Wei, Zhiyun; Biswas, Nilima; Wang, Lei; Courel, Maite; Zhang, Kuixing; Soler-Jover, Alex; Taupenot, Laurent; O’Connor, Daniel T.

    2012-01-01

    Background The catecholamine release-inhibitor catestatin and its precursor chromogranin A (CHGA) may constitute “intermediate phenotypes” in analysis of genetic risk for cardiovascular disease such as hypertension. Previously, the vacuolar H+-ATPase subunit gene ATP6V0A1 was found within the confidence interval for linkage with catestatin secretion in a genome-wide study, and its 3′-UTR polymorphism T+3246C (rs938671) was associated with both catestatin processing from CHGA, as well as population blood pressure (BP). Here we explored the molecular mechanism of this effect by experiments with transfected chimeric photoproteins in chromaffin cells. Methods and Results Placing the ATP6V0A1 3′-UTR downstream of a luciferase reporter, we found that the C (variant) allele decreased overall gene expression. The 3′-UTR effect was verified by coupled in vitro transcription/translation of the entire/intact human ATP6V0A1 mRNA. Chromaffin granule pH, monitored by fluorescence a CHGA/EGFP chimera during vesicular H+-ATPase inhibition by bafilomycin A1, was more easily perturbed during co-expression of the ATP6V0A1 3′-UTR C-allele than the T-allele. After bafilomycin A1 treatment, the ratio of CHGA precursor to its catestatin fragments in PC12 cells was substantially diminished, though the qualitative composition of such fragments was not affected (on immunoblot or MALDI mass spectrometry). Bafilomycin A1 treatment also decreased exocytotic secretion from the regulated pathway, monitored by a CHGA chimera tagged with embryonic alkaline phosphatase (EAP). 3′-UTR T+3246C created a binding motif for micro-RNA hsa-miR-637; co-transfection of hsa-miR-637 precursor or antagomir/inhibitor oligonucleotides yielded the predicted changes in expression of luciferase reporter/ATP6V0A1-3′-UTR plasmids varying at T+3246C. Conclusions The results suggest a series of events whereby ATP6V0A1 3′-UTR variant T+3246C functioned: ATP6V0A1 expression was affected likely through

  19. Effect of homogenization and pasteurization on the structure and stability of whey protein in milk.

    PubMed

    Qi, Phoebe X; Ren, Daxi; Xiao, Yingping; Tomasula, Peggy M

    2015-05-01

    The effect of homogenization alone or in combination with high-temperature, short-time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a 2-stage homogenizer at 35°C (6.9 MPa/10.3 MPa) and, along with skim milk, were subjected to HTST pasteurization (72°C for 15 s) or UHT processing (135°C for 2 s). Other whole milk samples were processed using homogenization followed by either HTST pasteurization or UHT processing. The processed skim and whole milk samples were centrifuged further to remove fat and then acidified to pH 4.6 to isolate the corresponding whey fractions, and centrifuged again. The whey fractions were then purified using dialysis and investigated using the circular dichroism, Fourier transform infrared, and Trp intrinsic fluorescence spectroscopic techniques. Results demonstrated that homogenization combined with UHT processing of milk caused not only changes in protein composition but also significant secondary structural loss, particularly in the amounts of apparent antiparallel β-sheet and α-helix, as well as diminished tertiary structural contact. In both cases of homogenization alone and followed by HTST treatments, neither caused appreciable chemical changes, nor remarkable secondary structural reduction. But disruption was evident in the tertiary structural environment of the whey proteins due to homogenization of whole milk as shown by both the near-UV circular dichroism and Trp intrinsic fluorescence. In-depth structural stability analyses revealed that even though processing of milk imposed little impairment on the secondary structural stability, the tertiary structural stability of whey protein was altered significantly. The following order was derived based on these studies: raw whole>HTST, homogenized, homogenized and pasteurized>skimmed and pasteurized, and skimmed UHT>homogenized

  20. Homogeneous Charge Compression Ignition Free Piston Linear Alternator

    SciTech Connect

    Janson Wu; Nicholas Paradiso; Peter Van Blarigan; Scott Goldsborough

    1998-11-01

    An experimental and theoretical investigation of a homogeneous charge compression ignition (HCCI) free piston powered linear alternator has been conducted to determine if improvements can be made in the thermal and conversion efficiencies of modern electrical generator systems. Performance of a free piston engine was investigated using a rapid compression expansion machine and a full cycle thermodynamic model. Linear alternator performance was investigated with a computer model. In addition linear alternator testing and permanent magnet characterization hardware were developed. The development of the two-stroke cycle scavenging process has begun.

  1. Homogeneous and heterogeneous chemistry along air parcel trajectories

    NASA Technical Reports Server (NTRS)

    Jones, R. L.; Mckenna, D. L.; Poole, L. R.; Solomon, S.

    1990-01-01

    The study of coupled heterogeneous and homogeneous chemistry due to polar stratospheric clouds (PSC's) using Lagrangian parcel trajectories for interpretation of the Airborne Arctic Stratosphere Experiment (AASE) is discussed. This approach represents an attempt to quantitatively model the physical and chemical perturbation to stratospheric composition due to formation of PSC's using the fullest possible representation of the relevant processes. Further, the meteorological fields from the United Kingdom Meteorological office global model were used to deduce potential vorticity and inferred regions of PSC's as an input to flight planning during AASE.

  2. Te homogeneous precipitation in Ge dislocation loop vicinity

    SciTech Connect

    Perrin Toinin, J.; Portavoce, A. Texier, M.; Bertoglio, M.; Hoummada, K.

    2016-06-06

    High resolution microscopies were used to study the interactions of Te atoms with Ge dislocation loops, after a standard n-type doping process in Ge. Te atoms neither segregate nor precipitate on dislocation loops, but form Te-Ge clusters at the same depth as dislocation loops, in contradiction with usual dopant behavior and thermodynamic expectations. Atomistic kinetic Monte Carlo simulations show that Te atoms are repulsed from dislocation loops due to elastic interactions, promoting homogeneous Te-Ge nucleation between dislocation loops. This phenomenon is enhanced by coulombic interactions between activated Te{sup 2+} or Te{sup 1+} ions.

  3. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    SciTech Connect

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  4. Computationally Probing the Performance of Hybrid, Heterogeneous, and Homogeneous Iridium-Based Catalysts for Water Oxidation

    SciTech Connect

    García-Melchor, Max; Vilella, Laia; López, Núria; Vojvodic, Aleksandra

    2016-04-29

    An attractive strategy to improve the performance of water oxidation catalysts would be to anchor a homogeneous molecular catalyst on a heterogeneous solid surface to create a hybrid catalyst. The idea of this combined system is to take advantage of the individual properties of each of the two catalyst components. We use Density Functional Theory to determine the stability and activity of a model hybrid water oxidation catalyst consisting of a dimeric Ir complex attached on the IrO2(110) surface through two oxygen atoms. We find that homogeneous catalysts can be bound to its matrix oxide without losing significant activity. Hence, designing hybrid systems that benefit from both the high tunability of activity of homogeneous catalysts and the stability of heterogeneous systems seems feasible.

  5. Homogenous charge compression ignition engine having a cylinder including a high compression space

    DOEpatents

    Agama, Jorge R.; Fiveland, Scott B.; Maloney, Ronald P.; Faletti, James J.; Clarke, John M.

    2003-12-30

    The present invention relates generally to the field of homogeneous charge compression engines. In these engines, fuel is injected upstream or directly into the cylinder when the power piston is relatively close to its bottom dead center position. The fuel mixes with air in the cylinder as the power piston advances to create a relatively lean homogeneous mixture that preferably ignites when the power piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. Thus, the present invention divides the homogeneous charge between a controlled volume higher compression space and a lower compression space to better control the start of ignition.

  6. APHRODITE daily precipitation and temperature dataset: Development, QC, Homogenization and Spatial Correlation

    NASA Astrophysics Data System (ADS)

    Yatagai, Akiyo; Zhao, Tianbao

    2014-05-01

    A daily gridded precipitation dataset for the period 1951-2007 was created by collecting and analyzing rain-gauge observation data across Asia through the activities of the Asian Precipitation - Highly Resolved Observational Data Integration Towards Evaluation (APHRODITE) of water resources project. They are available at http://www.chikyu.ac.jp/precip/. Utilization of station data is ideal for analyses of climatic trends, especially for those of extreme events. However, there was an increasing demand for accurate high-resolution gauge-based precipitation analyses. Rain-gauge based products are sometimes used for assessing trends of climate models or that of river runoff through driving hydrological models, because they are convenient and long records. On the other hand, some information is lost during the gridding process. Hence, in-house results of testing interpolation scheme, quality control and homogenization may give important information for the users. We will present such results as well as our quality control (QC) in the APHRODITE project activities. Before gridding, 14 objective QC steps were applied to the rain-gauge data, which mainly includes position checking, duplicate data checking and inhomogeneity and spatiotemporal isolation etc. Details are described in Hamada et al. (2011). For Chinese data, basic QC steps such as duplicate checking and position checking have been made by the local meteorological agency. Hence we made homogenization test and spatial correlation analyses separately. For 756 Chinese daily temperature stations, we applied Multiple Analysis of Series for Homogenization (MASH) developed by Szentimrey (1999, 2008). The results show this statistical method we used has a good performance to detect the discontinuities in climate series caused by station relocation, instrument change etc. regardless of the absence of metadata. Through the homogenization, most of discontinuities existed in original temperature data can be removed, and the

  7. Turbulent Diffusion in Non-Homogeneous Environments

    NASA Astrophysics Data System (ADS)

    Diez, M.; Redondo, J. M.; Mahjoub, O. B.; Sekula, E.

    2012-04-01

    Many experimental studies have been devoted to the understanding of non-homogeneous turbulent dynamics. Activity in this area intensified when the basic Kolmogorov self-similar theory was extended to two-dimensional or quasi 2D turbulent flows such as those appearing in the environment, that seem to control mixing [1,2]. The statistical description and the dynamics of these geophysical flows depend strongly on the distribution of long lived organized (coherent) structures. These flows show a complex topology, but may be subdivided in terms of strongly elliptical domains (high vorticity regions), strong hyperbolic domains (deformation cells with high energy condensations) and the background turbulent field of moderate elliptic and hyperbolic characteristics. It is of fundamental importance to investigate the different influence of these topological diverse regions. Relevant geometrical information of different areas is also given by the maximum fractal dimension, which is related to the energy spectrum of the flow. Using all the available information it is possible to investigate the spatial variability of the horizontal eddy diffusivity K(x,y). This information would be very important when trying to model numerically the behaviour in time of the oil spills [3,4] There is a strong dependence of horizontal eddy diffusivities with the Wave Reynolds number as well as with the wind stress measured as the friction velocity from wind profiles measured at the coastline. Natural sea surface oily slicks of diverse origin (plankton, algae or natural emissions and seeps of oil) form complicated structures in the sea surface due to the effects of both multiscale turbulence and Langmuir circulation. It is then possible to use the topological and scaling analysis to discriminate the different physical sea surface processes. We can relate higher orden moments of the Lagrangian velocity to effective diffusivity in spite of the need to calibrate the different regions determining the

  8. Can cognitive science create a cognitive economics?

    PubMed

    Chater, Nick

    2015-02-01

    Cognitive science can intersect with economics in at least three productive ways: by providing richer models of individual behaviour for use in economic analysis; by drawing from economic theory in order to model distributed cognition; and jointly to create more powerful 'rational' models of cognitive processes and social interaction. There is the prospect of moving from behavioural economics to a genuinely cognitive economics.

  9. Making Coalitions Work: Creating a Viable Environment.

    ERIC Educational Resources Information Center

    Killacky, Jim; Hulse-Killacky, Diana

    1997-01-01

    Describes community-based programming (CBP), a cooperative process that allows community and technical colleges to address critical issues through coalitions. Provides strategies for creating effective coalitions, focusing on information that new members should know, essential leadership qualities, group rules and activities, and ways to close…

  10. Does Double Loop Learning Create Reliable Knowledge?

    ERIC Educational Resources Information Center

    Blackman, Deborah; Connelly, James; Henderson, Steven

    2004-01-01

    This paper addresses doubts concerning the reliability of knowledge being created by double loop learning processes. Popper's ontological worlds are used to explore the philosophical basis of the way that individual experiences are turned into organisational knowledge, and such knowledge is used to generate organisational learning. The paper…

  11. Designing and Creating Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    McMeen, George R.

    Designed to encourage the use of a defined methodology and careful planning in creating computer-assisted instructional programs, this paper describes the instructional design process, compares computer-assisted instruction (CAI) and programmed instruction (PI), and discusses pragmatic concerns in computer programming. Topics addressed include:…

  12. Observations of Homogeneous and Inhomogeneous Mixing in Warm Cumulus Clouds

    NASA Astrophysics Data System (ADS)

    Lehmann, K.; Siebert, H.; Shaw, R. A.

    2007-12-01

    The helicopter-borne instrument payload ACTOS was used to study the entrainment/mixing process in shallow warm cumulus clouds. Using ACTOS, high resolution measurements of the three-dimensional wind, temperature and humidity fields were made. In addition, cloud microphysical parameters such as the droplet number concentration and size were measured with a modified Fast-FSSP. The effect of entrained subsaturated air on the droplet number size distribution was analyzed using mixing diagrams which correlate droplet number concentration and droplet size. Both homogeneous and inhomogeneous mixing was observed to take place. The characteristic of the mixing process is compared to the Damköhler number. The Damköhler number is given by the ratio of the timescale for turbulent mixing and the reaction timescale, which is either the time for droplet evaporation, or the phase relaxation timescale. With ACTOS' instrumentation, the Damköhler number can be determined with a spatial resolution of about 15 m. In agreement with literature, low values of the Damköhler number correlate with the homogeneous mixing scenario, while higher values of the Damköhler number correlate with the inhomogeneous mixing scenario. It is shown that even within one cloud, different mixing scenarios can take place. The data suggest that homogeneous mixing is more likely to occur in the vicinity of the vigorous cloud core, while inhomogeneous mixing dominates in the outer, less turbulent part of the cloud. A case is presented in which the mixing led to the formation of drops that are larger than in the unmixed adiabatic core. This is of potential importance for precipitation formation in warm cumulus clouds.

  13. Scaled long rod penetration experiments: Tungsten against rolled homogeneous armour

    NASA Astrophysics Data System (ADS)

    Proud, William; Cross, Daniel

    2012-03-01

    Scaled, reverse ballistic, long-rod experiments were performed at an impact velocity of ~700 m s-1. The targets were tungsten alloy rods and the projectiles either 3 or 6 mm thick rolled homogeneous armour (RHA) plates. The plate was inclined at 60° to the direction of travel and the interaction was recorded using high-speed photography, strain gauges and laser velocimetry. The pitch of the rod was varied in steps of 3° over a total range of 15°. In this range the rod deformation changed dramatically the bending process moved from a flexing of the tip away from the plate, to a marked motion into the surface. Cross comparison of the diagnostic outputs reveals the time windows for these process and also the varying sensitivity of the measurement system to that process. Post-impact recovery was also performed.

  14. Creating engaging experiences for rehabilitation.

    PubMed

    McClusky, John F

    2008-01-01

    The traditional model of rehabilitation center design based on usability and function falls short of addressing the aspirations of those who use them. To better serve the motivational needs of both patients and therapists, we need to reconsider the gymnasium-inspired designs of current rehabilitation centers. Designers Patricia Moore and David Guynes have drawn inspiration from the everyday to create more engaging rehabilitation experiences with their Easy Street, Independence Square, Rehab 1-2-3, Our Town, and WorkSyms rehabilitation environments. Their designs simulate real-life situations to motivate patients by helping them connect their therapy to the life in which they aspire to return. Utilizing an empathic research process, Moore and Guynes build a deeper understanding of both patients' and therapists' values and apply that understanding to designs that are more directly connected to patients' aspirational goals while still meeting their functional rehabilitation needs. This same research-based design approach is utilized in all of their design work that has included, most recently, the design of the Phoenix Valley Transit Authority's Metro Light Rail Train. The train and stations have won awards for accessibility and will begin public operation in late 2008.

  15. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    NASA Astrophysics Data System (ADS)

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang; Paxton, Dean; Lavender, Curt; Burkes, Douglas

    2016-04-01

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  16. Numerical homogenization of the Richards equation for unsaturated water flow through heterogeneous soils

    NASA Astrophysics Data System (ADS)

    Li, Na; Yue, Xingye; Ren, Li

    2016-11-01

    Homogenized equations and the corresponding effective constitutive relations are generally necessary for numerically modeling large-scale unsaturated flow processes in soils. Recently, based on the Kirchhoff transformation and the two-scale convergence theory, a homogenization method for the Richards equation with the Mualem-van Genuchten model has been proposed, with a constant model parameter α relating to the inverse of the air-entry pressure and the soil pore size distribution. The homogenized model is computationally efficient and convenient to use because of its explicit expression. In this study, we generalize this method, allowing α to be a spatially distributed random field and proposing a homogenized Richards equation in the mixed form (θ/h) under the condition that the effective hydraulic conductivity tensor is diagonal. This generalization eliminates the limitation of a constant α in practical applications; the proposed homogenized model is meaningful in most situations because the flow problems are influenced mainly by the diagonal terms of conductivity and the off-diagonal terms are often neglected. Two-dimensional numerical tests are conducted in soil profiles with different degrees of spatial heterogeneity structure to illustrate that the homogenized model can capture the fine-scale flow behaviors on coarse grids effectively. Homogenization for the Richards equation with other two commonly used constitutive relations—the Brooks-Corey model and the Gardner-Russo model—is also illustrated in this study.

  17. Modeling the Homogenization Kinetics of As-Cast U-10wt% Mo alloys

    SciTech Connect

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang Y.; Paxton, Dean M.; Lavender, Curt A.; Burkes, Douglas

    2016-01-15

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  18. Homogeneity computation: How interitem similarity in visual short-term memory alters recognition

    PubMed Central

    Viswanathan, Shivakumar; Perl, Daniel R.; Visscher, Kristina M.; Kahana, Michael J.; Sekuler, Robert

    2010-01-01

    Visual short-term recognition memory for multiple stimuli is strongly influenced by the study items’ similarity to one another—that is, by their homogeneity. However, the mechanism responsible for this homogeneity effect has remained unclear. We evaluated competing explanations of this effect, using controlled sets of Gabor patches as study items and probe stimuli. Our results, based on recognition memory for spatial frequency, rule out the possibility that the homogeneity effect arises because similar study items are encoded and/or maintained with higher fidelity in memory than dissimilar study items are. Instead, our results support the hypothesis that the homogeneity effect reflects trial-by-trial comparisons of study items, which generate a homogeneity signal. This homogeneity signal modulates recognition performance through an adjustment of the subject’s decision criterion. Additionally, it seems the homogeneity signal is computed prior to the presentation of the probe stimulus, by evaluating the familiarity of each new stimulus with respect to the items already in memory. This suggests that recognition-like processes operate not only on the probe stimulus, but on study items as well. PMID:20081162

  19. Homogeneous Characterization of Transiting Exoplanet Systems

    NASA Astrophysics Data System (ADS)

    Gomez Maqueo Chew, Yilen; Faedi, Francesca; Hebb, Leslie; Pollacco, Don; Stassun, Keivan; Ghezzi, Luan; Cargile, Phillip; Barros, Susana; Smalley, Barry; Mack, Claude

    2012-02-01

    We aim to obtain a homogeneous set of high resolution, high signal- to-noise (S/N) spectra for a large and diverse sample of stars with transiting planets, using the Kitt Peak 4-m echelle spectrograph for bright Northern targets (7.7homogeneous analysis of this high-quality dataset, we will be able to investigate any systematic uncertainties on the derived stellar properties, and consequently, on the planetary properties derived from the iterative combination of our spectral analysis with the best available radial velocity data and transit photometry. % to derive a homogeneous set of properties for the transiting systems. The resulting consistent set of physical properties will allow us to further explore known correlations, e.g., core-size of the planet and stellar metallicity, and to newly identify subtle relationships providing insight into our understanding of planetary formation, structure, and evolution. Our pilot study analyzing our WASP-13 HIRES spectrum (R 48 000, S/N>150) in combination with high precision light curves shows an improvement in the precision of the stellar parameters of 60% in Teff, 75% in FeH, 82% in mstar, and 73% in rstar, which translates into a 64% improvement in the precision of rpl, and more than 2% on mpl, relative to the discovery paper's values.

  20. Sulfur isotope homogeneity of lunar mare basalts

    NASA Astrophysics Data System (ADS)

    Wing, Boswell A.; Farquhar, James

    2015-12-01

    We present a new set of high precision measurements of relative 33S/32S, 34S/32S, and 36S/32S values in lunar mare basalts. The measurements are referenced to the Vienna-Canyon Diablo Troilite (V-CDT) scale, on which the international reference material, IAEA-S-1, is characterized by δ33S = -0.061‰, δ34S ≡ -0.3‰ and δ36S = -1.27‰. The present dataset confirms that lunar mare basalts are characterized by a remarkable degree of sulfur isotopic homogeneity, with most new and published SF6-based sulfur isotope measurements consistent with a single mass-dependent mean isotopic composition of δ34S = 0.58 ± 0.05‰, Δ33S = 0.008 ± 0.006‰, and Δ36S = 0.2 ± 0.2‰, relative to V-CDT, where the uncertainties are quoted as 99% confidence intervals on the mean. This homogeneity allows identification of a single sample (12022, 281) with an apparent 33S enrichment, possibly reflecting cosmic-ray-induced spallation reactions. It also reveals that some mare basalts have slightly lower δ34S values than the population mean, which is consistent with sulfur loss from a reduced basaltic melt prior to eruption at the lunar surface. Both the sulfur isotope homogeneity of the lunar mare basalts and the predicted sensitivity of sulfur isotopes to vaporization-driven fractionation suggest that less than ≈1-10% of lunar sulfur was lost after a potential moon-forming impact event.