Science.gov

Sample records for process creates homogenous

  1. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-01

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization.

  2. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-01

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization. PMID:23055062

  3. Web Pages Created Via SCID Process.

    ERIC Educational Resources Information Center

    Stammen, Ronald M.

    This paper describes the use of a management process, Systematic Curriculum and Instructional Development (SCID), for developing online multimedia modules. The project, "Collaboratively Creating Multimedia Modules for Teachers and Professors," was funded by the USWEST Foundation. The curriculum development process involved teams of experts in…

  4. Spoken Word Processing Creates a Lexical Bottleneck

    ERIC Educational Resources Information Center

    Cleland, Alexandra A.; Tamminen, Jakke; Quinlan, Philip T.; Gaskell, M. Gareth

    2012-01-01

    We report 3 experiments that examined whether presentation of a spoken word creates an attentional bottleneck associated with lexical processing in the absence of a response to that word. A spoken word and a visual stimulus were presented in quick succession, but only the visual stimulus demanded a response. Response times to the visual stimulus…

  5. Use of Atomic Layer Deposition to create homogeneous SRXF/STXM standards

    NASA Astrophysics Data System (ADS)

    Becker, Nicholas; Klug, Jeffrey; Sutton, Steve; Butterworth, Anna; Westphal, Andrew; Zasadzinski, John; Proslier, Thomas

    2014-03-01

    The use of Standard Reference Materials (SRM) from the National Institute of Standards and Technology (NIST) for quantitative analysis of chemical composition when analyzing samples using Synchrotron based X-Ray Florescence (SR-XRF) and Scanning Transmission X-Ray Microscopy (STXM) is common. However, these standards can suffer from inhomogeneity in chemical composition and often require further corrections to obtain quantitative results. This inhomogeneity can negatively effect the reproducibility of measurements as well as the quantitative measure itself, and the introduction of assumptions for calculations can further limit reliability. Atomic Layer Deposition (ALD) is a deposition technique known for producing uniform, conformal films of a wide range of compounds on nearly any substrate material. These traits make it an ideal deposition method for producing thin films to replace the NIST standards and create SRM on a wide range of relevant substrates. Utilizing Rutherford Backscattering, STXM, and SR-XRF we will present data proving ALD is capable of producing films that are homogenous over scales ranging from 100 μm to 1nm on TEM windows. This work was supported by the U.S. Department of Energy, Office of Science under contract No. DE-AC02-06CH11357.

  6. Process to create simulated lunar agglutinate particles

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert J. (Inventor); Gustafson, Marty A. (Inventor); White, Brant C. (Inventor)

    2011-01-01

    A method of creating simulated agglutinate particles by applying a heat source sufficient to partially melt a raw material is provided. The raw material is preferably any lunar soil simulant, crushed mineral, mixture of crushed minerals, or similar material, and the heat source creates localized heating of the raw material.

  7. Creep rupture as a non-homogeneous Poissonian process

    PubMed Central

    Danku, Zsuzsa; Kun, Ferenc

    2013-01-01

    Creep rupture of heterogeneous materials occurring under constant sub-critical external loads is responsible for the collapse of engineering constructions and for natural catastrophes. Acoustic monitoring of crackling bursts provides microscopic insight into the failure process. Based on a fiber bundle model, we show that the accelerating bursting activity when approaching failure can be described by the Omori law. For long range load redistribution the time series of bursts proved to be a non-homogeneous Poissonian process with power law distributed burst sizes and waiting times. We demonstrate that limitations of experiments such as finite detection threshold and time resolution have striking effects on the characteristic exponents, which have to be taken into account when comparing model calculations with experiments. Recording events solely within the Omori time to failure the size distribution of bursts has a crossover to a lower exponent which is promising for forecasting the imminent catastrophic failure. PMID:24045539

  8. Experimenting With Ore: Creating the Taconite Process; flow chart of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Experimenting With Ore: Creating the Taconite Process; flow chart of process - Mines Experiment Station, University of Minnesota, Twin Cities Campus, 56 East River Road, Minneapolis, Hennepin County, MN

  9. Competing Contact Processes on Homogeneous Networks with Tunable Clusterization

    NASA Astrophysics Data System (ADS)

    Rybak, Marcin; Kułakowski, Krzysztof

    2013-03-01

    We investigate two homogeneous networks: the Watts-Strogatz network with mean degree ⟨k⟩ = 4 and the Erdös-Rényi network with ⟨k⟩ = 10. In both kinds of networks, the clustering coefficient C is a tunable control parameter. The network is an area of two competing contact processes, where nodes can be in two states, S or D. A node S becomes D with probability 1 if at least two its mutually linked neighbors are D. A node D becomes S with a given probability p if at least one of its neighbors is S. The competition between the processes is described by a phase diagram, where the critical probability pc depends on the clustering coefficient C. For p > pc the rate of state S increases in time, seemingly to dominate in the whole system. Below pc, the majority of nodes is in the D-state. The numerical results indicate that for the Watts-Strogatz network the D-process is activated at the finite value of the clustering coefficient C, close to 0.3. On the contrary, for the Erdös-Rényi network the transition is observed at the whole investigated range of C.

  10. Modeling environmental noise exceedances using non-homogeneous Poisson processes.

    PubMed

    Guarnaccia, Claudio; Quartieri, Joseph; Barrios, Juan M; Rodrigues, Eliane R

    2014-10-01

    In this work a non-homogeneous Poisson model is considered to study noise exposure. The Poisson process, counting the number of times that a sound level surpasses a threshold, is used to estimate the probability that a population is exposed to high levels of noise a certain number of times in a given time interval. The rate function of the Poisson process is assumed to be of a Weibull type. The presented model is applied to community noise data from Messina, Sicily (Italy). Four sets of data are used to estimate the parameters involved in the model. After the estimation and tuning are made, a way of estimating the probability that an environmental noise threshold is exceeded a certain number of times in a given time interval is presented. This estimation can be very useful in the study of noise exposure of a population and also to predict, given the current behavior of the data, the probability of occurrence of high levels of noise in the near future. One of the most important features of the model is that it implicitly takes into account different noise sources, which need to be treated separately when using usual models.

  11. Can An Evolutionary Process Create English Text?

    SciTech Connect

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  12. A criterion for assessing homogeneity distribution in hyperspectral images. Part 1: homogeneity index bases and blending processes.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    The Process Analytical Technologies (PAT) initiative of the US Food and Drug Administration (US FDA) has established a framework for the development of imaging techniques to determine the real-time distribution of mixture components during the production of solid dosage forms. This study, which is the first in a series of two parts, uses existing mixing indices and a new criterion called the "percentage of homogeneity" (H%) to assess image homogeneity. Image analysis techniques use feature extraction procedures to extract information from images subjected to treatments including colour segmentation and binarization. The surface distribution of components was determined by macropixel analysis, which splits an image into non-overlapping blocks of a preset size and calculates several statistical parameters for the resulting divisional structure. Such parameters were used to compute mixing indices. In this work, we explored the potential of image processing in combination with mixing indices and H% for assessing blending end-point and component distribution on images. As a simplified test, an arrangement of binary and ternary systems of coloured particles was mixed collecting at-line multispectral (MSI) and non-invasive RGB pictures at preset intervals. PMID:22818029

  13. A criterion for assessing homogeneity distribution in hyperspectral images. Part 1: homogeneity index bases and blending processes.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    The Process Analytical Technologies (PAT) initiative of the US Food and Drug Administration (US FDA) has established a framework for the development of imaging techniques to determine the real-time distribution of mixture components during the production of solid dosage forms. This study, which is the first in a series of two parts, uses existing mixing indices and a new criterion called the "percentage of homogeneity" (H%) to assess image homogeneity. Image analysis techniques use feature extraction procedures to extract information from images subjected to treatments including colour segmentation and binarization. The surface distribution of components was determined by macropixel analysis, which splits an image into non-overlapping blocks of a preset size and calculates several statistical parameters for the resulting divisional structure. Such parameters were used to compute mixing indices. In this work, we explored the potential of image processing in combination with mixing indices and H% for assessing blending end-point and component distribution on images. As a simplified test, an arrangement of binary and ternary systems of coloured particles was mixed collecting at-line multispectral (MSI) and non-invasive RGB pictures at preset intervals.

  14. A Tool for Creating Healthier Workplaces: The Conducivity Process

    ERIC Educational Resources Information Center

    Karasek, Robert A.

    2004-01-01

    The conducivity process, a methodology for creating healthier workplaces by promoting conducive production, is illustrated through the use of the "conducivity game" developed in the NordNet Project in Sweden, which was an action research project to test a job redesign methodology. The project combined the "conducivity" hypotheses about a…

  15. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  16. Process to Create High-Fidelity Lunar Dust Simulants

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert

    2010-01-01

    A method was developed to create high-fidelity lunar dust simulants that better match the unique properties of lunar dust than the existing simulants. The new dust simulant is designed to more closely approximate the size, morphology, composition, and other important properties of lunar dust (including the presence of nanophase iron). A two-step process is required to create this dust simulant. The first step is to prepare a feedstock material that contains a high percentage of agglutinate-like particles with iron globules (including nanophase iron). The raw material selected must have the proper mineralogical composition. In the second processing step, the feedstock material from the first step is jet-milled to reduce the particle size to a range consistent with lunar dust.

  17. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  18. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and

  19. Tests of a homogeneous Poisson process against clustering and other alternatives

    SciTech Connect

    Atwood, C.L.

    1994-05-01

    This report presents three closely related tests of the hypothesis that data points come from a homogeneous Poisson process. If there is too much observed variation among the log-transformed between-point distances, the hypothesis is rejected. The tests are more powerful than the standard chi-squared test against the alternative hypothesis of event clustering, but not against the alternative hypothesis of a Poisson process with smoothly varying intensity.

  20. Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2009-09-01

    In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in

  1. Simple process for building large homogeneous adaptable retarders made from polymeric materials.

    PubMed

    Delplancke, F; Sendrowicz, H; Bernaerd, R; Ebbeni, J

    1995-06-01

    A process for building large, homogeneous, adaptable retarders easily and at low cost is proposed and analyzed. This method is based on the properties of high polymers to present variable birefringence as a function of applied stresses and on the possibility of freezing these stresses inside the material by a thermal process. Various geometries for the applied forces make obtaining a large range of birefringence profiles possible. In the process that we describe composed bending leads to a linear birefringence profile. The superimposition of two pieces with identical profiles with opposite directions gives homogeneous constant retardation. This retardation can be adjusted by a relative displacement between the pieces. A precision of better than 1% over large areas (more than 3 cm in diameter) for a quarter-wave value has been obtained. The correct choice of material makes many applications possible with a large range of wavelengths.

  2. Pot-in-pot reactions: Heterogenization of homogeneous reaction processes for otherwise impossible cascades

    NASA Astrophysics Data System (ADS)

    Thuo, Martin

    Many excellent examples of homogeneous catalysts have been developed that elegantly and efficiently catalyze one reaction. Although the use of catalysts is ubiquitous in chemical synthesis, reactions must be carried out sequentially; else the catalysts/reagents may poison one another or require incompatible reaction conditions. These limitations make synthesis of vital molecules a tedious, expensive, and wasteful process. The process of multi-step synthesis is also not environmentally benign based on the sheer volume of waste generated per step. To overcome some of these limitations, catalysts have been site-isolated from each other therefore facilitating several steps in one reaction pot. However, available site-isolation methods have major shortcomings. Therefore, a general approach that works with already known chemistry and catalysts---without the need for further modification, is desired. This thesis reports a new approach to catalyst site-isolation. We exploited the advantages of both heterogeneous and homogeneous processes to develop new cascade reaction sequences by employing polydimethylsiloxane thimbles as selective semi-permeable walls. These thimbles allow small organic molecules to diffuse through while retaining polar reagents and/or organometallic catalysts. A felicitous choice of reaction conditions led to the development of pot-in-pot reactions, a new concept in organic catalysis. To demonstrate how dynamic this new techniques is, we performed 2- and 3-step cascade reactions. This new approach circumvents the need to isolate intermediates, therefore enabling synthesis of otherwise challenging molecules. The genesis of our work was the occlusion of an organometallic catalyst in polydimethylsiloxane to perform catalysis in water. Also, by simply occluding the catalyst in a polymer matrix, it was possible to dictate whether the catalyst gave a metathesis or an isomerization product. Since the work summarized herein demonstrates site-isolation of a

  3. Selves creating stories creating selves: a process model of self-development.

    PubMed

    McLean, Kate C; Pasupathi, Monisha; Pals, Jennifer L

    2007-08-01

    This article is focused on the growing empirical emphasis on connections between narrative and self-development. The authors propose a process model of self-development in which storytelling is at the heart of both stability and change in the self. Specifically, we focus on how situated stories help develop and maintain the self with reciprocal impacts on enduring aspects of self, specifically self-concept and the life story. This article emphasizes the research that has shown how autobiographical stories affect the self and provides a direction for future work to maximize the potential of narrative approaches to studying processes of self-development.

  4. Occurrence analysis of daily rainfalls through non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2011-06-01

    A stochastic model based on a non-homogeneous Poisson process, characterised by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. The data modelling has been performed with a partition of observed daily rainfall data into a calibration period for parameter estimation and a validation period for checking on occurrence process changes. The model has been applied to a set of rain gauges located in different geographical areas of Southern Italy. The results show a good fit for time-varying intensity of rainfall occurrence process by 2-harmonic Fourier law and no statistically significant evidence of changes in the validation period for different threshold values.

  5. High-pressure homogenization associated hydrothermal process of palygorskite for enhanced adsorption of Methylene blue

    NASA Astrophysics Data System (ADS)

    Zhang, Zhifang; Wang, Wenbo; Wang, Aiqin

    2015-02-01

    Palygorskite (PAL) was modified by a high-pressure homogenization assisted hydrothermal process. The effects of modification on the morphology, structure and physicochemical properties of PAL were systematically investigated by Field-emission scanning electron microscopy (FESEM), Transmission electron microscopy (TEM), Fourier transform infrared spectrometry (FTIR), Brunauer-Emmett-Teller (BET) analysis, X-ray diffraction (XRD) and Zeta potential analysis techniques, and the adsorption properties were systematically evaluated using Methylene blue (MB) as the model dye. The results revealed that the crystal bundles were disaggregated and the PAL nanorods became more even after treated via associated high-pressure homogenization and hydrothermal process, and the crystal bundles were dispersed as nanorods. The intrinsic crystal structure of PAL was remained after hydrothermal treatment, and the pore size calculated by the BET method was increased. The adsorption properties of PAL for MB were evidently improved (from 119 mg/g to 171 mg/g) after modification, and the dispersion of PAL before hydrothermal reaction is favorable to the adsorption. The desorption evaluation confirms that the modified PAL has stronger affinity with MB, which is benefit to fabricate a stable organic-inorganic hybrid pigment.

  6. Process for forming a homogeneous oxide solid phase of catalytically active material

    DOEpatents

    Perry, Dale L.; Russo, Richard E.; Mao, Xianglei

    1995-01-01

    A process is disclosed for forming a homogeneous oxide solid phase reaction product of catalytically active material comprising one or more alkali metals, one or more alkaline earth metals, and one or more Group VIII transition metals. The process comprises reacting together one or more alkali metal oxides and/or salts, one or more alkaline earth metal oxides and/or salts, one or more Group VIII transition metal oxides and/or salts, capable of forming a catalytically active reaction product, in the optional presence of an additional source of oxygen, using a laser beam to ablate from a target such metal compound reactants in the form of a vapor in a deposition chamber, resulting in the deposition, on a heated substrate in the chamber, of the desired oxide phase reaction product. The resulting product may be formed in variable, but reproducible, stoichiometric ratios. The homogeneous oxide solid phase product is useful as a catalyst, and can be produced in many physical forms, including thin films, particulate forms, coatings on catalyst support structures, and coatings on structures used in reaction apparatus in which the reaction product of the invention will serve as a catalyst.

  7. Creating a Standardized Process to Meet Core Measure Compliance.

    PubMed

    Kwan, Sarah; Daniels, Melodie; Ryan, Lindsey; Fields, Willa

    2015-01-01

    A standardized process to improve compliance with venous thromboembolism prophylaxis and hospital-based inpatient psychiatric services Core Measures was developed, implemented, and evaluated by a clinical nurse specialist team. The use of a 1-page tool with the requirements and supporting evidence, combined with concurrent data and feedback, ensured success of improving compliance. The initial robust process of education and concurrent and retrospective review follow-up allowed for this process to be successful. PMID:26274512

  8. 3D-ICs created using oblique processing

    NASA Astrophysics Data System (ADS)

    Burckel, D. Bruce

    2016-03-01

    This paper demonstrates that another class of three-dimensional integrated circuits (3D-ICs) exists, distinct from through silicon via centric and monolithic 3D-ICs. Furthermore, it is possible to create devices that are 3D at the device level (i.e. with active channels oriented in each of the three coordinate axes), by performing standard CMOS fabrication operations at an angle with respect to the wafer surface into high aspect ratio silicon substrates using membrane projection lithography (MPL). MPL requires only minimal fixturing changes to standard CMOS equipment, and no change to current state-of-the-art lithography. Eliminating the constraint of 2D planar device architecture enables a wide range of new interconnect topologies which could help reduce interconnect resistance/capacitance, and potentially improve performance.

  9. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  10. Parallel information processing channels created in the retina.

    PubMed

    Schiller, Peter H

    2010-10-01

    In the retina, several parallel channels originate that extract different attributes from the visual scene. This review describes how these channels arise and what their functions are. Following the introduction four sections deal with these channels. The first discusses the "ON" and "OFF" channels that have arisen for the purpose of rapidly processing images in the visual scene that become visible by virtue of either light increment or light decrement; the ON channel processes images that become visible by virtue of light increment and the OFF channel processes images that become visible by virtue of light decrement. The second section examines the midget and parasol channels. The midget channel processes fine detail, wavelength information, and stereoscopic depth cues; the parasol channel plays a central role in processing motion and flicker as well as motion parallax cues for depth perception. Both these channels have ON and OFF subdivisions. The third section describes the accessory optic system that receives input from the retinal ganglion cells of Dogiel; these cells play a central role, in concert with the vestibular system, in stabilizing images on the retina to prevent the blurring of images that would otherwise occur when an organism is in motion. The last section provides a brief overview of several additional channels that originate in the retina.

  11. Creating knowledge-driven healthcare processes with the Intelligence Continuum.

    PubMed

    Wickramasinghe, Nilmini; Schaffer, Jonathan L

    2006-01-01

    Medical science has made revolutionary changes in the past few decades. Contemporaneously, however, healthcare has made incremental changes at best. One area within healthcare that best exemplifies this is the operating room (OR). The growing discrepancy between the revolutionary changes in medicine and the minimal changes in healthcare processes leads to inefficient and ineffective healthcare deliver and one if not the significant contributor to the exponentially increasing costs plaguing healthcare globally. Significant quantities of data and information permeate the healthcare industry, yet the healthcare industry has not maximised this data resource by fully embracing key business management processes or techniques (such as Knowledge Management (KM), data mining, Business Intelligence (BI) or Business Analytics (BA)) to capitalise on realising the full value of this data/information resource to reengineer processes. The Intelligence Continuum (IC), a Mobius strip of sophisticated tools, techniques and process provides a systematic mechanism for healthcare organisations to facilitate superior clinical practice and administrative management. In this paper, the case example of the orthopaedic OR is used to illustrate the power of the IC in effecting more efficient and effective healthcare processes to ensue and thereby enabling healthcare to make evolutionary changes.

  12. Parallel information processing channels created in the retina

    PubMed Central

    Schiller, Peter H.

    2010-01-01

    In the retina, several parallel channels originate that extract different attributes from the visual scene. This review describes how these channels arise and what their functions are. Following the introduction four sections deal with these channels. The first discusses the “ON” and “OFF” channels that have arisen for the purpose of rapidly processing images in the visual scene that become visible by virtue of either light increment or light decrement; the ON channel processes images that become visible by virtue of light increment and the OFF channel processes images that become visible by virtue of light decrement. The second section examines the midget and parasol channels. The midget channel processes fine detail, wavelength information, and stereoscopic depth cues; the parasol channel plays a central role in processing motion and flicker as well as motion parallax cues for depth perception. Both these channels have ON and OFF subdivisions. The third section describes the accessory optic system that receives input from the retinal ganglion cells of Dogiel; these cells play a central role, in concert with the vestibular system, in stabilizing images on the retina to prevent the blurring of images that would otherwise occur when an organism is in motion. The last section provides a brief overview of several additional channels that originate in the retina. PMID:20876118

  13. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment. PMID:27372262

  14. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment.

  15. Creating Bespoke COTS solutions for image processing applications

    NASA Astrophysics Data System (ADS)

    Rickman, R.; Hickman, D.; Smith, M.; Page, S.; Sadler, J.

    2011-06-01

    To address the emergent needs of military and security users, a new design approach has been developed to enable the rapid development of high performance and low cost imaging and processing systems. In this paper, information about the "Bespoke COTS" design approach is presented and is illustrated using examples of systems that have been built and delivered. This approach facilitates the integration of standardised COTS components into a customised yet flexible systems architecture to realise user requirements within stringent project timescales and budgets. The paper also discusses the important area of the design trade-off space (performance, flexibility, quality, and cost) and compares the results of the Bespoke COTS approach to design solutions derived from more conventional design processes.

  16. Informativeness ratings of messages created on an AAC processing prosthesis.

    PubMed

    Bartlett, Megan R; Fink, Ruth B; Schwartz, Myrna F; Linebarger, Marcia

    2007-01-01

    BACKGROUND: SentenceShaper() (SSR) is a computer program that supports spoken language production in aphasia by recording and storing the fragments that the user speaks into the microphone, making them available for playback and allowing them to be combined and integrated into larger structures (i.e., sentences and narratives). A prior study that measured utterance length and grammatical complexity in story-plot narratives produced with and without the aid of SentenceShaper demonstrated an "aided effect" in some speakers with aphasia, meaning an advantage for the narratives that were produced with the support of this communication aid (Linebarger, Schwartz, Romania, Kohn, & Stephens, 2000). The present study deviated from Linebarger et al.'s methods in key respects and again showed aided effects of SentenceShaper in persons with aphasia. AIMS: Aims were (1) to demonstrate aided effects in "functional narratives" conveying hypothetical real-life situations from a first person perspective; (2) for the first time, to submit aided and spontaneous speech samples to listener judgements of informativeness; and (3) to produce preliminary evidence on topic-specific carryover from SentenceShaper, i.e., carryover from an aided production to a subsequent unaided production on the same topic. METHODS #ENTITYSTARTX00026; PROCEDURES: Five individuals with chronic aphasia created narratives on two topics, under three conditions: Unaided (U), Aided (SSR), and Post-SSR Unaided (Post-U). The 30 samples (5 participants, 2 topics, 3 conditions) were randomised and judged for informativeness by graduate students in speech-language pathology. The method for rating was Direct Magnitude Estimation (DME). OUTCOMES #ENTITYSTARTX00026; RESULTS: Repeated measures ANOVAs were performed on DME ratings for each participant on each topic. A main effect of Condition was present for four of the five participants, on one or both topics. Planned contrasts revealed that the aided effect (SSR >U) was

  17. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  18. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    SciTech Connect

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the charged wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.

  19. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda

    2015-12-01

    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  20. CO2-assisted high pressure homogenization: a solvent-free process for polymeric microspheres and drug-polymer composites.

    PubMed

    Kluge, Johannes; Mazzotti, Marco

    2012-10-15

    The study explores the enabling role of near-critical CO(2) as a reversible plasticizer in the high pressure homogenization of polymer particles, aiming at their comminution as well as at the formation of drug-polymer composites. First, the effect of near-critical CO(2) on the homogenization of aqueous suspensions of poly lactic-co-glycolic acid (PLGA) was investigated. Applying a pressure drop of 900 bar and up to 150 passes across the homogenizer, it was found that particles processed in the presence of CO(2) were generally of microspherical morphology and at all times significantly smaller than those obtained in the absence of a plasticizer. The smallest particles, exhibiting a median x(50) of 1.3 μm, were obtained by adding a small quantity of ethyl acetate, which exerts on PLGA an additional plasticizing effect during the homogenization step. Further, the study concerns the possibility of forming drug-polymer composites through simultaneous high pressure homogenization of the two relevant solids, and particularly the effect of near-critical CO(2) on this process. Therefore, PLGA was homogenized together with crystalline S-ketoprofen (S-KET), a non-steroidal anti-inflammatory drug, at a drug to polymer ratio of 1:10, a pressure drop of 900 bar and up to 150 passes across the homogenizer. When the process was carried out in the presence of CO(2), an impregnation efficiency of 91% has been reached, corresponding to 8.3 wt.% of S-KET in PLGA; moreover, composite particles were of microspherical morphology and significantly smaller than those obtained in the absence of CO(2). The formation of drug-polymer composites through simultaneous homogenization of the two materials is thus greatly enhanced by the presence of CO(2), which increases the efficiency for both homogenization and impregnation.

  1. Creating a national citizen engagement process for energy policy

    PubMed Central

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-01-01

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants’ broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  2. Creating a national citizen engagement process for energy policy.

    PubMed

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-09-16

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy.

  3. Creating a national citizen engagement process for energy policy.

    PubMed

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-09-16

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  4. People Create Health: Effective Health Promotion is a Creative Process

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    Effective health promotion involves the creative cultivation of physical, mental, social, and spiritual well-being. Efforts at health promotion produce weak and inconsistent benefits when it does not engage people to express their own goals and values. Likewise, health promotion has been ineffective when it relies only on instruction about facts regarding a healthy lifestyle, or focuses on reduction of disease rather than the cultivation of well-being. Meta-analysis of longitudinal studies and experimental interventions shows that improvements in subjective well-being lead to short-term and long-term reductions in medical morbidity and mortality, as well as to healthier functioning and longevity. However, these effects are inconsistent and weak (correlations of about 0.15). The most consistent and strong predictor of both subjective well-being and objective health status in longitudinal studies is a creative personality profile characterized by being highly self-directed, cooperative, and self-transcendent. There is a synergy among these personality traits that enhances all aspects of the health and happiness of people. Experimental interventions to cultivate this natural creative potential of people are now just beginning, but available exploratory research has shown that creativity can be enhanced and the changes are associated with widespread and profound benefits, including greater physical, mental, social, and spiritual well-being. In addition to benefits mediated by choice of diet, physical activity, and health care utilization, the effect of a creative personality on health may be partly mediated by effects on the regulation of heart rate variability. Creativity promotes autonomic balance with parasympathetic dominance leading to a calm alert state that promotes an awakening of plasticities and intelligences that stress inhibits. We suggest that health, happiness, and meaning can be cultivated by a complex adaptive process that enhances healthy functioning

  5. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    PubMed

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  6. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  7. An analysis of the trend in ground-level ozone using non-homogeneous poisson processes

    NASA Astrophysics Data System (ADS)

    Shively, Thomas S.

    This paper provides a method for measuring the long-term trend in the frequency with which ground-level ozone present in the ambient air exceeds the U.S. Environmental Protection Agency's National Ambient Air Quality Standard (NAAQS) for ozone. A major weakness of previous studies that estimate the long-term trend in the very high values of ozone, and therefore the long-term trend in the probability of satisfying the NAAQS for ozone, is their failure to account for the confounding effects of meterological conditions on ozone levels. Meteorological variables such as temperature, wind speed, and frontal passage play an important role in the formation of ground-level ozone. A non-homogenous Poisson process is used to account for the relationship between very high values of ozone and meteorological conditions. This model provides an estimate of the trend in the ozone values after allowing for the effects of meteorological conditions. Therefore, this model provides a means to measure the effectiveness of pollution control programs after accounting for the effects of changing weather conditions. When our approach is applied to data collected at two sites in Houston, TX, we find evidence of a gradual long-term downward trend in the frequency of high values of ozone. The empirical results indicate how possibly misleading results can be obtained if the analysis does not account for changing weather conditions.

  8. Aflatoxin M1 in milk powders: processing, homogeneity and stability testing of certified reference materials.

    PubMed

    Josephs, R D; Ulberth, F; Van Egmond, H P; Emons, H

    2005-09-01

    As part of the certification campaign of three candidate reference materials for the determination of aflatoxin M1 (AfM1) in whole milk powders, homogeneity, short- and long-term stability tests of naturally contaminated milk powders have been performed. The homogeneity of two AfM1-contaminated milk powders was studied by taking samples at regular intervals of the filling sequences and analysing in triplicate for their AfM1 contents by liquid chromatography with fluorescence detection (LC-FLD) using random stratified sampling schemes. The homogeneity testing of an AfM1 'blank' milk powder material was performed by determining the nitrogen content because AfM1 levels were below the limit of detection of the most sensitive determination method. The short-term stability of AfM1-contaminated milk powders was evaluated at three different storage temperatures (4, 18 and 40 degrees C). After storage times of 0, 1, 2 and 4 weeks, samples were investigated using LC-FLD. The long-term stability study comprised of measurements after 0, 6, 12 and 18 months after storage at -20 and 4 degrees C. Analyses were done by LC-FLD. Based on the homogeneity tests, the materials were sufficiently homogenous to serve as certified reference materials. Corresponding uncertainty contributions of 0.23-0.89% were calculated for the homogeneity. The stability measurements showed no significant trends for both short- and long-term stability studies. The long-term stability uncertainties of the AfM1-contaminated milk powders were 7.4 and 6.3%, respectively, for a shelf-life of 6 years and storage at -20 degrees C. Supplementary stability monitoring schemes over a long period of several years are currently ongoing.

  9. First-Principles Molecular Dynamics Studies of Organometallic Complexes and Homogeneous Catalytic Processes.

    PubMed

    Vidossich, Pietro; Lledós, Agustí; Ujaque, Gregori

    2016-06-21

    Computational chemistry is a valuable aid to complement experimental studies of organometallic systems and their reactivity. It allows probing mechanistic hypotheses and investigating molecular structures, shedding light on the behavior and properties of molecular assemblies at the atomic scale. When approaching a chemical problem, the computational chemist has to decide on the theoretical approach needed to describe electron/nuclear interactions and the composition of the model used to approximate the actual system. Both factors determine the reliability of the modeling study. The community dedicated much effort to developing and improving the performance and accuracy of theoretical approaches for electronic structure calculations, on which the description of (inter)atomic interactions rely. Here, the importance of the model system used in computational studies is highlighted through examples from our recent research focused on organometallic systems and homogeneous catalytic processes. We show how the inclusion of explicit solvent allows the characterization of molecular events that would otherwise not be accessible in reduced model systems (clusters). These include the stabilization of nascent charged fragments via microscopic solvation (notably, hydrogen bonding), transfer of charge (protons) between distant fragments mediated by solvent molecules, and solvent coordination to unsaturated metal centers. Furthermore, when weak interactions are involved, we show how conformational and solvation properties of organometallic complexes are also affected by the explicit inclusion of solvent molecules. Such extended model systems may be treated under periodic boundary conditions, thus removing the cluster/continuum (or vacuum) boundary, and require a statistical mechanics simulation technique to sample the accessible configurational space. First-principles molecular dynamics, in which atomic forces are computed from electronic structure calculations (namely, density

  10. Efficacy of low-temperature high hydrostatic pressure processing in inactivating Vibrio parahaemolyticus in culture suspension and oyster homogenate.

    PubMed

    Phuvasate, Sureerat; Su, Yi-Cheng

    2015-03-01

    Culture suspensions of five clinical and five environmental Vibrio parahaemolyticus strains in 2% NaCl solution were subjected to high pressure processing (HPP) under various conditions (200-300MPa for 5 and 10 min at 1.5-20°C) to study differences in pressure resistance among the strains. The most pressure-resistant and pressure-sensitive strains were selected to investigate the effects of low temperatures (15, 5 and 1.5°C) on HPP (200 or 250MPa for 5 min) to inactivate V. parahaemolyticus in sterile oyster homogenates. Inactivation of V. parahaemolyticus cells in culture suspensions and oyster homogenates was greatly enhanced by lowering the processing temperature from 15 to 5 or 1.5°C. A treatment of oyster homogenates at 250MPa for 5 min at 5°C decreased the populations of V. parahaemolyticus by 6.2logCFU/g for strains 10290 and 100311Y11 and by >7.4logCFU/g for strain 10292. Decreasing the processing temperature of the same treatment to 1.5°C reduced all the V. parahaemolyticus strains inoculated to oyster homogenates to non-detectable (<10CFU/g) levels. Factors including pressure level, processing temperature and time all need to be considered for developing effective HPP for eliminating pathogens from foods. Further studies are needed to validate the efficacy of the HPP (250MPa for 5 min at 1.5°C) in inactivating V. parahaemolyticus cells in whole oysters.

  11. Regional Homogeneity

    PubMed Central

    Jiang, Lili; Zuo, Xi-Nian

    2015-01-01

    Much effort has been made to understand the organizational principles of human brain function using functional magnetic resonance imaging (fMRI) methods, among which resting-state fMRI (rfMRI) is an increasingly recognized technique for measuring the intrinsic dynamics of the human brain. Functional connectivity (FC) with rfMRI is the most widely used method to describe remote or long-distance relationships in studies of cerebral cortex parcellation, interindividual variability, and brain disorders. In contrast, local or short-distance functional interactions, especially at a scale of millimeters, have rarely been investigated or systematically reviewed like remote FC, although some local FC algorithms have been developed and applied to the discovery of brain-based changes under neuropsychiatric conditions. To fill this gap between remote and local FC studies, this review will (1) briefly survey the history of studies on organizational principles of human brain function; (2) propose local functional homogeneity as a network centrality to characterize multimodal local features of the brain connectome; (3) render a neurobiological perspective on local functional homogeneity by linking its temporal, spatial, and individual variability to information processing, anatomical morphology, and brain development; and (4) discuss its role in performing connectome-wide association studies and identify relevant challenges, and recommend its use in future brain connectomics studies. PMID:26170004

  12. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    DOE PAGES

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCPmore » is referred to herein as a Virtual ADCP (VADCP).« less

  13. The Challenges of Creating a Benchmarking Process for Administrative and Support Services

    ERIC Educational Resources Information Center

    Manning, Terri M.

    2007-01-01

    In the current climate of emphasis on outcomes assessment, colleges and universities are working diligently to create assessment processes for student learning outcomes, competence in general education, student satisfaction with services, and electronic tracking media to document evidence of competence in graduates. Benchmarking has become a…

  14. Study on rheo-diecasting process of 7075R alloys by SA-EMS melt homogenized treatment

    NASA Astrophysics Data System (ADS)

    Zhihua, G.; Jun, X.; Zhifeng, Z.; Guojun, L.; Mengou, T.

    2016-03-01

    An advanced melt processing technology, spiral annular electromagnetic stirring (SA-EMS) based on the annular electromagnetic stirring (A-EMS) process was developed for manufacturing Al-alloy components with high integrity. The SA-EMS process innovatively combines non-contact electromagnetic stirring and a spiral annular chamber with specially designed profiles to in situ make high quality melt slurry, and intensive forced shearing can be achieved under high shear rate and high intensity of turbulence inside the spiral annular chamber. In this paper, the solidification microstructure and hardness of 7075R alloy die-casting connecting rod conditioned by the SA-EMS melt processing technology were investigated. The results indicate that, the SA-EMS melt processing technology exhibited superior grain refinement and remarkable structure homogeneity. In addition, it can evidently enhance the mechanical performance and reduce the crack tendency.

  15. Comparing Non-homogeneous Gaussian Regression and Bayesian Model Averaging for post-processing hydrological ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Paul; Pappenberger, Florian

    2015-04-01

    Bayesian Model Averaging and Non-homogeneous Gaussian Regression have been proposed as techniques for post-processing ensemble forecasts into predictive probability distributions. Both methods make use of past forecast data for which observations are available to propose weights for the ensemble members along with bias and dispersion corrections. The mathematical basis and application of these methods though differs significantly. In this work we contrast the forecast results derived using these methods within the European Flood Awareness System, an operational flood forecasting system covering Europe. The performance of the different methods at lead times up to 15 days is compared at multiple sites and for notable flood events.

  16. Porcine liver decellularization under oscillating pressure conditions: a technical refinement to improve the homogeneity of the decellularization process.

    PubMed

    Struecker, Benjamin; Hillebrandt, Karl Herbert; Voitl, Robert; Butter, Antje; Schmuck, Rosa B; Reutzel-Selke, Anja; Geisel, Dominik; Joehrens, Korinna; Pickerodt, Philipp A; Raschzok, Nathanael; Puhl, Gero; Neuhaus, Peter; Pratschke, Johann; Sauer, Igor M

    2015-03-01

    Decellularization and recellularization of parenchymal organs may facilitate the generation of autologous functional liver organoids by repopulation of decellularized porcine liver matrices with induced liver cells. We present an accelerated (7 h overall perfusion time) and effective protocol for human-scale liver decellularization by pressure-controlled perfusion with 1% Triton X-100 and 1% sodium dodecyl sulfate via the hepatic artery (120 mmHg) and portal vein (60 mmHg). In addition, we analyzed the effect of oscillating pressure conditions on pig liver decellularization (n=19). The proprietary perfusion device used to generate these pressure conditions mimics intra-abdominal conditions during respiration to optimize microperfusion within livers and thus optimize the homogeneity of the decellularization process. The efficiency of perfusion decellularization was analyzed by macroscopic observation, histological staining (hematoxylin and eosin [H&E], Sirius red, and alcian blue), immunohistochemical staining (collagen IV, laminin, and fibronectin), and biochemical assessment (DNA, collagen, and glycosaminoglycans) of decellularized liver matrices. The integrity of the extracellular matrix (ECM) postdecellularization was visualized by corrosion casting and three-dimensional computed tomography scanning. We found that livers perfused under oscillating pressure conditions (P(+)) showed a more homogenous course of decellularization and contained less DNA compared with livers perfused without oscillating pressure conditions (P(-)). Microscopically, livers from the (P(-)) group showed remnant cell clusters, while no cells were found in livers from the (P(+)) group. The grade of disruption of the ECM was higher in livers from the (P(-)) group, although the perfusion rates and pressure did not significantly differ. Immunohistochemical staining revealed that important matrix components were still present after decellularization. Corrosion casting showed an intact

  17. Challenges in modelling homogeneous catalysis: new answers from ab initio molecular dynamics to the controversy over the Wacker process.

    PubMed

    Stirling, András; Nair, Nisanth N; Lledós, Agustí; Ujaque, Gregori

    2014-07-21

    We present here a review of the mechanistic studies of the Wacker process stressing the long controversy about the key reaction steps. We give an overview of the previous experimental and theoretical studies on the topic. Then we describe the importance of the most recent Ab Initio Molecular Dynamics (AIMD) calculations in modelling organometallic reactivity in water. As a prototypical example of homogeneous catalytic reactions, the Wacker process poses serious challenges to modelling. The adequate description of the multiple role of the water solvent is very difficult by using static quantum chemical approaches including cluster and continuum solvent models. In contrast, such reaction systems are suitable for AIMD, and by combining with rare event sampling techniques, the method provides reaction mechanisms and the corresponding free energy profiles. The review also highlights how AIMD has helped to obtain a novel understanding of the mechanism and kinetics of the Wacker process.

  18. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    SciTech Connect

    Suzuki, Hiroyuki Y.

    2008-02-15

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  19. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  20. Polymer powder processing of cryomilled polycaprolactone for solvent-free generation of homogeneous bioactive tissue engineering scaffolds.

    PubMed

    Lim, Jing; Chong, Mark Seow Khoon; Chan, Jerry Kok Yen; Teoh, Swee-Hin

    2014-06-25

    Synthetic polymers used in tissue engineering require functionalization with bioactive molecules to elicit specific physiological reactions. These additives must be homogeneously dispersed in order to achieve enhanced composite mechanical performance and uniform cellular response. This work demonstrates the use of a solvent-free powder processing technique to form osteoinductive scaffolds from cryomilled polycaprolactone (PCL) and tricalcium phosphate (TCP). Cryomilling is performed to achieve micrometer-sized distribution of PCL and reduce melt viscosity, thus improving TCP distribution and improving structural integrity. A breakthrough is achieved in the successful fabrication of 70 weight percentage of TCP into a continuous film structure. Following compaction and melting, PCL/TCP composite scaffolds are found to display uniform distribution of TCP throughout the PCL matrix regardless of composition. Homogeneous spatial distribution is also achieved in fabricated 3D scaffolds. When seeded onto powder-processed PCL/TCP films, mesenchymal stem cells are found to undergo robust and uniform osteogenic differentiation, indicating the potential application of this approach to biofunctionalize scaffolds for tissue engineering applications.

  1. The influence of melting process and parameters on the structure and homogeneity of titanium-tantalum alloys

    SciTech Connect

    Dunn, P.S.; Korzewka, D.; Garcia, F.; Damkroger, B.K.; Van Den Avyle, J.A.; Tissot, R.G.

    1995-12-31

    Alloys of titanium with refractory metals are attractive materials for applications requiring high temperature strength and corrosion resistance. However, the widely different characteristics of the component elements have made it difficult to produce sound, compositionally homogeneous ingots using traditional melting techniques. This is particularly critical because the compositional ranges spanned by the micro- and macrosegregation in theses systems can easily encompass a number of microconstituents which are detrimental to mechanical properties. This paper presents results of a study of plasma (PAM) and vacuum-arc (VAR) melting of a 60 wt% tantalum, 40 wt% titanium binary alloy. The structural and compositional homogeneity of both PAM consolidated + PAM remelted, and PAM consolidated + VAR remelted ingots were characterized and compared using optical and electron microscopy and x-ray fluorescence microanalysis. Additionally, the effect of melting parameter, including melt rate and magnetic stirring, was studied. Results indicate that PAM remelting achieves more complete dissolution of lie starting electrode, due to greater local superheat, than does VAR remelting. PAM remelting also produces a finer as-solidified grain structure, due to the smaller molten pool and lower local solidification times. Conversely, VAR remelting produces an ingot with a more uniform macrostructure, due to the more stable movement of the solidification interface and more uniform material feed rate. Based on these results, a three-step process of PAM consolidation, followed by a PAM intermediate melt and a VAR final melt, has been selected for further development of the alloy and processing sequence.

  2. Homogenous VUV advanced oxidation process for enhanced degradation and mineralization of antibiotics in contaminated water.

    PubMed

    Pourakbar, Mojtaba; Moussavi, Gholamreza; Shekoohiyan, Sakine

    2016-03-01

    This study was aimed to evaluate the degradation and mineralization of amoxicillin(AMX), using VUV advanced process. The effect of pH, AMX initial concentration, presence of water ingredients, the effect of HRT, and mineralization level by VUV process were taken into consideration. In order to make a direct comparison, the test was also performed by UVC radiation. The results show that the degradation of AMX was following the first-order kinetic. It was found that direct photolysis by UVC was able to degrade 50mg/L of AMX in 50min,while it was 3min for VUV process. It was also found that the removal efficiency by VUV process was directly influenced by pH of the solution, and higher removal rates were achieved at high pH values.The results show that 10mg/L of AMX was completely degraded and mineralized within 50s and 100s, respectively, indicating that the AMX was completely destructed into non-hazardous materials. Operating the photoreactor in contentious-flow mode revealed that 10mg/L AMX was completely degraded and mineralized at HRT values of 120s and 300s. it was concluded that the VUV advanced process was an efficient and viable technique for degradation and mineralization of contaminated water by antibiotics. PMID:26669695

  3. Quality function deployment: A customer-driven process to create and deliver value. Final report

    SciTech Connect

    George, S.S.

    1994-12-01

    Quality function deployment (QFD) is a team-oriented decision-making process used by more than 100 US businesses and industries to develop new products and marketing strategies. This report provides a detailed description of QFD and case study examples of how electric utilities can apply QFD principles in creating successful marketing and demand-side management (DSM) programs. The five-stage QFD process involves identifying customer needs and using this information to systematically develop program features, implementation activities, management procedures, and evaluation plans. QFD is not a deterministic model that provides answers, but a flexible, pragmatic tool for systematically organizing and communicating information to help utilities make better decisions.

  4. Development of a reference material for Staphylococcus aureus enterotoxin A in cheese: feasibility study, processing, homogeneity and stability assessment.

    PubMed

    Zeleny, R; Emteborg, H; Charoud-Got, J; Schimmel, H; Nia, Y; Mutel, I; Ostyn, A; Herbin, S; Hennekinne, J-A

    2015-02-01

    Staphylococcal food poisoning is caused by enterotoxins excreted into foods by strains of staphylococci. Commission Regulation 1441/2007 specifies thresholds for the presence of these toxins in foods. In this article we report on the progress towards reference materials (RMs) for Staphylococcal enterotoxin A (SEA) in cheese. RMs are crucial to enforce legislation and to implement and safeguard reliable measurements. First, a feasibility study revealed a suitable processing procedure for cheese powders: the blank material was prepared by cutting, grinding, freeze-drying and milling. For the spiked material, a cheese-water slurry was spiked with SEA solution, freeze-dried and diluted with blank material to the desired SEA concentration. Thereafter, batches of three materials (blank; two SEA concentrations) were processed. The materials were shown to be sufficiently homogeneous, and storage at ambient temperature for 4weeks did not indicate degradation. These results provide the basis for the development of a RM for SEA in cheese. PMID:25172706

  5. Multiple-pass high-pressure homogenization of milk for the development of pasteurization-like processing conditions.

    PubMed

    Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J

    2013-02-01

    Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P < 0·05) on microbial reduction of both spoilage microflora and Staph. aureus in milk. Optimized UHPH processes (five 202-MPa passes; four 232-MPa passes) defined a region where a 5-log(10) reduction of total bacterial count of milk and a baroresistant pathogen are attainable, as a requisite parameter for establishing an alternative method of pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization.

  6. New American Cancer Society process for creating trustworthy cancer screening guidelines.

    PubMed

    Brawley, Otis; Byers, Tim; Chen, Amy; Pignone, Michael; Ransohoff, David; Schenk, Maryjean; Smith, Robert; Sox, Harold; Thorson, Alan G; Wender, Richard

    2011-12-14

    Guidelines for cancer screening written by different organizations often differ, even when they are based on the same evidence. Those dissimilarities can create confusion among health care professionals, the general public, and policy makers. The Institute of Medicine (IOM) recently released 2 reports to establish new standards for developing more trustworthy clinical practice guidelines and conducting systematic evidence reviews that serve as their basis. Because the American Cancer Society (ACS) is an important source of guidance about cancer screening for both health care practitioners and the general public, it has revised its methods to create a more transparent, consistent, and rigorous process for developing and communicating guidelines. The new ACS methods align with the IOM principles for trustworthy clinical guideline development by creating a single generalist group for writing the guidelines, commissioning independent systematic evidence reviews, and clearly articulating the benefits, limitations, and harms associated with a screening test. This new process should ensure that ACS cancer screening guidelines will continue to be a trustworthy source of information for both health care practitioners and the general public to guide clinical practice, personal choice, and public policy about cancer screening.

  7. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2003-01-01

    The objective of this study was to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements were carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The main experimental technique employed was turbulent flow-chemical ionization mass spectrometry, which is particularly well suited for investigations of radical-radical reactions.

  8. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2001-01-01

    The objective of this study is to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases such as HCl with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements will be carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The techniques to be employed include turbulent flow - chemical ionization mass spectrometry, and optical ellipsometry. The next section summarizes our research activities during the second year of the project, and the section that follows consists of the statement of work for the third year.

  9. Degradation mechanism of cyanobacterial toxin cylindrospermopsin by hydroxyl radicals in homogeneous UV/H₂O₂ process.

    PubMed

    He, Xuexiang; Zhang, Geshan; de la Cruz, Armah A; O'Shea, Kevin E; Dionysiou, Dionysios D

    2014-04-15

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometry. Various byproducts identified indicated three common reaction pathways: hydroxyl addition (+16 Da), alcoholic oxidation or dehydrogenation (-2 Da), and elimination of sulfate (-80 Da). The initiation of the degradation was observed at the hydroxymethyl uracil and tricyclic guanidine groups; uracil moiety cleavage/fragmentation and further ring-opening of the alkaloid were also noted at an extended reaction time or higher UV fluence. The degradation rates of CYN decreased and less byproducts (species) were detected using natural water matrices; however, CYN was effectively eliminated under extended UV irradiation. This study demonstrates the efficiency of CYN degradation and provides a better understanding of the mechanism of CYN degradation by hydroxyl radical, a reactive oxygen species that can be generated by most AOPs and is present in natural water environment.

  10. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with

  11. The Parametric Model of the Human Mandible Coronoid Process Created by Method of Anatomical Features

    PubMed Central

    Vitković, Nikola; Mitić, Jelena; Manić, Miodrag; Trajanović, Miroslav; Husain, Karim; Petrović, Slađana; Arsić, Stojanka

    2015-01-01

    Geometrically accurate and anatomically correct 3D models of the human bones are of great importance for medical research and practice in orthopedics and surgery. These geometrical models can be created by the use of techniques which can be based on input geometrical data acquired from volumetric methods of scanning (e.g., Computed Tomography (CT)) or on the 2D images (e.g., X-ray). Geometrical models of human bones created in such way can be applied for education of medical practitioners, preoperative planning, etc. In cases when geometrical data about the human bone is incomplete (e.g., fractures), it may be necessary to create its complete geometrical model. The possible solution for this problem is the application of parametric models. The geometry of these models can be changed and adapted to the specific patient based on the values of parameters acquired from medical images (e.g., X-ray). In this paper, Method of Anatomical Features (MAF) which enables creation of geometrically precise and anatomically accurate geometrical models of the human bones is implemented for the creation of the parametric model of the Human Mandible Coronoid Process (HMCP). The obtained results about geometrical accuracy of the model are quite satisfactory, as it is stated by the medical practitioners and confirmed in the literature. PMID:26064183

  12. The Parametric Model of the Human Mandible Coronoid Process Created by Method of Anatomical Features.

    PubMed

    Vitković, Nikola; Mitić, Jelena; Manić, Miodrag; Trajanović, Miroslav; Husain, Karim; Petrović, Slađana; Arsić, Stojanka

    2015-01-01

    Geometrically accurate and anatomically correct 3D models of the human bones are of great importance for medical research and practice in orthopedics and surgery. These geometrical models can be created by the use of techniques which can be based on input geometrical data acquired from volumetric methods of scanning (e.g., Computed Tomography (CT)) or on the 2D images (e.g., X-ray). Geometrical models of human bones created in such way can be applied for education of medical practitioners, preoperative planning, etc. In cases when geometrical data about the human bone is incomplete (e.g., fractures), it may be necessary to create its complete geometrical model. The possible solution for this problem is the application of parametric models. The geometry of these models can be changed and adapted to the specific patient based on the values of parameters acquired from medical images (e.g., X-ray). In this paper, Method of Anatomical Features (MAF) which enables creation of geometrically precise and anatomically accurate geometrical models of the human bones is implemented for the creation of the parametric model of the Human Mandible Coronoid Process (HMCP). The obtained results about geometrical accuracy of the model are quite satisfactory, as it is stated by the medical practitioners and confirmed in the literature. PMID:26064183

  13. Processing of α-chitin nanofibers by dynamic high pressure homogenization: characterization and antifungal activity against A. niger.

    PubMed

    Salaberria, Asier M; Fernandes, Susana C M; Diaz, Rene Herrera; Labidi, Jalel

    2015-02-13

    Chitin nano-objects become more interesting and attractive material than native chitin because of their usable form, low density, high surface area and promising mechanical properties. This work suggests a straightforward and environmentally friendly method for processing chitin nanofibers using dynamic high pressure homogenization. This technique proved to be a remarkably simple way to get α-chitin into α-chitin nanofibers from yellow lobster wastes with a uniform width (bellow 100 nm) and high aspect ratio; and may contributes to a major breakthrough in chitin applications. Moreover, the resulting α-chitin nanofibers were characterized and compared with native α-chitin in terms of chemical and crystal structure, thermal degradation and antifungal activity. The biological assays highlighted that the nano nature of chitin nanofibers plays an important role in the antifungal activity against Aspergillus niger.

  14. Processing of α-chitin nanofibers by dynamic high pressure homogenization: characterization and antifungal activity against A. niger.

    PubMed

    Salaberria, Asier M; Fernandes, Susana C M; Diaz, Rene Herrera; Labidi, Jalel

    2015-02-13

    Chitin nano-objects become more interesting and attractive material than native chitin because of their usable form, low density, high surface area and promising mechanical properties. This work suggests a straightforward and environmentally friendly method for processing chitin nanofibers using dynamic high pressure homogenization. This technique proved to be a remarkably simple way to get α-chitin into α-chitin nanofibers from yellow lobster wastes with a uniform width (bellow 100 nm) and high aspect ratio; and may contributes to a major breakthrough in chitin applications. Moreover, the resulting α-chitin nanofibers were characterized and compared with native α-chitin in terms of chemical and crystal structure, thermal degradation and antifungal activity. The biological assays highlighted that the nano nature of chitin nanofibers plays an important role in the antifungal activity against Aspergillus niger. PMID:25458302

  15. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each

  16. Development of a new cucumber reference material for pesticide residue analysis: feasibility study for material processing, homogeneity and stability assessment.

    PubMed

    Grimalt, Susana; Harbeck, Stefan; Shegunova, Penka; Seghers, John; Sejerøe-Olsen, Berit; Emteborg, Håkan; Dabrio, Marta

    2015-04-01

    The feasibility of the production of a reference material for pesticide residue analysis in a cucumber matrix was investigated. Cucumber was spiked at 0.075 mg/kg with each of the 15 selected pesticides (acetamiprid, azoxystrobin, carbendazim, chlorpyrifos, cypermethrin, diazinon, (α + β)-endosulfan, fenitrothion, imazalil, imidacloprid, iprodione, malathion, methomyl, tebuconazole and thiabendazole) respectively. Three different strategies were considered for processing the material, based on the physicochemical properties of the vegetable and the target pesticides. As a result, a frozen spiked slurry of fresh cucumber, a spiked freeze-dried cucumber powder and a freeze-dried cucumber powder spiked by spraying the powder were studied. The effects of processing and aspects related to the reconstitution of the material were evaluated by monitoring the pesticide levels in the three materials. Two separate analytical methods based on LC-MS/MS and GC-MS/MS were developed and validated in-house. The spiked freeze-dried cucumber powder was selected as the most feasible material and more exhaustive studies on homogeneity and stability of the pesticide residues in the matrix were carried out. The results suggested that the between-unit homogeneity was satisfactory with a sample intake of dried material as low as 0.1 g. A 9-week isochronous stability study was undertaken at -20 °C, 4 °C and 18 °C, with -70 °C designated as the reference temperature. The pesticides tested exhibited adequate stability at -20 °C during the 9-week period as well as at -70 °C for a period of 18 months. These results constitute a good basis for the development of a new candidate reference material for selected pesticides in a cucumber matrix. PMID:25627789

  17. Creating OGC Web Processing Service workflows using a web-based editor

    NASA Astrophysics Data System (ADS)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  18. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    PubMed

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history.

  19. Preparation of cotton linter nanowhiskers by high-pressure homogenization process and its application in thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Savadekar, N. R.; Karande, V. S.; Vigneshwaran, N.; Kadam, P. G.; Mhaske, S. T.

    2015-03-01

    The present work deals with the preparation of cotton linter nanowhiskers (CLNW) by acid hydrolysis and subsequent processing in a high-pressure homogenizer. Prepared CLNW were then used as a reinforcing material in thermoplastic starch (TPS), with an aim to improve its performance properties. Concentration of CLNW was varied as 0, 1, 2, 3, 4 and 5 wt% in TPS. TPS/CLNW nanocomposite films were prepared by solution-casting process. The nanocomposite films were characterized by tensile, differential scanning calorimetry, scanning electron microscopy (SEM), water vapor permeability (WVP), oxygen permeability (OP), X-ray diffraction and light transmittance properties. 3 wt% CLNW-loaded TPS nanocomposite films demonstrated 88 % improvement in the tensile strength as compared to the pristine TPS polymer film; whereas, WVP and OP decreased by 90 and 92 %, respectively, which is highly appreciable compared to the quantity of CLNW added. DSC thermograms of nanocomposite films did not show any significant effect on melting temperature as compared to the pristine TPS. Light transmittance ( T r) value of TPS decreased with increased content of CLNW. Better interaction between CLNW and TPS, caused due to the hydrophilic nature of both the materials, and uniform distribution of CLNW in TPS were the prime reason for the improvement in properties observed at 3 wt% loading of CLNW in TPS. However, CLNW was seen to have formed agglomerates at higher concentration as determined from SEM analysis. These nanocomposite films can have potential use in food and pharmaceutical packaging applications.

  20. Manufacturing of 9CrMoCoB Steel of Large Ingot with Homogeneity by ESR Process

    NASA Astrophysics Data System (ADS)

    Kim, D. S.; Lee, G. J.; Lee, M. B.; Hur, J. I.; Lee, J. W.

    2016-07-01

    In case of 9CrMoCoB (COST FB2) steel, equilibrium relation between [B]/[Si] ratio and (B2O3)/(SiO2) ratio is very important to control [Si] and [B] in optimum range. Therefore, in this work, to investigate the thermodynamic equilibrium relation between [B]/[Si] ratio and (B2O3)/(SiO2) ratio, pilot ESR experiments of 9CrMoCoB steel were carried out using the CaF2-CaO-Al2O3-SiO2-B2O3 slag system according to change of Si content in electrode and B2O3 content in the slag. Furthermore, through the test melting of the 20ton-class ESR ingot, the merits and demerits of soft arcing were investigated. From these results, it is concluded that oxygen content in the ESR ingot decrease with decreasing SiO2 content in the slag, relation function between [B]/[Si] ratio and (B2O3)/(SiO2) ratio derived by Pilot ESR test shows a good agreement as compared to the calculated line with a same slope and soft arcing makes interior and surface quality of ingot worse. With the optimized ESR conditions obtained from the present study, a 1000mm diameter (20 tons) and 2200mm diameter (120ton) 9CrMoCoB steel of the ESR ingot were successfully manufactured with good homogeneity by the ESR process.

  1. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to

  2. Waste container weighing data processing to create reliable information of household waste generation.

    PubMed

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations.

  3. Waste container weighing data processing to create reliable information of household waste generation.

    PubMed

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. PMID:25765610

  4. Five Important Lessons I Learned during the Process of Creating New Child Care Centers

    ERIC Educational Resources Information Center

    Whitehead, R. Ann

    2005-01-01

    In this article, the author describes her experiences of developing new child care sites and offers five important lessons that she learned through her experiences which helped her to create successful child care centers. These lessons include: (1) Finding an appropriate area and location; (2) Creating realistic financial projections based on real…

  5. Degradation of atrazine by cobalt-mediated activation of peroxymonosulfate: Different cobalt counteranions in homogenous process and cobalt oxide catalysts in photolytic heterogeneous process.

    PubMed

    Chan, K H; Chu, W

    2009-05-01

    The degradation of atrazine (ATZ) by cobalt-mediated activation of peroxymonosulfate (PMS) has been studied in this work. For the homogenous process, different cobalt counteranions: cobalt(II) nitrate Co(NO(3))(2), cobalt(II) sulfate CoSO(4), cobalt(II) chloride CoCl(2), and cobalt(II) acetate Co(CH(3)COO)(2), have been examined. The inhibitory effect was observed in the process initiated by CoCl(2). For the pH test, wide range of pH level (2-10) has been investigated. It was found that the higher rates were obtained in the normal pH levels. At extreme pH levels, the process was impeded by inactivation of PMS at acidic pH and prohibited by precipitation at basic pH. On the other hand, the recycling capability of cobalt oxide and the oxidative potential of cobalt-immobilized titanium dioxide Co-TiO(2) catalyst were analyzed in the heterogeneous process. It was found that the higher the cobalt content in the catalyst, the better the removal performance was resulted. At last, the Co-TiO(2) catalyst synthesized in this work was found to be very effective in transforming ATZ as well as its intermediate in the presence of UV-vis irradiation.

  6. Mesoscopic homogenization of semi-insulating GaAs by two-step post growth annealing

    SciTech Connect

    Hoffmann, B.; Jurisch, M.; Koehler, A.; Reinhold, T.; Weinert, B.; Kissinger, G.

    1996-12-31

    Mesoscopic homogenization of the electrical properties of s.i. LEC-GaAs is commonly realized by thermal treatment of the crystals including the steps of dissolution of arsenic precipitates, homogenization of excess As and re-precipitation by creating a controlled supersaturation. Caused by the inhomogeneous distribution of dislocations and the corresponding cellular structure along and across LEC-grown crystals a proper choice of the time-temperature program is necessary to minimize fluctuations of mesoscopic homogeneity. A modified two-step ingot annealing process is demonstrated to ensure the homogeneous distribution of mesoscopic homogeneity.

  7. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  8. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  9. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  10. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  11. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  12. We're Born to Learn: Using the Brain's Natural Learning Process to Create Today's Curriculum. Second Edition

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    2011-01-01

    This updated edition of the bestselling book on the brain's natural learning process brings new research results and applications in a power-packed teacher tool kit. Rita Smilkstein shows teachers how to create and deliver curricula that help students become the motivated, successful, and natural learners they were born to be. Updated features…

  13. Method of removing the effects of electrical shorts and shunts created during the fabrication process of a solar cell

    DOEpatents

    Nostrand, Gerald E.; Hanak, Joseph J.

    1979-01-01

    A method of removing the effects of electrical shorts and shunts created during the fabrication process and improving the performance of a solar cell with a thick film cermet electrode opposite to the incident surface by applying a reverse bias voltage of sufficient magnitude to burn out the electrical shorts and shunts but less than the break down voltage of the solar cell.

  14. It's Who You Know "and" What You Know: The Process of Creating Partnerships between Schools and Communities

    ERIC Educational Resources Information Center

    Hands, Catherine

    2005-01-01

    Based on qualitative research, this article aims to clarify the process of creating school-community partnerships. Two secondary schools with numerous partnerships were selected within a southern Ontario school board characterized by economic and cultural diversity. Drawing on the within- and cross-case analyses of documents, observations, and 25…

  15. Simulation of the Vapor Intrusion Process for Non-Homogeneous Soils Using a Three-Dimensional Numerical Model.

    PubMed

    Bozkurt, Ozgur; Pennell, Kelly G; Suuberg, Eric M

    2009-01-01

    This paper presents model simulation results of vapor intrusion into structures built atop sites contaminated with volatile or semi-volatile chemicals of concern. A three-dimensional finite element model was used to investigate the importance of factors that could influence vapor intrusion when the site is characterized by non-homogeneous soils. Model simulations were performed to examine how soil layers of differing properties alter soil gas concentration profiles and vapor intrusion rates into structures. The results illustrate difference in soil gas concentration profiles and vapor intrusion rates between homogeneous and layered soils. The findings support the need for site conceptual models to adequately represent the site's geology when conducting site characterizations, interpreting field data and assessing the risk of vapor intrusion at a given site. For instance, in layered geologies, a lower permeability and diffusivity soil layer between the source and building often limits vapor intrusion rates, even if a higher permeability layer near the foundation permits increased soil gas flow rates into the building. In addition, the presence of water-saturated clay layers can considerably influence soil gas concentration profiles. Therefore, interpreting field data without accounting for clay layers in the site conceptual model could result in inaccurate risk calculations. Important considerations for developing more accurate conceptual site models are discussed in light of the findings. PMID:20664816

  16. Chemically Patterned Inverse Opal Created by a Selective Photolysis Modification Process.

    PubMed

    Tian, Tian; Gao, Ning; Gu, Chen; Li, Jian; Wang, Hui; Lan, Yue; Yin, Xianpeng; Li, Guangtao

    2015-09-01

    Anisotropic photonic crystal materials have long been pursued for their broad applications. A novel method for creating chemically patterned inverse opals is proposed here. The patterning technique is based on selective photolysis of a photolabile polymer together with postmodification on released amine groups. The patterning method allows regioselective modification within an inverse opal structure, taking advantage of selective chemical reaction. Moreover, combined with the unique signal self-reporting feature of the photonic crystal, the fabricated structure is capable of various applications, including gradient photonic bandgap and dynamic chemical patterns. The proposed method provides the ability to extend the structural and chemical complexity of the photonic crystal, as well as its potential applications.

  17. Atomic processes in plasmas created by an ultra-short laser pulse

    NASA Astrophysics Data System (ADS)

    Audebert, P.; Lecherbourg, L.; Bastiani-Ceccotti, S.; Geindre, J.-P.; Blancard, C.; Cossé, P.; Faussurier, G.; Shepherd, R.; Renaudin, P.

    2008-05-01

    Point projection K-shell absorption spectroscopy has been used to measure absorption spectra of transient aluminum plasma created by an ultra-short laser pulse. 1s-2p and 1s-3p absorption lines of weakly ionized aluminum were measured for an extended range of densities in a relatively low-temperature regime. Independent plasma characterization was obtained from frequency domain interferometry (FDI) diagnostic and allows the interpretation of the absorption spectra in terms of spectral opacities. The experimental spectra are compared with opacity calculations using the density and temperature inferred from the analysis of the FDI data.

  18. Nonadaptive processes can create the appearance of facultative cheating in microbes.

    PubMed

    Smith, Jeff; Van Dyken, J David; Velicer, Gregory J

    2014-03-01

    Adaptations to social life may take the form of facultative cheating, in which organisms cooperate with genetically similar individuals but exploit others. Consistent with this possibility, many strains of social microbes like Myxococcus bacteria and Dictyostelium amoebae have equal fitness in single-genotype social groups but outcompete other strains in mixed-genotype groups. Here we show that these observations are also consistent with an alternative, nonadaptive scenario: kin selection-mutation balance under local competition. Using simple mathematical models, we show that deleterious mutations that reduce competitiveness within social groups (growth rate, e.g.) without affecting group productivity can create fitness effects that are only expressed in the presence of other strains. In Myxococcus, mutations that delay sporulation may strongly reduce developmental competitiveness. Deleterious mutations are expected to accumulate when high levels of kin selection relatedness relax selection within groups. Interestingly, local resource competition can create nonzero "cost" and "benefit" terms in Hamilton's rule even in the absence of any cooperative trait. Our results show how deleterious mutations can play a significant role even in organisms with large populations and highlight the need to test evolutionary causes of social competition among microbes.

  19. Lycopene degradation, isomerization and in vitro bioaccessibility in high pressure homogenized tomato puree containing oil: effect of additional thermal and high pressure processing.

    PubMed

    Knockaert, Griet; Pulissery, Sudheer K; Colle, Ines; Van Buggenhout, Sandy; Hendrickx, Marc; Loey, Ann Van

    2012-12-01

    In the present study, the effect of equivalent thermal and high pressure processes at pasteurization and sterilization intensities on some health related properties of high pressure homogenized tomato puree containing oil were investigated. Total lycopene concentration, cis-lycopene content and in vitro lycopene bioaccessibility were examined as health related properties. Results showed that pasteurization hardly affected the health related properties of tomato puree. Only the formation of cis-lycopene during intense thermal pasteurization was observed. Sterilization processes on the other hand had a significant effect on the health related properties. A significant decrease in total lycopene concentration was found after the sterilization processes. Next to degradation, significant isomerization was also observed: all-trans-lycopene was mainly converted to 9-cis- and 13-cis-lycopene. High pressure sterilization limited the overall lycopene isomerization, when compared to the equivalent thermal sterilization processes. The formation of 5-cis-lycopene on the other hand seemed to be favoured by high pressure. The in vitro lycopene bioaccessibility of high pressure homogenized tomato puree containing oil was decreased during subsequent thermal or high pressure processing, whereby significant changes were observed for all the sterilization processes.

  20. Chemically Patterned Inverse Opal Created by a Selective Photolysis Modification Process.

    PubMed

    Tian, Tian; Gao, Ning; Gu, Chen; Li, Jian; Wang, Hui; Lan, Yue; Yin, Xianpeng; Li, Guangtao

    2015-09-01

    Anisotropic photonic crystal materials have long been pursued for their broad applications. A novel method for creating chemically patterned inverse opals is proposed here. The patterning technique is based on selective photolysis of a photolabile polymer together with postmodification on released amine groups. The patterning method allows regioselective modification within an inverse opal structure, taking advantage of selective chemical reaction. Moreover, combined with the unique signal self-reporting feature of the photonic crystal, the fabricated structure is capable of various applications, including gradient photonic bandgap and dynamic chemical patterns. The proposed method provides the ability to extend the structural and chemical complexity of the photonic crystal, as well as its potential applications. PMID:26269453

  1. Numerical Simulation of Crater Creating Process in Dynamic Replacement Method by Smooth Particle Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Danilewicz, Andrzej; Sikora, Zbigniew

    2015-02-01

    A theoretical base of SPH method, including the governing equations, discussion of importance of the smoothing function length, contact formulation, boundary treatment and finally utilization in hydrocode simulations are presented. An application of SPH to a real case of large penetrations (crater creating) into the soil caused by falling mass in Dynamic Replacement Method is discussed. An influence of particles spacing on method accuracy is presented. An example calculated by LS-DYNA software is discussed. Chronological development of Smooth Particle Hydrodynamics is presented. Theoretical basics of SPH method stability and consistency in SPH formulation, artificial viscosity and boundary treatment are discussed. Time integration techniques with stability conditions, SPH+FEM coupling, constitutive equation and equation of state (EOS) are presented as well.

  2. Managing the Drafting Process: Creating a New Model for the Workplace.

    ERIC Educational Resources Information Center

    Shwom, Barbara L.; Hirsch, Penny L.

    1994-01-01

    Discusses the development of a pragmatic model of the writing process in the workplace, focusing on the importance of "drafting" as part of that process. Discusses writers' attitudes about drafting and the structures of the workplace that drafting has to accommodate. Introduces a drafting model and discusses results of using this model in the…

  3. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  4. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  5. The Process of Inclusion and Accommodation: Creating Accessible Groups for Individuals with Disabilities.

    ERIC Educational Resources Information Center

    Patterson, Jeanne Boland; And Others

    1995-01-01

    Supports the important work of group counselors by focusing on the inclusion of individuals with disabilities in nondisability specific groups and addressing disability myths, disability etiquette, architectural accessibility, and group process issues. (LKS)

  6. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head

    PubMed Central

    Li, Kai; Papademetris, Xenophon; Tucker, Don M.

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  7. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head.

    PubMed

    Li, Kai; Papademetris, Xenophon; Tucker, Don M

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  8. Creating Low Vision and Nonvisual Instructions for Diabetes Technology: An Empirically Validated Process

    PubMed Central

    Williams, Ann S.

    2012-01-01

    Introduction Nearly 20% of the adults with diagnosed diabetes in the United States also have visual impairment. Many individuals in this group perform routine diabetes self-management tasks independently, often using technology that was not specifically designed for use by people with visual impairment (e.g., insulin pumps and pens). Equitable care for persons with disabilities requires providing instructions in formats accessible for nonreaders. However, instructions in accessible formats, such as recordings, braille, or digital documents that are legible to screen readers, are seldom available. Method This article includes a summary of existing guidelines for creating accessible documents. The guidelines are followed by a description of the production of accessible nonvisual instructions for use of insulin pens used in a study of dosing accuracy. The study results indicate that the instructions were used successfully by 40 persons with visual impairment. Discussion and Conclusions Instructions in accessible formats can increase access to the benefits of diabetes technology for persons with visual impairment. Recorded instructions may also be useful to sighted persons who do not read well, such as those with dyslexia, low literacy, or who use English as a second language. Finally, they may have important benefits for fully sighted people who find it easier to learn to use technology by handling the equipment while listening to instructions. Manufacturers may also benefit from marketing to an increased pool of potential users. PMID:22538133

  9. Feasibility study for producing a carrot/potato matrix reference material for 11 selected pesticides at EU MRL level: material processing, homogeneity and stability assessment.

    PubMed

    Saldanha, Helena; Sejerøe-Olsen, Berit; Ulberth, Franz; Emons, Hendrik; Zeleny, Reinhard

    2012-05-01

    The feasibility for producing a matrix reference material for selected pesticides in a carrot/potato matrix was investigated. A commercially available baby food (carrot/potato-based mash) was spiked with 11 pesticides at the respective EU maximum residue limits (MRLs), and further processed by either freezing or freeze-drying. Batches of some 150 units were produced per material type. First, the materials were assessed for the relative amount of pesticide recovered after processing (ratio of pesticide concentration in the processed material to the initially spiked pesticide concentration). In addition, the materials' homogeneity (bottle-to-bottle variation), and the short-term (1 month) and mid-term (5 months) stability at different temperatures were assessed. For this, an in-house validated GC-EI-MS method operated in the SIM mode with a sample preparation procedure based on the QuEChERS ("quick, easy, cheap, effective, rugged, and safe") principle was applied. Measurements on the frozen material provided the most promising results (smallest analyte losses during production), and also freeze-drying proved to be a suitable alternative processing technique for most of the investigated pesticides. Both the frozen and the freeze-dried material showed to be sufficiently homogeneous for the intended use, and storage at -20°C for 5 months did not reveal any detectable material degradation. The results constitute an important step towards the development of a pesticide matrix reference material. PMID:26434333

  10. Not All Analogies Are Created Equal: Associative and Categorical Analogy Processing following Brain Damage

    ERIC Educational Resources Information Center

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and…

  11. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... all other copies) of permanent, long-term temporary, or unscheduled records, use polyester base media...) If using reversal type processing, require full photographic reversal; i.e., develop, bleach, expose... stock and blank optical media (e.g., DVD and CD), for original copies of permanent, long-term...

  12. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... all other copies) of permanent, long-term temporary, or unscheduled records, use polyester base media...) If using reversal type processing, require full photographic reversal; i.e., develop, bleach, expose... stock and blank optical media (e.g., DVD and CD), for original copies of permanent, long-term...

  13. Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement

    ERIC Educational Resources Information Center

    Raven, Arjan; Randolph, Adriane B.; Heil, Shelli

    2010-01-01

    The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…

  14. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-digital or scanned digital images that are scheduled as permanent or unscheduled, a record (or master... requirements on image format and resolution, see § 1235.48(e) of this subchapter. For temporary digital...) If using reversal type processing, require full photographic reversal; i.e., develop, bleach,...

  15. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... records? Agencies must: (a) For picture negatives and motion picture preprints (negatives, masters, and..., develop, fix, and wash. (b) Avoid using motion pictures in a final “A & B” format (two precisely matched... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film...

  16. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse.

    PubMed

    Hovick, Torre J; Allred, Brady W; Elmore, R Dwayne; Fuhlendorf, Samuel D; Hamilton, Robert G; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993-2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy's Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken's distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as static

  17. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse

    PubMed Central

    Hovick, Torre J.; Allred, Brady W.; Elmore, R. Dwayne; Fuhlendorf, Samuel D.; Hamilton, Robert G.; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993–2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy’s Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken’s distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as

  18. ArhiNet - A Knowledge-Based System for Creating, Processing and Retrieving Archival eContent

    NASA Astrophysics Data System (ADS)

    Salomie, Ioan; Dinsoreanu, Mihaela; Pop, Cristina; Suciu, Sorin

    This paper addresses the problem of creating, processing and querying semantically enhanced eContent from archives and digital libraries. We present an analysis of the archival domain, resulting in the creation of an archival domain model and of a domain ontology core. Our system adds semantic mark-up to the historical documents content, thus enabling document and knowledge retrieval as response to natural language ontology-guided queries. The system functionality follows two main workflows: (i) semantically enhanced eContent generation and knowledge acquisition and (ii) knowledge processing and retrieval. Within the first workflow, the relevant domain information is extracted from documents written in natural languages, followed by semantic annotation and domain ontology population. In the second workflow, ontologically guided natural language queries trigger reasoning processes that provide relevant search results. The paper also discusses the transformation of the OWL domain ontology into a hierarchical data model, thus providing support for the efficient ontology processing.

  19. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  20. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  1. Moving beyond readmission penalties: creating an ideal process to improve transitional care

    PubMed Central

    Burke, Robert E.; Kripalani, Sunil; Vasilevskis, Eduard E.; Schnipper, Jeffrey L.

    2013-01-01

    Hospital readmissions are common and costly; this has resulted in their emergence as a key target and quality indicator in the current era of renewed focus on cost containment. However, many concerns remain about the use of readmissions as a hospital quality indicator and about how to reduce hospital readmissions. These concerns stem in part from deficiencies in the state of the science of transitional care. A conceptualization of the “ideal” discharge process could help address these deficiencies and move the state of the science forward. We describe an ideal transition in care, explicate the key components, discuss its implications in the context of recent efforts to reduce readmissions, and suggest next steps for policymakers, researchers, health care administrators, practitioners, and educators. PMID:23184714

  2. [Fragments of nursing history: a knowledge created in the web of the theoretical submission process].

    PubMed

    Corbellini, Valéria Lamb

    2007-01-01

    The purpose of this research is to rescue from the 1950s onwards, in the State of Rio Grande do Sul, both the discursive and non-discursive practices, on how the nursing teaching went on being redesigned, in search of a profession that would be more scientific, less technicistic, and how the nursing theories had had a participation in such process of transformations and contradictions. The survey has involved teaching nurses that had lived that period in history and, the discourse analysis was used for the documentary analysis. One of the analyses points out the importance of the nursing theories in the validation of the nurse's know-how between the 1950s and the 1980s.

  3. Not all analogies are created equal: Associative and categorical analogy processing following brain damage

    PubMed Central

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile; and contain more distant relationships. PMID:22402184

  4. Integrated assessment of emerging science and technologies as creating learning processes among assessment communities.

    PubMed

    Forsberg, Ellen-Marie; Ribeiro, Barbara; Heyen, Nils B; Nielsen, Rasmus Øjvind; Thorstensen, Erik; de Bakker, Erik; Klüver, Lars; Reiss, Thomas; Beekman, Volkert; Millar, Kate

    2016-12-01

    Emerging science and technologies are often characterised by complexity, uncertainty and controversy. Regulation and governance of such scientific and technological developments needs to build on knowledge and evidence that reflect this complicated situation. This insight is sometimes formulated as a call for integrated assessment of emerging science and technologies, and such a call is analysed in this article. The article addresses two overall questions. The first is: to what extent are emerging science and technologies currently assessed in an integrated way. The second is: if there appears to be a need for further integration, what should such integration consist in? In the article we briefly outline the pedigree of the term 'integrated assessment' and present a number of interpretations of the concept that are useful for informing current analyses and discussions of integration in assessment. Based on four case studies of assessment of emerging science and technologies, studies of assessment traditions, literature analysis and dialogues with assessment professionals, currently under-developed integration dimensions are identified. It is suggested how these dimensions can be addressed in a practical approach to assessment where representatives of different assessment communities and stakeholders are involved. We call this approach the Trans Domain Technology Evaluation Process (TranSTEP). PMID:27465504

  5. AFRA confronts gender issues: the process of creating a gender strategy.

    PubMed

    Bydawell, M

    1997-02-01

    The Association for Rural Advancement (AFRA), a nongovernmental organization in South Africa affiliated with the National Land Committee (NLC), seeks to redress the legacy of unjust land dispensation during the apartheid period. AFRA is the first organization within NLC to deal openly with issues of race and gender; this process has been conflictual, however. At gender training workshops conducted by White development workers, many staff expressed the view that sexism is an alien Western issue. Moreover, gender sensitivity was interpreted by Black staff as an assault on their race and cultural identity. The staff itself was polarized on racial grounds, with White managers and Black field workers. Staff further expressed concerns that a gender perspective would dilute AFRA's focus on land reform and alienate rural women who want male household heads to continue to hold the title to their land. The organizational structure was reorganized, though, to become more democratic and racially representative. The 1995 appointment of the first field worker assigned to address women's empowerment in both the organization and target communities refueled the controversy, and a gender workshop led by a psychologist was held to build trust and unity. Staff moved toward a shared understanding of gender as an aspect of social differentiation. AFRA has since committed itself to develop an integrated gender strategy sensitive to people's needs and fears.

  6. Not all analogies are created equal: Associative and categorical analogy processing following brain damage.

    PubMed

    Schmidt, Gwenda L; Cardillo, Eileen R; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-06-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile, and contain more distant relationships.

  7. Creating and maintaining dialogue on climate information - Reflections on the adaptation process in Sweden

    NASA Astrophysics Data System (ADS)

    Nilsson, C.

    2010-09-01

    Climate information can be communicated in many ways to various actors within society. Dialogue as a means to communicate is often emphasised as an important tool to deliver and receive information. However, a dialogue can be initiated and maintained in many different ways. In Sweden the Swedish Meteorological and Hydrological Institute, SMHI, can be seen as one of the actors in between the climate science experts and the society, and together with the Swedish Geotechnical Institute, SGI, they had recognised that the County Administrative Boards required climate information and knowledge in how to use and how not to use climate scenarios in order to start coordinating regional adaptation activities. At the same time the National Authorities called for a more detailed compilation on the needs in the counties, regarding climate information for decision making. Together the SMHI and SGI visited the 21 counties in Sweden with a start in the autumn of 2008, initiating a dialogue with the County Administrative Boards. The process continued with a first seminar in springtime 2009, on adaptation. In spring 2009 the County Administrative Boards were appointed as the formal foci of adaptation coordination in Sweden, and a new phase in the dialogue started, when the counties had specific goals with the interaction. Personal visits and seminars have continued to be the arena for the dialogue, and new climate information products have been initiated as a response to the interaction. Here, reflections are presented on the role of the dialogue in Sweden as a tool towards a sustainable way to communicate climate information.

  8. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  9. Degradation Mechanism of Cyanobacterial Toxin Cylindrospermopsin by Hydroxyl Radicals in Homogeneous UV/H2O2 Process

    EPA Science Inventory

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometr...

  10. A trapped magnetic field of 3 T in homogeneous, bulk MgB2 superconductors fabricated by a modified precursor infiltration and growth process

    NASA Astrophysics Data System (ADS)

    Bhagurkar, A. G.; Yamamoto, A.; Anguilano, L.; Dennis, A. R.; Durrell, J. H.; Babu, N. Hari; Cardwell, D. A.

    2016-03-01

    The wetting of boron with liquid magnesium is a critical factor in the synthesis of MgB2 bulk superconductors by the infiltration and growth (IG) process. Poor wetting characteristics can therefore result potentially in non-uniform infiltration, formation of defects in the final sample structure and poor structural homogeneity throughout the bulk material. Here we report the fabrication of near-net-shaped MgB2 bulk superconductors by a modified precursor infiltration and growth (MPIG) technique. A homogeneous bulk microstructure has subsequently been achieved via the uniform infiltration of Mg liquid by enriching pre-reacted MgB2 powder within the green precursor pellet as a wetting enhancer, leading to relatively little variation in superconducting properties across the entire bulk sample. Almost identical values of trapped magnetic field of 2.12 T have been measured at 5 K at both the top and bottom surfaces of a sample fabricated by the MPIG process, confirming the uniformity of the bulk microstructure. A maximum trapped field of 3 T has been measured at 5 K at the centre of a stack of two bulk MgB2 samples fabricated using this technique. A steady rise in trapped field was observed for this material with decreasing temperature down to 5 K without the occurrence of flux avalanches and with a relatively low field decay rate (1.5%/d). These properties are attributed to the presence of a fine distribution of residual Mg within the bulk microstructure generated by the MPIG processing technique.

  11. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants

    PubMed Central

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (−47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  12. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants.

    PubMed

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (-47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  13. Is cryopreservation a homogeneous process? Ultrastructure and motility of untreated, prefreezing, and postthawed spermatozoa of Diplodus puntazzo (Cetti).

    PubMed

    Taddei, A R; Barbato, F; Abelli, L; Canese, S; Moretti, F; Rana, K J; Fausto, A M; Mazzini, M

    2001-06-01

    This study subdivides the cryopreservation procedure for Diplodus puntazzo spermatozoa into three key phases, fresh, prefreezing (samples equilibrated in cryosolutions), and postthawed stages, and examines the ultrastructural anomalies and motility profiles of spermatozoa in each stage, with different cryodiluents. Two simple cryosolutions were evaluated: 0.17 M sodium chloride containing a final concentration of 15% dimethyl sulfoxide (Me(2)SO) (cryosolution A) and 0.1 M sodium citrate containing a final concentration of 10% Me(2)SO (cryosolution B). Ultrastructural anomalies of the plasmatic and nuclear membranes of the sperm head were common and the severity of the cryoinjury differed significantly between the pre- and the postfreezing phases and between the two cryosolutions. In spermatozoa diluted with cryosolution A, during the prefreezing phase, the plasmalemma of 61% of the cells was absent or damaged compared with 24% in the fresh sample (P < 0.001). In spermatozoa diluted with cryosolution B, there was a pronounced increase in the number of cells lacking the head plasmatic membrane from the prefreezing to the postthawed stages (from 32 to 52%, P < 0.01). In both cryosolutions, damages to nuclear membrane were significantly higher after freezing (cryosolution A: 8 to 23%, P < 0.01; cryosolution B: 5 to 38%, P < 0.001). With cryosolution A, the after-activation motility profile confirmed a consistent drop from fresh at the prefreezing stage, whereas freezing and thawing did not affect the motility much further and 50% of the cells were immotile by 60-90 s after activation. With cryosolution B, only the postthawing stage showed a sharp drop of motility profile. This study suggests that the different phases of the cryoprocess should be investigated to better understand the process of sperm damage. PMID:11748933

  14. ABA Southern Region Burn disaster plan: the process of creating and experience with the ABA southern region burn disaster plan.

    PubMed

    Kearns, Randy D; Cairns, Bruce A; Hickerson, William L; Holmes, James H

    2014-01-01

    The Southern Region of the American Burn Association began to craft a regional plan to address a surge of burn-injured patients after a mass casualty event in 2004. Published in 2006, this plan has been tested through modeling, exercise, and actual events. This article focuses on the process of how the plan was created, how it was tested, and how it interfaces with other ongoing efforts on preparedness. One key to success regarding how people respond to a disaster can be traced to preexisting relationships and collaborations. These activities would include training or working together and building trust long before the crisis. Knowing who you can call and rely on when you need help, within the context of your plan, can be pivotal in successfully managing a disaster. This article describes how a coalition of burn center leaders came together. Their ongoing personal association has facilitated the development of planning activities and has kept the process dynamic. This article also includes several of the building blocks for developing a plan from creation to composition, implementation, and testing. The plan discussed here is an example of linking leadership, relationships, process, and documentation together. On the basis of these experiences, the authors believe these elements are present in other regions. The intent of this work is to share an experience and to offer it as a guide to aid others in their regional burn disaster planning efforts.

  15. ABA Southern Region Burn disaster plan: the process of creating and experience with the ABA southern region burn disaster plan.

    PubMed

    Kearns, Randy D; Cairns, Bruce A; Hickerson, William L; Holmes, James H

    2014-01-01

    The Southern Region of the American Burn Association began to craft a regional plan to address a surge of burn-injured patients after a mass casualty event in 2004. Published in 2006, this plan has been tested through modeling, exercise, and actual events. This article focuses on the process of how the plan was created, how it was tested, and how it interfaces with other ongoing efforts on preparedness. One key to success regarding how people respond to a disaster can be traced to preexisting relationships and collaborations. These activities would include training or working together and building trust long before the crisis. Knowing who you can call and rely on when you need help, within the context of your plan, can be pivotal in successfully managing a disaster. This article describes how a coalition of burn center leaders came together. Their ongoing personal association has facilitated the development of planning activities and has kept the process dynamic. This article also includes several of the building blocks for developing a plan from creation to composition, implementation, and testing. The plan discussed here is an example of linking leadership, relationships, process, and documentation together. On the basis of these experiences, the authors believe these elements are present in other regions. The intent of this work is to share an experience and to offer it as a guide to aid others in their regional burn disaster planning efforts. PMID:23666386

  16. Creating Sub-50 Nm Nanofluidic Junctions in PDMS Microfluidic Chip via Self-Assembly Process of Colloidal Particles.

    PubMed

    Wei, Xi; Syed, Abeer; Mao, Pan; Han, Jongyoon; Song, Yong-Ak

    2016-03-13

    Polydimethylsiloxane (PDMS) is the prevailing building material to make microfluidic devices due to its ease of molding and bonding as well as its transparency. Due to the softness of the PDMS material, however, it is challenging to use PDMS for building nanochannels. The channels tend to collapse easily during plasma bonding. In this paper, we present an evaporation-driven self-assembly method of silica colloidal nanoparticles to create nanofluidic junctions with sub-50 nm pores between two microchannels. The pore size as well as the surface charge of the nanofluidic junction is tunable simply by changing the colloidal silica bead size and surface functionalization outside of the assembled microfluidic device in a vial before the self-assembly process. Using the self-assembly of nanoparticles with a bead size of 300 nm, 500 nm, and 900 nm, it was possible to fabricate a porous membrane with a pore size of ~45 nm, ~75 nm and ~135 nm, respectively. Under electrical potential, this nanoporous membrane initiated ion concentration polarization (ICP) acting as a cation-selective membrane to concentrate DNA by ~1,700 times within 15 min. This non-lithographic nanofabrication process opens up a new opportunity to build a tunable nanofluidic junction for the study of nanoscale transport processes of ions and molecules inside a PDMS microfluidic chip.

  17. Creating a high-reliability health care system: improving performance on core processes of care at Johns Hopkins Medicine.

    PubMed

    Pronovost, Peter J; Armstrong, C Michael; Demski, Renee; Callender, Tiffany; Winner, Laura; Miller, Marlene R; Austin, J Matthew; Berenholtz, Sean M; Yang, Ting; Peterson, Ronald R; Reitz, Judy A; Bennett, Richard G; Broccolino, Victor A; Davis, Richard O; Gragnolati, Brian A; Green, Gene E; Rothman, Paul B

    2015-02-01

    In this article, the authors describe an initiative that established an infrastructure to manage quality and safety efforts throughout a complex health care system and that improved performance on core measures for acute myocardial infarction, heart failure, pneumonia, surgical care, and children's asthma. The Johns Hopkins Medicine Board of Trustees created a governance structure to establish health care system-wide oversight and hospital accountability for quality and safety efforts throughout Johns Hopkins Medicine. The Armstrong Institute for Patient Safety and Quality was formed; institute leaders used a conceptual model nested in a fractal infrastructure to implement this initiative to improve performance at two academic medical centers and three community hospitals, starting in March 2012. The initiative aimed to achieve ≥ 96% compliance on seven inpatient process-of-care core measures and meet the requirements for the Delmarva Foundation and Joint Commission awards. The primary outcome measure was the percentage of patients at each hospital who received the recommended process of care. The authors compared health system and hospital performance before (2011) and after (2012, 2013) the initiative. The health system achieved ≥ 96% compliance on six of the seven targeted measures by 2013. Of the five hospitals, four received the Delmarva Foundation award and two received The Joint Commission award in 2013. The authors argue that, to improve quality and safety, health care systems should establish a system-wide governance structure and accountability process. They also should define and communicate goals and measures and build an infrastructure to support peer learning.

  18. Homogeneous processes of atmospheric interest

    NASA Technical Reports Server (NTRS)

    Rossi, M. J.; Barker, J. R.; Golden, D. M.

    1983-01-01

    Upper atmospheric research programs in the department of chemical kinetics are reported. Topics discussed include: (1) third-order rate constants of atmospheric importance; (2) a computational study of the HO2 + HO2 and DO2 + DO2 reactions; (3) measurement and estimation of rate constants for modeling reactive systems; (4) kinetics and thermodynamics of ion-molecule association reactions; (5) entropy barriers in ion-molecule reactions; (6) reaction rate constant for OH + HOONO2 yields products over the temperature range 246 to 324 K; (7) very low-pressure photolysis of tert-bytyl nitrite at 248 nm; (8) summary of preliminary data for the photolysis of C1ONO2 and N2O5 at 285 nm; and (9) heterogeneous reaction of N2O5 and H2O.

  19. Homogeneity and elemental distribution in self-assembled bimetallic Pd-Pt aerogels prepared by a spontaneous one-step gelation process.

    PubMed

    Oezaslan, M; Liu, W; Nachtegaal, M; Frenkel, A I; Rutkowski, B; Werheid, M; Herrmann, A-K; Laugier-Bonnaud, C; Yilmaz, H-C; Gaponik, N; Czyrska-Filemonowicz, A; Eychmüller, A; Schmidt, T J

    2016-07-27

    Multi-metallic aerogels have recently emerged as a novel and promising class of unsupported electrocatalyst materials due to their high catalytic activity and improved durability for various electrochemical reactions. Aerogels can be prepared by a spontaneous one-step gelation process, where the chemical co-reduction of metal precursors and the prompt formation of nanochain-containing hydrogels, as a preliminary stage for the preparation of aerogels, take place. However, detailed knowledge about the homogeneity and chemical distribution of these three-dimensional Pd-Pt aerogels at the nano-scale as well as at the macro-scale is still unclear. Therefore, we used a combination of spectroscopic and microscopic techniques to obtain a better insight into the structure and elemental distribution of the various Pd-rich Pd-Pt aerogels prepared by the spontaneous one-step gelation process. Synchrotron-based extended X-ray absorption fine structure (EXAFS) spectroscopy and high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) in combination with energy-dispersive X-ray spectroscopy (EDX) were employed in this work to uncover the structural architecture and chemical composition of the various Pd-rich Pd-Pt aerogels over a broad length range. The Pd80Pt20, Pd60Pt40 and Pd50Pt50 aerogels showed heterogeneity in the chemical distribution of the Pt and Pd atoms inside the macroscopic nanochain-network. The features of mono-metallic clusters were not detected by EXAFS or STEM-EDX, indicating alloyed nanoparticles. However, the local chemical composition of the Pd-Pt alloys strongly varied along the nanochains and thus within a single aerogel. To determine the electrochemically active surface area (ECSA) of the Pd-Pt aerogels for application in electrocatalysis, we used the electrochemical CO stripping method. Due to their high porosity and extended network structure, the resulting values of the ECSA for the Pd-Pt aerogels were higher than that for

  20. High pressure homogenization processing, thermal treatment and milk matrix affect in vitro bioaccessibility of phenolics in apple, grape and orange juice to different extents.

    PubMed

    He, Zhiyong; Tao, Yadan; Zeng, Maomao; Zhang, Shuang; Tao, Guanjun; Qin, Fang; Chen, Jie

    2016-06-01

    The effects of high pressure homogenization processing (HPHP), thermal treatment (TT) and milk matrix (soy, skimmed and whole milk) on the phenolic bioaccessibility and the ABTS scavenging activity of apple, grape and orange juice (AJ, GJ and OJ) were investigated. HPHP and soy milk diminished AJ's total phenolic bioaccessibility 29.3%, 26.3%, respectively, whereas TT and bovine milk hardly affected it. HPHP had little effect on GJ's and OJ's total phenolic bioaccessibility, while TT enhanced them 27.3-33.9%, 19.0-29.2%, respectively, and milk matrix increased them 26.6-31.1%, 13.3-43.4%, respectively. Furthermore, TT (80 °C/30 min) and TT (90 °C/30 s) presented the similar influences on GJ's and OJ's phenolic bioaccessibility. Skimmed milk showed a better enhancing effect on OJ's total phenolic bioaccessibility than soy and whole milk, but had a similar effect on GJ's as whole milk. These results contribute to promoting the health benefits of fruit juices by optimizing the processing and formulas in the food industry.

  1. Thermomechanical process optimization of U-10wt% Mo – Part 2: The effect of homogenization on the mechanical properties and microstructure

    SciTech Connect

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean M.; Burkes, Douglas E.

    2015-07-09

    Low-enriched uranium alloyed with 10 wt% molybdenum (U-10Mo) is currently being investigated as an alternative fuel for the highly enriched uranium used in several of the United States’ high performance research reactors. Development of the methods to fabricate the U-10Mo fuel plates is currently underway and requires fundamental understanding of the mechanical properties at the expected processing temperatures. In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the “homogenized” materials were then tested under compression between 500 and 800°C. The as-cast samples and those treated at 800°C for 24 hours had grain sizes of 25-30 μm, whereas those treated at 1000°C for 16 hours had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  2. Large-area homogeneous periodic surface structures generated on the surface of sputtered boron carbide thin films by femtosecond laser processing

    NASA Astrophysics Data System (ADS)

    Serra, R.; Oliveira, V.; Oliveira, J. C.; Kubart, T.; Vilar, R.; Cavaleiro, A.

    2015-03-01

    Amorphous and crystalline sputtered boron carbide thin films have a very high hardness even surpassing that of bulk crystalline boron carbide (≈41 GPa). However, magnetron sputtered B-C films have high friction coefficients (C.o.F) which limit their industrial application. Nanopatterning of materials surfaces has been proposed as a solution to decrease the C.o.F. The contact area of the nanopatterned surfaces is decreased due to the nanometre size of the asperities which results in a significant reduction of adhesion and friction. In the present work, the surface of amorphous and polycrystalline B-C thin films deposited by magnetron sputtering was nanopatterned using infrared femtosecond laser radiation. Successive parallel laser tracks 10 μm apart were overlapped in order to obtain a processed area of about 3 mm2. Sinusoidal-like undulations with the same spatial period as the laser tracks were formed on the surface of the amorphous boron carbide films after laser processing. The undulations amplitude increases with increasing laser fluence. The formation of undulations with a 10 μm period was also observed on the surface of the crystalline boron carbide film processed with a pulse energy of 72 μJ. The amplitude of the undulations is about 10 times higher than in the amorphous films processed at the same pulse energy due to the higher roughness of the films and consequent increase in laser radiation absorption. LIPSS formation on the surface of the films was achieved for the three B-C films under study. However, LIPSS are formed under different circumstances. Processing of the amorphous films at low fluence (72 μJ) results in LIPSS formation only on localized spots on the film surface. LIPSS formation was also observed on the top of the undulations formed after laser processing with 78 μJ of the amorphous film deposited at 800 °C. Finally, large-area homogeneous LIPSS coverage of the boron carbide crystalline films surface was achieved within a large range

  3. Salting-out homogeneous liquid-liquid extraction approach applied in sample pre-processing for the quantitative determination of entecavir in human plasma by LC-MS.

    PubMed

    Zhao, Feng-Juan; Tang, Hong; Zhang, Qing-Hua; Yang, Jin; Davey, Andrew K; Wang, Ji-Ping

    2012-01-15

    A convenient, robust, economical and selective sample preparation method for the quantitative determination of entecavir in human plasma by LC-MS was developed and validated. Entecavir and the internal standard of acyclovir were extracted from 500 μL of human plasma by a salting-out homogeneous liquid-liquid extraction approach (SHLLE) with acetonitrile as the organic extractant and magnesium sulfate as the salting-out reagent. They were analyzed on a Hanbon® Lichrospher RP C18 HPLC column (150 mm×2.0 mm; 5 μm) with gradient elution. The mobile phase comprised 0.1% acetic acid-0.2 mmol ammonium acetate in water (mobile phase A) and acetonitrile (mobile phase B). The flow rate is 0.2 mL/min. The analytes were detected by a LC-MS 2010 single quadrupole mass spectrometer instrument equipped with an electrospray ionization interface using selective ion monitoring positive mode. A "post cut" column switch technique was incorporated into the method to remove interferences of earlier and later eluting matrix components than entecavir and internal standard, including salting-out reagent used in sample pre-processing. The method was validated over the concentration range of 0.05-20 ng/mL. The intra-day and inter-day precision of the assay, as measured by the coefficient of variation (%CV), was within 3.59%, and the intra-day assay accuracy was found to be within 4.88%. The average recovery of entecavir was about 50% and the ion suppression was approximately 44% over the standard curve. Comparison of matrix effect between SHLLE and SPE by continuous post column infusion showed that these two methods got similar, slight ion suppression. The SHLLE method has been successfully utilized for the analysis of entecavir in post-dose samples from a clinical study.

  4. High-pressure homogenization as a non-thermal technique for the inactivation of microorganisms.

    PubMed

    Diels, Ann M J; Michiels, Chris W

    2006-01-01

    In the pharmaceutical, cosmetic, chemical, and food industries high-pressure homogenization is used for the preparation or stabilization of emulsions and suspensions, or for creating physical changes, such as viscosity changes, in products. Another well-known application is cell disruption of yeasts or bacteria in order to release intracellular products such as recombinant proteins. The development over the last few years of homogenizing equipment that operates at increasingly higher pressures has also stimulated research into the possible application of high-pressure homogenization as a unit process for microbial load reduction of liquid products. Several studies have indicated that gram-negative bacteria are more sensitive to high-pressure homogenization than gram-positive bacteria supporting the widely held belief that high-pressure homogenization kills vegetative bacteria mainly through mechanical disruption. However, controversy exists in the literature regarding the exact cause(s) of cell disruption by high-pressure homogenization. The causes that have been proposed include spatial pressure and velocity gradients, turbulence, cavitation, impact with solid surfaces, and extensional stress. The purpose of this review is to give an overview of the existing literature about microbial inactivation by high-pressure homogenization. Particular attention will be devoted to the different proposed microbial inactivation mechanisms. Further, the different parameters that influence the microbial inactivation by high-pressure homogenization will be scrutinized.

  5. Homogenization results for various meteorological elements in the Czech Republic

    NASA Astrophysics Data System (ADS)

    Stepanek, P.; Zahradnicek, P.

    2012-04-01

    In many scientific disciplines it is needed to process long time series of meteorological elements. In recent years considerable attention has been devoted also to analysis of daily data. Prior to any analysis, the need to homogenize data and check their quality arises. Unfortunately, most of the time series of atmospheric data with a resolution of decades to centuries contains inhomogeneities caused by station relocations, exchange of observers, changes in the vicinity of stations (e.g. urbanization), changes of instruments, observing practices (like a new formula for calculating daily average, different observation times), etc. For the period 1961-2007, 1750 series of seven climatological characteristics were tested for homogeneity (on monthly, seasonal and annual scale) and inhomogeneities were found in 42% of them. This value is underestimated, due to the low number of detections in precipitation series, in which breaks were detected only in 15% of series. For all other characteristics, this number was above 50%. Before homogenization itself, quality control on the subdaily data (for individual observation hours 7,14,21) was performed and all suspicious values were removed from time series. In our approach, data quality control is carried out by combining several methods (Štěpánek et al 2009). Detection of inhomogeneities was performed using monthly, seasonal and annual means (or sums in the case of precipitation and sunshine duration). In the homogenization of the time series, the use of various statistical tests and types of reference series made it possible to increase considerably the number of homogeneity tests results for each series tested and thus to assess homogeneity more reliably. The relative homogeneity tests applied were: Standard Normal Homogeneity Test [SNHT], the Maronna and Yohai bivariate testand the Easterling and Peterson test. Data were corrected for found inhomogeneities on daily scale. We created our own correction method (called DAP

  6. Homogeneous complex networks

    NASA Astrophysics Data System (ADS)

    Bogacz, Leszek; Burda, Zdzisław; Wacław, Bartłomiej

    2006-07-01

    We discuss various ensembles of homogeneous complex networks and a Monte-Carlo method of generating graphs from these ensembles. The method is quite general and can be applied to simulate micro-canonical, canonical or grand-canonical ensembles for systems with various statistical weights. It can be used to construct homogeneous networks with desired properties, or to construct a non-trivial scoring function for problems of advanced motif searching.

  7. Creating Tribes.

    ERIC Educational Resources Information Center

    Robyn, Elisa

    2000-01-01

    Suggests the use of the "tribal" metaphor to foster team building and collaborative learning in college classes. Offers examples of how linking students in the classroom in tribes builds identification and interdependence through such activities as creating a group myth and participating in membership rituals. The tribal metaphor has also led to…

  8. Creating Poetry.

    ERIC Educational Resources Information Center

    Drury, John

    Encouraging exploration and practice, this book offers hundreds of exercises and numerous tips covering every step involved in creating poetry. Each chapter is a self-contained unit offering an overview of material in the chapter, a definition of terms, and poetry examples from well-known authors designed to supplement the numerous exercises.…

  9. Creating Community.

    ERIC Educational Resources Information Center

    Ruane, Patricia; And Others

    1994-01-01

    Brookline (Massachusetts) Public Schools has created a telecommunications network that encourages creative thinking, risk taking, thoughtful practice. Interested parties are advised to identify leadership team; rethink resources; identify potentially successful conference groups; learn to make deals; provide training and ongoing support; expect…

  10. Economy, Culture, Public Policy, and the Urban Underclass. A Discussion of Research on Processes and Mechanisms That Create, Maintain, or Overcome Urban Poverty.

    ERIC Educational Resources Information Center

    Pearson, Robert W.

    1989-01-01

    Research is examining the processes by which persistent and concentrated urban poverty is created, maintained, prevented, or overcome. This paper reports on discussion and suggestions generated in a planning meeting of the Social Science Research Council's Committee for Research on the Urban Underclass held on September 21-23, 1988. Issues…

  11. Ecological and evolutionary consequences of biotic homogenization.

    PubMed

    Olden, Julian D; Leroy Poff, N; Douglas, Marlis R; Douglas, Michael E; Fausch, Kurt D

    2004-01-01

    Biotic homogenization, the gradual replacement of native biotas by locally expanding non-natives, is a global process that diminishes floral and faunal distinctions among regions. Although patterns of homogenization have been well studied, their specific ecological and evolutionary consequences remain unexplored. We argue that our current perspective on biotic homogenization should be expanded beyond a simple recognition of species diversity loss, towards a synthesis of higher order effects. Here, we explore three distinct forms of homogenization (genetic, taxonomic and functional), and discuss their immediate and future impacts on ecological and evolutionary processes. Our goal is to initiate future research that investigates the broader conservation implications of homogenization and to promote a proactive style of adaptive management that engages the human component of the anthropogenic blender that is currently mixing the biota on Earth. PMID:16701221

  12. Creating bulk nanocrystalline metal.

    SciTech Connect

    Fredenburg, D. Anthony; Saldana, Christopher J.; Gill, David D.; Hall, Aaron Christopher; Roemer, Timothy John; Vogler, Tracy John; Yang, Pin

    2008-10-01

    Nanocrystalline and nanostructured materials offer unique microstructure-dependent properties that are superior to coarse-grained materials. These materials have been shown to have very high hardness, strength, and wear resistance. However, most current methods of producing nanostructured materials in weapons-relevant materials create powdered metal that must be consolidated into bulk form to be useful. Conventional consolidation methods are not appropriate due to the need to maintain the nanocrystalline structure. This research investigated new ways of creating nanocrystalline material, new methods of consolidating nanocrystalline material, and an analysis of these different methods of creation and consolidation to evaluate their applicability to mesoscale weapons applications where part features are often under 100 {micro}m wide and the material's microstructure must be very small to give homogeneous properties across the feature.

  13. A study of the process of using Pro/ENGINEER geometry models to create finite element models

    SciTech Connect

    Kistler, B.L.

    1997-02-01

    Methods for building Pro/ENGINEER models which allowed integration with structural and thermal mesh generation and analyses software without recreating geometry were evaluated. This study was not intended to be an in-depth study of the mechanics of Pro/ENGINEER or of mesh generation or analysis software, but instead was a first cut attempt to provide recommendations for Sandia personnel which would yield useful analytical models in less time than an analyst would require to create a separate model. The study evaluated a wide variety of geometries built in Pro/ENGINEER and provided general recommendations for designers, drafters, and analysts.

  14. Phase-shifting of correlation fringes created by image processing as an alternative to improve digital shearography

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Marcon, Marlon; Magalhães, Ricardo R.; Paiva-Almeida, Thiago; Santos, Igor V. A.; Martins, Moisés

    2016-12-01

    The adoption of digital speckle pattern shearing interferometry, or speckle shearography, is well known in many areas when one needs to measure micro-displacements in-plane and out of the plane in biological and non-biological objects; it is based on the Michelson's Interferometer with the use of a piezoelectric transducer (PZT) in order to provide the phase-shift of the fringes and then to improve the quality of the final image. The creation of the shifting images using a PZT, despite its widespread use, has some drawbacks or limitations, such as the cost of the apparatus, the difficulties in applying the same displacement in the mirror repeated times, and when the phase-shift cannot be used in dynamic object measurement. The aim of this work was to create digitally phase-shift images avoiding the mechanical adjustments of the PZT, testing them with the digital shearography method. The methodology was tested using a well-known object, a cantilever beam of aluminium under deformation. The results documented the ability to create the deformation map and curves with reliability and sensitivity, reducing the cost, and improving the robustness and also the accessibility of digital speckle pattern shearing interferometry.

  15. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  16. The second phase in creating the cardiac center for the next generation: beyond structure to process improvement.

    PubMed

    Woods, J

    2001-01-01

    The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute. PMID:11765624

  17. Strictly homogeneous laterally complete modules

    NASA Astrophysics Data System (ADS)

    Chilin, V. I.; Karimov, J. A.

    2016-03-01

    Let A be a laterally complete commutative regular algebra and X be a laterally complete A-module. In this paper we introduce a notion of homogeneous and strictly homogeneous A-modules. It is proved that any homogeneous A-module is strictly homogeneous A-module, if the Boolean algebra of all idempotents in A is multi-σ-finite.

  18. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    SciTech Connect

    Bulutsuz, A. G.; Demircioglu, P. Bogrekci, I.; Durakbasa, M. N.

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  19. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    NASA Astrophysics Data System (ADS)

    Bulutsuz, A. G.; Demircioglu, P.; Bogrekci, I.; Durakbasa, M. N.; Katiboglu, A. B.

    2015-03-01

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  20. Creating Processes Associated with Providing Government Goods and Services Under the Commercial Space Launch Act at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Letchworth, Janet F.

    2011-01-01

    Kennedy Space Center (KSC) has decided to write its agreements under the Commercial Space Launch Act (CSLA) authority to cover a broad range of categories of support that KSC could provide to our commercial partner. Our strategy was to go through the onerous process of getting the agreement in place once and allow added specificity and final cost estimates to be documented on a separate Task Order Request (TOR). This paper is written from the implementing engineering team's perspective. It describes how we developed the processes associated with getting Government support to our emerging commercial partners, such as SpaceX and reports on our success to date.

  1. Decision-Making Processes of SME in Cloud Computing Adoption to Create Disruptive Innovation: Mediating Effect of Collaboration

    ERIC Educational Resources Information Center

    Sonthiprasat, Rattanawadee

    2014-01-01

    THE PROBLEM. The purpose of this quantitative correlation study was to assess the relationship between different Cloud service levels of effective business innovation for SMEs. In addition, the new knowledge gained from the benefits of Cloud adoption with knowledge sharing would enhance the decision making process for businesses to consider the…

  2. Energy of homogeneous cosmologies

    SciTech Connect

    Nester, James M.; So, L.L.; Vargas, T.

    2008-08-15

    An energy for the homogeneous cosmological models is presented. More specifically, using an appropriate natural prescription, we find the energy within any region with any gravitational source for a large class of gravity theories--namely, those with a tetrad description--for all nine Bianchi types. Our energy is given by the value of the Hamiltonian with homogeneous boundary conditions; this value vanishes for all regions in all Bianchi class A models, and it does not vanish for any class B model. This is so not only for Einstein's general relativity but, moreover, for the whole three-parameter class of tetrad-teleparallel theories. For the physically favored one-parameter subclass, which includes the teleparallel equivalent of Einstein's theory as an important special case, the energy for all class B models is, contrary to expectation, negative.

  3. Challenges in creating an opt-in biobank with a registrar-based consent process and a commercial EHR

    PubMed Central

    Corsmo, Jeremy; Barnes, Michael G; Pollick, Carrie; Chalfin, Jamie; Nix, Jeremy; Smith, Christopher; Ganta, Rajesh

    2012-01-01

    Residual clinical samples represent a very appealing source of biomaterial for translational and clinical research. We describe the implementation of an opt-in biobank, with consent being obtained at the time of registration and the decision stored in our electronic health record, Epic. Information on that decision, along with laboratory data, is transferred to an application that signals to biobank staff whether a given sample can be kept for research. Investigators can search for samples using our i2b2 data warehouse. Patient participation has been overwhelmingly positive and much higher than anticipated. Over 86% of patients provided consent and almost 83% requested to be notified of any incidental research findings. In 6 months, we obtained decisions from over 18 000 patients and processed 8000 blood samples for storage in our research biobank. However, commercial electronic health records like Epic lack key functionality required by a registrar-based consent process, although workarounds exist. PMID:22878682

  4. Homogeneous and inhomogeneous eddies

    SciTech Connect

    Pavia, E.G.

    1994-12-31

    This work deals with mesoscale warm oceanic eddies; i.e., self-contained bodies of water which transport heat, among other things, for several months and for several hundreds of kilometers. This heat transport is believed to play an important role in the atmospheric and oceanic conditions of the region where it is being transported. Here the author examines the difference in evolution between eddies modeled as blobs of homogeneous water and eddies in which density varies in the horizontal. Preliminary results suggest that instability is enhanced by inhomogeneities, which would imply that traditional modeling studies, based on homogeneous vortices have underestimated the rate of heat-release from oceanic eddies to the surroundings. The approach is modeling in the simplest form; i.e., one single active layer. Although previous studies have shown the drastic effect on stability brought by two or more dynamically-relevant homogeneous layers, the author believes the single-layer eddy-model has not been investigated thoroughly.

  5. The application of fluorescent layers, created by the sol-gel process, in an optical cryogenic temperature sensor

    NASA Astrophysics Data System (ADS)

    Bertrand, S.; Bresson, F.; Audebert, P.; Tribillon, G.

    1995-02-01

    The sol-gel process is used to elaborate a fluorescent layer containing SrF 2:Yb 2+ powdered crystal. From this gel, luminescent properties of SrF 2:Yb 2+ are exploited to realize a cryogenic temperature sensor. Comparative tests with other fluorescent layers using classical binders like glues, show that the gel is a very promising tool for this sensor application.

  6. Observing and documenting the snow surface processes creating the isotopic signal in the snow at Summit, Greenland

    NASA Astrophysics Data System (ADS)

    Steen-Larsen, H. C.; Noone, D.; Berkelhammer, M.; Schneider, D.; White, J.; Steffen, K.

    2012-04-01

    Only very limited understanding of the physical processes influencing the formation of the isotopic signal observed in the snow in Greenland and Antarctica exist. Current knowledge is to a large extend based on more or less ad hoc assumptions and observed empirical relations. During the spring of 2011 a suite of state of the art instruments were installed at the NSF-operated station, Summit, on top of the Greenland Ice Sheet. The instruments package includes measurements performed at several heights (from 0.1 m to 50 meter) above the snow surface by sonic anemometers, high precision temperature sensors, particle size and shape spectrometers, and isotopic water vapor spectrometers. To support the interpretation of the above snow surface measurements an array of temperature and pressure sensors as well as inlets for measuring the interstitial isotopic water vapor composition were installed to a depth of 1.0 meter. We present here the setup and the preliminary results that have come out of the installed suite of instruments together with the projection of these observations. Especially we focus on the following three questions: 1) What is the variation in isotopic composition caused by changes in source conditions? 2) What is the influence of differing cloud microphysics on the isotopic composition of snow? 3) To what degree are the aspects of the atmospheric hydrology masked in the ice core record due to post-depositional processes. The instruments installed at Summit is planned to be continuously operational for the following three years thereby providing key information of the year round processes.

  7. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  8. Effect of homogenization techniques on reducing the size of microcapsules and the survival of probiotic bacteria therein.

    PubMed

    Ding, W K; Shah, N P

    2009-08-01

    This study investigated 2 different homogenization techniques for reducing the size of calcium alginate beads during the microencapsulation process of 8 probiotic bacteria strains, namely, Lactobacillus rhamnosus, L. salivarius, L. plantarum, L. acidophilus, L. paracasei, Bifidobacterium longum, B. lactis type Bi-04, and B. lactis type Bi-07. Two different homogenization techniques were used, namely, ultra-turrax benchtop homogenizer and Microfluidics microfluidizer. Various settings on the homogenization equipment were studied such as the number of passes, speed (rpm), duration (min), and pressure (psi). The traditional mixing method using a magnetic stirrer was used as a control. The size of microcapsules resulting from the homogenization technique, and the various settings were measured using a light microscope and a stage micrometer. The smallest capsules measuring (31.2 microm) were created with the microfluidizer using 26 passes at 1200 psi for 40 min. The greatest loss in viability of 3.21 log CFU/mL was observed when using the ultra-turrax benchtop homogenizer with a speed of 1300 rpm for 5 min. Overall, both homogenization techniques reduced capsule sizes; however, homogenization settings at high rpm also greatly reduced the viability of probiotic organisms.

  9. Restoration of overwash processes creates piping plover (Charadrius melodus) habitat on a barrier island (Assateague Island, Maryland)

    NASA Astrophysics Data System (ADS)

    Schupp, Courtney A.; Winn, Neil T.; Pearl, Tami L.; Kumer, John P.; Carruthers, Tim J. B.; Zimmerman, Carl S.

    2013-01-01

    On Assateague Island, an undeveloped barrier island along Maryland and Virginia, a foredune was constructed to protect the island from the erosion and breaching threat caused by permanent jetties built to maintain Ocean City Inlet. Scientists and engineers integrated expertise in vegetation, wildlife, geomorphology, and coastal engineering in order to design a habitat restoration project that would be evaluated in terms of coastal processes rather than static features. Development of specific restoration targets, thresholds for intervention, and criteria to evaluate long-term project success were based on biological and geomorphological data and coastal engineering models. A detailed long-term monitoring plan was established to measure project sustainability. The foredune unexpectedly acted as near-total barrier to both overwash and wind, and the dynamic ecosystem underwent undesirable habitat changes including conversion of early-succession beach habitat to herbaceous and shrub communities, diminishing availability of foraging habitat and thereby reducing productivity of the Federally-listed Threatened Charadrius melodus (piping plover). To address these impacts, multiple notches were cut through the constructed foredune. The metric for initial geomorphological success-restoration of at least one overwash event per year across the constructed foredune, if occurring elsewhere on the island-was reached. New overwash fans increased island stability by increasing interior island elevation. At every notch, areas of sparse vegetation increased and the new foraging habitat was utilized by breeding pairs during the 2010 breeding season. However, the metric for long-term biological success-an increase to 37% sparsely vegetated habitat on the North End and an increase in piping plover productivity to 1.25 chicks fledged per breeding pair-has not yet been met. By 2010 there was an overall productivity of 1.2 chicks fledged per breeding pair and a 1.7% decrease in sparsely

  10. A model cerium oxide matrix composite reinforced with a homogeneous dispersion of silver particulate - prepared using the glycine-nitrate process

    SciTech Connect

    Weil, K. Scott; Hardy, John S.

    2005-01-31

    Recently a new method of ceramic brazing has been developed. Based on a two-phase liquid composed of silver and copper oxide, brazing is conducted directly in air without the need of an inert cover gas or the use of surface reactive fluxes. Because the braze displays excellent wetting characteristics on a number ceramic surfaces, including alumina, various perovskites, zirconia, and ceria, we were interested in investigating whether a metal-reinforced ceramic matrix composite (CMC) could be developed with this material. In the present study, two sets of homogeneously mixed silver/copper oxide/ceria powders were synthesized using a combustion synthesis technique. The powders were compacted and heat treated in air above the liquidus temperature for the chosen Ag-CuO composition. Metallographic analysis indicates that the resulting composite microstructures are extremely uniform with respect to both the size of the metallic reinforcement as well as its spatial distribution within the ceramic matrix. The size, morphology, and spacing of the metal particulate in the densified composite appears to be dependent on the original size and the structure of the starting combustion synthesized powders.

  11. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  12. Heterogeneous nucleation or homogeneous nucleation?

    NASA Astrophysics Data System (ADS)

    Liu, X. Y.

    2000-06-01

    The generic heterogeneous effect of foreign particles on three dimensional nucleation was examined both theoretically and experimentally. It shows that the nucleation observed under normal conditions includes a sequence of progressive heterogeneous processes, characterized by different interfacial correlation function f(m,x)s. At low supersaturations, nucleation will be controlled by the process with a small interfacial correlation function f(m,x), which results from a strong interaction and good structural match between the foreign bodies and the crystallizing phase. At high supersaturations, nucleation on foreign particles having a weak interaction and poor structural match with the crystallizing phase (f(m,x)→1) will govern the kinetics. This frequently leads to the false identification of homogeneous nucleation. Genuine homogeneous nucleation, which is the up-limit of heterogeneous nucleation, may not be easily achievable under gravity. In order to check these results, the prediction is confronted with nucleation experiments of some organic and inorganic crystals. The results are in excellent agreement with the theory.

  13. Experimental Simulation of the Radionuclide Behaviour in the Process of Creating Additional Safety Barriers in Solid Radioactive Waste Repositories Containing Irradiated Graphite

    NASA Astrophysics Data System (ADS)

    Pavliuk, A. O.; Kotlyarevskiy, S. G.; Bespala, E. V.; Zakarova, E. V.; Rodygina, N. I.; Ermolaev, V. M.; Proshin, I. M.; Volkova, A.

    2016-08-01

    Results of the experimental modeling of radionuclide behavior when creating additional safety barriers in solid radioactive waste repositories are presented. The experiments were run on the repository mockup containing solid radioactive waste fragments including irradiated graphite. The repository mockup layout is given; the processes with radionuclides that occur during the barrier creation with a clayey solution and during the following barrier operation are investigated. The results obtained confirm high anti-migration and anti-filtration properties of clay used for the barrier creation even under the long-term excessive water saturation of rocks confining the repository.

  14. Homogeneous quantum electrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1992-01-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  15. HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Hammond, R.P.; Busey, H.M.

    1959-02-17

    Nuclear reactors of the homogeneous liquid fuel type are discussed. The reactor is comprised of an elongated closed vessel, vertically oriented, having a critical region at the bottom, a lower chimney structure extending from the critical region vertically upwardly and surrounded by heat exchanger coils, to a baffle region above which is located an upper chimney structure containing a catalyst functioning to recombine radiolyticallydissociated moderator gages. In operation the liquid fuel circulates solely by convection from the critical region upwardly through the lower chimney and then downwardly through the heat exchanger to return to the critical region. The gases formed by radiolytic- dissociation of the moderator are carried upwardly with the circulating liquid fuel and past the baffle into the region of the upper chimney where they are recombined by the catalyst and condensed, thence returning through the heat exchanger to the critical region.

  16. Homogeneous quantum electrodynamic turbulence

    SciTech Connect

    Shebalin, J.V.

    1992-10-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  17. Homogeneity study of candidate reference material in fish matrix

    NASA Astrophysics Data System (ADS)

    Ulrich, J. C.; Sarkis, J. E. S.; Hortellani, M. A.

    2015-01-01

    A material is perfectly homogeneous with respect to a given characteristic, or composition, if there is no difference between the values obtained from one part to another. Homogeneity is usually evaluated using analysis of variance (ANOVA). However, the requirement that populations of data to be processed must have a normal distribution and equal variances greatly limits the use of this statistical tool. A more suitable test for assessing the homogeneity of RMs, known as "sufficient homogeneity", was proposed by Fearn and Thompson. In this work, we evaluate the performance of the two statistical treatments for assessing homogeneity of methylmercury (MeHg) in candidate reference material of fish tissue.

  18. Homogeneous nucleation of nitrogen

    NASA Astrophysics Data System (ADS)

    Iland, Kristina; Wedekind, Jan; Wölk, Judith; Strey, Reinhard

    2009-03-01

    We investigated the homogeneous nucleation of nitrogen in a cryogenic expansion chamber [A. Fladerer and R. Strey, J. Chem. Phys. 124, 164710 (2006)]. Gas mixtures of nitrogen and helium as carrier gas were adiabatically expanded and cooled down from an initial temperature of 83 K until nucleation occurred. This onset was detected by constant angle light scattering at nitrogen vapor pressures of 1.3-14.2 kPa and temperatures of 42-54 K. An analytical fit function well describes the experimental onset pressures with an error of ±15%. We estimate the size of the critical nucleus with the Gibbs-Thomson equation yielding critical sizes of about 50 molecules at the lowest and 70 molecules at the highest temperature. In addition, we estimate the nucleation rate and compare it with nucleation theories. The predictions of classical nucleation theory (CNT) are 9 to 19 orders of magnitude below the experimental results and show a stronger temperature dependence. The Reguera-Reiss theory [Phys. Rev. Lett. 93, 165701 (2004)] predicts the correct temperature dependence at low temperatures and decreases the absolute deviation to 7-13 orders of magnitude. We present an empirical correction function to CNT describing our experimental results. These correction parameters are remarkably close to the ones of argon [Iland et al., J. Chem. Phys. 127, 154506 (2007)] and even those of water [J. Wölk and R. Strey, J. Phys. Chem. B 105, 11683 (2001)].

  19. Universum Inference and Corpus Homogeneity

    NASA Astrophysics Data System (ADS)

    Vogel, Carl; Lynch, Gerard; Janssen, Jerom

    Universum Inference is re-interpreted for assessment of corpus homogeneity in computational stylometry. Recent stylometric research quantifies strength of characterization within dramatic works by assessing the homogeneity of corpora associated with dramatic personas. A methodological advance is suggested to mitigate the potential for the assessment of homogeneity to be achieved by chance. Baseline comparison analysis is constructed for contributions to debates by nonfictional participants: the corpus analyzed consists of transcripts of US Presidential and Vice-Presidential debates from the 2000 election cycle. The corpus is also analyzed in translation to Italian, Spanish and Portuguese. Adding randomized categories makes assessments of homogeneity more conservative.

  20. Reciprocity theory of homogeneous reactions

    NASA Astrophysics Data System (ADS)

    Agbormbai, Adolf A.

    1990-03-01

    The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.

  1. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  2. Shear wave splitting hints at dynamical features of mantle convection: a global study of homogeneously processed source and receiver side upper mantle anisotropy

    NASA Astrophysics Data System (ADS)

    Walpole, J.; Wookey, J. M.; Masters, G.; Kendall, J. M.

    2013-12-01

    The asthenosphere is embroiled in the process of mantle convection. Its viscous properties allow it to flow around sinking slabs and deep cratonic roots as it is displaced by intruding material and dragged around by the moving layer above. As the asthenosphere flows it develops a crystalline fabric with anisotropic crystals preferentially aligned in the direction of flow. Meanwhile, the lithosphere above deforms as it is squeezed and stretched by underlying tectonic processes, enabling anisotropic fabrics to develop and become fossilised in the rigid rock and to persist over vast spans of geological time. As a shear wave passes through an anisotropic medium it splits into two orthogonally polarised quasi shear waves that propagate at different velocities (this phenomenon is known as shear wave splitting). By analysing the polarisation and the delay time of many split waves that have passed through a region it is possible to constrain the anisotropy of the medium in that region. This anisotropy is the key to revealing the deformation history of the deep Earth. In this study we present measurements of shear wave splitting recorded on S, SKS, and SKKS waves from earthquakes recorded at stations from the IRIS DMC catalogue (1976-2010). We have used a cluster analysis phase picking technique [1] to pick hundreds of thousands of high signal to noise waveforms on long period data. These picks are used to feed the broadband data into an automated processing workflow that recovers shear wave splitting parameters [2,3]. The workflow includes a new method for making source and receiver corrections, whereby the stacked error surfaces are used as input to correction rather than a single set of parameters, this propagates uncertainty information into the final measurement. Using SKS, SKKS, and source corrected S, we recover good measurements of anisotropy beneath 1,569 stations. Using receiver corrected S we recover good measurements of anisotropy beneath 470 events. We compare

  3. A homogeneous fluorometric assay platform based on novel synthetic proteins

    SciTech Connect

    Vardar-Schara, Goenuel; Krab, Ivo M.; Yi, Guohua; Su, Wei Wen . E-mail: wsu@hawaii.edu

    2007-09-14

    Novel synthetic recombinant sensor proteins have been created to detect analytes in solution, in a rapid single-step 'mix and read' noncompetitive homogeneous assay process, based on modulating the Foerster resonance energy transfer (FRET) property of the sensor proteins upon binding to their targets. The sensor proteins comprise a protein scaffold that incorporates a specific target-capturing element, sandwiched by genetic fusion between two molecules that form a FRET pair. The utility of the sensor proteins was demonstrated via three examples, for detecting an anti-biotin Fab antibody, a His-tagged recombinant protein, and an anti-FLAG peptide antibody, respectively, all done directly in solution. The diversity of sensor-target interactions that we have demonstrated in this study points to a potentially universal applicability of the biosensing concept. The possibilities for integrating a variety of target-capturing elements with a common sensor scaffold predict a broad range of practical applications.

  4. Effects of sample homogenization on solid phase sediment toxicity

    SciTech Connect

    Anderson, B.S.; Hunt, J.W.; Newman, J.W.; Tjeerdema, R.S.; Fairey, W.R.; Stephenson, M.D.; Puckett, H.M.; Taberski, K.M.

    1995-12-31

    Sediment toxicity is typically assessed using homogenized surficial sediment samples. It has been recognized that homogenization alters sediment integrity and may result in changes in chemical bioavailability through oxidation-reduction or other chemical processes. In this study, intact (unhomogenized) sediment cores were taken from a Van Veen grab sampler and tested concurrently with sediment homogenate from the same sample in order to investigate the effect of homogenization on toxicity. Two different solid-phase toxicity test protocols were used for these comparisons. Results of amphipod exposures to samples from San Francisco Bay indicated minimal difference between intact and homogenized samples. Mean amphipod survival in intact cores relative to homogenates was similar at two contaminated sites. Mean survival was 34 and 33% in intact and homogenized samples, respectively, at Castro Cove. Mean survival was 41% and 57%, respectively, in intact and homogenized samples from Islais Creek. Studies using the sea urchin development protocol, modified for testing at the sediment/water interface, indicated considerably more toxicity in intact samples relative to homogenized samples from San Diego Bay. Measures of metal flux into the overlying water demonstrated greater flux of metals from the intact samples. Zinc flux was five times greater, and copper flux was twice as great in some intact samples relative to homogenates. Future experiments will compare flux of metals and organic compounds in intact and homogenized sediments to further evaluate the efficacy of using intact cores for solid phase toxicity assessment.

  5. Doublet-mechanical approach to elastic homogenization

    SciTech Connect

    Ferrari, M.; Hanford, D.

    1996-10-01

    The process of deducing the overall properties of multi-phase media from phase properties and distributional data is referred to as homogenization. Two prominent homogenization modes are (1) the so-called direct, or concentrator-based approaches; and (2) the so-called mathematical homogenization, or cell-based method. Within the direct method one can classify the Eshelby, the Mori-Tanaka, the Voigt, the Reuss, and the ploy-inclusion approaches. As was proven by one of the authors (MF) in recent publications, none of the existing approaches satisfies even most elementary admissibility criteria for the general bi-phase composite, i.e., the search for general concentrators is still far from complete. The mathematical homogenization method, developed by Tartar and Sanchez-Palencia among others, reduces the overall effective property prediction to the numerical solution of a representative cell problem. In this paper, the methods of the Doublet Mechanics (DM) of V.T. Granik and M. Ferrari are employed to address both the concentrator problem of the direct approach, and the cell problem of mathematical homogenization. In the former, a choice of macroscopic concentrator is determined exactly from the closed-form solution of a micromechanical problem. The latter problem is solved by identifying the representative micro-level volume with an assembly of points with translational regularity, and employing the discrete-continuum transition that underlies DM.

  6. STEAM STIRRED HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Busey, H.M.

    1958-06-01

    A homogeneous nuclear reactor utilizing a selfcirculating liquid fuel is described. The reactor vessel is in the form of a vertically disposed tubular member having the lower end closed by the tube walls and the upper end closed by a removal fianged assembly. A spherical reaction shell is located in the lower end of the vessel and spaced from the inside walls. The reaction shell is perforated on its lower surface and is provided with a bundle of small-diameter tubes extending vertically upward from its top central portion. The reactor vessel is surrounded in the region of the reaction shell by a neutron reflector. The liquid fuel, which may be a solution of enriched uranyl sulfate in ordinary or heavy water, is mainiained at a level within the reactor vessel of approximately the top of the tubes. The heat of the reaction which is created in the critical region within the spherical reaction shell forms steam bubbles which more upwardly through the tubes. The upward movement of these bubbles results in the forcing of the liquid fuel out of the top of these tubes, from where the fuel passes downwardly in the space between the tubes and the vessel wall where it is cooled by heat exchangers. The fuel then re-enters the critical region in the reaction shell through the perforations in the bottom. The upper portion of the reactor vessel is provided with baffles to prevent the liquid fuel from splashing into this region which is also provided with a recombiner apparatus for recombining the radiolytically dissociated moderator vapor and a control means.

  7. Creating standard resistors based on germanium and silicon single crystals grown under microgravity conditions

    NASA Astrophysics Data System (ADS)

    Kartavykh, A. V.; Rakov, V. V.

    2006-11-01

    Requirements on the creation of standard resistors (SRs), which are necessary for the calibration of microprobe complexes used for the diagnostics of electrical homogeneity of single crystal semiconductors, are considered. SR prototypes have been created based on Sb-doped Ge single crystals grown by float zone melting under microgravity conditions aboard the Photon series satellites, in which the inhomogeneity of the resistivity distribution does not exceed 1%. The main factors influencing the homogeneity of doping for Ge and Si crystals grown from melt under orbital flight conditions are formulated; methods for the optimization of this technological process are described.

  8. Homogeneous global mean temperature time series

    SciTech Connect

    Peterson, T.C.; Easterling, D.R.; Vose, R.S.; Eischeid, J.K.

    1993-11-01

    A multi-agency effort has been underway to create a homogeneous global baseline data set suitable for studying climate change. The joint release of the Global Historical Climatology Network (GHCN, Vose et al, 1992) version I in 1992 by the National Climatic Data Center/NOAA and the Carbon Dioxide Information Analysis Center/DOE gave the climate research community the largest monthly land surface global climate data set available to date with over 6,000 temperature stations, 39% of which have more than 50 years of data and 10% have more than 100 years of data (see Figure 1). Fifteen different global or regional data sets were merged to create GHCN version 1. Ten of these source data sets have temperature data but only two have been tested and adjusted for inhomogeneities in the station time series. The majority of the station temperature time series in GHCN have not been systematically examined for discontinuities.

  9. Rare earth elements in fly ashes created during the coal burning process in certain coal-fired power plants operating in Poland - Upper Silesian Industrial Region.

    PubMed

    Smolka-Danielowska, Danuta

    2010-11-01

    The subject of the study covered volatile ashes created during hard coal burning process in ash furnaces, in power plants operating in the Upper Silesian Industrial Region, Southern Poland. Coal-fired power plants are furnished with dust extracting devices, electro precipitators, with 99-99.6% combustion gas extracting efficiency. Activity concentrations ofTh-232, Ra-226, K-40, Ac-228, U-235 and U-238 were measured with gamma-ray spectrometer. Concentrations of selected rare soil elements (La, Ce, Nd, Sm, Y, Gd, Th, U) were analysed by means of instrumental neutron activation analysis (INAA). Mineral phases of individual ash particles were identified with the use of scanning electron microscope equipped with EDS attachment. Laser granulometric analyses were executed with the use of Analyssette analyser. The activity of the investigated fly-ash samples is several times higher than that of the bituminous coal samples; in the coal, the activities are: 226Ra - 85.4 Bq kg(-1), 40 K-689 Bq kg(-1), 232Th - 100.8 Bq kg(-1), 235U-13.5 Bq kg(-1), 238U-50 Bq kg(-1) and 228Ac - 82.4 Bq kg(-1).

  10. Creating esthetic composite restorations.

    PubMed

    Grin, D

    2000-05-01

    The purpose of this article is to describe a fabrication technique to assist dental technicians creating esthetic indirect composite restorations. After the teeth have been prepared and the models completed, the technician can begin the fabrication process. Translucent dentin is selected to reduce opacity and enhance the blend with the remaining dentition. High chroma modifiers can then be placed into the fossa area to replicate dentin seen in natural dentition. Different incisal materials can then be layered into the build-up to regulate the value of the restoration. Special effects such as hypocalcification are placed internally to mimic naturally occurring esthetics. Realistic anatomy is created using a small-tipped instrument directly into the final layer of uncured enamel material. Fissure characterization is placed in the restoration to match existing dentition. Fit and margins are verified on separate dies to minimize discrepancies. Path of insertion and proximal contacts are established on a solid model to minimize chairside adjustments.

  11. Competition of periodic and homogeneous modes in extended dynamical systems.

    PubMed

    Dressel, B; Joets, A; Pastur, L; Pesch, W; Plaut, E; Ribotta, R

    2002-01-14

    Despite their simple structure, spatially homogeneous modes can participate directly in pattern-formation processes. This is demonstrated by new experimental and theoretical results for thermo- and electroconvection in planar nematic liquid crystals, where two distinct homogeneous modes, twist and splay distortions of the director field, emerge. Their nonlinear excitation is due to certain spontaneous symmetry-breaking bifurcations.

  12. Operator estimates in homogenization theory

    NASA Astrophysics Data System (ADS)

    Zhikov, V. V.; Pastukhova, S. E.

    2016-06-01

    This paper gives a systematic treatment of two methods for obtaining operator estimates: the shift method and the spectral method. Though substantially different in mathematical technique and physical motivation, these methods produce basically the same results. Besides the classical formulation of the homogenization problem, other formulations of the problem are also considered: homogenization in perforated domains, the case of an unbounded diffusion matrix, non-self-adjoint evolution equations, and higher-order elliptic operators. Bibliography: 62 titles.

  13. Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (initial award through Award Modification 2); Energy & Risk Transfer Assessment (Award Modifications 3 - 6)

    SciTech Connect

    Michael Ebert

    2008-02-28

    This is the final report for the DOE-NETL grant entitled 'Creating New Incentives for Risk Identification & Insurance Processes for the Electric Utility Industry' and later, 'Energy & Risk Transfer Assessment'. It reflects work done on projects from 15 August 2004 to 29 February 2008. Projects were on a variety of topics, including commercial insurance for electrical utilities, the Electrical Reliability Organization, cost recovery by Gulf State electrical utilities after major hurricanes, and review of state energy emergency plans. This Final Technical Report documents and summarizes all work performed during the award period, which in this case is from 15 August 2004 (date of notification of original award) through 29 February 2008. This report presents this information in a comprehensive, integrated fashion that clearly shows a logical and synergistic research trajectory, and is augmented with findings and conclusions drawn from the research as a whole. Four major research projects were undertaken and completed during the 42 month period of activities conducted and funded by the award; these are: (1) Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (also referred to as the 'commercial insurance' research). Three major deliverables were produced: a pre-conference white paper, a two-day facilitated stakeholders workshop conducted at George Mason University, and a post-workshop report with findings and recommendations. All deliverables from this work are published on the CIP website at http://cipp.gmu.edu/projects/DoE-NETL-2005.php. (2) The New Electric Reliability Organization (ERO): an examination of critical issues associated with governance, standards development and implementation, and jurisdiction (also referred to as the 'ERO study'). Four major deliverables were produced: a series of preliminary memoranda for the staff of the Office of Electricity Delivery and Energy Reliability ('OE'), an ERO interview

  14. (Ultra) high pressure homogenization for continuous high pressure sterilization of pumpable foods - a review.

    PubMed

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work.

  15. (Ultra) High Pressure Homogenization for Continuous High Pressure Sterilization of Pumpable Foods – A Review

    PubMed Central

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work. PMID:25988118

  16. Political homogeneity can nurture threats to research validity.

    PubMed

    Chambers, John R; Schlenker, Barry R

    2015-01-01

    Political homogeneity within a scientific field nurtures threats to the validity of many research conclusions by allowing ideologically compatible values to influence interpretations, by minimizing skepticism, and by creating premature consensus. Although validity threats can crop in any research, the usual corrective activities in science are more likely to be minimized and delayed.

  17. Using high-performance ¹H NMR (HP-qNMR®) for the certification of organic reference materials under accreditation guidelines--describing the overall process with focus on homogeneity and stability assessment.

    PubMed

    Weber, Michael; Hellriegel, Christine; Rueck, Alexander; Wuethrich, Juerg; Jenks, Peter

    2014-05-01

    Quantitative NMR spectroscopy (qNMR) is gaining interest across both analytical and industrial research applications and has become an essential tool for the content assignment and quantitative determination of impurities. The key benefits of using qNMR as measurement method for the purity determination of organic molecules are discussed, with emphasis on the ability to establish traceability to "The International System of Units" (SI). The work describes a routine certification procedure from the point of view of a commercial producer of certified reference materials (CRM) under ISO/IEC 17025 and ISO Guide 34 accreditation, that resulted in a set of essential references for (1)H qNMR measurements, and the relevant application data for these substances are given. The overall process includes specific selection criteria, pre-tests, experimental conditions, homogeneity and stability studies. The advantages of an accelerated stability study over the classical stability-test design are shown with respect to shelf-life determination and shipping conditions. PMID:24182847

  18. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    SciTech Connect

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  19. The OPtimising HEalth LIterAcy (Ophelia) process: study protocol for using health literacy profiling and community engagement to create and implement health reform

    PubMed Central

    2014-01-01

    Background Health literacy is a multi-dimensional concept comprising a range of cognitive, affective, social, and personal skills and attributes. This paper describes the research and development protocol for a large communities-based collaborative project in Victoria, Australia that aims to identify and respond to health literacy issues for people with chronic conditions. The project, called Ophelia (OPtimising HEalth LIterAcy) Victoria, is a partnership between two universities, eight service organisations and the Victorian Government. Based on the identified issues, it will develop and pilot health literacy interventions across eight disparate health services to inform the creation of a health literacy response framework to improve health outcomes and reduce health inequalities. Methods/Design The protocol draws on many inputs including the experience of the partners in previous co-creation and roll-out of large-scale health-promotion initiatives. Three key conceptual models/discourses inform the protocol: intervention mapping; quality improvement collaboratives, and realist synthesis. The protocol is outcomes-oriented and focuses on two key questions: ‘What are the health literacy strengths and weaknesses of clients of participating sites?’, and ‘How do sites interpret and respond to these in order to achieve positive health and equity outcomes for their clients?’. The process has six steps in three main phases. The first phase is a needs assessment that uses the Health Literacy Questionnaire (HLQ), a multi-dimensional measure of health literacy, to identify common health literacy needs among clients. The second phase involves front-line staff and management within each service organisation in co-creating intervention plans to strategically respond to the identified local needs. The third phase will trial the interventions within each site to determine if the site can improve identified limitations to service access and/or health outcomes. Discussion

  20. Homogenization in micro-plasticity

    NASA Astrophysics Data System (ADS)

    Berdichevsky, Victor L.

    2005-11-01

    Homogenized descriptions of plasticity on micro- and macro-scale are essentially different. A key distinction is that the energy of micron-size specimens, in contrast to that of macro-specimens, is not a functional of integral characteristics of the dislocation networks. Thus, energy must be considered as an independent characteristic of the body which is additional to all other characteristics. In this paper, a homogenized description of dislocation motion on the micro-scale is proposed. The theory is considered for the case of anti-plane constrained shear which admits an analytical treatment.

  1. Nonlocality in homogeneous superfluid turbulence

    NASA Astrophysics Data System (ADS)

    Dix, O. M.; Zieve, R. J.

    2014-10-01

    Simulating superfluid turbulence using the localized induction approximation allows neighboring parallel vortices to proliferate. In many circumstances a turbulent tangle becomes unsustainable, degenerating into a series of parallel, noninteracting vortex lines. Calculating with the fully nonlocal Biot-Savart law prevents this difficulty but also increases computation time. Here we use a truncated Biot-Savart integral to investigate the effects of nonlocality on homogeneous turbulence. We find that including the nonlocal interaction up to roughly the spacing between nearest-neighbor vortex segments prevents the parallel alignment from developing, yielding an accurate model of homogeneous superfluid turbulence with less computation time.

  2. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  3. Entanglement Created by Dissipation

    SciTech Connect

    Alharbi, Abdullah F.; Ficek, Zbigniew

    2011-10-27

    A technique for entangling closely separated atoms by the process of dissipative spontaneous emission is presented. The system considered is composed of two non-identical two-level atoms separated at the quarter wavelength of a driven standing wave laser field. At this atomic distance, only one of the atoms can be addressed by the laser field. In addition, we arrange the atomic dipole moments to be oriented relative to the inter-atomic axis such that the dipole-dipole interaction between the atoms is zero at this specific distance. It is shown that an entanglement can be created between the atoms on demand by tuning the Rabi frequency of the driving field to the difference between the atomic transition frequencies. The amount of the entanglement created depends on the ratio between the damping rates of the atoms, but is independent of the frequency difference between the atoms. We also find that the transient buildup of an entanglement between the atoms may differ dramatically for different initial atomic conditions.

  4. Homogeneous cooling state of frictionless rod particles

    NASA Astrophysics Data System (ADS)

    Rubio-Largo, S. M.; Alonso-Marroquin, F.; Weinhart, T.; Luding, S.; Hidalgo, R. C.

    2016-02-01

    In this work, we report some theoretical results on granular gases consisting of frictionless 3D rods with low energy dissipation. We performed simulations on the temporal evolution of soft spherocylinders, using a molecular dynamics algorithm implemented on GPU architecture. A homogeneous cooling state for rods, where the time dependence of the system's intensive variables occurs only through a global granular temperature, has been identified. We have found a homogeneous cooling process, which is in excellent agreement with Haff's law, when using an adequate rescaling time τ(ξ), the value of which depends on the particle elongation ξ and the restitution coefficient. It was further found that scaled particle velocity distributions remain approximately Gaussian regardless of the particle shape. Similarly to a system of ellipsoids, energy equipartition between rotational and translational degrees of freedom was better satisfied as one gets closer to the elastic limit. Taking advantage of scaling properties, we have numerically determined the general functionality of the magnitude Dc(ξ), which describes the efficiency of the energy interchange between rotational and translational degrees of freedom, as well as its dependence on particle shape. We have detected a range of particle elongations (1.5 < ξ < 4.0), where the average energy transfer between the rotational and translational degrees of freedom results greater for spherocylinders than for homogeneous ellipsoids with the same aspect ratio.

  5. A compact setup to study homogeneous nucleation and condensation

    NASA Astrophysics Data System (ADS)

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 106 Pa resulting from mass flow rates of up to 600 lN min-1 and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD.

  6. A compact setup to study homogeneous nucleation and condensation.

    PubMed

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 10(6) Pa resulting from mass flow rates of up to 600 l(N) min(-1) and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD.

  7. Homogeneous Pt-bimetallic Electrocatalysts

    SciTech Connect

    Wang, Chao; Chi, Miaofang; More, Karren Leslie; Markovic, Nenad; Stamenkovic, Vojislav

    2011-01-01

    Alloying has shown enormous potential for tailoring the atomic and electronic structures, and improving the performance of catalytic materials. Systematic studies of alloy catalysts are, however, often compromised by inhomogeneous distribution of alloying components. Here we introduce a general approach for the synthesis of monodispersed and highly homogeneous Pt-bimetallic alloy nanocatalysts. Pt{sub 3}M (where M = Fe, Ni, or Co) nanoparticles were prepared by an organic solvothermal method and then supported on high surface area carbon. These catalysts attained a homogeneous distribution of elements, as demonstrated by atomic-scale elemental analysis using scanning transmission electron microscopy. They also exhibited high catalytic activities for the oxygen reduction reaction (ORR), with improvement factors of 2-3 versus conventional Pt/carbon catalysts. The measured ORR catalytic activities for Pt{sub 3}M nanocatalysts validated the volcano curve established on extended surfaces, with Pt{sub 3}Co being the most active alloy.

  8. Homogeneous asymmetric catalysis in fragrance chemistry.

    PubMed

    Ciappa, Alessandra; Bovo, Sara; Bertoldini, Matteo; Scrivanti, Alberto; Matteoli, Ugo

    2008-06-01

    Opposite enantiomers of a chiral fragrance may exhibit different olfactory activities making a synthesis in high enantiomeric purity commercially and scientifically interesting. Accordingly, the asymmetric synthesis of four chiral odorants, Fixolide, Phenoxanol, Citralis, and Citralis Nitrile, has been investigated with the aim to develop practically feasible processes. In the devised synthetic schemes, the key step that leads to the formation of the stereogenic center is the homogeneous asymmetric hydrogenation of a prochiral olefin. By an appropriate choice of the catalyst and the reaction conditions, Phenoxanol, Citralis, and Citralis Nitrile were obtained in high enantiomeric purity, and odor profiles of the single enantiomers were determined.

  9. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  10. High School Student Perceptions of the Utility of the Engineering Design Process: Creating Opportunities to Engage in Engineering Practices and Apply Math and Science Content

    NASA Astrophysics Data System (ADS)

    Berland, Leema; Steingut, Rebecca; Ko, Pat

    2014-12-01

    Research and policy documents increasingly advocate for incorporating engineering design into K-12 classrooms in order to accomplish two goals: (1) provide an opportunity to engage with science content in a motivating real-world context; and (2) introduce students to the field of engineering. The present study uses multiple qualitative data sources (i.e., interviews, artifact analysis) in order to examine the ways in which engaging in engineering design can support students in participating in engineering practices and applying math and science knowledge. This study suggests that students better understand and value those aspects of engineering design that are more qualitative (i.e., interviewing users, generating multiple possible solutions) than the more quantitative aspects of design which create opportunities for students to integrate traditional math and science content into their design work (i.e., modeling or systematically choosing between possible design solutions). Recommendations for curriculum design and implementation are discussed.

  11. Homogenization and improvement in energy dissipation of nonlinear composites

    NASA Astrophysics Data System (ADS)

    Verma, Luv; Sivakumar, Srinivasan M.; Vedantam, S.

    2016-04-01

    Due to their high strength to weight and stiffness to weight ratio, there is a huge shift towards the composite materials from the conventional metals, but composites have poor damage resistance in the transverse direction. Undergoing impact loads, they can fail in wide variety of modes which severely reduces the structural integrity of the component. This paper deals with the homogenization of glass-fibers and epoxy composite with a material introduced as an inelastic inclusion. This nonlinearity is being modelled by kinematic hardening procedure and homogenization is done by one of the mean field homogenization technique known as Mori-Tanaka method. The homogenization process consider two phases, one is the matrix and another is the inelastic inclusion, thus glass-fibers and epoxy are two phases which can be considered as one phase and act as a matrix while homogenizing non-linear composite. Homogenization results have been compared to the matrix at volume fraction zero of the inelastic inclusions and to the inelastic material at volume fraction one. After homogenization, increase of the energy dissipation into the composite due to addition of inelastic material and effects onto the same by changing the properties of the matrix material have been discussed.

  12. Matrix shaped pulsed laser deposition: New approach to large area and homogeneous deposition

    NASA Astrophysics Data System (ADS)

    Akkan, C. K.; May, A.; Hammadeh, M.; Abdul-Khaliq, H.; Aktas, O. C.

    2014-05-01

    Pulsed laser deposition (PLD) is one of the well-established physical vapor deposition methods used for synthesis of ultra-thin layers. Especially PLD is suitable for the preparation of thin films of complex alloys and ceramics where the conservation of the stoichiometry is critical. Beside several advantages of PLD, inhomogeneity in thickness limits use of PLD in some applications. There are several approaches such as rotation of the substrate or scanning of the laser beam over the target to achieve homogenous layers. On the other hand movement and transition create further complexity in process parameters. Here we present a new approach which we call Matrix Shaped PLD to control the thickness and homogeneity of deposited layers precisely. This new approach is based on shaping of the incoming laser beam by a microlens array and a Fourier lens. The beam is split into much smaller multi-beam array over the target and this leads to a homogenous plasma formation. The uniform intensity distribution over the target yields a very uniform deposit on the substrate. This approach is used to deposit carbide and oxide thin films for biomedical applications. As a case study coating of a stent which has a complex geometry is presented briefly.

  13. Rapid homogeneous endothelialization of high aspect ratio microvascular networks.

    PubMed

    Naik, Nisarga; Hanjaya-Putra, Donny; Haller, Carolyn A; Allen, Mark G; Chaikof, Elliot L

    2015-08-01

    Microvascularization of an engineered tissue construct is necessary to ensure the nourishment and viability of the hosted cells. Microvascular constructs can be created by seeding the luminal surfaces of microfluidic channel arrays with endothelial cells. However, in a conventional flow-based system, the uniformity of endothelialization of such an engineered microvascular network is constrained by mass transfer of the cells through high length-to-diameter (L/D) aspect ratio microchannels. Moreover, given the inherent limitations of the initial seeding process to generate a uniform cell coating, the large surface-area-to-volume ratio of microfluidic systems demands long culture periods for the formation of confluent cellular microconduits. In this report, we describe the design of polydimethylsiloxane (PDMS) and poly(glycerol sebacate) (PGS) microvascular constructs with reentrant microchannels that facilitates rapid, spatially homogeneous endothelial cell seeding of a high L/D (2 cm/35 μm; > 550:1) aspect ratio microchannels. MEMS technology was employed for the fabrication of a monolithic, elastomeric, reentrant microvascular construct. Isotropic etching and PDMS micromolding yielded a near-cylindrical microvascular channel array. A 'stretch - seed - seal' operation was implemented for uniform incorporation of endothelial cells along the entire microvascular area of the construct yielding endothelialized microvascular networks in less than 24 h. The feasibility of this endothelialization strategy and the uniformity of cellularization were established using confocal microscope imaging. PMID:26227213

  14. Rapid homogeneous endothelialization of high aspect ratio microvascular networks.

    PubMed

    Naik, Nisarga; Hanjaya-Putra, Donny; Haller, Carolyn A; Allen, Mark G; Chaikof, Elliot L

    2015-08-01

    Microvascularization of an engineered tissue construct is necessary to ensure the nourishment and viability of the hosted cells. Microvascular constructs can be created by seeding the luminal surfaces of microfluidic channel arrays with endothelial cells. However, in a conventional flow-based system, the uniformity of endothelialization of such an engineered microvascular network is constrained by mass transfer of the cells through high length-to-diameter (L/D) aspect ratio microchannels. Moreover, given the inherent limitations of the initial seeding process to generate a uniform cell coating, the large surface-area-to-volume ratio of microfluidic systems demands long culture periods for the formation of confluent cellular microconduits. In this report, we describe the design of polydimethylsiloxane (PDMS) and poly(glycerol sebacate) (PGS) microvascular constructs with reentrant microchannels that facilitates rapid, spatially homogeneous endothelial cell seeding of a high L/D (2 cm/35 μm; > 550:1) aspect ratio microchannels. MEMS technology was employed for the fabrication of a monolithic, elastomeric, reentrant microvascular construct. Isotropic etching and PDMS micromolding yielded a near-cylindrical microvascular channel array. A 'stretch - seed - seal' operation was implemented for uniform incorporation of endothelial cells along the entire microvascular area of the construct yielding endothelialized microvascular networks in less than 24 h. The feasibility of this endothelialization strategy and the uniformity of cellularization were established using confocal microscope imaging.

  15. Homogeneous cooling of mixtures of particle shapes

    NASA Astrophysics Data System (ADS)

    Hidalgo, R. C.; Serero, D.; Pöschel, T.

    2016-07-01

    In this work, we examine theoretically the cooling dynamics of binary mixtures of spheres and rods. To this end, we introduce a generalized mean field analytical theory, which describes the free cooling behavior of the mixture. The relevant characteristic time scale for the cooling process is derived, depending on the mixture composition and the aspect ratio of the rods. We simulate mixtures of spherocylinders and spheres using a molecular dynamics algorithm implemented on graphics processing unit (GPU) architecture. We systematically study mixtures composed of spheres and rods with several aspect ratios and varying the mixture composition. A homogeneous cooling state, where the time dependence of the system's intensive variables occurs only through a global granular temperature, is identified. We find cooling dynamics in excellent agreement with Haff's law, when using an adequate time scale. Using the scaling properties of the homogeneous cooling dynamics, we estimated numerically the efficiency of the energy interchange between rotational and translational degrees of freedom for collisions between spheres and rods.

  16. Creating a Classroom Library.

    ERIC Educational Resources Information Center

    Hepler, Susan; And Others

    1992-01-01

    Presents ideas for creating classroom libraries, noting how to set up a library (create a space, build and organize the collection, and set rules), where to find books at bargain prices (e.g., garage sales, libraries, book clubs, and grants), basic books to include, and information on authors and illustrators. (SM)

  17. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    SciTech Connect

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  18. Homogeneous Open Quantum Random Walks on a Lattice

    NASA Astrophysics Data System (ADS)

    Carbone, Raffaella; Pautrat, Yan

    2015-09-01

    We study open quantum random walks (OQRWs) for which the underlying graph is a lattice, and the generators of the walk are homogeneous in space. Using the results recently obtained in Carbone and Pautrat (Ann Henri Poincaré, 2015), we study the quantum trajectory associated with the OQRW, which is described by a position process and a state process. We obtain a central limit theorem and a large deviation principle for the position process. We study in detail the case of homogeneous OQRWs on the lattice , with internal space.

  19. An approximation for homogeneous freezing temperature of water droplets

    NASA Astrophysics Data System (ADS)

    O, K.-T.; Wood, R.

    2015-11-01

    In this work, based on the well-known formulae of classical nucleation theory (CNT), the temperature TNc = 1 at which the mean number of critical embryos inside a droplet is unity is derived and proposed as a new approximation for homogeneous freezing temperature of water droplets. Without consideration of time dependence and stochastic nature of the ice nucleation process, the approximation TNc = 1 is able to reproduce the dependence of homogeneous freezing temperature on drop size and water activity of aqueous drops observed in a wide range of experimental studies. We use the TNc = 1 approximation to argue that the distribution of homogeneous freezing temperatures observed in the experiments may largely be explained by the spread in the size distribution of droplets used in the particular experiment. It thus appears that this approximation is useful for predicting homogeneous freezing temperatures of water droplets in the atmosphere.

  20. Coherence delay augmented laser beam homogenizer

    DOEpatents

    Rasmussen, P.; Bernhardt, A.

    1993-06-29

    The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.

  1. Coherence delay augmented laser beam homogenizer

    DOEpatents

    Rasmussen, Paul; Bernhardt, Anthony

    1993-01-01

    The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.

  2. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  3. Numerical experiments in homogeneous turbulence

    NASA Technical Reports Server (NTRS)

    Rogallo, R. S.

    1981-01-01

    The direct simulation methods developed by Orszag and Patternson (1972) for isotropic turbulence were extended to homogeneous turbulence in an incompressible fluid subjected to uniform deformation or rotation. The results of simulations for irrotational strain (plane and axisymmetric), shear, rotation, and relaxation toward isotropy following axisymmetric strain are compared with linear theory and experimental data. Emphasis is placed on the shear flow because of its importance and because of the availability of accurate and detailed experimental data. The computed results are used to assess the accuracy of two popular models used in the closure of the Reynolds-stress equations. Data from a variety of the computed fields and the details of the numerical methods used in the simulation are also presented.

  4. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-01

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales.

  5. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-01

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales. PMID:17360379

  6. Creating physics stars

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2013-07-01

    Korea has begun an ambitious 5bn plan to create 50 new institutes dedicated to fundamental research. Michael Banks meets physicist Se-Jung Oh, president of the Institute for Basic Science, to find out more.

  7. Homogenization patterns of the world’s freshwater fish faunas

    PubMed Central

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-01-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the “Homogocene era” is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes. PMID:22025692

  8. Binary homogeneous nucleation of octane isomers

    NASA Astrophysics Data System (ADS)

    Doster, George Jay

    The measurement of the binary homogeneous nucleation of i-octane and n-octane (2,2,4-trimethylpentane) has been performed with a Wilson cloud chamber. This system of octane isomers has been chosen because it exhibits the desirable properties of a nearly ideal system. The octanes are non-polar, do not hydrogen bond, and have a low heat of mixing. The results from this experiment are presented and compared to the binary classical nucleation theory, the diffuse interface theory, and the binary scaled nucleation theory. The data from this experiment includes 3 mixtures of the octane isomers in mole fraction ratios of 1:1, 1:3, and 3:1 along with results from the pure octanes. Nucleation rates from approximately 100 to 50,000 cm3s and nucleation temperatures of 215 K to 260 K are included. This wide range of data is an effort to create a collection of data to which modified or new nucleation theories may be compared.

  9. Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals: stereoelectronic role of trans effect in a metal-mediated pericyclic process and a shift from homogeneous to heterogeneous catalysis during a one-pot reaction.

    PubMed

    Vidhani, Dinesh V; Krafft, Marie E; Alabugin, Igor V

    2014-01-01

    The combination of experiments and computations reveals unusual features of stereoselective Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals. The first step, the conversion of propargyl vinyl ethers into allene aldehydes, proceeds under homogeneous conditions via a "cyclization-mediated" mechanism initiated by Rh(I) coordination at the alkyne. This path agrees well with the small experimental effects of substituents on the carbinol carbon. The key feature revealed by the computational study is the stereoelectronic effect of the ligand arrangement at the catalytic center. The rearrangement barriers significantly decrease due to the greater transfer of electron density from the catalytic metal center to the CO ligand oriented trans to the alkyne. This effect increases electrophilicity of the metal and lowers the calculated barriers by 9.0 kcal/mol. Subsequent evolution of the catalyst leads to the in situ formation of Rh(I) nanoclusters that catalyze stereoselective tautomerization. The intermediacy of heterogeneous catalysis by nanoclusters was confirmed by mercury poisoning, temperature-dependent sigmoidal kinetic curves, and dynamic light scattering. The combination of experiments and computations suggests that the initially formed allene-aldehyde product assists in the transformation of a homogeneous catalyst (or "a cocktail of catalysts") into nanoclusters, which in turn catalyze and control the stereochemistry of subsequent transformations.

  10. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  11. Cryogenic Homogenization and Sampling of Heterogeneous Multi-Phase Feedstock

    SciTech Connect

    Doyle, Glenn M.; Ideker, Virgene D.; Siegwarth, James D.

    1999-09-21

    An apparatus and process for producing a homogeneous analytical sample from a heterogeneous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77K (-196 C). Further, with the process of this invention the representative sample maybe maintained below the critical temperature until being analyzed.

  12. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  13. Genomic homogeneity in fibrolamellar carcinomas

    PubMed Central

    Sirivatanauksorn, Y; Sirivatanauksorn, V; Lemoine, N; Williamson, R; Davidson, B

    2001-01-01

    BACKGROUND—Fibrolamellar carcinoma (FLC) is a variant of hepatocellular carcinoma (HCC) with distinctive clinical and histological features. To date there have been few studies on the genotypic aspects of FLC and no previous attempts have been made to use the arbitrarily primed-polymerase chain reaction (AP-PCR) technique to detect genetic alterations in this disease.
AIM—The aim of this study was to assess the degree of genomic heterogeneity of FLC using the AP-PCR technique.
METHODS—A total of 50 tissue samples of primary and metastatic FLCs from seven patients were microdissected. AP-PCR amplification of each genomic DNA sample was carried out using two arbitrary primers.
RESULTS—DNA fingerprints of the primary FLCs and all their metastatic lesions (both synchronous and metachronous disease) were identical in an individual patient. The fingerprints were different between tumours of different patients. No evidence of intratumour heterogeneity was observed.
CONCLUSIONS—Such genomic homogeneity in FLCs may explain their indolent growth. The absence of clonal evolution, which is present in other tumours (particularly HCCs), may explain the distinct behaviour in this tumour. The tumorigenic pathway and degree of somatic genomic changes in this disease may be less complex than in HCC.


Keywords: fibrolamellar carcinoma; hepatocellular carcinoma; DNA fingerprint; arbitrarily primed-polymerase chain reaction; laser capture microdissection PMID:11413114

  14. The production of homogeneous extrudates of microcrystalline cellulose pastes.

    PubMed

    Rough, S L; Wilson, D I

    2004-05-19

    The homogeneity of water-based microcrystalline cellulose (MCC) paste extrudates was investigated during ram extrusion as a function of ram velocity. Variations in the water content of the extrudates were caused by liquid phase migration within the paste. The evolution in water content was measured by sectioning and drying the extrudate, and the subsequent homogeneity was quantified by the standard error in water content. The homogeneity of the extrudates was found to decrease as the ram velocity decreased. This result was also inferred from the rate of increase of the extrusion pressure. The extrudate homogeneity was significantly improved by compensating for water migration in the barrel during the compaction stage. This was achieved using a non-uniform initial paste billet, created by packing the barrel with layers of paste of different water contents. This technique also produced a smaller variation in extrusion pressure over the ram displacement range, and a reduction in water loss from the upstream paste compact into the extrudate and/or through the apparatus tooling.

  15. Comparative Analysis of a MOOC and a Residential Community Using Introductory College Physics: Documenting How Learning Environments Are Created, Lessons Learned in the Process, and Measurable Outcomes

    NASA Astrophysics Data System (ADS)

    Olsen, Jack Ryan

    Higher education institutions, such as the University of Colorado Boulder (CU-Boulder), have as a core mission to advance their students' academic performance. On the frontier of education technologies that hold the promise to address our educational mission are Massively Open Online Courses (MOOCs) which are new enough to not be fully understood or well-researched. MOOCs, in theory, have vast potential for being cost-effective and for reaching diverse audiences across the world. This thesis examines the implementation of one MOOC, Physics 1 for Physical Science Majors, implemented in the augural round of institutionally sanctioned MOOCs in Fall 2013. While comparatively inexpensive to a brick-and-mortar course and while it initially enrolled audience of nearly 16,000 students, this MOOC was found to be time-consuming to implement, and only roughly 1.5% of those who enrolled completed the course---approximately 1/4 of those who completed the standard brick and mortar course that the MOOC was designed around. An established education technology, residential communities, contrast the MOOCs by being high-touch and highly humanized, but by being expensive and locally-based. The Andrews Hall Residential College (AHRC) on the CU campus fosters academic success and retention by engaging and networking students outside of the standard brick and mortar courses and enculturating students into an environment with vertical integration through the different classes: freshman, sophomore, junior, etc. The physics MOOC and the AHRC were studied to determine how the environments were made and what lessons were learned in the process. Also, student performance was compared for the physics MOOC, a subset of the AHRC students enrolled in a special physics course, and the standard CU Physics 1 brick and mortar course. All yielded similar learning gains for physics 1 performance, for those who completed the courses. These environments are presented together to compare and contrast their

  16. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer

    PubMed Central

    An, Ning; Shi, Xiaoyu; Zhang, Yueming; Lv, Ning; Feng, Lin; Di, Xuebing; Han, Naijun; Wang, Guiqi

    2015-01-01

    Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis), probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan–Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11), and significantly associated with disease-free survival in four

  17. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer.

    PubMed

    An, Ning; Shi, Xiaoyu; Zhang, Yueming; Lv, Ning; Feng, Lin; Di, Xuebing; Han, Naijun; Wang, Guiqi; Cheng, Shujun; Zhang, Kaitai

    2015-01-01

    Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis), probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan-Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11), and significantly associated with disease-free survival in four

  18. Iterative and variational homogenization methods for filled elastomers

    NASA Astrophysics Data System (ADS)

    Goudarzi, Taha

    Elastomeric composites have increasingly proved invaluable in commercial technological applications due to their unique mechanical properties, especially their ability to undergo large reversible deformation in response to a variety of stimuli (e.g., mechanical forces, electric and magnetic fields, changes in temperature). Modern advances in organic materials science have revealed that elastomeric composites hold also tremendous potential to enable new high-end technologies, especially as the next generation of sensors and actuators featured by their low cost together with their biocompatibility, and processability into arbitrary shapes. This potential calls for an in-depth investigation of the macroscopic mechanical/physical behavior of elastomeric composites directly in terms of their microscopic behavior with the objective of creating the knowledge base needed to guide their bottom-up design. The purpose of this thesis is to generate a mathematical framework to describe, explain, and predict the macroscopic nonlinear elastic behavior of filled elastomers, arguably the most prominent class of elastomeric composites, directly in terms of the behavior of their constituents --- i.e., the elastomeric matrix and the filler particles --- and their microstructure --- i.e., the content, size, shape, and spatial distribution of the filler particles. This will be accomplished via a combination of novel iterative and variational homogenization techniques capable of accounting for interphasial phenomena and finite deformations. Exact and approximate analytical solutions for the fundamental nonlinear elastic response of dilute suspensions of rigid spherical particles (either firmly bonded or bonded through finite size interphases) in Gaussian rubber are first generated. These results are in turn utilized to construct approximate solutions for the nonlinear elastic response of non-Gaussian elastomers filled with a random distribution of rigid particles (again, either firmly

  19. Creating a Classroom Newspaper.

    ERIC Educational Resources Information Center

    Buss, Kathleen, Ed.; McClain-Ruelle, Leslie, Ed.

    Based on the premise that students can learn a great deal by reading and writing a newspaper, this book was created by preservice instructors to teach upper elementary students (grades 3-5) newspaper concepts, journalism, and how to write newspaper articles. It shows how to use newspaper concepts to help students integrate knowledge from multiple…

  20. Creating a Logo Environment.

    ERIC Educational Resources Information Center

    Riordon, Tim

    1982-01-01

    Discusses creation of computer classroom environment by implementing Logo, a computer program language designed to develop knowledge of programing, mathematics, and problem solving. Five questions are examined concerning Logo environment, attributes, elements absent in Logo environment, reasons for creating environment, and how to begin. Six…

  1. Creating an Effective Newsletter

    ERIC Educational Resources Information Center

    Shackelford, Ray; Griffis, Kurt

    2006-01-01

    Newsletters are an important resource or form of media. They offer a cost-effective way to keep people informed, as well as to promote events and programs. Production of a newsletter makes an excellent project, relevant to real-world communication, for technology students. This article presents an activity on how to create a short newsletter. The…

  2. Creating an Interactive PDF

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2008-01-01

    There are many ways to begin a PDF document using Adobe Acrobat. The easiest and most popular way is to create the document in another application (such as Microsoft Word) and then use the Adobe Acrobat software to convert it to a PDF. In this article, the author describes how he used Acrobat's many tools in his project--an interactive…

  3. Creating dedicated bioenergy crops

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Bioenergy is one of the current mechanisms of producing renewable energy to reduce our use of nonrenewable fossil fuels and to reduce carbon emissions into the atmosphere. Humans have been using bioenergy since we first learned to create and control fire - burning manure, peat, and wood to cook food...

  4. Looking, Writing, Creating.

    ERIC Educational Resources Information Center

    Katzive, Bonnie

    1997-01-01

    Describes how a middle school language arts teacher makes analyzing and creating visual art a partner to reading and writing in her classroom. Describes a project on art and Vietnam which shows how background information can add to and influence interpretation. Describes a unit on Greek mythology and Greek vases which leads to a related visual…

  5. Creating Dialogue by Storytelling

    ERIC Educational Resources Information Center

    Passila, Anne; Oikarinen, Tuija; Kallio, Anne

    2013-01-01

    Purpose: The objective of this paper is to develop practice and theory from Augusto Boal's dialogue technique (Image Theatre) for organisational use. The paper aims to examine how the members in an organisation create dialogue together by using a dramaturgical storytelling framework where the dialogue emerges from storytelling facilitated by…

  6. Creating a Market.

    ERIC Educational Resources Information Center

    Kazimirski, J.; And Others

    The second in a series of programmed books, "Creating a Market" is published by the International Labour Office as a manual for persons studying marketing. This manual was designed to meet the needs of the labor organization's technical cooperation programs and is primarily concerned with consumer goods industries. Using a fill-in-the-blanks and…

  7. Creating Pupils' Internet Magazine

    ERIC Educational Resources Information Center

    Bognar, Branko; Šimic, Vesna

    2014-01-01

    This article presents an action research, which aimed to improve pupils' literary creativity and enable them to use computers connected to the internet. The study was conducted in a small district village school in Croatia. Creating a pupils' internet magazine appeared to be an excellent way for achieving the educational aims of almost all…

  8. Creating an Interactive Globe.

    ERIC Educational Resources Information Center

    Martin, Kurt D.

    1989-01-01

    Describes a hands-on geography activity that is designed to teach longitude and latitude to fifth-grade students. Children create a scale model of the earth from a 300 gram weather balloon. This activity incorporates geography, mathematics, science, art, and homework. Provides information for obtaining materials. (KO)

  9. Creating Photo Illustrations.

    ERIC Educational Resources Information Center

    Wilson, Bradley

    2003-01-01

    Explains the uses of photo illustrations. Notes that the key to developing a successful photo illustration is collaborative planning. Outlines the following guidelines for photo illustrations: never set up a photograph to mimic reality; create only abstractions with photo illustrations; clearly label photo illustrations; and never play photo…

  10. Creating Quality Media Materials.

    ERIC Educational Resources Information Center

    Hortin, John A.; Bailey, Gerald D.

    1982-01-01

    Innovation, imagination, and student creativity are key ingredients in creating quality media materials for the small school. Student-produced media materials, slides without a camera, personalized slide programs and copy work, self-made task cards, self-made overhead transparencies, graphic materials, and utilization of the mass media are some of…

  11. Create a Critter Collector.

    ERIC Educational Resources Information Center

    Hinchey, Elizabeth K.; Nestlerode, Janet A.

    2001-01-01

    Presents methods for creating appropriate ways of collecting live specimens to use for firsthand observation in the classroom. Suggests ecological questions for students to address using these devices. This project is ideal for schools that have access to piers or bridges on a coastal body of water. (NB)

  12. Creating Historical Drama.

    ERIC Educational Resources Information Center

    Cassler, Robert

    1990-01-01

    Describes creating for the National Archives Public Education Department a historical drama, "Second in the Realm," based on the story of the Magna Carta. Demonstrates the effectiveness of historical drama as a teaching tool. Explains the difficulties of writing such dramas and provides guidelines for overcoming these problems. (NL)

  13. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  14. Create Your State

    ERIC Educational Resources Information Center

    Dunham, Kris; Melvin, Samantha

    2011-01-01

    Students are often encouraged to work together with their classmates, sometimes with other classes, occasionally with kids at other schools, but rarely with kids across the country. In this article the authors describe the Create Your State project, a collaborative nationwide project inspired by the Texas Chair Project wherein the artist, Damien…

  15. Creating a Classroom Makerspace

    ERIC Educational Resources Information Center

    Rivas, Luz

    2014-01-01

    What is a makerspace? Makerspaces are community-operated physical spaces where people (makers) create do-it-yourself projects together. These membership spaces serve as community labs where people learn together and collaborate on projects. Makerspaces often have tools and equipment like 3-D printers, laser cutters, and soldering irons.…

  16. How Banks Create Money.

    ERIC Educational Resources Information Center

    Beale, Lyndi

    This teaching module explains how the U.S. banking system uses excess reserves to create money in the form of new deposits for borrowers. The module is part of a computer-animated series of four-to-five-minute modules illustrating standard concepts in high school economics. Although the module is designed to accompany the video program, it may be…

  17. Homogeneous catalysts in hypersonic combustion

    SciTech Connect

    Harradine, D.M.; Lyman, J.L.; Oldenborg, R.C.; Pack, R.T.; Schott, G.L.

    1989-01-01

    Density and residence time both become unfavorably small for efficient combustion of hydrogen fuel in ramjet propulsion in air at high altitude and hypersonic speed. Raising the density and increasing the transit time of the air through the engine necessitates stronger contraction of the air flow area. This enhances the kinetic and thermodynamic tendency of H/sub 2/O to form completely, accompanied only by N/sub 2/ and any excess H/sub 2/(or O/sub 2/). The by-products to be avoided are the energetically expensive fragment species H and/or O atoms and OH radicals, and residual (2H/sub 2/ plus O/sub 2/). However, excessive area contraction raises air temperature and consequent combustion-product temperature by adiabatic compression. This counteracts and ultimately overwhelms the thermodynamic benefit by which higher density favors the triatomic product, H/sub 2/O, over its monatomic and diatomic alternatives. For static pressures in the neighborhood of 1 atm, static temperature must be kept or brought below ca. 2400 K for acceptable stability of H/sub 2/O. Another measure, whose requisite chemistry we address here, is to extract propulsive work from the combustion products early in the expansion. The objective is to lower the static temperature of the combustion stream enough for H/sub 2/O to become adequately stable before the exhaust flow is massively expanded and its composition ''frozen.'' We proceed to address this mechanism and its kinetics, and then examine prospects for enhancing its rate by homogeneous catalysts. 9 refs.

  18. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  19. Design and testing of a refractive laser beam homogenizer

    NASA Astrophysics Data System (ADS)

    Fernelius, N. C.; Bradley, K. R.; Hoekstra, B. L.

    1984-09-01

    A survey is made of various techniques to create a homogeneous or flat top laser beam profile. A refractive homogenizer was designed for use with a ND:YAG laser with output at its fundamental (1.06 micrometer) and frequency doubled (532 nm) modes. The system consists of a 2X beam expander and two faceted cylindrical lenses with differing focal lengths. Each cylindrical lens focusses its input into a strip the width of a facet. By orienting their axes at a 90 degree angle and focussing them on the same plane, the beam is concentrated into a square focus. Formulae for calculating the facet angles are derived and a FORTRAN computer square focus. Formulae for calculating the facet angles are derived and a FORTRAN computer program was written to calculate them with a precision greater than one is able to fabricate them.

  20. Effect of heat and homogenization on in vitro digestion of milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  1. Homogeneous freezing of water starts in the subsurface.

    PubMed

    Vrbka, Lubos; Jungwirth, Pavel

    2006-09-21

    Molecular dynamics simulations of homogeneous ice nucleation in extended aqueous slabs show that freezing preferentially starts in the subsurface. The top surface layer remains disordered during the freezing process. The subsurface accommodates better than the bulk the increase of volume connected with freezing. It also experiences strong electric fields caused by oriented surface water molecules, which can enhance ice nucleation. Our computational results shed new light on the experimental controversy concerning the bulk vs surface origin of homogeneous ice nucleation in water droplets. This has important atmospheric implications for the microphysics of formation of high altitude clouds.

  2. Context homogeneity facilitates both distractor inhibition and target enhancement.

    PubMed

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2013-01-01

    Homogeneous contexts were shown to result in prioritized processing of embedded targets compared to heterogeneous contexts (Duncan & Humphreys, 1989). The present experiment used behavioral and ERP measures to examine whether context homogeneity affects both enhancing relevant information and inhibiting irrelevant in contexts of varying homogeneity. Targets and distractors were presented laterally or on the vertical midline which allowed disentangling target- and distractor-related activity in the lateralized ERP (Hickey, diLollo, & McDonald, 2009). In homogeneous contexts, targets elicited an NT component from 150 ms on and a PD component from 200 ms on, showing early attention deployment at target locations and active suppression of distractors. In heterogeneous contexts, an NT component was also found from 150 ms on and PD was found from 250 ms on, suggesting delayed suppression of the distractor. Before 250 ms, distractors in heterogeneous contexts elicited a contralateral negativity, indicating attentional capture of the distractor prior to active suppression. In sum the present results suggest that top-down control of attention is more pronounced in homogeneous than in heterogeneous contexts.

  3. Creating Geoscience Leaders

    NASA Astrophysics Data System (ADS)

    Buskop, J.; Buskop, W.

    2013-12-01

    The United Nations Educational, Scientific, and Cultural Organization recognizes 21 World Heritage in the United States, ten of which have astounding geological features: Wrangell St. Elias National Park, Olympic National Park, Mesa Verde National Park, Chaco Canyon, Glacier National Park, Carlsbad National Park, Mammoth Cave, Great Smokey Mountains National Park, Hawaii Volcanoes National Park, and Everglades National Park. Created by a student frustrated with fellow students addicted to smart phones with an extreme lack of interest in the geosciences, one student visited each World Heritage site in the United States and created one e-book chapter per park. Each chapter was created with original photographs, and a geological discovery hunt to encourage teen involvement in preserving remarkable geological sites. Each chapter describes at least one way young adults can get involved with the geosciences, such a cave geology, glaciology, hydrology, and volcanology. The e-book describes one park per chapter, each chapter providing a geological discovery hunt, information on how to get involved with conservation of the parks, geological maps of the parks, parallels between archaeological and geological sites, and how to talk to a ranger. The young author is approaching UNESCO to publish the work as a free e-book to encourage involvement in UNESCO sites and to prove that the geosciences are fun.

  4. Cell-Laden Poly(ɛ-caprolactone)/Alginate Hybrid Scaffolds Fabricated by an Aerosol Cross-Linking Process for Obtaining Homogeneous Cell Distribution: Fabrication, Seeding Efficiency, and Cell Proliferation and Distribution

    PubMed Central

    Lee, HyeongJin; Ahn, SeungHyun; Bonassar, Lawrence J.; Chun, Wook

    2013-01-01

    Generally, solid-freeform fabricated scaffolds show a controllable pore structure (pore size, porosity, pore connectivity, and permeability) and mechanical properties by using computer-aided techniques. Although the scaffolds can provide repeated and appropriate pore structures for tissue regeneration, they have a low biological activity, such as low cell-seeding efficiency and nonuniform cell density in the scaffold interior after a long culture period, due to a large pore size and completely open pores. Here we fabricated three different poly(ɛ-caprolactone) (PCL)/alginate scaffolds: (1) a rapid prototyped porous PCL scaffold coated with an alginate, (2) the same PCL scaffold coated with a mixture of alginate and cells, and (3) a multidispensed hybrid PCL/alginate scaffold embedded with cell-laden alginate struts. The three scaffolds had similar micropore structures (pore size=430–580 μm, porosity=62%–68%, square pore shape). Preosteoblast cells (MC3T3-E1) were used at the same cell density in each scaffold. By measuring cell-seeding efficiency, cell viability, and cell distribution after various periods of culturing, we sought to determine which scaffold was more appropriate for homogeneously regenerated tissues. PMID:23469894

  5. Cell-laden poly(ɛ-caprolactone)/alginate hybrid scaffolds fabricated by an aerosol cross-linking process for obtaining homogeneous cell distribution: fabrication, seeding efficiency, and cell proliferation and distribution.

    PubMed

    Lee, HyeongJin; Ahn, SeungHyun; Bonassar, Lawrence J; Chun, Wook; Kim, GeunHyung

    2013-10-01

    Generally, solid-freeform fabricated scaffolds show a controllable pore structure (pore size, porosity, pore connectivity, and permeability) and mechanical properties by using computer-aided techniques. Although the scaffolds can provide repeated and appropriate pore structures for tissue regeneration, they have a low biological activity, such as low cell-seeding efficiency and nonuniform cell density in the scaffold interior after a long culture period, due to a large pore size and completely open pores. Here we fabricated three different poly(ɛ-caprolactone) (PCL)/alginate scaffolds: (1) a rapid prototyped porous PCL scaffold coated with an alginate, (2) the same PCL scaffold coated with a mixture of alginate and cells, and (3) a multidispensed hybrid PCL/alginate scaffold embedded with cell-laden alginate struts. The three scaffolds had similar micropore structures (pore size = 430-580 μm, porosity = 62%-68%, square pore shape). Preosteoblast cells (MC3T3-E1) were used at the same cell density in each scaffold. By measuring cell-seeding efficiency, cell viability, and cell distribution after various periods of culturing, we sought to determine which scaffold was more appropriate for homogeneously regenerated tissues.

  6. Toward site-specific, homogeneous and highly stable fluorescent silver nanoclusters fabrication on triplex DNA scaffolds

    PubMed Central

    Feng, Lingyan; Huang, Zhenzhen; Ren, Jinsong; Qu, Xiaogang

    2012-01-01

    A new strategy to create site-specific, homogeneous, and bright silver nanoclusters (AgNCs) with high-stability was demonstrated by triplex DNA as template. By reasonable design of DNA sequence, homogeneous Ag2 cluster was obtained in the predefined position of CG.C+ site of triplex DNA. This strategy was also explored for controlled alignment of AgNCs on the DNA nanoscaffold. To the best of our knowledge, this was the first example to simultaneously answer the challenges of excellent site-specific nucleation and growth, homogeneity and stability against salt of DNA-templated AgNCs. PMID:22570417

  7. The Case Against Homogeneous Sets in Mathematics

    ERIC Educational Resources Information Center

    Jackman, M. K.

    1973-01-01

    A point-by-point criticism is made of F. H. Flynn's article, The Case for Homogeneous Sets in Mathematics'' (Mathematics in School, Volume 1 Number 2, 1972) in an attempt to show that the arguments used in trying to justify homogeneous grouping in mathematics are invalid. (Editor/DT)

  8. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  9. Are geological media homogeneous or heterogeneous for neutron investigations?

    PubMed

    Woźnicka, U; Drozdowicz, K; Gabańska, B; Krynicka, E; Igielski, A

    2003-01-01

    The thermal neutron absorption cross section of a heterogeneous material is lower than that of the corresponding homogeneous one which contains the same components. When rock materials are investigated the sample usually contains grains which create heterogeneity. The heterogeneity effect depends on the mass contribution of highly and low-absorbing centers, on the ratio of their absorption cross sections, and on their sizes. An influence of the granulation of silicon and diabase samples on the absorption cross section measured with Czubek's method has been experimentally investigated. A 20% underestimation of the absorption cross section has been observed for diabase grains of sizes from 6.3 to 12.8 mm.

  10. Molecular weight enlargement--a molecular approach to continuous homogeneous catalysis.

    PubMed

    Janssen, Michèle; Müller, Christian; Vogt, Dieter

    2010-09-28

    Molecular weight enlargement (MWE) is an attractive method for homogeneous catalyst recycling. Applications of MWE in combination with either catalyst precipitation or nanofiltration have demonstrated their great potential as a method for process intensification in homogeneous catalysis. Selected, recent advances in MWE in combination with catalyst recovery are discussed, together with their implication for future developments. These examples demonstrate that this strategy is applicable in many different homogeneously catalyzed transformations.

  11. Creating healthy camp experiences.

    PubMed

    Walton, Edward A; Tothy, Alison S

    2011-04-01

    The American Academy of Pediatrics has created recommendations for health appraisal and preparation of young people before participation in day or resident camps and to guide health and safety practices for children at camp. These recommendations are intended for parents, primary health care providers, and camp administration and health center staff. Although camps have diverse environments, there are general guidelines that apply to all situations and specific recommendations that are appropriate under special conditions. This policy statement has been reviewed and is supported by the American Camp Association. PMID:21444589

  12. Creating sustainable performance.

    PubMed

    Spreitzer, Gretchen; Porath, Christine

    2012-01-01

    What makes for sustainable individual and organizational performance? Employees who are thriving-not just satisfied and productive but also engaged in creating the future. The authors found that people who fit this description demonstrated 16% better overall performance, 125% less burnout, 32% more commitment to the organization, and 46% more job satisfaction than their peers. Thriving has two components: vitality, or the sense of being alive and excited, and learning, or the growth that comes from gaining knowledge and skills. Some people naturally build vitality and learning into their jobs, but most employees are influenced by their environment. Four mechanisms, none of which requires heroic effort or major resources, create the conditions for thriving: providing decision-making discretion, sharing information about the organization and its strategy, minimizing incivility, and offering performance feedback. Organizations such as Alaska Airlines, Zingerman's, Quicken Loans, and Caiman Consulting have found that helping people grow and remain energized at work is valiant on its own merits-but it can also boost performance in a sustainable way. PMID:22299508

  13. Creating corporate advantage.

    PubMed

    Collis, D J; Montgomery, C A

    1998-01-01

    What differentiates truly great corporate strategies from the merely adequate? How can executives at the corporate level create tangible advantage for their businesses that makes the whole more than the sum of the parts? This article presents a comprehensive framework for value creation in the multibusiness company. It addresses the most fundamental questions of corporate strategy: What businesses should a company be in? How should it coordinate activities across businesses? What role should the corporate office play? How should the corporation measure and control performance? Through detailed case studies of Tyco International, Sharp, the Newell Company, and Saatchi and Saatchi, the authors demonstrate that the answers to all those questions are driven largely by the nature of a company's special resources--its assets, skills, and capabilities. These range along a continuum from the highly specialized at one end to the very general at the other. A corporation's location on the continuum constrains the set of businesses it should compete in and limits its choices about the design of its organization. Applying the framework, the authors point out the common mistakes that result from misaligned corporate strategies. Companies mistakenly enter businesses based on similarities in products rather than the resources that contribute to competitive advantage in each business. Instead of tailoring organizational structures and systems to the needs of a particular strategy, they create plain-vanilla corporate offices and infrastructures. The company examples demonstrate that one size does not fit all. One can find great corporate strategies all along the continuum.

  14. Creating corporate advantage.

    PubMed

    Collis, D J; Montgomery, C A

    1998-01-01

    What differentiates truly great corporate strategies from the merely adequate? How can executives at the corporate level create tangible advantage for their businesses that makes the whole more than the sum of the parts? This article presents a comprehensive framework for value creation in the multibusiness company. It addresses the most fundamental questions of corporate strategy: What businesses should a company be in? How should it coordinate activities across businesses? What role should the corporate office play? How should the corporation measure and control performance? Through detailed case studies of Tyco International, Sharp, the Newell Company, and Saatchi and Saatchi, the authors demonstrate that the answers to all those questions are driven largely by the nature of a company's special resources--its assets, skills, and capabilities. These range along a continuum from the highly specialized at one end to the very general at the other. A corporation's location on the continuum constrains the set of businesses it should compete in and limits its choices about the design of its organization. Applying the framework, the authors point out the common mistakes that result from misaligned corporate strategies. Companies mistakenly enter businesses based on similarities in products rather than the resources that contribute to competitive advantage in each business. Instead of tailoring organizational structures and systems to the needs of a particular strategy, they create plain-vanilla corporate offices and infrastructures. The company examples demonstrate that one size does not fit all. One can find great corporate strategies all along the continuum. PMID:10179655

  15. Homogenization method based on the inverse problem

    SciTech Connect

    Tota, A.; Makai, M.

    2013-07-01

    We present a method for deriving homogeneous multi-group cross sections to replace a heterogeneous region's multi-group cross sections; providing that the fluxes and the currents on the external boundary, and the region averaged fluxes are preserved. The method is developed using diffusion approximation to the neutron transport equation in a symmetrical slab geometry. Assuming that the boundary fluxes are given, two response matrices (RMs) can be defined. The first derives the boundary current from the boundary flux, the second derives the flux integral over the region from the boundary flux. Assuming that these RMs are known, we present a formula which reconstructs the multi-group cross-section matrix and the diffusion coefficients from the RMs of a homogeneous slab. Applying this formula to the RMs of a slab with multiple homogeneous regions yields a homogenization method; which produce such homogenized multi-group cross sections and homogenized diffusion coefficients, that the fluxes and the currents on the external boundary, and the region averaged fluxes are preserved. The method is based on the determination of the eigenvalues and the eigenvectors of the RMs. We reproduce the four-group cross section matrix and the diffusion constants from the RMs in numerical examples. We give conditions for replacing a heterogeneous region by a homogeneous one so that the boundary current and the region-averaged flux are preserved for a given boundary flux. (authors)

  16. Homogenization of precipitation time series with ACMANT

    NASA Astrophysics Data System (ADS)

    Domonkos, Peter

    2015-10-01

    New method for the time series homogenization of observed precipitation (PP) totals is presented; this method is a unit of the ACMANT software package. ACMANT is a relative homogenization method; minimum four time series with adequate spatial correlations are necessary for its use. The detection of inhomogeneities (IHs) is performed with fitting optimal step function, while the calculation of adjustment terms is based on the minimization of the residual variance in homogenized datasets. Together with the presentation of PP homogenization with ACMANT, some peculiarities of PP homogenization as, for instance, the frequency and seasonal variation of IHs in observed PP data and their relation to the performance of homogenization methods are discussed. In climatic regions of snowy winters, ACMANT distinguishes two seasons, namely, rainy season and snowy season, and the seasonal IHs are searched with bivariate detection. ACMANT is a fully automatic method, is freely downloadable from internet and treats either daily or monthly input. Series of observed data in the input dataset may cover different periods, and the occurrence of data gaps is allowed. False zero values instead of missing data code or physical outliers should be corrected before running ACMANT. Efficiency tests indicate that ACMANT belongs to the best performing methods, although further comparative tests of automatic homogenization methods are needed to confirm or reject this finding.

  17. Deforestation homogenizes tropical parasitoid-host networks.

    PubMed

    Laliberté, Etienne; Tylianakis, Jason M

    2010-06-01

    Human activities drive biotic homogenization (loss of regional diversity) of many taxa. However, whether species interaction networks (e.g., food webs) can also become homogenized remains largely unexplored. Using 48 quantitative parasitoid-host networks replicated through space and time across five tropical habitats, we show that deforestation greatly homogenized network structure at a regional level, such that interaction composition became more similar across rice and pasture sites compared with forested habitats. This was not simply caused by altered consumer and resource community composition, but was associated with altered consumer foraging success, such that parasitoids were more likely to locate their hosts in deforested habitats. Furthermore, deforestation indirectly homogenized networks in time through altered mean consumer and prey body size, which decreased in deforested habitats. Similar patterns were obtained with binary networks, suggesting that interaction (link) presence-absence data may be sufficient to detect network homogenization effects. Our results show that tropical agroforestry systems can support regionally diverse parasitoid-host networks, but that removal of canopy cover greatly homogenizes the structure of these networks in space, and to a lesser degree in time. Spatiotemporal homogenization of interaction networks may alter coevolutionary outcomes and reduce ecological resilience at regional scales, but may not necessarily be predictable from community changes observed within individual trophic levels. PMID:20583715

  18. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  19. Is the universe homogeneous on large scale?

    NASA Astrophysics Data System (ADS)

    Zhu, Xingfen; Chu, Yaoquan

    Wether the distribution of matter in the universe is homogeneous or fractal on large scale is vastly debated in observational cosmology recently. Pietronero and his co-workers have strongly advocated that the fractal behaviour in the galaxy distribution extends to the largest scale observed (≍1000h-1Mpc) with the fractal dimension D ≍ 2. Most cosmologists who hold the standard model, however, insist that the universe be homogeneous on large scale. The answer of whether the universe is homogeneous or not on large scale should wait for the new results of next generation galaxy redshift surveys.

  20. Bubbles in an isotropic homogeneous turbulent flow

    NASA Astrophysics Data System (ADS)

    Mancilla, F. E.; Martinez, M.; Soto, E.; Ascanio, G.; Zenit, R.

    2011-11-01

    Bubbly turbulent flow plays an important role in many engineering applications and natural phenomena. In this kind of flows the bubbles are dispersed in a turbulent flow and they interact with the turbulent structures. The present study focuses on the motion and hydrodynamic interaction of a single bubble in a turbulent environment. In most previous studies, the effect of bubbles on the carrier fluid was analyzed, under the assumption that the bubble size was significantly smaller that the smallest turbulence length scale. An experimental study of the effect of an isotropic and homogeneous turbulent flow on the bubble shape and motion was conducted. Experiments were performed in an isotropic turbulent chamber with nearly zero mean flow, in which a single bubble was injected. The fluid velocity was measured using the Particle Image Velocimetry (PIV) technique. The bubble deformation was determined by video processing of high-speed movies. The fluid disturbances on the bubble shape were studied for bubbles with different sizes. We will present experimental data obtained and discuss the differences among these results to try to understand the bubble - turbulence interaction mechanisms.

  1. Homogeneous cosmology with aggressively expanding civilizations

    NASA Astrophysics Data System (ADS)

    Olson, S. Jay

    2015-11-01

    In the context of a homogeneous Universe, we note that the appearance of aggressively expanding advanced life is geometrically similar to the process of nucleation and bubble growth in a first-order cosmological phase transition. We exploit this similarity to describe the dynamics of life saturating the Universe on a cosmic scale, adapting the phase transition model to incorporate probability distributions of expansion and resource consumption strategies. Through a series of numerical solutions spanning several orders of magnitude in the input assumption parameters, the resulting cosmological model is used to address basic questions related to the intergalactic spreading of life, dealing with issues such as timescales, observability, competition between strategies, and first-mover advantage. Finally, we examine physical effects on the Universe itself, such as reheating and the backreaction on the evolution of the scale factor, if such life is able to control and convert a significant fraction of the available pressureless matter into radiation. We conclude that the existence of life, if certain advanced technologies are practical, could have a significant influence on the future large-scale evolution of the Universe.

  2. Numerical Computation of Homogeneous Slope Stability

    PubMed Central

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS). PMID:25784927

  3. Creating With Carbon

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A subsidiary of SI Diamond Technology, Inc., Applied Nanotech, of Austin, Texas, is creating a buzz among various technology firms and venture capital groups interested in the company s progressive research on carbon-related field emission devices, including carbon nanotubes, filaments of pure carbon less than one ten-thousandth the width of human hair. Since their discovery in 1991, carbon nanotubes have gained considerable attention due to their unique physical properties. For example, a single perfect carbon nanotube can range from 10 to 100 times stronger than steel, per unit weight. Recent studies also indicate that the nanotubes may be the best heat-conducting material in existence. These properties, combined with the ease of growing thin films or nanotubes by a variety of deposition techniques, make the carbon-based material one of the most desirable for cold field emission cathodes.

  4. Creating a TQM culture.

    PubMed

    Lynn, G; Curto, C

    1992-11-01

    Creating a culture and environment for quality improvement is hard work that takes time and commitment. It is often frustrating and painful. For an organization to be successful in this transformation, leadership is not just important, it is vital. The leaders in TQM have new roles to play, roles that go against the grain of many of the forces that led to management success. The tasks of the leaders in a TQM organization emphasize building teamwork and removing barriers that prevent the organization from meeting customer needs. When Jamie Haughton, CEO of Corning, was asked where in his job he found the time to commit to TQM, he replied, "Continuous quality improvement is my job; it is the most important thing I do ... Quality is the primary responsibility of the leader."

  5. Creating Griffith Observatory

    NASA Astrophysics Data System (ADS)

    Cook, Anthony

    2013-01-01

    Griffith Observatory has been the iconic symbol of the sky for southern California since it began its public mission on May 15, 1935. While the Observatory is widely known as being the gift of Col. Griffith J. Griffith (1850-1919), the story of how Griffith’s gift became reality involves many of the people better known for other contributions that made Los Angeles area an important center of astrophysics in the 20th century. Griffith began drawing up his plans for an observatory and science museum for the people of Los Angeles after looking at Saturn through the newly completed 60-inch reflector on Mt. Wilson. He realized the social impact that viewing the heavens could have if made freely available, and discussing the idea of a public observatory with Mt. Wilson Observatory’s founder, George Ellery Hale, and Director, Walter Adams. This resulted, in 1916, in a will specifying many of the features of Griffith Observatory, and establishing a committee managed trust fund to build it. Astronomy popularizer Mars Baumgardt convinced the committee at the Zeiss Planetarium projector would be appropriate for Griffith’s project after the planetarium was introduced in Germany in 1923. In 1930, the trust committee judged funds to be sufficient to start work on creating Griffith Observatory, and letters from the Committee requesting help in realizing the project were sent to Hale, Adams, Robert Millikan, and other area experts then engaged in creating the 200-inch telescope eventually destined for Palomar Mountain. A Scientific Advisory Committee, headed by Millikan, recommended that Caltech Physicist Edward Kurth be put in charge of building and exhibit design. Kurth, in turn, sought help from artist Russell Porter. The architecture firm of John C. Austin and Fredrick Ashley was selected to design the project, and they adopted the designs of Porter and Kurth. Philip Fox of the Adler Planetarium was enlisted to manage the completion of the Observatory and become its

  6. Creating the living brand.

    PubMed

    Bendapudi, Neeli; Bendapudi, Venkat

    2005-05-01

    It's easy to conclude from the literature and the lore that top-notch customer service is the province of a few luxury companies and that any retailer outside that rarefied atmosphere is condemned to offer mediocre service at best. But even companies that position themselves for the mass market can provide outstanding customer-employee interactions and profit from them, if they train employees to reflect the brand's core values. The authors studied the convenience store industry in depth and focused on two that have developed a devoted following: QuikTrip (QT) and Wawa. Turnover rates at QT and Wawa are 14% and 22% respectively, much lower than the typical rate in retail. The authors found six principles that both firms embrace to create a strong culture of customer service. Know what you're looking for: A focus on candidates' intrinsic traits allows the companies to hire people who will naturally bring the right qualities to the job. Make the most of talent: In mass-market retail, talent is generally viewed as a commodity, but that outlook becomes a self-fulfilling prophesy. Create pride in the brand: Service quality depends directly on employees' attachment to the brand. Build community: Wawa and QT have made concerted efforts to build customer loyalty through a sense of community. Share the business context: Employees need a clear understanding of how their company operates and how it defines success. Satisfy the soul: To win an employee's passionate engagement, a company must meet his or her needs for security, esteem, and justice. PMID:15929408

  7. Energy cost of creating quantum coherence

    NASA Astrophysics Data System (ADS)

    Misra, Avijit; Singh, Uttam; Bhattacharya, Samyadeb; Pati, Arun Kumar

    2016-05-01

    We consider physical situations where the resource theories of coherence and thermodynamics play competing roles. In particular, we study the creation of quantum coherence using unitary operations with limited thermodynamic resources. We find the maximal coherence that can be created under unitary operations starting from a thermal state and find explicitly the unitary transformation that creates the maximal coherence. Since coherence is created by unitary operations starting from a thermal state, it requires some amount of energy. This motivates us to explore the trade-off between the amount of coherence that can be created and the energy cost of the unitary process. We also find the maximal achievable coherence under the constraint on the available energy. Additionally, we compare the maximal coherence and the maximal total correlation that can be created under unitary transformations with the same available energy at our disposal. We find that when maximal coherence is created with limited energy, the total correlation created in the process is upper bounded by the maximal coherence, and vice versa. For two-qubit systems we show that no unitary transformation exists that creates the maximal coherence and maximal total correlation simultaneously with a limited energy cost.

  8. Detecting pop-out targets in contexts of varying homogeneity: investigating homogeneity coding with event-related brain potentials (ERPs).

    PubMed

    Schubö, Anna; Wykowska, Agnieszka; Müller, Hermann J

    2007-03-23

    Searching for a target among many distracting context elements might be an easy or a demanding task. Duncan and Humphreys (Duncan, J., Humphreys, G.W., 1989. Visual search and stimulus similarity. Psychol. Rev. 96, 433-458) showed that not only the target itself plays a role in the difficulty of target detection. Similarity among context elements and dissimilarity of target and context are two main factors also affecting search efficiency. Moreover, many studies have shown that search becomes particularly efficient with large set sizes and perfectly homogeneous context elements, presumably due to grouping processes involved in target-context segmentation. Especially N2p amplitude has been found to be modulated by the number of context elements and their homogeneity. The aim of the present study was to investigate the influence of context elements of different heterogeneities on search performance using event-related brain potentials (ERPs). Results showed that contexts with perfectly homogeneous elements were indeed special: they were most efficient in visual search and elicited a large N2p differential amplitude effect. Increasing context heterogeneity led to a decrease in search performance and a reduction in N2p differential amplitude. Reducing the number of context elements led to a marked performance decrease for random heterogeneous contexts but not for grouped heterogeneous contexts. Behavioral and N2p results delivered evidence (a) in favor of specific processing modes operating on different spatial scales (b) for the existence of homogeneity coding postulated by Duncan and Humphreys.

  9. Non-Homogeneous Fractal Hierarchical Weighted Networks

    PubMed Central

    Dong, Yujuan; Dai, Meifeng; Ye, Dandan

    2015-01-01

    A model of fractal hierarchical structures that share the property of non-homogeneous weighted networks is introduced. These networks can be completely and analytically characterized in terms of the involved parameters, i.e., the size of the original graph Nk and the non-homogeneous weight scaling factors r1, r2, · · · rM. We also study the average weighted shortest path (AWSP), the average degree and the average node strength, taking place on the non-homogeneous hierarchical weighted networks. Moreover the AWSP is scrupulously calculated. We show that the AWSP depends on the number of copies and the sum of all non-homogeneous weight scaling factors in the infinite network order limit. PMID:25849619

  10. ANALYSIS OF FISH HOMOGENATES FOR PERFLUORINATED COMPOUNDS

    EPA Science Inventory

    Perfluorinated compounds (PFCs) which include PFOS and PFOA are widely distributed in wildlife. Whole fish homogenates were analyzed for PFCs from the upper Mississippi, the Missouri and the Ohio rivers. Methods development, validation data, and preliminary study results will b...

  11. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  12. Producing tritium in a homogenous reactor

    DOEpatents

    Cawley, William E.

    1985-01-01

    A method and apparatus are described for the joint production and separation of tritium. Tritium is produced in an aqueous homogenous reactor and heat from the nuclear reaction is used to distill tritium from the lower isotopes of hydrogen.

  13. Non-homogeneous fractal hierarchical weighted networks.

    PubMed

    Dong, Yujuan; Dai, Meifeng; Ye, Dandan

    2015-01-01

    A model of fractal hierarchical structures that share the property of non-homogeneous weighted networks is introduced. These networks can be completely and analytically characterized in terms of the involved parameters, i.e., the size of the original graph Nk and the non-homogeneous weight scaling factors r1, r2, · · · rM. We also study the average weighted shortest path (AWSP), the average degree and the average node strength, taking place on the non-homogeneous hierarchical weighted networks. Moreover the AWSP is scrupulously calculated. We show that the AWSP depends on the number of copies and the sum of all non-homogeneous weight scaling factors in the infinite network order limit.

  14. Effect of homogenization and pasteurization on the structure and thermal stability of whey protein in milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The effect of homogenization alone or in combination with high temperature, short time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a two-...

  15. Homogeneous cosmological models and new inflation

    NASA Technical Reports Server (NTRS)

    Turner, Michael S.; Widrow, Lawrence M.

    1986-01-01

    The promise of the inflationary-universe scenario is to free the present state of the universe from extreme dependence upon initial data. Paradoxically, inflation is usually analyzed in the context of the homogeneous and isotropic Robertson-Walker cosmological models. It is shown that all but a small subset of the homogeneous models undergo inflation. Any initial anisotropy is so strongly damped that if sufficient inflation occurs to solve the flatness and horizon problems, the universe today would still be very isotropic.

  16. Noncommutative complex structures on quantum homogeneous spaces

    NASA Astrophysics Data System (ADS)

    Ó Buachalla, Réamonn

    2016-01-01

    A new framework for noncommutative complex geometry on quantum homogeneous spaces is introduced. The main ingredients used are covariant differential calculi and Takeuchi's categorical equivalence for quantum homogeneous spaces. A number of basic results are established, producing a simple set of necessary and sufficient conditions for noncommutative complex structures to exist. Throughout, the framework is applied to the quantum projective spaces endowed with the Heckenberger-Kolb calculus.

  17. Layout optimization using the homogenization method

    NASA Technical Reports Server (NTRS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  18. Layout optimization using the homogenization method

    NASA Astrophysics Data System (ADS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  19. Improving homogeneity by dynamic speed limit systems.

    PubMed

    van Nes, Nicole; Brandenburg, Stefan; Twisk, Divera

    2010-05-01

    Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12 road sections in a driving simulator. The speed limit system (static-dynamic), the sophistication of the dynamic speed limit system (basic roadside, advanced roadside, and advanced in-car) and the situational condition (dangerous-non-dangerous) were varied. The homogeneity of driving speed, the rated credibility of the posted speed limit and the acceptance of the different dynamic speed limit systems were assessed. The results show that the homogeneity of individual speeds, defined as the variation in driving speed for an individual subject along a particular road section, was higher with the dynamic speed limit system than with the static speed limit system. The more sophisticated dynamic speed limit system tested within this study led to higher homogeneity than the less sophisticated systems. The acceptance of the dynamic speed limit systems used in this study was positive, they were perceived as quite useful and rather satisfactory.

  20. Cluster Mechanism of Homogeneous Crystallization (Computer Study)

    NASA Astrophysics Data System (ADS)

    Belashchenko, D. K.

    2008-12-01

    A molecular dynamics (MD) study of homogeneous crystallization of liquid rubidium is conducted with an inter-particle pair potential. The equilibrium crystallization temperature of the models was 313 K. Models consisted of 500, 998, and 1968 particles in a basic cube. The main investigation method was as follows: to detect (along the MD run) the atoms with Voronoi polyhedrons (VP) of 0608 type (“0608-atoms,” as in a bcc crystal) and to detect the bound groups of 0608-atoms (“0608-clusters”) that could play the role of the seeds in crystallization. Full crystallization was observed only at temperatures lower than 185 K with the creation of a predominant bcc crystal. The crystallization mechanism of Rb models differs drastically from the mechanism adopted in classical nucleation theory. It consists of the growth of the total number of 0608-atoms on cooling and the formation of 0608-clusters, analogous to the case of coagulation of solute for a supersaturated two-component solution. At the first stage of the process the clusters have a very loose structure (something like medusa or octopus with many tentacles) and include inside atoms with other Voronoi polyhedron types. The dimensions of clusters quickly increase and approach those of the basic cube. 0608-atoms play the leading role in the crystallization process and activate the transition of the atoms involved in the 0608-coordination. The fast growth of the maximum cluster begins after it attains a critical size (about 150 0608-atoms). The fluctuations of cluster sizes are very important in the creation of a 0608-cluster of critical (threshold) size. These fluctuations are especially large in the interval from 180 K to 185 K.

  1. A homogeneous superconducting magnet design using a hybrid optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ni, Zhipeng; Wang, Qiuliang; Liu, Feng; Yan, Luguang

    2013-12-01

    This paper employs a hybrid optimization algorithm with a combination of linear programming (LP) and nonlinear programming (NLP) to design the highly homogeneous superconducting magnets for magnetic resonance imaging (MRI). The whole work is divided into two stages. The first LP stage provides a global optimal current map with several non-zero current clusters, and the mathematical model for the LP was updated by taking into account the maximum axial and radial magnetic field strength limitations. In the second NLP stage, the non-zero current clusters were discretized into practical solenoids. The superconducting conductor consumption was set as the objective function both in the LP and NLP stages to minimize the construction cost. In addition, the peak-peak homogeneity over the volume of imaging (VOI), the scope of 5 Gauss fringe field, and maximum magnetic field strength within superconducting coils were set as constraints. The detailed design process for a dedicated 3.0 T animal MRI scanner was presented. The homogeneous magnet produces a magnetic field quality of 6.0 ppm peak-peak homogeneity over a 16 cm by 18 cm elliptical VOI, and the 5 Gauss fringe field was limited within a 1.5 m by 2.0 m elliptical region.

  2. A modified homogeneous freezing rate parameterization for aqueous solution droplets

    NASA Astrophysics Data System (ADS)

    Moehler, O.; Benz, S.; Hoehler, K.; Wagner, R.

    2012-12-01

    It is still a matter of debate wether cirrus cloud formation is dominated by heterogeneous ice nucleation, leading to low ice crystal number concentrations, or is also influenced by homogeneous freezing of solution aerosols leading to higher ice crystal number concentrations. Part of the discussion is due to the fact that current models seem to overestimate ice crystal numbers from homogeneous freezing compared to measurements, though the formation rate of cirrus ice crystals by homogeneous freezing of aqueous particles is believed to be well understood and formulated in terms of e.g. the concept of effective freezing temperatures or the water activity dependent ice nucleation rates. Series of recent cirrus cloud simulation experiments at the cloud chamber facility AIDA at the Karlsruhe Institute of Technology at temperatures between -40°C and -80°C together with process modeling studies demonstrated, that the freezing formulations tend to show a low bias in the humidity onset thresholds for homogeneous ice formation at temperatures below about 210 K, and furthermore overestimate the ice formation rate by at least a factor of 2. The experimental results will be summarized and a new empirical fit to the experimental data will be suggested for use in atmospheric models.

  3. Homogenization of intergranular fracture towards a transient gradient damage model

    NASA Astrophysics Data System (ADS)

    Sun, G.; Poh, L. H.

    2016-10-01

    This paper focuses on the intergranular fracture of polycrystalline materials, where a detailed model at the meso-scale is translated onto the macro-level through a proposed homogenization theory. The bottom-up strategy involves the introduction of an additional macro-kinematic field to characterize the average displacement jump within the unit cell. Together with the standard macro-strain field, the underlying processes are propagated onto the macro-scale by imposing the equivalence of power and energy at the two scales. The set of macro-governing equations and constitutive relations are next extracted naturally as per standard thermodynamics procedure. The resulting homogenized microforce balance recovers the so-called 'implicit' gradient expression with a transient nonlocal interaction. The homogenized gradient damage model is shown to fully regularize the softening behavior, i.e. the structural response is made mesh-independent, with the damage strain correctly localizing into a macroscopic crack, hence resolving the spurious damage growth observed in many conventional gradient damage models. Furthermore, the predictive capability of the homogenized model is demonstrated by benchmarking its solutions against reference meso-solutions, where a good match is obtained with minimal calibrations, for two different grain sizes.

  4. Creating Heliophysics Concept Maps

    NASA Astrophysics Data System (ADS)

    Ali, N. A.; Peticolas, L. M.; Paglierani, R.; Mendez, B. J.

    2011-12-01

    The Center for Science Education at University of California Berkeley's Space Sciences Laboratory is creating concept maps for Heliophysics and would like to get input from scientists. The purpose of this effort is to identify key concepts related to Heliophysics and map their progression to show how students' understanding of Heliophysics might develop from Kindergarten through higher education. These maps are meant to tie into the AAAS Project 2061 Benchmarks for Scientific Literacy and National Science Education Standards. It is hoped that the results of this effort will be useful for curriculum designers developing Heliophysics-related curriculum materials and classroom teachers using Heliophysics materials. The need for concept maps was identified as a result of product analysis undertaken by the NASA Heliophysics Forum Team. The NASA Science Education and Public Outreach Forums have as two of their goals to improve the characterization of the contents of the Science Mission Directorate and Public Outreach (SMD E/PO) portfolio (Objective 2.1) and assist SMD in addressing gaps in the portfolio of SMD E/PO products and project activities (Objective 2.2). An important part of this effort is receiving feedback from solar scientists regarding the inclusion of key concepts and their progression in the maps. This session will introduce the draft concept maps and elicit feedback from scientists.

  5. Creating alternatives in science

    PubMed Central

    2009-01-01

    Traditional scientist training at the PhD level does not prepare students to be competitive in biotechnology or other non-academic science careers. Some universities have developed biotechnology-relevant doctoral programmes, but most have not. Forming a life science career club makes a statement to university administrators that it is time to rework the curriculum to include biotechnology-relevant training. A career club can supplement traditional PhD training by introducing students to available career choices, help them develop a personal network and teach the business skills that they will need to be competitive in science outside of academia. This paper is an instructional guide designed to help students create a science career club at their own university. These suggestions are based on the experience gained in establishing such a club for the Graduate School at the University of Colorado Denver. We describe the activities that can be offered, the job descriptions for the offices required and potential challenges. With determination, a creative spirit, and the guidance of this paper, students should be able to greatly increase awareness of science career options, and begin building the skills necessary to become competitive in non-academic science. PMID:20161069

  6. Creating Sample Plans

    1999-03-24

    The program has been designed to increase the accuracy and reduce the preparation time for completing sampling plans. It consists of our files 1. Analyte/Combination (AnalCombo) A list of analytes and combinations of analytes that can be requested of the onsite and offsite labs. Whenever a specific combination of analytes or suite names appear on the same line as the code number, this indicates that one sample can be placed in one bottle to bemore » analyzed for these paremeters. A code number is assigned for each analyte and combination of analytes. 2. Sampling Plans Database (SPDb) A database that contains all of the analytes and combinations of analytes along with the basic information required for preparing a sample plan. That basic information includes the following fields; matrix, hold time, preservation, sample volume, container size, if the bottle caps are taped, acceptable choices. 3. Sampling plans create (SPcreate) a file that will lookup information from the Sampling Plans Database and the Job Log File (JLF98) A major database used by Sample Managemnet Services for recording more than 100 fields of information.« less

  7. Simulator for SUPO, a Benchmark Aqueous Homogeneous Reactor (AHR)

    SciTech Connect

    Klein, Steven Karl; Determan, John C.

    2015-10-14

    A simulator has been developed for SUPO (Super Power) an aqueous homogeneous reactor (AHR) that operated at Los Alamos National Laboratory (LANL) from 1951 to 1974. During that period SUPO accumulated approximately 600,000 kWh of operation. It is considered the benchmark for steady-state operation of an AHR. The SUPO simulator was developed using the process that resulted in a simulator for an accelerator-driven subcritical system, which has been previously reported.

  8. Creating a Desired Future

    ERIC Educational Resources Information Center

    Jenkins-Scott, Jackie

    2008-01-01

    When the author became president of Wheelock College in Boston in 2004, she asked the trustees and the entire campus community to engage in an innovative strategic planning and visioning process. The goal was to achieve consensus on a strategic vision for the future of Wheelock College by the end of her first year. This article discusses how…

  9. Create a Classroom Blog!

    ERIC Educational Resources Information Center

    Brunsell, Eric; Horejsi, Martin

    2010-01-01

    Science education blogs can serve as powerful digital lab notebooks that contain text, images, and videos. Each blog entry documents a moment in time, but becomes interactive with the addition of readers' comments. Blogs can provide a realistic experience of the peer-review process and generate evolving descriptions of observations through time.…

  10. Creating Photomontage Videos

    ERIC Educational Resources Information Center

    Nitzberg, Kevan

    2008-01-01

    Several years ago, the author began exploring the use of digital film and video as an art-making media when he took over instructing the video computer art class at the high school where he teaches. He found numerous ways to integrate a variety of multimedia technologies and software with more traditional types of visual art processes and…

  11. Creating a Children's Village

    ERIC Educational Resources Information Center

    Roberts, Paul

    2012-01-01

    Five years ago the author embarked on an odyssey that would fundamentally change his life as an architect. He and his partner, Dave Deppen, were selected through a very competitive process to design a new Child Development and Family Studies Center in the Sierra Foothills, near Yosemite National Park for Columbia College. The Columbia College…

  12. A criterion for assessing homogeneity distribution in hyperspectral images. Part 2: application of homogeneity indices to solid pharmaceutical dosage forms.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    This article is the second of a series of two articles detailing the application of mixing index to assess homogeneity distribution in oral pharmaceutical solid dosage forms by image analysis. Chemical imaging (CI) is an emerging technique integrating conventional imaging and spectroscopic techniques with a view to obtaining spatial and spectral information from a sample. Near infrared chemical imaging (NIR-CI) has proved an excellent analytical tool for extracting high-quality information from sample surfaces. The primary objective of this second part was to demonstrate that the approach developed in the first part could be successfully applied to near infrared hyperspectral images of oral pharmaceutical solid dosage forms such as coated, uncoated and effervescent tablets, as well as to powder blends. To this end, we assessed a new criterion for establishing mixing homogeneity by using four different methods based on a three-dimensional (M×N×λ) data array of hyperspectral images (spectral standard deviations and correlation coefficients) or a two-dimensional (M×N) data array (concentration maps and binary images). The four methods were used applying macropixel analysis to the Poole (M(P)) and homogeneity (H%(Poole)) indices. Both indices proved useful for assessing the degree of homogeneity of pharmaceutical samples. The results testify that the proposed approach can be effectively used in the pharmaceutical industry, in the finished products (e.g., tablets) and in mixing unit operations for example, as a process analytical technology tool for the blending monitoring (see part 1).

  13. A criterion for assessing homogeneity distribution in hyperspectral images. Part 2: application of homogeneity indices to solid pharmaceutical dosage forms.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    This article is the second of a series of two articles detailing the application of mixing index to assess homogeneity distribution in oral pharmaceutical solid dosage forms by image analysis. Chemical imaging (CI) is an emerging technique integrating conventional imaging and spectroscopic techniques with a view to obtaining spatial and spectral information from a sample. Near infrared chemical imaging (NIR-CI) has proved an excellent analytical tool for extracting high-quality information from sample surfaces. The primary objective of this second part was to demonstrate that the approach developed in the first part could be successfully applied to near infrared hyperspectral images of oral pharmaceutical solid dosage forms such as coated, uncoated and effervescent tablets, as well as to powder blends. To this end, we assessed a new criterion for establishing mixing homogeneity by using four different methods based on a three-dimensional (M×N×λ) data array of hyperspectral images (spectral standard deviations and correlation coefficients) or a two-dimensional (M×N) data array (concentration maps and binary images). The four methods were used applying macropixel analysis to the Poole (M(P)) and homogeneity (H%(Poole)) indices. Both indices proved useful for assessing the degree of homogeneity of pharmaceutical samples. The results testify that the proposed approach can be effectively used in the pharmaceutical industry, in the finished products (e.g., tablets) and in mixing unit operations for example, as a process analytical technology tool for the blending monitoring (see part 1). PMID:22840977

  14. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation.

  15. Identifying homogenous subgroups for individual patient meta-analysis based on Rough Set Theory.

    PubMed

    Gil-Herrera, Eleazar; Tsalatsanis, Athanasios; Kumar, Ambuj; Mhaskar, Rahul; Miladinovic, Branko; Yalcin, Ali; Djulbegovic, Benjamin

    2014-01-01

    Failure to detect and manage heterogeneity between clinical trials included in meta-analysis may lead to misinterpretation of summary effect estimates. This may ultimately compromise the validity of the results of the meta-analysis. Typically, when heterogeneity between trials is detected, researchers use sensitivity or subgroup analysis to manage it. However, both methods fail to explain why heterogeneity existed in the first place. Here we propose a novel methodology that relies on Rough Set Theory (RST) to detect, explain, and manage the sources of heterogeneity applicable to meta-analysis performed on individual patient data (IPD). The method exploits the RST relations of discernibility and indiscernibility to create homogeneous groups of patients. We applied our methodology on a dataset of 1,111 patients enrolled in 9 randomized controlled trials studying the effect of two transplantation procedures in the management of hematologic malignancies. Our method was able to create three subgroups of patients with remarkably low statistical heterogeneity values (16.8%, 0% and 0% respectively). The proposed methodology has the potential to automatize and standardize the process of detecting and managing heterogeneity in IPD meta-analysis. Future work involves investigating the applications of the proposed methodology in analyzing treatment effects in patients belonging to different risk groups, which will ultimately assist in personalized healthcare decision making.

  16. ISO 55000: Creating an asset management system.

    PubMed

    Bradley, Chris; Main, Kevin

    2015-02-01

    In the October 2014 issue of HEJ, Keith Hamer, group vice-president, Asset Management & Engineering at Sodexo, and marketing director at Asset Wisdom, Kevin Main, argued that the new ISO 55000 standards present facilities managers with an opportunity to create 'a joined-up, whole lifecycle approach' to managing and delivering value from assets. In this article, Kevin Main and Chris Bradley, who runs various asset management projects, examine the process of creating an asset management system.

  17. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  18. Analysis of homogeneous/non-homogeneous nanofluid models accounting for nanofluid-surface interactions

    NASA Astrophysics Data System (ADS)

    Ahmad, R.

    2016-07-01

    This article reports an unbiased analysis for the water based rod shaped alumina nanoparticles by considering both the homogeneous and non-homogeneous nanofluid models over the coupled nanofluid-surface interface. The mechanics of the surface are found for both the homogeneous and non-homogeneous models, which were ignored in previous studies. The viscosity and thermal conductivity data are implemented from the international nanofluid property benchmark exercise. All the simulations are being done by using the experimentally verified results. By considering the homogeneous and non-homogeneous models, the precise movement of the alumina nanoparticles over the surface has been observed by solving the corresponding system of differential equations. For the non-homogeneous model, a uniform temperature and nanofluid volume fraction are assumed at the surface, and the flux of the alumina nanoparticle is taken as zero. The assumption of zero nanoparticle flux at the surface makes the non-homogeneous model physically more realistic. The differences of all profiles for both the homogeneous and nonhomogeneous models are insignificant, and this is due to small deviations in the values of the Brownian motion and thermophoresis parameters.

  19. Homogenization in compiling ICRF combined catalogs

    NASA Astrophysics Data System (ADS)

    Marco, F. J.; Martínez, M. J.; López, J. A.

    2013-10-01

    Context. The International Astronomical Union (IAU) recommendations regarding the International Celestial Reference Frame (ICRF) realizations require the construction of radio sources catalogs obtained using very-long-baseline interferometry (VLBI) methods. The improvement of these catalogs is a necessary procedure for the further densification of the ICRF over the celestial sphere. Aims: The different positions obtained from several catalogs using common sources to the ICRF make it necessary to critically revise the different methods employed in improving the ICRF from several radio sources catalogs. In this sense, a revision of the analytical and the statistical methods is necessary in line with their advantages and disadvantages. We have a double goal: first, we propose an adequate treatment of the residual of several catalogs to obtain a homogeneous catalog; second, we attempt to discern whether a combined catalog is homogeneous. Methods: We define homogeneity as applied to our problem in a dual sense: the first deals with the spatial distribution of the data over the celestial sphere. The second has a statistical meaning, as we consider that homogeneity exists when the residual between a given catalog and the ICRF behaves as a unimodal pure Gaussian. We use a nonparametrical method, which enables us to homogeneously extend the statistical properties of the residual over the entire sphere. This intermediate adjustment allows for subsequent computation of the coefficients for any parametrical adjustment model that has a higher accuracy and greater stability, and it prevents problems related with direct adjustments using the models. On the other hand, the homogeneity of the residuals in a catalog is tested using different weights. Our procedure also serves to propose the most suitable weights to maintain homogeneity in the final results. We perform a test using the ICRF-Ext2, JPL, and USNO quasar catalogs. Results: We show that a combination of catalogs can only

  20. Climate Data Homogenization Using Edge Detection Algorithms

    NASA Astrophysics Data System (ADS)

    Hammann, A. C.; Rennermalm, A. K.

    2015-12-01

    The problem of climate data homogenization has predominantly been addressed by testing the likelihood of one or more breaks inserted into a given time series and modeling the mean to be stationary in between the breaks. We recast the same problem in a slightly different form: that of detecting step-like changes in noisy data, and observe that this problem has spawned a large number of approaches to its solution as the "edge detection" problem in image processing. With respect to climate data, we ask the question: How can we optimally separate step-like from smoothly-varying low-frequency signals? We study the hypothesis that the edge-detection approach makes better use of all information contained in the time series than the "traditional" approach (e.g. Caussinus and Mestre, 2004), which we base on several observations. 1) The traditional formulation of the problem reduces the available information from the outset to that contained in the test statistic. 2) The criterion of local steepness of the low-frequency variability, while at least hypothetically useful, is ignored. 3) The practice of using monthly data corresponds, mathematically, to applying a moving average filter (to reduce noise) and subsequent subsampling of the result; this subsampling reduces the amount of available information beyond what is necessary for noise reduction. Most importantly, the tradeoff between noise reduction (better with filters with wide support in the time domain) and localization of detected changes (better with filters with narrow support) is expressed in the well-known uncertainty principle and can be addressed optimally within a time-frequency framework. Unsurprisingly, a large number of edge-detection algorithms have been proposed that make use of wavelet decompositions and similar techniques. We are developing this framework in part to be applied to a particular set of climate data from Greenland; we will present results from this application as well as from tests with

  1. Creating a Toilet Training Plan

    MedlinePlus

    ... Size Email Print Share Creating a Toilet Training Plan Page Content Article Body These are the tools ... will need to create your own toilet-training plan and implement it at the best time for ...

  2. Rapid biotic homogenization of marine fish assemblages.

    PubMed

    Magurran, Anne E; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J; McGill, Brian

    2015-01-01

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change. PMID:26400102

  3. Heterogeneous and homogeneous robot group behavior

    SciTech Connect

    Goldberg, D.

    1996-12-31

    When working with groups of robots it may be very difficult to determine what characteristics the group requires in order to perform a task most efficiently-i.e., in the least time. Some researchers have used groups of behaviorally differentiated robots-where the robots do not perform the same actions-and others have used behaviorally homogeneous groups. None of this research, however, explicitly compares the behavior of heterogeneous and homogeneous groups of robots to determine which performs a task more efficiently. The research described here makes such a comparison and aims at developing guidelines to aid in the design of the heterogeneous/homogeneous characteristics that will allow a group of robots to perform a task efficiently.

  4. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E. (Inventor); Taylor, Bryant D. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  5. Commensurability effects in holographic homogeneous lattices

    NASA Astrophysics Data System (ADS)

    Andrade, Tomas; Krikun, Alexander

    2016-05-01

    An interesting application of the gauge/gravity duality to condensed matter physics is the description of a lattice via breaking translational invariance on the gravity side. By making use of global symmetries, it is possible to do so without scarifying homogeneity of the pertinent bulk solutions, which we thus term as "homogeneous holographic lattices." Due to their technical simplicity, these configurations have received a great deal of attention in the last few years and have been shown to correctly describe momentum relaxation and hence (finite) DC conductivities.

  6. Contribution of the live-vertebrate trade toward taxonomic homogenization.

    PubMed

    Romagosa, Christina M; Guyer, Craig; Wooten, Michael C

    2009-08-01

    The process of taxonomic homogenization occurs through two mechanisms, extinctions and introductions, and leads to a reduction of global biodiversity. We used available U.S. trade data as a proxy for global trade in live vertebrates to assess the contribution of trade to the process of taxonomic homogenization. Data included all available U.S. importation and exportation records, estimation of extinction risk, and reports of establishment outside the native range for species within six vertebrate groups. Based on Monte Carlo sampling, the number of species traded, established outside of the native range, and threatened with extinction was not randomly distributed among vertebrate families. Twenty-eight percent of vertebrate families that were traded preferentially were also established or threatened with extinction, an unusually high percentage compared with the 7% of families that were not traded preferentially but that became established or threatened with extinction. The importance of trade in homogenization of vertebrates suggests that additional efforts should be made to prevent introductions and extinctions through this medium.

  7. Homogeneous and heterogeneous reactions of phenanthrene with ozone

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Yang, Bo; Meng, Junwang; Gao, Shaokai; Dong, Xinyu; Shu, Jinian

    2010-02-01

    The reactions of gas-phase phenanthrene and suspended phenanthrene particles with ozone were conducted in a 200l chamber. The secondary organic aerosol formation was observed in the reaction of gas-phase phenanthrene with ozone and simultaneously the size distribution of the secondary organic aerosol was monitored with a scanning mobility particle sizer during the formation process. The particulate ozonation products from both reactions were analyzed with a vacuum ultraviolet photoionization aerosol time-of-flight mass spectrometer. 2,2'-Diformylbiphenyl was identified as the dominant product in both homogeneous and heterogeneous reactions of phenanthrene with ozone. GC/MS analysis of ozonation products of phenanthrene in glacial acetic acid was carried out for assigning time-of-flight mass spectra of reaction products formed in the homogeneous and heterogeneous reactions of phenanthrene with ozone.

  8. Improved flexibility with grayscale fabrication of calcium fluoride homogenizers

    NASA Astrophysics Data System (ADS)

    Brown, Jeremiah; Brakhage, Peter; Simmons, Lamarr; Mueller, Ralf

    2012-03-01

    High quality and highly uniform illumination is a critical component for advanced lithography systems and wafer inspection tools. Homogenizer elements fabricated in calcium fluoride have demonstrated good performance for deep UV applications. Grayscale photolithography allows for the fabrication of single-sided micro lens array (MLA) elements with excellent optical performance. The MLA offers some significant advantages over crossed cylinders fabricated using grayscale photolithography processes, including the reduction in the number of fabrication steps and the added flexibility of manufacturing noncylindrical surface geometries. This research presentation reviews the fabrication process and compares grayscale crossed cylindrical arrays and MLAs in terms of their capabilities and performance.

  9. Homogeneously catalyzed synthesis gas transformations to oxygenate fuels

    SciTech Connect

    Mahajan, D.; Mattas, L.; Sanchez, J.

    1992-04-01

    At Brookhaven National Laboratory (BNL), the ongoing oxygenates synthesis program is addressing the catalytic synthesis gas conversion to liquid fuels and fuel additives. The major thrust of this effort is to enhance carbon conversion, reaction rates, product selectivity and overall process efficiency. To this effect, a series of liquid phase homogeneous catalysts have been developed and successfully utilized in the synthesis of methanol and other oxygenates. This paper identifies advantages and uncertainties associated with these newly developed catalysts. The effect of system parameters on the overall process scheme is discussed.

  10. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  11. Spatial Homogeneity and Redshift--Distance Laws

    NASA Astrophysics Data System (ADS)

    Nicoll, J. F.; Segal, I. E.

    1982-06-01

    Spatial homogeneity in the radial direction of low-redshift galaxies is subjected to Kafka-Schmidt V/Vm tests using well-documented samples. Homogeneity is consistent with the assumption of the Lundmark (quadratic redshift-distance) law, but large deviations from homogeneity are implied by the assumption of the Hubble (linear redshift-distance) law. These deviations are similar to what would be expected on the basis of the Lundmark law. Luminosity functions are obtained for each law by a nonparametric statistically optimal method that removes the observational cutoff bias in complete samples. Although the Hubble law correlation of absolute magnitude with redshift is reduced considerably by elimination of the bias, computer simulations show that its bias-free value is nevertheless at a satistically quite significant level, indicating the self-inconsistency of the law. The corresponding Lundmark law correlations are quite satisfactory satistically. The regression of redshift on magnitude also involves radial spatial homogeneity and, according to R. Soneira, has slope determining the redshift-magnitude exponent independently of the luminosity function. We have, however, rigorously proved the material dependence of the regression on this function and here exemplify our treatment by using the bias-free functions indicated, with results consistent with the foregoing argument.

  12. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  13. General Theorems about Homogeneous Ellipsoidal Inclusions

    ERIC Educational Resources Information Center

    Korringa, J.; And Others

    1978-01-01

    Mathematical theorems about the properties of ellipsoids are developed. Included are Poisson's theorem concerning the magnetization of a homogeneous body of ellipsoidal shape, the polarization of a dielectric, the transport of heat or electricity through an ellipsoid, and other problems. (BB)

  14. Extension theorems for homogenization on lattice structures

    NASA Technical Reports Server (NTRS)

    Miller, Robert E.

    1992-01-01

    When applying homogenization techniques to problems involving lattice structures, it is necessary to extend certain functions defined on a perforated domain to a simply connected domain. This paper provides general extension operators which preserve bounds on derivatives of order l. Only the special case of honeycomb structures is considered.

  15. Homogeneity of Latvian temperature and precipitation series

    NASA Astrophysics Data System (ADS)

    Lizuma, L.; Briede, A.

    2010-09-01

    During previous years and decades the homogenization of Latvian monthly temperature and precipitation data series was based on the direct homogenization methods which relayed on metadata and studies of the effects of specific changes in time of observation as well as methods of observation. However, the method is not effective for temperature and precipitation data series shifts detection caused by measurement's place relocation or environmental changes. The both climatological temperature and precipitation records are significantly affected by a number of non-climatological factors (station moves, changes in instrumentation; introduction of different observing practices like a different observing time or introduction of wetting corrections for precipitation, changes in the local urban environment). If these non-homogeneities are not accounted for properly, that makes the data unrepresentative to be used for analyses of climate state, variations and changes. Monthly and daily Latvian station series (1950-2008) of surface air temperature and precipitation are statistically tested with respect to homogeneity. Two homogeneity tests are applied to evaluate monthly series. The multiple analyses of series for homogenization MASHv3.02 has been applied to 23 Latvian mean, maximum and minimum daily and monthly data series and daily and monthly precipitation series. The standard normal homogeneity tests (SNHT) has been applied to monthly mean temperature and precipitation series. During the tested period the station network is dense enough for efficient homogeneity testing. It has been found that all the time series contain the homogeneity breaks at least during one of the month. For some stations the multiple breaks were found. For mean temperature time series the 80 % of the breaks are generally less than ±0.20C. The largest detected homogeneity breaks in the mean monthly temperatures are up to ±1.00C, in mean monthly maximum temperature are up to ±1.30C and for mean

  16. Confocal detection of planar homogeneous and heterogeneous immunosorbent assays

    NASA Astrophysics Data System (ADS)

    Ghafari, Homanaz; Zhou, Yanzhou; Ali, Selman; Hanley, Quentin S.

    2009-11-01

    Optically sectioned detection of fluorescence immunoassays using a confocal microscope enables the creation of both homo- and heterogeneous planar format assays. We report a set assays requiring optically sectioned detection using a model system and analysis procedures for separating signals of a surface layer from an overlying solution. A model sandwich assay with human immunoglobulin G as the target antigen is created on a glass substrate. The prepared surfaces are exposed to antigen and a FITC-labeled secondary antibody. The resulting preparations are either read directly to provide a homogeneous assay or after wash steps, giving a heterogeneous assay. The simplicity of the object shapes arising from the planar format makes the decomposition of analyte signals from the thin film bound to the surface and overlayer straightforward. Measured response functions of the thin film and overlayer fit well to the Cauchy-Lorentz and cumulative Cauchy-Lorentz functions, respectively, enabling the film and overlayer to be separated. Under the conditions used, the detection limits for the homogeneous and heterogeneous forms of the assay are 2.2 and 5.5 ng/ml, respectively. Planar format, confocally read fluorescence assays enable wash-free detection of antigens and should be applicable to a wide range of assays involving surface-bound species.

  17. Confocal detection of planar homogeneous and heterogeneous immunosorbent assays.

    PubMed

    Ghafari, Homanaz; Zhou, Yanzhou; Ali, Selman; Hanley, Quentin S

    2009-01-01

    Optically sectioned detection of fluorescence immunoassays using a confocal microscope enables the creation of both homo- and heterogeneous planar format assays. We report a set assays requiring optically sectioned detection using a model system and analysis procedures for separating signals of a surface layer from an overlying solution. A model sandwich assay with human immunoglobulin G as the target antigen is created on a glass substrate. The prepared surfaces are exposed to antigen and a FITC-labeled secondary antibody. The resulting preparations are either read directly to provide a homogeneous assay or after wash steps, giving a heterogeneous assay. The simplicity of the object shapes arising from the planar format makes the decomposition of analyte signals from the thin film bound to the surface and overlayer straightforward. Measured response functions of the thin film and overlayer fit well to the Cauchy-Lorentz and cumulative Cauchy-Lorentz functions, respectively, enabling the film and overlayer to be separated. Under the conditions used, the detection limits for the homogeneous and heterogeneous forms of the assay are 2.2 and 5.5 ng/ml, respectively. Planar format, confocally read fluorescence assays enable wash-free detection of antigens and should be applicable to a wide range of assays involving surface-bound species.

  18. Are geological media homogeneous or heterogeneous for neutron investigations?

    PubMed

    Woźnicka, U; Drozdowicz, K; Gabańska, B; Krynicka, E; Igielski, A

    2003-01-01

    The thermal neutron absorption cross section of a heterogeneous material is lower than that of the corresponding homogeneous one which contains the same components. When rock materials are investigated the sample usually contains grains which create heterogeneity. The heterogeneity effect depends on the mass contribution of highly and low-absorbing centers, on the ratio of their absorption cross sections, and on their sizes. An influence of the granulation of silicon and diabase samples on the absorption cross section measured with Czubek's method has been experimentally investigated. A 20% underestimation of the absorption cross section has been observed for diabase grains of sizes from 6.3 to 12.8 mm. PMID:12485675

  19. Temperature Trends from Homogenized German Radiosonde Data

    NASA Astrophysics Data System (ADS)

    Pattantyús-Ábrahám, Margit; Steinbrecht, Wolfgang

    2015-04-01

    We present homogenization procedure and results for Germany's historical radiosonde records, dating back to 1950. Upper-air temperature records have been homogenized manually. The method makes use of the different RS networks existing in East and West-Germany from the 1950s until 1990. The largest temperature adjustments, up to 2.5K, apply to Freiberg sondes used in the East in the 1950s and 1960s. Adjustments for Graw H50 and M60 sondes, used in the West from the 1950s to the late 1980s, and for RKZ sondes, used in the East in the 1970s and 1980s, are also significant, 0.3 to 0.5K. Small differences between Vaisala RS80 and RS92 sondes used throughout Germany since 1990 and 2005, respectively, were not corrected for at levels from the ground to 300 hPa. Comparison of the homogenized data with other radiosonde datasets, RICH and HadAT2, and with Microwave Sounding Unit satellite data, shows generally good agreement. HadAT2 data exhibit a few suspicious spikes in the 1970s and 1980s, and some suspicious offsets up to 1K after 1995. Compared to RICH, our homogenized data show slightly different temperatures in the 1960s and 1970s. We find that the troposphere over Germany has been warming by 0.25 ± 0.1K per decade since the early 1960s, slightly more than reported in other studies. The stratosphere has been cooling, with the trend increasing from almost no change near 230hPa (the tropopause) to -0.5 ± 0.2K per decade near 50hPa. Trends from the homogenized data are more positive by about 0.1K per decade than for the original data, both in troposphere and stratosphere.

  20. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape

    PubMed Central

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to

  1. Utilizing Hierarchical Clustering to improve Efficiency of Self-Organizing Feature Map to Identify Hydrological Homogeneous Regions

    NASA Astrophysics Data System (ADS)

    Farsadnia, Farhad; Ghahreman, Bijan

    2016-04-01

    Hydrologic homogeneous group identification is considered both fundamental and applied research in hydrology. Clustering methods are among conventional methods to assess the hydrological homogeneous regions. Recently, Self-Organizing feature Map (SOM) method has been applied in some studies. However, the main problem of this method is the interpretation on the output map of this approach. Therefore, SOM is used as input to other clustering algorithms. The aim of this study is to apply a two-level Self-Organizing feature map and Ward hierarchical clustering method to determine the hydrologic homogenous regions in North and Razavi Khorasan provinces. At first by principal component analysis, we reduced SOM input matrix dimension, then the SOM was used to form a two-dimensional features map. To determine homogeneous regions for flood frequency analysis, SOM output nodes were used as input into the Ward method. Generally, the regions identified by the clustering algorithms are not statistically homogeneous. Consequently, they have to be adjusted to improve their homogeneity. After adjustment of the homogeneity regions by L-moment tests, five hydrologic homogeneous regions were identified. Finally, adjusted regions were created by a two-level SOM and then the best regional distribution function and associated parameters were selected by the L-moment approach. The results showed that the combination of self-organizing maps and Ward hierarchical clustering by principal components as input is more effective than the hierarchical method, by principal components or standardized inputs to achieve hydrologic homogeneous regions.

  2. Tissue homogeneity requires inhibition of unequal gene silencing during development.

    PubMed

    Le, Hai H; Looney, Monika; Strauss, Benjamin; Bloodgood, Michael; Jose, Antony M

    2016-08-01

    Multicellular organisms can generate and maintain homogenous populations of cells that make up individual tissues. However, cellular processes that can disrupt homogeneity and how organisms overcome such disruption are unknown. We found that ∼100-fold differences in expression from a repetitive DNA transgene can occur between intestinal cells in Caenorhabditis elegans These differences are caused by gene silencing in some cells and are actively suppressed by parental and zygotic factors such as the conserved exonuclease ERI-1. If unsuppressed, silencing can spread between some cells in embryos but can be repeat specific and independent of other homologous loci within each cell. Silencing can persist through DNA replication and nuclear divisions, disrupting uniform gene expression in developed animals. Analysis at single-cell resolution suggests that differences between cells arise during early cell divisions upon unequal segregation of an initiator of silencing. Our results suggest that organisms with high repetitive DNA content, which include humans, could use similar developmental mechanisms to achieve and maintain tissue homogeneity. PMID:27458132

  3. Homogeneous UVA system for corneal cross-linking treatment

    NASA Astrophysics Data System (ADS)

    Ayres Pereira, Fernando R.; Stefani, Mario A.; Otoboni, José A.; Richter, Eduardo H.; Ventura, Liliane

    2010-02-01

    The treatment of keratoconus and corneal ulcers by collagen cross-linking using ultraviolet type A irradiation, combined with photo-sensitizer Riboflavin (vitamin B2), is a promising technique. The standard protocol suggests instilling Riboflavin in the pre-scratched cornea every 5min for 30min, during the UVA irradiation of the cornea at 3mW/cm2 for 30 min. This process leads to an increase of the biomechanical strength of the cornea, stopping the progression, or sometimes, even reversing Keratoconus. The collagen cross-linking can be achieved by many methods, but the utilization of UVA light, for this purpose, is ideal because of its possibility of a homogeneous treatment leading to an equal result along the treated area. We have developed a system, to be clinically used for treatment of unhealthy corneas using the cross-linking technique, which consists of an UVA emitting delivery device controlled by a closed loop system with high homogeneity. The system is tunable and delivers 3-5 mW/cm2, at 365nm, for three spots (6mm, 8mm and 10mm in diameter). The electronics close loop presents 1% of precision, leading to an overall error, after the calibration, of less than 10% and approximately 96% of homogeneity.

  4. Photoinduced electron transfer processes in homogeneous and microheterogeneous solutions

    SciTech Connect

    Whitten, D.G.

    1991-10-01

    The focus of the work described in this report is on single electron transfer reactions of excited states which culminate in the formation of stable or metastable even electron species. For the most part the studies have involved even electron organic substrates which are thus converted photochemically to odd electron species and then at some stage reconvert to even electron products. These reactions generally fall into two rather different categories. In one set of studies we have examined reactions in which the metastable reagents generated by single electron transfer quenching of an excited state undergo novel fragmentation reactions, chiefly involving C-C bond cleavage. These reactions often culminate in novel and potentially useful chemical reactions and frequently have the potential for leading to new chemical products otherwise unaffordable by conventional reaction paths. In a rather different investigation we have also studied reactions in which single electron transfer quenching of an excited state is followed by subsequent reactions which lead reversibly to metastable two electron products which, often stable in themselves, can nonetheless be reacted with each other or with other reagents to regenerate the starting materials with release of energy. 66 refs., 9 figs., 1 tab.

  5. A Story Approach to Create Online College Courses

    ERIC Educational Resources Information Center

    Romero, Liz

    2016-01-01

    The purpose of this article is to describe the implementation of a story approach to create online courses in a college environment. The article describes the components of the approach and the implementation process to create a nursing and a language course. The implementation starts with the identification of the need and follows by creating a…

  6. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered.

  7. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  8. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  9. Does Double Loop Learning Create Reliable Knowledge?

    ERIC Educational Resources Information Center

    Blackman, Deborah; Connelly, James; Henderson, Steven

    2004-01-01

    This paper addresses doubts concerning the reliability of knowledge being created by double loop learning processes. Popper's ontological worlds are used to explore the philosophical basis of the way that individual experiences are turned into organisational knowledge, and such knowledge is used to generate organisational learning. The paper…

  10. Instruction: Does It Mean Creating Intelligence?

    ERIC Educational Resources Information Center

    Brethower, Dale

    1990-01-01

    Argues that the mission of the university is to create intelligence. Defines intelligence, discusses research on cognitive processes of learning, and discusses obstacles to using the demonstrate-label-coach-mastery strategy emphasizing the value of the clinical approach used to teach seven specific skills. Presents a classroom illustration of this…

  11. Can cognitive science create a cognitive economics?

    PubMed

    Chater, Nick

    2015-02-01

    Cognitive science can intersect with economics in at least three productive ways: by providing richer models of individual behaviour for use in economic analysis; by drawing from economic theory in order to model distributed cognition; and jointly to create more powerful 'rational' models of cognitive processes and social interaction. There is the prospect of moving from behavioural economics to a genuinely cognitive economics.

  12. Creating engaging experiences for rehabilitation.

    PubMed

    McClusky, John F

    2008-01-01

    The traditional model of rehabilitation center design based on usability and function falls short of addressing the aspirations of those who use them. To better serve the motivational needs of both patients and therapists, we need to reconsider the gymnasium-inspired designs of current rehabilitation centers. Designers Patricia Moore and David Guynes have drawn inspiration from the everyday to create more engaging rehabilitation experiences with their Easy Street, Independence Square, Rehab 1-2-3, Our Town, and WorkSyms rehabilitation environments. Their designs simulate real-life situations to motivate patients by helping them connect their therapy to the life in which they aspire to return. Utilizing an empathic research process, Moore and Guynes build a deeper understanding of both patients' and therapists' values and apply that understanding to designs that are more directly connected to patients' aspirational goals while still meeting their functional rehabilitation needs. This same research-based design approach is utilized in all of their design work that has included, most recently, the design of the Phoenix Valley Transit Authority's Metro Light Rail Train. The train and stations have won awards for accessibility and will begin public operation in late 2008.

  13. Beyond relationships between homogeneous and heterogeneous catalysis

    SciTech Connect

    Dixon, David A.; Katz, Alexander; Arslan, Ilke; Gates, Bruce C.

    2014-08-13

    Scientists who regard catalysis as a coherent field have been striving for decades to articulate the fundamental unifying principles. But because these principles seem to be broader than chemistry, chemical engineering, and materials science combined, catalytic scientists commonly interact within the sub-domains of homogeneous, heterogeneous, and bio-catalysis, and increasingly within even narrower domains such as organocatalysis, phase-transfer catalysis, acid-base catalysis, zeolite catalysis, etc. Attempts to unify catalysis have motivated researchers to find relationships between homogeneous and heterogeneous catalysis and to mimic enzymes. These themes have inspired vibrant international meetings and workshops, and we have benefited from the idea exchanges and have some thoughts about a path forward.

  14. Homogeneous freezing nucleation of stratospheric solution droplets

    NASA Technical Reports Server (NTRS)

    Jensen, Eric J.; Toon, Owen B.; Hamill, Patrick

    1991-01-01

    The classical theory of homogeneous nucleation was used to calculate the freezing rate of sulfuric acid solution aerosols under stratospheric conditions. The freezing of stratospheric aerosols would be important for the nucleation of nitric acid trihydrate particles in the Arctic and Antarctic stratospheres. In addition, the rate of heterogeneous chemical reactions on stratospheric aerosols may be very sensitive to their state. The calculations indicate that homogeneous freezing nucleation of pure water ice in the stratospheric solution droplets would occur at temperatures below about 192 K. However, the physical properties of H2SO4 solution at such low temperatures are not well known, and it is possible that sulfuric acid aerosols will freeze out at temperatures ranging from about 180 to 195 K. It is also shown that the temperature at which the aerosols freeze is nearly independent of their size.

  15. Detonation in shocked homogeneous high explosives

    SciTech Connect

    Yoo, C.S.; Holmes, N.C.; Souers, P.C.

    1995-11-01

    We have studied shock-induced changes in homogeneous high explosives including nitromethane, tetranitromethane, and single crystals of pentaerythritol tetranitrate (PETN) by using fast time-resolved emission and Raman spectroscopy at a two-stage light-gas gun. The results reveal three distinct steps during which the homogeneous explosives chemically evolve to final detonation products. These are (1) the initiation of shock compressed high explosives after an induction period, (2) thermal explosion of shock-compressed and/or reacting materials, and (3) a decay to a steady-state representing a transition to the detonation of uncompressed high explosives. Based on a gray-body approximation, we have obtained the CJ temperatures: 3800 K for nitromethane, 2950 K for tetranitromethane, and 4100 K for PETN. We compare the data with various thermochemical equilibrium calculations. In this paper we will also show a preliminary result of single-shot time-resolved Raman spectroscopy applied to shock-compressed nitromethane.

  16. Program Logics for Homogeneous Meta-programming

    NASA Astrophysics Data System (ADS)

    Berger, Martin; Tratt, Laurence

    A meta-program is a program that generates or manipulates another program; in homogeneous meta-programming, a program may generate new parts of, or manipulate, itself. Meta-programming has been used extensively since macros were introduced to Lisp, yet we have little idea how formally to reason about meta-programs. This paper provides the first program logics for homogeneous meta-programming - using a variant of MiniML_e^{square} by Davies and Pfenning as underlying meta-programming language. We show the applicability of our approach by reasoning about example meta-programs from the literature. We also demonstrate that our logics are relatively complete in the sense of Cook, enable the inductive derivation of characteristic formulae, and exactly capture the observational properties induced by the operational semantics.

  17. CUDA Simulation of Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John V.; Shum, Victor; Fu, Terry

    2011-01-01

    We discuss very fast Compute Unified Device Architecture (CUDA) simulations of ideal homogeneous incompressible turbulence based on Fourier models. These models have associated statistical theories that predict that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. Prior numerical simulations have shown that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We review the theoretical basis of this "broken ergodicity" as applied to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence. Our new simulations examine the phenomenon of broken ergodicity through very long time and large grid size runs performed on a state-of-the-art CUDA platform. Results comparing various CUDA hardware configurations and grid sizes are discussed. NS and MHD results are compared.

  18. A homogenization model of the annulus fibrosus.

    PubMed

    Yin, Luzhong; Elliott, Dawn M

    2005-08-01

    The objective of this study was to use a homogenization model of the anisotropic mechanical behavior of annulus fibrosus (AF) to address some of the issues raised in structural finite element and fiber-reinforced strain energy models. Homogenization theory describes the effect of microstructure on macroscopic material properties by assuming the material is composed of repeating representative volume elements. We first developed the general homogenization model and then specifically prescribed the model to in-plane single lamella and multi-lamellae AF properties. We compared model predictions to experimentally measured AF properties and performed parametric studies. The predicted tensile moduli (E theta and E z) and their dependence on fiber volume fraction and fiber angle were consistent with measured values. However, the model prediction for shear modulus (G thetaz) was two orders of magnitude larger than directly measured values. The values of E theta and E z were strongly dependent on the model input for matrix modulus, much more so than the fiber modulus. These parametric analyses demonstrated the contribution of the matrix in AF load support, which may play a role when protoeglycans are decreased in disc degeneration, and will also be an important design factor in tissue engineering. We next compared the homogenization model to a 3-D structural finite element model and fiber-reinforced energy models. Similarities between the three model types provided confidence in the ability of these models to predict AF tissue mechanics. This study provides a direct comparison between the several types of AF models and will be useful for interpreting previous studies and elucidating AF structure-function relationships in disc degeneration and for functional tissue engineering.

  19. Spherical cloaking with homogeneous isotropic multilayered structures

    NASA Astrophysics Data System (ADS)

    Qiu, Cheng-Wei; Hu, Li; Xu, Xiaofei; Feng, Yijun

    2009-04-01

    We propose a practical realization of electromagnetic spherical cloaking by layered structure of homogeneous isotropic materials. By mimicking the classic anisotropic cloak by many alternating thin layers of isotropic dielectrics, the permittivity and permeability in each isotropic layer can be properly determined by effective medium theory in order to achieve invisibility. The model greatly facilitates modeling by Mie theory and realization by multilayer coating of dielectrics. Eigenmode analysis is also presented to provide insights of the discretization in multilayers.

  20. Spherical cloaking with homogeneous isotropic multilayered structures.

    PubMed

    Qiu, Cheng-Wei; Hu, Li; Xu, Xiaofei; Feng, Yijun

    2009-04-01

    We propose a practical realization of electromagnetic spherical cloaking by layered structure of homogeneous isotropic materials. By mimicking the classic anisotropic cloak by many alternating thin layers of isotropic dielectrics, the permittivity and permeability in each isotropic layer can be properly determined by effective medium theory in order to achieve invisibility. The model greatly facilitates modeling by Mie theory and realization by multilayer coating of dielectrics. Eigenmode analysis is also presented to provide insights of the discretization in multilayers. PMID:19518392

  1. Recent advances in homogeneous nickel catalysis.

    PubMed

    Tasker, Sarah Z; Standley, Eric A; Jamison, Timothy F

    2014-05-15

    Tremendous advances have been made in nickel catalysis over the past decade. Several key properties of nickel, such as facile oxidative addition and ready access to multiple oxidation states, have allowed the development of a broad range of innovative reactions. In recent years, these properties have been increasingly understood and used to perform transformations long considered exceptionally challenging. Here we discuss some of the most recent and significant developments in homogeneous nickel catalysis, with an emphasis on both synthetic outcome and mechanism.

  2. Background: What the States Created

    ERIC Educational Resources Information Center

    Cox, James C.

    2009-01-01

    Prior to 2003, virtual universities were being created at a rate that would question the usual perception that higher education rarely changed, or changed (if at all) at a glacial speed. No comprehensive study of what was actually being created had been done; nor had anyone tapped the experiences of the developers in the states to see what was…

  3. Homogeneous large-scale crystalline nanoparticle-covered substrate with high SERS performance

    NASA Astrophysics Data System (ADS)

    Aybeke, E. N.; Lacroute, Y.; Elie-Caille, C.; Bouhelier, A.; Bourillot, E.; Lesniewska, E.

    2015-06-01

    This article details the surface-enhanced Raman scattering (SERS) performance of plasmonic substrates fabricated by a physical metal evaporation technique that uses no precursor or intermediate coating. We outline a cost-effective nanofabrication protocol that uses common laboratory equipment to produce homogeneously covered crystalline nanoparticle substrates. Our fabrication yields a homogeneous SERS response over the whole surface. The platform is tested with methylene blue diluted at various concentrations to estimate the sensitivity, homogeneity, and reproducibility of the process. The capacity of the substrates is also confirmed with spectroscopic investigations of human microsomal cytochrome b5.

  4. TESTING HOMOGENEITY WITH GALAXY STAR FORMATION HISTORIES

    SciTech Connect

    Hoyle, Ben; Jimenez, Raul; Tojeiro, Rita; Maartens, Roy; Heavens, Alan; Clarkson, Chris

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past light cone, while observations take place on the light cone. The star formation history (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked luminous red galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal-area contiguous sky patches and 10 redshift slices (0.2 < z < 0.5), which correspond to 120 blocks of volume {approx}0.04 Gpc{sup 3}. Using the SFH in a time period that samples the history of the universe between look-back times 11.5 and 13.4 Gyr as a proxy for homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is no extra variance at all. At 95% credibility, there is no evidence of deviations larger than 5.8%.

  5. Homogeneous Biosensing Based on Magnetic Particle Labels.

    PubMed

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  6. Homogeneous Biosensing Based on Magnetic Particle Labels

    PubMed Central

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  7. Equilibrium states of homogeneous sheared compressible turbulence

    NASA Astrophysics Data System (ADS)

    Riahi, M.; Lili, T.

    2011-06-01

    Equilibrium states of homogeneous compressible turbulence subjected to rapid shear is studied using rapid distortion theory (RDT). The purpose of this study is to determine the numerical solutions of unsteady linearized equations governing double correlations spectra evolution. In this work, RDT code developed by authors solves these equations for compressible homogeneous shear flows. Numerical integration of these equations is carried out using a second-order simple and accurate scheme. The two Mach numbers relevant to homogeneous shear flow are the turbulent Mach number Mt, given by the root mean square turbulent velocity fluctuations divided by the speed of sound, and the gradient Mach number Mg which is the mean shear rate times the transverse integral scale of the turbulence divided by the speed of sound. Validation of this code is performed by comparing RDT results with direct numerical simulation (DNS) of [A. Simone, G.N. Coleman, and C. Cambon, Fluid Mech. 330, 307 (1997)] and [S. Sarkar, J. Fluid Mech. 282, 163 (1995)] for various values of initial gradient Mach number Mg0. It was found that RDT is valid for small values of the non-dimensional times St (St < 3.5). It is important to note that RDT is also valid for large values of St (St > 10) in particular for large values of Mg0. This essential feature justifies the resort to RDT in order to determine equilibrium states in the compressible regime.

  8. MULTIGRID HOMOGENIZATION OF HETEROGENEOUS POROUS MEDIA

    SciTech Connect

    Dendy, J.E.; Moulton, J.D.

    2000-10-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL); this report, however, reports on only two years research, since this project was terminated at the end of two years in response to the reduction in funding for the LDRD Program at LANL. The numerical simulation of flow through heterogeneous porous media has become a vital tool in forecasting reservoir performance, analyzing groundwater supply and predicting the subsurface flow of contaminants. Consequently, the computational efficiency and accuracy of these simulations is paramount. However, the parameters of the underlying mathematical models (e.g., permeability, conductivity) typically exhibit severe variations over a range of significantly different length scales. Thus the numerical treatment of these problems relies on a homogenization or upscaling procedure to define an approximate coarse-scale problem that adequately captures the influence of the fine-scale structure, with a resultant compromise between the competing objectives of computational efficiency and numerical accuracy. For homogenization in models of flow through heterogeneous porous media, We have developed new, efficient, numerical, multilevel methods, that offer a significant improvement in the compromise between accuracy and efficiency. We recently combined this approach with the work of Dvorak to compute bounded estimates of the homogenized permeability for such flows and demonstrated the effectiveness of this new algorithm with numerical examples.

  9. Effect of homogenization and ultrasonication on the physical properties of insoluble wheat bran fibres

    NASA Astrophysics Data System (ADS)

    Hu, Ran; Zhang, Min; Adhikari, Benu; Liu, Yaping

    2015-10-01

    Wheat bran is rich in dietary fibre and its annual output is abundant, but underutilized. Insoluble dietary fibre often influences food quality negatively; therefore, how to improve the physical and chemical properties of insoluble dietary fibre of wheat bran for post processing is a challenge. Insoluble dietary fibre was obtained from wheat bran and micronized using high-pressure homogenization, high-intensity sonication, and a combination of these two methods. The high-pressure homogenization and high-pressure homogenization+high-intensity sonication treatments significantly (p<0.05) improved the solubility, swelling, water-holding, oil-holding, and cation exchange capacities. The improvement of the above properties by high-intensity sonication alone was marginal. In most cases, the high-pressure homogenization process was as good as the high-pressure homogenization+high-intensity sonication process in improving the above-mentioned properties; hence, the contribution of high-`intensity sonication in the high-pressure homogenization+high-intensity sonication process was minimal. The best results show that the minimum particle size of wheat bran can reach 9 μm, and the solubility, swelling, water-holding, oil-holding, cation exchange capacities change significantly.

  10. Effect of homogenization and pasteurization on the structure and stability of whey protein in milk.

    PubMed

    Qi, Phoebe X; Ren, Daxi; Xiao, Yingping; Tomasula, Peggy M

    2015-05-01

    The effect of homogenization alone or in combination with high-temperature, short-time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a 2-stage homogenizer at 35°C (6.9 MPa/10.3 MPa) and, along with skim milk, were subjected to HTST pasteurization (72°C for 15 s) or UHT processing (135°C for 2 s). Other whole milk samples were processed using homogenization followed by either HTST pasteurization or UHT processing. The processed skim and whole milk samples were centrifuged further to remove fat and then acidified to pH 4.6 to isolate the corresponding whey fractions, and centrifuged again. The whey fractions were then purified using dialysis and investigated using the circular dichroism, Fourier transform infrared, and Trp intrinsic fluorescence spectroscopic techniques. Results demonstrated that homogenization combined with UHT processing of milk caused not only changes in protein composition but also significant secondary structural loss, particularly in the amounts of apparent antiparallel β-sheet and α-helix, as well as diminished tertiary structural contact. In both cases of homogenization alone and followed by HTST treatments, neither caused appreciable chemical changes, nor remarkable secondary structural reduction. But disruption was evident in the tertiary structural environment of the whey proteins due to homogenization of whole milk as shown by both the near-UV circular dichroism and Trp intrinsic fluorescence. In-depth structural stability analyses revealed that even though processing of milk imposed little impairment on the secondary structural stability, the tertiary structural stability of whey protein was altered significantly. The following order was derived based on these studies: raw whole>HTST, homogenized, homogenized and pasteurized>skimmed and pasteurized, and skimmed UHT>homogenized

  11. Homogeneous Charge Compression Ignition Free Piston Linear Alternator

    SciTech Connect

    Janson Wu; Nicholas Paradiso; Peter Van Blarigan; Scott Goldsborough

    1998-11-01

    An experimental and theoretical investigation of a homogeneous charge compression ignition (HCCI) free piston powered linear alternator has been conducted to determine if improvements can be made in the thermal and conversion efficiencies of modern electrical generator systems. Performance of a free piston engine was investigated using a rapid compression expansion machine and a full cycle thermodynamic model. Linear alternator performance was investigated with a computer model. In addition linear alternator testing and permanent magnet characterization hardware were developed. The development of the two-stroke cycle scavenging process has begun.

  12. Homogeneous turbulence subjected to mean flow with elliptic streamlines

    NASA Technical Reports Server (NTRS)

    Blaisdell, G. A.; Shariff, K.

    1994-01-01

    Direct numerical simulations are performed for homogeneous turbulence with a mean flow having elliptic streamlines. This flow combines the effects of rotation and strain on the turbulence. Qualitative comparisons are made with linear theory for cases with high Rossby number. The nonlinear transfer process is monitored using a generalized skewness. In general, rotation turns off the nonlinear cascade; however, for moderate ellipticities and rotation rates the nonlinear cascade is turned off and then reestablished. Turbulence statistics of interest in turbulence modeling are calculated, including full Reynolds stress budgets.

  13. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    SciTech Connect

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  14. Homogenous charge compression ignition engine having a cylinder including a high compression space

    DOEpatents

    Agama, Jorge R.; Fiveland, Scott B.; Maloney, Ronald P.; Faletti, James J.; Clarke, John M.

    2003-12-30

    The present invention relates generally to the field of homogeneous charge compression engines. In these engines, fuel is injected upstream or directly into the cylinder when the power piston is relatively close to its bottom dead center position. The fuel mixes with air in the cylinder as the power piston advances to create a relatively lean homogeneous mixture that preferably ignites when the power piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. Thus, the present invention divides the homogeneous charge between a controlled volume higher compression space and a lower compression space to better control the start of ignition.

  15. Turbulent Diffusion in Non-Homogeneous Environments

    NASA Astrophysics Data System (ADS)

    Diez, M.; Redondo, J. M.; Mahjoub, O. B.; Sekula, E.

    2012-04-01

    Many experimental studies have been devoted to the understanding of non-homogeneous turbulent dynamics. Activity in this area intensified when the basic Kolmogorov self-similar theory was extended to two-dimensional or quasi 2D turbulent flows such as those appearing in the environment, that seem to control mixing [1,2]. The statistical description and the dynamics of these geophysical flows depend strongly on the distribution of long lived organized (coherent) structures. These flows show a complex topology, but may be subdivided in terms of strongly elliptical domains (high vorticity regions), strong hyperbolic domains (deformation cells with high energy condensations) and the background turbulent field of moderate elliptic and hyperbolic characteristics. It is of fundamental importance to investigate the different influence of these topological diverse regions. Relevant geometrical information of different areas is also given by the maximum fractal dimension, which is related to the energy spectrum of the flow. Using all the available information it is possible to investigate the spatial variability of the horizontal eddy diffusivity K(x,y). This information would be very important when trying to model numerically the behaviour in time of the oil spills [3,4] There is a strong dependence of horizontal eddy diffusivities with the Wave Reynolds number as well as with the wind stress measured as the friction velocity from wind profiles measured at the coastline. Natural sea surface oily slicks of diverse origin (plankton, algae or natural emissions and seeps of oil) form complicated structures in the sea surface due to the effects of both multiscale turbulence and Langmuir circulation. It is then possible to use the topological and scaling analysis to discriminate the different physical sea surface processes. We can relate higher orden moments of the Lagrangian velocity to effective diffusivity in spite of the need to calibrate the different regions determining the

  16. Microstructure-Mechanical Properties Relation of TLP-Bonded FSX-414 Superalloy: Effect of Homogenization Design

    NASA Astrophysics Data System (ADS)

    Bakhtiari, R.; Ekrami, A.; Khan, T. I.

    2015-04-01

    Co-based FSX-414 superalloy is especially used for first-stage nozzles of gas turbines. Transient liquid phase (TLP) bonding has potential as repair process for these nozzles. In this study, homogenization of TLP-bonded FSX-414 superalloy at optimum bonding condition (1150 °C/5 min) was conducted at 1175, 1200, and 1225 °C for 1, 3, and 6 h. Homogenization at 1175 °C/1 h had no effect on removing the diffusion-affected zone (DAZ) phases. Increasing the time to 6 h was effective in removing DAZ phases and boride phases formed due to liquefaction, but compositional homogenization was not complete. Homogenization at 1200 °C for 1 h caused boride phases to form adjacent to the joint and in the base metal. By increasing the time to 3 h produced joint, free of these phases. At 1225 °C/3 h homogenization condition, using a Ni-etchant and EDS analysis across the joint showed appropriate combination of compositional and microstructural homogenization including removed DAZ phases. The highest hardness, the most uniform hardness profile across the joint, and the highest shear strength (91% of the base metal strength) in addition to the microstructural features showed the best joints homogenizing at 1225 °C for 3 h.

  17. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    NASA Astrophysics Data System (ADS)

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang; Paxton, Dean; Lavender, Curt; Burkes, Douglas

    2016-04-01

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  18. Creating and Exploring Simple Models

    ERIC Educational Resources Information Center

    Hubbard, Miles J.

    2007-01-01

    Students manipulate data algebraically, and statistically to create models applied to a falling ball. They also borrow tools from arithmetic progressions to examine the relationship between the velocity and the distance the ball falls. (Contains 2 tables and 5 figures.)

  19. Creating and Nurturing Strong Teams.

    ERIC Educational Resources Information Center

    Martin, Kaye M.

    1999-01-01

    Discusses ways to create and sustain strong teaching teams, including matching curriculum goals, complementary professional strengths, and exercise of autonomy. Elaborates the administrator's role in nurturing and supporting teamwork. (JPB)

  20. Homogenization, lyophilization or acid-extraction of meat products improves iron uptake from cereal-meat product combinations in an in vitro digestion/Caco-2 cell model.

    PubMed

    Pachón, Helena; Stoltzfus, Rebecca J; Glahn, Raymond P

    2009-03-01

    The effect of processing (homogenization, lyophilization, acid-extraction) meat products on iron uptake from meat combined with uncooked iron-fortified cereal was evaluated using an in vitro digestion/Caco-2 cell model. Beef was cooked, blended to create smaller meat particles, and combined with electrolytic iron-fortified infant rice cereal. Chicken liver was cooked and blended, lyophilized, or acid-extracted, and combined with FeSO4-fortified wheat flour. In the beef-cereal combination, Caco-2 cell iron uptake, assessed by measuring the ferritin formed by cells, was greater when the beef was blended for the greatest amount of time (360 s) compared with 30 s (P < 0.05). Smaller liver particles (blended for 360 s or lyophilized) significantly enhanced iron uptake compared to liver blended for 60 s (P < 0.001) in the liver-flour combination. Compared to liver blended for 60 s, acid-extraction of liver significantly enhanced iron uptake (P = 0.03) in the liver-flour combination. Homogenization of beef and homogenization, lyophilization, or acid-extraction of chicken liver increases the enhancing effect of meat products on iron absorption in iron-fortified cereals.

  1. Sulfur isotope homogeneity of lunar mare basalts

    NASA Astrophysics Data System (ADS)

    Wing, Boswell A.; Farquhar, James

    2015-12-01

    We present a new set of high precision measurements of relative 33S/32S, 34S/32S, and 36S/32S values in lunar mare basalts. The measurements are referenced to the Vienna-Canyon Diablo Troilite (V-CDT) scale, on which the international reference material, IAEA-S-1, is characterized by δ33S = -0.061‰, δ34S ≡ -0.3‰ and δ36S = -1.27‰. The present dataset confirms that lunar mare basalts are characterized by a remarkable degree of sulfur isotopic homogeneity, with most new and published SF6-based sulfur isotope measurements consistent with a single mass-dependent mean isotopic composition of δ34S = 0.58 ± 0.05‰, Δ33S = 0.008 ± 0.006‰, and Δ36S = 0.2 ± 0.2‰, relative to V-CDT, where the uncertainties are quoted as 99% confidence intervals on the mean. This homogeneity allows identification of a single sample (12022, 281) with an apparent 33S enrichment, possibly reflecting cosmic-ray-induced spallation reactions. It also reveals that some mare basalts have slightly lower δ34S values than the population mean, which is consistent with sulfur loss from a reduced basaltic melt prior to eruption at the lunar surface. Both the sulfur isotope homogeneity of the lunar mare basalts and the predicted sensitivity of sulfur isotopes to vaporization-driven fractionation suggest that less than ≈1-10% of lunar sulfur was lost after a potential moon-forming impact event.

  2. Creating Cartoons to Promote Leaderships Skills and Explore Leadership Qualities

    ERIC Educational Resources Information Center

    Smith, Latisha L.; Clausen, Courtney K.; Teske, Jolene K.; Ghayoorrad, Maryam; Gray, Phyllis; Al Subia, Sukainah; Atwood-Blaine, Dana; Rule, Audrey C.

    2015-01-01

    This document describes a strategy for increasing student leadership and creativity skills through the creation of cartoons. Creating cartoons engages students in divergent thinking and cognitive processes, such as perception, recall, and mental processing. When students create cartoons focused on a particular topic, they are making connections to…

  3. Utilizing Educational Corporate Culture To Create a Quality School.

    ERIC Educational Resources Information Center

    Osborne, Bill

    Strategies for utilizing educational corporate culture to create a quality school are presented in this paper, which argues that the understanding of the shared belief system of organizational members is crucial to the process. Creating a quality school entails moving from a "teach the process" oriented model to one that internalizes the desired…

  4. Heterogeneity versus homogeneity of multiple sclerosis

    PubMed Central

    Sato, Fumitaka; Martinez, Nicholas E; Omura, Seiichi; Tsunoda, Ikuo

    2011-01-01

    The 10th International Congress of Neuroimmunology, including the 10th European School of Neuroimmunology Course, was held by the International Society of Neuroimmunology in Sitges (Barcelona, Spain) on 26–30 October 2010. The conference covered a wide spectrum of issues and challenges in both basic science and clinical aspects of neuroimmunology. Data and ideas were shared through a variety of programs, including review talks and poster sessions. One of the topics of the congress was whether multiple sclerosis is a homogenous or heterogenous disease, clinically and pathologically, throughout its course. PMID:21426254

  5. Isotropic homogeneous universe with viscous fluid

    SciTech Connect

    Santos, N.O.; Dias, R.S.; Banerjee, A.

    1985-04-01

    Exact solutions are obtained for the isotropic homogeneous cosmological model with viscous fluid. The fluid has only bulk viscosity and the viscosity coefficient is taken to be a power function of the mass density. The equation of state assumed obeys a linear relation between mass density and pressure. The models satisfying Hawking's energy conditions are discussed. Murphy's model is only a special case of this general set of solutions and it is shown that Murphy's conclusion that the introduciton of bulk viscosity can avoid the occurrence of space-time singularity at finite past is not, in general, valid.

  6. Relativistic effects in homogeneous gold catalysis.

    PubMed

    Gorin, David J; Toste, F Dean

    2007-03-22

    Transition-metal catalysts containing gold present new opportunities for chemical synthesis, and it is therefore not surprising that these complexes are beginning to capture the attention of the chemical community. Cationic phosphine-gold(i) complexes are especially versatile and selective catalysts for a growing number of synthetic transformations. The reactivity of these species can be understood in the context of theoretical studies on gold; relativistic effects are especially helpful in rationalizing the reaction manifolds available to gold catalysts. This Review draws on experimental and computational data to present our current understanding of homogeneous gold catalysis, focusing on previously unexplored reactivity and its application to the development of new methodology.

  7. Compressible homogeneous shear: Simulation and modeling

    NASA Technical Reports Server (NTRS)

    Sarkar, S.; Erlebacher, G.; Hussaini, M. Y.

    1992-01-01

    Compressibility effects were studied on turbulence by direct numerical simulation of homogeneous shear flow. A primary observation is that the growth of the turbulent kinetic energy decreases with increasing turbulent Mach number. The sinks provided by compressible dissipation and the pressure dilatation, along with reduced Reynolds shear stress, are shown to contribute to the reduced growth of kinetic energy. Models are proposed for these dilatational terms and verified by direct comparison with the simulations. The differences between the incompressible and compressible fields are brought out by the examination of spectra, statistical moments, and structure of the rate of strain tensor.

  8. Effect of ultrasonic homogenization on the Vis/NIR bulk optical properties of milk.

    PubMed

    Aernouts, Ben; Van Beers, Robbe; Watté, Rodrigo; Huybrechts, Tjebbe; Jordens, Jeroen; Vermeulen, Daniel; Van Gerven, Tom; Lammertyn, Jeroen; Saeys, Wouter

    2015-02-01

    The size of colloidal particles in food products has a considerable impact on the product's physicochemical, functional and sensory characteristics. Measurement techniques to monitor the size of suspended particles could, therefore, help to further reduce the variability in production processes and promote the development of new food products with improved properties. Visible and near-infrared (Vis/NIR) spectroscopy is already widely used to measure the composition of agricultural and food products. However, this technology can also be consulted to acquire microstructure-related scattering properties of food products. In this study, the effect of the fat globule size on the Vis/NIR bulk scattering properties of milk was investigated. Variability in fat globule size distribution was created using ultrasonic homogenization of raw milk. Reduction of the fat globule size resulted in a higher wavelength-dependency of both the Vis/NIR bulk scattering coefficient and the scattering anisotropy factor. Moreover, the anisotropy factor and the bulk scattering coefficients for wavelengths above 600 nm were reduced and were dominated by Rayleigh scattering. Additionally, the bulk scattering properties could be well (R(2) ≥ 0.990) estimated from measured particle size distributions by consulting an algorithm based on the Mie solution. Future research could aim at the inversion of this model to estimate the particle size distributions from Vis/NIR spectroscopic measurements. PMID:25604617

  9. Exploring Earthquake Databases for the Creation of Magnitude-Homogeneous Catalogues: Tools for Application on a Regional and Global Scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-06-01

    The creation of a magnitude-homogenised catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenising multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins, and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilise this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonise magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonised into moment-magnitude to form a catalogue of more than 562,840 events. This extended catalogue, whilst not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  10. Bio-inspired homogeneous multi-scale place recognition.

    PubMed

    Chen, Zetao; Lowry, Stephanie; Jacobson, Adam; Hasselmo, Michael E; Milford, Michael

    2015-12-01

    Robotic mapping and localization systems typically operate at either one fixed spatial scale, or over two, combining a local metric map and a global topological map. In contrast, recent high profile discoveries in neuroscience have indicated that animals such as rodents navigate the world using multiple parallel maps, with each map encoding the world at a specific spatial scale. While a number of theoretical-only investigations have hypothesized several possible benefits of such a multi-scale mapping system, no one has comprehensively investigated the potential mapping and place recognition performance benefits for navigating robots in large real world environments, especially using more than two homogeneous map scales. In this paper we present a biologically-inspired multi-scale mapping system mimicking the rodent multi-scale map. Unlike hybrid metric-topological multi-scale robot mapping systems, this new system is homogeneous, distinguishable only by scale, like rodent neural maps. We present methods for training each network to learn and recognize places at a specific spatial scale, and techniques for combining the output from each of these parallel networks. This approach differs from traditional probabilistic robotic methods, where place recognition spatial specificity is passively driven by models of sensor uncertainty. Instead we intentionally create parallel learning systems that learn associations between sensory input and the environment at different spatial scales. We also conduct a systematic series of experiments and parameter studies that determine the effect on performance of using different neural map scaling ratios and different numbers of discrete map scales. The results demonstrate that a multi-scale approach universally improves place recognition performance and is capable of producing better than state of the art performance compared to existing robotic navigation algorithms. We analyze the results and discuss the implications with respect to

  11. Bio-inspired homogeneous multi-scale place recognition.

    PubMed

    Chen, Zetao; Lowry, Stephanie; Jacobson, Adam; Hasselmo, Michael E; Milford, Michael

    2015-12-01

    Robotic mapping and localization systems typically operate at either one fixed spatial scale, or over two, combining a local metric map and a global topological map. In contrast, recent high profile discoveries in neuroscience have indicated that animals such as rodents navigate the world using multiple parallel maps, with each map encoding the world at a specific spatial scale. While a number of theoretical-only investigations have hypothesized several possible benefits of such a multi-scale mapping system, no one has comprehensively investigated the potential mapping and place recognition performance benefits for navigating robots in large real world environments, especially using more than two homogeneous map scales. In this paper we present a biologically-inspired multi-scale mapping system mimicking the rodent multi-scale map. Unlike hybrid metric-topological multi-scale robot mapping systems, this new system is homogeneous, distinguishable only by scale, like rodent neural maps. We present methods for training each network to learn and recognize places at a specific spatial scale, and techniques for combining the output from each of these parallel networks. This approach differs from traditional probabilistic robotic methods, where place recognition spatial specificity is passively driven by models of sensor uncertainty. Instead we intentionally create parallel learning systems that learn associations between sensory input and the environment at different spatial scales. We also conduct a systematic series of experiments and parameter studies that determine the effect on performance of using different neural map scaling ratios and different numbers of discrete map scales. The results demonstrate that a multi-scale approach universally improves place recognition performance and is capable of producing better than state of the art performance compared to existing robotic navigation algorithms. We analyze the results and discuss the implications with respect to

  12. Primary healthcare solo practices: homogeneous or heterogeneous?

    PubMed

    Pineault, Raynald; Borgès Da Silva, Roxane; Provost, Sylvie; Beaulieu, Marie-Dominique; Boivin, Antoine; Couture, Audrey; Prud'homme, Alexandre

    2014-01-01

    Introduction. Solo practices have generally been viewed as forming a homogeneous group. However, they may differ on many characteristics. The objective of this paper is to identify different forms of solo practice and to determine the extent to which they are associated with patient experience of care. Methods. Two surveys were carried out in two regions of Quebec in 2010: a telephone survey of 9180 respondents from the general population and a postal survey of 606 primary healthcare (PHC) practices. Data from the two surveys were linked through the respondent's usual source of care. A taxonomy of solo practices was constructed (n = 213), using cluster analysis techniques. Bivariate and multilevel analyses were used to determine the relationship of the taxonomy with patient experience of care. Results. Four models were derived from the taxonomy. Practices in the "resourceful networked" model contrast with those of the "resourceless isolated" model to the extent that the experience of care reported by their patients is more favorable. Conclusion. Solo practice is not a homogeneous group. The four models identified have different organizational features and their patients' experience of care also differs. Some models seem to offer a better organizational potential in the context of current reforms.

  13. Oscillating Instantons as Homogeneous Tunneling Channels

    NASA Astrophysics Data System (ADS)

    Lee, Bum-Hoon; Lee, Wonwoo; Yeom, Dong-Han

    2013-07-01

    In this paper, we study Einstein gravity with a minimally coupled scalar field accompanied with a potential, assuming an O(4) symmetric metric ansatz. We call an Euclidean instanton is to be an oscillating instanton, if there exists a point where the derivative of the scale factor and the scalar field vanish at the same time. Then, we can prove that the oscillating instanton can be analytically continued, both as inhomogeneous and homogeneous tunneling channels. Here, we especially focus on the possibility of a homogeneous tunneling channel. For the existence of such an instanton, we have to assume three things: (1) there should be a local maximum and the curvature of the maximum should be sufficiently large, (2) there should be a local minimum and (3) the other side of the potential should have a sufficiently deeper vacuum. Then, we can show that there exists a number of oscillating instanton solutions and their probabilities are higher compared to the Hawking-Moss instantons. We also check the possibility when the oscillating instantons are comparable with the Coleman-de Luccia channels. Thus, for a general vacuum decay problem, we should not ignore the oscillating instanton channels.

  14. Emergence of Leadership within a Homogeneous Group

    PubMed Central

    Eskridge, Brent E.; Valle, Elizabeth; Schlupp, Ingo

    2015-01-01

    Large scale coordination without dominant, consistent leadership is frequent in nature. How individuals emerge from within the group as leaders, however transitory this position may be, has become an increasingly common question asked. This question is further complicated by the fact that in many of these aggregations, differences between individuals are minor and the group is largely considered to be homogeneous. In the simulations presented here, we investigate the emergence of leadership in the extreme situation in which all individuals are initially identical. Using a mathematical model developed using observations of natural systems, we show that the addition of a simple concept of leadership tendencies which is inspired by observations of natural systems and is affected by experience can produce distinct leaders and followers using a nonlinear feedback loop. Most importantly, our results show that small differences in experience can promote the rapid emergence of stable roles for leaders and followers. Our findings have implications for our understanding of adaptive behaviors in initially homogeneous groups, the role experience can play in shaping leadership tendencies, and the use of self-assessment in adapting behavior and, ultimately, self-role-assignment. PMID:26226381

  15. The Statistical Mechanics of Ideal Homogeneous Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2002-01-01

    Plasmas, such as those found in the space environment or in plasma confinement devices, are often modeled as electrically conducting fluids. When fluids and plasmas are energetically stirred, regions of highly nonlinear, chaotic behavior known as turbulence arise. Understanding the fundamental nature of turbulence is a long-standing theoretical challenge. The present work describes a statistical theory concerning a certain class of nonlinear, finite dimensional, dynamical models of turbulence. These models arise when the partial differential equations describing incompressible, ideal (i.e., nondissipative) homogeneous fluid and magnetofluid (i.e., plasma) turbulence are Fourier transformed into a very large set of ordinary differential equations. These equations define a divergenceless flow in a high-dimensional phase space, which allows for the existence of a Liouville theorem, guaranteeing a distribution function based on constants of the motion (integral invariants). The novelty of these particular dynamical systems is that there are integral invariants other than the energy, and that some of these invariants behave like pseudoscalars under two of the discrete symmetry transformations of physics, parity, and charge conjugation. In this work the 'rugged invariants' of ideal homogeneous turbulence are shown to be the only significant scalar and pseudoscalar invariants. The discovery that pseudoscalar invariants cause symmetries of the original equations to be dynamically broken and induce a nonergodic structure on the associated phase space is the primary result presented here. Applicability of this result to dissipative turbulence is also discussed.

  16. On shearing fluids with homogeneous densities

    NASA Astrophysics Data System (ADS)

    Srivastava, D. C.; Srivastava, V. C.; Kumar, Rajesh

    2016-06-01

    In this paper, we study shearing spherically symmetric homogeneous density fluids in comoving coordinates. It is found that the expansion of the four-velocity of a perfect fluid is homogeneous, whereas its shear is generated by an arbitrary function of time M( t), related to the mass function of the distribution. This function is found to bear a functional relationship with density. The field equations are reduced to two coupled first order ordinary differential equations for the metric coefficients g_{11} and g_{22}. We have explored a class of solutions assuming that M is a linear function of the density. This class embodies, as a subcase, the complete class of shear-free solutions. We have discussed the off quoted work of Kustaanheimo (Comment Phys Math XIII:12, 1, 1947) and have noted that it deals with shear-free fluids having anisotropic pressure. It is shown that the anisotropy of the fluid is characterized by an arbitrary function of time. We have discussed some issues of historical priorities and credentials related to shear-free solutions. Recent controversial claims by Mitra (Astrophys Space Sci 333:351, 2011 and Gravit Cosmol 18:17, 2012) have also been addressed. We found that the singularity and the shearing motion of the fluid are closely related. Hence, there is a need for fresh look to the solutions obtained earlier in comoving coordinates.

  17. Homogenization in micro-magneto-mechanics

    NASA Astrophysics Data System (ADS)

    Sridhar, A.; Keip, M.-A.; Miehe, C.

    2016-07-01

    Ferromagnetic materials are characterized by a heterogeneous micro-structure that can be altered by external magnetic and mechanical stimuli. The understanding and the description of the micro-structure evolution is of particular importance for the design and the analysis of smart materials with magneto-mechanical coupling. The macroscopic response of the material results from complex magneto-mechanical interactions occurring on smaller length scales, which are driven by magnetization reorientation and associated magnetic domain wall motions. The aim of this work is to directly base the description of the macroscopic magneto-mechanical material behavior on the micro-magnetic domain evolution. This will be realized by the incorporation of a ferromagnetic phase-field formulation into a macroscopic Boltzmann continuum by the use of computational homogenization. The transition conditions between the two scales are obtained via rigorous exploitation of rate-type and incremental variational principles, which incorporate an extended version of the classical Hill-Mandel macro-homogeneity condition covering the phase field on the micro-scale. An efficient two-scale computational scenario is developed based on an operator splitting scheme that includes a predictor for the magnetization on the micro-scale. Two- and three-dimensional numerical simulations demonstrate the performance of the method. They investigate micro-magnetic domain evolution driven by macroscopic fields as well as the associated overall hysteretic response of ferromagnetic solids.

  18. Population dynamics in non-homogeneous environments

    NASA Astrophysics Data System (ADS)

    Alards, Kim M. J.; Tesser, Francesca; Toschi, Federico

    2014-11-01

    For organisms living in aquatic ecosystems the presence of fluid transport can have a strong influence on the dynamics of populations and on evolution of species. In particular, displacements due to self-propulsion, summed up with turbulent dispersion at larger scales, strongly influence the local densities and thus population and genetic dynamics. Real marine environments are furthermore characterized by a high degree of non-homogeneities. In the case of population fronts propagating in ``fast'' turbulence, with respect to the population duplication time, the flow effect can be studied by replacing the microscopic diffusivity with an effective turbulent diffusivity. In the opposite case of ``slow'' turbulence the advection by the flow has to be considered locally. Here we employ numerical simulations to study the influence of non-homogeneities in the diffusion coefficient of reacting individuals of different species expanding in a 2 dimensional space. Moreover, to explore the influence of advection, we consider a population expanding in the presence of simple velocity fields like cellular flows. The output is analyzed in terms of front roughness, front shape, propagation speed and, concerning the genetics, by means of heterozygosity and local and global extinction probabilities.

  19. Si isotope homogeneity of the solar nebula

    SciTech Connect

    Pringle, Emily A.; Savage, Paul S.; Moynier, Frédéric; Jackson, Matthew G.; Barrat, Jean-Alix E-mail: savage@levee.wustl.edu E-mail: moynier@ipgp.fr E-mail: Jean-Alix.Barrat@univ-brest.fr

    2013-12-20

    The presence or absence of variations in the mass-independent abundances of Si isotopes in bulk meteorites provides important clues concerning the evolution of the early solar system. No Si isotopic anomalies have been found within the level of analytical precision of 15 ppm in {sup 29}Si/{sup 28}Si across a wide range of inner solar system materials, including terrestrial basalts, chondrites, and achondrites. A possible exception is the angrites, which may exhibit small excesses of {sup 29}Si. However, the general absence of anomalies suggests that primitive meteorites and differentiated planetesimals formed in a reservoir that was isotopically homogenous with respect to Si. Furthermore, the lack of resolvable anomalies in the calcium-aluminum-rich inclusion measured here suggests that any nucleosynthetic anomalies in Si isotopes were erased through mixing in the solar nebula prior to the formation of refractory solids. The homogeneity exhibited by Si isotopes may have implications for the distribution of Mg isotopes in the solar nebula. Based on supernova nucleosynthetic yield calculations, the expected magnitude of heavy-isotope overabundance is larger for Si than for Mg, suggesting that any potential Mg heterogeneity, if present, exists below the 15 ppm level.

  20. Computational approaches to homogeneous gold catalysis.

    PubMed

    Faza, Olalla Nieto; López, Carlos Silva

    2015-01-01

    Homogenous gold catalysis has been exploding for the last decade at an outstanding pace. The best described reactivity of Au(I) and Au(III) species is based on gold's properties as a soft Lewis acid, but new reactivity patterns have recently emerged which further expand the range of transformations achievable using gold catalysis, with examples of dual gold activation, hydrogenation reactions, or Au(I)/Au(III) catalytic cycles.In this scenario, to develop fully all these new possibilities, the use of computational tools to understand at an atomistic level of detail the complete role of gold as a catalyst is unavoidable. In this work we aim to provide a comprehensive review of the available benchmark works on methodological options to study homogenous gold catalysis in the hope that this effort can help guide the choice of method in future mechanistic studies involving gold complexes. This is relevant because a representative number of current mechanistic studies still use methods which have been reported as inappropriate and dangerously inaccurate for this chemistry.Together with this, we describe a number of recent mechanistic studies where computational chemistry has provided relevant insights into non-conventional reaction paths, unexpected selectivities or novel reactivity, which illustrate the complexity behind gold-mediated organic chemistry.

  1. Role of structural barriers for carotenoid bioaccessibility upon high pressure homogenization.

    PubMed

    Palmero, Paola; Panozzo, Agnese; Colle, Ines; Chigwedere, Claire; Hendrickx, Marc; Van Loey, Ann

    2016-05-15

    A specific approach to investigate the effect of high pressure homogenization on the carotenoid bioaccessibility in tomato-based products was developed. Six different tomato-based model systems were reconstituted in order to target the specific role of the natural structural barriers (chromoplast substructure/cell wall) and of the phases (soluble/insoluble) in determining the carotenoid bioaccessibility and viscosity changes upon high pressure homogenization. Results indicated that in the absence of natural structural barriers (carotenoid enriched oil), the soluble and insoluble phases determined the carotenoid bioaccessibility upon processing whereas, in their presence, these barriers governed the bioaccessibility. Furthermore, it was shown that the increment of the viscosity upon high pressure homogenization is determined by the presence of insoluble phase, however, this result was related to the initial ratio of the soluble:insoluble phases in the system. In addition, no relationship between the changes in viscosity and carotenoid bioaccessibility upon high pressure homogenization was found.

  2. Rotational homogeneity in graphene grown on Au(111)

    NASA Astrophysics Data System (ADS)

    Wofford, Joseph; Starodub, Elena; Walter, Andrew; Nie, Shu; Bostwick, Aaron; Bartelt, Norman; Thürmer, Konrad; Rotenberg, Eli; McCarty, Kevin; Dubon, Oscar

    2012-02-01

    The set of properties offered by the (111) surface of gold makes it intriguing as a platform on which to study the fundamental processes that underpin graphene growth on metals. Among these are the low carbon solubility and an interaction strength with graphene that is predicted to be smaller than most transition metals. We have investigated this synthesis process using low-energy electron microscopy and diffraction to monitor the sample surface in real time, and found that the resulting graphene film possesses a remarkable degree of rotational homogeneity. The dominant orientation of the graphene is aligned with the Au lattice, with a small minority rotated by 30 degrees. The origins of this in-plane structuring are puzzling because angularly resolved photo-emission spectroscopy and scanning tunneling microscopy experiments both suggest only a relatively small interaction between the two materials. Finally, the implications of these findings for the growth of high structural-quality graphene films are discussed.

  3. Creating Spaces for Literacy, Creating Spaces for Learning

    ERIC Educational Resources Information Center

    Howard, Christy

    2016-01-01

    This study represents the practices of a middle school social studies teacher as she focuses on integrating questioning, reading, and writing in her content area. This teacher uses literacy strategies to engage students in practices of reading multiple texts and writing to showcase learning. She creates opportunities for students to make…

  4. Concordance and discordance between taxonomic and functional homogenization: responses of soil mite assemblages to forest conversion.

    PubMed

    Mori, Akira S; Ota, Aino T; Fujii, Saori; Seino, Tatsuyuki; Kabeya, Daisuke; Okamoto, Toru; Ito, Masamichi T; Kaneko, Nobuhiro; Hasegawa, Motohiro

    2015-10-01

    The compositional characteristics of ecological assemblages are often simplified; this process is termed "biotic homogenization." This process of biological reorganization occurs not only taxonomically but also functionally. Testing both aspects of homogenization is essential if ecosystem functioning supported by a diverse mosaic of functional traits in the landscape is concerned. Here, we aimed to infer the underlying processes of taxonomic/functional homogenization at the local scale, which is a scale that is meaningful for this research question. We recorded species of litter-dwelling oribatid mites along a gradient of forest conversion from a natural forest to a monoculture larch plantation in Japan (in total 11 stands), and collected data on the functional traits of the recorded species to quantify functional diversity. We calculated the taxonomic and functional β-diversity, an index of biotic homogenization. We found that both the taxonomic and functional β-diversity decreased with larch dominance (stand homogenization). After further deconstructing β-diversity into the components of turnover and nestedness, which reflect different processes of community organization, a significant decrease in the response to larch dominance was observed only for the functional turnover. As a result, there was a steeper decline in the functional β-diversity than the taxonomic β-diversity. This discordance between the taxonomic and functional response suggests that species replacement occurs between species that are functionally redundant under environmental homogenization, ultimately leading to the stronger homogenization of functional diversity. The insights gained from community organization of oribatid mites suggest that the functional characteristics of local assemblages, which support the functionality of ecosystems, are of more concern in human-dominated forest landscapes.

  5. Concordance and discordance between taxonomic and functional homogenization: responses of soil mite assemblages to forest conversion.

    PubMed

    Mori, Akira S; Ota, Aino T; Fujii, Saori; Seino, Tatsuyuki; Kabeya, Daisuke; Okamoto, Toru; Ito, Masamichi T; Kaneko, Nobuhiro; Hasegawa, Motohiro

    2015-10-01

    The compositional characteristics of ecological assemblages are often simplified; this process is termed "biotic homogenization." This process of biological reorganization occurs not only taxonomically but also functionally. Testing both aspects of homogenization is essential if ecosystem functioning supported by a diverse mosaic of functional traits in the landscape is concerned. Here, we aimed to infer the underlying processes of taxonomic/functional homogenization at the local scale, which is a scale that is meaningful for this research question. We recorded species of litter-dwelling oribatid mites along a gradient of forest conversion from a natural forest to a monoculture larch plantation in Japan (in total 11 stands), and collected data on the functional traits of the recorded species to quantify functional diversity. We calculated the taxonomic and functional β-diversity, an index of biotic homogenization. We found that both the taxonomic and functional β-diversity decreased with larch dominance (stand homogenization). After further deconstructing β-diversity into the components of turnover and nestedness, which reflect different processes of community organization, a significant decrease in the response to larch dominance was observed only for the functional turnover. As a result, there was a steeper decline in the functional β-diversity than the taxonomic β-diversity. This discordance between the taxonomic and functional response suggests that species replacement occurs between species that are functionally redundant under environmental homogenization, ultimately leading to the stronger homogenization of functional diversity. The insights gained from community organization of oribatid mites suggest that the functional characteristics of local assemblages, which support the functionality of ecosystems, are of more concern in human-dominated forest landscapes. PMID:26001603

  6. Simple circuit to improve electric field homogeneity in contour-clamped homogeneous electric field chambers.

    PubMed

    Herrera, José A; Canino, Carlos A; López-Cánovas, Lilia; Gigato, Regnar; Riverón, Ana Maria

    2003-04-01

    We redesigned contour-clamped homogeneous electric field (CHEF) circuitry to eliminate crossover distortion, to set identical potentials at electrodes of each equipotential pair and to drive pairs with transistors in emitter follower stages. An equipotential pair comprised the two electrodes set at the same potential to provide electric field homogeneity inside of the hexagonal array. The new circuitry consisted of two identical circuits, each having a resistor ladder, diodes and transistors. Both circuits were interconnected by diodes that controlled the current flow to electrodes when the array was energized in the 'A' or 'B' direction of the electric field. The total number of transistors was two-thirds of the total number of electrodes. Average voltage deviation from potentials expected at electrodes to achieve a homogeneous electric field was 0.06 V, whereas 0.44 V was obtained with another circuit that used transistors in push-pull stages. The new voltage clamp unit is cheap, generated homogeneous electric field, and gave reproducible and undistorted DNA band patterns.

  7. Simple circuit to improve electric field homogeneity in contour-clamped homogeneous electric field chambers.

    PubMed

    Herrera, José A; Canino, Carlos A; López-Cánovas, Lilia; Gigato, Regnar; Riverón, Ana Maria

    2003-04-01

    We redesigned contour-clamped homogeneous electric field (CHEF) circuitry to eliminate crossover distortion, to set identical potentials at electrodes of each equipotential pair and to drive pairs with transistors in emitter follower stages. An equipotential pair comprised the two electrodes set at the same potential to provide electric field homogeneity inside of the hexagonal array. The new circuitry consisted of two identical circuits, each having a resistor ladder, diodes and transistors. Both circuits were interconnected by diodes that controlled the current flow to electrodes when the array was energized in the 'A' or 'B' direction of the electric field. The total number of transistors was two-thirds of the total number of electrodes. Average voltage deviation from potentials expected at electrodes to achieve a homogeneous electric field was 0.06 V, whereas 0.44 V was obtained with another circuit that used transistors in push-pull stages. The new voltage clamp unit is cheap, generated homogeneous electric field, and gave reproducible and undistorted DNA band patterns. PMID:12707904

  8. Homogeneous catalyst formulations for methanol production

    DOEpatents

    Mahajan, Devinder; Sapienza, Richard S.; Slegeir, William A.; O'Hare, Thomas E.

    1990-01-01

    There is disclosed synthesis of CH.sub.3 OH from carbon monoxide and hydrogen using an extremely active homogeneous catalyst for methanol synthesis directly from synthesis gas. The catalyst operates preferably between 100.degree.-150.degree. C. and preferably at 100-150 psia synthesis gas to produce methanol. Use can be made of syngas mixtures which contain considerable quantities of other gases, such as nitrogen, methane or excess hydrogen. The catalyst is composed of two components: (a) a transition metal carbonyl complex and (b) an alkoxide component. In the simplest formulation, component (a) is a complex of nickel tetracarbonyl and component (b) is methoxide (CH.sub.3 O.sup.13 ), both being dissolved in a methanol solvent system. The presence of a co-solvent such as p-dioxane, THF, polyalcohols, ethers, hydrocarbons, and crown ethers accelerates the methanol synthesis reaction.

  9. Homogeneous catalyst formulations for methanol production

    DOEpatents

    Mahajan, Devinder; Sapienza, Richard S.; Slegeir, William A.; O'Hare, Thomas E.

    1991-02-12

    There is disclosed synthesis of CH.sub.3 OH from carbon monoxide and hydrogen using an extremely active homogeneous catalyst for methanol synthesis directly from synthesis gas. The catalyst operates preferably between 100.degree.-150.degree. C. and preferably at 100-150 psia synthesis gas to produce methanol. Use can be made of syngas mixtures which contain considerable quantities of other gases, such as nitrogen, methane or excess hydrogen. The catalyst is composed of two components: (a) a transition metal carbonyl complex and (b) an alkoxide component. In the simplest formulation, component (a) is a complex of nickel tetracarbonyl and component (b) is methoxide (CH.sub.3 O.sup.-), both being dissolved in a methanol solvent system. The presence of a co-solvent such as p-dioxane, THF, polyalcohols, ethers, hydrocarbons, and crown ethers accelerates the methanol synthesis reaction.

  10. Soliton production with nonlinear homogeneous lines

    DOE PAGES

    Elizondo-Decanini, Juan M.; Coleman, Phillip D.; Moorman, Matthew W.; Petney, Sharon Joy Victor; Dudley, Evan C.; Youngman, Kevin; Penner, Tim Dwight; Fang, Lu; Myers, Katherine M.

    2015-11-24

    Low- and high-voltage Soliton waves were produced and used to demonstrate collision and compression using diode-based nonlinear transmission lines. Experiments demonstrate soliton addition and compression using homogeneous nonlinear lines. We built the nonlinear lines using commercially available diodes. These diodes are chosen after their capacitance versus voltage dependence is used in a model and the line design characteristics are calculated and simulated. Nonlinear ceramic capacitors are then used to demonstrate high-voltage pulse amplification and compression. The line is designed such that a simple capacitor discharge, input signal, develops soliton trains in as few as 12 stages. We also demonstrated outputmore » voltages in excess of 40 kV using Y5V-based commercial capacitors. The results show some key features that determine efficient production of trains of solitons in the kilovolt range.« less

  11. Autophoretic self-propulsion of homogeneous particles

    NASA Astrophysics Data System (ADS)

    Michelin, Sebastien; Lauga, Eric; de Canio, Gabriele

    2014-11-01

    Phoretic mechanisms such as diffusiophoresis exploit short-ranged interactions between solute molecules in the fluid and a rigid wall to generate local slip velocities in the presence of solute gradients along the solid boundary. This boundary flow can result in macroscopic fluid motion or phoretic migration of inert particles. These mechanisms have recently received a renewed interest to design self-propelled ``autophoretic'' systems able to generate the required solute gradients through chemical reaction at their surface. Most existing designs rely on the asymmetric chemical treatment of the particle's surface to guarantee symmetry-breaking and the generation of a net flow. We show here, however, that chemical asymmetry is not necessary for flow generation and that homogeneous particles with asymmetric geometry may lead to self-propulsion in Stokes flow. Similarly, this principle can be used to manufacture micro-pumps using channel walls with uniform chemical properties.

  12. Exact vectorial law for homogeneous rotating turbulence.

    PubMed

    Galtier, Sébastien

    2009-10-01

    Three-dimensional hydrodynamic turbulence is investigated under the assumptions of homogeneity and weak axisymmetry. Following the kinematics developed by E. Lindborg [J. Fluid Mech. 302, 179 (1995)] we rewrite the von Kármán-Howarth equation in terms of measurable correlations and derive the exact relation associated with the flux conservation. This relation is then analyzed in the particular case of turbulence subject to solid-body rotation. We make the ansatz that the development of anisotropy implies an algebraic relation between the axial and the radial components of the separation vector r and we derive an exact vectorial law which is parametrized by the intensity of anisotropy. A simple dimensional analysis allows us to fix this parameter and find a unique expression.

  13. RF Spectroscopy on a Homogeneous Fermi Gas

    NASA Astrophysics Data System (ADS)

    Yan, Zhenjie; Mukherjee, Biswaroop; Patel, Parth; Struck, Julian; Zwierlein, Martin

    2016-05-01

    Over the last two decades RF spectroscopy has been established as an indispensable tool to probe a large variety of fundamental properties of strongly interacting Fermi gases. This ranges from measurement of the pairing gap over tan's contact to the quasi-particle weight of Fermi polarons. So far, most RF spectroscopy experiments have been performed in harmonic traps, resulting in an averaged response over different densities. We have realized an optical uniform potential for ultracold Fermi gases of 6 Li atoms, which allows us to avoid the usual problems connected to inhomogeneous systems. Here we present recent results on RF spectroscopy of these homogeneous samples with a high signal to noise ratio. In addition, we report progress on measuring the contact of a unitary Fermi gas across the normal to superfluid transition.

  14. Soliton production with nonlinear homogeneous lines

    SciTech Connect

    Elizondo-Decanini, Juan M.; Coleman, Phillip D.; Moorman, Matthew W.; Petney, Sharon Joy Victor; Dudley, Evan C.; Youngman, Kevin; Penner, Tim Dwight; Fang, Lu; Myers, Katherine M.

    2015-11-24

    Low- and high-voltage Soliton waves were produced and used to demonstrate collision and compression using diode-based nonlinear transmission lines. Experiments demonstrate soliton addition and compression using homogeneous nonlinear lines. We built the nonlinear lines using commercially available diodes. These diodes are chosen after their capacitance versus voltage dependence is used in a model and the line design characteristics are calculated and simulated. Nonlinear ceramic capacitors are then used to demonstrate high-voltage pulse amplification and compression. The line is designed such that a simple capacitor discharge, input signal, develops soliton trains in as few as 12 stages. We also demonstrated output voltages in excess of 40 kV using Y5V-based commercial capacitors. The results show some key features that determine efficient production of trains of solitons in the kilovolt range.

  15. Cloaking with optimized homogeneous anisotropic layers

    NASA Astrophysics Data System (ADS)

    Popa, Bogdan-Ioan; Cummer, Steven A.

    2009-02-01

    We present a method to reduce the scattering from arbitrary objects by surrounding them with shells composed of several layers of homogeneous anisotropic materials. An optimization procedure is used to find the material parameters for each layer, the starting point of which is a discretized approximation of a coordinate transformation cloaking shell. We show that an optimized, three-layer shell can reduce the maximum scattering of an object by as much as 15dB more than a 100-layer realization of a coordinate transformation cloaking shell. Moreover, using an optimization procedure can yield high-performance cloaking shell solutions that also meet external constraints, such as the maximum value of permittivity or permeability. This design approach can substantially simplify the fabrication of moderate-size cloaking shells.

  16. Homogeneously dispersed multimetal oxygen-evolving catalysts.

    PubMed

    Zhang, Bo; Zheng, Xueli; Voznyy, Oleksandr; Comin, Riccardo; Bajdich, Michal; García-Melchor, Max; Han, Lili; Xu, Jixian; Liu, Min; Zheng, Lirong; García de Arquer, F Pelayo; Dinh, Cao Thang; Fan, Fengjia; Yuan, Mingjian; Yassitepe, Emre; Chen, Ning; Regier, Tom; Liu, Pengfei; Li, Yuhang; De Luna, Phil; Janmohamed, Alyf; Xin, Huolin L; Yang, Huagui; Vojvodic, Aleksandra; Sargent, Edward H

    2016-04-15

    Earth-abundant first-row (3d) transition metal-based catalysts have been developed for the oxygen-evolution reaction (OER); however, they operate at overpotentials substantially above thermodynamic requirements. Density functional theory suggested that non-3d high-valency metals such as tungsten can modulate 3d metal oxides, providing near-optimal adsorption energies for OER intermediates. We developed a room-temperature synthesis to produce gelled oxyhydroxides materials with an atomically homogeneous metal distribution. These gelled FeCoW oxyhydroxides exhibit the lowest overpotential (191 millivolts) reported at 10 milliamperes per square centimeter in alkaline electrolyte. The catalyst shows no evidence of degradation after more than 500 hours of operation. X-ray absorption and computational studies reveal a synergistic interplay between tungsten, iron, and cobalt in producing a favorable local coordination environment and electronic structure that enhance the energetics for OER. PMID:27013427

  17. Consistency of homogenization schemes in linear poroelasticity

    NASA Astrophysics Data System (ADS)

    Pichler, Bernhard; Dormieux, Luc

    2008-08-01

    In view of extending classical micromechanics of poroelasticity to the non-saturated regime, one has to deal with different pore stresses which may be affected by the size and the shape of the pores. Introducing the macrostrain and these pore stresses as loading parameters, the macrostress of a representative volume element of a porous material can be derived by means of Levin's theorem or by means of the direct formulation of the stress average rule, respectively. A consistency requirement for a given homogenization scheme is obtained from the condition that the two approaches should yield identical results. Classical approaches (Mori-Tanaka scheme, self-consistent scheme) are shown to be only conditionally consistent. In contrast, the Ponte Castañeda-Willis scheme proves to provide consistent descriptions both of porous matrix-inclusion composites and of porous polycrystals. To cite this article: B. Pichler, L. Dormieux, C. R. Mecanique 336 (2008).

  18. Homogenization analysis of complementary waveguide metamaterials

    NASA Astrophysics Data System (ADS)

    Landy, Nathan; Hunt, John; Smith, David R.

    2013-11-01

    We analyze the properties of complementary metamaterials as effective inclusions patterned into the conducting walls of metal waveguide structures. We show that guided wave metamaterials can be homogenized using the same retrieval techniques used for volumetric metamaterials, leading to a description in which a given complementary element is conceptually replaced by a block of material within the waveguide whose effective permittivity and permeability result in equivalent scattering characteristics. The use of effective constitutive parameters for waveguide materials provides an alternative point-of-view for the design of waveguide and microstrip based components, including planar lenses and filters, as well as devices with derived from a bulk material response. In addition to imparting effective constitutive properties to the waveguide, complementary metamaterials also couple energy from waveguide modes into radiation. Thus, complementary waveguide metamaterials can be used to modify and optimize a variety of antenna structures.

  19. Data Homogenization of the NOAA Long-Term Ozonesonde Records

    NASA Astrophysics Data System (ADS)

    Johnson, B.; Cullis, P.; Sterling, C. W.; Jordan, A. F.; Hall, E. G.; Petropavlovskikh, I. V.; Oltmans, S. J.; Mcconville, G.

    2015-12-01

    The NOAA long term balloon-borne ozonesonde sites at Boulder, Colorado; Hilo, Hawaii; and South Pole Station, Antarctica have measured weekly ozone profiles for more than 3 decades. The ozonesonde consists of an electrochemical concentration cell (ECC) sensor interfaced with a weather radiosonde which transmits high resolution ozone and meteorological data during ascent from the surface to 30-35 km altitude. During this 30 year time period there have been several model changes in the commercially available ECC ozonesondes and radiosondes as well as three adjustments in the ozone sensor solution composition at NOAA. These changes were aimed at optimizing the ozonesonde performance. Organized intercomparison campaigns conducted at the environmental simulation facility at the Research Centre Juelich, Germany and international field site testing have been the primary process for assessing new designs, instruments, or sensor solution changes and developing standard operating procedures. NOAA has also performed in-house laboratory tests and launched 28 dual ozonesondes at various sites since 1994 to provide further comparison data to determine the optimum homogenized data set. The final homogenization effort involved reviewing and editing several thousand individual ozonesonde profiles followed by applying the optimum correction algorithms for changes in type of sensor solution composition. The results of improved data sets will be shown with long term trends and uncertainties at various altitude levels.

  20. Homogeneous Protein Analysis by Magnetic Core-Shell Nanorod Probes.

    PubMed

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Altantzis, Thomas; Bals, Sara; Schotter, Joerg

    2016-04-13

    Studying protein interactions is of vital importance both to fundamental biology research and to medical applications. Here, we report on the experimental proof of a universally applicable label-free homogeneous platform for rapid protein analysis. It is based on optically detecting changes in the rotational dynamics of magnetically agitated core-shell nanorods upon their specific interaction with proteins. By adjusting the excitation frequency, we are able to optimize the measurement signal for each analyte protein size. In addition, due to the locking of the optical signal to the magnetic excitation frequency, background signals are suppressed, thus allowing exclusive studies of processes at the nanoprobe surface only. We study target proteins (soluble domain of the human epidermal growth factor receptor 2 - sHER2) specifically binding to antibodies (trastuzumab) immobilized on the surface of our nanoprobes and demonstrate direct deduction of their respective sizes. Additionally, we examine the dependence of our measurement signal on the concentration of the analyte protein, and deduce a minimally detectable sHER2 concentration of 440 pM. For our homogeneous measurement platform, good dispersion stability of the applied nanoprobes under physiological conditions is of vital importance. To that end, we support our measurement data by theoretical modeling of the total particle-particle interaction energies. The successful implementation of our platform offers scope for applications in biomarker-based diagnostics as well as for answering basic biology questions.

  1. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, N.; Kaya, B.S.; Godt, J.W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs. Copyright 2011 by the American Geophysical Union.

  2. Homogeneous Protein Analysis by Magnetic Core-Shell Nanorod Probes.

    PubMed

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Altantzis, Thomas; Bals, Sara; Schotter, Joerg

    2016-04-13

    Studying protein interactions is of vital importance both to fundamental biology research and to medical applications. Here, we report on the experimental proof of a universally applicable label-free homogeneous platform for rapid protein analysis. It is based on optically detecting changes in the rotational dynamics of magnetically agitated core-shell nanorods upon their specific interaction with proteins. By adjusting the excitation frequency, we are able to optimize the measurement signal for each analyte protein size. In addition, due to the locking of the optical signal to the magnetic excitation frequency, background signals are suppressed, thus allowing exclusive studies of processes at the nanoprobe surface only. We study target proteins (soluble domain of the human epidermal growth factor receptor 2 - sHER2) specifically binding to antibodies (trastuzumab) immobilized on the surface of our nanoprobes and demonstrate direct deduction of their respective sizes. Additionally, we examine the dependence of our measurement signal on the concentration of the analyte protein, and deduce a minimally detectable sHER2 concentration of 440 pM. For our homogeneous measurement platform, good dispersion stability of the applied nanoprobes under physiological conditions is of vital importance. To that end, we support our measurement data by theoretical modeling of the total particle-particle interaction energies. The successful implementation of our platform offers scope for applications in biomarker-based diagnostics as well as for answering basic biology questions. PMID:27023370

  3. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, Ning; Kaya, Basak Sener; Godt, Jonathan W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs.

  4. Homogenization of global radiosonde humidity data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Haimberger, Leopold

    2016-04-01

    The global radiosonde network is an important source of upper-air measurements and is strongly connected to reanalysis efforts of the 20th century. However, measurements are strongly affected by changes in the observing system and require a homogenization before they can be considered useful in climate studies. In particular humidity measurements are known to show spurious trends and biases induced by many sources, e.g. reporting practices or freezing of the sensor. We propose to detect and correct these biases in an automated way, as has been done with temperature and winds. We detect breakpoints in dew point depression (DPD) time series by employing a standard normal homogeneity test (SNHT) on DPD-departures from ERA-Interim. In a next step, we calculate quantile departures between the latter and the earlier part near the breakpoints of the time series, going back in time. These departures adjust the earlier distribution of DPD to the latter distribution, called quantile matching, thus removing for example a non climatic shift. We employ this approach to the existing radiosonde network. In a first step to verify our approach we compare our results with ERA-Interim data and brightness temperatures of humidity-sensitive channels of microwave measuring radiometers (SSMIS) onboard DMSP F16. The results show that some of the biases can be detected and corrected in an automated way, however large biases that impact the distribution of DPD values originating from known reporting practices (e.g. 30 DPD on US stations) remain. These biases can be removed but not corrected. The comparison of brightness temperatures from satellite and radiosondes proofs to be difficult as large differences result from for example representative errors.

  5. Creating an organizational climate for multiculturalism.

    PubMed

    Bruhn, J G

    1996-06-01

    Multiculturism is an ideal goal for our society, its organizations, and its institutions, involving a continuous process of education and change within organizations. Multiculturalism begins with diversity and requires various steps to achieve changes in attitudes, behaviors, and values. The leadership of organizations must not only commit to diversification, but they must participate in it and reward its efforts. Diversification should be managed by creating a climate of open participation, feedback, and control at the lower organizational levels. To micromanage the process of becoming diverse increases resistance and paranoia and counters educational efforts. PMID:10157003

  6. Creating a culture of mutual respect.

    PubMed

    Kaplan, Kathryn; Mestel, Pamela; Feldman, David L

    2010-04-01

    The Joint Commission mandates that hospitals seeking accreditation have a process to define and address disruptive behavior. Leaders at Maimonides Medical Center, Brooklyn, New York, took the initiative to create a code of mutual respect that not only requires respectful behavior, but also encourages sensitivity and awareness to the causes of frustration that often lead to inappropriate behavior. Steps to implementing the code included selecting code advocates, setting up a system for mediating disputes, tracking and addressing operational system issues, providing training for personnel, developing a formal accountability process, and measuring the results. PMID:20362215

  7. Create a Polarized Light Show.

    ERIC Educational Resources Information Center

    Conrad, William H.

    1992-01-01

    Presents a lesson that introduces students to polarized light using a problem-solving approach. After illustrating the concept using a slinky and poster board with a vertical slot, students solve the problem of creating a polarized light show using Polya's problem-solving methods. (MDH)

  8. Creating Space for Children's Literature

    ERIC Educational Resources Information Center

    Serafini, Frank

    2011-01-01

    As teachers struggle to balance the needs of their students with the requirements of commercial reading materials, educators need to consider how teachers will create space for children's literature in today's classrooms. In this article, 10 practical recommendations for incorporating children's literature in the reading instructional framework…

  9. Creating Time for Equity Together

    ERIC Educational Resources Information Center

    Renée, Michelle

    2015-01-01

    Iin urban communities across the nation, a broad range of partners have committed to reinventing educational time together to ensure equitable access to rich learning opportunities for all young people. Across the nation, education partners are using their creativity, commitment, and unique resources to create new school and system designs that…

  10. Creating Three-Dimensional Scenes

    ERIC Educational Resources Information Center

    Krumpe, Norm

    2005-01-01

    Persistence of Vision Raytracer (POV-Ray), a free computer program for creating photo-realistic, three-dimensional scenes and a link for Mathematica users interested in generating POV-Ray files from within Mathematica, is discussed. POV-Ray has great potential in secondary mathematics classrooms and helps in strengthening students' visualization…

  11. Creating an Innovative Learning Organization

    ERIC Educational Resources Information Center

    Salisbury, Mark

    2010-01-01

    This article describes how to create an innovative learning (iLearning) organization. It begins by discussing the life cycle of knowledge in an organization, followed by a description of the theoretical foundation for iLearning. Next, the article presents an example of iLearning, followed by a description of the distributed nature of work, the…

  12. Creating Highlander Wherever You Are

    ERIC Educational Resources Information Center

    Williams, Susan; Mullett, Cathy

    2016-01-01

    Highlander Research and Education Center serves as a catalyst for grassroots organizing and movement building. This article focuses on an interview with education coordinator Susan Williams who has worked at Highlander for 26 years. We discuss how others can and do create powerful popular education experiences anywhere, whether they have a…

  13. Creating Presentations on ICT Classes

    ERIC Educational Resources Information Center

    Marchis, Iuliana

    2010-01-01

    The article focuses on the creation of presentations on ICT classes. The first part highlights the most important steps when creating a presentation. The main idea is, that the computer presentation shouldn't consist only from the technological part, i.e. the editing of the presentation in a computer program. There are many steps before and after…

  14. Creating a Global Perspective Campus

    ERIC Educational Resources Information Center

    Braskamp, Larry A.

    2011-01-01

    The author has written this Guidebook to assist users interested in creating a campus that will be more global in its mission, programs, and people. His approach is to focus on the views and contributions of the people who are engaged in higher education. Thus it has a "person" emphasis rather than a structural or policy point of view. The author…

  15. Can Children Really Create Knowledge?

    ERIC Educational Resources Information Center

    Bereiter, Carl; Scardamalia, Marlene

    2010-01-01

    Can children genuinely create new knowledge, as opposed to merely carrying out activities that resemble those of mature scientists and innovators? The answer is yes, provided the comparison is not to works of genius but to standards that prevail in ordinary research communities. One important product of knowledge creation is concepts and tools…

  16. Creating Adult Basic Education Programs.

    ERIC Educational Resources Information Center

    Harris, Dolores M.

    Adult basic education programs must teach the "social living skills" disadvantaged adults need, as well as basic literacy skills. In creating an ABE program, one must first assess the needs of the target population--through surveys, group meetings, an advisory council of members of the target population, demographic studies, and consideration of…

  17. Effect of cloud-scale vertical velocity on the contribution of homogeneous nucleation to cirrus formation and radiative forcing

    NASA Astrophysics Data System (ADS)

    Shi, X.; Liu, X.

    2016-06-01

    Ice nucleation is a critical process for the ice crystal formation in cirrus clouds. The relative contribution of homogeneous nucleation versus heterogeneous nucleation to cirrus formation differs between measurements and predictions from general circulation models. Here we perform large-ensemble simulations of the ice nucleation process using a cloud parcel model driven by observed vertical motions and find that homogeneous nucleation occurs rather infrequently, in agreement with recent measurement findings. When the effect of observed vertical velocity fluctuations on ice nucleation is considered in the Community Atmosphere Model version 5, the relative contribution of homogeneous nucleation to cirrus cloud occurrences decreases to only a few percent. However, homogeneous nucleation still has strong impacts on the cloud radiative forcing. Hence, the importance of homogeneous nucleation for cirrus cloud formation should not be dismissed on the global scale.

  18. Absorbing metasurface created by diffractionless disordered arrays of nanoantennas

    SciTech Connect

    Chevalier, Paul; Bouchon, Patrick Jaeck, Julien; Lauwick, Diane; Kattnig, Alain; Bardou, Nathalie; Pardo, Fabrice; Haïdar, Riad

    2015-12-21

    We study disordered arrays of metal-insulator-metal nanoantenna in order to create a diffractionless metasurface able to absorb light in the 3–5 μm spectral range. This study is conducted with angle-resolved reflectivity measurements obtained with a Fourier transform infrared spectrometer. A first design is based on a perturbation of a periodic arrangement, leading to a significant reduction of the radiative losses. Then, a random assembly of nanoantennas is built following a Poisson-disk distribution of given density, in order to obtain a nearly perfect cluttered assembly with optical properties of a homogeneous material.

  19. Evidence for homogeneous distribution of osmium in the protosolar nebula

    NASA Astrophysics Data System (ADS)

    Walker, Richard J.

    2012-10-01

    Separate s-, r-, and possibly p-process enriched and depleted components have been shown to host Os in low metamorphic grade chondrites, although no measureable Os isotopic anomalies have yet been discovered for bulk chondrites. Here, iron meteorites from groups IAB, IIAB, IIIAB, IVA and IVB, as well as the main group pallasites are examined. Many of these meteorites show well-resolved anomalies in ɛ190Os, ɛ189Os and ɛ186Osi. The anomalies, however, differ from those observed in chemically extracted components from chondrites, and are interpreted to reflect long-term exposure of the meteorites to cosmic rays, rather than nucleosynthetic effects. A neutron capture model is presented that can well account for observed isotopic variations in 190Os and 189Os. The same model predicts greater enrichment in 186Osi than is observed for at least one iron, suggesting as yet unaccounted for effects, or failings of the model. Despite the variable anomalies resulting from cosmic ray exposure, each of the major meteorite groups examined contains at one member with normal Os isotopic compositions that are unresolved from chondritic compositions. This indicates that some domains within these meteorites were little affected by cosmic rays. These domains are excellent candidates for application of the 182Hf-182W system for dating metal-silicate segregation on their parent bodies. The normal Os also implies that Os was homogeneously distributed throughout the protosolar nebula on the scale of planetesimal accretion, within the current level of analytical resolution. The homogeneity in Os contrasts with isotopic heterogeneity present for other siderophile elements, including Mo, Ru and W. The contrast in the scale of anomalies may reflect a late stage-injection of s- and p- process rich material into the coalescing nebula. Alternately, nebular thermal processing and destruction of some presolar host phases of Mo, Ru and W may also be responsible.

  20. Influence of interspecific competition and landscape structure on spatial homogenization of avian assemblages.

    PubMed

    Robertson, Oliver J; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or 'winners', is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or 'winners', can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems.

  1. Photonic crystal waveguide created by selective infiltration

    NASA Astrophysics Data System (ADS)

    Casas Bedoya, A.; Domachuk, P.; Grillet, C.; Monat, C.; Mägi, E. C.; Li, E.; Eggleton, B. J.

    2012-06-01

    The marriage of photonics and microfluidics ("optofluidics") uses the inherent mobility of fluids to reversibly tune photonic structures beyond traditional fabrication methods by infiltrating voids in said structures. Photonic crystals (PhCs) strongly control light on the wavelength scale and are well suited to optofluidic tuning because their periodic airhole microstructure is a natural candidate for housing liquids. The infiltration of a single row of holes in the PhC matrix modifies the effective refractive index allowing optical modes to be guided by the PhC bandgap. In this work we present the first experimental demonstration of a reconfigurable single mode W1 photonic crystal defect waveguide created by selective liquid infiltration. We modified a hexagonal silicon planar photonic crystal membrane by selectively filling a single row of air holes with ~300nm resolution, using high refractive index ionic liquid. The modification creates optical confinement in the infiltrated region and allows propagation of a single optical waveguide mode. We describe the challenges arising from the infiltration process and the liquid/solid surface interaction in the photonic crystal. We include a detailed comparison between analytic and numerical modeling and experimental results, and introduce a new approach to create an offset photonic crystal cavity by varying the nature of the selective infiltration process.

  2. Galaxies Collide to Create Hot, Huge Galaxy

    NASA Technical Reports Server (NTRS)

    2009-01-01

    This image of a pair of colliding galaxies called NGC 6240 shows them in a rare, short-lived phase of their evolution just before they merge into a single, larger galaxy. The prolonged, violent collision has drastically altered the appearance of both galaxies and created huge amounts of heat turning NGC 6240 into an 'infrared luminous' active galaxy.

    A rich variety of active galaxies, with different shapes, luminosities and radiation profiles exist. These galaxies may be related astronomers have suspected that they may represent an evolutionary sequence. By catching different galaxies in different stages of merging, a story emerges as one type of active galaxy changes into another. NGC 6240 provides an important 'missing link' in this process.

    This image was created from combined data from the infrared array camera of NASA's Spitzer Space Telescope at 3.6 and 8.0 microns (red) and visible light from NASA's Hubble Space Telescope (green and blue).

  3. Creating a climate for excellence.

    PubMed

    Lancaster, J

    1985-01-01

    Some people are motivated to achieve in a manner consistent with the goals of their organization while others pursue individual goals. The attitudes people hold determine their behavior. Therefore, the manager is charged with creating an environment that fosters employee commitment to organizational goals. To create a climate for achievement, managers must recognize that all employees want recognition. Employees perform more effectively when they understand the goals of the organization, know what is expected of them, and are part of a system that includes feedback and reinforcement. Generally, people perform more effectively in an environment with minimal threat and punishment; individual responsibility should be encouraged, rewards based on results, and a climate of trust and open communication should prevail.

  4. Managing resilience by creating purpose.

    PubMed

    Spake, Michael; Thompson, Elaine C

    2013-01-01

    Rapid, disruptive change is today's normal. It comes in all forms and frequencies. To cope and survive, healthcare executives need to build a culture of agility and resilience at all levels and across all domains of the hospital or health system. Lakeland Regional Health Systems Inc. has been transforming its culture in order to manage resilience by creating purpose. To adapt and sustain itself, Lakeland Regional has launched a transformation from a culture characterized by a collection of single values to one whose core is caring relationships through human interaction; human experience; and community values, beliefs, and attitudes. With a clear purpose of caring for ourselves, caring for our patients and families, caring for each other, and caring for our community, Lakeland Regional is creating resilience by building a purpose that sets the stage for a resilient culture defined by purpose; passion; and a healthy work, spiritual, and life balance.

  5. Creating youth leaders: community supports.

    PubMed

    Davidson, Adina; Schwartz, Sarah E O; Noam, Gil G

    2008-01-01

    In order to maximize the effectiveness of prevention and intervention efforts with youth and address the needs of the whole student, it is necessary to work not only directly with youth, but also to partner with other key adults in a young person's life: parents and guardians, teachers, after-school staff, and clinicians. Inherent in RALLY's philosophy is a dual strategy of working intensively with students and teachers in the school while creating partnerships that bring students' families and a network of community agencies into the school as well. These partnerships bring important resources to school communities and create richer opportunities for young people and their families. Furthermore, a key to working effectively with youth lies in providing them not only with services that match their needs and interests, but also opportunities for participation and empowerment. Such opportunities can result in significant individual change in the students involved in these opportunities as well as broader community.

  6. Edge-Based Image Compression with Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Mainberger, Markus; Weickert, Joachim

    It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.

  7. Analysis of opinion spreading in homogeneous networks with signed relationships

    NASA Astrophysics Data System (ADS)

    Fan, Pengyi; Wang, Hui; Li, Pei; Li, Wei; Jiang, Zhihong

    2012-08-01

    Recently, significant attention has been devoted to opinion dynamics in social networks, in which all the relationships between individuals are assumed as positive ones (i.e. friend, altruism or trust). However, many realistic social networks include negative relationships (i.e. enemy or distrust) as well as positive ones. In order to find the dynamical behavior of opinion spreading in signed networks, we propose a model taking into account the impacts of positive and negative relationships. Based on this model, we analyze the dynamical process and provide a detailed mathematical analysis for identifying the threshold of opinion spreading in homogeneous networks with signed relationships. By performing numerical simulations for the threshold in three different signed networks, we find that the theoretical and numerical results are in good agreement, confirming the correctness of our exact solution.

  8. Homogeneous optical cloak constructed with uniform layered structures.

    PubMed

    Zhang, Jingjing; Liu, Liu; Luo, Yu; Zhang, Shuang; Mortensen, Niels Asger

    2011-04-25

    The prospect of rendering objects invisible has intrigued researchers for centuries. Transformation optics based invisibility cloak design is now bringing this goal from science fictions to reality and has already been demonstrated experimentally in microwave and optical frequencies. However, the majority of the invisibility cloaks reported so far have a spatially varying refractive index which requires complicated design processes. Besides, the size of the hidden object is usually small relative to that of the cloak device. Here we report the experimental realization of a homogenous invisibility cloak with a uniform silicon grating structure. The design strategy eliminates the need for spatial variation of the material index, and in terms of size it allows for a very large obstacle/cloak ratio. A broadband invisibility behavior has been verified at near-infrared frequencies, opening up new opportunities for using uniform layered medium to realize invisibility at any frequency ranges, where high-quality dielectrics are available. PMID:21643114

  9. Homogeneous analysis: label-free and substrate-free aptasensors.

    PubMed

    Li, Bingling; Dong, Shaojun; Wang, Erkang

    2010-06-01

    In this Focus Review, we introduce a kind of "label-free" and "substrate-free" (LFSF) aptasensor that carries out the whole sensing process in a homogeneous solution. This means that commonly used covalent label; separation, and immobilization steps in biosensors are successfully avoided, which simplifies the sensing operations to the greatest degree. After brief description about the advantages of aptamers and "LFSF" aptasensors, the main content of the review is divided into fluorescent aptasenors, calorimetric aptasensors, and hemin-aptamer-DNAzyme "LFSF" aptasensors, which are three most widely developed sensing systems in this field. It is hoped that this review can provide an overall scene about how aptamers function as ideal recognition elements in smart analysis.

  10. Nature of low-frequency noise in homogeneous semiconductors

    NASA Astrophysics Data System (ADS)

    Palenskis, Vilius; Maknys, Kęstutis

    2015-12-01

    This report deals with a 1/f noise in homogeneous classical semiconductor samples on the base of silicon. We perform detail calculations of resistance fluctuations of the silicon sample due to both a) the charge carrier number changes due to their capture-emission processes, and b) due to screening effect of those negative charged centers, and show that proportionality of noise level to square mobility appears as a presentation parameter, but not due to mobility fluctuations. The obtained calculation results explain well the observed experimental results of 1/f noise in Si, Ge, GaAs and exclude the mobility fluctuations as the nature of 1/f noise in these materials and their devices. It is also shown how from the experimental 1/f noise results to find the effective number of defects responsible for this noise in the measured frequency range.

  11. Homogeneously catalyzed oxidation for the destruction of aqueous organic wastes

    SciTech Connect

    Leavitt, D.D.; Horbath, J.S.; Abraham, M.A. )

    1990-11-01

    Several organic species, specifically atrazine, 2,4-dichlorophenozyacetic acid, and biphenyl, were converted to CO{sub 2} and other non-harmful gases through oxidation catalyzed by inorganic acid. Nearly complete conversion was obtained through homogeneous liquid-phase oxidation with ammonium nitrate. The kinetics of reaction have been investigated and indicate parallel oxidation and thermal degradation of the oxidant. This results in a maximum conversion at an intermediate temperature. Increasing oxidant concentration accelerates the rate of conversion and shifts the location of the optimum temperature. Reaction at varying acid concentration revealed that conversion increased with an approximately linear relationship as the pH of the solution was increased. Conversion was increased to greater than 99% through the addition of small amounts of transition metal salts demonstrating the suitability of a treatment process based on this technology for wastestreams containing small quantities of heavy metals.

  12. Toward homogenization of Mediterranean lagoons and their loss of hydrodiversity

    NASA Astrophysics Data System (ADS)

    Ferrarin, Christian; Bajo, Marco; Bellafiore, Debora; Cucco, Andrea; De Pascalis, Francesca; Ghezzo, Michol; Umgiesser, Georg

    2014-08-01

    Lagoons are considered to be the most valuable systems of the Mediterranean coastal area, with crucial ecological, historical, economical, and social relevance. Climate change strongly affects coastal areas and can deeply change the status of transitional areas like lagoons. Herein we investigate the hydrological response of 10 Mediterranean lagoons to climate change by means of numerical models. Our results suggest that Mediterranean lagoons amplify the salinity and temperature changes expected for the open sea. Moreover, numerical simulations indicate that there will be a general loss of intralagoon and interlagoon variability of their physical properties. Therefore, as a result of climate change, we see on Mediterranean lagoons an example of a common process that in future may effect many coastal environments: that of homogenization of the physical characteristics with a tendency toward marinization.

  13. Nature of low-frequency noise in homogeneous semiconductors

    PubMed Central

    Palenskis, Vilius; Maknys, Kęstutis

    2015-01-01

    This report deals with a 1/f noise in homogeneous classical semiconductor samples on the base of silicon. We perform detail calculations of resistance fluctuations of the silicon sample due to both a) the charge carrier number changes due to their capture–emission processes, and b) due to screening effect of those negative charged centers, and show that proportionality of noise level to square mobility appears as a presentation parameter, but not due to mobility fluctuations. The obtained calculation results explain well the observed experimental results of 1/f noise in Si, Ge, GaAs and exclude the mobility fluctuations as the nature of 1/f noise in these materials and their devices. It is also shown how from the experimental 1/f noise results to find the effective number of defects responsible for this noise in the measured frequency range. PMID:26674184

  14. Creating a Mobile Library Website

    ERIC Educational Resources Information Center

    Cutshall, Tom C.; Blake, Lindsay; Bandy, Sandra L.

    2011-01-01

    The overwhelming results were iPhones and Android devices. Since the library wasn't equipped technologically to develop an in-house application platform and because we wanted the content to work across all mobile platforms, we decided to focus on creating a mobile web-based platform. From the NLM page of mobile sites we chose the basic PubMed/…

  15. Gifted Kids Create a Large Scale Museum Exhibit in Indianapolis.

    ERIC Educational Resources Information Center

    Waters, Mary Lou; Bostwick, Alice

    1989-01-01

    Indianapolis gifted students in grades five-seven created a museum exhibit to express the views of youth while giving adults an opportunity to listen. The process involved brainstorming, selecting a topic (drug and alcohol abuse), gathering information, forming committees, building a scale model, and creating the actual exhibit. (JDD)

  16. Homogeneous isolation of nanocelluloses by controlling the shearing force and pressure in microenvironment.

    PubMed

    Li, Jihua; Wang, Yihong; Wei, Xiaoyi; Wang, Fei; Han, Donghui; Wang, Qinghuang; Kong, Lingxue

    2014-11-26

    Nanocelluloses were prepared from sugarcane bagasse celluloses by dynamic high pressure microfluidization (DHPM), aiming at achieving a homogeneous isolation through the controlling of shearing force and pressure within a microenvironment. In the DHPM process, the homogeneous cellulose solution passed through chambers at a higher pressure in fewer cycles, compared with the high pressure homogenization (HPH) process. X-ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS) demonstrated that entangled network structures of celluloses were well dispersed in the microenvironment, which provided proper shearing forces and pressure to fracture the hydrogen bonds. Gel permeation chromatography (GPC), CP/MAS (13)C NMR and Fourier transform infrared spectroscopy (FT-IR) measurements suggested that intra-molecular hydrogen bonds were maintained. These nanocelluloses of smaller particle size, good dispersion and lower thermal stability will have great potential to be applied in electronics devices, electrochemistry, medicine, and package and printing industry.

  17. Dynamic contact angle cycling homogenizes heterogeneous surfaces.

    PubMed

    Belibel, R; Barbaud, C; Mora, L

    2016-12-01

    In order to reduce restenosis, the necessity to develop the appropriate coating material of metallic stent is a challenge for biomedicine and scientific research over the past decade. Therefore, biodegradable copolymers of poly((R,S)-3,3 dimethylmalic acid) (PDMMLA) were prepared in order to develop a new coating exhibiting different custom groups in its side chain and being able to carry a drug. This material will be in direct contact with cells and blood. It consists of carboxylic acid and hexylic groups used for hydrophilic and hydrophobic character, respectively. The study of this material wettability and dynamic surface properties is of importance due to the influence of the chemistry and the potential motility of these chemical groups on cell adhesion and polymer kinetic hydrolysis. Cassie theory was used for the theoretical correction of contact angles of these chemical heterogeneous surfaces coatings. Dynamic Surface Analysis was used as practical homogenizer of chemical heterogeneous surfaces by cycling during many cycles in water. In this work, we confirmed that, unlike receding contact angle, advancing contact angle is influenced by the difference of only 10% of acidic groups (%A) in side-chain of polymers. It linearly decreases with increasing acidity percentage. Hysteresis (H) is also a sensitive parameter which is discussed in this paper. Finally, we conclude that cycling provides real information, thus avoiding theoretical Cassie correction. H(10)is the most sensible parameter to %A. PMID:27612817

  18. Homogeneity of Antibody Responses in Tuberculosis Patients

    PubMed Central

    Samanich, K.; Belisle, J. T.; Laal, S.

    2001-01-01

    The goals of the present study were twofold: (i) to compare the repertoires of antigens in culture filtrates of in vitro-grown Mycobacterium tuberculosis that are recognized by antibodies from noncavitary and cavitary tuberculosis (TB) patients and (ii) to determine the extent of variation that exists between the antigen profiles recognized by individual TB patients. Lipoarabinomannan-free culture filtrate proteins of M. tuberculosis were fractionated by one-dimensional (1-D) and 2-D polyacrylamide gel electrophoresis, and the Western blots were probed with sera from non-human immunodeficiency virus (non-HIV)-infected cavitary and noncavitary TB patients and from HIV-infected, noncavitary TB patients. In contrast to earlier studies based on recombinant antigens of M. tuberculosis which suggested that antibody responses in TB patients were heterogeneous (K. Lyashchenko et al., 1998, Infect. Immun. 66:3936–3940, 1998), our studies with native culture filtrate proteins show that the antibody responses in TB patients show significant homogeneity in being directed against a well-defined subset of antigens. Thus, there is a well-defined subset of culture filtrate antigens that elicits antibodies during noncavitary and cavitary disease. In addition, another set of antigens is recognized primarily by cavitary TB patients. The mapping with individual patient sera presented here suggests that serodiagnostic tests based on the subset of antigens recognized during both noncavitary and cavitary TB will enhance the sensitivity of antibody detection in TB patients, especially in difficult-to-diagnose, smear-negative, noncavitary TB patients. PMID:11402004

  19. Inhomogeneous radiative forcing of homogeneous greenhouse gases

    NASA Astrophysics Data System (ADS)

    Huang, Yi; Tan, Xiaoxiao; Xia, Yan

    2016-03-01

    Radiative forcing of a homogeneous greenhouse gas (HGG) can be very inhomogeneous because the forcing is dependent on other atmospheric and surface variables. In the case of doubling CO2, the monthly mean instantaneous forcing at the top of the atmosphere is found to vary geographically and temporally from positive to negative values, with the range (-2.5-5.1 W m-2) being more than 3 times the magnitude of the global mean value (2.3 W m-2). The vertical temperature change across the atmospheric column (temperature lapse rate) is found to be the best single predictor for explaining forcing variation. In addition, the masking effects of clouds and water vapor also contribute to forcing inhomogeneity. A regression model that predicts forcing from geophysical variables is constructed. This model can explain more than 90% of the variance of the forcing. Applying this model to analyzing the forcing variation in the Climate Model Intercomparison Project Phase 5 models, we find that intermodel discrepancy in CO2 forcing caused by model climatology leads to considerable discrepancy in their projected change in poleward energy transport.

  20. Homogeneous LED-illumination using microlens arrays

    NASA Astrophysics Data System (ADS)

    Schreiber, Peter; Kudaev, Serge; Dannberg, Peter; Zeitner, Uwe D.

    2005-08-01

    Efficient homogeneous illumination of rectangular or circular areas with LEDs is a promising application for doublesided microlens arrays. Such illumination schemes employ a primary optics - which can be realized with a concentrator or a collimation lens - and a secondary optics with one or more double-sided microlens arrays and a collection optics for superposing the light from the individual array channels. The main advantage of this design is the achievable short system length compared to integrating lightpipe designs with subsequent relay optics. We describe design rules for the secondary optics derived from simple ABCD-matrix formalism. Based on these rules, sequential raytracing is used for the actual optics system design. Double-sided arrays are manufactured by polymer-on-glass replication of reflow lenses. With cylindrical lens arrays we assembled high-brightness RGB-illumination systems for rectangular areas. Hexagonal packed double-sided arrays of spherical lenslets were applied for a miniaturized circular spotlight. Black matrix polymer apertures attached to the lens array helped to avoid unwanted straylight.

  1. Simulation and modeling of homogeneous, compressed turbulence

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.

    1985-01-01

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression was simulated by numerically solving the Navier-Stokes equations. The numerical simulations were performed on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second-order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one-dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. Results from the simulated flow fields were used to test one-point closure, two-equation models. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  2. Homogeneously dispersed, multimetal oxygen-evolving catalysts

    DOE PAGES

    Zhang, Bo; Zheng, Xueli; Voznyy, Oleksandr; Comin, Riccardo; Bajdich, Michal; Garcia-Melchor, Max; Han, Lili; Xu, Jixian; Liu, Min; Zheng, Lirong; et al

    2016-03-24

    Earth-abundant first-row (3d) transition-metal-based catalysts have been developed for the oxygen-evolution reaction (OER); however, they operate at overpotentials significantly above thermodynamic requirements. Density functional theory suggested that non-3d high-valency metals such as tungsten can modulate 3d metal oxides, providing near-optimal adsorption energies for OER intermediates. We developed a room-temperature synthesis to produce gelled oxy-hydroxide materials with an atomically homogeneous metal distribution. These gelled FeCoW oxy-hydroxide exhibits the lowest overpotential (191 mV) reported at 10 mA per square centimeter in alkaline electrolyte. Here, the catalyst shows no evidence of degradation following more than 500 hours of operation. X-ray absorption and computationalmore » studies reveal a synergistic interplay between W, Fe and Co in producing a favorable local coordination environment and electronic structure that enhance the energetics for OER.« less

  3. Homogeneous screening assay for human tankyrase.

    PubMed

    Narwal, Mohit; Fallarero, Adyary; Vuorela, Pia; Lehtiö, Lari

    2012-06-01

    Tankyrase, a member of human PARP protein superfamily, catalyzes a covalent post-translational modification of substrate proteins. This modification, poly(ADP-ribos)ylation, leads to changes in protein interactions and modifies downstream signaling events. Tankyrase 1 is a potential drug target due to its functions in telomere homeostasis and in Wnt signaling. We describe here optimization and application of an activity-based homogenous assay for tankyrase inhibitors in a high-throughput screening format. The method measures the consumption of substrate by the chemical conversion of the remaining NAD(+) into a stable fluorescent condensation product. Conditions were optimized to measure the enzymatic auto-modification of a recombinant catalytic fragment of tankyrase 1. The fluorescence assay is inexpensive, operationally easy and performs well according to the statistical analysis (Z'= 0.7). A validatory screen with a natural product library confirmed suitability of the assay for finding new tankyrase inhibitors. Flavone was the most potent (IC(50)=325 nM) hit from the natural compounds. A flavone derivative, apigenin, and isopropyl gallate showed potency on the micromolar range, but displayed over 30-fold selectivity for tankyrase over the studied isoenzymes PARP1 and PARP2. The assay is robust and will be useful for screening new tankyrase inhibitors. PMID:22357873

  4. Digital Documentation: Using Computers to Create Multimedia Reports.

    ERIC Educational Resources Information Center

    Speitel, Tom; And Others

    1996-01-01

    Describes methods for creating integrated multimedia documents using recent advances in print, audio, and video digitization that bring added usefulness to computers as data acquisition, processing, and presentation tools. Discusses advantages of digital documentation. (JRH)

  5. Building communities that create health.

    PubMed Central

    Wilcox, R; Knapp, A

    2000-01-01

    Typically, public health policy, program design, and resource allocation are based on issue-specific, targeted interventions directed at specific populations or sub-populations. The authors argue that this approach fails to meet the goal of public health-to improve health for all--and that the key to health improvement is to create a social context in which healthy choices are the norm. The authors present as case studies two Pennsylvania cities that used multisectoral approaches to achieve community health improvements. Images p141-a PMID:10968745

  6. Homogeneous and heterogenized iridium water oxidation catalysts

    NASA Astrophysics Data System (ADS)

    Macchioni, Alceo

    2014-10-01

    The development of an efficient catalyst for the oxidative splitting of water into molecular oxygen, protons and electrons is of key importance for producing solar fuels through artificial photosynthesis. We are facing the problem by means of a rational approach aimed at understanding how catalytic performance may be optimized by the knowledge of the reaction mechanism of water oxidation and the fate of the catalytic site under the inevitably harsh oxidative conditions. For the purposes of our study we selected iridium water oxidation catalysts, exhibiting remarkable performance (TOF > 5 s-1 and TON > 20000). In particular, we recently focused our attention on [Cp*Ir(N,O)X] (N,O = 2-pyridincarboxylate; X = Cl or NO3) and [IrCl(Hedta)]Na water oxidation catalysts. The former exhibited a remarkable TOF whereas the latter showed a very high TON. Furthermore, [IrCl(Hedta)]Na was heterogenized onto TiO2 taking advantage of the presence of a dandling -COOH functionality. The heterogenized catalyst maintained approximately the same catalytic activity of the homogeneous analogous with the advantage that could be reused many times. Mechanistic studies were performed in order to shed some light on the rate-determining step and the transformation of catalysts when exposed to "oxidative stress". It was found that the last oxidative step, preceding oxygen liberation, is the rate-determining step when a small excess of sacrificial oxidant is used. In addition, several intermediates of the oxidative transformation of the catalyst were intercepted and characterized by NMR, X-Ray diffractometry and ESI-MS.

  7. Homogenization estimates for texture evolution in halite

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Gilormini, Pierre; Ponte Castañeda, Pedro

    2005-09-01

    In this work, the recently developed "second-order" self-consistent method [Liu, Y., Ponte Castañeda, P., 2004a. Second-order estimates for the effective behavior and field fluctuations in viscoplastic polycrystals. J. Mech. Phys. Solids 52 467-495] is used to simulate texture evolution in halite polycrystals. This method makes use of a suitably optimized linear comparison polycrystal and has the distinguishing property of being exact to second order in the heterogeneity contrast. The second-order model takes into consideration the effects of hardening and of the evolution of both crystallographic and morphological texture to yield reliable predictions for the macroscopic behavior of the polycrystal. Comparisons of these predictions with full-field numerical simulations [Lebensohn, R.A., Dawson, P.R., Kern, H.M., Wenk, H.R., 2003. Heterogeneous deformation and texture development in halite polycrystals: comparison of different modeling approaches and experimental data. Tectonophysics 370 287-311], as well as with predictions resulting from the earlier "variational" and "tangent" self-consistent models, included here for comparison purposes, provide insight into how the underlying assumptions of the various models affect slip in the grains, and therefore the texture predictions in highly anisotropic and nonlinear polycrystalline materials. The "second-order" self-consistent method, while giving a softer stress-strain response than the corresponding full-field results, predicts a pattern of texture evolution that is not captured by the other homogenization models and that agrees reasonably well with the full-field predictions and with the experimental measures.

  8. Creating new interspecific hybrid and polyploid crops.

    PubMed

    Mason, Annaliese S; Batley, Jacqueline

    2015-08-01

    Agricultural selection of desirable traits in domesticated plant and animal species mimics natural evolutionary selection for ability of species to survive, thrive, and reproduce in the wild. However, one evolutionary process is currently underutilised for human agricultural purposes: speciation through interspecific hybridisation and polyploid formation. Despite promising successes in creation of new hybrid and or polyploid species in many genera, few geneticists and breeders deliberately take advantage of polyploidy and interspecific hybridisation for crop improvement. We outline the possible benefits as well as potential problems and criticisms with this approach, and address how modern advances in technology and knowledge can help to create new crop species for agriculture. PMID:26164645

  9. [Cellular homogeneity in diverse portions of the diaphragm].

    PubMed

    Jiménez-Fuentes, M A; Gea, J; Mariñán, M; Gáldiz, J B; Gallego, F; Broquetas, J M

    1998-02-01

    The diaphragm is the main inspiratory muscle. It is composed of two parts, the costal and crural, with both anatomical and functional differences. The general morphometric characteristics of the diaphragm have been described in various species but homogeneity throughout the muscle has not been adequately studied. The aim of this study was to evaluate the fiber phenotype of various parts of the diaphragm. The entire diaphragm muscles of five New Zealand rabbits were removed and each was divided into quarters. The specimens were processed for morphometry (hematoxyllineosin stains, NADH-TR and ATPase at pH levels of 4.2, 4.6 and 9.4). For each portion we measured percent and size of fibers, expressing the latter as minimum diameter (Dm), measured area (Ar) and calculated area (Ac). Left and right diaphragm hemispheres (20 portions examined) were similar for fiber percentages and sizes. For left and right halves, respectively 50 +/- 2 and 51 +/- 4% of fibers were type I; type I Dm measurements were 38 +/- 5 and 41 +/- 4 microns; type I Ar values were 1798 +/- 481 and 2030 +/- 390 micron 2; type I Ac values were 1181 +/- 360 and 1321 +/- 382 micron 2; type II Dm values were 46 +/- 4 and 46 +/- 5 microns; type II Ar values were 2466 +/- 388 micron 2 and 2539 +/- 456 micron 2; type II Ac data were 1642 +/- 255 and 1655 +/- 382 micron 2. We likewise found no differences between costal and crural portions of the muscle (n = 20). For costal and crural portions, respectively, 50 +/- 3 and 50 +/- 2% of fibers were type I; type I Dm sizes were 39 +/- 5 and 40 +/- 4 microns; type I Ar measurements were 1859 +/- 521 and 1964 +/- 365 micron 2; type I Ac figures were 1231 +/- 317 and 1266 +/- 288 micron 2; type II Dm were 47 +/- 4 and 44 +/- 3 microns; type II Ar were 2563 +/- 481 and 2430 +/- 331 micron 2; type II Ac were 1729 +/- 373 and 1557 +/- 212 micron 2. Type II fibers, however, were somewhat larger than type I fibers in all portions (p = 0.001). New Zealand rabbit

  10. A non-asymptotic homogenization theory for periodic electromagnetic structures

    PubMed Central

    Tsukerman, Igor; Markel, Vadim A.

    2014-01-01

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions. PMID:25104912

  11. Homogeneous Freezing of Water Droplets and its Dependence on Droplet Size

    NASA Astrophysics Data System (ADS)

    Schmitt, Thea; Möhler, Ottmar; Höhler, Kristina; Leisner, Thomas

    2014-05-01

    The formulation and parameterisation of microphysical processes in tropospheric clouds, such as phase transitions, is still a challenge for weather and climate models. This includes the homogeneous freezing of supercooled water droplets, since this is an important process in deep convective systems, where almost pure water droplets may stay liquid until homogeneous freezing occurs at temperatures around 238 K. Though the homogeneous ice nucleation in supercooled water is considered to be well understood, recent laboratory experiments with typical cloud droplet sizes showed one to two orders of magnitude smaller nucleation rate coefficients than previous literature results, including earlier results from experiments with single levitated water droplets and from cloud simulation experiments at the AIDA (Aerosol Interaction and Dynamics in the Atmosphere) facility. This motivated us to re-analyse homogeneous droplet freezing experiments conducted during the previous years at the AIDA cloud chamber. This cloud chamber has a volume of 84m3 and operates under atmospherically relevant conditions within wide ranges of temperature, pressure and humidity, whereby investigations of both tropospheric mixed-phase clouds and cirrus clouds can be realised. By controlled adiabatic expansions, the ascent of an air parcel in the troposphere can be simulated. According to our new results and their comparison to the results from single levitated droplet experiments, the homogeneous freezing of water droplets seems to be a volume-dependent process, at least for droplets as small as a few micrometers in diameter. A contribution of surface induced freezing can be ruled out, in agreement to previous conclusions from the single droplet experiments. The obtained volume nucleation rate coefficients are in good agreement, within error bars, with some previous literature data, including our own results from earlier AIDA experiments, but they do not agree with recently published lower volume

  12. Development of Dynamic Explicit Crystallographic Homogenization Finite Element Analysis Code to Assess Sheet Metal Formability

    NASA Astrophysics Data System (ADS)

    Nakamura, Yasunori; Tam, Nguyen Ngoc; Ohata, Tomiso; Morita, Kiminori; Nakamachi, Eiji

    2004-06-01

    The crystallographic texture evolution induced by plastic deformation in the sheet metal forming process has a great influence on its formability. In the present study, a dynamic explicit finite element (FE) analysis code is newly developed by introducing a crystallographic homogenization method to estimate the polycrystalline sheet metal formability, such as the extreme thinning and "earing." This code can predict the plastic deformation induced texture evolution at the micro scale and the plastic anisotropy at the macro scale, simultaneously. This multi-scale analysis can couple the microscopic crystal plasticity inhomogeneous deformation with the macroscopic continuum deformation. In this homogenization process, the stress at the macro scale is defined by the volume average of those of the corresponding microscopic crystal aggregations in satisfying the equation of motion and compatibility condition in the micro scale "unit cell," where the periodicity of deformation is satisfied. This homogenization algorithm is implemented in the conventional dynamic explicit finite element code by employing the updated Lagrangian formulation and the rate type elastic/viscoplastic constitutive equation. At first, it has been confirmed through a texture evolution analyses in cases of typical deformation modes that Taylor's "constant strain homogenization algorithm" yields extreme concentration toward the preferred crystal orientations compared with our homogenization one. Second, we study the plastic anisotropy effects on "earing" in the hemispherical cup deep drawing process of pure ferrite phase sheet metal. By the comparison of analytical results with those of Taylor's assumption, conclusions are drawn that the present newly developed dynamic explicit crystallographic homogenization FEM shows more reasonable prediction of plastic deformation induced texture evolution and plastic anisotropy at the macro scale.

  13. Creating Cross-disciplinary Courses

    PubMed Central

    Reynolds, Elaine R.

    2012-01-01

    Because of its focus on the biological underpinnings of action and behavior, neuroscience intersects with many fields of human endeavor. Some of these cross-disciplinary intersections have been long standing, while others, such as neurotheology or neuroeconomics, are more recently formed fields. Many undergraduate institutions have sought to include cross-disciplinary courses in their curriculum because this style of pedagogy is often seen as applicable to real world problems. However, it can be difficult for faculty with specialized training within their discipline to expand beyond their own fields to offer cross-disciplinary courses. I have been creating a series of multi- or cross-disciplinary courses and have found some strategies that have helped me successfully teach these classes. I will discuss general strategies and tools in developing these types of courses including: 1) creating mixed experience classrooms of students and contributing faculty 2) finding the right tools that will allow you to teach to a mixed population without prerequisites 3) examining the topic using multiple disciplinary perspectives 4) feeding off student experience and interest 5) assessing the impact of these courses on student outcomes and your neuroscience program. This last tool in particular is important in establishing the validity of this type of teaching for neuroscience students and the general student population. PMID:23494491

  14. Creating Stop-Motion Videos with iPads to Support Students' Understanding of Cell Processes: "Because You Have to Know What You're Talking about to Be Able to Do It"

    ERIC Educational Resources Information Center

    Deaton, Cynthia C. M.; Deaton, Benjamin E.; Ivankovic, Diana; Norris, Frank A.

    2013-01-01

    The purpose of this qualitative case study is two-fold: (a) describe the implementation of a stop-motion animation video activity to support students' understanding of cell processes, and (b) present research findings about students' beliefs and use of iPads to support their creation of stop-motion videos in an introductory biology course. Data…

  15. Molecular precursors for the preparation of homogenous zirconia-silica materials by hydrolytic sol-gel process in organic media. Crystal structures of [Zr{OSi(O(t)Bu)3}4(H2O)2]·2H2O and [Ti(O(t)Bu){OSi(O(t)Bu)3}3].

    PubMed

    Dhayal, Veena; Chaudhary, Archana; Choudhary, Banwari Lal; Nagar, Meena; Bohra, Rakesh; Mobin, Shaikh M; Mathur, Pradeep

    2012-08-21

    [Zr(OPr(i))(4)·Pr(i)OH] reacts with [HOSi(O(t)Bu)(3)] in anhydrous benzene in 1:1 and 1:2 molar ratios to afford alkoxy zirconosiloxane precursors of the types [Zr(OPr(i))(3){OSi(O(t)Bu)(3)}] (A) and [Zr(OPr(i))(2){OSi(O(t)Bu)(3)}(2)] (B), respectively. Further reactions of A or B with glycols in 1:1 molar ratio afforded six chemically modified precursors of the types [Zr(OPr(i))(OGO){OSi(O(t)Bu)(3)}] (1A-3A) and [Zr(OGO){OSi(O(t)Bu)(3)}(2)] (1B-3B), respectively [where G = (-CH(2)-)(2) (1A, 1B); (-CH(2)-)(3) (2A, 2B) and (-CH(2)CH(2)CH(CH(3)-)} (3A, 3B)]. The precursors A and B are viscous liquids, which solidify on ageing whereas the other products are all solids, soluble in common organic solvents. These were characterized by elemental analyses, molecular weight measurements, FAB mass, FTIR, (1)H, (13)C and (29)Si-NMR studies. Cryoscopic molecular weight measurements of all the products, as well as the FAB mass studies of 3A and 3B, indicate their monomeric nature. However, FAB mass spectrum of the solidified B suggests that it exists in dimeric form. Single crystal structure analysis of [Zr{OSi(O(t)Bu)(3)}(4)(H(2)O)(2)]·2H(2)O (3b) (R(fac) = 11.9%) as well as that of corresponding better quality crystals of [Ti(O(t)Bu){OSi(O(t)Bu)(3)}(3)] (4) (R(fac) = 5.97%) indicate the presence of a M-O-Si bond. TG analyses of 3A, B, and 3B indicate the formation of zirconia-silica materials of the type ZrO(2)·SiO(2) from 3A and ZrO(2)·2SiO(2) from B or 3B at low decomposition temperatures (≤200 °C). The desired homogenous nano-sized zirconia-silica materials [ZrO(2)·nSiO(2)] have been obtained easily from the precursors A and B as well as from the glycol modified precursors 3A and 3B by hydrolytic sol-gel process in organic media without using any acid or base catalyst, and these were characterized by powder XRD patterns, SEM images, EDX analyses and IR spectroscopy.

  16. Electrothermal atomic absorption spectrophotometry of nickel in tissue homogenates

    SciTech Connect

    Sunderman, F.W. Jr.; Marzouk, A.; Crisostomo, M.C.; Weatherby, D.R.

    1985-01-01

    A method for analysis of Ni concentrations in tissues is described, which involves (a) tissue dissection with metal-free obsidian knives, (b) tissue homogenization in polyethylene bags by use by a Stomacher blender, (c) oxidative digestion with mixed nitric, sulfuric, and perchloric acids, and (d) quantitation of Ni by electrothermal atomic absorption spectrophotometry with Zeeman background correction. The detection limit for Ni in tissues is 10 ng per g, dry weight; the coefficient of variation ranges from 7 to 15%, depending on the tissue Ni concentration; the recovery of Ni added in concentration of 20 ng per g, dry weight, to kidney homogenates averages 101 +/- 8% (mean +/-SD). In control rats, Ni concentrations are highest in lung (102 +/- 39 ng per g, dry weight) and lowest in spleen (35 +/- 16 ng per g, dry wt.). In descending order of Ni concentrations, the tissues of control rats rank as follows: lung > heart > bone > kidney > brain > testis > fat > liver > spleen. In rats killed 24 h after sc injection of NiCl/sub 2/ (0.125 mmol per kg, body weight) Ni concentrations are highest in kidney (17.7 +/- 2.5 ..mu..g per g, dry weight) and lowest in brain (0.38 +/- 0.14 ..mu..g per g, dry weight). In descending order of Ni concentrations, the tissues of NiCl/sub 2/-treated rats rank as follows: kidney >> lung > spleen > testis > heart > fat > liver > bone > brain. The present method fills the need for an accurate, sensitive, and practical technique to determine tissue Ni concentrations, with stringent precautions to minimize Ni contamination during tissue sampling and processing. 35 references, 5 figures, 1 table.

  17. Isomorphism, Homogeneity, and Rationalism in University Retrenchment.

    ERIC Educational Resources Information Center

    Gates, Gordon S.

    1997-01-01

    Describes the process of retrenchment at a medium-sized state university (pseudonyms used) for the purpose of analyzing the isomorphic pressures (mimetic, coercive, normative) surfacing during the process and the role played by rationalism in decision-making. Draws on the literature of organizational theory and change, and makes comparisons with…

  18. Numerical Generation of Dense Plume Fingers in Unsaturated Homogeneous Porous Media

    NASA Astrophysics Data System (ADS)

    Cremer, C.; Graf, T.

    2012-04-01

    In nature, the migration of dense plumes typically results in the formation of vertical plume fingers. Flow direction in fingers is downwards, which is counterbalanced by upwards flow of less dense fluid between fingers. In heterogeneous media, heterogeneity itself is known to trigger the formation of fingers. In homogeneous media, however, fingers are also created even if all grains had the same diameter. The reason is that pore-scale heterogeneity leading to different flow velocities also exists in homogeneous media due to two effects: (i) Grains of identical size may randomly arrange differently, e.g. forming tetrahedrons, hexahedrons or octahedrons. Each arrangement creates pores of varying diameter, thus resulting in different average flow velocities. (ii) Random variations of solute concentration lead to varying buoyancy effects, thus also resulting in different velocities. As a continuation of previously made efforts to incorporate pore-scale heterogeneity into fully saturated soil such that dense fingers are realistically generated (Cremer and Graf, EGU Assembly, 2011), the current paper extends the research scope from saturated to unsaturated soil. Perturbation methods are evaluated by numerically re-simulating a laboratory-scale experiment of plume transport in homogeneous unsaturated sand (Simmons et al., Transp. Porous Media, 2002). The following 5 methods are being discussed: (i) homogeneous sand, (ii) initial perturbation of solute concentration, (iii) spatially random, time-constant perturbation of solute source, (iv) spatially and temporally random noise of simulated solute concentration, and (v) random K-field that introduces physically insignificant but numerically significant heterogeneity. Results demonstrate that, as opposed to saturated flow, perturbing the solute source will not result in plume fingering. This is because the location of the perturbed source (domain top) and the location of finger generation (groundwater surface) do not

  19. Grouping in Primary Schools and Reference Processes.

    ERIC Educational Resources Information Center

    Meijnen, G. W.; Guldemond, H.

    2002-01-01

    Studied reference processes in within-class grouping for elementary school students in the Netherlands in homogeneous (n=16) and heterogeneous (n=14) classes. Findings indicate that homogeneous grouping sets strong reference processes in motion, and processes of comparison have considerably greater effects in homogeneous groups, with negative…

  20. Phase resolved analysis of the homogeneity of a diffuse dielectric barrier discharge

    NASA Astrophysics Data System (ADS)

    Baldus, Sabrina; Kogelheide, Friederike; Bibinov, Nikita; Stapelmann, Katharina; Awakowicz, Peter

    2015-09-01

    Cold atmospheric pressure plasmas have already proven their ability of supporting the healing process of chronic wounds. Especially simple configurations like a dielectric barrier discharge (DBD), comprising of one driven electrode which is coated with a dielectric layer, are of interest, because they are cost-effective and easy to handle. The homogeneity of such plasmas during treatment is necessary since the whole wound should be treated evenly. In this investigation phase resolved optical emission spectroscopy is used to investigate the homogeneity of a DBD. Electron densities and reduced electric field distributions are determined with temporal and spatial resolution and the differences for applied positive and negative voltage pulses are studied.

  1. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  2. On the supposed influence of milk homogenization on the risk of CVD, diabetes and allergy.

    PubMed

    Michalski, Marie-Caroline

    2007-04-01

    Commercial milk is homogenized for the purpose of physical stability, thereby reducing fat droplet size and including caseins and some whey proteins at the droplet interface. This seems to result in a better digestibility than untreated milk. Various casein peptides and milk fat globule membrane (MFGM) proteins are reported to present either harmful (e.g. atherogenic) or beneficial bioactivity (e.g. hypotensive, anticarcinogenic and others). Homogenization might enhance either of these effects, but this remains controversial. The effect of homogenization has not been studied regarding the link between early cow's milk consumption and occurrence of type I diabetes in children prone to the disease and no link appears in the general population. Homogenization does not influence milk allergy and intolerance in allergic children and lactose-intolerant or milk-hypersensitive adults. The impact of homogenization, as well as heating and other treatments such as cheesemaking processes, on the health properties of milk and dairy products remains to be fully elucidated. PMID:17349070

  3. A Multivariate Technique for Evaluating the Statistical Homogeneity of Jointed Rock Masses

    NASA Astrophysics Data System (ADS)

    Li, Yanyan; Wang, Qing; Chen, Jianping; Song, Shengyuan; Ruan, Yunkai; Zhang, Qi

    2015-09-01

    Various approaches have been developed for identifying statistically homogeneous regions or structural domains in a jointed rock mass based on joint orientations or the other joint parameters; however, few studies have been conducted by integrating both. In this paper, nine parameters are considered for this identification, namely: orientation, spacing, aperture, roughness, trace length, trace type, filling, groundwater condition, and weathering. A statistical parameter, known as the correlation coefficient, is used to quantify the degree of the similarity between the joint parameters collected from four adjacent adits at the Songta dam site on the upper reaches of the Nu River in southwest China. Based on the analytic hierarchy process, the weights of the parameters are obtained. The overall homogeneity of the rock mass around the studied regions is determined based on the homogeneity index.

  4. Microstructural evolution in Al-Zn-Mg-Cu-Sc-Zr alloys during short-time homogenization

    NASA Astrophysics Data System (ADS)

    Liu, Tao; He, Chun-nian; Li, Gen; Meng, Xin; Shi, Chun-sheng; Zhao, Nai-qin

    2015-05-01

    Microstructural evolution in a new kind of aluminum (Al) alloy with the chemical composition of Al-8.82Zn-2.08Mg-0.80Cu-0.31Sc-0.3Zr was investigated. It is found that the secondary phase MgZn2 is completely dissolved into the matrix during a short homogenization treatment (470°C, 1 h), while the primary phase Al3(Sc,Zr) remains stable. This is due to Sc and Zr additions into the Al alloy, high Zn/Mg mass ratio, and low Cu content. The experimental findings fit well with the results calculated by the homogenization diffusion kinetics equation. The alloy shows an excellent mechanical performance after the short homogenization process followed by hot-extrusion and T6 treatment. Consequently, a good combination of low energy consumption and favorable mechanical properties is obtained.

  5. Synthesis of cyclic sulfites from epoxides and sulfur dioxide with silica-immobilized homogeneous catalysts.

    PubMed

    Takenaka, Yasumasa; Kiyosu, Takahiro; Mori, Goro; Choi, Jun-Chul; Fukaya, Norihisa; Sakakura, Toshiyasu; Yasuda, Hiroyuki

    2012-01-01

    Quaternary ammonium- and amino-functionalized silica catalysts have been prepared for the selective synthesis of cyclic sulfites from epoxides and sulfur dioxide, demonstrating the effects of immobilizing the homogeneous catalysts on silica. The cycloaddition of sulfur dioxide to various epoxides was conducted under solvent-free conditions at 100 °C. The quaternary ammonium- and amino-functionalized silica catalysts produced cyclic sulfites in high yields (79-96 %) that are comparable to those produced by the homogeneous catalysts. The functionalized silica catalysts could be separated from the product solution by filtration, thereby avoiding the catalytic decomposition of the cyclic sulfite products upon distillation of the product solution. Heterogenization of a homogeneous catalyst by immobilization can, therefore, improve the efficiency of the purification of crude reaction products. Despite a decrease in catalytic activity after each recycling step, the heterogeneous pyridine-functionalized silica catalyst provided high yields after as many as five recycling processes.

  6. Creating Maps of Forbush Decreases

    NASA Astrophysics Data System (ADS)

    Santiago, A.; Lara, A.; Niembro, T.

    2013-05-01

    The flux of galactic cosmic rays (GCR) to the inner Heliosphere and in particular to the Earth surroundings, is modulated by the solar activity. In a time scale of hours the GCR flux may diminish abruptly, reach a minimum value and then follow a slow recovery phase lasting one or two days.The so called Forbush Decreases (FD) are caused by large scale structures of plasma and magnetic field traveling at high speed i. e. interplanetary coronal mass ejections (ICMEs). Using the new observational capability of imaging the interplanetary space (e.g. Stereo spacecraft) and assuming a direct relationship between density and magnetic field inside ICMEs, in this work we create maps of ICMEs, as GCR sinks seen by an observer at the Earth surface. The objective is to survey the observational necessities of new cosmic ray detectors in order to perform such maps.

  7. Creating genetic resistance to HIV.

    PubMed

    Burnett, John C; Zaia, John A; Rossi, John J

    2012-10-01

    HIV/AIDS remains a chronic and incurable disease, in spite of the notable successes of combination antiretroviral therapy. Gene therapy offers the prospect of creating genetic resistance to HIV that supplants the need for antiviral drugs. In sight of this goal, a variety of anti-HIV genes have reached clinical testing, including gene-editing enzymes, protein-based inhibitors, and RNA-based therapeutics. Combinations of therapeutic genes against viral and host targets are designed to improve the overall antiviral potency and reduce the likelihood of viral resistance. In cell-based therapies, therapeutic genes are expressed in gene modified T lymphocytes or in hematopoietic stem cells that generate an HIV-resistant immune system. Such strategies must promote the selective proliferation of the transplanted cells and the prolonged expression of therapeutic genes. This review focuses on the current advances and limitations in genetic therapies against HIV, including the status of several recent and ongoing clinical studies.

  8. Pi overlapping ring systems contained in a homogeneous assay: a novel homogeneous assay for antigens

    NASA Astrophysics Data System (ADS)

    Kidwell, David A.

    1993-05-01

    A novel immunoassay, Pi overlapping ring systems contained in a homogeneous assay (PORSCHA), is described. This assay relies upon the change in fluorescent spectral properties that pyrene and its derivatives show with varying concentration. Because antibodies and other biomolecules can bind two molecules simultaneously, they can change the local concentration of the molecules that they bind. This concentration change may be detected spectrally as a change in the fluorescence emission wavelength of an appropriately labeled biomolecule. Several tests of PORSCHA have been performed which demonstrate this principle. For example: with streptavidin as the binding biomolecule and a biotin labeled pyrene derivative, the production of the excimer emitting at 470 nm is observed. Without the streptavidin present, only the monomer emitting at 378 and 390 nm is observed. The ratio of monomer to excimer provides the concentration of unlabeled biotin in the sample. Approximately 1 ng/mL of biotin may be detected with this system using a 50 (mu) l sample (2 X 10-16 moles biotin). The principles behind PORSCHA, the results with the streptavidin/biotin system are discussed and extensions of the PORSCHA concept to antibodies as the binding partner and DNA in homogeneous assays are suggested.

  9. A FORTRAN program for testing trend and homogeneity in proportions.

    PubMed

    Thakur, A K; Berry, K J; Mielke, P W

    1985-01-01

    A FORTRAN program is provided for testing linear trend and homogeneity in proportions. Trend is evaluated by the Cochran-Armitage method and homogeneity is tested by an overall X2 test as well by multiple pairwise comparisons by the Fisher-Irwin exact method. The program should be easy to implement on any size of computer with a FORTRAN compiler.

  10. Converting homogeneous to heterogeneous in electrophilic catalysis using monodisperse metal nanoparticles.

    PubMed

    Witham, Cole A; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N; Somorjai, Gabor A; Toste, F Dean

    2010-01-01

    A continuing goal in catalysis is to unite the advantages of homogeneous and heterogeneous catalytic processes. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this unification can also be supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl(2), and catalyse a range of π-bond activation reactions previously only catalysed through homogeneous processes. Multiple experimental methods are used to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, a size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared with larger, polymer-capped analogues.

  11. Converting homogeneous to heterogeneous in electrophilic catalysis using monodisperse metal nanoparticles

    NASA Astrophysics Data System (ADS)

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2010-01-01

    A continuing goal in catalysis is to unite the advantages of homogeneous and heterogeneous catalytic processes. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this unification can also be supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl2, and catalyse a range of π-bond activation reactions previously only catalysed through homogeneous processes. Multiple experimental methods are used to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, a size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared with larger, polymer-capped analogues.

  12. Creating bicultural experiences in nursing.

    PubMed

    Hezekiah, J

    1993-01-01

    This article describes the process (activities) involved in helping Registered Nurse students from Pakistan in an international health project adjust to Canadian culture and readjust to their home culture. The process, involving both structured and informal activities in Pakistan and in Canada, was designed to assist the students in adapting to both the foreign and home cultures. These processes included both human and material resources. Predeparture and reentry workshops, support systems in the form of Karachi-based faculty advisers, and intensive orientation programs were identified as important factors in the project students' adjustment. PMID:8227597

  13. Dynamics of spiking neurons: between homogeneity and synchrony.

    PubMed

    Rangan, Aaditya V; Young, Lai-Sang

    2013-06-01

    Randomly connected networks of neurons driven by Poisson inputs are often assumed to produce "homogeneous" dynamics, characterized by largely independent firing and approximable by diffusion processes. At the same time, it is well known that such networks can fire synchronously. Between these two much studied scenarios lies a vastly complex dynamical landscape that is relatively unexplored. In this paper, we discuss a phenomenon which commonly manifests in these intermediate regimes, namely brief spurts of spiking activity which we call multiple firing events (MFE). These events do not depend on structured network architecture nor on structured input; they are an emergent property of the system. We came upon them in an earlier modeling paper, in which we discovered, through a careful benchmarking process, that MFEs are the single most important dynamical mechanism behind many of the V1 phenomena we were able to replicate. In this paper we explain in a simpler setting how MFEs come about, as well as their potential dynamic consequences. Although the mechanism underlying MFEs cannot easily be captured by current population dynamics models, this phenomena should not be ignored during analysis; there is a growing body of evidence that such collaborative activity may be a key towards unlocking the possible functional properties of many neuronal networks. PMID:23096934

  14. Improving non-homogeneous regression for probabilistic precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Presser, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2016-04-01

    Non-homogenous regression is a state-of-the-art ensemble post-processing technique that statistically corrects ensemble forecasts and predicts a full probability distribution. Originally, a Gaussian model is employed that linearly links the predicted distribution mean and variance to the ensemble mean and variance, respectively. Regarding non-normally distributed precipitation data, this model can be censored at zero to account for periods without precipitation. We improve this regression approach in several directions. First, we consider link functions in the variance sub-model that assure positivity of the model variance. Second, we consider a censored Logistic (instead of censored Gaussian) distribution to accommodate more frequent events with high precipitation. Third, we introduce a splitting procedure, which appropriately accounts for perfect prediction cases, i.e., where no precipitation is observed when all ensemble members predict no precipitation. This study is applied to different accumulation periods (3, 6, 12, 24 hours) for short-range precipitation forecasts in Northern Italy. The choice of link function for the variance parameter, the splitting procedure, and an appropriate distribution assumption for precipitation data significantly improve the probabilistic forecast skill, especially for shorter accumulation periods. KEYWORDS: heteroscedastic ensemble post-processing, censored distribution, maximum likelihood estimation, probabilistic precipitation forecasting

  15. Fuel mixture stratification as a method for improving homogeneous charge compression ignition engine operation

    DOEpatents

    Dec, John E.; Sjoberg, Carl-Magnus G.

    2006-10-31

    A method for slowing the heat-release rate in homogeneous charge compression ignition ("HCCI") engines that allows operation without excessive knock at higher engine loads than are possible with conventional HCCI. This method comprises injecting a fuel charge in a manner that creates a stratified fuel charge in the engine cylinder to provide a range of fuel concentrations in the in-cylinder gases (typically with enough oxygen for complete combustion) using a fuel with two-stage ignition fuel having appropriate cool-flame chemistry so that regions of different fuel concentrations autoignite sequentially.

  16. Creating universes with thick walls

    NASA Astrophysics Data System (ADS)

    Ulvestad, Andrew; Albrecht, Andreas

    2012-05-01

    We study the dynamics of a spherically symmetric false vacuum bubble embedded in a true vacuum region separated by a “thick wall”, which is generated by a scalar field in a quartic potential. We study the “Farhi-Guth-Guven” (FGG) quantum tunneling process by constructing numerical solutions relevant to this process. The Arnowitt-Deser-Misner mass of the spacetime is calculated, and we show that there is a lower bound that is a significant fraction of the scalar field mass. We argue that the zero mass solutions used to by some to argue against the physicality of the FGG process are artifacts of the thin wall approximation used in earlier work. We argue that the zero mass solutions should not be used to question the viability of the FGG process.

  17. Creating a winning organizational culture.

    PubMed

    Campbell, Robert James

    2009-01-01

    This article explores the idea of how to create a winning organizational culture. By definition, a winning organizational culture is one that is able to make current innovations stick, while continuously changing based on the demands of the marketplace. More importantly, the article explores the notion that a winning organizational culture can have a profound impact on the conscious of the workforce, helping each individual to become a better, more productive person, who provides important services and products to the community. To form a basis toward defining the structure of what a winning organization culture looks like, 4 experts were asked 12 questions related to the development of an organizational culture. Three of the experts have worked intimately within the health care industry, while a fourth has been charged with turning around an organization that has had a losing culture for 17 years. The article provides insight into the role that values, norms, goals, leadership style, familiarity, and hiring practices play in developing a winning organizational culture. The article also emphasizes the important role that leaders perform in developing an organizational culture.

  18. Laser Created Relativistic Positron Jets

    SciTech Connect

    Chen, H; Wilks, S C; Meyerhofer, D D; Bonlie, J; Chen, C D; Chen, S N; Courtois, C; Elberson, L; Gregori, G; Kruer, W; Landoas, O; Mithen, J; Murphy, C; Nilson, P; Price, D; Scheider, M; Shepherd, R; Stoeckl, C; Tabak, M; Tommasini, R; Beiersdorder, P

    2009-10-08

    Electron-positron jets with MeV temperature are thought to be present in a wide variety of astrophysical phenomena such as active galaxies, quasars, gamma ray bursts and black holes. They have now been created in the laboratory in a controlled fashion by irradiating a gold target with an intense picosecond duration laser pulse. About 10{sup 11} MeV positrons are emitted from the rear surface of the target in a 15 to 22-degree cone for a duration comparable to the laser pulse. These positron jets are quasi-monoenergetic (E/{delta}E {approx} 5) with peak energies controllable from 3-19 MeV. They have temperatures from 1-4 MeV in the beam frame in both the longitudinal and transverse directions. Positron production has been studied extensively in recent decades at low energies (sub-MeV) in areas related to surface science, positron emission tomography, basic antimatter science such as antihydrogen experiments, Bose-Einstein condensed positronium, and basic plasma physics. However, the experimental tools to produce very high temperature positrons and high-flux positron jets needed to simulate astrophysical positron conditions have so far been absent. The MeV temperature jets of positrons and electrons produced in our experiments offer a first step to evaluate the physics models used to explain some of the most energetic phenomena in the universe.

  19. Creating experimental color harmony map

    NASA Astrophysics Data System (ADS)

    Chamaret, Christel; Urban, Fabrice; Lepinel, Josselin

    2014-02-01

    Starting in the 17th century with Newton, color harmony is a topic that did not reach a consensus on definition, representation or modeling so far. Previous work highlighted specific characteristics for color harmony on com- bination of color doublets or triplets by means of a human rating on a harmony scale. However, there were no investigation involving complex stimuli or pointing out how harmony is spatially located within a picture. The modeling of such concept as well as a reliable ground-truth would be of high value for the community, since the applications are wide and concern several communities: from psychology to computer graphics. We propose a protocol for creating color harmony maps from a controlled experiment. Through an eye-tracking protocol, we focus on the identification of disharmonious colors in pictures. The experiment was composed of a free viewing pass in order to let the observer be familiar with the content before a second pass where we asked "to search for the most disharmonious areas in the picture". Twenty-seven observers participated to the experiments that was composed of a total of 30 different stimuli. The high inter-observer agreement as well as a cross-validation confirm the validity of the proposed ground-truth.

  20. Creating a urine black hole

    NASA Astrophysics Data System (ADS)

    Hurd, Randy; Pan, Zhao; Meritt, Andrew; Belden, Jesse; Truscott, Tadd

    2015-11-01

    Since the mid-nineteenth century, both enlisted and fashion-conscious owners of khaki trousers have been plagued by undesired speckle patterns resulting from splash-back while urinating. In recent years, industrial designers and hygiene-driven entrepreneurs have sought to limit this splashing by creating urinal inserts, with the effectiveness of their inventions varying drastically. From this large assortment of inserts, designs consisting of macroscopic pillar arrays seem to be the most effective splash suppressers. Interestingly this design partially mimics the geometry of the water capturing moss Syntrichia caninervis, which exhibits a notable ability to suppress splash and quickly absorb water from impacting rain droplets. With this natural splash suppressor in mind, we search for the ideal urine black hole by performing experiments of simulated urine streams (water droplet streams) impacting macroscopic pillar arrays with varying parameters including pillar height and spacing, draining and material properties. We propose improved urinal insert designs based on our experimental data in hopes of reducing potential embarrassment inherent in wearing khakis.