Science.gov

Sample records for process creates homogenous

  1. Creating a Flexible Budget Process

    ERIC Educational Resources Information Center

    Frew, James; Olson, Robert; Pelton, M. Lee

    2009-01-01

    The budget process is often an especially thorny area in communication between administrators and faculty members. Last year, Willamette University took a step toward reducing tensions surrounding the budget. As university administrators planned for the current year, they faced the high degree of uncertainty that the financial crisis has forced on…

  2. Web Pages Created Via SCID Process.

    ERIC Educational Resources Information Center

    Stammen, Ronald M.

    This paper describes the use of a management process, Systematic Curriculum and Instructional Development (SCID), for developing online multimedia modules. The project, "Collaboratively Creating Multimedia Modules for Teachers and Professors," was funded by the USWEST Foundation. The curriculum development process involved teams of experts in…

  3. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-01

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization. PMID:23055062

  4. Spoken Word Processing Creates a Lexical Bottleneck

    ERIC Educational Resources Information Center

    Cleland, Alexandra A.; Tamminen, Jakke; Quinlan, Philip T.; Gaskell, M. Gareth

    2012-01-01

    We report 3 experiments that examined whether presentation of a spoken word creates an attentional bottleneck associated with lexical processing in the absence of a response to that word. A spoken word and a visual stimulus were presented in quick succession, but only the visual stimulus demanded a response. Response times to the visual stimulus…

  5. Creating Only Isotropic Homogeneous Turbulence in Liquid Helium near Absolute Zero

    NASA Astrophysics Data System (ADS)

    Ihas, G. G.; Thompson, K. J.; Labbe, G.; McClintock, P. V. E.

    2012-02-01

    Flow through a grid is a standard method to produce isotropic, homogeneous turbulence for laboratory study. This technique has been used to generate quantum turbulence (QT) above 1 K in superfluid heliumootnotetextS. R. Stalp, L. Skrbek, and R. J. Donnelly, Phys. Rev. Lett. 82, 4831 (1999). where QT seems to mimic classical turbulence. Efforts have been made recentlyootnotetextG. G. Ihas, G. Labbe, S-c. Liu, and K. J. Thompson, J. Low Temp. Phys. 150, 384 (2008). to make similar measurements near absolute zero, where there is an almost total absence of normal fluid and hence classical viscosity. This presents the difficulty that most motive force devices produce heat which overwhelms the phenomena being investigated. The process of designing and implimenting a ``dissipation-free'' motor for pulling a grid through superfluid helium at millikelvin temperatures has resulted in the development of new techniques which have broad application in low temperature research. Some of these, such as Meissner-affect magnetic drives, capacitive and inductive position sensors, and magnetic centering devices will be described. Heating results for devices which can move in a controlled fashion from very low speed up to 10 cm/s will be presented. Acknowledgement: We thank W.F. Vinen for many useful discussions.

  6. Process to create simulated lunar agglutinate particles

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert J. (Inventor); Gustafson, Marty A. (Inventor); White, Brant C. (Inventor)

    2011-01-01

    A method of creating simulated agglutinate particles by applying a heat source sufficient to partially melt a raw material is provided. The raw material is preferably any lunar soil simulant, crushed mineral, mixture of crushed minerals, or similar material, and the heat source creates localized heating of the raw material.

  7. Creep rupture as a non-homogeneous Poissonian process

    PubMed Central

    Danku, Zsuzsa; Kun, Ferenc

    2013-01-01

    Creep rupture of heterogeneous materials occurring under constant sub-critical external loads is responsible for the collapse of engineering constructions and for natural catastrophes. Acoustic monitoring of crackling bursts provides microscopic insight into the failure process. Based on a fiber bundle model, we show that the accelerating bursting activity when approaching failure can be described by the Omori law. For long range load redistribution the time series of bursts proved to be a non-homogeneous Poissonian process with power law distributed burst sizes and waiting times. We demonstrate that limitations of experiments such as finite detection threshold and time resolution have striking effects on the characteristic exponents, which have to be taken into account when comparing model calculations with experiments. Recording events solely within the Omori time to failure the size distribution of bursts has a crossover to a lower exponent which is promising for forecasting the imminent catastrophic failure. PMID:24045539

  8. On homogenization of diffusion processes in microperiodic stratified bodies

    SciTech Connect

    Matysiak, S.J.; Mieszkowski, R.

    1999-05-01

    The bodies with microperiodic layered structures can be made by man (laminated composites) or can be found in nature (warved clays, sandstone-slates, sandstone-shales, thin-layered limestones). The knowledge of diffusion processes in the microperiodic stratified bodies is very important in the chemical engineering, material technology and environmental engineering. The warved clays are applied as natural barriers in the construction of waste dumps. Here, the aim of this contribution is to present the homogenized model of the diffusion processes in microperiodic stratified bodies. The considerations are based on the linear Fick`s theory of diffusion and the procedure of microlocal modeling. The obtained model takes into account certain microlocal structure of the body. As the illustration of the application of presented model, a simple example is given.

  9. Experimenting With Ore: Creating the Taconite Process; flow chart of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Experimenting With Ore: Creating the Taconite Process; flow chart of process - Mines Experiment Station, University of Minnesota, Twin Cities Campus, 56 East River Road, Minneapolis, Hennepin County, MN

  10. Competing Contact Processes on Homogeneous Networks with Tunable Clusterization

    NASA Astrophysics Data System (ADS)

    Rybak, Marcin; Kułakowski, Krzysztof

    2013-03-01

    We investigate two homogeneous networks: the Watts-Strogatz network with mean degree ⟨k⟩ = 4 and the Erdös-Rényi network with ⟨k⟩ = 10. In both kinds of networks, the clustering coefficient C is a tunable control parameter. The network is an area of two competing contact processes, where nodes can be in two states, S or D. A node S becomes D with probability 1 if at least two its mutually linked neighbors are D. A node D becomes S with a given probability p if at least one of its neighbors is S. The competition between the processes is described by a phase diagram, where the critical probability pc depends on the clustering coefficient C. For p > pc the rate of state S increases in time, seemingly to dominate in the whole system. Below pc, the majority of nodes is in the D-state. The numerical results indicate that for the Watts-Strogatz network the D-process is activated at the finite value of the clustering coefficient C, close to 0.3. On the contrary, for the Erdös-Rényi network the transition is observed at the whole investigated range of C.

  11. Creating Documentary Theatre as Educational Process.

    ERIC Educational Resources Information Center

    Hirschfeld-Medalia, Adeline

    With the celebration of the United States bicentennial as impetus, university students and faculty attempted several approaches to the creation of a touring documentary production composed almost completely from primary sources. This paper describes the process involved in producing a traveling show which featured groups relatively excluded from…

  12. Can An Evolutionary Process Create English Text?

    SciTech Connect

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  13. [Chemiluminescence spectroscopic analysis of homogeneous charge compression ignition combustion processes].

    PubMed

    Liu, Hai-feng; Yao, Ming-fa; Jin, Chao; Zhang, Peng; Li, Zhe-ming; Zheng, Zun-qing

    2010-10-01

    To study the combustion reaction kinetics of homogeneous charge compression ignition (HCCI) under different port injection strategies and intake temperature conditions, the tests were carried out on a modified single-cylinder optical engine using chemiluminescence spectroscopic analysis. The experimental conditions are keeping the fuel mass constant; fueling the n-heptane; controlling speed at 600 r x min(-1) and inlet pressure at 0.1 MPa; controlling inlet temperature at 95 degrees C and 125 degrees C, respectively. The results of chemiluminescence spectrum show that the chemiluminescence is quite faint during low temperature heat release (LTHR), and these bands spectrum originates from formaldehyde (CH2O) chemiluminescence. During the phase of later LTHR-negative temperature coefficient (NTC)-early high temperature heat release (HTHR), these bands spectrum also originates from formaldehyde (CH2O) chemiluminescence. The CO--O* continuum is strong during HTHR, and radicals such as OH, HCO, CH and CH2O appear superimposed on this CO--O* continuum. After the HTHR, the chemiluminescence intensity is quite faint. In comparison to the start of injection (SOI) of -30 degrees ATDC, the chemiluminescence intensity is higher under the SOI = -300 degrees ATDC condition due to the more intense emissions of CO--O* continuum. And more radicals of HCO and OH are formed, which also indicates a more intense combustion reaction. Similarly, more intense CO--O* continuum and more radicals of HCO and OH are emitted under higher intake temperature case. PMID:21137383

  14. A Tool for Creating Healthier Workplaces: The Conducivity Process

    ERIC Educational Resources Information Center

    Karasek, Robert A.

    2004-01-01

    The conducivity process, a methodology for creating healthier workplaces by promoting conducive production, is illustrated through the use of the "conducivity game" developed in the NordNet Project in Sweden, which was an action research project to test a job redesign methodology. The project combined the "conducivity" hypotheses about a…

  15. A study of the role of homogeneous process in heterogeneous high explosives

    SciTech Connect

    Tang, P.K.

    1993-05-01

    In a new hydrodynamic formulation of shock-induced chemical reaction, we can show formally that the presence of certain homogenous reaction characteristics is becoming more evident as shock pressure increase even in heterogeneous high explosives. The homogeneous reaction pathway includes nonequilibrium excitation and deactivation stages prior to chemical reaction. The excitation process leads to an intermediate state at higher energy level than the equilibrium state, and as a result, the effective activation energy appears to be lower than the value based on thermal experiments. As the pressure goes up higher, the homogeneous reaction can even surpass the heterogeneous process and becomes the dominant mechanism.

  16. A study of the role of homogeneous process in heterogeneous high explosives

    SciTech Connect

    Tang, P.K.

    1993-01-01

    In a new hydrodynamic formulation of shock-induced chemical reaction, we can show formally that the presence of certain homogenous reaction characteristics is becoming more evident as shock pressure increase even in heterogeneous high explosives. The homogeneous reaction pathway includes nonequilibrium excitation and deactivation stages prior to chemical reaction. The excitation process leads to an intermediate state at higher energy level than the equilibrium state, and as a result, the effective activation energy appears to be lower than the value based on thermal experiments. As the pressure goes up higher, the homogeneous reaction can even surpass the heterogeneous process and becomes the dominant mechanism.

  17. Design and fabrication of optical homogenizer with micro structure by injection molding process

    NASA Astrophysics Data System (ADS)

    Chen, C.-C. A.; Chang, S.-W.; Weng, C.-J.

    2008-08-01

    This paper is to design and fabricate an optical homogenizer with hybrid design of collimator, toroidal lens array, and projection lens for beam shaping of Gaussian beam into uniform cylindrical beam. TracePro software was used to design the geometry of homogenizer and simulation of injection molding was preceded by Moldflow MPI to evaluate the mold design for injection molding process. The optical homogenizer is a cylindrical part with thickness 8.03 mm and diameter 5 mm. The micro structure of toroidal array has groove height designed from 12 μm to 99 μm. An electrical injection molding machine and PMMA (n= 1.4747) were selected to perform the experiment. Experimental results show that the optics homogenizer has achieved the transfer ratio of grooves (TRG) as 88.98% and also the optical uniformity as 68% with optical efficiency as 91.88%. Future study focuses on development of an optical homogenizer for LED light source.

  18. Effect of homogenization process on the hardness of Zn-Al-Cu alloys

    NASA Astrophysics Data System (ADS)

    Villegas-Cardenas, Jose D.; Saucedo-Muñoz, Maribel L.; Lopez-Hirata, Victor M.; De Ita-De la Torre, Antonio; Avila-Davila, Erika O.; Gonzalez-Velazquez, Jorge Luis

    2015-10-01

    The effect of a homogenizing treatment on the hardness of as-cast Zn-Al-Cu alloys was investigated. Eight alloy compositions were prepared and homogenized at 350 °C for 180 h, and their Rockwell "B" hardness was subsequently measured. All the specimens were analyzed by X-ray diffraction and metallographically prepared for observation by optical microscopy and scanning electron microscopy. The results of the present work indicated that the hardness of both alloys (as-cast and homogenized) increased with increasing Al and Cu contents; this increased hardness is likely related to the presence of the θ and τ' phases. A regression equation was obtained to determine the hardness of the homogenized alloys as a function of their chemical composition and processing parameters, such as homogenization time and temperature, used in their preparation.

  19. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. PMID:21412782

  20. Process to Create High-Fidelity Lunar Dust Simulants

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert

    2010-01-01

    A method was developed to create high-fidelity lunar dust simulants that better match the unique properties of lunar dust than the existing simulants. The new dust simulant is designed to more closely approximate the size, morphology, composition, and other important properties of lunar dust (including the presence of nanophase iron). A two-step process is required to create this dust simulant. The first step is to prepare a feedstock material that contains a high percentage of agglutinate-like particles with iron globules (including nanophase iron). The raw material selected must have the proper mineralogical composition. In the second processing step, the feedstock material from the first step is jet-milled to reduce the particle size to a range consistent with lunar dust.

  1. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  2. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and

  3. Analysis of daily rainfall processes in lower extremadura (Spain) and homogenization of the data

    NASA Astrophysics Data System (ADS)

    Garcia, J. A.; Marroquin, A.; Garrido, J.; Mateos, V. L.

    1995-03-01

    In this paper we analyze, from the point of view of stochastic processes, daily rainfall data recorded at the Badajoz Observatory (Southwestern Spain) since the beginning of the century. We attempt to identify any periodicities or trends in the daily rainfall occurrences and their dependence structure, and attempt to select the appropriate point stochastic model for the daily rainfall series. Standard regression analysis, graphical methods and the Cramer statistic show a rise in the number of cases of light rain (between 0.1 and 5 mm/d) and a decline in the number of cases of moderate to heavy rain (> 5 mm/d) in the daily rainfall at least at the 5% significance level. That the homogenization process was satisfactory is shown by the mean interarrival time of the homogenized series and the test of the rate of homogenized daily rainfall occurrences. Our analysis also shows that the behavior of the spectra of the homogenized daily rainfall counts is completely different from that of a Poisson process, so that the hypothesis of a non-homogeneous Poisson process is rejected.

  4. Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2009-09-01

    In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in

  5. Creating an Equity State of Mind: A Learning Process

    ERIC Educational Resources Information Center

    Pickens, Augusta Maria

    2012-01-01

    The Diversity Scorecard Project evaluated in this study was created by the University of Southern California's Center for Urban Education. It was designed to create awareness among institutional members about the state of inequities in educational outcomes for underrepresented students. The Diversity Scorecard Project facilitators' aimed…

  6. Creating a Standardized Process to Meet Core Measure Compliance.

    PubMed

    Kwan, Sarah; Daniels, Melodie; Ryan, Lindsey; Fields, Willa

    2015-01-01

    A standardized process to improve compliance with venous thromboembolism prophylaxis and hospital-based inpatient psychiatric services Core Measures was developed, implemented, and evaluated by a clinical nurse specialist team. The use of a 1-page tool with the requirements and supporting evidence, combined with concurrent data and feedback, ensured success of improving compliance. The initial robust process of education and concurrent and retrospective review follow-up allowed for this process to be successful. PMID:26274512

  7. Novel particulate production processes to create unique security materials

    NASA Astrophysics Data System (ADS)

    Hampden-Smith, Mark; Kodas, Toivo; Haubrich, Scott; Oljaca, Miki; Einhorn, Rich; Williams, Darryl

    2006-02-01

    Particles are frequently used to impart security features to high value items. These particles are typically produced by traditional methods, and therefore the security must be derived from the chemical composition of the particles rather than the particle production process. Here, we present new and difficult-to-reproduce particle production processes based on spray pyrolysis that can produce unique particles and features that are dependent on the use of these new-to-the-world processes and process trade secrets. Specifically two examples of functional materials are described, luminescent materials and electrocatalytic materials.

  8. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  9. Occurrence analysis of daily rainfalls through non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2011-06-01

    A stochastic model based on a non-homogeneous Poisson process, characterised by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. The data modelling has been performed with a partition of observed daily rainfall data into a calibration period for parameter estimation and a validation period for checking on occurrence process changes. The model has been applied to a set of rain gauges located in different geographical areas of Southern Italy. The results show a good fit for time-varying intensity of rainfall occurrence process by 2-harmonic Fourier law and no statistically significant evidence of changes in the validation period for different threshold values.

  10. Process for forming a homogeneous oxide solid phase of catalytically active material

    DOEpatents

    Perry, Dale L.; Russo, Richard E.; Mao, Xianglei

    1995-01-01

    A process is disclosed for forming a homogeneous oxide solid phase reaction product of catalytically active material comprising one or more alkali metals, one or more alkaline earth metals, and one or more Group VIII transition metals. The process comprises reacting together one or more alkali metal oxides and/or salts, one or more alkaline earth metal oxides and/or salts, one or more Group VIII transition metal oxides and/or salts, capable of forming a catalytically active reaction product, in the optional presence of an additional source of oxygen, using a laser beam to ablate from a target such metal compound reactants in the form of a vapor in a deposition chamber, resulting in the deposition, on a heated substrate in the chamber, of the desired oxide phase reaction product. The resulting product may be formed in variable, but reproducible, stoichiometric ratios. The homogeneous oxide solid phase product is useful as a catalyst, and can be produced in many physical forms, including thin films, particulate forms, coatings on catalyst support structures, and coatings on structures used in reaction apparatus in which the reaction product of the invention will serve as a catalyst.

  11. Parallel information processing channels created in the retina

    PubMed Central

    Schiller, Peter H.

    2010-01-01

    In the retina, several parallel channels originate that extract different attributes from the visual scene. This review describes how these channels arise and what their functions are. Following the introduction four sections deal with these channels. The first discusses the “ON” and “OFF” channels that have arisen for the purpose of rapidly processing images in the visual scene that become visible by virtue of either light increment or light decrement; the ON channel processes images that become visible by virtue of light increment and the OFF channel processes images that become visible by virtue of light decrement. The second section examines the midget and parasol channels. The midget channel processes fine detail, wavelength information, and stereoscopic depth cues; the parasol channel plays a central role in processing motion and flicker as well as motion parallax cues for depth perception. Both these channels have ON and OFF subdivisions. The third section describes the accessory optic system that receives input from the retinal ganglion cells of Dogiel; these cells play a central role, in concert with the vestibular system, in stabilizing images on the retina to prevent the blurring of images that would otherwise occur when an organism is in motion. The last section provides a brief overview of several additional channels that originate in the retina. PMID:20876118

  12. Ask--Think--Create: The Process of Inquiry

    ERIC Educational Resources Information Center

    Diggs, Valerie

    2009-01-01

    Today's students find it difficult to develop an understanding of what it is they need to know, and more importantly, why they need to know it. Framing this "need to know" has been called by various names, such as "inquiry," "inquiry process," "essential questions," "knowledge construction." Inquiry, however, goes much deeper than casual…

  13. Informativeness ratings of messages created on an AAC processing prosthesis.

    PubMed

    Bartlett, Megan R; Fink, Ruth B; Schwartz, Myrna F; Linebarger, Marcia

    2007-01-01

    BACKGROUND: SentenceShaper() (SSR) is a computer program that supports spoken language production in aphasia by recording and storing the fragments that the user speaks into the microphone, making them available for playback and allowing them to be combined and integrated into larger structures (i.e., sentences and narratives). A prior study that measured utterance length and grammatical complexity in story-plot narratives produced with and without the aid of SentenceShaper demonstrated an "aided effect" in some speakers with aphasia, meaning an advantage for the narratives that were produced with the support of this communication aid (Linebarger, Schwartz, Romania, Kohn, & Stephens, 2000). The present study deviated from Linebarger et al.'s methods in key respects and again showed aided effects of SentenceShaper in persons with aphasia. AIMS: Aims were (1) to demonstrate aided effects in "functional narratives" conveying hypothetical real-life situations from a first person perspective; (2) for the first time, to submit aided and spontaneous speech samples to listener judgements of informativeness; and (3) to produce preliminary evidence on topic-specific carryover from SentenceShaper, i.e., carryover from an aided production to a subsequent unaided production on the same topic. METHODS #ENTITYSTARTX00026; PROCEDURES: Five individuals with chronic aphasia created narratives on two topics, under three conditions: Unaided (U), Aided (SSR), and Post-SSR Unaided (Post-U). The 30 samples (5 participants, 2 topics, 3 conditions) were randomised and judged for informativeness by graduate students in speech-language pathology. The method for rating was Direct Magnitude Estimation (DME). OUTCOMES #ENTITYSTARTX00026; RESULTS: Repeated measures ANOVAs were performed on DME ratings for each participant on each topic. A main effect of Condition was present for four of the five participants, on one or both topics. Planned contrasts revealed that the aided effect (SSR >U) was

  14. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment. PMID:27372262

  15. Creating a national citizen engagement process for energy policy.

    PubMed

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-09-16

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  16. Creating a national citizen engagement process for energy policy

    PubMed Central

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-01-01

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants’ broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  17. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGESBeta

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  18. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda

    2015-12-01

    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  19. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    SciTech Connect

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the charged wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.

  20. Specific multiple-scattering process in acoustic cloak with multilayered homogeneous isotropic materials

    NASA Astrophysics Data System (ADS)

    Cheng, Ying; Liu, XiaoJun

    2008-11-01

    It was qualitatively demonstrated through finite-element full-wave simulations that acoustic cloak can be constructed by using concentric multilayered structure with alternating homogeneous isotropic materials [Y. Cheng et al., Appl. Phys. Lett. 92, 151913 (2008)]. Here we present a sequential in-depth analysis of the proposed cloak by means of the multiple-scattering algorithms. Calculated pressure fields demonstrate that the cloak possesses low-reflection and wavefront-bending properties. The scattering patterns further characterize the directional cloaking performance in the far field, which is consistent with the pressure fields. The mechanism of the cloaking is ascribed to a specific multiple-scattering process determined by the microscopic material distribution and structural details of the cloak. We also discuss the behavior of the multilayered cloak as a function of wavelength.

  1. People Create Health: Effective Health Promotion is a Creative Process

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    Effective health promotion involves the creative cultivation of physical, mental, social, and spiritual well-being. Efforts at health promotion produce weak and inconsistent benefits when it does not engage people to express their own goals and values. Likewise, health promotion has been ineffective when it relies only on instruction about facts regarding a healthy lifestyle, or focuses on reduction of disease rather than the cultivation of well-being. Meta-analysis of longitudinal studies and experimental interventions shows that improvements in subjective well-being lead to short-term and long-term reductions in medical morbidity and mortality, as well as to healthier functioning and longevity. However, these effects are inconsistent and weak (correlations of about 0.15). The most consistent and strong predictor of both subjective well-being and objective health status in longitudinal studies is a creative personality profile characterized by being highly self-directed, cooperative, and self-transcendent. There is a synergy among these personality traits that enhances all aspects of the health and happiness of people. Experimental interventions to cultivate this natural creative potential of people are now just beginning, but available exploratory research has shown that creativity can be enhanced and the changes are associated with widespread and profound benefits, including greater physical, mental, social, and spiritual well-being. In addition to benefits mediated by choice of diet, physical activity, and health care utilization, the effect of a creative personality on health may be partly mediated by effects on the regulation of heart rate variability. Creativity promotes autonomic balance with parasympathetic dominance leading to a calm alert state that promotes an awakening of plasticities and intelligences that stress inhibits. We suggest that health, happiness, and meaning can be cultivated by a complex adaptive process that enhances healthy functioning

  2. CO2-assisted high pressure homogenization: a solvent-free process for polymeric microspheres and drug-polymer composites.

    PubMed

    Kluge, Johannes; Mazzotti, Marco

    2012-10-15

    The study explores the enabling role of near-critical CO(2) as a reversible plasticizer in the high pressure homogenization of polymer particles, aiming at their comminution as well as at the formation of drug-polymer composites. First, the effect of near-critical CO(2) on the homogenization of aqueous suspensions of poly lactic-co-glycolic acid (PLGA) was investigated. Applying a pressure drop of 900 bar and up to 150 passes across the homogenizer, it was found that particles processed in the presence of CO(2) were generally of microspherical morphology and at all times significantly smaller than those obtained in the absence of a plasticizer. The smallest particles, exhibiting a median x(50) of 1.3 μm, were obtained by adding a small quantity of ethyl acetate, which exerts on PLGA an additional plasticizing effect during the homogenization step. Further, the study concerns the possibility of forming drug-polymer composites through simultaneous high pressure homogenization of the two relevant solids, and particularly the effect of near-critical CO(2) on this process. Therefore, PLGA was homogenized together with crystalline S-ketoprofen (S-KET), a non-steroidal anti-inflammatory drug, at a drug to polymer ratio of 1:10, a pressure drop of 900 bar and up to 150 passes across the homogenizer. When the process was carried out in the presence of CO(2), an impregnation efficiency of 91% has been reached, corresponding to 8.3 wt.% of S-KET in PLGA; moreover, composite particles were of microspherical morphology and significantly smaller than those obtained in the absence of CO(2). The formation of drug-polymer composites through simultaneous homogenization of the two materials is thus greatly enhanced by the presence of CO(2), which increases the efficiency for both homogenization and impregnation. PMID:22750408

  3. Homogeneous sonophotolysis of food processing industry wastewater: Study of synergistic effects, mineralization and toxicity removal.

    PubMed

    Durán, A; Monteagudo, J M; Sanmartín, I; Gómez, P

    2013-03-01

    The mineralization of industrial wastewater coming from food industry using an emerging homogeneous sonophotolytic oxidation process was evaluated as an alternative to or a rapid pretreatment step for conventional anaerobic digestion with the aim of considerably reducing the total treatment time. At the selected operation conditions ([H(2)O(2)]=11,750ppm, pH=8, amplitude=50%, pulse length (cycles)=1), 60% of TOC is removed after 60min and 98% after 180min when treating an industrial effluent with 2114ppm of total organic carbon (TOC). This process removed completely the toxicity generated during storing or due to intermediate compounds. An important synergistic effect between sonolysis and photolysis (H(2)O(2)/UV) was observed. Thus the sonophotolysis (ultrasound/H(2)O(2)/UV) technique significantly increases TOC removal when compared with each individual process. Finally, a preliminary economical analysis confirms that the sono-photolysis with H(2)O(2) and pretreated water is a profitable system when compared with the same process without using ultrasound waves and with no pretreatment. PMID:23122709

  4. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  5. Study on Flow Stress Model and Processing Map of Homogenized Mg-Gd-Y-Zn-Zr Alloy During Thermomechanical Processes

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Zhang, Zhimin; Lu, Guang; Xie, Zhiping; Yang, Yongbiao; Cui, Ya

    2015-02-01

    Quantities of billets were compressed with 50% height reduction on a hot process simulator to study the plastic flow behaviors of homogenized as-cast Mg-13Gd-4Y-2Zn-0.6Zr alloy. The test alloy was heat treated at 520 °C for 12 h before thermomechanical experiments. The temperature of the processes ranged from 300 to 480 °C. The strain rate was varied between 0.001 and 0.5 s-1. According to the Arrhenius type equation, a flow stress model was established. In this model, flow stress was regarded as the function of the stress peak, strain peak, and the strain. A softening factor was used to characterize the dynamic softening phenomenon that occurred in the deformation process. Meanwhile, the processing maps based on the dynamic material modeling were constructed. The optimum temperature and strain rate for hot working of the test alloy were 480 °C and 0.01 s-1, respectively. Furthermore, the flow instability occurred in the two areas where the temperature ranged from 350 to 480 °C at strain rate of 0.01-0.1 s-1, and the temperature ranged from 450 to 480 °C with a strain rate of 0.1 s-1. According to the determined hot deformation parameters, four components were successfully formed, and the ultimate tensile strength, yield strength, and elongation of the component were 386 MPa, 331 MPa, and 6.3%, respectively.

  6. First-Principles Molecular Dynamics Studies of Organometallic Complexes and Homogeneous Catalytic Processes.

    PubMed

    Vidossich, Pietro; Lledós, Agustí; Ujaque, Gregori

    2016-06-21

    Computational chemistry is a valuable aid to complement experimental studies of organometallic systems and their reactivity. It allows probing mechanistic hypotheses and investigating molecular structures, shedding light on the behavior and properties of molecular assemblies at the atomic scale. When approaching a chemical problem, the computational chemist has to decide on the theoretical approach needed to describe electron/nuclear interactions and the composition of the model used to approximate the actual system. Both factors determine the reliability of the modeling study. The community dedicated much effort to developing and improving the performance and accuracy of theoretical approaches for electronic structure calculations, on which the description of (inter)atomic interactions rely. Here, the importance of the model system used in computational studies is highlighted through examples from our recent research focused on organometallic systems and homogeneous catalytic processes. We show how the inclusion of explicit solvent allows the characterization of molecular events that would otherwise not be accessible in reduced model systems (clusters). These include the stabilization of nascent charged fragments via microscopic solvation (notably, hydrogen bonding), transfer of charge (protons) between distant fragments mediated by solvent molecules, and solvent coordination to unsaturated metal centers. Furthermore, when weak interactions are involved, we show how conformational and solvation properties of organometallic complexes are also affected by the explicit inclusion of solvent molecules. Such extended model systems may be treated under periodic boundary conditions, thus removing the cluster/continuum (or vacuum) boundary, and require a statistical mechanics simulation technique to sample the accessible configurational space. First-principles molecular dynamics, in which atomic forces are computed from electronic structure calculations (namely, density

  7. Gravitational influences on the liquid-state homogenization and solidification of aluminum antimonide. [space processing of solar cell material

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1979-01-01

    Typical commercial or laboratory-prepared samples of polycrystalline AlSb contain microstructural inhomogeneities of Al- or Sb-rich phases in addition to the primary AlSb grains. The paper reports on gravitational influences, such as density-driven convection or sedimentation, that cause microscopic phase separation and nonequilibrium conditions to exist in earth-based melts of AlSb. A triple-cavity electric furnace is used to homogenize the multiphase AlSb samples in space and on earth. A comparative characterization of identically processed low- and one-gravity samples of commercial AlSb reveals major improvements in the homogeneity of the low-gravity homogenized material.

  8. Creating a Context for the Learning of Science Process Skills through Picture Books

    ERIC Educational Resources Information Center

    Monhardt, Leigh; Monhardt, Rebecca

    2006-01-01

    This article provides suggestions on ways in which science process skills can be taught in a meaningful context through children's literature. It is hoped that the following examples of how process skills can be taught using children's books will provide a starting point from which primary teachers can create additional examples. Many…

  9. Homogeneous and heterogeneous distributed cluster processing for two- and three-dimensional viscoelastic flows

    NASA Astrophysics Data System (ADS)

    Baloch, A.; Grant, P. W.; Webster, M. F.

    2002-12-01

    A finite-element study of two- and three-dimensional incompressible viscoelastic flows in a planar lid-driven cavity and concentric rotating cylinders is presented. The hardware platforms consist of both homogeneous and heterogeneous clusters of workstations. A semi-implicit time-stepping Taylor-Galerkin scheme is employed using the message passing mechanism provided by the Parallel Virtual Machine libraries. DEC-alpha, Intel Solaris and AMD-K7(Athlon) Linux clusters are utilized. Parallel results are compared against single processor (sequentially) solutions, using the parallelism paradigm of domain decomposition. Communication is effectively masked and practically ideal, linear speed-up with the number of processors is realized.

  10. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    DOE PAGESBeta

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCPmore » is referred to herein as a Virtual ADCP (VADCP).« less

  11. Study on rheo-diecasting process of 7075R alloys by SA-EMS melt homogenized treatment

    NASA Astrophysics Data System (ADS)

    Zhihua, G.; Jun, X.; Zhifeng, Z.; Guojun, L.; Mengou, T.

    2016-03-01

    An advanced melt processing technology, spiral annular electromagnetic stirring (SA-EMS) based on the annular electromagnetic stirring (A-EMS) process was developed for manufacturing Al-alloy components with high integrity. The SA-EMS process innovatively combines non-contact electromagnetic stirring and a spiral annular chamber with specially designed profiles to in situ make high quality melt slurry, and intensive forced shearing can be achieved under high shear rate and high intensity of turbulence inside the spiral annular chamber. In this paper, the solidification microstructure and hardness of 7075R alloy die-casting connecting rod conditioned by the SA-EMS melt processing technology were investigated. The results indicate that, the SA-EMS melt processing technology exhibited superior grain refinement and remarkable structure homogeneity. In addition, it can evidently enhance the mechanical performance and reduce the crack tendency.

  12. Quality function deployment: A customer-driven process to create and deliver value. Final report

    SciTech Connect

    George, S.S.

    1994-12-01

    Quality function deployment (QFD) is a team-oriented decision-making process used by more than 100 US businesses and industries to develop new products and marketing strategies. This report provides a detailed description of QFD and case study examples of how electric utilities can apply QFD principles in creating successful marketing and demand-side management (DSM) programs. The five-stage QFD process involves identifying customer needs and using this information to systematically develop program features, implementation activities, management procedures, and evaluation plans. QFD is not a deterministic model that provides answers, but a flexible, pragmatic tool for systematically organizing and communicating information to help utilities make better decisions.

  13. Porcine liver decellularization under oscillating pressure conditions: a technical refinement to improve the homogeneity of the decellularization process.

    PubMed

    Struecker, Benjamin; Hillebrandt, Karl Herbert; Voitl, Robert; Butter, Antje; Schmuck, Rosa B; Reutzel-Selke, Anja; Geisel, Dominik; Joehrens, Korinna; Pickerodt, Philipp A; Raschzok, Nathanael; Puhl, Gero; Neuhaus, Peter; Pratschke, Johann; Sauer, Igor M

    2015-03-01

    Decellularization and recellularization of parenchymal organs may facilitate the generation of autologous functional liver organoids by repopulation of decellularized porcine liver matrices with induced liver cells. We present an accelerated (7 h overall perfusion time) and effective protocol for human-scale liver decellularization by pressure-controlled perfusion with 1% Triton X-100 and 1% sodium dodecyl sulfate via the hepatic artery (120 mmHg) and portal vein (60 mmHg). In addition, we analyzed the effect of oscillating pressure conditions on pig liver decellularization (n=19). The proprietary perfusion device used to generate these pressure conditions mimics intra-abdominal conditions during respiration to optimize microperfusion within livers and thus optimize the homogeneity of the decellularization process. The efficiency of perfusion decellularization was analyzed by macroscopic observation, histological staining (hematoxylin and eosin [H&E], Sirius red, and alcian blue), immunohistochemical staining (collagen IV, laminin, and fibronectin), and biochemical assessment (DNA, collagen, and glycosaminoglycans) of decellularized liver matrices. The integrity of the extracellular matrix (ECM) postdecellularization was visualized by corrosion casting and three-dimensional computed tomography scanning. We found that livers perfused under oscillating pressure conditions (P(+)) showed a more homogenous course of decellularization and contained less DNA compared with livers perfused without oscillating pressure conditions (P(-)). Microscopically, livers from the (P(-)) group showed remnant cell clusters, while no cells were found in livers from the (P(+)) group. The grade of disruption of the ECM was higher in livers from the (P(-)) group, although the perfusion rates and pressure did not significantly differ. Immunohistochemical staining revealed that important matrix components were still present after decellularization. Corrosion casting showed an intact

  14. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki Y.

    2008-02-01

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  15. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    SciTech Connect

    Suzuki, Hiroyuki Y.

    2008-02-15

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  16. The influence of melting process and parameters on the structure and homogeneity of titanium-tantalum alloys

    SciTech Connect

    Dunn, P.S.; Korzewka, D.; Garcia, F.; Damkroger, B.K.; Van Den Avyle, J.A.; Tissot, R.G.

    1995-12-31

    Alloys of titanium with refractory metals are attractive materials for applications requiring high temperature strength and corrosion resistance. However, the widely different characteristics of the component elements have made it difficult to produce sound, compositionally homogeneous ingots using traditional melting techniques. This is particularly critical because the compositional ranges spanned by the micro- and macrosegregation in theses systems can easily encompass a number of microconstituents which are detrimental to mechanical properties. This paper presents results of a study of plasma (PAM) and vacuum-arc (VAR) melting of a 60 wt% tantalum, 40 wt% titanium binary alloy. The structural and compositional homogeneity of both PAM consolidated + PAM remelted, and PAM consolidated + VAR remelted ingots were characterized and compared using optical and electron microscopy and x-ray fluorescence microanalysis. Additionally, the effect of melting parameter, including melt rate and magnetic stirring, was studied. Results indicate that PAM remelting achieves more complete dissolution of lie starting electrode, due to greater local superheat, than does VAR remelting. PAM remelting also produces a finer as-solidified grain structure, due to the smaller molten pool and lower local solidification times. Conversely, VAR remelting produces an ingot with a more uniform macrostructure, due to the more stable movement of the solidification interface and more uniform material feed rate. Based on these results, a three-step process of PAM consolidation, followed by a PAM intermediate melt and a VAR final melt, has been selected for further development of the alloy and processing sequence.

  17. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  18. Experimental development of processes to produce homogenized alloys of immiscible metals, phase 3

    NASA Technical Reports Server (NTRS)

    Reger, J. L.

    1976-01-01

    An experimental drop tower package was designed and built for use in a drop tower. This effort consisted of a thermal analysis, container/heater fabrication, and assembly of an expulsion device for rapid quenching of heated specimens during low gravity conditions. Six gallium bismuth specimens with compositions in the immiscibility region (50 a/o of each element) were processed in the experimental package: four during low gravity conditions and two under a one gravity environment. One of the one gravity processed specimens did not have telemetry data and was subsequently deleted for analysis since the processing conditions were not known. Metallurgical, Hall effect, resistivity, and superconductivity examinations were performed on the five specimens. Examination of the specimens showed that the gallium was dispersed in the bismuth. The low gravity processed specimens showed a relatively uniform distribution of gallium, with particle sizes of 1 micrometer or less, in contrast to the one gravity control specimen. Comparison of the cooling rates of the dropped specimens versus microstructure indicated that low cooling rates are more desirable.

  19. Deactivation processes of homogeneous Pd catalysts using in situ time resolved spectroscopic techniques.

    PubMed

    Tromp, Moniek; Sietsma, Jelle R A; van Bokhoven, Jeroen A; van Strijdonck, Gino P F; van Haaren, Richard J; van der Eerden, Ad M J; van Leeuwen, Piet W N M; Koningsberger, Diek C

    2003-01-01

    UV-Vis, combined with ED-XAFS shows, for the first time, the evolution of inactive Pd dimers and trimers, that are a possible first stage in the deactivation process of important palladium catalysed reactions, leading to larger palladium clusters and eventually palladium black. PMID:12610999

  20. Development of a reference material for Staphylococcus aureus enterotoxin A in cheese: feasibility study, processing, homogeneity and stability assessment.

    PubMed

    Zeleny, R; Emteborg, H; Charoud-Got, J; Schimmel, H; Nia, Y; Mutel, I; Ostyn, A; Herbin, S; Hennekinne, J-A

    2015-02-01

    Staphylococcal food poisoning is caused by enterotoxins excreted into foods by strains of staphylococci. Commission Regulation 1441/2007 specifies thresholds for the presence of these toxins in foods. In this article we report on the progress towards reference materials (RMs) for Staphylococcal enterotoxin A (SEA) in cheese. RMs are crucial to enforce legislation and to implement and safeguard reliable measurements. First, a feasibility study revealed a suitable processing procedure for cheese powders: the blank material was prepared by cutting, grinding, freeze-drying and milling. For the spiked material, a cheese-water slurry was spiked with SEA solution, freeze-dried and diluted with blank material to the desired SEA concentration. Thereafter, batches of three materials (blank; two SEA concentrations) were processed. The materials were shown to be sufficiently homogeneous, and storage at ambient temperature for 4weeks did not indicate degradation. These results provide the basis for the development of a RM for SEA in cheese. PMID:25172706

  1. The Distribution of Family Sizes Under a Time-Homogeneous Birth and Death Process.

    PubMed

    Moschopoulos, Panagis; Shpak, Max

    2010-05-11

    The number of extant individuals within a lineage, as exemplified by counts of species numbers across genera in a higher taxonomic category, is known to be a highly skewed distribution. Because the sublineages (such as genera in a clade) themselves follow a random birth process, deriving the distribution of lineage sizes involves averaging the solutions to a birth and death process over the distribution of time intervals separating the origin of the lineages. In this article, we show that the resulting distributions can be represented by hypergeometric functions of the second kind. We also provide approximations of these distributions up to the second order, and compare these results to the asymptotic distributions and numerical approximations used in previous studies. For two limiting cases, one with a relatively high rate of lineage origin, one with a low rate, the cumulative probability densities and percentiles are compared to show that the approximations are robust over a wide range of parameters. It is proposed that the probability distributions of lineage size may have a number of relevant applications to biological problems such as the coalescence of genetic lineages and in predicting the number of species in living and extinct higher taxa, as these systems are special instances of the underlying process analyzed in this article. PMID:23543815

  2. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with

  3. The Parametric Model of the Human Mandible Coronoid Process Created by Method of Anatomical Features

    PubMed Central

    Vitković, Nikola; Mitić, Jelena; Manić, Miodrag; Trajanović, Miroslav; Husain, Karim; Petrović, Slađana; Arsić, Stojanka

    2015-01-01

    Geometrically accurate and anatomically correct 3D models of the human bones are of great importance for medical research and practice in orthopedics and surgery. These geometrical models can be created by the use of techniques which can be based on input geometrical data acquired from volumetric methods of scanning (e.g., Computed Tomography (CT)) or on the 2D images (e.g., X-ray). Geometrical models of human bones created in such way can be applied for education of medical practitioners, preoperative planning, etc. In cases when geometrical data about the human bone is incomplete (e.g., fractures), it may be necessary to create its complete geometrical model. The possible solution for this problem is the application of parametric models. The geometry of these models can be changed and adapted to the specific patient based on the values of parameters acquired from medical images (e.g., X-ray). In this paper, Method of Anatomical Features (MAF) which enables creation of geometrically precise and anatomically accurate geometrical models of the human bones is implemented for the creation of the parametric model of the Human Mandible Coronoid Process (HMCP). The obtained results about geometrical accuracy of the model are quite satisfactory, as it is stated by the medical practitioners and confirmed in the literature. PMID:26064183

  4. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each

  5. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2003-01-01

    The objective of this study was to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements were carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The main experimental technique employed was turbulent flow-chemical ionization mass spectrometry, which is particularly well suited for investigations of radical-radical reactions.

  6. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique

    PubMed Central

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5–20 minutes) and speed (1,000–9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters. PMID:25709435

  7. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2001-01-01

    The objective of this study is to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases such as HCl with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements will be carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The techniques to be employed include turbulent flow - chemical ionization mass spectrometry, and optical ellipsometry. The next section summarizes our research activities during the second year of the project, and the section that follows consists of the statement of work for the third year.

  8. Creating OGC Web Processing Service workflows using a web-based editor

    NASA Astrophysics Data System (ADS)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  9. Processing of α-chitin nanofibers by dynamic high pressure homogenization: characterization and antifungal activity against A. niger.

    PubMed

    Salaberria, Asier M; Fernandes, Susana C M; Diaz, Rene Herrera; Labidi, Jalel

    2015-02-13

    Chitin nano-objects become more interesting and attractive material than native chitin because of their usable form, low density, high surface area and promising mechanical properties. This work suggests a straightforward and environmentally friendly method for processing chitin nanofibers using dynamic high pressure homogenization. This technique proved to be a remarkably simple way to get α-chitin into α-chitin nanofibers from yellow lobster wastes with a uniform width (bellow 100 nm) and high aspect ratio; and may contributes to a major breakthrough in chitin applications. Moreover, the resulting α-chitin nanofibers were characterized and compared with native α-chitin in terms of chemical and crystal structure, thermal degradation and antifungal activity. The biological assays highlighted that the nano nature of chitin nanofibers plays an important role in the antifungal activity against Aspergillus niger. PMID:25458302

  10. Development of a new cucumber reference material for pesticide residue analysis: feasibility study for material processing, homogeneity and stability assessment.

    PubMed

    Grimalt, Susana; Harbeck, Stefan; Shegunova, Penka; Seghers, John; Sejerøe-Olsen, Berit; Emteborg, Håkan; Dabrio, Marta

    2015-04-01

    The feasibility of the production of a reference material for pesticide residue analysis in a cucumber matrix was investigated. Cucumber was spiked at 0.075 mg/kg with each of the 15 selected pesticides (acetamiprid, azoxystrobin, carbendazim, chlorpyrifos, cypermethrin, diazinon, (α + β)-endosulfan, fenitrothion, imazalil, imidacloprid, iprodione, malathion, methomyl, tebuconazole and thiabendazole) respectively. Three different strategies were considered for processing the material, based on the physicochemical properties of the vegetable and the target pesticides. As a result, a frozen spiked slurry of fresh cucumber, a spiked freeze-dried cucumber powder and a freeze-dried cucumber powder spiked by spraying the powder were studied. The effects of processing and aspects related to the reconstitution of the material were evaluated by monitoring the pesticide levels in the three materials. Two separate analytical methods based on LC-MS/MS and GC-MS/MS were developed and validated in-house. The spiked freeze-dried cucumber powder was selected as the most feasible material and more exhaustive studies on homogeneity and stability of the pesticide residues in the matrix were carried out. The results suggested that the between-unit homogeneity was satisfactory with a sample intake of dried material as low as 0.1 g. A 9-week isochronous stability study was undertaken at -20 °C, 4 °C and 18 °C, with -70 °C designated as the reference temperature. The pesticides tested exhibited adequate stability at -20 °C during the 9-week period as well as at -70 °C for a period of 18 months. These results constitute a good basis for the development of a new candidate reference material for selected pesticides in a cucumber matrix. PMID:25627789

  11. Nonstationary homogeneous nucleation

    NASA Technical Reports Server (NTRS)

    Harstad, K. G.

    1974-01-01

    The theory of homogeneous condensation is reviewed and equations describing this process are presented. Numerical computer solutions to transient problems in nucleation (relaxation to steady state) are presented and compared to a prior computation.

  12. Preparation of cotton linter nanowhiskers by high-pressure homogenization process and its application in thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Savadekar, N. R.; Karande, V. S.; Vigneshwaran, N.; Kadam, P. G.; Mhaske, S. T.

    2015-03-01

    The present work deals with the preparation of cotton linter nanowhiskers (CLNW) by acid hydrolysis and subsequent processing in a high-pressure homogenizer. Prepared CLNW were then used as a reinforcing material in thermoplastic starch (TPS), with an aim to improve its performance properties. Concentration of CLNW was varied as 0, 1, 2, 3, 4 and 5 wt% in TPS. TPS/CLNW nanocomposite films were prepared by solution-casting process. The nanocomposite films were characterized by tensile, differential scanning calorimetry, scanning electron microscopy (SEM), water vapor permeability (WVP), oxygen permeability (OP), X-ray diffraction and light transmittance properties. 3 wt% CLNW-loaded TPS nanocomposite films demonstrated 88 % improvement in the tensile strength as compared to the pristine TPS polymer film; whereas, WVP and OP decreased by 90 and 92 %, respectively, which is highly appreciable compared to the quantity of CLNW added. DSC thermograms of nanocomposite films did not show any significant effect on melting temperature as compared to the pristine TPS. Light transmittance ( T r) value of TPS decreased with increased content of CLNW. Better interaction between CLNW and TPS, caused due to the hydrophilic nature of both the materials, and uniform distribution of CLNW in TPS were the prime reason for the improvement in properties observed at 3 wt% loading of CLNW in TPS. However, CLNW was seen to have formed agglomerates at higher concentration as determined from SEM analysis. These nanocomposite films can have potential use in food and pharmaceutical packaging applications.

  13. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to

  14. Using a critical reflection process to create an effective learning community in the workplace.

    PubMed

    Walker, Rachel; Cooke, Marie; Henderson, Amanda; Creedy, Debra K

    2013-05-01

    Learning circles are an enabling process to critically examine and reflect on practices with the purpose of promoting individual and organizational growth and change. The authors adapted and developed a learning circle strategy to facilitate open discourse between registered nurses, clinical leaders, clinical facilitators and students, to critically reflect on practice experiences to promote a positive learning environment. This paper reports on an analysis of field notes taken during a critical reflection process used to create an effective learning community in the workplace. A total of 19 learning circles were conducted during in-service periods (that is, the time allocated for professional education between morning and afternoon shifts) over a 3 month period with 56 nurses, 33 students and 1 university-employed clinical supervisor. Participation rates ranged from 3 to 12 individuals per discussion. Ten themes emerged from content analysis of the clinical learning issues identified through the four-step model of critical reflection used in learning circle discussions. The four-step model of critical reflection allowed participants to reflect on clinical learning issues, and raise them in a safe environment that enabled topics to be challenged and explored in a shared and cooperative manner. PMID:22459911

  15. Waste container weighing data processing to create reliable information of household waste generation.

    PubMed

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. PMID:25765610

  16. Manufacturing of 9CrMoCoB Steel of Large Ingot with Homogeneity by ESR Process

    NASA Astrophysics Data System (ADS)

    Kim, D. S.; Lee, G. J.; Lee, M. B.; Hur, J. I.; Lee, J. W.

    2016-07-01

    In case of 9CrMoCoB (COST FB2) steel, equilibrium relation between [B]/[Si] ratio and (B2O3)/(SiO2) ratio is very important to control [Si] and [B] in optimum range. Therefore, in this work, to investigate the thermodynamic equilibrium relation between [B]/[Si] ratio and (B2O3)/(SiO2) ratio, pilot ESR experiments of 9CrMoCoB steel were carried out using the CaF2-CaO-Al2O3-SiO2-B2O3 slag system according to change of Si content in electrode and B2O3 content in the slag. Furthermore, through the test melting of the 20ton-class ESR ingot, the merits and demerits of soft arcing were investigated. From these results, it is concluded that oxygen content in the ESR ingot decrease with decreasing SiO2 content in the slag, relation function between [B]/[Si] ratio and (B2O3)/(SiO2) ratio derived by Pilot ESR test shows a good agreement as compared to the calculated line with a same slope and soft arcing makes interior and surface quality of ingot worse. With the optimized ESR conditions obtained from the present study, a 1000mm diameter (20 tons) and 2200mm diameter (120ton) 9CrMoCoB steel of the ESR ingot were successfully manufactured with good homogeneity by the ESR process.

  17. Regional Homogeneity of Resting-State Brain Activity Suppresses the Effect of Dopamine-Related Genes on Sensory Processing Sensitivity

    PubMed Central

    Chen, Chuansheng; Moyzis, Robert; Xia, Mingrui; He, Yong; Xue, Gui; Li, Jin; He, Qinghua; Lei, Xuemei; Wang, Yunxin; Liu, Bin; Chen, Wen; Zhu, Bi; Dong, Qi

    2015-01-01

    Sensory processing sensitivity (SPS) is an intrinsic personality trait whose genetic and neural bases have recently been studied. The current study used a neural mediation model to explore whether resting-state brain functions mediated the effects of dopamine-related genes on SPS. 298 healthy Chinese college students (96 males, mean age = 20.42 years, SD = 0.89) were scanned with magnetic resonance imaging during resting state, genotyped for 98 loci within the dopamine system, and administered the Highly Sensitive Person Scale. We extracted a “gene score” that summarized the genetic variations representing the 10 loci that were significantly linked to SPS, and then used path analysis to search for brain regions whose resting-state data would help explain the gene-behavior association. Mediation analysis revealed that temporal homogeneity of regional spontaneous activity (ReHo) in the precuneus actually suppressed the effect of dopamine-related genes on SPS. The path model explained 16% of the variance of SPS. This study represents the first attempt at using a multi-gene voxel-based neural mediation model to explore the complex relations among genes, brain, and personality. PMID:26308205

  18. Five Important Lessons I Learned during the Process of Creating New Child Care Centers

    ERIC Educational Resources Information Center

    Whitehead, R. Ann

    2005-01-01

    In this article, the author describes her experiences of developing new child care sites and offers five important lessons that she learned through her experiences which helped her to create successful child care centers. These lessons include: (1) Finding an appropriate area and location; (2) Creating realistic financial projections based on real…

  19. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  20. Mechanical homogenization increases bacterial homogeneity in sputum.

    PubMed

    Stokell, Joshua R; Khan, Ammad; Steck, Todd R

    2014-07-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  1. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  2. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  3. Using the "New Planning for Results" Process To Create Local Standards of Library Service.

    ERIC Educational Resources Information Center

    Kotch, Marianne

    2002-01-01

    Discusses "The New Planning for Results" manual published by the American Library Association that helps create local standards of public library service, and provides implementation examples based on experiences in Vermont. Highlights include evaluating community needs; service responses to those needs; developing library objectives; and…

  4. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  5. Thermomechanical process optimization of U-10wt% Mo - Part 2: The effect of homogenization on the mechanical properties and microstructure

    NASA Astrophysics Data System (ADS)

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean; Burkes, Douglas E.

    2015-10-01

    In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the "homogenized" materials were then tested under compression between 500 and 800 °C. The as-cast samples and those treated at 800 °C for 24 h had grain sizes of 25-30 μm, whereas those treated at 1000 °C for 16 h had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  6. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  7. We're Born to Learn: Using the Brain's Natural Learning Process to Create Today's Curriculum. Second Edition

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    2011-01-01

    This updated edition of the bestselling book on the brain's natural learning process brings new research results and applications in a power-packed teacher tool kit. Rita Smilkstein shows teachers how to create and deliver curricula that help students become the motivated, successful, and natural learners they were born to be. Updated features…

  8. It's Who You Know "and" What You Know: The Process of Creating Partnerships between Schools and Communities

    ERIC Educational Resources Information Center

    Hands, Catherine

    2005-01-01

    Based on qualitative research, this article aims to clarify the process of creating school-community partnerships. Two secondary schools with numerous partnerships were selected within a southern Ontario school board characterized by economic and cultural diversity. Drawing on the within- and cross-case analyses of documents, observations, and 25…

  9. Method of removing the effects of electrical shorts and shunts created during the fabrication process of a solar cell

    DOEpatents

    Nostrand, Gerald E.; Hanak, Joseph J.

    1979-01-01

    A method of removing the effects of electrical shorts and shunts created during the fabrication process and improving the performance of a solar cell with a thick film cermet electrode opposite to the incident surface by applying a reverse bias voltage of sufficient magnitude to burn out the electrical shorts and shunts but less than the break down voltage of the solar cell.

  10. Atomic processes in plasmas created by an ultra-short laser pulse

    NASA Astrophysics Data System (ADS)

    Audebert, P.; Lecherbourg, L.; Bastiani-Ceccotti, S.; Geindre, J.-P.; Blancard, C.; Cossé, P.; Faussurier, G.; Shepherd, R.; Renaudin, P.

    2008-05-01

    Point projection K-shell absorption spectroscopy has been used to measure absorption spectra of transient aluminum plasma created by an ultra-short laser pulse. 1s-2p and 1s-3p absorption lines of weakly ionized aluminum were measured for an extended range of densities in a relatively low-temperature regime. Independent plasma characterization was obtained from frequency domain interferometry (FDI) diagnostic and allows the interpretation of the absorption spectra in terms of spectral opacities. The experimental spectra are compared with opacity calculations using the density and temperature inferred from the analysis of the FDI data.

  11. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  12. Simulation of the Vapor Intrusion Process for Non-Homogeneous Soils Using a Three-Dimensional Numerical Model

    PubMed Central

    Bozkurt, Ozgur; Pennell, Kelly G.; Suuberg, Eric M.

    2010-01-01

    This paper presents model simulation results of vapor intrusion into structures built atop sites contaminated with volatile or semi-volatile chemicals of concern. A three-dimensional finite element model was used to investigate the importance of factors that could influence vapor intrusion when the site is characterized by non-homogeneous soils. Model simulations were performed to examine how soil layers of differing properties alter soil gas concentration profiles and vapor intrusion rates into structures. The results illustrate difference in soil gas concentration profiles and vapor intrusion rates between homogeneous and layered soils. The findings support the need for site conceptual models to adequately represent the site’s geology when conducting site characterizations, interpreting field data and assessing the risk of vapor intrusion at a given site. For instance, in layered geologies, a lower permeability and diffusivity soil layer between the source and building often limits vapor intrusion rates, even if a higher permeability layer near the foundation permits increased soil gas flow rates into the building. In addition, the presence of water-saturated clay layers can considerably influence soil gas concentration profiles. Therefore, interpreting field data without accounting for clay layers in the site conceptual model could result in inaccurate risk calculations. Important considerations for developing more accurate conceptual site models are discussed in light of the findings. PMID:20664816

  13. Chemically Patterned Inverse Opal Created by a Selective Photolysis Modification Process.

    PubMed

    Tian, Tian; Gao, Ning; Gu, Chen; Li, Jian; Wang, Hui; Lan, Yue; Yin, Xianpeng; Li, Guangtao

    2015-09-01

    Anisotropic photonic crystal materials have long been pursued for their broad applications. A novel method for creating chemically patterned inverse opals is proposed here. The patterning technique is based on selective photolysis of a photolabile polymer together with postmodification on released amine groups. The patterning method allows regioselective modification within an inverse opal structure, taking advantage of selective chemical reaction. Moreover, combined with the unique signal self-reporting feature of the photonic crystal, the fabricated structure is capable of various applications, including gradient photonic bandgap and dynamic chemical patterns. The proposed method provides the ability to extend the structural and chemical complexity of the photonic crystal, as well as its potential applications. PMID:26269453

  14. Numerical Simulation of Crater Creating Process in Dynamic Replacement Method by Smooth Particle Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Danilewicz, Andrzej; Sikora, Zbigniew

    2015-02-01

    A theoretical base of SPH method, including the governing equations, discussion of importance of the smoothing function length, contact formulation, boundary treatment and finally utilization in hydrocode simulations are presented. An application of SPH to a real case of large penetrations (crater creating) into the soil caused by falling mass in Dynamic Replacement Method is discussed. An influence of particles spacing on method accuracy is presented. An example calculated by LS-DYNA software is discussed. Chronological development of Smooth Particle Hydrodynamics is presented. Theoretical basics of SPH method stability and consistency in SPH formulation, artificial viscosity and boundary treatment are discussed. Time integration techniques with stability conditions, SPH+FEM coupling, constitutive equation and equation of state (EOS) are presented as well.

  15. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  16. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  17. The Process of Inclusion and Accommodation: Creating Accessible Groups for Individuals with Disabilities.

    ERIC Educational Resources Information Center

    Patterson, Jeanne Boland; And Others

    1995-01-01

    Supports the important work of group counselors by focusing on the inclusion of individuals with disabilities in nondisability specific groups and addressing disability myths, disability etiquette, architectural accessibility, and group process issues. (LKS)

  18. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head

    PubMed Central

    Li, Kai; Papademetris, Xenophon; Tucker, Don M.

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  19. Creating Low Vision and Nonvisual Instructions for Diabetes Technology: An Empirically Validated Process

    PubMed Central

    Williams, Ann S.

    2012-01-01

    Introduction Nearly 20% of the adults with diagnosed diabetes in the United States also have visual impairment. Many individuals in this group perform routine diabetes self-management tasks independently, often using technology that was not specifically designed for use by people with visual impairment (e.g., insulin pumps and pens). Equitable care for persons with disabilities requires providing instructions in formats accessible for nonreaders. However, instructions in accessible formats, such as recordings, braille, or digital documents that are legible to screen readers, are seldom available. Method This article includes a summary of existing guidelines for creating accessible documents. The guidelines are followed by a description of the production of accessible nonvisual instructions for use of insulin pens used in a study of dosing accuracy. The study results indicate that the instructions were used successfully by 40 persons with visual impairment. Discussion and Conclusions Instructions in accessible formats can increase access to the benefits of diabetes technology for persons with visual impairment. Recorded instructions may also be useful to sighted persons who do not read well, such as those with dyslexia, low literacy, or who use English as a second language. Finally, they may have important benefits for fully sighted people who find it easier to learn to use technology by handling the equipment while listening to instructions. Manufacturers may also benefit from marketing to an increased pool of potential users. PMID:22538133

  20. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head.

    PubMed

    Li, Kai; Papademetris, Xenophon; Tucker, Don M

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  1. Creating aging-enriched social work education:a process of curricular and organizational change.

    PubMed

    Hooyman, Nancy; St Peter, Suzanne

    2006-01-01

    The CSWE Geriatric Enrichment in Social Work Education Project, funded by the John A. Hartford foundation, aimed to change curricula and organizational structure in 67 GeroRich projects so that all students would graduate with foundation knowledge and skills to work effectively with older adults and their families. The emphasis was on change processes to infuse and sustain gerontological competencies and curricular resources in foundation courses. This article presents lessons learned and strategies for engaging faculty, practitioners and students in the curriculum and organizational change process. PMID:17200068

  2. Study of stirred layers on 316L steel created by friction stir processing

    NASA Astrophysics Data System (ADS)

    Langlade, C.; Roman, A.; Schlegel, D.; Gete, E.; Folea, M.

    2014-08-01

    Nanostructured materials are known to exhibit attractive properties, especially in the mechanical field where high hardness is of great interest. The friction stir process (FSP) is a recent surface engineering technique derived from the friction stir welding method (FSW). In this study, the FSP of an 316L austenitic stainless steel has been evaluated. The treated layers have been characterized in terms of hardness and microstructure and these results have been related to the FSP operational parameters. The process has been analysed using a Response Surface Method (RSM) to enable the stirred layer thickness prediction.

  3. Not All Analogies Are Created Equal: Associative and Categorical Analogy Processing following Brain Damage

    ERIC Educational Resources Information Center

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and…

  4. Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement

    ERIC Educational Resources Information Center

    Raven, Arjan; Randolph, Adriane B.; Heil, Shelli

    2010-01-01

    The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…

  5. Feasibility study for producing a carrot/potato matrix reference material for 11 selected pesticides at EU MRL level: material processing, homogeneity and stability assessment.

    PubMed

    Saldanha, Helena; Sejerøe-Olsen, Berit; Ulberth, Franz; Emons, Hendrik; Zeleny, Reinhard

    2012-05-01

    The feasibility for producing a matrix reference material for selected pesticides in a carrot/potato matrix was investigated. A commercially available baby food (carrot/potato-based mash) was spiked with 11 pesticides at the respective EU maximum residue limits (MRLs), and further processed by either freezing or freeze-drying. Batches of some 150 units were produced per material type. First, the materials were assessed for the relative amount of pesticide recovered after processing (ratio of pesticide concentration in the processed material to the initially spiked pesticide concentration). In addition, the materials' homogeneity (bottle-to-bottle variation), and the short-term (1 month) and mid-term (5 months) stability at different temperatures were assessed. For this, an in-house validated GC-EI-MS method operated in the SIM mode with a sample preparation procedure based on the QuEChERS ("quick, easy, cheap, effective, rugged, and safe") principle was applied. Measurements on the frozen material provided the most promising results (smallest analyte losses during production), and also freeze-drying proved to be a suitable alternative processing technique for most of the investigated pesticides. Both the frozen and the freeze-dried material showed to be sufficiently homogeneous for the intended use, and storage at -20°C for 5 months did not reveal any detectable material degradation. The results constitute an important step towards the development of a pesticide matrix reference material. PMID:26434333

  6. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse.

    PubMed

    Hovick, Torre J; Allred, Brady W; Elmore, R Dwayne; Fuhlendorf, Samuel D; Hamilton, Robert G; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993-2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy's Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken's distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as static

  7. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse

    PubMed Central

    Hovick, Torre J.; Allred, Brady W.; Elmore, R. Dwayne; Fuhlendorf, Samuel D.; Hamilton, Robert G.; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993–2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy’s Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken’s distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as

  8. Near InfraRed Spectroscopy homogeneity evaluation of complex powder blends in a small-scale pharmaceutical preformulation process, a real-life application.

    PubMed

    Storme-Paris, I; Clarot, I; Esposito, S; Chaumeil, J C; Nicolas, A; Brion, F; Rieutord, A; Chaminade, P

    2009-05-01

    Near InfraRed Spectroscopy (NIRS) is a potentially powerful tool for assessing the homogeneity of industrial powder blends. In the particular context of hospital manufacturing, we considered the introduction of the technique at a small pharmaceutical process scale, with the objective of following blend homogeneity in mixtures of seven components. This article investigates the performance of various NIRS-based methodologies to assess powder blending. The formulation studied is prescribed in haematology unit, as part of the treatment for digestive decontamination in children receiving stem-cell transplantation. It is composed of the active pharmaceutical ingredients (APIs) colimycin and tobramycin and five excipients. We evaluated 39 different blends composing 14 different formulations, with uncorrelated proportions of constituents between these 14 formulations. The reference methods used to establish the NIRS models were gravimetry and a High Performance Liquid Chromatography method coupled to an Evaporative Light Scattering Detection. Unsupervised and supervised qualitative and quantitative chemometric methods were performed to assess powder blend homogeneity using a bench top instrument equipped with an optical fibre. For qualitative evaluations, unsupervised Moving Block Standard Deviation, autocorrelation functions and Partial Least Square Discriminant Analysis (PLS-DA) were used. For quantitative evaluations, Partial Least Square Cross-Validated models were chosen. Results are expressed as API, and major excipient percentages of theoretical values as a function of blending time. The 14 different formulations were only satisfactorily discriminated by supervised algorithms, such as an optimised PLS-DA model. The homogeneity state was demonstrated after 16 min of blending, quantifying three components with a precision between 1.2% and 1.4% w/w. This study demonstrates, for the first time, the effective implementation of NIRS for blend homogeneity evaluation, as

  9. Creating a process for incorporating epidemiological modelling into outbreak management decisions.

    PubMed

    Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy

    2012-01-01

    Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks. PMID:22948107

  10. ArhiNet - A Knowledge-Based System for Creating, Processing and Retrieving Archival eContent

    NASA Astrophysics Data System (ADS)

    Salomie, Ioan; Dinsoreanu, Mihaela; Pop, Cristina; Suciu, Sorin

    This paper addresses the problem of creating, processing and querying semantically enhanced eContent from archives and digital libraries. We present an analysis of the archival domain, resulting in the creation of an archival domain model and of a domain ontology core. Our system adds semantic mark-up to the historical documents content, thus enabling document and knowledge retrieval as response to natural language ontology-guided queries. The system functionality follows two main workflows: (i) semantically enhanced eContent generation and knowledge acquisition and (ii) knowledge processing and retrieval. Within the first workflow, the relevant domain information is extracted from documents written in natural languages, followed by semantic annotation and domain ontology population. In the second workflow, ontologically guided natural language queries trigger reasoning processes that provide relevant search results. The paper also discusses the transformation of the OWL domain ontology into a hierarchical data model, thus providing support for the efficient ontology processing.

  11. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  12. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  13. Climate for Learning: A Symposium. Creating a Climate for Learning, and the Humanizing Process. The Principal and School Discipline. Curriculum Bulletin Vol. XXXII, No. 341.

    ERIC Educational Resources Information Center

    Johnson, Simon O.; Chaky, June

    This publication contains two articles focusing on creating a climate for learning. In "Creating a Climate for Learning, and the Humanizing Process," Simon O. Johnson offers practical suggestions for creating a humanistic learning environment. The author begins by defining the basic concepts--humanism, affective education, affective situation,…

  14. Not all analogies are created equal: Associative and categorical analogy processing following brain damage

    PubMed Central

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile; and contain more distant relationships. PMID:22402184

  15. Integrated assessment of emerging science and technologies as creating learning processes among assessment communities.

    PubMed

    Forsberg, Ellen-Marie; Ribeiro, Barbara; Heyen, Nils B; Nielsen, Rasmus Øjvind; Thorstensen, Erik; de Bakker, Erik; Klüver, Lars; Reiss, Thomas; Beekman, Volkert; Millar, Kate

    2016-12-01

    Emerging science and technologies are often characterised by complexity, uncertainty and controversy. Regulation and governance of such scientific and technological developments needs to build on knowledge and evidence that reflect this complicated situation. This insight is sometimes formulated as a call for integrated assessment of emerging science and technologies, and such a call is analysed in this article. The article addresses two overall questions. The first is: to what extent are emerging science and technologies currently assessed in an integrated way. The second is: if there appears to be a need for further integration, what should such integration consist in? In the article we briefly outline the pedigree of the term 'integrated assessment' and present a number of interpretations of the concept that are useful for informing current analyses and discussions of integration in assessment. Based on four case studies of assessment of emerging science and technologies, studies of assessment traditions, literature analysis and dialogues with assessment professionals, currently under-developed integration dimensions are identified. It is suggested how these dimensions can be addressed in a practical approach to assessment where representatives of different assessment communities and stakeholders are involved. We call this approach the Trans Domain Technology Evaluation Process (TranSTEP). PMID:27465504

  16. AFRA confronts gender issues: the process of creating a gender strategy.

    PubMed

    Bydawell, M

    1997-02-01

    The Association for Rural Advancement (AFRA), a nongovernmental organization in South Africa affiliated with the National Land Committee (NLC), seeks to redress the legacy of unjust land dispensation during the apartheid period. AFRA is the first organization within NLC to deal openly with issues of race and gender; this process has been conflictual, however. At gender training workshops conducted by White development workers, many staff expressed the view that sexism is an alien Western issue. Moreover, gender sensitivity was interpreted by Black staff as an assault on their race and cultural identity. The staff itself was polarized on racial grounds, with White managers and Black field workers. Staff further expressed concerns that a gender perspective would dilute AFRA's focus on land reform and alienate rural women who want male household heads to continue to hold the title to their land. The organizational structure was reorganized, though, to become more democratic and racially representative. The 1995 appointment of the first field worker assigned to address women's empowerment in both the organization and target communities refueled the controversy, and a gender workshop led by a psychologist was held to build trust and unity. Staff moved toward a shared understanding of gender as an aspect of social differentiation. AFRA has since committed itself to develop an integrated gender strategy sensitive to people's needs and fears. PMID:12320741

  17. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  18. Dimensional Methods: Dimensions, Units and the Principle of Dimensional Homogeneity. Physical Processes in Terrestrial and Aquatic Ecosystems, Applied Mathematics.

    ERIC Educational Resources Information Center

    Fletcher, R. Ian

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. The module is concerned with conventional techniques such as concepts of measurement,…

  19. Degradation Mechanism of Cyanobacterial Toxin Cylindrospermopsin by Hydroxyl Radicals in Homogeneous UV/H2O2 Process

    EPA Science Inventory

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometr...

  20. A trapped magnetic field of 3 T in homogeneous, bulk MgB2 superconductors fabricated by a modified precursor infiltration and growth process

    NASA Astrophysics Data System (ADS)

    Bhagurkar, A. G.; Yamamoto, A.; Anguilano, L.; Dennis, A. R.; Durrell, J. H.; Babu, N. Hari; Cardwell, D. A.

    2016-03-01

    The wetting of boron with liquid magnesium is a critical factor in the synthesis of MgB2 bulk superconductors by the infiltration and growth (IG) process. Poor wetting characteristics can therefore result potentially in non-uniform infiltration, formation of defects in the final sample structure and poor structural homogeneity throughout the bulk material. Here we report the fabrication of near-net-shaped MgB2 bulk superconductors by a modified precursor infiltration and growth (MPIG) technique. A homogeneous bulk microstructure has subsequently been achieved via the uniform infiltration of Mg liquid by enriching pre-reacted MgB2 powder within the green precursor pellet as a wetting enhancer, leading to relatively little variation in superconducting properties across the entire bulk sample. Almost identical values of trapped magnetic field of 2.12 T have been measured at 5 K at both the top and bottom surfaces of a sample fabricated by the MPIG process, confirming the uniformity of the bulk microstructure. A maximum trapped field of 3 T has been measured at 5 K at the centre of a stack of two bulk MgB2 samples fabricated using this technique. A steady rise in trapped field was observed for this material with decreasing temperature down to 5 K without the occurrence of flux avalanches and with a relatively low field decay rate (1.5%/d). These properties are attributed to the presence of a fine distribution of residual Mg within the bulk microstructure generated by the MPIG processing technique.

  1. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants.

    PubMed

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (-47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  2. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants

    PubMed Central

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (−47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  3. Is cryopreservation a homogeneous process? Ultrastructure and motility of untreated, prefreezing, and postthawed spermatozoa of Diplodus puntazzo (Cetti).

    PubMed

    Taddei, A R; Barbato, F; Abelli, L; Canese, S; Moretti, F; Rana, K J; Fausto, A M; Mazzini, M

    2001-06-01

    This study subdivides the cryopreservation procedure for Diplodus puntazzo spermatozoa into three key phases, fresh, prefreezing (samples equilibrated in cryosolutions), and postthawed stages, and examines the ultrastructural anomalies and motility profiles of spermatozoa in each stage, with different cryodiluents. Two simple cryosolutions were evaluated: 0.17 M sodium chloride containing a final concentration of 15% dimethyl sulfoxide (Me(2)SO) (cryosolution A) and 0.1 M sodium citrate containing a final concentration of 10% Me(2)SO (cryosolution B). Ultrastructural anomalies of the plasmatic and nuclear membranes of the sperm head were common and the severity of the cryoinjury differed significantly between the pre- and the postfreezing phases and between the two cryosolutions. In spermatozoa diluted with cryosolution A, during the prefreezing phase, the plasmalemma of 61% of the cells was absent or damaged compared with 24% in the fresh sample (P < 0.001). In spermatozoa diluted with cryosolution B, there was a pronounced increase in the number of cells lacking the head plasmatic membrane from the prefreezing to the postthawed stages (from 32 to 52%, P < 0.01). In both cryosolutions, damages to nuclear membrane were significantly higher after freezing (cryosolution A: 8 to 23%, P < 0.01; cryosolution B: 5 to 38%, P < 0.001). With cryosolution A, the after-activation motility profile confirmed a consistent drop from fresh at the prefreezing stage, whereas freezing and thawing did not affect the motility much further and 50% of the cells were immotile by 60-90 s after activation. With cryosolution B, only the postthawing stage showed a sharp drop of motility profile. This study suggests that the different phases of the cryoprocess should be investigated to better understand the process of sperm damage. PMID:11748933

  4. Creating Sub-50 Nm Nanofluidic Junctions in PDMS Microfluidic Chip via Self-Assembly Process of Colloidal Particles.

    PubMed

    Wei, Xi; Syed, Abeer; Mao, Pan; Han, Jongyoon; Song, Yong-Ak

    2016-01-01

    Polydimethylsiloxane (PDMS) is the prevailing building material to make microfluidic devices due to its ease of molding and bonding as well as its transparency. Due to the softness of the PDMS material, however, it is challenging to use PDMS for building nanochannels. The channels tend to collapse easily during plasma bonding. In this paper, we present an evaporation-driven self-assembly method of silica colloidal nanoparticles to create nanofluidic junctions with sub-50 nm pores between two microchannels. The pore size as well as the surface charge of the nanofluidic junction is tunable simply by changing the colloidal silica bead size and surface functionalization outside of the assembled microfluidic device in a vial before the self-assembly process. Using the self-assembly of nanoparticles with a bead size of 300 nm, 500 nm, and 900 nm, it was possible to fabricate a porous membrane with a pore size of ~45 nm, ~75 nm and ~135 nm, respectively. Under electrical potential, this nanoporous membrane initiated ion concentration polarization (ICP) acting as a cation-selective membrane to concentrate DNA by ~1,700 times within 15 min. This non-lithographic nanofabrication process opens up a new opportunity to build a tunable nanofluidic junction for the study of nanoscale transport processes of ions and molecules inside a PDMS microfluidic chip. PMID:27023724

  5. ABA Southern Region Burn disaster plan: the process of creating and experience with the ABA southern region burn disaster plan.

    PubMed

    Kearns, Randy D; Cairns, Bruce A; Hickerson, William L; Holmes, James H

    2014-01-01

    The Southern Region of the American Burn Association began to craft a regional plan to address a surge of burn-injured patients after a mass casualty event in 2004. Published in 2006, this plan has been tested through modeling, exercise, and actual events. This article focuses on the process of how the plan was created, how it was tested, and how it interfaces with other ongoing efforts on preparedness. One key to success regarding how people respond to a disaster can be traced to preexisting relationships and collaborations. These activities would include training or working together and building trust long before the crisis. Knowing who you can call and rely on when you need help, within the context of your plan, can be pivotal in successfully managing a disaster. This article describes how a coalition of burn center leaders came together. Their ongoing personal association has facilitated the development of planning activities and has kept the process dynamic. This article also includes several of the building blocks for developing a plan from creation to composition, implementation, and testing. The plan discussed here is an example of linking leadership, relationships, process, and documentation together. On the basis of these experiences, the authors believe these elements are present in other regions. The intent of this work is to share an experience and to offer it as a guide to aid others in their regional burn disaster planning efforts. PMID:23666386

  6. Creating a high-reliability health care system: improving performance on core processes of care at Johns Hopkins Medicine.

    PubMed

    Pronovost, Peter J; Armstrong, C Michael; Demski, Renee; Callender, Tiffany; Winner, Laura; Miller, Marlene R; Austin, J Matthew; Berenholtz, Sean M; Yang, Ting; Peterson, Ronald R; Reitz, Judy A; Bennett, Richard G; Broccolino, Victor A; Davis, Richard O; Gragnolati, Brian A; Green, Gene E; Rothman, Paul B

    2015-02-01

    In this article, the authors describe an initiative that established an infrastructure to manage quality and safety efforts throughout a complex health care system and that improved performance on core measures for acute myocardial infarction, heart failure, pneumonia, surgical care, and children's asthma. The Johns Hopkins Medicine Board of Trustees created a governance structure to establish health care system-wide oversight and hospital accountability for quality and safety efforts throughout Johns Hopkins Medicine. The Armstrong Institute for Patient Safety and Quality was formed; institute leaders used a conceptual model nested in a fractal infrastructure to implement this initiative to improve performance at two academic medical centers and three community hospitals, starting in March 2012. The initiative aimed to achieve ≥ 96% compliance on seven inpatient process-of-care core measures and meet the requirements for the Delmarva Foundation and Joint Commission awards. The primary outcome measure was the percentage of patients at each hospital who received the recommended process of care. The authors compared health system and hospital performance before (2011) and after (2012, 2013) the initiative. The health system achieved ≥ 96% compliance on six of the seven targeted measures by 2013. Of the five hospitals, four received the Delmarva Foundation award and two received The Joint Commission award in 2013. The authors argue that, to improve quality and safety, health care systems should establish a system-wide governance structure and accountability process. They also should define and communicate goals and measures and build an infrastructure to support peer learning. PMID:25517699

  7. The Denali EarthScope Education Partnership: Creating Opportunities for Learning About Solid Earth Processes in Alaska and Beyond.

    NASA Astrophysics Data System (ADS)

    Roush, J. J.; Hansen, R. A.

    2003-12-01

    The Geophysical Institute of the University of Alaska Fairbanks, in partnership with Denali National Park and Preserve, has begun an education outreach program that will create learning opportunities in solid earth geophysics for a wide sector of the public. We will capitalize upon a unique coincidence of heightened public interest in earthquakes (due to the M 7.9 Denali Fault event of Nov. 3rd, 2002), the startup of the EarthScope experiment, and the construction of the Denali Science & Learning Center, a premiere facility for science education located just 43 miles from the epicenter of the Denali Fault earthquake. Real-time data and current research results from EarthScope installations and science projects in Alaska will be used to engage students and teachers, national park visitors, and the general public in a discovery process that will enhance public understanding of tectonics, seismicity and volcanism along the boundary between the Pacific and North American plates. Activities will take place in five program areas, which are: 1) museum displays and exhibits, 2) outreach via print publications and electronic media, 3) curriculum development to enhance K-12 earth science education, 4) teacher training to develop earth science expertise among K-12 educators, and 5) interaction between scientists and the public. In order to engage the over 1 million annual visitors to Denali, as well as people throughout Alaska, project activities will correspond with the opening of the Denali Science and Learning Center in 2004. An electronic interactive kiosk is being constructed to provide public access to real-time data from seismic and geodetic monitoring networks in Alaska, as well as cutting edge visualizations of solid earth processes. A series of print publications and a website providing access to real-time seismic and geodetic data will be developed for park visitors and the general public, highlighting EarthScope science in Alaska. A suite of curriculum modules

  8. Homogeneous processes of atmospheric interest

    NASA Technical Reports Server (NTRS)

    Rossi, M. J.; Barker, J. R.; Golden, D. M.

    1983-01-01

    Upper atmospheric research programs in the department of chemical kinetics are reported. Topics discussed include: (1) third-order rate constants of atmospheric importance; (2) a computational study of the HO2 + HO2 and DO2 + DO2 reactions; (3) measurement and estimation of rate constants for modeling reactive systems; (4) kinetics and thermodynamics of ion-molecule association reactions; (5) entropy barriers in ion-molecule reactions; (6) reaction rate constant for OH + HOONO2 yields products over the temperature range 246 to 324 K; (7) very low-pressure photolysis of tert-bytyl nitrite at 248 nm; (8) summary of preliminary data for the photolysis of C1ONO2 and N2O5 at 285 nm; and (9) heterogeneous reaction of N2O5 and H2O.

  9. Homogeneity and elemental distribution in self-assembled bimetallic Pd-Pt aerogels prepared by a spontaneous one-step gelation process.

    PubMed

    Oezaslan, M; Liu, W; Nachtegaal, M; Frenkel, A I; Rutkowski, B; Werheid, M; Herrmann, A-K; Laugier-Bonnaud, C; Yilmaz, H-C; Gaponik, N; Czyrska-Filemonowicz, A; Eychmüller, A; Schmidt, T J

    2016-07-27

    Multi-metallic aerogels have recently emerged as a novel and promising class of unsupported electrocatalyst materials due to their high catalytic activity and improved durability for various electrochemical reactions. Aerogels can be prepared by a spontaneous one-step gelation process, where the chemical co-reduction of metal precursors and the prompt formation of nanochain-containing hydrogels, as a preliminary stage for the preparation of aerogels, take place. However, detailed knowledge about the homogeneity and chemical distribution of these three-dimensional Pd-Pt aerogels at the nano-scale as well as at the macro-scale is still unclear. Therefore, we used a combination of spectroscopic and microscopic techniques to obtain a better insight into the structure and elemental distribution of the various Pd-rich Pd-Pt aerogels prepared by the spontaneous one-step gelation process. Synchrotron-based extended X-ray absorption fine structure (EXAFS) spectroscopy and high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) in combination with energy-dispersive X-ray spectroscopy (EDX) were employed in this work to uncover the structural architecture and chemical composition of the various Pd-rich Pd-Pt aerogels over a broad length range. The Pd80Pt20, Pd60Pt40 and Pd50Pt50 aerogels showed heterogeneity in the chemical distribution of the Pt and Pd atoms inside the macroscopic nanochain-network. The features of mono-metallic clusters were not detected by EXAFS or STEM-EDX, indicating alloyed nanoparticles. However, the local chemical composition of the Pd-Pt alloys strongly varied along the nanochains and thus within a single aerogel. To determine the electrochemically active surface area (ECSA) of the Pd-Pt aerogels for application in electrocatalysis, we used the electrochemical CO stripping method. Due to their high porosity and extended network structure, the resulting values of the ECSA for the Pd-Pt aerogels were higher than that for

  10. High pressure homogenization processing, thermal treatment and milk matrix affect in vitro bioaccessibility of phenolics in apple, grape and orange juice to different extents.

    PubMed

    He, Zhiyong; Tao, Yadan; Zeng, Maomao; Zhang, Shuang; Tao, Guanjun; Qin, Fang; Chen, Jie

    2016-06-01

    The effects of high pressure homogenization processing (HPHP), thermal treatment (TT) and milk matrix (soy, skimmed and whole milk) on the phenolic bioaccessibility and the ABTS scavenging activity of apple, grape and orange juice (AJ, GJ and OJ) were investigated. HPHP and soy milk diminished AJ's total phenolic bioaccessibility 29.3%, 26.3%, respectively, whereas TT and bovine milk hardly affected it. HPHP had little effect on GJ's and OJ's total phenolic bioaccessibility, while TT enhanced them 27.3-33.9%, 19.0-29.2%, respectively, and milk matrix increased them 26.6-31.1%, 13.3-43.4%, respectively. Furthermore, TT (80 °C/30 min) and TT (90 °C/30 s) presented the similar influences on GJ's and OJ's phenolic bioaccessibility. Skimmed milk showed a better enhancing effect on OJ's total phenolic bioaccessibility than soy and whole milk, but had a similar effect on GJ's as whole milk. These results contribute to promoting the health benefits of fruit juices by optimizing the processing and formulas in the food industry. PMID:26830567

  11. Thermomechanical process optimization of U-10wt% Mo – Part 2: The effect of homogenization on the mechanical properties and microstructure

    SciTech Connect

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean M.; Burkes, Douglas E.

    2015-07-09

    Low-enriched uranium alloyed with 10 wt% molybdenum (U-10Mo) is currently being investigated as an alternative fuel for the highly enriched uranium used in several of the United States’ high performance research reactors. Development of the methods to fabricate the U-10Mo fuel plates is currently underway and requires fundamental understanding of the mechanical properties at the expected processing temperatures. In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the “homogenized” materials were then tested under compression between 500 and 800°C. The as-cast samples and those treated at 800°C for 24 hours had grain sizes of 25-30 μm, whereas those treated at 1000°C for 16 hours had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  12. Is the Universe homogeneous?

    PubMed

    Maartens, Roy

    2011-12-28

    The standard model of cosmology is based on the existence of homogeneous surfaces as the background arena for structure formation. Homogeneity underpins both general relativistic and modified gravity models and is central to the way in which we interpret observations of the cosmic microwave background (CMB) and the galaxy distribution. However, homogeneity cannot be directly observed in the galaxy distribution or CMB, even with perfect observations, since we observe on the past light cone and not on spatial surfaces. We can directly observe and test for isotropy, but to link this to homogeneity we need to assume the Copernican principle (CP). First, we discuss the link between isotropic observations on the past light cone and isotropic space-time geometry: what observations do we need to be isotropic in order to deduce space-time isotropy? Second, we discuss what we can say with the Copernican assumption. The most powerful result is based on the CMB: the vanishing of the dipole, quadrupole and octupole of the CMB is sufficient to impose homogeneity. Real observations lead to near-isotropy on large scales--does this lead to near-homogeneity? There are important partial results, and we discuss why this remains a difficult open question. Thus, we are currently unable to prove homogeneity of the Universe on large scales, even with the CP. However, we can use observations of the cosmic microwave background, galaxies and clusters to test homogeneity itself. PMID:22084298

  13. Creating Poetry.

    ERIC Educational Resources Information Center

    Drury, John

    Encouraging exploration and practice, this book offers hundreds of exercises and numerous tips covering every step involved in creating poetry. Each chapter is a self-contained unit offering an overview of material in the chapter, a definition of terms, and poetry examples from well-known authors designed to supplement the numerous exercises.…

  14. Creating Community

    PubMed Central

    Budin, Wendy C.

    2009-01-01

    In this column, the editor of The Journal of Perinatal Education describes ways that Lamaze International is helping to create a community for those who share a common interest in promoting, supporting, and protecting natural, safe, and healthy childbirth. The editor also describes the contents of this issue, which offer a broad range of resources, research, and inspiration for childbirth educators in their efforts to promote normal birth. PMID:19936112

  15. Spatial homogenization methods for pin-by-pin neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Kozlowski, Tomasz

    For practical reactor core applications low-order transport approximations such as SP3 have been shown to provide sufficient accuracy for both static and transient calculations with considerably less computational expense than the discrete ordinate or the full spherical harmonics methods. These methods have been applied in several core simulators where homogenization was performed at the level of the pin cell. One of the principal problems has been to recover the error introduced by pin-cell homogenization. Two basic approaches to treat pin-cell homogenization error have been proposed: Superhomogenization (SPH) factors and Pin-Cell Discontinuity Factors (PDF). These methods are based on well established Equivalence Theory and Generalized Equivalence Theory to generate appropriate group constants. These methods are able to treat all sources of error together, allowing even few-group diffusion with one mesh per cell to reproduce the reference solution. A detailed investigation and consistent comparison of both homogenization techniques showed potential of PDF approach to improve accuracy of core calculation, but also reveal its limitation. In principle, the method is applicable only for the boundary conditions at which it was created, i.e. for boundary conditions considered during the homogenization process---normally zero current. Therefore, there exists a need to improve this method, making it more general and environment independent. The goal of proposed general homogenization technique is to create a function that is able to correctly predict the appropriate correction factor with only homogeneous information available, i.e. a function based on heterogeneous solution that could approximate PDFs using homogeneous solution. It has been shown that the PDF can be well approximated by least-square polynomial fit of non-dimensional heterogeneous solution and later used for PDF prediction using homogeneous solution. This shows a promise for PDF prediction for off

  16. Creating bulk nanocrystalline metal.

    SciTech Connect

    Fredenburg, D. Anthony; Saldana, Christopher J.; Gill, David D.; Hall, Aaron Christopher; Roemer, Timothy John; Vogler, Tracy John; Yang, Pin

    2008-10-01

    Nanocrystalline and nanostructured materials offer unique microstructure-dependent properties that are superior to coarse-grained materials. These materials have been shown to have very high hardness, strength, and wear resistance. However, most current methods of producing nanostructured materials in weapons-relevant materials create powdered metal that must be consolidated into bulk form to be useful. Conventional consolidation methods are not appropriate due to the need to maintain the nanocrystalline structure. This research investigated new ways of creating nanocrystalline material, new methods of consolidating nanocrystalline material, and an analysis of these different methods of creation and consolidation to evaluate their applicability to mesoscale weapons applications where part features are often under 100 {micro}m wide and the material's microstructure must be very small to give homogeneous properties across the feature.

  17. A study of the process of using Pro/ENGINEER geometry models to create finite element models

    SciTech Connect

    Kistler, B.L.

    1997-02-01

    Methods for building Pro/ENGINEER models which allowed integration with structural and thermal mesh generation and analyses software without recreating geometry were evaluated. This study was not intended to be an in-depth study of the mechanics of Pro/ENGINEER or of mesh generation or analysis software, but instead was a first cut attempt to provide recommendations for Sandia personnel which would yield useful analytical models in less time than an analyst would require to create a separate model. The study evaluated a wide variety of geometries built in Pro/ENGINEER and provided general recommendations for designers, drafters, and analysts.

  18. Phase-shifting of correlation fringes created by image processing as an alternative to improve digital shearography

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Marcon, Marlon; Magalhães, Ricardo R.; Paiva-Almeida, Thiago; Santos, Igor V. A.; Martins, Moisés

    2016-12-01

    The adoption of digital speckle pattern shearing interferometry, or speckle shearography, is well known in many areas when one needs to measure micro-displacements in-plane and out of the plane in biological and non-biological objects; it is based on the Michelson's Interferometer with the use of a piezoelectric transducer (PZT) in order to provide the phase-shift of the fringes and then to improve the quality of the final image. The creation of the shifting images using a PZT, despite its widespread use, has some drawbacks or limitations, such as the cost of the apparatus, the difficulties in applying the same displacement in the mirror repeated times, and when the phase-shift cannot be used in dynamic object measurement. The aim of this work was to create digitally phase-shift images avoiding the mechanical adjustments of the PZT, testing them with the digital shearography method. The methodology was tested using a well-known object, a cantilever beam of aluminium under deformation. The results documented the ability to create the deformation map and curves with reliability and sensitivity, reducing the cost, and improving the robustness and also the accessibility of digital speckle pattern shearing interferometry.

  19. Ecological and evolutionary consequences of biotic homogenization.

    PubMed

    Olden, Julian D; Leroy Poff, N; Douglas, Marlis R; Douglas, Michael E; Fausch, Kurt D

    2004-01-01

    Biotic homogenization, the gradual replacement of native biotas by locally expanding non-natives, is a global process that diminishes floral and faunal distinctions among regions. Although patterns of homogenization have been well studied, their specific ecological and evolutionary consequences remain unexplored. We argue that our current perspective on biotic homogenization should be expanded beyond a simple recognition of species diversity loss, towards a synthesis of higher order effects. Here, we explore three distinct forms of homogenization (genetic, taxonomic and functional), and discuss their immediate and future impacts on ecological and evolutionary processes. Our goal is to initiate future research that investigates the broader conservation implications of homogenization and to promote a proactive style of adaptive management that engages the human component of the anthropogenic blender that is currently mixing the biota on Earth. PMID:16701221

  20. A rapid method for creating drug implants: translating laboratory-based methods into a scalable manufacturing process.

    PubMed

    Wang, Cheng-Kuo; Wang, Wan-Yi; Meyer, Robert F; Liang, Yuling; Winey, Karen I; Siegel, Steven J

    2010-05-01

    Low compliance with medication is the major cause of poor outcome in schizophrenia treatment. While surgically implantable solvent-cast pellets were produced to improve outcome by increased compliance with medication, this process is laborious and time-consuming, inhibiting its broader application (Siegel et al., Eur J Pharm Biopharm 2006;64:287-293). In this study, the previous fabrication process was translated to a continuous and scalable extrusion method. Extrusion processes were modified based on in vitro release studies, drug load consistency examination, and surface morphology analysis using scanning electron microscopy. Afterward, optimized haloperidol implants were implanted into rats for preliminary analysis of biocompatibility. Barrel temperature, screw speed and resulting processing pressure influenced surface morphology and drug release. Data suggest that fewer surface pores shift the mechanism from bulk to surface PLGA degradation and longer lag period. Results demonstrate that extrusion is a viable process for manufacturing antipsychotic implants. PMID:20225251

  1. The second phase in creating the cardiac center for the next generation: beyond structure to process improvement.

    PubMed

    Woods, J

    2001-01-01

    The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute. PMID:11765624

  2. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    NASA Astrophysics Data System (ADS)

    Bulutsuz, A. G.; Demircioglu, P.; Bogrekci, I.; Durakbasa, M. N.; Katiboglu, A. B.

    2015-03-01

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  3. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    SciTech Connect

    Bulutsuz, A. G.; Demircioglu, P. Bogrekci, I.; Durakbasa, M. N.

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  4. Creating Processes Associated with Providing Government Goods and Services Under the Commercial Space Launch Act at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Letchworth, Janet F.

    2011-01-01

    Kennedy Space Center (KSC) has decided to write its agreements under the Commercial Space Launch Act (CSLA) authority to cover a broad range of categories of support that KSC could provide to our commercial partner. Our strategy was to go through the onerous process of getting the agreement in place once and allow added specificity and final cost estimates to be documented on a separate Task Order Request (TOR). This paper is written from the implementing engineering team's perspective. It describes how we developed the processes associated with getting Government support to our emerging commercial partners, such as SpaceX and reports on our success to date.

  5. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  6. Are Children's Memory Illusions Created Differently from Those of Adults? Evidence from Levels-of-Processing and Divided Attention Paradigms

    ERIC Educational Resources Information Center

    Wimmer, Marina C.; Howe, Mark L.

    2010-01-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for…

  7. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  8. Strictly homogeneous laterally complete modules

    NASA Astrophysics Data System (ADS)

    Chilin, V. I.; Karimov, J. A.

    2016-03-01

    Let A be a laterally complete commutative regular algebra and X be a laterally complete A-module. In this paper we introduce a notion of homogeneous and strictly homogeneous A-modules. It is proved that any homogeneous A-module is strictly homogeneous A-module, if the Boolean algebra of all idempotents in A is multi-σ-finite.

  9. SP CREATE. Creating Sample Plans

    SciTech Connect

    Spears, J.H.; Seebode, L.

    1998-11-10

    The program has been designed to increase the accuracy and reduce the preparation time for completing sampling plans. It consists of our files 1. Analyte/Combination (AnalCombo) A list of analytes and combinations of analytes that can be requested of the onsite and offsite labs. Whenever a specific combination of analytes or suite names appear on the same line as the code number, this indicates that one sample can be placed in one bottle to be analyzed for these paremeters. A code number is assigned for each analyte and combination of analytes. 2. Sampling Plans Database (SPDb) A database that contains all of the analytes and combinations of analytes along with the basic information required for preparing a sample plan. That basic information includes the following fields; matrix, hold time, preservation, sample volume, container size, if the bottle caps are taped, acceptable choices. 3. Sampling plans create (SPcreate) a file that will lookup information from the Sampling Plans Database and the Job Log File (JLF98) A major database used by Sample Managemnet Services for recording more than 100 fields of information.

  10. Homogeneous and inhomogeneous eddies

    SciTech Connect

    Pavia, E.G.

    1994-12-31

    This work deals with mesoscale warm oceanic eddies; i.e., self-contained bodies of water which transport heat, among other things, for several months and for several hundreds of kilometers. This heat transport is believed to play an important role in the atmospheric and oceanic conditions of the region where it is being transported. Here the author examines the difference in evolution between eddies modeled as blobs of homogeneous water and eddies in which density varies in the horizontal. Preliminary results suggest that instability is enhanced by inhomogeneities, which would imply that traditional modeling studies, based on homogeneous vortices have underestimated the rate of heat-release from oceanic eddies to the surroundings. The approach is modeling in the simplest form; i.e., one single active layer. Although previous studies have shown the drastic effect on stability brought by two or more dynamically-relevant homogeneous layers, the author believes the single-layer eddy-model has not been investigated thoroughly.

  11. Restoration of overwash processes creates piping plover (Charadrius melodus) habitat on a barrier island (Assateague Island, Maryland)

    NASA Astrophysics Data System (ADS)

    Schupp, Courtney A.; Winn, Neil T.; Pearl, Tami L.; Kumer, John P.; Carruthers, Tim J. B.; Zimmerman, Carl S.

    2013-01-01

    On Assateague Island, an undeveloped barrier island along Maryland and Virginia, a foredune was constructed to protect the island from the erosion and breaching threat caused by permanent jetties built to maintain Ocean City Inlet. Scientists and engineers integrated expertise in vegetation, wildlife, geomorphology, and coastal engineering in order to design a habitat restoration project that would be evaluated in terms of coastal processes rather than static features. Development of specific restoration targets, thresholds for intervention, and criteria to evaluate long-term project success were based on biological and geomorphological data and coastal engineering models. A detailed long-term monitoring plan was established to measure project sustainability. The foredune unexpectedly acted as near-total barrier to both overwash and wind, and the dynamic ecosystem underwent undesirable habitat changes including conversion of early-succession beach habitat to herbaceous and shrub communities, diminishing availability of foraging habitat and thereby reducing productivity of the Federally-listed Threatened Charadrius melodus (piping plover). To address these impacts, multiple notches were cut through the constructed foredune. The metric for initial geomorphological success-restoration of at least one overwash event per year across the constructed foredune, if occurring elsewhere on the island-was reached. New overwash fans increased island stability by increasing interior island elevation. At every notch, areas of sparse vegetation increased and the new foraging habitat was utilized by breeding pairs during the 2010 breeding season. However, the metric for long-term biological success-an increase to 37% sparsely vegetated habitat on the North End and an increase in piping plover productivity to 1.25 chicks fledged per breeding pair-has not yet been met. By 2010 there was an overall productivity of 1.2 chicks fledged per breeding pair and a 1.7% decrease in sparsely

  12. Star formation in the filament of S254-S258 OB complex: a cluster in the process of being created

    NASA Astrophysics Data System (ADS)

    Samal, M. R.; Ojha, D. K.; Jose, J.; Zavagno, A.; Takahashi, S.; Neichel, B.; Kim, J. S.; Chauhan, N.; Pandey, A. K.; Zinchenko, I.; Tamura, M.; Ghosh, S. K.

    2015-09-01

    Infrared dark clouds are ideal laboratories for studying the initial processes of high-mass star and star-cluster formation. We investigated the star formation activity of an unexplored filamentary dark cloud (size ~5.7 pc × 1.9 pc), which itself is part of a large filament (~20 pc) located in the S254-S258 OB complex at a distance of 2.5 kpc. Using Multi-band Imaging Photometer (MIPS) Spitzer 24 μm data, we uncovered 49 sources with signal-to-noise ratios greater than 5. We identified 45 sources as candidate young stellar objects (YSOs) of Class I, flat-spectrum, and Class II natures. Additional 17 candidate YSOs (9 Class I and 8 Class II) are also identified using JHK and Wide-field Infrared Survey Explorer (WISE) photometry. We find that the protostar-to-Class II sources ratio (~2) and the protostar fraction (~70%) of the region are high. Comparison of the protostar fraction to other young clusters suggests that the star formation in the dark cloud possibly started only 1 Myr ago. Combining the near-infrared photometry of the YSO candidates with the theoretical evolutionary models, we infer that most of the candidate YSOs formed in the dark cloud are low-mass (<2 M⊙). We examine the spatial distribution of the YSOs and find that majority of them are linearly aligned along the highest column density line (N(H2)~1 × 1022 cm-2) of the dark cloud along its long axis at the mean nearest-neighbour separation of ~0.2 pc. Using the observed properties of the YSOs, physical conditions of the cloud and a simple cylindrical model, we explore the possible star formation process of this filamentary dark cloud and suggest that gravitational fragmentation within the filament should have played a dominant role in the formation of the YSOs. From the total mass of the YSOs, the gaseous mass associated with the dark cloud, and the surrounding environment, we infer that the region is presently forming stars at an efficiency of ~3% and a rate ~30 M⊙ Myr-1, and it may emerge

  13. Effect of homogenization techniques on reducing the size of microcapsules and the survival of probiotic bacteria therein.

    PubMed

    Ding, W K; Shah, N P

    2009-08-01

    This study investigated 2 different homogenization techniques for reducing the size of calcium alginate beads during the microencapsulation process of 8 probiotic bacteria strains, namely, Lactobacillus rhamnosus, L. salivarius, L. plantarum, L. acidophilus, L. paracasei, Bifidobacterium longum, B. lactis type Bi-04, and B. lactis type Bi-07. Two different homogenization techniques were used, namely, ultra-turrax benchtop homogenizer and Microfluidics microfluidizer. Various settings on the homogenization equipment were studied such as the number of passes, speed (rpm), duration (min), and pressure (psi). The traditional mixing method using a magnetic stirrer was used as a control. The size of microcapsules resulting from the homogenization technique, and the various settings were measured using a light microscope and a stage micrometer. The smallest capsules measuring (31.2 microm) were created with the microfluidizer using 26 passes at 1200 psi for 40 min. The greatest loss in viability of 3.21 log CFU/mL was observed when using the ultra-turrax benchtop homogenizer with a speed of 1300 rpm for 5 min. Overall, both homogenization techniques reduced capsule sizes; however, homogenization settings at high rpm also greatly reduced the viability of probiotic organisms. PMID:19723206

  14. Microfluidic Generation of Monodisperse, Structurally Homogeneous Alginate Microgels for Cell Encapsulation and 3D Cell Culture.

    PubMed

    Utech, Stefanie; Prodanovic, Radivoje; Mao, Angelo S; Ostafe, Raluca; Mooney, David J; Weitz, David A

    2015-08-01

    Monodisperse alginate microgels (10-50 μm) are created via droplet-based microfluidics by a novel crosslinking procedure. Ionic crosslinking of alginate is induced by release of chelated calcium ions. The process separates droplet formation and gelation reaction enabling excellent control over size and homogeneity under mild reaction conditions. Living mesenchymal stem cells are encapsulated and cultured in the generated 3D microenvironments. PMID:26039892

  15. A model cerium oxide matrix composite reinforced with a homogeneous dispersion of silver particulate - prepared using the glycine-nitrate process

    SciTech Connect

    Weil, K. Scott; Hardy, John S.

    2005-01-31

    Recently a new method of ceramic brazing has been developed. Based on a two-phase liquid composed of silver and copper oxide, brazing is conducted directly in air without the need of an inert cover gas or the use of surface reactive fluxes. Because the braze displays excellent wetting characteristics on a number ceramic surfaces, including alumina, various perovskites, zirconia, and ceria, we were interested in investigating whether a metal-reinforced ceramic matrix composite (CMC) could be developed with this material. In the present study, two sets of homogeneously mixed silver/copper oxide/ceria powders were synthesized using a combustion synthesis technique. The powders were compacted and heat treated in air above the liquidus temperature for the chosen Ag-CuO composition. Metallographic analysis indicates that the resulting composite microstructures are extremely uniform with respect to both the size of the metallic reinforcement as well as its spatial distribution within the ceramic matrix. The size, morphology, and spacing of the metal particulate in the densified composite appears to be dependent on the original size and the structure of the starting combustion synthesized powders.

  16. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  17. Heterogeneous nucleation or homogeneous nucleation?

    NASA Astrophysics Data System (ADS)

    Liu, X. Y.

    2000-06-01

    The generic heterogeneous effect of foreign particles on three dimensional nucleation was examined both theoretically and experimentally. It shows that the nucleation observed under normal conditions includes a sequence of progressive heterogeneous processes, characterized by different interfacial correlation function f(m,x)s. At low supersaturations, nucleation will be controlled by the process with a small interfacial correlation function f(m,x), which results from a strong interaction and good structural match between the foreign bodies and the crystallizing phase. At high supersaturations, nucleation on foreign particles having a weak interaction and poor structural match with the crystallizing phase (f(m,x)→1) will govern the kinetics. This frequently leads to the false identification of homogeneous nucleation. Genuine homogeneous nucleation, which is the up-limit of heterogeneous nucleation, may not be easily achievable under gravity. In order to check these results, the prediction is confronted with nucleation experiments of some organic and inorganic crystals. The results are in excellent agreement with the theory.

  18. Creating Happy Memories.

    ERIC Educational Resources Information Center

    Weeks, Denise Jarrett

    2001-01-01

    Some teachers are building and sharing their wisdom and know-how through lesson study, in the process creating memorable learning experiences for students and for each other. This paper describes how lesson study can transform teaching and how schools are implementing lesson study. A sidebar presents questions to consider in lesson study. (SM)

  19. Influence of Gas Flow and Improvement of Homogeneity on the Distribution of Critical Current Density in YBCO Coated Conductor Processed by TFA-MOD Method

    NASA Astrophysics Data System (ADS)

    Shiohara, Kei; Higashikawa, Kohei; Kawaguchi, Teppei; Inoue, Masayoshi; Kiss, Takanobu; Yoshizumi, Masateru; Izumi, Teruo

    Using a scanning Hall-probe microscopy, we have investigated in-plane distribution of critical current density in TFA-MOD processed YBCO coated conductors. We compared the distributions of critical current density for two kinds of coated conductors processed with different directions of gas flow at the calcinations. As a result, it was found that the direction of the gas flow largely influenced the distribution of critical current density. For example, the maximum value of critical current density was 1.5 times higher than the average for a sample processed with a gas flow in width direction. On the other hand, the distribution of critical current density was relatively uniform for the one with a gas flow in axial direction perpendicular to the surface of the conductor. These findings will be very important information for the optimization of the manufacturer processes for the conductors. Actually, a very uniform distribution of critical current density has been observed for a coated conductor produced by an optimized process. This demonstrates a high potential of TFA-MOD processed YBCO coated conductors for practical applications.

  20. Homogeneous quantum electrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1992-01-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  1. HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Hammond, R.P.; Busey, H.M.

    1959-02-17

    Nuclear reactors of the homogeneous liquid fuel type are discussed. The reactor is comprised of an elongated closed vessel, vertically oriented, having a critical region at the bottom, a lower chimney structure extending from the critical region vertically upwardly and surrounded by heat exchanger coils, to a baffle region above which is located an upper chimney structure containing a catalyst functioning to recombine radiolyticallydissociated moderator gages. In operation the liquid fuel circulates solely by convection from the critical region upwardly through the lower chimney and then downwardly through the heat exchanger to return to the critical region. The gases formed by radiolytic- dissociation of the moderator are carried upwardly with the circulating liquid fuel and past the baffle into the region of the upper chimney where they are recombined by the catalyst and condensed, thence returning through the heat exchanger to the critical region.

  2. Homogeneous quantum electrodynamic turbulence

    SciTech Connect

    Shebalin, J.V.

    1992-10-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  3. Homogeneity study of candidate reference material in fish matrix

    NASA Astrophysics Data System (ADS)

    Ulrich, J. C.; Sarkis, J. E. S.; Hortellani, M. A.

    2015-01-01

    A material is perfectly homogeneous with respect to a given characteristic, or composition, if there is no difference between the values obtained from one part to another. Homogeneity is usually evaluated using analysis of variance (ANOVA). However, the requirement that populations of data to be processed must have a normal distribution and equal variances greatly limits the use of this statistical tool. A more suitable test for assessing the homogeneity of RMs, known as "sufficient homogeneity", was proposed by Fearn and Thompson. In this work, we evaluate the performance of the two statistical treatments for assessing homogeneity of methylmercury (MeHg) in candidate reference material of fish tissue.

  4. The Leadership Assignment: Creating Change.

    ERIC Educational Resources Information Center

    Calabrese, Raymond L.

    This book provides change-motivated leaders with an understanding of the change process and the tools to drive change. Eight change principles guide change agents in creating and sustaining change: prepare to lead change; knowledge is power; create empowering mental models; overcome resistance to change; lead change; accelerate the change process;…

  5. Shear wave splitting hints at dynamical features of mantle convection: a global study of homogeneously processed source and receiver side upper mantle anisotropy

    NASA Astrophysics Data System (ADS)

    Walpole, J.; Wookey, J. M.; Masters, G.; Kendall, J. M.

    2013-12-01

    The asthenosphere is embroiled in the process of mantle convection. Its viscous properties allow it to flow around sinking slabs and deep cratonic roots as it is displaced by intruding material and dragged around by the moving layer above. As the asthenosphere flows it develops a crystalline fabric with anisotropic crystals preferentially aligned in the direction of flow. Meanwhile, the lithosphere above deforms as it is squeezed and stretched by underlying tectonic processes, enabling anisotropic fabrics to develop and become fossilised in the rigid rock and to persist over vast spans of geological time. As a shear wave passes through an anisotropic medium it splits into two orthogonally polarised quasi shear waves that propagate at different velocities (this phenomenon is known as shear wave splitting). By analysing the polarisation and the delay time of many split waves that have passed through a region it is possible to constrain the anisotropy of the medium in that region. This anisotropy is the key to revealing the deformation history of the deep Earth. In this study we present measurements of shear wave splitting recorded on S, SKS, and SKKS waves from earthquakes recorded at stations from the IRIS DMC catalogue (1976-2010). We have used a cluster analysis phase picking technique [1] to pick hundreds of thousands of high signal to noise waveforms on long period data. These picks are used to feed the broadband data into an automated processing workflow that recovers shear wave splitting parameters [2,3]. The workflow includes a new method for making source and receiver corrections, whereby the stacked error surfaces are used as input to correction rather than a single set of parameters, this propagates uncertainty information into the final measurement. Using SKS, SKKS, and source corrected S, we recover good measurements of anisotropy beneath 1,569 stations. Using receiver corrected S we recover good measurements of anisotropy beneath 470 events. We compare

  6. Homogeneous spaces of Dirac groupoids

    NASA Astrophysics Data System (ADS)

    Jotz Lean, Madeleine

    2016-06-01

    A Poisson structure on a homogeneous space of a Poisson groupoid is homogeneous if the action of the Lie groupoid on the homogeneous space is compatible with the Poisson structures. According to a result of Liu, Weinstein and Xu, Poisson homogeneous spaces of a Poisson groupoid are in correspondence with suitable Dirac structures in the Courant algebroid defined by the Lie bialgebroid of the Poisson groupoid. We show that this correspondence result fits into a more natural context: the one of Dirac groupoids, which are objects generalizing Poisson groupoids and multiplicative closed 2-forms on groupoids.

  7. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  8. STEAM STIRRED HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Busey, H.M.

    1958-06-01

    A homogeneous nuclear reactor utilizing a selfcirculating liquid fuel is described. The reactor vessel is in the form of a vertically disposed tubular member having the lower end closed by the tube walls and the upper end closed by a removal fianged assembly. A spherical reaction shell is located in the lower end of the vessel and spaced from the inside walls. The reaction shell is perforated on its lower surface and is provided with a bundle of small-diameter tubes extending vertically upward from its top central portion. The reactor vessel is surrounded in the region of the reaction shell by a neutron reflector. The liquid fuel, which may be a solution of enriched uranyl sulfate in ordinary or heavy water, is mainiained at a level within the reactor vessel of approximately the top of the tubes. The heat of the reaction which is created in the critical region within the spherical reaction shell forms steam bubbles which more upwardly through the tubes. The upward movement of these bubbles results in the forcing of the liquid fuel out of the top of these tubes, from where the fuel passes downwardly in the space between the tubes and the vessel wall where it is cooled by heat exchangers. The fuel then re-enters the critical region in the reaction shell through the perforations in the bottom. The upper portion of the reactor vessel is provided with baffles to prevent the liquid fuel from splashing into this region which is also provided with a recombiner apparatus for recombining the radiolytically dissociated moderator vapor and a control means.

  9. Effects of sample homogenization on solid phase sediment toxicity

    SciTech Connect

    Anderson, B.S.; Hunt, J.W.; Newman, J.W.; Tjeerdema, R.S.; Fairey, W.R.; Stephenson, M.D.; Puckett, H.M.; Taberski, K.M.

    1995-12-31

    Sediment toxicity is typically assessed using homogenized surficial sediment samples. It has been recognized that homogenization alters sediment integrity and may result in changes in chemical bioavailability through oxidation-reduction or other chemical processes. In this study, intact (unhomogenized) sediment cores were taken from a Van Veen grab sampler and tested concurrently with sediment homogenate from the same sample in order to investigate the effect of homogenization on toxicity. Two different solid-phase toxicity test protocols were used for these comparisons. Results of amphipod exposures to samples from San Francisco Bay indicated minimal difference between intact and homogenized samples. Mean amphipod survival in intact cores relative to homogenates was similar at two contaminated sites. Mean survival was 34 and 33% in intact and homogenized samples, respectively, at Castro Cove. Mean survival was 41% and 57%, respectively, in intact and homogenized samples from Islais Creek. Studies using the sea urchin development protocol, modified for testing at the sediment/water interface, indicated considerably more toxicity in intact samples relative to homogenized samples from San Diego Bay. Measures of metal flux into the overlying water demonstrated greater flux of metals from the intact samples. Zinc flux was five times greater, and copper flux was twice as great in some intact samples relative to homogenates. Future experiments will compare flux of metals and organic compounds in intact and homogenized sediments to further evaluate the efficacy of using intact cores for solid phase toxicity assessment.

  10. Integration of a nurse navigator into the triage process for patients with non-small-cell lung cancer: creating systematic improvements in patient care

    PubMed Central

    Zibrik, K.; Laskin, J.; Ho, C.

    2016-01-01

    Nurse navigation is a developing facet of oncology care. The concept of patient navigation was originally created in 1990 at the Harlem Hospital Center in New York City as a strategy to assist vulnerable and socially disadvantaged populations with timely access to breast cancer care. Since the mid-1990s, navigation programs have expanded to include many patient populations that require specialized management and prompt access to diagnostic and clinical resources. Advanced non-small-cell lung cancer is ideally suited for navigation to facilitate efficient assessment in this fragile patient population and to ensure timely results of molecular tests for first-line therapy with appropriately targeted agents. At the BC Cancer Agency, nurse navigator involvement with thoracic oncology triage has been demonstrated to increase the proportion of patients receiving systemic treatment, to shorten the time to delivery of systemic treatment, and to increase the rate of molecular testing and the number of patients with molecular testing results available at time of initial consultation. Insights gained through the start-up process are briefly discussed, and a framework for implementation at other institutions is outlined. PMID:27330366

  11. Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (initial award through Award Modification 2); Energy & Risk Transfer Assessment (Award Modifications 3 - 6)

    SciTech Connect

    Michael Ebert

    2008-02-28

    This is the final report for the DOE-NETL grant entitled 'Creating New Incentives for Risk Identification & Insurance Processes for the Electric Utility Industry' and later, 'Energy & Risk Transfer Assessment'. It reflects work done on projects from 15 August 2004 to 29 February 2008. Projects were on a variety of topics, including commercial insurance for electrical utilities, the Electrical Reliability Organization, cost recovery by Gulf State electrical utilities after major hurricanes, and review of state energy emergency plans. This Final Technical Report documents and summarizes all work performed during the award period, which in this case is from 15 August 2004 (date of notification of original award) through 29 February 2008. This report presents this information in a comprehensive, integrated fashion that clearly shows a logical and synergistic research trajectory, and is augmented with findings and conclusions drawn from the research as a whole. Four major research projects were undertaken and completed during the 42 month period of activities conducted and funded by the award; these are: (1) Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (also referred to as the 'commercial insurance' research). Three major deliverables were produced: a pre-conference white paper, a two-day facilitated stakeholders workshop conducted at George Mason University, and a post-workshop report with findings and recommendations. All deliverables from this work are published on the CIP website at http://cipp.gmu.edu/projects/DoE-NETL-2005.php. (2) The New Electric Reliability Organization (ERO): an examination of critical issues associated with governance, standards development and implementation, and jurisdiction (also referred to as the 'ERO study'). Four major deliverables were produced: a series of preliminary memoranda for the staff of the Office of Electricity Delivery and Energy Reliability ('OE'), an ERO interview

  12. Thermocouple homogeneity scanning

    NASA Astrophysics Data System (ADS)

    Webster, E.; White, D. R.

    2015-02-01

    The inhomogeneities within a thermocouple influence the measured temperature and contribute the largest component to uncertainty. Currently there is no accepted best practice for measuring the inhomogeneities or for forecasting their effects on real-world measurements. The aim of this paper is to provide guidance on the design and performance assessment of thermocouple inhomogeneity scanners by characterizing the qualitative performance of the various designs reported in the literature, and developing a quantitative measure of scanner resolution. Numerical simulations incorporating Fourier transforms and convolutions are used to gauge the levels of attenuation and distortion present in single- and double-gradient scanners. Single-gradient scanners are found to be far superior to double-gradient scanners, which are unsuitable for quantitative measurements due to their blindness to inhomogeneities at many spatial frequencies and severe attenuation of signals at other frequencies. It is recommended that the standard deviation of the temperature gradient within the scanner is used as a measure of the scanner resolution and spatial bandwidth. Recommendations for the design of scanners are presented, and include advice on the basic design of scanners, the media employed, operating temperature, scan rates, construction of survey probes, data processing, gradient symmetry, and the spatial resolution required for research and calibration applications.

  13. Homogeneity analysis of precipitation series in Iran

    NASA Astrophysics Data System (ADS)

    Hosseinzadeh Talaee, P.; Kouchakzadeh, Mahdi; Shifteh Some'e, B.

    2014-10-01

    Assessment of the reliability and quality of historical precipitation data is required in the modeling of hydrology and water resource processes and for climate change studies. The homogeneity of the annual and monthly precipitation data sets throughout Iran was tested using the Bayesian, Cumulative Deviations, and von Neumann tests at a significance level of 0.05. The precipitation records from 41 meteorological stations covering the years between 1966 and 2005 were considered. The annual series of Iranian precipitation were found to be homogeneous by applying the Bayesian and Cumulative Deviations tests, while the von Neumann test detected inhomogeneities at seven stations. Almost all the monthly precipitation data sets are homogeneous and considered as "useful." The outputs of the statistical tests for the homogeneity analysis of the precipitation time series had discrepancies in some cases which are related to different sensitivities of the tests to break in the time series. It was found that the von Neumann test is more sensitive than the Bayesian and Cumulative Deviations tests in the determination of inhomogeneity in the precipitation series.

  14. (Ultra) High Pressure Homogenization for Continuous High Pressure Sterilization of Pumpable Foods – A Review

    PubMed Central

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work. PMID:25988118

  15. The Architecture of a Homogeneous Vector Supercomputer

    NASA Astrophysics Data System (ADS)

    Gustafson, J. L.; Hawkinson, S.; Scott, K.

    A new homogeneous computer architecture combines two fundamental techniques for high-speed computing: parallelism based on the binary n-cube interconnect, and pipelined vector arithmetic. The design makes extensive use of VLSI technology, resulting in a processing node that can be economically replicated. The new system achieves a careful balance between high-speed communication and floating-point computation. This paper describes the new architecture in detail and explores some of the issues in developing effective software.

  16. The OPtimising HEalth LIterAcy (Ophelia) process: study protocol for using health literacy profiling and community engagement to create and implement health reform

    PubMed Central

    2014-01-01

    Background Health literacy is a multi-dimensional concept comprising a range of cognitive, affective, social, and personal skills and attributes. This paper describes the research and development protocol for a large communities-based collaborative project in Victoria, Australia that aims to identify and respond to health literacy issues for people with chronic conditions. The project, called Ophelia (OPtimising HEalth LIterAcy) Victoria, is a partnership between two universities, eight service organisations and the Victorian Government. Based on the identified issues, it will develop and pilot health literacy interventions across eight disparate health services to inform the creation of a health literacy response framework to improve health outcomes and reduce health inequalities. Methods/Design The protocol draws on many inputs including the experience of the partners in previous co-creation and roll-out of large-scale health-promotion initiatives. Three key conceptual models/discourses inform the protocol: intervention mapping; quality improvement collaboratives, and realist synthesis. The protocol is outcomes-oriented and focuses on two key questions: ‘What are the health literacy strengths and weaknesses of clients of participating sites?’, and ‘How do sites interpret and respond to these in order to achieve positive health and equity outcomes for their clients?’. The process has six steps in three main phases. The first phase is a needs assessment that uses the Health Literacy Questionnaire (HLQ), a multi-dimensional measure of health literacy, to identify common health literacy needs among clients. The second phase involves front-line staff and management within each service organisation in co-creating intervention plans to strategically respond to the identified local needs. The third phase will trial the interventions within each site to determine if the site can improve identified limitations to service access and/or health outcomes. Discussion

  17. Using high-performance ¹H NMR (HP-qNMR®) for the certification of organic reference materials under accreditation guidelines--describing the overall process with focus on homogeneity and stability assessment.

    PubMed

    Weber, Michael; Hellriegel, Christine; Rueck, Alexander; Wuethrich, Juerg; Jenks, Peter

    2014-05-01

    Quantitative NMR spectroscopy (qNMR) is gaining interest across both analytical and industrial research applications and has become an essential tool for the content assignment and quantitative determination of impurities. The key benefits of using qNMR as measurement method for the purity determination of organic molecules are discussed, with emphasis on the ability to establish traceability to "The International System of Units" (SI). The work describes a routine certification procedure from the point of view of a commercial producer of certified reference materials (CRM) under ISO/IEC 17025 and ISO Guide 34 accreditation, that resulted in a set of essential references for (1)H qNMR measurements, and the relevant application data for these substances are given. The overall process includes specific selection criteria, pre-tests, experimental conditions, homogeneity and stability studies. The advantages of an accelerated stability study over the classical stability-test design are shown with respect to shelf-life determination and shipping conditions. PMID:24182847

  18. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    SciTech Connect

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  19. Creating Sub-50 nm Nanofluidic Junctions in PDMS Microchip via Self-Assembly Process of Colloidal Silica Beads for Electrokinetic Concentration of Biomolecules

    PubMed Central

    Syed, A.; Mangano, L.; Mao, P.; Han, J.

    2014-01-01

    In this work we describe a novel and simple self-assembly of colloidal silica beads to create nanofluidic junction between two microchannels. The nanoporous membrane was used to induce ion concentration polarization inside the microchannel and this electrokinetic preconcentration system allowed rapid concentration of DNA samples by ∼1700 times and protein samples by ∼100 times within 5 minutes. PMID:25254651

  20. Entanglement Created by Dissipation

    SciTech Connect

    Alharbi, Abdullah F.; Ficek, Zbigniew

    2011-10-27

    A technique for entangling closely separated atoms by the process of dissipative spontaneous emission is presented. The system considered is composed of two non-identical two-level atoms separated at the quarter wavelength of a driven standing wave laser field. At this atomic distance, only one of the atoms can be addressed by the laser field. In addition, we arrange the atomic dipole moments to be oriented relative to the inter-atomic axis such that the dipole-dipole interaction between the atoms is zero at this specific distance. It is shown that an entanglement can be created between the atoms on demand by tuning the Rabi frequency of the driving field to the difference between the atomic transition frequencies. The amount of the entanglement created depends on the ratio between the damping rates of the atoms, but is independent of the frequency difference between the atoms. We also find that the transient buildup of an entanglement between the atoms may differ dramatically for different initial atomic conditions.

  1. Strongly Interacting Homogeneous Fermi Gases

    NASA Astrophysics Data System (ADS)

    Mukherjee, Biswaroop; Patel, Parth; Yan, Zhenjie; Struck, Julian; Zwierlein, Martin

    2016-05-01

    We present a homogeneous box potential for strongly interacting Fermi gases. The local density approximation (LDA) allows measurements on traditional inhomogeneous traps to observe a continuous distribution of Fermi gases in a single shot, but also suffer from a broadened response due to line-of-sight averaging over varying densities. We trap ultracold Fermionic (6 Li) in an optical homogeneous potential and characterize its flatness through in-situ tomography. A hybrid approach combining a cylindrical optical potential with a harmonic magnetic trap allows us to exploit the LDA and measure local RF spectra without requiring significant image reconstruction. We extract various quantities from the RF spectra such as the Tan's contact, and discuss further measurements of homogeneous Fermi systems under spin imbalance and finite temperature.

  2. The Art of Gymnastics: Creating Sequences.

    ERIC Educational Resources Information Center

    Rovegno, Inez

    1988-01-01

    Offering students opportunities for creating movement sequences in gymnastics allows them to understand the essence of gymnastics, have creative experiences, and learn about themselves. The process of creating sequences is described. (MT)

  3. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  4. Homogenizing Developmental Studies and ESL.

    ERIC Educational Resources Information Center

    Weaver, Margaret E.

    A discussion of pragmatic issues in both developmental studies (DS) and English-as-a-second-language (ESL) instruction at the college level argues that because the two fields have common problems, challenges, and objectives, they have become homogenized as one in many institutions. Because full-time college faculty avoid teaching developmental…

  5. High frequency homogenization for structural mechanics

    NASA Astrophysics Data System (ADS)

    Nolde, E.; Craster, R. V.; Kaplunov, J.

    2011-03-01

    We consider a net created from elastic strings as a model structure to investigate the propagation of waves through semi-discrete media. We are particularly interested in the development of continuum models, valid at high frequencies, when the wavelength and each cell of the net are of similar order. Net structures are chosen as these form a general two-dimensional example, encapsulating the essential physics involved in the two-dimensional excitation of a lattice structure whilst retaining the simplicity of dealing with elastic strings. Homogenization techniques are developed here for wavelengths commensurate with the cellular scale. Unlike previous theories, these techniques are not limited to low frequency or static regimes, and lead to effective continuum equations valid on a macroscale with the details of the cellular structure encapsulated only through integrated quantities. The asymptotic procedure is based upon a two-scale approach and the physical observation that there are frequencies that give standing waves, periodic with the period or double-period of the cell. A specific example of a net created by a lattice of elastic strings is constructed, the theory is general and not reliant upon the net being infinite, none the less the infinite net is a useful special case for which Bloch theory can be applied. This special case is explored in detail allowing for verification of the theory, and highlights the importance of degenerate cases; the specific example of a square net is treated in detail. An additional illustration of the versatility of the method is the response to point forcing which provides a stringent test of the homogenized equations; an exact Green's function for the net is deduced and compared to the asymptotics.

  6. Homogeneous cooling state of frictionless rod particles

    NASA Astrophysics Data System (ADS)

    Rubio-Largo, S. M.; Alonso-Marroquin, F.; Weinhart, T.; Luding, S.; Hidalgo, R. C.

    2016-02-01

    In this work, we report some theoretical results on granular gases consisting of frictionless 3D rods with low energy dissipation. We performed simulations on the temporal evolution of soft spherocylinders, using a molecular dynamics algorithm implemented on GPU architecture. A homogeneous cooling state for rods, where the time dependence of the system's intensive variables occurs only through a global granular temperature, has been identified. We have found a homogeneous cooling process, which is in excellent agreement with Haff's law, when using an adequate rescaling time τ(ξ), the value of which depends on the particle elongation ξ and the restitution coefficient. It was further found that scaled particle velocity distributions remain approximately Gaussian regardless of the particle shape. Similarly to a system of ellipsoids, energy equipartition between rotational and translational degrees of freedom was better satisfied as one gets closer to the elastic limit. Taking advantage of scaling properties, we have numerically determined the general functionality of the magnitude Dc(ξ), which describes the efficiency of the energy interchange between rotational and translational degrees of freedom, as well as its dependence on particle shape. We have detected a range of particle elongations (1.5 < ξ < 4.0), where the average energy transfer between the rotational and translational degrees of freedom results greater for spherocylinders than for homogeneous ellipsoids with the same aspect ratio.

  7. Self Creating Universe

    NASA Astrophysics Data System (ADS)

    Terry, Bruce

    2001-04-01

    Cosmology has deduced that our existence began 15 billion years ago but that does not constitute a true story. When compared against infinity, the true question one must as is, ‘why did creation begin now (a mere 15 billion give or take years ago) and not at some infinite point before? What could keep the one common original source static for an infinity, and then spring forth into existence?’ Also, accelerators are actually creating atmospheres much like that within quasars, black holes and stars. This destructive/creative environment is not that of original creation, it is of that which occurs in a later stage of cosmic evolution. Knowing that it is only a matter of movement or change, understanding what is moving is the key. Regardless of how much power is used to alter the character of a particle’s matter, it does not make its essence go away, nor does it make the understanding of original essence clearer. To find the true answer of what occurred, one must look back in time and think carefully over the process of elimination to find the original creation of matter, albeit different than that of the later processes. Matter and the physical laws formed themselves in an absolute infinity of blackness prior to light and no Big Bang scenario was necessary.

  8. A compact setup to study homogeneous nucleation and condensation.

    PubMed

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 10(6) Pa resulting from mass flow rates of up to 600 l(N) min(-1) and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD. PMID:17411197

  9. Creating improved ASTER DEMs over glacierized terrain

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2006-12-01

    Digital elevation models (DEMs) produced from ASTER stereo imagery over glacierized terrain frequently contain data voids, which some software packages fill by interpolation. Even when interpolation is applied, the results are often not accurate enough for studies of glacier thickness changes. DEMs are created by automatic cross-correlation between the image pairs, and rely on spatial variability in the digital number (DN) values for this process. Voids occur in radiometrically homogeneous regions, such as glacier accumulation areas covered with uniform snow, due to lack of correlation. The same property that leads to lack of correlation makes possible the derivation of elevation information from photoclinometry, also known as shape-from-shading. We demonstrate a technique to produce improved DEMs from ASTER data by combining the results from conventional cross-correlation DEM-generation software with elevation information produced from shape-from-shading in the accumulation areas of glacierized terrain. The resulting DEMs incorporate more information from the imagery, and the filled voids more accurately represent the glacier surface. This will allow for more accurate determination of glacier hypsometry and thickness changes, leading to better predictions of response to climate change.

  10. Homogeneous melting of superheated crystals: Molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Forsblom, Mattias; Grimvall, Göran

    2005-08-01

    The homogeneous melting mechanism in a superheated fcc lattice is studied through molecular dynamics simulations, usually for about 20 000 atoms, with the Ercolessi and Adams interaction that represents aluminum. The periodic boundary conditions for the simulation cell suppress the usual surface-initiated melting at Tm=939K , and the solid-to-liquid transition takes place at the temperature Ts=1.3Tm . By logging the position of each atom at every time step in the simulation, we can follow the melting process in detail at the atomic level. Thermal fluctuations close to Ts create interstitial-vacancy pairs, which occasionally separate into mobile interstitials and almost immobile vacancies. There is an attraction between two interstitials, with a calculated maximum interaction energy of about 0.7eV . When three to four migrating interstitials have come close enough to form a bound aggregate of point defects, and a few thermally created interstitial-vacancy pairs have been added to the aggregate, such a defect configuration usually continues to grow irreversibly to the liquid state. For 20 000 atoms in the simulation cell, the growth process takes about 102τ to be completed, where τ is the period of a typical atomic vibration in the solid phase. This melting mechanism involves fewer atoms in its crucial initial phase than has been suggested in other melting models. The elastic shear moduli c44 and c'=(c11-c12)/2 were calculated as a function of temperature and were shown to be finite at the onset of melting.

  11. Homogeneous Pt-bimetallic Electrocatalysts

    SciTech Connect

    Wang, Chao; Chi, Miaofang; More, Karren Leslie; Markovic, Nenad; Stamenkovic, Vojislav

    2011-01-01

    Alloying has shown enormous potential for tailoring the atomic and electronic structures, and improving the performance of catalytic materials. Systematic studies of alloy catalysts are, however, often compromised by inhomogeneous distribution of alloying components. Here we introduce a general approach for the synthesis of monodispersed and highly homogeneous Pt-bimetallic alloy nanocatalysts. Pt{sub 3}M (where M = Fe, Ni, or Co) nanoparticles were prepared by an organic solvothermal method and then supported on high surface area carbon. These catalysts attained a homogeneous distribution of elements, as demonstrated by atomic-scale elemental analysis using scanning transmission electron microscopy. They also exhibited high catalytic activities for the oxygen reduction reaction (ORR), with improvement factors of 2-3 versus conventional Pt/carbon catalysts. The measured ORR catalytic activities for Pt{sub 3}M nanocatalysts validated the volcano curve established on extended surfaces, with Pt{sub 3}Co being the most active alloy.

  12. Homogeneous enzyme immunoassay for netilmicin.

    PubMed Central

    Wenk, M; Hemmann, R; Follath, F

    1982-01-01

    A newly developed homogeneous enzyme immunoassay for the determination of netilmicin in serum was evaluated and compared with a radioenzymatic assay. A total of 102 serum samples from patients treated with netilmicin were measured by both methods. This comparison showed an excellent correlation (r = 0.993). The enzyme immunoassay has proved to be precise, accurate, and specific. Because of its rapidity and the ease of performance, this method is a useful alternative to current assays for monitoring serum netilmicin concentrations. PMID:6760807

  13. High School Student Perceptions of the Utility of the Engineering Design Process: Creating Opportunities to Engage in Engineering Practices and Apply Math and Science Content

    NASA Astrophysics Data System (ADS)

    Berland, Leema; Steingut, Rebecca; Ko, Pat

    2014-12-01

    Research and policy documents increasingly advocate for incorporating engineering design into K-12 classrooms in order to accomplish two goals: (1) provide an opportunity to engage with science content in a motivating real-world context; and (2) introduce students to the field of engineering. The present study uses multiple qualitative data sources (i.e., interviews, artifact analysis) in order to examine the ways in which engaging in engineering design can support students in participating in engineering practices and applying math and science knowledge. This study suggests that students better understand and value those aspects of engineering design that are more qualitative (i.e., interviewing users, generating multiple possible solutions) than the more quantitative aspects of design which create opportunities for students to integrate traditional math and science content into their design work (i.e., modeling or systematically choosing between possible design solutions). Recommendations for curriculum design and implementation are discussed.

  14. Multifractal spectra in homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  15. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  16. Homogenization and improvement in energy dissipation of nonlinear composites

    NASA Astrophysics Data System (ADS)

    Verma, Luv; Sivakumar, Srinivasan M.; Vedantam, S.

    2016-04-01

    Due to their high strength to weight and stiffness to weight ratio, there is a huge shift towards the composite materials from the conventional metals, but composites have poor damage resistance in the transverse direction. Undergoing impact loads, they can fail in wide variety of modes which severely reduces the structural integrity of the component. This paper deals with the homogenization of glass-fibers and epoxy composite with a material introduced as an inelastic inclusion. This nonlinearity is being modelled by kinematic hardening procedure and homogenization is done by one of the mean field homogenization technique known as Mori-Tanaka method. The homogenization process consider two phases, one is the matrix and another is the inelastic inclusion, thus glass-fibers and epoxy are two phases which can be considered as one phase and act as a matrix while homogenizing non-linear composite. Homogenization results have been compared to the matrix at volume fraction zero of the inelastic inclusions and to the inelastic material at volume fraction one. After homogenization, increase of the energy dissipation into the composite due to addition of inelastic material and effects onto the same by changing the properties of the matrix material have been discussed.

  17. Reduction of pantethine in rabbit ocular lens homogenate.

    PubMed

    Fisher, D H; Szulc, M E

    1997-02-01

    In several animal models, preliminary studies have indicated that pantethine may inhibit cataract formation. Therefore, preclinical trials need to be conducted to study the pharmacology of pantethine in the ocular lens and to establish its efficacy. Since pantethine, which is a disulfide, can undergo a variety of chemical modifications such as reduction and formation of mixed disulfides, a detailed study was first conducted to determine the stability of pantethine in rabbit lens homogenate. A knowledge of the stability of pantethine in lens homogenate was necessary to establish if pantethine could be metabolized in the time it takes to harvest and homogenize a lens. The results of this study will be used to establish a protocol for harvesting and homogenizing lens samples. Pantethine (100 microM) is completely reduced to pantetheine in rabbit lens homogenate in about 16 min. About 1.5% of the pantethine added to lens homogenate forms a mixed disulfide with lens proteins, and the remainder is found in the supernatant. The supernatant pantethine concentration decreases exponentially as a function of time, and the terminal half-life for this process is 3.3 min. The free supernatant pantetheine concentration increases in pseudo first order manner as a function of time with a rate constant of 4.3 min. Pantethinase activity is not significant, because the free supernatant pantetheine concentration did not decrease. The exact mechanism of pantethine reduction in rabbit lens homogenate remains to be determined. PMID:9127277

  18. Matrix shaped pulsed laser deposition: New approach to large area and homogeneous deposition

    NASA Astrophysics Data System (ADS)

    Akkan, C. K.; May, A.; Hammadeh, M.; Abdul-Khaliq, H.; Aktas, O. C.

    2014-05-01

    Pulsed laser deposition (PLD) is one of the well-established physical vapor deposition methods used for synthesis of ultra-thin layers. Especially PLD is suitable for the preparation of thin films of complex alloys and ceramics where the conservation of the stoichiometry is critical. Beside several advantages of PLD, inhomogeneity in thickness limits use of PLD in some applications. There are several approaches such as rotation of the substrate or scanning of the laser beam over the target to achieve homogenous layers. On the other hand movement and transition create further complexity in process parameters. Here we present a new approach which we call Matrix Shaped PLD to control the thickness and homogeneity of deposited layers precisely. This new approach is based on shaping of the incoming laser beam by a microlens array and a Fourier lens. The beam is split into much smaller multi-beam array over the target and this leads to a homogenous plasma formation. The uniform intensity distribution over the target yields a very uniform deposit on the substrate. This approach is used to deposit carbide and oxide thin films for biomedical applications. As a case study coating of a stent which has a complex geometry is presented briefly.

  19. Homogeneous cooling of mixtures of particle shapes

    NASA Astrophysics Data System (ADS)

    Hidalgo, R. C.; Serero, D.; Pöschel, T.

    2016-07-01

    In this work, we examine theoretically the cooling dynamics of binary mixtures of spheres and rods. To this end, we introduce a generalized mean field analytical theory, which describes the free cooling behavior of the mixture. The relevant characteristic time scale for the cooling process is derived, depending on the mixture composition and the aspect ratio of the rods. We simulate mixtures of spherocylinders and spheres using a molecular dynamics algorithm implemented on graphics processing unit (GPU) architecture. We systematically study mixtures composed of spheres and rods with several aspect ratios and varying the mixture composition. A homogeneous cooling state, where the time dependence of the system's intensive variables occurs only through a global granular temperature, is identified. We find cooling dynamics in excellent agreement with Haff's law, when using an adequate time scale. Using the scaling properties of the homogeneous cooling dynamics, we estimated numerically the efficiency of the energy interchange between rotational and translational degrees of freedom for collisions between spheres and rods.

  20. Creating a Comprehensive, Efficient, and Sustainable Nuclear Regulatory Structure: A Process Report from the U.S. Department of Energy's Material Protection, Control and Accounting Program

    SciTech Connect

    Wright, Troy L.; O'Brien, Patricia E.; Hazel, Michael J.; Tuttle, John D.; Cunningham, Mitchel E.; Schlegel, Steven C.

    2010-08-11

    With the congressionally mandated January 1, 2013 deadline for the U.S. Department of Energy’s (DOE) Nuclear Material Protection, Control and Accounting (MPC&A) program to complete its transition of MPC&A responsibility to the Russian Federation, National Nuclear Security Administration (NNSA) management directed its MPC&A program managers and team leaders to demonstrate that work in ongoing programs would lead to successful and timely achievement of these milestones. In the spirit of planning for successful project completion, the NNSA review of the Russian regulatory development process confirmed the critical importance of an effective regulatory system to a sustainable nuclear protection regime and called for an analysis of the existing Russian regulatory structure and the identification of a plan to ensure a complete MPC&A regulatory foundation. This paper describes the systematic process used by DOE’s MPC&A Regulatory Development Project (RDP) to develop an effective and sustainable MPC&A regulatory structure in the Russian Federation. This nuclear regulatory system will address all non-military Category I and II nuclear materials at State Corporation for Atomic Energy “Rosatom,” the Federal Service for Ecological, Technological, and Nuclear Oversight (Rostechnadzor), the Federal Agency for Marine and River Transport (FAMRT, within the Ministry of Transportation), and the Ministry of Industry and Trade (Minpromtorg). The approach to ensuring a complete and comprehensive nuclear regulatory structure includes five sequential steps. The approach was adopted from DOE’s project management guidelines and was adapted to the regulatory development task by the RDP. The five steps in the Regulatory Development Process are: 1) Define MPC&A Structural Elements; 2) Analyze the existing regulatory documents using the identified Structural Elements; 3) Validate the analysis with Russian colleagues and define the list of documents to be developed; 4) Prioritize and

  1. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    SciTech Connect

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  2. Creating a framework for experimentally testing early visual processing: a response to Nurmoja, et al. (2012) on trait perception from pixelized faces.

    PubMed

    Carbon, Claus-Christian

    2013-08-01

    Nurmoja, Eamets, Härma, and Bachmann (2012) revealed that strongly pixelated pictures of faces still provide relevant cues for reliably assessing the apparent (i.e., subjectively perceived) traits of the portrayed. The present article responds to the paper by developing the outline of a framework for future research to reveal certain steps in processing complex visual stimuli. This framework combines the approach of degradation of the stimuli with the so-called microgenetic approach of percepts based on presentation time limitations. The proposed combination of a particular kind of stimulus manipulation and a specific experimental procedure allows testing targeted assumptions concerning visual processing, not only in the domain of face perception, but in all domains involving complex visual stimuli, for example, art perception. PMID:24422351

  3. 3D modeling of the molten zone shape created by an asymmetric HF EM field during the FZ crystal growth process

    NASA Astrophysics Data System (ADS)

    Rudevics, A.; Muiznieks, A.; Ratnieks, G.; Riemann, H.

    2005-06-01

    In the modern industrial floating zone (FZ) silicon crystal growth process by the needle-eye technique, the high frequency (HF) electromagnetic (EM) field plays a crucial role. The EM field melts a rotating poly silicon feed rod and maintains the zone of molten silicon, which is held by the rotating single crystal. To model such a system, the 2D axi-symmetric models can be used, however, due to the system's asymmetry (e.g., the asymmetry of the HF inductor) the applicability of such models is restricted. Therefore, the modeling of FZ process in three dimensions (3D) is necessary. This paper describes a new complex 3D mathematical model of the FZ crystal growth and a correspondingly developed software package Shape3D. A 3D calculation example for the realistic FZ system is also presented. Figs 25, Refs 9.

  4. Homogeneous Open Quantum Random Walks on a Lattice

    NASA Astrophysics Data System (ADS)

    Carbone, Raffaella; Pautrat, Yan

    2015-09-01

    We study open quantum random walks (OQRWs) for which the underlying graph is a lattice, and the generators of the walk are homogeneous in space. Using the results recently obtained in Carbone and Pautrat (Ann Henri Poincaré, 2015), we study the quantum trajectory associated with the OQRW, which is described by a position process and a state process. We obtain a central limit theorem and a large deviation principle for the position process. We study in detail the case of homogeneous OQRWs on the lattice , with internal space.

  5. Create a Logo.

    ERIC Educational Resources Information Center

    Duchen, Gail

    2002-01-01

    Presents an art lesson that introduced students to graphic art as a career path. Explains that the students met a graphic artist and created a logo for a pretend client. Explains that the students researched logos. (CMK)

  6. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  7. Numerical experiments in homogeneous turbulence

    NASA Technical Reports Server (NTRS)

    Rogallo, R. S.

    1981-01-01

    The direct simulation methods developed by Orszag and Patternson (1972) for isotropic turbulence were extended to homogeneous turbulence in an incompressible fluid subjected to uniform deformation or rotation. The results of simulations for irrotational strain (plane and axisymmetric), shear, rotation, and relaxation toward isotropy following axisymmetric strain are compared with linear theory and experimental data. Emphasis is placed on the shear flow because of its importance and because of the availability of accurate and detailed experimental data. The computed results are used to assess the accuracy of two popular models used in the closure of the Reynolds-stress equations. Data from a variety of the computed fields and the details of the numerical methods used in the simulation are also presented.

  8. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-01

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales. PMID:17360379

  9. Challenges of daily data homogenization

    NASA Astrophysics Data System (ADS)

    Gruber, C.; Auer, I.; Mestre, O.

    2009-04-01

    In recent years the growing demand of extreme value studies has led to the development of methods for the homogenisation of daily data. The behaviour of some of these methods has been investigated: Two methods (HOM: Della-Marta and Wanner, 2006 and SPLIDHOM: Mestre et al., submitted) which adjust the whole distribution of the climate element (especially minimum and maximum temperature) have been compared to the simpler Vincent's method (Vincent et al., 2002) which interpolates monthly adjustment factors onto daily data. The results indicate that the behaviour of the methods HOM and SPLIDHOM is very similar, although the complexity of these methods is different. They can improve the results compared to the Vincent's method when inhomogeneities in higher order moments occur. However, their applicability is limited since highly correlated neighbour series are required. More over, more data in the intervals before and after breaks is needed if the whole distribution shall be adjusted instead of the mean only. Due to these limitations a combination of distribution dependent adjustment methods and the Vincent method seems to be necessary for the homogenization of many time series. A dataset of Austrian daily maximum and minimum temperature data is used to illustrate the challenges of distribution dependent homogenization methods. Emphasis is placed on the estimation of the (sampling) uncertainty of these methods. Therefore a bootstrap approach is used. The accuracy of the calculated adjustments varies mainly between about 0.5°C for mean temperatures and more than one degree Celsius for the margins of the distribution. These uncertainty estimates can be valuable for extreme value studies.

  10. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  11. Polyurethane phantoms with homogeneous and nearly homogeneous optical properties

    NASA Astrophysics Data System (ADS)

    Keränen, Ville T.; Mäkynen, Anssi J.; Dayton, Amanda L.; Prahl, Scott A.

    2010-02-01

    Phantoms with controlled optical properties are often used for calibration and standardization. The phantoms are typically prepared by adding absorbers and scatterers to a clear host material. It is usually assumed that the scatterers and absorbers are uniformly dispersed within the medium. To explore the effects of this assumption, we prepared paired sets of polyurethane phantoms (both with identical masses of absorber, India ink and scatterer, titanium dioxide). Polyurethane phantoms were made by mixing two polyurethane parts (a and b) together and letting them cure in a polypropylene container. The mixture was degassed before curing to ensure a sample without bubbles. The optical properties were controlled by mixing titanium dioxide or India ink into polyurethane part (a or b) before blending the parts together. By changing the mixing sequence, we could change the aggregation of the scattering and absorbing particles. Each set had one sample with homogeneously dispersed scatterers and absorbers, and a second sample with slightly aggregated scatterers or absorbers. We found that the measured transmittance could easily vary by a factor of twenty. The estimated optical properties (using the inverse adding-doubling method) indicate that when aggregation is present, the optical properties are no longer proportional to the concentrations of absorbers or scatterers.

  12. An observation of homogeneous and heterogeneous catalysis processes in the decomposition of H sub 2 O sub 2 over MnO sub 2 and Mn(OH) sub 2

    SciTech Connect

    Jiang, S.P.; Ashton, W.R.; Tseung, A.C.C. )

    1991-09-01

    The kinetics of peroxide decomposition by manganese dioxide (MnO{sub 2}) and manganese hydroxide (Mn(OH){sub 2}) have been studied in alkaline solutions. The activity for peroxide decomposition on Mn(OH){sub 2} was generally higher than MnO{sub 2} and the kinetics for the decomposition of H{sub 2}O{sub 2} were first-order in the case of MnO{sub 2} catalysts, but 1.3-order for Mn(OH){sub 2} catalysts. It is suggested that H{sub 2}O{sub 2} is mainly homogeneously decomposed by Mn{sup 2+} ions (in the form of HMnO{sub 2}{sup {minus}} ions in concentrated alkaline solutions) dissolved in the solution in the case of Mn(OH){sub 2}. Compared with the results reported for the decomposition of H{sub 2}O{sub 2} in the presence of 1 ppm Co{sup 2+} ions, it is concluded that the kinetics of the homogeneous decomposition of H{sub 2}O{sub 2} are directly influenced by the concentration of the active species in the solution.

  13. Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals: stereoelectronic role of trans effect in a metal-mediated pericyclic process and a shift from homogeneous to heterogeneous catalysis during a one-pot reaction.

    PubMed

    Vidhani, Dinesh V; Krafft, Marie E; Alabugin, Igor V

    2014-01-01

    The combination of experiments and computations reveals unusual features of stereoselective Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals. The first step, the conversion of propargyl vinyl ethers into allene aldehydes, proceeds under homogeneous conditions via a "cyclization-mediated" mechanism initiated by Rh(I) coordination at the alkyne. This path agrees well with the small experimental effects of substituents on the carbinol carbon. The key feature revealed by the computational study is the stereoelectronic effect of the ligand arrangement at the catalytic center. The rearrangement barriers significantly decrease due to the greater transfer of electron density from the catalytic metal center to the CO ligand oriented trans to the alkyne. This effect increases electrophilicity of the metal and lowers the calculated barriers by 9.0 kcal/mol. Subsequent evolution of the catalyst leads to the in situ formation of Rh(I) nanoclusters that catalyze stereoselective tautomerization. The intermediacy of heterogeneous catalysis by nanoclusters was confirmed by mercury poisoning, temperature-dependent sigmoidal kinetic curves, and dynamic light scattering. The combination of experiments and computations suggests that the initially formed allene-aldehyde product assists in the transformation of a homogeneous catalyst (or "a cocktail of catalysts") into nanoclusters, which in turn catalyze and control the stereochemistry of subsequent transformations. PMID:24304338

  14. Homogenization patterns of the world’s freshwater fish faunas

    PubMed Central

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-01-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the “Homogocene era” is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes. PMID:22025692

  15. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer

    PubMed Central

    An, Ning; Shi, Xiaoyu; Zhang, Yueming; Lv, Ning; Feng, Lin; Di, Xuebing; Han, Naijun; Wang, Guiqi

    2015-01-01

    Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis), probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan–Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11), and significantly associated with disease-free survival in four

  16. Comparative Analysis of a MOOC and a Residential Community Using Introductory College Physics: Documenting How Learning Environments Are Created, Lessons Learned in the Process, and Measurable Outcomes

    NASA Astrophysics Data System (ADS)

    Olsen, Jack Ryan

    Higher education institutions, such as the University of Colorado Boulder (CU-Boulder), have as a core mission to advance their students' academic performance. On the frontier of education technologies that hold the promise to address our educational mission are Massively Open Online Courses (MOOCs) which are new enough to not be fully understood or well-researched. MOOCs, in theory, have vast potential for being cost-effective and for reaching diverse audiences across the world. This thesis examines the implementation of one MOOC, Physics 1 for Physical Science Majors, implemented in the augural round of institutionally sanctioned MOOCs in Fall 2013. While comparatively inexpensive to a brick-and-mortar course and while it initially enrolled audience of nearly 16,000 students, this MOOC was found to be time-consuming to implement, and only roughly 1.5% of those who enrolled completed the course---approximately 1/4 of those who completed the standard brick and mortar course that the MOOC was designed around. An established education technology, residential communities, contrast the MOOCs by being high-touch and highly humanized, but by being expensive and locally-based. The Andrews Hall Residential College (AHRC) on the CU campus fosters academic success and retention by engaging and networking students outside of the standard brick and mortar courses and enculturating students into an environment with vertical integration through the different classes: freshman, sophomore, junior, etc. The physics MOOC and the AHRC were studied to determine how the environments were made and what lessons were learned in the process. Also, student performance was compared for the physics MOOC, a subset of the AHRC students enrolled in a special physics course, and the standard CU Physics 1 brick and mortar course. All yielded similar learning gains for physics 1 performance, for those who completed the courses. These environments are presented together to compare and contrast their

  17. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer.

    PubMed

    An, Ning; Shi, Xiaoyu; Zhang, Yueming; Lv, Ning; Feng, Lin; Di, Xuebing; Han, Naijun; Wang, Guiqi; Cheng, Shujun; Zhang, Kaitai

    2015-01-01

    Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis), probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan-Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11), and significantly associated with disease-free survival in four

  18. Creating Pupils' Internet Magazine

    ERIC Educational Resources Information Center

    Bognar, Branko; Šimic, Vesna

    2014-01-01

    This article presents an action research, which aimed to improve pupils' literary creativity and enable them to use computers connected to the internet. The study was conducted in a small district village school in Croatia. Creating a pupils' internet magazine appeared to be an excellent way for achieving the educational aims of almost all…

  19. Creating an Interactive Globe.

    ERIC Educational Resources Information Center

    Martin, Kurt D.

    1989-01-01

    Describes a hands-on geography activity that is designed to teach longitude and latitude to fifth-grade students. Children create a scale model of the earth from a 300 gram weather balloon. This activity incorporates geography, mathematics, science, art, and homework. Provides information for obtaining materials. (KO)

  20. How Banks Create Money.

    ERIC Educational Resources Information Center

    Beale, Lyndi

    This teaching module explains how the U.S. banking system uses excess reserves to create money in the form of new deposits for borrowers. The module is part of a computer-animated series of four-to-five-minute modules illustrating standard concepts in high school economics. Although the module is designed to accompany the video program, it may be…

  1. Creating Quality Media Materials.

    ERIC Educational Resources Information Center

    Hortin, John A.; Bailey, Gerald D.

    1982-01-01

    Innovation, imagination, and student creativity are key ingredients in creating quality media materials for the small school. Student-produced media materials, slides without a camera, personalized slide programs and copy work, self-made task cards, self-made overhead transparencies, graphic materials, and utilization of the mass media are some of…

  2. Creating a Reference Toolbox.

    ERIC Educational Resources Information Center

    Scott, Jane

    1997-01-01

    To help students understand that references are tools used to locate specific information, one librarian has her third-grade students create their own reference toolboxes as she introduces dictionaries, atlases, encyclopedias, and thesauri. Presents a lesson plan to introduce print and nonprint thesauri to third and fourth graders and includes a…

  3. Creating Photo Illustrations.

    ERIC Educational Resources Information Center

    Wilson, Bradley

    2003-01-01

    Explains the uses of photo illustrations. Notes that the key to developing a successful photo illustration is collaborative planning. Outlines the following guidelines for photo illustrations: never set up a photograph to mimic reality; create only abstractions with photo illustrations; clearly label photo illustrations; and never play photo…

  4. Creating dedicated bioenergy crops

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Bioenergy is one of the current mechanisms of producing renewable energy to reduce our use of nonrenewable fossil fuels and to reduce carbon emissions into the atmosphere. Humans have been using bioenergy since we first learned to create and control fire - burning manure, peat, and wood to cook food...

  5. Create a Critter Collector.

    ERIC Educational Resources Information Center

    Hinchey, Elizabeth K.; Nestlerode, Janet A.

    2001-01-01

    Presents methods for creating appropriate ways of collecting live specimens to use for firsthand observation in the classroom. Suggests ecological questions for students to address using these devices. This project is ideal for schools that have access to piers or bridges on a coastal body of water. (NB)

  6. Creating a Market.

    ERIC Educational Resources Information Center

    Kazimirski, J.; And Others

    The second in a series of programmed books, "Creating a Market" is published by the International Labour Office as a manual for persons studying marketing. This manual was designed to meet the needs of the labor organization's technical cooperation programs and is primarily concerned with consumer goods industries. Using a fill-in-the-blanks and…

  7. Looking, Writing, Creating.

    ERIC Educational Resources Information Center

    Katzive, Bonnie

    1997-01-01

    Describes how a middle school language arts teacher makes analyzing and creating visual art a partner to reading and writing in her classroom. Describes a project on art and Vietnam which shows how background information can add to and influence interpretation. Describes a unit on Greek mythology and Greek vases which leads to a related visual…

  8. Creating an Interactive PDF

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2008-01-01

    There are many ways to begin a PDF document using Adobe Acrobat. The easiest and most popular way is to create the document in another application (such as Microsoft Word) and then use the Adobe Acrobat software to convert it to a PDF. In this article, the author describes how he used Acrobat's many tools in his project--an interactive…

  9. Creating a Classroom Makerspace

    ERIC Educational Resources Information Center

    Rivas, Luz

    2014-01-01

    What is a makerspace? Makerspaces are community-operated physical spaces where people (makers) create do-it-yourself projects together. These membership spaces serve as community labs where people learn together and collaborate on projects. Makerspaces often have tools and equipment like 3-D printers, laser cutters, and soldering irons.…

  10. Creating a Virtual Gymnasium

    ERIC Educational Resources Information Center

    Fiorentino, Leah H.; Castelli, Darla

    2005-01-01

    Physical educators struggle with the challenges of assessing student performance, providing feedback about motor skills, and creating opportunities for all students to engage in game-play on a daily basis. The integration of technology in the gymnasium can address some of these challenges by improving teacher efficiency and increasing student…

  11. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  12. Creating Dialogue by Storytelling

    ERIC Educational Resources Information Center

    Passila, Anne; Oikarinen, Tuija; Kallio, Anne

    2013-01-01

    Purpose: The objective of this paper is to develop practice and theory from Augusto Boal's dialogue technique (Image Theatre) for organisational use. The paper aims to examine how the members in an organisation create dialogue together by using a dramaturgical storytelling framework where the dialogue emerges from storytelling facilitated by…

  13. Create Your State

    ERIC Educational Resources Information Center

    Dunham, Kris; Melvin, Samantha

    2011-01-01

    Students are often encouraged to work together with their classmates, sometimes with other classes, occasionally with kids at other schools, but rarely with kids across the country. In this article the authors describe the Create Your State project, a collaborative nationwide project inspired by the Texas Chair Project wherein the artist, Damien…

  14. Creating Quality Schools.

    ERIC Educational Resources Information Center

    American Association of School Administrators, Arlington, VA.

    This booklet presents information on how total quality management can be applied to school systems to create educational improvement. Total quality management offers education a systemic approach and a new set of assessment tools. Chapter 1 provides a definition and historical overview of total quality management. Chapter 2 views the school…

  15. Cryogenic Homogenization and Sampling of Heterogeneous Multi-Phase Feedstock

    SciTech Connect

    Doyle, Glenn M.; Ideker, Virgene D.; Siegwarth, James D.

    1999-09-21

    An apparatus and process for producing a homogeneous analytical sample from a heterogeneous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77K (-196 C). Further, with the process of this invention the representative sample maybe maintained below the critical temperature until being analyzed.

  16. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  17. Theoretical studies of homogeneous catalysts mimicking nitrogenase.

    PubMed

    Sgrignani, Jacopo; Franco, Duvan; Magistrato, Alessandra

    2011-01-01

    The conversion of molecular nitrogen to ammonia is a key biological and chemical process and represents one of the most challenging topics in chemistry and biology. In Nature the Mo-containing nitrogenase enzymes perform nitrogen 'fixation' via an iron molybdenum cofactor (FeMo-co) under ambient conditions. In contrast, industrially, the Haber-Bosch process reduces molecular nitrogen and hydrogen to ammonia with a heterogeneous iron catalyst under drastic conditions of temperature and pressure. This process accounts for the production of millions of tons of nitrogen compounds used for agricultural and industrial purposes, but the high temperature and pressure required result in a large energy loss, leading to several economic and environmental issues. During the last 40 years many attempts have been made to synthesize simple homogeneous catalysts that can activate dinitrogen under the same mild conditions of the nitrogenase enzymes. Several compounds, almost all containing transition metals, have been shown to bind and activate N₂ to various degrees. However, to date Mo(N₂)(HIPTN)₃N with (HIPTN)₃N= hexaisopropyl-terphenyl-triamidoamine is the only compound performing this process catalytically. In this review we describe how Density Functional Theory calculations have been of help in elucidating the reaction mechanisms of the inorganic compounds that activate or fix N₂. These studies provided important insights that rationalize and complement the experimental findings about the reaction mechanisms of known catalysts, predicting the reactivity of new potential catalysts and helping in tailoring new efficient catalytic compounds. PMID:21221062

  18. Iterative and variational homogenization methods for filled elastomers

    NASA Astrophysics Data System (ADS)

    Goudarzi, Taha

    Elastomeric composites have increasingly proved invaluable in commercial technological applications due to their unique mechanical properties, especially their ability to undergo large reversible deformation in response to a variety of stimuli (e.g., mechanical forces, electric and magnetic fields, changes in temperature). Modern advances in organic materials science have revealed that elastomeric composites hold also tremendous potential to enable new high-end technologies, especially as the next generation of sensors and actuators featured by their low cost together with their biocompatibility, and processability into arbitrary shapes. This potential calls for an in-depth investigation of the macroscopic mechanical/physical behavior of elastomeric composites directly in terms of their microscopic behavior with the objective of creating the knowledge base needed to guide their bottom-up design. The purpose of this thesis is to generate a mathematical framework to describe, explain, and predict the macroscopic nonlinear elastic behavior of filled elastomers, arguably the most prominent class of elastomeric composites, directly in terms of the behavior of their constituents --- i.e., the elastomeric matrix and the filler particles --- and their microstructure --- i.e., the content, size, shape, and spatial distribution of the filler particles. This will be accomplished via a combination of novel iterative and variational homogenization techniques capable of accounting for interphasial phenomena and finite deformations. Exact and approximate analytical solutions for the fundamental nonlinear elastic response of dilute suspensions of rigid spherical particles (either firmly bonded or bonded through finite size interphases) in Gaussian rubber are first generated. These results are in turn utilized to construct approximate solutions for the nonlinear elastic response of non-Gaussian elastomers filled with a random distribution of rigid particles (again, either firmly

  19. Creating Geoscience Leaders

    NASA Astrophysics Data System (ADS)

    Buskop, J.; Buskop, W.

    2013-12-01

    The United Nations Educational, Scientific, and Cultural Organization recognizes 21 World Heritage in the United States, ten of which have astounding geological features: Wrangell St. Elias National Park, Olympic National Park, Mesa Verde National Park, Chaco Canyon, Glacier National Park, Carlsbad National Park, Mammoth Cave, Great Smokey Mountains National Park, Hawaii Volcanoes National Park, and Everglades National Park. Created by a student frustrated with fellow students addicted to smart phones with an extreme lack of interest in the geosciences, one student visited each World Heritage site in the United States and created one e-book chapter per park. Each chapter was created with original photographs, and a geological discovery hunt to encourage teen involvement in preserving remarkable geological sites. Each chapter describes at least one way young adults can get involved with the geosciences, such a cave geology, glaciology, hydrology, and volcanology. The e-book describes one park per chapter, each chapter providing a geological discovery hunt, information on how to get involved with conservation of the parks, geological maps of the parks, parallels between archaeological and geological sites, and how to talk to a ranger. The young author is approaching UNESCO to publish the work as a free e-book to encourage involvement in UNESCO sites and to prove that the geosciences are fun.

  20. Turbulence in homogeneous shear flows

    NASA Astrophysics Data System (ADS)

    Pumir, Alain

    1996-11-01

    Homogeneous shear flows with an imposed mean velocity U=Syx̂ are studied in a period box of size Lx×Ly×Lz, in the statistically stationary turbulent state. In contrast with unbounded shear flows, the finite size of the system constrains the large-scale dynamics. The Reynolds number, defined by Re≡SL2y/ν varies in the range 2600⩽Re⩽11300. The total kinetic energy and enstrophy in the volume of numerical integration have large peaks, resulting in fluctuations of kinetic energy of order 30%-50%. The mechanism leading to these fluctuations is very reminiscent of the ``streaks'' responsible for the violent bursts observed in turbulent boundary layers. The large scale anisotropy of the flow, characterized by the two-point correlation tensor depends on the aspect ratio of the system. The probability distribution functions (PDF) of the components of the velocity are found to be close to Gaussian. The physics of the Reynolds stress tensor, uv, is very similar to what is found experimentally in wall bounded shear flows. The study of the two-point correlation tensor of the vorticity <ωiωj> suggests that the small scales become isotropic when the Reynolds number increases, as observed in high Reynolds number turbulent boundary layers. However, the skewness of the z component of vorticity is independent of the Reynolds number in this range, suggesting that some small scale anisotropy remains even at very high Reynolds numbers. An analogy is drawn with the problem of turbulent mixing, where a similar anisotropy is observed.

  1. Homogeneous catalysts in hypersonic combustion

    SciTech Connect

    Harradine, D.M.; Lyman, J.L.; Oldenborg, R.C.; Pack, R.T.; Schott, G.L.

    1989-01-01

    Density and residence time both become unfavorably small for efficient combustion of hydrogen fuel in ramjet propulsion in air at high altitude and hypersonic speed. Raising the density and increasing the transit time of the air through the engine necessitates stronger contraction of the air flow area. This enhances the kinetic and thermodynamic tendency of H/sub 2/O to form completely, accompanied only by N/sub 2/ and any excess H/sub 2/(or O/sub 2/). The by-products to be avoided are the energetically expensive fragment species H and/or O atoms and OH radicals, and residual (2H/sub 2/ plus O/sub 2/). However, excessive area contraction raises air temperature and consequent combustion-product temperature by adiabatic compression. This counteracts and ultimately overwhelms the thermodynamic benefit by which higher density favors the triatomic product, H/sub 2/O, over its monatomic and diatomic alternatives. For static pressures in the neighborhood of 1 atm, static temperature must be kept or brought below ca. 2400 K for acceptable stability of H/sub 2/O. Another measure, whose requisite chemistry we address here, is to extract propulsive work from the combustion products early in the expansion. The objective is to lower the static temperature of the combustion stream enough for H/sub 2/O to become adequately stable before the exhaust flow is massively expanded and its composition ''frozen.'' We proceed to address this mechanism and its kinetics, and then examine prospects for enhancing its rate by homogeneous catalysts. 9 refs.

  2. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  3. Design and testing of a refractive laser beam homogenizer

    NASA Astrophysics Data System (ADS)

    Fernelius, N. C.; Bradley, K. R.; Hoekstra, B. L.

    1984-09-01

    A survey is made of various techniques to create a homogeneous or flat top laser beam profile. A refractive homogenizer was designed for use with a ND:YAG laser with output at its fundamental (1.06 micrometer) and frequency doubled (532 nm) modes. The system consists of a 2X beam expander and two faceted cylindrical lenses with differing focal lengths. Each cylindrical lens focusses its input into a strip the width of a facet. By orienting their axes at a 90 degree angle and focussing them on the same plane, the beam is concentrated into a square focus. Formulae for calculating the facet angles are derived and a FORTRAN computer square focus. Formulae for calculating the facet angles are derived and a FORTRAN computer program was written to calculate them with a precision greater than one is able to fabricate them.

  4. Homogenization of Heterogeneous Elastic Materials with Applications to Seismic Anisotropy

    NASA Astrophysics Data System (ADS)

    Vel, S. S.; Johnson, S. E.; Okaya, D. A.; Cook, A. C.

    2014-12-01

    The velocities of seismic waves passing through a complex Earth volume can be influenced by heterogeneities at length scales shorter than the seismic wavelength. As such, seismic wave propagation analyses can be performed by replacing the actual Earth volume by a homogeneous i.e., "effective", elastic medium. Homogenization refers to the process by which the elastic stiffness tensor of the effective medium is "averaged" from the elastic properties, orientations, modal proportions and spatial distributions of the finer heterogeneities. When computing the homogenized properties of a heterogeneous material, the goal is to compute an effective or bulk elastic stiffness tensor that relates the average stresses to the average strains in the material. Tensor averaging schemes such as the Voigt and Reuss methods are based on certain simplifying assumptions. The Voigt method assumes spatially uniform strains while the Reuss method assumes spatially uniform stresses within the heterogeneous material. Although they are both physically unrealistic, they provide upper and lower bounds for the actual homogenized elastic stiffness tensor. In order to more precisely determine the homogenized stiffness tensor, the stress and strain distributions must be computed by solving the three-dimensional equations of elasticity over the heterogeneous region. Asymptotic expansion homogenization (AEH) is one such structure-based approach for the comprehensive micromechanical analysis of heterogeneous materials. Unlike modal volume methods, the AEH method takes into account how geometrical orientation and alignment can increase elastic stiffness in certain directions. We use the AEH method in conjunction with finite element analysis to calculate the bulk elastic stiffnesses of heterogeneous materials. In our presentation, wave speeds computed using the AEH method are compared with those generated using stiffness tensors derived from commonly-used analytical estimates. The method is illustrated

  5. Effect of heat and homogenization on in vitro digestion of milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  6. Creating new growth platforms.

    PubMed

    Laurie, Donald L; Doz, Yves L; Sheer, Claude P

    2006-05-01

    Sooner or later, most companies can't attain the growth rates expected by their boards and CEOs and demanded by investors. To some extent, such businesses are victims of their own successes. Many were able to sustain high growth rates for a long time because they were in high-growth industries. But once those industries slowed down, the businesses could no longer deliver the performance that investors had come to take for granted. Often, companies have resorted to acquisition, though this strategy has a discouraging track record. Over time, 65% of acquisitions destroy more value than they create. So where does real growth come from? For the past 12 years, the authors have been researching and advising companies on this issue. With the support of researchers at Harvard Business School and Insead, they instituted a project titled "The CEO Agenda and Growth". They identified and approached 24 companies that had achieved significant organic growth and interviewed their CEOs, chief strategists, heads of R&D, CFOs, and top-line managers. They asked, "Where does your growth come from?" and found a consistent pattern in the answers. All the businesses grew by creating new growth platforms (NGPs) on which they could build families of products and services and extend their capabilities into multiple new domains. Identifying NGP opportunities calls for executives to challenge conventional wisdom. In all the companies studied, top management believed that NGP innovation differed significantly from traditional product or service innovation. They had independent, senior-level units with a standing responsibility to create NGPs, and their CEOs spent as much as 50% of their time working with these units. The payoff has been spectacular and lasting. For example, from 1985 to 2004, the medical devices company Medtronic grew revenues at 18% per year, earnings at 20%, and market capitalization at 30%. PMID:16649700

  7. Cell-Laden Poly(ɛ-caprolactone)/Alginate Hybrid Scaffolds Fabricated by an Aerosol Cross-Linking Process for Obtaining Homogeneous Cell Distribution: Fabrication, Seeding Efficiency, and Cell Proliferation and Distribution

    PubMed Central

    Lee, HyeongJin; Ahn, SeungHyun; Bonassar, Lawrence J.; Chun, Wook

    2013-01-01

    Generally, solid-freeform fabricated scaffolds show a controllable pore structure (pore size, porosity, pore connectivity, and permeability) and mechanical properties by using computer-aided techniques. Although the scaffolds can provide repeated and appropriate pore structures for tissue regeneration, they have a low biological activity, such as low cell-seeding efficiency and nonuniform cell density in the scaffold interior after a long culture period, due to a large pore size and completely open pores. Here we fabricated three different poly(ɛ-caprolactone) (PCL)/alginate scaffolds: (1) a rapid prototyped porous PCL scaffold coated with an alginate, (2) the same PCL scaffold coated with a mixture of alginate and cells, and (3) a multidispensed hybrid PCL/alginate scaffold embedded with cell-laden alginate struts. The three scaffolds had similar micropore structures (pore size=430–580 μm, porosity=62%–68%, square pore shape). Preosteoblast cells (MC3T3-E1) were used at the same cell density in each scaffold. By measuring cell-seeding efficiency, cell viability, and cell distribution after various periods of culturing, we sought to determine which scaffold was more appropriate for homogeneously regenerated tissues. PMID:23469894

  8. Creating healthy camp experiences.

    PubMed

    Walton, Edward A; Tothy, Alison S

    2011-04-01

    The American Academy of Pediatrics has created recommendations for health appraisal and preparation of young people before participation in day or resident camps and to guide health and safety practices for children at camp. These recommendations are intended for parents, primary health care providers, and camp administration and health center staff. Although camps have diverse environments, there are general guidelines that apply to all situations and specific recommendations that are appropriate under special conditions. This policy statement has been reviewed and is supported by the American Camp Association. PMID:21444589

  9. Creating corporate advantage.

    PubMed

    Collis, D J; Montgomery, C A

    1998-01-01

    What differentiates truly great corporate strategies from the merely adequate? How can executives at the corporate level create tangible advantage for their businesses that makes the whole more than the sum of the parts? This article presents a comprehensive framework for value creation in the multibusiness company. It addresses the most fundamental questions of corporate strategy: What businesses should a company be in? How should it coordinate activities across businesses? What role should the corporate office play? How should the corporation measure and control performance? Through detailed case studies of Tyco International, Sharp, the Newell Company, and Saatchi and Saatchi, the authors demonstrate that the answers to all those questions are driven largely by the nature of a company's special resources--its assets, skills, and capabilities. These range along a continuum from the highly specialized at one end to the very general at the other. A corporation's location on the continuum constrains the set of businesses it should compete in and limits its choices about the design of its organization. Applying the framework, the authors point out the common mistakes that result from misaligned corporate strategies. Companies mistakenly enter businesses based on similarities in products rather than the resources that contribute to competitive advantage in each business. Instead of tailoring organizational structures and systems to the needs of a particular strategy, they create plain-vanilla corporate offices and infrastructures. The company examples demonstrate that one size does not fit all. One can find great corporate strategies all along the continuum. PMID:10179655

  10. Creating sustainable performance.

    PubMed

    Spreitzer, Gretchen; Porath, Christine

    2012-01-01

    What makes for sustainable individual and organizational performance? Employees who are thriving-not just satisfied and productive but also engaged in creating the future. The authors found that people who fit this description demonstrated 16% better overall performance, 125% less burnout, 32% more commitment to the organization, and 46% more job satisfaction than their peers. Thriving has two components: vitality, or the sense of being alive and excited, and learning, or the growth that comes from gaining knowledge and skills. Some people naturally build vitality and learning into their jobs, but most employees are influenced by their environment. Four mechanisms, none of which requires heroic effort or major resources, create the conditions for thriving: providing decision-making discretion, sharing information about the organization and its strategy, minimizing incivility, and offering performance feedback. Organizations such as Alaska Airlines, Zingerman's, Quicken Loans, and Caiman Consulting have found that helping people grow and remain energized at work is valiant on its own merits-but it can also boost performance in a sustainable way. PMID:22299508

  11. Creating breakthroughs at 3M.

    PubMed

    von Hippel, E; Thomke, S; Sonnack, M

    1999-01-01

    Most senior managers want their product development teams to create break-throughs--new products that will allow their companies to grow rapidly and maintain high margins. But more often they get incremental improvements to existing products. That's partly because companies must compete in the short term. Searching for breakthroughs is expensive and time consuming; line extensions can help the bottom line immediately. In addition, developers simply don't know how to achieve breakthroughs, and there is usually no system in place to guide them. By the mid-1990s, the lack of such a system was a problem even for an innovative company like 3M. Then a project team in 3M's Medical-Surgical Markets Division became acquainted with a method for developing breakthrough products: the lead user process. The process is based on the fact that many commercially important products are initially thought of and even prototyped by "lead users"--companies, organizations, or individuals that are well ahead of market trends. Their needs are so far beyond those of the average user that lead users create innovations on their own that may later contribute to commercially attractive breakthroughs. The lead user process transforms the job of inventing breakthroughs into a systematic task of identifying lead users and learning from them. The authors explain the process and how the 3M project team successfully navigated through it. In the end, the team proposed three major new product lines and a change in the division's strategy that has led to the development of breakthrough products. And now several more divisions are using the process to break away from incrementalism. PMID:10621267

  12. Creating innovative departments.

    PubMed

    von Segesser, Ludwig K

    2004-12-01

    'Creating an innovative department' as an objective implies further improvements in organization, function, and progression of a surgical unit active in patient care, research, and education. It is of prime importance to stress here the mutual benefits of patient care, research (the basis for future patient care) and education (the channel for training health care professionals in future patient care). Neither innovation (from latin innovare: to renew, revive) nor creation (from latin creare: to make, produce) is something that will fall from heaven without effort any time soon. Hence, a pro-active attitude towards progress is indicated. This requires searching for new ideas, allocation of resources, finding allies, getting focussed, and being persistent. One word says it all: WORK! PMID:15776856

  13. Creating With Carbon

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A subsidiary of SI Diamond Technology, Inc., Applied Nanotech, of Austin, Texas, is creating a buzz among various technology firms and venture capital groups interested in the company s progressive research on carbon-related field emission devices, including carbon nanotubes, filaments of pure carbon less than one ten-thousandth the width of human hair. Since their discovery in 1991, carbon nanotubes have gained considerable attention due to their unique physical properties. For example, a single perfect carbon nanotube can range from 10 to 100 times stronger than steel, per unit weight. Recent studies also indicate that the nanotubes may be the best heat-conducting material in existence. These properties, combined with the ease of growing thin films or nanotubes by a variety of deposition techniques, make the carbon-based material one of the most desirable for cold field emission cathodes.

  14. Creating the living brand.

    PubMed

    Bendapudi, Neeli; Bendapudi, Venkat

    2005-05-01

    It's easy to conclude from the literature and the lore that top-notch customer service is the province of a few luxury companies and that any retailer outside that rarefied atmosphere is condemned to offer mediocre service at best. But even companies that position themselves for the mass market can provide outstanding customer-employee interactions and profit from them, if they train employees to reflect the brand's core values. The authors studied the convenience store industry in depth and focused on two that have developed a devoted following: QuikTrip (QT) and Wawa. Turnover rates at QT and Wawa are 14% and 22% respectively, much lower than the typical rate in retail. The authors found six principles that both firms embrace to create a strong culture of customer service. Know what you're looking for: A focus on candidates' intrinsic traits allows the companies to hire people who will naturally bring the right qualities to the job. Make the most of talent: In mass-market retail, talent is generally viewed as a commodity, but that outlook becomes a self-fulfilling prophesy. Create pride in the brand: Service quality depends directly on employees' attachment to the brand. Build community: Wawa and QT have made concerted efforts to build customer loyalty through a sense of community. Share the business context: Employees need a clear understanding of how their company operates and how it defines success. Satisfy the soul: To win an employee's passionate engagement, a company must meet his or her needs for security, esteem, and justice. PMID:15929408

  15. Creating Griffith Observatory

    NASA Astrophysics Data System (ADS)

    Cook, Anthony

    2013-01-01

    Griffith Observatory has been the iconic symbol of the sky for southern California since it began its public mission on May 15, 1935. While the Observatory is widely known as being the gift of Col. Griffith J. Griffith (1850-1919), the story of how Griffith’s gift became reality involves many of the people better known for other contributions that made Los Angeles area an important center of astrophysics in the 20th century. Griffith began drawing up his plans for an observatory and science museum for the people of Los Angeles after looking at Saturn through the newly completed 60-inch reflector on Mt. Wilson. He realized the social impact that viewing the heavens could have if made freely available, and discussing the idea of a public observatory with Mt. Wilson Observatory’s founder, George Ellery Hale, and Director, Walter Adams. This resulted, in 1916, in a will specifying many of the features of Griffith Observatory, and establishing a committee managed trust fund to build it. Astronomy popularizer Mars Baumgardt convinced the committee at the Zeiss Planetarium projector would be appropriate for Griffith’s project after the planetarium was introduced in Germany in 1923. In 1930, the trust committee judged funds to be sufficient to start work on creating Griffith Observatory, and letters from the Committee requesting help in realizing the project were sent to Hale, Adams, Robert Millikan, and other area experts then engaged in creating the 200-inch telescope eventually destined for Palomar Mountain. A Scientific Advisory Committee, headed by Millikan, recommended that Caltech Physicist Edward Kurth be put in charge of building and exhibit design. Kurth, in turn, sought help from artist Russell Porter. The architecture firm of John C. Austin and Fredrick Ashley was selected to design the project, and they adopted the designs of Porter and Kurth. Philip Fox of the Adler Planetarium was enlisted to manage the completion of the Observatory and become its

  16. Energy cost of creating quantum coherence

    NASA Astrophysics Data System (ADS)

    Misra, Avijit; Singh, Uttam; Bhattacharya, Samyadeb; Pati, Arun Kumar

    2016-05-01

    We consider physical situations where the resource theories of coherence and thermodynamics play competing roles. In particular, we study the creation of quantum coherence using unitary operations with limited thermodynamic resources. We find the maximal coherence that can be created under unitary operations starting from a thermal state and find explicitly the unitary transformation that creates the maximal coherence. Since coherence is created by unitary operations starting from a thermal state, it requires some amount of energy. This motivates us to explore the trade-off between the amount of coherence that can be created and the energy cost of the unitary process. We also find the maximal achievable coherence under the constraint on the available energy. Additionally, we compare the maximal coherence and the maximal total correlation that can be created under unitary transformations with the same available energy at our disposal. We find that when maximal coherence is created with limited energy, the total correlation created in the process is upper bounded by the maximal coherence, and vice versa. For two-qubit systems we show that no unitary transformation exists that creates the maximal coherence and maximal total correlation simultaneously with a limited energy cost.

  17. Homogenization of precipitation time series with ACMANT

    NASA Astrophysics Data System (ADS)

    Domonkos, Peter

    2015-10-01

    New method for the time series homogenization of observed precipitation (PP) totals is presented; this method is a unit of the ACMANT software package. ACMANT is a relative homogenization method; minimum four time series with adequate spatial correlations are necessary for its use. The detection of inhomogeneities (IHs) is performed with fitting optimal step function, while the calculation of adjustment terms is based on the minimization of the residual variance in homogenized datasets. Together with the presentation of PP homogenization with ACMANT, some peculiarities of PP homogenization as, for instance, the frequency and seasonal variation of IHs in observed PP data and their relation to the performance of homogenization methods are discussed. In climatic regions of snowy winters, ACMANT distinguishes two seasons, namely, rainy season and snowy season, and the seasonal IHs are searched with bivariate detection. ACMANT is a fully automatic method, is freely downloadable from internet and treats either daily or monthly input. Series of observed data in the input dataset may cover different periods, and the occurrence of data gaps is allowed. False zero values instead of missing data code or physical outliers should be corrected before running ACMANT. Efficiency tests indicate that ACMANT belongs to the best performing methods, although further comparative tests of automatic homogenization methods are needed to confirm or reject this finding.

  18. String pair production in non homogeneous backgrounds

    NASA Astrophysics Data System (ADS)

    Bolognesi, S.; Rabinovici, E.; Tallarita, G.

    2016-04-01

    We consider string pair production in non homogeneous electric backgrounds. We study several particular configurations which can be addressed with the Euclidean world-sheet instanton technique, the analogue of the world-line instanton for particles. In the first case the string is suspended between two D-branes in flat space-time, in the second case the string lives in AdS and terminates on one D-brane (this realizes the holographic Schwinger effect). In some regions of parameter space the result is well approximated by the known analytical formulas, either the particle pair production in non-homogeneous background or the string pair production in homogeneous background. In other cases we see effects which are intrinsically stringy and related to the non-homogeneity of the background. The pair production is enhanced already for particles in time dependent electric field backgrounds. The string nature enhances this even further. For spacial varying electrical background fields the string pair production is less suppressed than the rate of particle pair production. We discuss in some detail how the critical field is affected by the non-homogeneity, for both time and space dependent electric field backgrouds. We also comment on what could be an interesting new prediction for the small field limit. The third case we consider is pair production in holographic confining backgrounds with homogeneous and non-homogeneous fields.

  19. Deforestation homogenizes tropical parasitoid-host networks.

    PubMed

    Laliberté, Etienne; Tylianakis, Jason M

    2010-06-01

    Human activities drive biotic homogenization (loss of regional diversity) of many taxa. However, whether species interaction networks (e.g., food webs) can also become homogenized remains largely unexplored. Using 48 quantitative parasitoid-host networks replicated through space and time across five tropical habitats, we show that deforestation greatly homogenized network structure at a regional level, such that interaction composition became more similar across rice and pasture sites compared with forested habitats. This was not simply caused by altered consumer and resource community composition, but was associated with altered consumer foraging success, such that parasitoids were more likely to locate their hosts in deforested habitats. Furthermore, deforestation indirectly homogenized networks in time through altered mean consumer and prey body size, which decreased in deforested habitats. Similar patterns were obtained with binary networks, suggesting that interaction (link) presence-absence data may be sufficient to detect network homogenization effects. Our results show that tropical agroforestry systems can support regionally diverse parasitoid-host networks, but that removal of canopy cover greatly homogenizes the structure of these networks in space, and to a lesser degree in time. Spatiotemporal homogenization of interaction networks may alter coevolutionary outcomes and reduce ecological resilience at regional scales, but may not necessarily be predictable from community changes observed within individual trophic levels. PMID:20583715

  20. Homogenization method based on the inverse problem

    SciTech Connect

    Tota, A.; Makai, M.

    2013-07-01

    We present a method for deriving homogeneous multi-group cross sections to replace a heterogeneous region's multi-group cross sections; providing that the fluxes and the currents on the external boundary, and the region averaged fluxes are preserved. The method is developed using diffusion approximation to the neutron transport equation in a symmetrical slab geometry. Assuming that the boundary fluxes are given, two response matrices (RMs) can be defined. The first derives the boundary current from the boundary flux, the second derives the flux integral over the region from the boundary flux. Assuming that these RMs are known, we present a formula which reconstructs the multi-group cross-section matrix and the diffusion coefficients from the RMs of a homogeneous slab. Applying this formula to the RMs of a slab with multiple homogeneous regions yields a homogenization method; which produce such homogenized multi-group cross sections and homogenized diffusion coefficients, that the fluxes and the currents on the external boundary, and the region averaged fluxes are preserved. The method is based on the determination of the eigenvalues and the eigenvectors of the RMs. We reproduce the four-group cross section matrix and the diffusion constants from the RMs in numerical examples. We give conditions for replacing a heterogeneous region by a homogeneous one so that the boundary current and the region-averaged flux are preserved for a given boundary flux. (authors)

  1. Encapsulation of volatiles by homogenized partially-cross linked alginates.

    PubMed

    Inguva, Pavan K; Ooi, Shing Ming; Desai, Parind M; Heng, Paul W S

    2015-12-30

    Cross-linked calcium alginate gels are too viscous to be efficaciously incorporated into spray dried formulations. Thus, viscosity reduction is essential to ensure the processability of calcium alginate gels to be sprayed. Viscosity reduction by high pressure homogenization can open new formulation possibilities. Presently, testing of microcapsule integrity is also limited because either single particle tests neglect collective particle behaviours in bulk or bulk testing methods are often associated with single compressions which may not fully characterize individual particle strengths. The aim of this study was sub-divided into three objectives. First objective was to evaluate the impact of high pressure homogenization on gel viscosity. Second objective was to explore the use of the homogenized gels with modified starch for microencapsulation by spray drying. The final objective was to develop a stamping system as microcapsule strength tester that can assess microcapsules in bulk and evaluate the impact of multiple compressions. Collectively, this study would lead towards developing a pressure-activated patch of microcapsules with encapsulated volatiles and the method to assess the patch efficacy. The alginate gels largely experienced an exponential decay in viscosity when homogenized. Furthermore, the homogenized gels were successfully incorporated in spray drying formulations for microencapsulation. The custom-designed microcapsule strength tester was successfully used and shown to possess the required sensitivity to discern batches of microcapsules containing volatiles to have different release profiles. Addition of homogenized gels strengthened the microcapsules only at high wall to core ratios with low mass-load alginate gels. High mass-load gels weaken the microcapsules, exhibiting a higher release at low stamping pressures and wrinkling on the microcapsules surface. PMID:26581772

  2. Higher Order Macro Coefficients in Periodic Homogenization

    NASA Astrophysics Data System (ADS)

    Conca, Carlos; San Martin, Jorge; Smaranda, Loredana; Vanninathan, Muthusamy

    2011-09-01

    A first set of macro coefficients known as the homogenized coefficients appear in the homogenization of PDE on periodic structures. If energy is increased or scale is decreased, these coefficients do not provide adequate approximation. Using Bloch decomposition, it is first realized that the above coefficients correspond to the lowest energy and the largest scale. This naturally paves the way to introduce other sets of macro coefficients corresponding to higher energies and lower scales which yield better approximation. The next task is to compare their properties with those of the homogenized coefficients. This article reviews these developments along with some new results yet to be published.

  3. Numerical computation of homogeneous slope stability.

    PubMed

    Xiao, Shuangshuang; Li, Kemin; Ding, Xiaohua; Liu, Tong

    2015-01-01

    To simplify the computational process of homogeneous slope stability, improve computational accuracy, and find multiple potential slip surfaces of a complex geometric slope, this study utilized the limit equilibrium method to derive expression equations of overall and partial factors of safety. This study transformed the solution of the minimum factor of safety (FOS) to solving of a constrained nonlinear programming problem and applied an exhaustive method (EM) and particle swarm optimization algorithm (PSO) to this problem. In simple slope examples, the computational results using an EM and PSO were close to those obtained using other methods. Compared to the EM, the PSO had a small computation error and a significantly shorter computation time. As a result, the PSO could precisely calculate the slope FOS with high efficiency. The example of the multistage slope analysis indicated that this slope had two potential slip surfaces. The factors of safety were 1.1182 and 1.1560, respectively. The differences between these and the minimum FOS (1.0759) were small, but the positions of the slip surfaces were completely different than the critical slip surface (CSS). PMID:25784927

  4. Homogeneous cosmology with aggressively expanding civilizations

    NASA Astrophysics Data System (ADS)

    Olson, S. Jay

    2015-11-01

    In the context of a homogeneous Universe, we note that the appearance of aggressively expanding advanced life is geometrically similar to the process of nucleation and bubble growth in a first-order cosmological phase transition. We exploit this similarity to describe the dynamics of life saturating the Universe on a cosmic scale, adapting the phase transition model to incorporate probability distributions of expansion and resource consumption strategies. Through a series of numerical solutions spanning several orders of magnitude in the input assumption parameters, the resulting cosmological model is used to address basic questions related to the intergalactic spreading of life, dealing with issues such as timescales, observability, competition between strategies, and first-mover advantage. Finally, we examine physical effects on the Universe itself, such as reheating and the backreaction on the evolution of the scale factor, if such life is able to control and convert a significant fraction of the available pressureless matter into radiation. We conclude that the existence of life, if certain advanced technologies are practical, could have a significant influence on the future large-scale evolution of the Universe.

  5. Creating new market space.

    PubMed

    Kim, W C; Mauborgne, R

    1999-01-01

    Most companies focus on matching and beating their rivals. As a result, their strategies tend to take on similar dimensions. What ensues is head-to-head competition based largely on incremental improvements in cost, quality, or both. The authors have studied how innovative companies break free from the competitive pack by staking out fundamentally new market space--that is, by creating products or services for which there are no direct competitors. This path to value innovation requires a different competitive mind-set and a systematic way of looking for opportunities. Instead of looking within the conventional boundaries that define how an industry competes, managers can look methodically across them. By so doing, they can find unoccupied territory that represents real value innovation. Rather than looking at competitors within their own industry, for example, managers can ask why customers make the trade-off between substitute products or services. Home Depot, for example, looked across the substitutes serving home improvement needs. Intuit looked across the substitutes available to individuals managing their personal finances. In both cases, powerful insights were derived from looking at familiar data from a new perspective. Similar insights can be gleaned by looking across strategic groups within an industry; across buyer groups; across complementary product and service offerings; across the functional-emotional orientation of an industry; and even across time. To help readers explore new market space systematically, the authors developed a tool, the value curve, that can be used to represent visually a range of value propositions. PMID:10345394

  6. Creating alternatives in science

    PubMed Central

    2009-01-01

    Traditional scientist training at the PhD level does not prepare students to be competitive in biotechnology or other non-academic science careers. Some universities have developed biotechnology-relevant doctoral programmes, but most have not. Forming a life science career club makes a statement to university administrators that it is time to rework the curriculum to include biotechnology-relevant training. A career club can supplement traditional PhD training by introducing students to available career choices, help them develop a personal network and teach the business skills that they will need to be competitive in science outside of academia. This paper is an instructional guide designed to help students create a science career club at their own university. These suggestions are based on the experience gained in establishing such a club for the Graduate School at the University of Colorado Denver. We describe the activities that can be offered, the job descriptions for the offices required and potential challenges. With determination, a creative spirit, and the guidance of this paper, students should be able to greatly increase awareness of science career options, and begin building the skills necessary to become competitive in non-academic science. PMID:20161069

  7. Creating Sample Plans

    SciTech Connect

    Spears, Joseph H.; Seebode, Linda C.

    1999-03-24

    The program has been designed to increase the accuracy and reduce the preparation time for completing sampling plans. It consists of our files 1. Analyte/Combination (AnalCombo) A list of analytes and combinations of analytes that can be requested of the onsite and offsite labs. Whenever a specific combination of analytes or suite names appear on the same line as the code number, this indicates that one sample can be placed in one bottle to be analyzed for these paremeters. A code number is assigned for each analyte and combination of analytes. 2. Sampling Plans Database (SPDb) A database that contains all of the analytes and combinations of analytes along with the basic information required for preparing a sample plan. That basic information includes the following fields; matrix, hold time, preservation, sample volume, container size, if the bottle caps are taped, acceptable choices. 3. Sampling plans create (SPcreate) a file that will lookup information from the Sampling Plans Database and the Job Log File (JLF98) A major database used by Sample Managemnet Services for recording more than 100 fields of information.

  8. Creating Sample Plans

    Energy Science and Technology Software Center (ESTSC)

    1999-03-24

    The program has been designed to increase the accuracy and reduce the preparation time for completing sampling plans. It consists of our files 1. Analyte/Combination (AnalCombo) A list of analytes and combinations of analytes that can be requested of the onsite and offsite labs. Whenever a specific combination of analytes or suite names appear on the same line as the code number, this indicates that one sample can be placed in one bottle to bemore » analyzed for these paremeters. A code number is assigned for each analyte and combination of analytes. 2. Sampling Plans Database (SPDb) A database that contains all of the analytes and combinations of analytes along with the basic information required for preparing a sample plan. That basic information includes the following fields; matrix, hold time, preservation, sample volume, container size, if the bottle caps are taped, acceptable choices. 3. Sampling plans create (SPcreate) a file that will lookup information from the Sampling Plans Database and the Job Log File (JLF98) A major database used by Sample Managemnet Services for recording more than 100 fields of information.« less

  9. Non-Homogeneous Fractal Hierarchical Weighted Networks

    PubMed Central

    Dong, Yujuan; Dai, Meifeng; Ye, Dandan

    2015-01-01

    A model of fractal hierarchical structures that share the property of non-homogeneous weighted networks is introduced. These networks can be completely and analytically characterized in terms of the involved parameters, i.e., the size of the original graph Nk and the non-homogeneous weight scaling factors r1, r2, · · · rM. We also study the average weighted shortest path (AWSP), the average degree and the average node strength, taking place on the non-homogeneous hierarchical weighted networks. Moreover the AWSP is scrupulously calculated. We show that the AWSP depends on the number of copies and the sum of all non-homogeneous weight scaling factors in the infinite network order limit. PMID:25849619

  10. Producing tritium in a homogenous reactor

    DOEpatents

    Cawley, William E.

    1985-01-01

    A method and apparatus are described for the joint production and separation of tritium. Tritium is produced in an aqueous homogenous reactor and heat from the nuclear reaction is used to distill tritium from the lower isotopes of hydrogen.

  11. On homogeneous Einstein (α , β) -metrics

    NASA Astrophysics Data System (ADS)

    Yan, Zaili; Deng, Shaoqiang

    2016-05-01

    In this paper, we study homogeneous Einstein (α , β) -metrics. First, we deduce a formula for Ricci curvature of a homogeneous (α , β) -metric. Based on this formula, we obtain a sufficient and necessary condition for a compact homogeneous (α , β) -metric to be Einstein and with vanishing S-curvature. Moreover, we prove that any homogeneous Ricci flat (α , β) space with vanishing S-curvature must be a Minkowski space. Finally, we consider left invariant Einstein (α , β) -metrics on Lie groups with negative Ricci constant. Under some appropriate conditions, we show that the underlying Lie groups must be two step solvable. We also present a more convenient sufficient and necessary condition for the metric to be Einstein in this special case.

  12. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  13. Create a Classroom Blog!

    ERIC Educational Resources Information Center

    Brunsell, Eric; Horejsi, Martin

    2010-01-01

    Science education blogs can serve as powerful digital lab notebooks that contain text, images, and videos. Each blog entry documents a moment in time, but becomes interactive with the addition of readers' comments. Blogs can provide a realistic experience of the peer-review process and generate evolving descriptions of observations through time.…

  14. Creating a Children's Village

    ERIC Educational Resources Information Center

    Roberts, Paul

    2012-01-01

    Five years ago the author embarked on an odyssey that would fundamentally change his life as an architect. He and his partner, Dave Deppen, were selected through a very competitive process to design a new Child Development and Family Studies Center in the Sierra Foothills, near Yosemite National Park for Columbia College. The Columbia College…

  15. Creating Photomontage Videos

    ERIC Educational Resources Information Center

    Nitzberg, Kevan

    2008-01-01

    Several years ago, the author began exploring the use of digital film and video as an art-making media when he took over instructing the video computer art class at the high school where he teaches. He found numerous ways to integrate a variety of multimedia technologies and software with more traditional types of visual art processes and…

  16. Effect of homogenization and pasteurization on the structure and thermal stability of whey protein in milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The effect of homogenization alone or in combination with high temperature, short time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a two-...

  17. Creating your own leadership brand.

    PubMed

    Kerfoot, Karlene

    2002-01-01

    Building equity in a brand happens through many encounters. The initial attraction must be followed by the meeting of expectations. This creates a loyalty that is part of an emotional connection to that brand. This is the same process people go through when they first meet a leader and decide if this is a person they want to buy into. People will examine your style, your competence, and your standards. If you fail on any of these fronts, your ability to lead will be severely compromised. People expect more of leaders now, because they know and recognize good leaders. And, predictably, people are now more cynical of leaders because of the well-publicized excess of a few leaders who advanced their own causes at the expense of their people and their financial future. This will turn out to be a good thing, because it will create a higher standard of leadership that all must aspire to achieve. When the bar is raised for us, our standards of performance are also raised. PMID:12424994

  18. Creating your own leadership brand.

    PubMed

    Kerfoot, Karlene

    2002-01-01

    Building equity in a brand happens through many encounters. The initial attraction must be followed by the meeting of expectations. This creates a loyalty that is part of an emotional connection to that brand. This is the same process people go through when they first meet a leader and decide if this is a person they want to buy into. People will examine your style, your competence, and your standards. If you fail on any of these fronts, your ability to lead will be severely compromised. People expect more of leaders now, because they know and recognize good leaders. And, predictably, people are now more cynical of leaders because of the well-publicized excess of a few leaders who advanced their own causes at the expense of their people and their financial future. This will turn out to be a good thing, because it will create a higher standard of leadership that all must aspire to achieve. When the bar is raised for us, our standards of performance are also raised. PMID:12382542

  19. Layout optimization using the homogenization method

    NASA Technical Reports Server (NTRS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    1993-01-01

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  20. Layout optimization using the homogenization method

    NASA Astrophysics Data System (ADS)

    Suzuki, Katsuyuki; Kikuchi, Noboru

    A generalized layout problem involving sizing, shape, and topology optimization is solved by using the homogenization method for three-dimensional linearly elastic shell structures in order to seek a possibility of establishment of an integrated design system of automotive car bodies, as an extension of the previous work by Bendsoe and Kikuchi. A formulation of a three-dimensional homogenized shell, a solution algorithm, and several examples of computing the optimum layout are presented in this first part of the two articles.

  1. Noncommutative complex structures on quantum homogeneous spaces

    NASA Astrophysics Data System (ADS)

    Ó Buachalla, Réamonn

    2016-01-01

    A new framework for noncommutative complex geometry on quantum homogeneous spaces is introduced. The main ingredients used are covariant differential calculi and Takeuchi's categorical equivalence for quantum homogeneous spaces. A number of basic results are established, producing a simple set of necessary and sufficient conditions for noncommutative complex structures to exist. Throughout, the framework is applied to the quantum projective spaces endowed with the Heckenberger-Kolb calculus.

  2. Recent advances in the understanding of homogeneous dielectric barrier discharges

    NASA Astrophysics Data System (ADS)

    Massines, F.; Gherardi, N.; Naudé, N.; Ségur, P.

    2009-08-01

    This paper is a state of the art of the understanding on the physics of homogeneous dielectric barrier discharges at atmospheric pressure. It is based on the analysis of present and previous work about the behavior of these discharges and the conditions to get them. Mechanisms controlling the homogeneity during gas breakdown and discharge development are successively discussed. The breakdown has to be a Townsend one, the ionization has to be slow enough to avoid a large avalanche development. During the breakdown, the discharge homogeneity is related to the ratio of the secondary emission at the cathode (γ coefficient) on the ionization in the gas bulk (α coefficient). Higher is this ratio, higher is the pressure × gas gap product (Pd) value for which a Townsend breakdown is obtained. Among the phenomena enhancing the secondary emission there is the negative charge of the dielectric on the cathode surface, the trapping of ions in the gas and the existence of excited state having a long lifetime compared to the time between two consecutive discharges. The first phenomenon is always present when the electrodes are covered by a solid dielectric, the second one is related to the formation of a positive column and the third one is specific of the gas. During the discharge development, the homogeneity is mainly controlled by the voltage or the current imposed by the electrical circuit/electrode configuration and by the gas ability to be slowly ionized. Larger is the contribution of a multiple step ionization process like Penning ionization, higher will be the working domain of the discharge. A decrease of the gas voltage during the discharge development is a solution to enhance the contribution of this process. After 20 years of research a lot of mechanisms have been understood however there is still open questions like the nature of the Inhibited homogeneous DBD, surface energy transfers, role of attachment and detachment...

  3. ISO 55000: Creating an asset management system.

    PubMed

    Bradley, Chris; Main, Kevin

    2015-02-01

    In the October 2014 issue of HEJ, Keith Hamer, group vice-president, Asset Management & Engineering at Sodexo, and marketing director at Asset Wisdom, Kevin Main, argued that the new ISO 55000 standards present facilities managers with an opportunity to create 'a joined-up, whole lifecycle approach' to managing and delivering value from assets. In this article, Kevin Main and Chris Bradley, who runs various asset management projects, examine the process of creating an asset management system. PMID:26268021

  4. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  5. A modified homogeneous freezing rate parameterization for aqueous solution droplets

    NASA Astrophysics Data System (ADS)

    Moehler, O.; Benz, S.; Hoehler, K.; Wagner, R.

    2012-12-01

    It is still a matter of debate wether cirrus cloud formation is dominated by heterogeneous ice nucleation, leading to low ice crystal number concentrations, or is also influenced by homogeneous freezing of solution aerosols leading to higher ice crystal number concentrations. Part of the discussion is due to the fact that current models seem to overestimate ice crystal numbers from homogeneous freezing compared to measurements, though the formation rate of cirrus ice crystals by homogeneous freezing of aqueous particles is believed to be well understood and formulated in terms of e.g. the concept of effective freezing temperatures or the water activity dependent ice nucleation rates. Series of recent cirrus cloud simulation experiments at the cloud chamber facility AIDA at the Karlsruhe Institute of Technology at temperatures between -40°C and -80°C together with process modeling studies demonstrated, that the freezing formulations tend to show a low bias in the humidity onset thresholds for homogeneous ice formation at temperatures below about 210 K, and furthermore overestimate the ice formation rate by at least a factor of 2. The experimental results will be summarized and a new empirical fit to the experimental data will be suggested for use in atmospheric models.

  6. A homogeneous superconducting magnet design using a hybrid optimization algorithm

    NASA Astrophysics Data System (ADS)

    Ni, Zhipeng; Wang, Qiuliang; Liu, Feng; Yan, Luguang

    2013-12-01

    This paper employs a hybrid optimization algorithm with a combination of linear programming (LP) and nonlinear programming (NLP) to design the highly homogeneous superconducting magnets for magnetic resonance imaging (MRI). The whole work is divided into two stages. The first LP stage provides a global optimal current map with several non-zero current clusters, and the mathematical model for the LP was updated by taking into account the maximum axial and radial magnetic field strength limitations. In the second NLP stage, the non-zero current clusters were discretized into practical solenoids. The superconducting conductor consumption was set as the objective function both in the LP and NLP stages to minimize the construction cost. In addition, the peak-peak homogeneity over the volume of imaging (VOI), the scope of 5 Gauss fringe field, and maximum magnetic field strength within superconducting coils were set as constraints. The detailed design process for a dedicated 3.0 T animal MRI scanner was presented. The homogeneous magnet produces a magnetic field quality of 6.0 ppm peak-peak homogeneity over a 16 cm by 18 cm elliptical VOI, and the 5 Gauss fringe field was limited within a 1.5 m by 2.0 m elliptical region.

  7. Creating a Toilet Training Plan

    MedlinePlus

    ... Size Email Print Share Creating a Toilet Training Plan Page Content Article Body These are the tools ... will need to create your own toilet-training plan and implement it at the best time for ...

  8. Simulator for SUPO, a Benchmark Aqueous Homogeneous Reactor (AHR)

    SciTech Connect

    Klein, Steven Karl; Determan, John C.

    2015-10-14

    A simulator has been developed for SUPO (Super Power) an aqueous homogeneous reactor (AHR) that operated at Los Alamos National Laboratory (LANL) from 1951 to 1974. During that period SUPO accumulated approximately 600,000 kWh of operation. It is considered the benchmark for steady-state operation of an AHR. The SUPO simulator was developed using the process that resulted in a simulator for an accelerator-driven subcritical system, which has been previously reported.

  9. A criterion for assessing homogeneity distribution in hyperspectral images. Part 2: application of homogeneity indices to solid pharmaceutical dosage forms.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    This article is the second of a series of two articles detailing the application of mixing index to assess homogeneity distribution in oral pharmaceutical solid dosage forms by image analysis. Chemical imaging (CI) is an emerging technique integrating conventional imaging and spectroscopic techniques with a view to obtaining spatial and spectral information from a sample. Near infrared chemical imaging (NIR-CI) has proved an excellent analytical tool for extracting high-quality information from sample surfaces. The primary objective of this second part was to demonstrate that the approach developed in the first part could be successfully applied to near infrared hyperspectral images of oral pharmaceutical solid dosage forms such as coated, uncoated and effervescent tablets, as well as to powder blends. To this end, we assessed a new criterion for establishing mixing homogeneity by using four different methods based on a three-dimensional (M×N×λ) data array of hyperspectral images (spectral standard deviations and correlation coefficients) or a two-dimensional (M×N) data array (concentration maps and binary images). The four methods were used applying macropixel analysis to the Poole (M(P)) and homogeneity (H%(Poole)) indices. Both indices proved useful for assessing the degree of homogeneity of pharmaceutical samples. The results testify that the proposed approach can be effectively used in the pharmaceutical industry, in the finished products (e.g., tablets) and in mixing unit operations for example, as a process analytical technology tool for the blending monitoring (see part 1). PMID:22840977

  10. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation. PMID:26027492

  11. Effect of non-homogenous thermal stress during sub-lethal photodynamic antimicrobial chemotherapy

    NASA Astrophysics Data System (ADS)

    Gadura, N.; Kokkinos, D.; Dehipawala, S.; Cheung, E.; Sullivan, R.; Subramaniam, R.; Schneider, P.; Tremberger, G., Jr.; Holden, T.; Lieberman, D.; Cheung, T.

    2012-03-01

    Pathogens could be inactivated via a light source coupled with a photosensitizing agent in photodynamic antimicrobial chemotherapy (PACT). This project studied the effect of non-homogenous substrate on cell colony. The non-homogeneity could be controlled by iron oxide nano-particles doping in porous glassy substrates such that each cell would experience tens of hot spots when illuminated with additional light source. The substrate non-homogeneity was characterized by Atomic Force Microscopy, Transmission Electron Microscopy and Extended X-Ray Absorption Fine Structure at Brookhaven Synchrotron Light Source. Microscopy images of cell motion were used to study the motility. Laboratory cell colonies on non-homogenous substrates exhibit reduced motility similar to those observed with sub-lethal PCAT treatment. Such motility reduction on non-homogenous substrate is interpreted as the presence of thermal stress. The studied pathogens included E. coli and Pseudomonas aeruginosa. Non-pathogenic microbes Bacillus subtilis was also studied for comparison. The results show that sub-lethal PACT could be effective with additional non-homogenous thermal stress. The use of non-uniform illumination on a homogeneous substrate to create thermal stress in sub-micron length scale is discussed via light correlation in propagation through random medium. Extension to sub-lethal PACT application complemented with thermal stress would be an appropriate application.

  12. Analysis of homogeneous/non-homogeneous nanofluid models accounting for nanofluid-surface interactions

    NASA Astrophysics Data System (ADS)

    Ahmad, R.

    2016-07-01

    This article reports an unbiased analysis for the water based rod shaped alumina nanoparticles by considering both the homogeneous and non-homogeneous nanofluid models over the coupled nanofluid-surface interface. The mechanics of the surface are found for both the homogeneous and non-homogeneous models, which were ignored in previous studies. The viscosity and thermal conductivity data are implemented from the international nanofluid property benchmark exercise. All the simulations are being done by using the experimentally verified results. By considering the homogeneous and non-homogeneous models, the precise movement of the alumina nanoparticles over the surface has been observed by solving the corresponding system of differential equations. For the non-homogeneous model, a uniform temperature and nanofluid volume fraction are assumed at the surface, and the flux of the alumina nanoparticle is taken as zero. The assumption of zero nanoparticle flux at the surface makes the non-homogeneous model physically more realistic. The differences of all profiles for both the homogeneous and nonhomogeneous models are insignificant, and this is due to small deviations in the values of the Brownian motion and thermophoresis parameters.

  13. Homogenization of Periodic Systems with Large Potentials

    NASA Astrophysics Data System (ADS)

    Allaire, Grégoire; Capdeboscq, Yves; Piatnitski, Andrey; Siess, Vincent; Vanninathan, M.

    2004-11-01

    We consider the homogenization of a system of second-order equations with a large potential in a periodic medium. Denoting by ɛ the period, the potential is scaled as ɛ-2. Under a generic assumption on the spectral properties of the associated cell problem, we prove that the solution can be approximately factorized as the product of a fast oscillating cell eigenfunction and of a slowly varying solution of a scalar second-order equation. This result applies to various types of equations such as parabolic, hyperbolic or eigenvalue problems, as well as fourth-order plate equation. We also prove that, for well-prepared initial data concentrating at the bottom of a Bloch band, the resulting homogenized tensor depends on the chosen Bloch band. Our method is based on a combination of classical homogenization techniques (two-scale convergence and suitable oscillating test functions) and of Bloch waves decomposition.

  14. Rapid biotic homogenization of marine fish assemblages.

    PubMed

    Magurran, Anne E; Dornelas, Maria; Moyes, Faye; Gotelli, Nicholas J; McGill, Brian

    2015-01-01

    The role human activities play in reshaping biodiversity is increasingly apparent in terrestrial ecosystems. However, the responses of entire marine assemblages are not well-understood, in part, because few monitoring programs incorporate both spatial and temporal replication. Here, we analyse an exceptionally comprehensive 29-year time series of North Atlantic groundfish assemblages monitored over 5° latitude to the west of Scotland. These fish assemblages show no systematic change in species richness through time, but steady change in species composition, leading to an increase in spatial homogenization: the species identity of colder northern localities increasingly resembles that of warmer southern localities. This biotic homogenization mirrors the spatial pattern of unevenly rising ocean temperatures over the same time period suggesting that climate change is primarily responsible for the spatial homogenization we observe. In this and other ecosystems, apparent constancy in species richness may mask major changes in species composition driven by anthropogenic change. PMID:26400102

  15. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E. (Inventor); Taylor, Bryant D. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  16. How was Taiwan created?

    NASA Astrophysics Data System (ADS)

    Sibuet, Jean-Claude; Hsu, Shu-Kun

    2004-02-01

    Since the beginning of formation of proto-Taiwan during late Miocene (9 Ma), the subducting Philippine (PH) Sea plate moved continuously through time in the N307° direction at a 5.6 cm/year velocity with respect to Eurasia (EU), tearing the Eurasian plate. Strain states within the EU crust are different on each side of the western PH Sea plate boundary (extensional in the Okinawa Trough and northeastern Taiwan versus contractional for the rest of Taiwan Island). The B feature corresponds to the boundary between the continental and oceanic parts of the subducting Eurasian plate and lies in the prolongation of the ocean-continent boundary of the northern South China Sea. Strain rates in the Philippines to northern Taiwan accretionary prism are similar on each side of B (contractional), though with different strain directions, perhaps in relation with the change of nature of the EU slab across B. Consequently, in the process of Taiwan mountain building, the deformation style was probably not changing continuously from the Manila to the Ryukyu subduction zones. The Luzon intra-oceanic arc only formed south of B, above the subducting Eurasian oceanic lithosphere. North of B, the Luzon arc collided with EU simultaneously with the eastward subduction of a portion of EU continental lithosphere beneath the Luzon arc. In its northern portion, the lower part of the Luzon arc was subducting beneath Eurasia while the upper part accreted against the Ryukyu forearc. Among the consequences of such a simple geodynamic model: (i) The notion of continuum from subduction to collision might be questioned. (ii) Traces of the Miocene volcanic arc were never found in the southwestern Ryukyu arc. We suggest that the portion of EU continental lithosphere, which has subducted beneath the Coastal Range, might include the Miocene Ryukyu arc volcanoes formed west of 126°E longitude and which are missing today. (iii) The 150-km-wide oceanic domain located south of B between the Luzon arc and

  17. Commensurability effects in holographic homogeneous lattices

    NASA Astrophysics Data System (ADS)

    Andrade, Tomas; Krikun, Alexander

    2016-05-01

    An interesting application of the gauge/gravity duality to condensed matter physics is the description of a lattice via breaking translational invariance on the gravity side. By making use of global symmetries, it is possible to do so without scarifying homogeneity of the pertinent bulk solutions, which we thus term as "homogeneous holographic lattices." Due to their technical simplicity, these configurations have received a great deal of attention in the last few years and have been shown to correctly describe momentum relaxation and hence (finite) DC conductivities.

  18. Homogeneous and heterogeneous reactions of phenanthrene with ozone

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Yang, Bo; Meng, Junwang; Gao, Shaokai; Dong, Xinyu; Shu, Jinian

    2010-02-01

    The reactions of gas-phase phenanthrene and suspended phenanthrene particles with ozone were conducted in a 200l chamber. The secondary organic aerosol formation was observed in the reaction of gas-phase phenanthrene with ozone and simultaneously the size distribution of the secondary organic aerosol was monitored with a scanning mobility particle sizer during the formation process. The particulate ozonation products from both reactions were analyzed with a vacuum ultraviolet photoionization aerosol time-of-flight mass spectrometer. 2,2'-Diformylbiphenyl was identified as the dominant product in both homogeneous and heterogeneous reactions of phenanthrene with ozone. GC/MS analysis of ozonation products of phenanthrene in glacial acetic acid was carried out for assigning time-of-flight mass spectra of reaction products formed in the homogeneous and heterogeneous reactions of phenanthrene with ozone.

  19. Homogeneity study of ointment dosage forms by infrared imaging spectroscopy.

    PubMed

    Carneiro, Renato Lajarim; Poppi, Ronei Jesus

    2012-01-25

    Ointment dosage forms are semi-solid preparations intended for local or transdermal delivery of active substances usually for application to the skin and it is important that they present a homogeneous appearance. In this work, a study of the homogeneity of a tacrolimus ointment dosage form was performed using infrared imaging spectroscopy coupled with principal component analysis (PCA) and multivariate curve resolution with alternating least squares (MCR-ALS) to interpret the imaging data. Optical visible microscopy images indicated possible phase separation in the ointment and, based on the results presented by distribution concentration maps from infrared imaging, it was possible to conclude that, in fact, there was phase separation incorporated in the ointment. Thus, infrared imaging spectroscopy associated to PCA and MCR-ALS is demonstrated to be a powerful tool for the development process of ointment dosage forms. PMID:22018891

  20. Low-gravity homogenization and solidification of aluminum antimonide. [Apollo-Soyuz test project

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1976-01-01

    The III-V semiconducting compound AlSb shows promise as a highly efficient solar cell material, but it has not been commercially exploited because of difficulties in compound synthesis. Liquid state homogenization and solidification of AlSb were carried out in the Apollo-Soyuz Test Project Experiment MA-044 in the hope that compositional homogeneity would be improved by negating the large density difference between the two constituents. Post-flight analysis and comparative characterization of the space-processed and ground-processed samples indicate that there are major homogeneity improvements in the low-gravity solidified material.

  1. A Story Approach to Create Online College Courses

    ERIC Educational Resources Information Center

    Romero, Liz

    2016-01-01

    The purpose of this article is to describe the implementation of a story approach to create online courses in a college environment. The article describes the components of the approach and the implementation process to create a nursing and a language course. The implementation starts with the identification of the need and follows by creating a…

  2. General Theorems about Homogeneous Ellipsoidal Inclusions

    ERIC Educational Resources Information Center

    Korringa, J.; And Others

    1978-01-01

    Mathematical theorems about the properties of ellipsoids are developed. Included are Poisson's theorem concerning the magnetization of a homogeneous body of ellipsoidal shape, the polarization of a dielectric, the transport of heat or electricity through an ellipsoid, and other problems. (BB)

  3. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  4. Extension theorems for homogenization on lattice structures

    NASA Technical Reports Server (NTRS)

    Miller, Robert E.

    1992-01-01

    When applying homogenization techniques to problems involving lattice structures, it is necessary to extend certain functions defined on a perforated domain to a simply connected domain. This paper provides general extension operators which preserve bounds on derivatives of order l. Only the special case of honeycomb structures is considered.

  5. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  6. A Common Genetic Variant in the 3′-UTR of Vacuolar H+-ATPase ATP6V0A1 Creates a Micro-RNA Motif to Alter Chromogranin A (CHGA) Processing and Hypertension Risk

    PubMed Central

    Wei, Zhiyun; Biswas, Nilima; Wang, Lei; Courel, Maite; Zhang, Kuixing; Soler-Jover, Alex; Taupenot, Laurent; O’Connor, Daniel T.

    2012-01-01

    Background The catecholamine release-inhibitor catestatin and its precursor chromogranin A (CHGA) may constitute “intermediate phenotypes” in analysis of genetic risk for cardiovascular disease such as hypertension. Previously, the vacuolar H+-ATPase subunit gene ATP6V0A1 was found within the confidence interval for linkage with catestatin secretion in a genome-wide study, and its 3′-UTR polymorphism T+3246C (rs938671) was associated with both catestatin processing from CHGA, as well as population blood pressure (BP). Here we explored the molecular mechanism of this effect by experiments with transfected chimeric photoproteins in chromaffin cells. Methods and Results Placing the ATP6V0A1 3′-UTR downstream of a luciferase reporter, we found that the C (variant) allele decreased overall gene expression. The 3′-UTR effect was verified by coupled in vitro transcription/translation of the entire/intact human ATP6V0A1 mRNA. Chromaffin granule pH, monitored by fluorescence a CHGA/EGFP chimera during vesicular H+-ATPase inhibition by bafilomycin A1, was more easily perturbed during co-expression of the ATP6V0A1 3′-UTR C-allele than the T-allele. After bafilomycin A1 treatment, the ratio of CHGA precursor to its catestatin fragments in PC12 cells was substantially diminished, though the qualitative composition of such fragments was not affected (on immunoblot or MALDI mass spectrometry). Bafilomycin A1 treatment also decreased exocytotic secretion from the regulated pathway, monitored by a CHGA chimera tagged with embryonic alkaline phosphatase (EAP). 3′-UTR T+3246C created a binding motif for micro-RNA hsa-miR-637; co-transfection of hsa-miR-637 precursor or antagomir/inhibitor oligonucleotides yielded the predicted changes in expression of luciferase reporter/ATP6V0A1-3′-UTR plasmids varying at T+3246C. Conclusions The results suggest a series of events whereby ATP6V0A1 3′-UTR variant T+3246C functioned: ATP6V0A1 expression was affected likely through

  7. Confocal detection of planar homogeneous and heterogeneous immunosorbent assays

    NASA Astrophysics Data System (ADS)

    Ghafari, Homanaz; Zhou, Yanzhou; Ali, Selman; Hanley, Quentin S.

    2009-11-01

    Optically sectioned detection of fluorescence immunoassays using a confocal microscope enables the creation of both homo- and heterogeneous planar format assays. We report a set assays requiring optically sectioned detection using a model system and analysis procedures for separating signals of a surface layer from an overlying solution. A model sandwich assay with human immunoglobulin G as the target antigen is created on a glass substrate. The prepared surfaces are exposed to antigen and a FITC-labeled secondary antibody. The resulting preparations are either read directly to provide a homogeneous assay or after wash steps, giving a heterogeneous assay. The simplicity of the object shapes arising from the planar format makes the decomposition of analyte signals from the thin film bound to the surface and overlayer straightforward. Measured response functions of the thin film and overlayer fit well to the Cauchy-Lorentz and cumulative Cauchy-Lorentz functions, respectively, enabling the film and overlayer to be separated. Under the conditions used, the detection limits for the homogeneous and heterogeneous forms of the assay are 2.2 and 5.5 ng/ml, respectively. Planar format, confocally read fluorescence assays enable wash-free detection of antigens and should be applicable to a wide range of assays involving surface-bound species.

  8. Are geological media homogeneous or heterogeneous for neutron investigations?

    PubMed

    Woźnicka, U; Drozdowicz, K; Gabańska, B; Krynicka, E; Igielski, A

    2003-01-01

    The thermal neutron absorption cross section of a heterogeneous material is lower than that of the corresponding homogeneous one which contains the same components. When rock materials are investigated the sample usually contains grains which create heterogeneity. The heterogeneity effect depends on the mass contribution of highly and low-absorbing centers, on the ratio of their absorption cross sections, and on their sizes. An influence of the granulation of silicon and diabase samples on the absorption cross section measured with Czubek's method has been experimentally investigated. A 20% underestimation of the absorption cross section has been observed for diabase grains of sizes from 6.3 to 12.8 mm. PMID:12485675

  9. Designing and Creating Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    McMeen, George R.

    Designed to encourage the use of a defined methodology and careful planning in creating computer-assisted instructional programs, this paper describes the instructional design process, compares computer-assisted instruction (CAI) and programmed instruction (PI), and discusses pragmatic concerns in computer programming. Topics addressed include:…

  10. Instruction: Does It Mean Creating Intelligence?

    ERIC Educational Resources Information Center

    Brethower, Dale

    1990-01-01

    Argues that the mission of the university is to create intelligence. Defines intelligence, discusses research on cognitive processes of learning, and discusses obstacles to using the demonstrate-label-coach-mastery strategy emphasizing the value of the clinical approach used to teach seven specific skills. Presents a classroom illustration of this…

  11. Creating engaging experiences for rehabilitation.

    PubMed

    McClusky, John F

    2008-01-01

    The traditional model of rehabilitation center design based on usability and function falls short of addressing the aspirations of those who use them. To better serve the motivational needs of both patients and therapists, we need to reconsider the gymnasium-inspired designs of current rehabilitation centers. Designers Patricia Moore and David Guynes have drawn inspiration from the everyday to create more engaging rehabilitation experiences with their Easy Street, Independence Square, Rehab 1-2-3, Our Town, and WorkSyms rehabilitation environments. Their designs simulate real-life situations to motivate patients by helping them connect their therapy to the life in which they aspire to return. Utilizing an empathic research process, Moore and Guynes build a deeper understanding of both patients' and therapists' values and apply that understanding to designs that are more directly connected to patients' aspirational goals while still meeting their functional rehabilitation needs. This same research-based design approach is utilized in all of their design work that has included, most recently, the design of the Phoenix Valley Transit Authority's Metro Light Rail Train. The train and stations have won awards for accessibility and will begin public operation in late 2008. PMID:18430671

  12. Photoinduced electron transfer processes in homogeneous and microheterogeneous solutions

    SciTech Connect

    Whitten, D.G.

    1991-10-01

    The focus of the work described in this report is on single electron transfer reactions of excited states which culminate in the formation of stable or metastable even electron species. For the most part the studies have involved even electron organic substrates which are thus converted photochemically to odd electron species and then at some stage reconvert to even electron products. These reactions generally fall into two rather different categories. In one set of studies we have examined reactions in which the metastable reagents generated by single electron transfer quenching of an excited state undergo novel fragmentation reactions, chiefly involving C-C bond cleavage. These reactions often culminate in novel and potentially useful chemical reactions and frequently have the potential for leading to new chemical products otherwise unaffordable by conventional reaction paths. In a rather different investigation we have also studied reactions in which single electron transfer quenching of an excited state is followed by subsequent reactions which lead reversibly to metastable two electron products which, often stable in themselves, can nonetheless be reacted with each other or with other reagents to regenerate the starting materials with release of energy. 66 refs., 9 figs., 1 tab.

  13. Pd/C catalyzed Suzuki-Miyaura cross coupling reaction: Is it heterogeneous or homogeneous?

    NASA Astrophysics Data System (ADS)

    Hoang, Tony Phuc

    The Suzuki-Miyaura cross-coupling reaction is a popular industrial method of creating covalent bonds between two carbons. This reaction can be catalyzed by a myriad of palladium catalyst including heterogeneous and homogeneous. The objective of this research is to study whether the Suzuki cross coupling reaction catalyzed by solid supported palladium catalysts is truly heterogeneous in nature (i.e. does the reaction occurs on the surface of the catalyst or does palladium leach from the solid support and catalyze the reaction in a homogenous manner).

  14. Homogeneous magnitude system of the Eurasian continent: S and L waves

    NASA Astrophysics Data System (ADS)

    Christoskov, L.; Kondorskaya, N. V.; Vanek, J.

    1983-07-01

    A research project was started by the Commission of Academies of Socialist Countries on Planetary Geophysics (KAPG) to establish a system of seismic reference stations of the Eurasian continent for determining reliable earthquake magnitudes. This system was called the Homogeneous Magnitude System (HMS), and seismologist of 13 institutions from Bulgaria, Czechoslovakia, German Democratic Republic, Poland, and the U.S.S.R. participated. The project was sponsored by the Commission on Practice of the International Association of Seismology and Physics of the Earth's Interior, which created a special working group for homogeneous magnitude system within its Subcommission on Magnitude.

  15. Tissue homogeneity requires inhibition of unequal gene silencing during development.

    PubMed

    Le, Hai H; Looney, Monika; Strauss, Benjamin; Bloodgood, Michael; Jose, Antony M

    2016-08-01

    Multicellular organisms can generate and maintain homogenous populations of cells that make up individual tissues. However, cellular processes that can disrupt homogeneity and how organisms overcome such disruption are unknown. We found that ∼100-fold differences in expression from a repetitive DNA transgene can occur between intestinal cells in Caenorhabditis elegans These differences are caused by gene silencing in some cells and are actively suppressed by parental and zygotic factors such as the conserved exonuclease ERI-1. If unsuppressed, silencing can spread between some cells in embryos but can be repeat specific and independent of other homologous loci within each cell. Silencing can persist through DNA replication and nuclear divisions, disrupting uniform gene expression in developed animals. Analysis at single-cell resolution suggests that differences between cells arise during early cell divisions upon unequal segregation of an initiator of silencing. Our results suggest that organisms with high repetitive DNA content, which include humans, could use similar developmental mechanisms to achieve and maintain tissue homogeneity. PMID:27458132

  16. Homogeneous UVA system for corneal cross-linking treatment

    NASA Astrophysics Data System (ADS)

    Ayres Pereira, Fernando R.; Stefani, Mario A.; Otoboni, José A.; Richter, Eduardo H.; Ventura, Liliane

    2010-02-01

    The treatment of keratoconus and corneal ulcers by collagen cross-linking using ultraviolet type A irradiation, combined with photo-sensitizer Riboflavin (vitamin B2), is a promising technique. The standard protocol suggests instilling Riboflavin in the pre-scratched cornea every 5min for 30min, during the UVA irradiation of the cornea at 3mW/cm2 for 30 min. This process leads to an increase of the biomechanical strength of the cornea, stopping the progression, or sometimes, even reversing Keratoconus. The collagen cross-linking can be achieved by many methods, but the utilization of UVA light, for this purpose, is ideal because of its possibility of a homogeneous treatment leading to an equal result along the treated area. We have developed a system, to be clinically used for treatment of unhealthy corneas using the cross-linking technique, which consists of an UVA emitting delivery device controlled by a closed loop system with high homogeneity. The system is tunable and delivers 3-5 mW/cm2, at 365nm, for three spots (6mm, 8mm and 10mm in diameter). The electronics close loop presents 1% of precision, leading to an overall error, after the calibration, of less than 10% and approximately 96% of homogeneity.

  17. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization

    PubMed Central

    Kwiatkowski, M.; Wurlitzer, M.; Krutilin, A.; Kiani, P.; Nimer, R.; Omidi, M.; Mannaa, A.; Bussmann, T.; Bartkowiak, K.; Kruber, S.; Uschold, S.; Steffen, P.; Lübberstedt, J.; Küpker, N.; Petersen, H.; Knecht, R.; Hansen, N.O.; Zarrine-Afsar, A.; Robertson, W.D.; Miller, R.J.D.; Schlüter, H.

    2016-01-01

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Biological significance Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. PMID:26778141

  18. Applications of High and Ultra High Pressure Homogenization for Food Safety.

    PubMed

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide "fresh-like" products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350-400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  19. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  20. Homogeneous crystal nucleation in binary metallic melts

    NASA Technical Reports Server (NTRS)

    Thompson, C. V.; Spaepen, F.

    1983-01-01

    A method for calculating the homogeneous crystal nucleation frequency in binary metallic melts is developed. The free energy of crystallization is derived from regular solution models for the liquid and solid and is used, together with model-based estimates of the interfacial tension, to calculate the nucleation frequency from the classical theory. The method can account for the composition dependence of the maximum undercooling observed in a number of experiments on small droplet dispersions. It can also be used to calculate the driving force for crystal growth and to obtain more precise estimates of the homogeneous crystal nucleation frequency in glass-forming alloys. This method, although approximate, is simple to apply, and requires only knowledge of the phase diagram and a few readily available thermodynamic quantities as input data.

  1. Beyond relationships between homogeneous and heterogeneous catalysis

    SciTech Connect

    Dixon, David A.; Katz, Alexander; Arslan, Ilke; Gates, Bruce C.

    2014-08-13

    Scientists who regard catalysis as a coherent field have been striving for decades to articulate the fundamental unifying principles. But because these principles seem to be broader than chemistry, chemical engineering, and materials science combined, catalytic scientists commonly interact within the sub-domains of homogeneous, heterogeneous, and bio-catalysis, and increasingly within even narrower domains such as organocatalysis, phase-transfer catalysis, acid-base catalysis, zeolite catalysis, etc. Attempts to unify catalysis have motivated researchers to find relationships between homogeneous and heterogeneous catalysis and to mimic enzymes. These themes have inspired vibrant international meetings and workshops, and we have benefited from the idea exchanges and have some thoughts about a path forward.

  2. Homogeneous Superpixels from Markov Random Walks

    NASA Astrophysics Data System (ADS)

    Perbet, Frank; Stenger, Björn; Maki, Atsuto

    This paper presents a novel algorithm to generate homogeneous superpixels from Markov random walks. We exploit Markov clustering (MCL) as the methodology, a generic graph clustering method based on stochastic flow circulation. In particular, we introduce a graph pruning strategy called compact pruning in order to capture intrinsic local image structure. The resulting superpixels are homogeneous, i.e. uniform in size and compact in shape. The original MCL algorithm does not scale well to a graph of an image due to the square computation of the Markov matrix which is necessary for circulating the flow. The proposed pruning scheme has the advantages of faster computation, smaller memory footprint, and straightforward parallel implementation. Through comparisons with other recent techniques, we show that the proposed algorithm achieves state-of-the-art performance.

  3. Detonation in shocked homogeneous high explosives

    SciTech Connect

    Yoo, C.S.; Holmes, N.C.; Souers, P.C.

    1995-11-01

    We have studied shock-induced changes in homogeneous high explosives including nitromethane, tetranitromethane, and single crystals of pentaerythritol tetranitrate (PETN) by using fast time-resolved emission and Raman spectroscopy at a two-stage light-gas gun. The results reveal three distinct steps during which the homogeneous explosives chemically evolve to final detonation products. These are (1) the initiation of shock compressed high explosives after an induction period, (2) thermal explosion of shock-compressed and/or reacting materials, and (3) a decay to a steady-state representing a transition to the detonation of uncompressed high explosives. Based on a gray-body approximation, we have obtained the CJ temperatures: 3800 K for nitromethane, 2950 K for tetranitromethane, and 4100 K for PETN. We compare the data with various thermochemical equilibrium calculations. In this paper we will also show a preliminary result of single-shot time-resolved Raman spectroscopy applied to shock-compressed nitromethane.

  4. Resonant ultrasound spectroscopy and homogeneity in polycrystals.

    PubMed

    Kaplan, Gunes; Darling, T W; McCall, K R

    2009-01-01

    Resonant ultrasound spectroscopy (RUS) is capable of determining the bulk elastic properties of a solid from its characteristic vibration frequencies, given the dimensions, density and shape of the sample. The model used for extracting values of the elastic constants assumes perfect homogeneity, which can be approximated by average-isotropic polycrystals. This approximation is excellent in the small grain regime assumed for most averaging procedures, but for real samples with indeterminate grain size distributions, it is not clear where the approximation breaks down. RUS measurements were made on pure copper samples where the grain size distribution was changed by progressive heat treatments in order to find a quantitative limit for the loss of homogeneity. It is found that when a measure of the largest grains is 15% of the sample's smallest dimension, the deviation in RUS fits indicates elastic inhomogeneity. PMID:18804831

  5. CUDA Simulation of Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John V.; Shum, Victor; Fu, Terry

    2011-01-01

    We discuss very fast Compute Unified Device Architecture (CUDA) simulations of ideal homogeneous incompressible turbulence based on Fourier models. These models have associated statistical theories that predict that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. Prior numerical simulations have shown that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We review the theoretical basis of this "broken ergodicity" as applied to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence. Our new simulations examine the phenomenon of broken ergodicity through very long time and large grid size runs performed on a state-of-the-art CUDA platform. Results comparing various CUDA hardware configurations and grid sizes are discussed. NS and MHD results are compared.

  6. Homogeneous freezing nucleation of stratospheric solution droplets

    NASA Technical Reports Server (NTRS)

    Jensen, Eric J.; Toon, Owen B.; Hamill, Patrick

    1991-01-01

    The classical theory of homogeneous nucleation was used to calculate the freezing rate of sulfuric acid solution aerosols under stratospheric conditions. The freezing of stratospheric aerosols would be important for the nucleation of nitric acid trihydrate particles in the Arctic and Antarctic stratospheres. In addition, the rate of heterogeneous chemical reactions on stratospheric aerosols may be very sensitive to their state. The calculations indicate that homogeneous freezing nucleation of pure water ice in the stratospheric solution droplets would occur at temperatures below about 192 K. However, the physical properties of H2SO4 solution at such low temperatures are not well known, and it is possible that sulfuric acid aerosols will freeze out at temperatures ranging from about 180 to 195 K. It is also shown that the temperature at which the aerosols freeze is nearly independent of their size.

  7. Homogeneity of kappa statistics in multiple samples.

    PubMed

    Reed, J F

    2000-08-01

    The measurement of intra-observer agreement when the data are categorical has been the subject of several investigators since Cohen first proposed the kappa (kappa) as a chance-corrected coefficient of agreement for nominal scales. Subsequent procedures have been developed to assess the agreement of several raters using a dichotomous classification scheme, assess majority agreement among several raters using a polytomous classification scheme, and the use of kappa as an indicator of the quality of a measurement. Further developments include inference procedures for testing the homogeneity of k>/=2 independent kappa statistics. An executable FORTRAN code for testing the homogeneity of kappa statistics (kappa(h)) across multiple sites or studies is given. The FORTRAN program listing and/or executable programs are available from the author on request. PMID:10927153

  8. A homogenization model of the annulus fibrosus.

    PubMed

    Yin, Luzhong; Elliott, Dawn M

    2005-08-01

    The objective of this study was to use a homogenization model of the anisotropic mechanical behavior of annulus fibrosus (AF) to address some of the issues raised in structural finite element and fiber-reinforced strain energy models. Homogenization theory describes the effect of microstructure on macroscopic material properties by assuming the material is composed of repeating representative volume elements. We first developed the general homogenization model and then specifically prescribed the model to in-plane single lamella and multi-lamellae AF properties. We compared model predictions to experimentally measured AF properties and performed parametric studies. The predicted tensile moduli (E theta and E z) and their dependence on fiber volume fraction and fiber angle were consistent with measured values. However, the model prediction for shear modulus (G thetaz) was two orders of magnitude larger than directly measured values. The values of E theta and E z were strongly dependent on the model input for matrix modulus, much more so than the fiber modulus. These parametric analyses demonstrated the contribution of the matrix in AF load support, which may play a role when protoeglycans are decreased in disc degeneration, and will also be an important design factor in tissue engineering. We next compared the homogenization model to a 3-D structural finite element model and fiber-reinforced energy models. Similarities between the three model types provided confidence in the ability of these models to predict AF tissue mechanics. This study provides a direct comparison between the several types of AF models and will be useful for interpreting previous studies and elucidating AF structure-function relationships in disc degeneration and for functional tissue engineering. PMID:15958225

  9. Spherical cloaking with homogeneous isotropic multilayered structures.

    PubMed

    Qiu, Cheng-Wei; Hu, Li; Xu, Xiaofei; Feng, Yijun

    2009-04-01

    We propose a practical realization of electromagnetic spherical cloaking by layered structure of homogeneous isotropic materials. By mimicking the classic anisotropic cloak by many alternating thin layers of isotropic dielectrics, the permittivity and permeability in each isotropic layer can be properly determined by effective medium theory in order to achieve invisibility. The model greatly facilitates modeling by Mie theory and realization by multilayer coating of dielectrics. Eigenmode analysis is also presented to provide insights of the discretization in multilayers. PMID:19518392

  10. [Create or copy... Which is the difference?].

    PubMed

    López P, Ricardo

    2009-01-01

    Creating and copying are two different processes; we must not confuse creativity with plagiarism. However, this distinction is problematic, because there is no possibility of creating from scratch, this implies that any creative act necessarily arises from accumulative experience, inevitably producing a continuity between old and new. Even so it is necessary to establish clearly the difference between creating and copying. It is not desirable that a matter of such importance remains in the nebula or that the relationship between creativity and ethics is kept unaware. There are many cases of plagiarism, but this cannot be a consolation. There is no gain when the existence of a plagiarism is ignored or concealed and less when it is unjustified. PMID:19399333

  11. MULTIGRID HOMOGENIZATION OF HETEROGENEOUS POROUS MEDIA

    SciTech Connect

    Dendy, J.E.; Moulton, J.D.

    2000-10-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL); this report, however, reports on only two years research, since this project was terminated at the end of two years in response to the reduction in funding for the LDRD Program at LANL. The numerical simulation of flow through heterogeneous porous media has become a vital tool in forecasting reservoir performance, analyzing groundwater supply and predicting the subsurface flow of contaminants. Consequently, the computational efficiency and accuracy of these simulations is paramount. However, the parameters of the underlying mathematical models (e.g., permeability, conductivity) typically exhibit severe variations over a range of significantly different length scales. Thus the numerical treatment of these problems relies on a homogenization or upscaling procedure to define an approximate coarse-scale problem that adequately captures the influence of the fine-scale structure, with a resultant compromise between the competing objectives of computational efficiency and numerical accuracy. For homogenization in models of flow through heterogeneous porous media, We have developed new, efficient, numerical, multilevel methods, that offer a significant improvement in the compromise between accuracy and efficiency. We recently combined this approach with the work of Dvorak to compute bounded estimates of the homogenized permeability for such flows and demonstrated the effectiveness of this new algorithm with numerical examples.

  12. Homogeneous Biosensing Based on Magnetic Particle Labels

    PubMed Central

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  13. TESTING HOMOGENEITY WITH GALAXY STAR FORMATION HISTORIES

    SciTech Connect

    Hoyle, Ben; Jimenez, Raul; Tojeiro, Rita; Maartens, Roy; Heavens, Alan; Clarkson, Chris

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past light cone, while observations take place on the light cone. The star formation history (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked luminous red galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal-area contiguous sky patches and 10 redshift slices (0.2 < z < 0.5), which correspond to 120 blocks of volume {approx}0.04 Gpc{sup 3}. Using the SFH in a time period that samples the history of the universe between look-back times 11.5 and 13.4 Gyr as a proxy for homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is no extra variance at all. At 95% credibility, there is no evidence of deviations larger than 5.8%.

  14. Homogeneous Biosensing Based on Magnetic Particle Labels.

    PubMed

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  15. Effect of homogenization and ultrasonication on the physical properties of insoluble wheat bran fibres

    NASA Astrophysics Data System (ADS)

    Hu, Ran; Zhang, Min; Adhikari, Benu; Liu, Yaping

    2015-10-01

    Wheat bran is rich in dietary fibre and its annual output is abundant, but underutilized. Insoluble dietary fibre often influences food quality negatively; therefore, how to improve the physical and chemical properties of insoluble dietary fibre of wheat bran for post processing is a challenge. Insoluble dietary fibre was obtained from wheat bran and micronized using high-pressure homogenization, high-intensity sonication, and a combination of these two methods. The high-pressure homogenization and high-pressure homogenization+high-intensity sonication treatments significantly (p<0.05) improved the solubility, swelling, water-holding, oil-holding, and cation exchange capacities. The improvement of the above properties by high-intensity sonication alone was marginal. In most cases, the high-pressure homogenization process was as good as the high-pressure homogenization+high-intensity sonication process in improving the above-mentioned properties; hence, the contribution of high-`intensity sonication in the high-pressure homogenization+high-intensity sonication process was minimal. The best results show that the minimum particle size of wheat bran can reach 9 μm, and the solubility, swelling, water-holding, oil-holding, cation exchange capacities change significantly.

  16. Homogeneous and heterogeneous chemistry along air parcel trajectories

    NASA Technical Reports Server (NTRS)

    Jones, R. L.; Mckenna, D. L.; Poole, L. R.; Solomon, S.

    1990-01-01

    The study of coupled heterogeneous and homogeneous chemistry due to polar stratospheric clouds (PSC's) using Lagrangian parcel trajectories for interpretation of the Airborne Arctic Stratosphere Experiment (AASE) is discussed. This approach represents an attempt to quantitatively model the physical and chemical perturbation to stratospheric composition due to formation of PSC's using the fullest possible representation of the relevant processes. Further, the meteorological fields from the United Kingdom Meteorological office global model were used to deduce potential vorticity and inferred regions of PSC's as an input to flight planning during AASE.

  17. Te homogeneous precipitation in Ge dislocation loop vicinity

    NASA Astrophysics Data System (ADS)

    Perrin Toinin, J.; Portavoce, A.; Texier, M.; Bertoglio, M.; Hoummada, K.

    2016-06-01

    High resolution microscopies were used to study the interactions of Te atoms with Ge dislocation loops, after a standard n-type doping process in Ge. Te atoms neither segregate nor precipitate on dislocation loops, but form Te-Ge clusters at the same depth as dislocation loops, in contradiction with usual dopant behavior and thermodynamic expectations. Atomistic kinetic Monte Carlo simulations show that Te atoms are repulsed from dislocation loops due to elastic interactions, promoting homogeneous Te-Ge nucleation between dislocation loops. This phenomenon is enhanced by coulombic interactions between activated Te2+ or Te1+ ions.

  18. Homogeneous turbulence subjected to mean flow with elliptic streamlines

    NASA Technical Reports Server (NTRS)

    Blaisdell, G. A.; Shariff, K.

    1994-01-01

    Direct numerical simulations are performed for homogeneous turbulence with a mean flow having elliptic streamlines. This flow combines the effects of rotation and strain on the turbulence. Qualitative comparisons are made with linear theory for cases with high Rossby number. The nonlinear transfer process is monitored using a generalized skewness. In general, rotation turns off the nonlinear cascade; however, for moderate ellipticities and rotation rates the nonlinear cascade is turned off and then reestablished. Turbulence statistics of interest in turbulence modeling are calculated, including full Reynolds stress budgets.

  19. Homogeneous Charge Compression Ignition Free Piston Linear Alternator

    SciTech Connect

    Janson Wu; Nicholas Paradiso; Peter Van Blarigan; Scott Goldsborough

    1998-11-01

    An experimental and theoretical investigation of a homogeneous charge compression ignition (HCCI) free piston powered linear alternator has been conducted to determine if improvements can be made in the thermal and conversion efficiencies of modern electrical generator systems. Performance of a free piston engine was investigated using a rapid compression expansion machine and a full cycle thermodynamic model. Linear alternator performance was investigated with a computer model. In addition linear alternator testing and permanent magnet characterization hardware were developed. The development of the two-stroke cycle scavenging process has begun.

  20. Creating and Exploring Simple Models

    ERIC Educational Resources Information Center

    Hubbard, Miles J.

    2007-01-01

    Students manipulate data algebraically, and statistically to create models applied to a falling ball. They also borrow tools from arithmetic progressions to examine the relationship between the velocity and the distance the ball falls. (Contains 2 tables and 5 figures.)

  1. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    SciTech Connect

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  2. Creating Cartoons to Promote Leaderships Skills and Explore Leadership Qualities

    ERIC Educational Resources Information Center

    Smith, Latisha L.; Clausen, Courtney K.; Teske, Jolene K.; Ghayoorrad, Maryam; Gray, Phyllis; Al Subia, Sukainah; Atwood-Blaine, Dana; Rule, Audrey C.

    2015-01-01

    This document describes a strategy for increasing student leadership and creativity skills through the creation of cartoons. Creating cartoons engages students in divergent thinking and cognitive processes, such as perception, recall, and mental processing. When students create cartoons focused on a particular topic, they are making connections to…

  3. Homogenous charge compression ignition engine having a cylinder including a high compression space

    DOEpatents

    Agama, Jorge R.; Fiveland, Scott B.; Maloney, Ronald P.; Faletti, James J.; Clarke, John M.

    2003-12-30

    The present invention relates generally to the field of homogeneous charge compression engines. In these engines, fuel is injected upstream or directly into the cylinder when the power piston is relatively close to its bottom dead center position. The fuel mixes with air in the cylinder as the power piston advances to create a relatively lean homogeneous mixture that preferably ignites when the power piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. Thus, the present invention divides the homogeneous charge between a controlled volume higher compression space and a lower compression space to better control the start of ignition.

  4. APHRODITE daily precipitation and temperature dataset: Development, QC, Homogenization and Spatial Correlation

    NASA Astrophysics Data System (ADS)

    Yatagai, Akiyo; Zhao, Tianbao

    2014-05-01

    A daily gridded precipitation dataset for the period 1951-2007 was created by collecting and analyzing rain-gauge observation data across Asia through the activities of the Asian Precipitation - Highly Resolved Observational Data Integration Towards Evaluation (APHRODITE) of water resources project. They are available at http://www.chikyu.ac.jp/precip/. Utilization of station data is ideal for analyses of climatic trends, especially for those of extreme events. However, there was an increasing demand for accurate high-resolution gauge-based precipitation analyses. Rain-gauge based products are sometimes used for assessing trends of climate models or that of river runoff through driving hydrological models, because they are convenient and long records. On the other hand, some information is lost during the gridding process. Hence, in-house results of testing interpolation scheme, quality control and homogenization may give important information for the users. We will present such results as well as our quality control (QC) in the APHRODITE project activities. Before gridding, 14 objective QC steps were applied to the rain-gauge data, which mainly includes position checking, duplicate data checking and inhomogeneity and spatiotemporal isolation etc. Details are described in Hamada et al. (2011). For Chinese data, basic QC steps such as duplicate checking and position checking have been made by the local meteorological agency. Hence we made homogenization test and spatial correlation analyses separately. For 756 Chinese daily temperature stations, we applied Multiple Analysis of Series for Homogenization (MASH) developed by Szentimrey (1999, 2008). The results show this statistical method we used has a good performance to detect the discontinuities in climate series caused by station relocation, instrument change etc. regardless of the absence of metadata. Through the homogenization, most of discontinuities existed in original temperature data can be removed, and the

  5. Turbulent Diffusion in Non-Homogeneous Environments

    NASA Astrophysics Data System (ADS)

    Diez, M.; Redondo, J. M.; Mahjoub, O. B.; Sekula, E.

    2012-04-01

    Many experimental studies have been devoted to the understanding of non-homogeneous turbulent dynamics. Activity in this area intensified when the basic Kolmogorov self-similar theory was extended to two-dimensional or quasi 2D turbulent flows such as those appearing in the environment, that seem to control mixing [1,2]. The statistical description and the dynamics of these geophysical flows depend strongly on the distribution of long lived organized (coherent) structures. These flows show a complex topology, but may be subdivided in terms of strongly elliptical domains (high vorticity regions), strong hyperbolic domains (deformation cells with high energy condensations) and the background turbulent field of moderate elliptic and hyperbolic characteristics. It is of fundamental importance to investigate the different influence of these topological diverse regions. Relevant geometrical information of different areas is also given by the maximum fractal dimension, which is related to the energy spectrum of the flow. Using all the available information it is possible to investigate the spatial variability of the horizontal eddy diffusivity K(x,y). This information would be very important when trying to model numerically the behaviour in time of the oil spills [3,4] There is a strong dependence of horizontal eddy diffusivities with the Wave Reynolds number as well as with the wind stress measured as the friction velocity from wind profiles measured at the coastline. Natural sea surface oily slicks of diverse origin (plankton, algae or natural emissions and seeps of oil) form complicated structures in the sea surface due to the effects of both multiscale turbulence and Langmuir circulation. It is then possible to use the topological and scaling analysis to discriminate the different physical sea surface processes. We can relate higher orden moments of the Lagrangian velocity to effective diffusivity in spite of the need to calibrate the different regions determining the

  6. Homogeneity computation: How interitem similarity in visual short-term memory alters recognition

    PubMed Central

    Viswanathan, Shivakumar; Perl, Daniel R.; Visscher, Kristina M.; Kahana, Michael J.; Sekuler, Robert

    2010-01-01

    Visual short-term recognition memory for multiple stimuli is strongly influenced by the study items’ similarity to one another—that is, by their homogeneity. However, the mechanism responsible for this homogeneity effect has remained unclear. We evaluated competing explanations of this effect, using controlled sets of Gabor patches as study items and probe stimuli. Our results, based on recognition memory for spatial frequency, rule out the possibility that the homogeneity effect arises because similar study items are encoded and/or maintained with higher fidelity in memory than dissimilar study items are. Instead, our results support the hypothesis that the homogeneity effect reflects trial-by-trial comparisons of study items, which generate a homogeneity signal. This homogeneity signal modulates recognition performance through an adjustment of the subject’s decision criterion. Additionally, it seems the homogeneity signal is computed prior to the presentation of the probe stimulus, by evaluating the familiarity of each new stimulus with respect to the items already in memory. This suggests that recognition-like processes operate not only on the probe stimulus, but on study items as well. PMID:20081162

  7. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    NASA Astrophysics Data System (ADS)

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang; Paxton, Dean; Lavender, Curt; Burkes, Douglas

    2016-04-01

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  8. The Chemical Homogeneity of Open Clusters

    NASA Astrophysics Data System (ADS)

    Bovy, Jo

    2016-01-01

    Determining the level of chemical homogeneity in open clusters is of fundamental importance in the study of the evolution of star-forming clouds and that of the Galactic disk. Yet limiting the initial abundance spread in clusters has been hampered by difficulties in obtaining consistent spectroscopic abundances for different stellar types. Without reference to any specific model of stellar photospheres, a model for a homogeneous cluster is that it forms a one-dimensional sequence, with any differences between members due to variations in stellar mass and observational uncertainties. I present a novel method for investigating the abundance spread in open clusters that tests this one-dimensional hypothesis at the level of observed stellar spectra, rather than constraining homogeneity using derived abundances as is traditionally done. Using high-resolution APOGEE spectra for 49 giants in M67, NGC 6819, and NGC 2420 I demonstrate that these spectra form one-dimensional sequences for each cluster. With detailed forward modeling of the spectra and Approximate Bayesian Computation, I derive strong limits on the initial abundance spread of 15 elements: <0.01 (0.02) {dex} for C and Fe, ≲0.015 (0.03) {dex} for N, O, Mg, Si, and Ni, ≲0.02 (0.03) {dex} for Al, Ca, and Mn, and ≲0.03 (0.05) {dex} for Na, S, K, Ti, and V (at 68% and 95% confidence, respectively). The strong limits on C and O imply that no pollution by massive core-collapse supernovae occurred during star formation in open clusters, which, thus, need to form within ≲6 {Myr}. Further development of this and related techniques will bring the power of differential abundances to stars other than solar twins in large spectroscopic surveys and will help unravel the history of star formation and chemical enrichment in the Milky Way through chemical tagging.

  9. Sulfur isotope homogeneity of lunar mare basalts

    NASA Astrophysics Data System (ADS)

    Wing, Boswell A.; Farquhar, James

    2015-12-01

    We present a new set of high precision measurements of relative 33S/32S, 34S/32S, and 36S/32S values in lunar mare basalts. The measurements are referenced to the Vienna-Canyon Diablo Troilite (V-CDT) scale, on which the international reference material, IAEA-S-1, is characterized by δ33S = -0.061‰, δ34S ≡ -0.3‰ and δ36S = -1.27‰. The present dataset confirms that lunar mare basalts are characterized by a remarkable degree of sulfur isotopic homogeneity, with most new and published SF6-based sulfur isotope measurements consistent with a single mass-dependent mean isotopic composition of δ34S = 0.58 ± 0.05‰, Δ33S = 0.008 ± 0.006‰, and Δ36S = 0.2 ± 0.2‰, relative to V-CDT, where the uncertainties are quoted as 99% confidence intervals on the mean. This homogeneity allows identification of a single sample (12022, 281) with an apparent 33S enrichment, possibly reflecting cosmic-ray-induced spallation reactions. It also reveals that some mare basalts have slightly lower δ34S values than the population mean, which is consistent with sulfur loss from a reduced basaltic melt prior to eruption at the lunar surface. Both the sulfur isotope homogeneity of the lunar mare basalts and the predicted sensitivity of sulfur isotopes to vaporization-driven fractionation suggest that less than ≈1-10% of lunar sulfur was lost after a potential moon-forming impact event.

  10. Compressible homogeneous shear: Simulation and modeling

    NASA Technical Reports Server (NTRS)

    Sarkar, S.; Erlebacher, G.; Hussaini, M. Y.

    1992-01-01

    Compressibility effects were studied on turbulence by direct numerical simulation of homogeneous shear flow. A primary observation is that the growth of the turbulent kinetic energy decreases with increasing turbulent Mach number. The sinks provided by compressible dissipation and the pressure dilatation, along with reduced Reynolds shear stress, are shown to contribute to the reduced growth of kinetic energy. Models are proposed for these dilatational terms and verified by direct comparison with the simulations. The differences between the incompressible and compressible fields are brought out by the examination of spectra, statistical moments, and structure of the rate of strain tensor.

  11. Einstein billiards and spatially homogeneous cosmological models

    NASA Astrophysics Data System (ADS)

    de Buyl, Sophie; Pinardi, Gaïa; Schomblond, Christiane

    2003-12-01

    In this paper, we analyse the Einstein and Einstein Maxwell billiards for all spatially homogeneous cosmological models corresponding to three- and four-dimensional real unimodular Lie algebras and provide a list of those models which are chaotic in the Belinskii, Khalatnikov and Lifschitz (BKL) limit. Through the billiard picture, we confirm that, in D = 5 spacetime dimensions, chaos is present if off-diagonal metric elements are kept: the finite volume billiards can be identified with the fundamental Weyl chambers of hyperbolic Kac Moody algebras. The most generic cases bring in the same algebras as in the inhomogeneous case, but other algebras appear through special initial conditions.

  12. Homogeneous system UTBLI for 1964 - 1986.

    NASA Astrophysics Data System (ADS)

    Jovanović, B.; Durović, L.; Jovanović, M.

    1993-09-01

    Homogeneous results of universal time determinations derived from the observations by the Transit Instrument of Belgrade Astronomical Observatory (BLI) for the interval 1964 - 1986 are presented. They were prepared in accordance with IERS standards and listed in a table. In addition, using the smoothed values of monthly averaged UT1BLI-UT1BIH, an analysis on the variation of the local system UT1BLI is carried out, and also, systematic deviations after the adopted BIH model are shown. Undoubtedly, there exists a significant 11 - 14 year periodic change of UT1BLI system.

  13. Isotropic homogeneous universe with viscous fluid

    SciTech Connect

    Santos, N.O.; Dias, R.S.; Banerjee, A.

    1985-04-01

    Exact solutions are obtained for the isotropic homogeneous cosmological model with viscous fluid. The fluid has only bulk viscosity and the viscosity coefficient is taken to be a power function of the mass density. The equation of state assumed obeys a linear relation between mass density and pressure. The models satisfying Hawking's energy conditions are discussed. Murphy's model is only a special case of this general set of solutions and it is shown that Murphy's conclusion that the introduciton of bulk viscosity can avoid the occurrence of space-time singularity at finite past is not, in general, valid.

  14. Mirror Symmetry for Quasi-Homogeneous Singularities

    NASA Astrophysics Data System (ADS)

    Rathnakumara, Himal; Jarvis, Tyler

    2008-10-01

    I will present an introduction to mirror symmetry in the context of string theory. Then I will describe an instance of mirror symmetry for singularties defined by quasi-homogeneous polynomials in weighted projective spaces. Milnor rings and the FJRW (Fan-Jarvis-Ruan-Witten) rings associated with these singularities and their relation to the Landua-Ginzburg A model and the Landua-Ginzburg B model will be explained. Results of the calculations for certain singularities for which the mirror symmetry conjecture has been verified will be presented.

  15. Heterogeneity versus homogeneity of multiple sclerosis

    PubMed Central

    Sato, Fumitaka; Martinez, Nicholas E; Omura, Seiichi; Tsunoda, Ikuo

    2011-01-01

    The 10th International Congress of Neuroimmunology, including the 10th European School of Neuroimmunology Course, was held by the International Society of Neuroimmunology in Sitges (Barcelona, Spain) on 26–30 October 2010. The conference covered a wide spectrum of issues and challenges in both basic science and clinical aspects of neuroimmunology. Data and ideas were shared through a variety of programs, including review talks and poster sessions. One of the topics of the congress was whether multiple sclerosis is a homogenous or heterogenous disease, clinically and pathologically, throughout its course. PMID:21426254

  16. Local structures of homogeneous Hall MHD turbulence

    NASA Astrophysics Data System (ADS)

    Miura, H.; Araki, K.

    2011-12-01

    Local structures of decaying homogeneous and isotropic Hall MHD turbulence are studied by means of direct numerical simulations. Regions of strong vorticity and strong current density in Hall MHD turbulence are compared to those of single-fluid MHD turbulence. An analysis by the use of a low-pass filter reveals that the introduction of the Hall term can modify not only small-scale structures of the current density but also structures of the vorticity field, especially at the scales smaller than the ion skin depth.

  17. Exploring Earthquake Databases for the Creation of Magnitude-Homogeneous Catalogues: Tools for Application on a Regional and Global Scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-06-01

    The creation of a magnitude-homogenised catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenising multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins, and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilise this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonise magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonised into moment-magnitude to form a catalogue of more than 562,840 events. This extended catalogue, whilst not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  18. Homogeneous and hypersurface-homogeneous shear-free perfect fluids ingeneral relativity.

    NASA Astrophysics Data System (ADS)

    Collins, C. B.

    1988-08-01

    Shear-free, general-relativistic perfect fluids are investigated in the case where they are either homogeneous or hypersurface-homogeneous (and, in particular, spatially homogeneous). It is assumed that the energy density μ and the presurep of the fluid are related by a barotropic equation of statep = p(μ), where μ +p ≠ 0. Under such circumstances, it follows that either the fluid's volume expansion rate θ or the fluid's vorticity (i.e., rotation) ω must vanish. In the homogeneous case, this leads to only two possibilities: either ω = θ = 0 (the Einstein static solution), or ω ≠ 0,θ = 0 (the Gödel solution). In the hypersurface-homogeneous case, the situation is more complicated: either ω = 0, θ≠ 0 (as exemplified,inter alia, by the Friedmann-Robertson-Walker models), or ω ≠ 0, θ = 0 (which pertains, for example, in general stationary cylindrically symmetric fluids with rigid rotation, or ω = θ = 0 (as occurs for static spherically symmetric solutions). Each possibility is further subdivided in an invariant way, and related to the studies of other authors, thereby unifying and extending these earlier works.

  19. Plasma And Beam Homogeneity Of The RF-Driven Negative Hydrogen Ion Source For ITER NBI

    SciTech Connect

    Fantz, U.; Franzen, P.; Kraus, W.; Wuenderlich, D.; Gutser, R.; Berger, M.

    2009-03-12

    The neutral beam injection (NBI) system of ITER is based on a large RF driven negative hydrogen ion source. For good beam transmission ITER requires a beam homogeneity of better than 10%. The plasma uniformity and the correlation with the beam homogeneity are being investigated at the prototype ion sources at IPP. Detailed studies are carried out at the long pulse test facility MANITU with a source of roughly 1/8 of the ITER source size. The plasma homogeneity close to plasma grid is measured by optical emission spectroscopy and by fixed Langmuir probes working in the ion saturation region. The beam homogeneity is measured with a spatially resolved H{sub {alpha}} Doppler-shifted beam spectroscopy system. The plasma top-to-bottom symmetry improves with increasing RF power and increasing bias voltage which is applied to suppress the co-extracted electron current. The symmetry is better in deuterium than in hydrogen. The boundary layer near the plasma grid determines the plasma symmetry. At high ion currents with a low amount of co-extracted electrons the plasma is symmetrical and the beam homogeneity is typically 5-10%(RMS). The size scaling and the influence of the magnetic field strength of the filter field created by a plasma grid current is studied at the test facility RADI (roughly a 1/2 size ITER source) at ITER relevant RF power levels. In volume operation in deuterium (non-cesiated source), the plasma illumination of the grid is satisfying.

  20. Bio-inspired homogeneous multi-scale place recognition.

    PubMed

    Chen, Zetao; Lowry, Stephanie; Jacobson, Adam; Hasselmo, Michael E; Milford, Michael

    2015-12-01

    Robotic mapping and localization systems typically operate at either one fixed spatial scale, or over two, combining a local metric map and a global topological map. In contrast, recent high profile discoveries in neuroscience have indicated that animals such as rodents navigate the world using multiple parallel maps, with each map encoding the world at a specific spatial scale. While a number of theoretical-only investigations have hypothesized several possible benefits of such a multi-scale mapping system, no one has comprehensively investigated the potential mapping and place recognition performance benefits for navigating robots in large real world environments, especially using more than two homogeneous map scales. In this paper we present a biologically-inspired multi-scale mapping system mimicking the rodent multi-scale map. Unlike hybrid metric-topological multi-scale robot mapping systems, this new system is homogeneous, distinguishable only by scale, like rodent neural maps. We present methods for training each network to learn and recognize places at a specific spatial scale, and techniques for combining the output from each of these parallel networks. This approach differs from traditional probabilistic robotic methods, where place recognition spatial specificity is passively driven by models of sensor uncertainty. Instead we intentionally create parallel learning systems that learn associations between sensory input and the environment at different spatial scales. We also conduct a systematic series of experiments and parameter studies that determine the effect on performance of using different neural map scaling ratios and different numbers of discrete map scales. The results demonstrate that a multi-scale approach universally improves place recognition performance and is capable of producing better than state of the art performance compared to existing robotic navigation algorithms. We analyze the results and discuss the implications with respect to

  1. Journaling: creating space for "I".

    PubMed

    Charles, Jennell P

    2010-01-01

    As nurses engaged in a caring profession, it is critical that we learn not only to care for others but also to care for ourselves. To care effectively for ourselves, we must create the space and time in which to do this. Journaling is one tool that scholars offer as a way to create this space. Although there is no clear consensus about the best techniques for journaling, there is evidence that journaling, as a reflective, meditative activity, can promote creativity, self-awareness, and personal development. PMID:21140872

  2. The Statistical Mechanics of Ideal Homogeneous Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2002-01-01

    Plasmas, such as those found in the space environment or in plasma confinement devices, are often modeled as electrically conducting fluids. When fluids and plasmas are energetically stirred, regions of highly nonlinear, chaotic behavior known as turbulence arise. Understanding the fundamental nature of turbulence is a long-standing theoretical challenge. The present work describes a statistical theory concerning a certain class of nonlinear, finite dimensional, dynamical models of turbulence. These models arise when the partial differential equations describing incompressible, ideal (i.e., nondissipative) homogeneous fluid and magnetofluid (i.e., plasma) turbulence are Fourier transformed into a very large set of ordinary differential equations. These equations define a divergenceless flow in a high-dimensional phase space, which allows for the existence of a Liouville theorem, guaranteeing a distribution function based on constants of the motion (integral invariants). The novelty of these particular dynamical systems is that there are integral invariants other than the energy, and that some of these invariants behave like pseudoscalars under two of the discrete symmetry transformations of physics, parity, and charge conjugation. In this work the 'rugged invariants' of ideal homogeneous turbulence are shown to be the only significant scalar and pseudoscalar invariants. The discovery that pseudoscalar invariants cause symmetries of the original equations to be dynamically broken and induce a nonergodic structure on the associated phase space is the primary result presented here. Applicability of this result to dissipative turbulence is also discussed.

  3. Homogenization in micro-magneto-mechanics

    NASA Astrophysics Data System (ADS)

    Sridhar, A.; Keip, M.-A.; Miehe, C.

    2016-07-01

    Ferromagnetic materials are characterized by a heterogeneous micro-structure that can be altered by external magnetic and mechanical stimuli. The understanding and the description of the micro-structure evolution is of particular importance for the design and the analysis of smart materials with magneto-mechanical coupling. The macroscopic response of the material results from complex magneto-mechanical interactions occurring on smaller length scales, which are driven by magnetization reorientation and associated magnetic domain wall motions. The aim of this work is to directly base the description of the macroscopic magneto-mechanical material behavior on the micro-magnetic domain evolution. This will be realized by the incorporation of a ferromagnetic phase-field formulation into a macroscopic Boltzmann continuum by the use of computational homogenization. The transition conditions between the two scales are obtained via rigorous exploitation of rate-type and incremental variational principles, which incorporate an extended version of the classical Hill-Mandel macro-homogeneity condition covering the phase field on the micro-scale. An efficient two-scale computational scenario is developed based on an operator splitting scheme that includes a predictor for the magnetization on the micro-scale. Two- and three-dimensional numerical simulations demonstrate the performance of the method. They investigate micro-magnetic domain evolution driven by macroscopic fields as well as the associated overall hysteretic response of ferromagnetic solids.

  4. Emergence of Leadership within a Homogeneous Group

    PubMed Central

    Eskridge, Brent E.; Valle, Elizabeth; Schlupp, Ingo

    2015-01-01

    Large scale coordination without dominant, consistent leadership is frequent in nature. How individuals emerge from within the group as leaders, however transitory this position may be, has become an increasingly common question asked. This question is further complicated by the fact that in many of these aggregations, differences between individuals are minor and the group is largely considered to be homogeneous. In the simulations presented here, we investigate the emergence of leadership in the extreme situation in which all individuals are initially identical. Using a mathematical model developed using observations of natural systems, we show that the addition of a simple concept of leadership tendencies which is inspired by observations of natural systems and is affected by experience can produce distinct leaders and followers using a nonlinear feedback loop. Most importantly, our results show that small differences in experience can promote the rapid emergence of stable roles for leaders and followers. Our findings have implications for our understanding of adaptive behaviors in initially homogeneous groups, the role experience can play in shaping leadership tendencies, and the use of self-assessment in adapting behavior and, ultimately, self-role-assignment. PMID:26226381

  5. Emergence of Leadership within a Homogeneous Group.

    PubMed

    Eskridge, Brent E; Valle, Elizabeth; Schlupp, Ingo

    2015-01-01

    Large scale coordination without dominant, consistent leadership is frequent in nature. How individuals emerge from within the group as leaders, however transitory this position may be, has become an increasingly common question asked. This question is further complicated by the fact that in many of these aggregations, differences between individuals are minor and the group is largely considered to be homogeneous. In the simulations presented here, we investigate the emergence of leadership in the extreme situation in which all individuals are initially identical. Using a mathematical model developed using observations of natural systems, we show that the addition of a simple concept of leadership tendencies which is inspired by observations of natural systems and is affected by experience can produce distinct leaders and followers using a nonlinear feedback loop. Most importantly, our results show that small differences in experience can promote the rapid emergence of stable roles for leaders and followers. Our findings have implications for our understanding of adaptive behaviors in initially homogeneous groups, the role experience can play in shaping leadership tendencies, and the use of self-assessment in adapting behavior and, ultimately, self-role-assignment. PMID:26226381

  6. Si isotope homogeneity of the solar nebula

    SciTech Connect

    Pringle, Emily A.; Savage, Paul S.; Moynier, Frédéric; Jackson, Matthew G.; Barrat, Jean-Alix E-mail: savage@levee.wustl.edu E-mail: moynier@ipgp.fr E-mail: Jean-Alix.Barrat@univ-brest.fr

    2013-12-20

    The presence or absence of variations in the mass-independent abundances of Si isotopes in bulk meteorites provides important clues concerning the evolution of the early solar system. No Si isotopic anomalies have been found within the level of analytical precision of 15 ppm in {sup 29}Si/{sup 28}Si across a wide range of inner solar system materials, including terrestrial basalts, chondrites, and achondrites. A possible exception is the angrites, which may exhibit small excesses of {sup 29}Si. However, the general absence of anomalies suggests that primitive meteorites and differentiated planetesimals formed in a reservoir that was isotopically homogenous with respect to Si. Furthermore, the lack of resolvable anomalies in the calcium-aluminum-rich inclusion measured here suggests that any nucleosynthetic anomalies in Si isotopes were erased through mixing in the solar nebula prior to the formation of refractory solids. The homogeneity exhibited by Si isotopes may have implications for the distribution of Mg isotopes in the solar nebula. Based on supernova nucleosynthetic yield calculations, the expected magnitude of heavy-isotope overabundance is larger for Si than for Mg, suggesting that any potential Mg heterogeneity, if present, exists below the 15 ppm level.

  7. On shearing fluids with homogeneous densities

    NASA Astrophysics Data System (ADS)

    Srivastava, D. C.; Srivastava, V. C.; Kumar, Rajesh

    2016-06-01

    In this paper, we study shearing spherically symmetric homogeneous density fluids in comoving coordinates. It is found that the expansion of the four-velocity of a perfect fluid is homogeneous, whereas its shear is generated by an arbitrary function of time M( t), related to the mass function of the distribution. This function is found to bear a functional relationship with density. The field equations are reduced to two coupled first order ordinary differential equations for the metric coefficients g_{11} and g_{22}. We have explored a class of solutions assuming that M is a linear function of the density. This class embodies, as a subcase, the complete class of shear-free solutions. We have discussed the off quoted work of Kustaanheimo (Comment Phys Math XIII:12, 1, 1947) and have noted that it deals with shear-free fluids having anisotropic pressure. It is shown that the anisotropy of the fluid is characterized by an arbitrary function of time. We have discussed some issues of historical priorities and credentials related to shear-free solutions. Recent controversial claims by Mitra (Astrophys Space Sci 333:351, 2011 and Gravit Cosmol 18:17, 2012) have also been addressed. We found that the singularity and the shearing motion of the fluid are closely related. Hence, there is a need for fresh look to the solutions obtained earlier in comoving coordinates.

  8. Role of structural barriers for carotenoid bioaccessibility upon high pressure homogenization.

    PubMed

    Palmero, Paola; Panozzo, Agnese; Colle, Ines; Chigwedere, Claire; Hendrickx, Marc; Van Loey, Ann

    2016-05-15

    A specific approach to investigate the effect of high pressure homogenization on the carotenoid bioaccessibility in tomato-based products was developed. Six different tomato-based model systems were reconstituted in order to target the specific role of the natural structural barriers (chromoplast substructure/cell wall) and of the phases (soluble/insoluble) in determining the carotenoid bioaccessibility and viscosity changes upon high pressure homogenization. Results indicated that in the absence of natural structural barriers (carotenoid enriched oil), the soluble and insoluble phases determined the carotenoid bioaccessibility upon processing whereas, in their presence, these barriers governed the bioaccessibility. Furthermore, it was shown that the increment of the viscosity upon high pressure homogenization is determined by the presence of insoluble phase, however, this result was related to the initial ratio of the soluble:insoluble phases in the system. In addition, no relationship between the changes in viscosity and carotenoid bioaccessibility upon high pressure homogenization was found. PMID:26775991

  9. A Model for an Object Created

    NASA Astrophysics Data System (ADS)

    Jeong, Hyeok-Je

    2006-04-01

    Before going into the model treated here, it is need to know the nature of energy. Energy itself is active and constantly move. This fact results in the phenomenon of energy spread. The phenomenon of energy spread is under the law of energy conservation. For confining energy, additional energy is required. Suppose there were gathered energies for some reason. Creation of some objects is the result of the gathered energy and energy spread. In the case where a new object is more stable, after some fluctuation, energy from the object goes away so that a new object remains behind. For this, the enegy, E, larger than the sum of energy barrier, Eb, and the difference between the energy state of the object and initial energy state, dE, is required. E > Eb+dE Thus, a new object is created. It is an irreversible process. Adaptation is a sort of creation with no energy barrier. In the case where there is no energy source near the object, the created object is relatively inactive one. This is matter. To reduce the increased energy state due to gravitation, matters gather. In the case where there is an energy source near matters, a new object can be created around or within the matters. The created object will be active. This is life.

  10. Creating an organizational climate for multiculturalism.

    PubMed

    Bruhn, J G

    1996-06-01

    Multiculturism is an ideal goal for our society, its organizations, and its institutions, involving a continuous process of education and change within organizations. Multiculturalism begins with diversity and requires various steps to achieve changes in attitudes, behaviors, and values. The leadership of organizations must not only commit to diversification, but they must participate in it and reward its efforts. Diversification should be managed by creating a climate of open participation, feedback, and control at the lower organizational levels. To micromanage the process of becoming diverse increases resistance and paranoia and counters educational efforts. PMID:10157003

  11. Creating a New Professional Association

    ERIC Educational Resources Information Center

    Journal of College Reading and Learning, 2009

    2009-01-01

    This position paper investigates the merits and potential benefits of creating a new, more comprehensive professional association for members of the learning assistance and developmental education profession. This was the task assigned to the College Reading and Learning Association/National Association for Developmental Education (CRLA/NADE)…

  12. Creating Three-Dimensional Scenes

    ERIC Educational Resources Information Center

    Krumpe, Norm

    2005-01-01

    Persistence of Vision Raytracer (POV-Ray), a free computer program for creating photo-realistic, three-dimensional scenes and a link for Mathematica users interested in generating POV-Ray files from within Mathematica, is discussed. POV-Ray has great potential in secondary mathematics classrooms and helps in strengthening students' visualization…

  13. Creating Frameworks for Reflective Teaching

    ERIC Educational Resources Information Center

    Carter, Margie

    2007-01-01

    The task of creating organizational policies and systems that promote and support reflective teaching is multifaceted and seldom enumerated in early childhood professional literature. One of the best overviews the author has found comes from Carol Brunson Phillips and Sue Bredekamp (1998). The author opines that if the early childhood profession…

  14. Creating an Innovative Learning Organization

    ERIC Educational Resources Information Center

    Salisbury, Mark

    2010-01-01

    This article describes how to create an innovative learning (iLearning) organization. It begins by discussing the life cycle of knowledge in an organization, followed by a description of the theoretical foundation for iLearning. Next, the article presents an example of iLearning, followed by a description of the distributed nature of work, the…

  15. Creating Presentations on ICT Classes

    ERIC Educational Resources Information Center

    Marchis, Iuliana

    2010-01-01

    The article focuses on the creation of presentations on ICT classes. The first part highlights the most important steps when creating a presentation. The main idea is, that the computer presentation shouldn't consist only from the technological part, i.e. the editing of the presentation in a computer program. There are many steps before and after…

  16. Creating Highlander Wherever You Are

    ERIC Educational Resources Information Center

    Williams, Susan; Mullett, Cathy

    2016-01-01

    Highlander Research and Education Center serves as a catalyst for grassroots organizing and movement building. This article focuses on an interview with education coordinator Susan Williams who has worked at Highlander for 26 years. We discuss how others can and do create powerful popular education experiences anywhere, whether they have a…

  17. Can Children Really Create Knowledge?

    ERIC Educational Resources Information Center

    Bereiter, Carl; Scardamalia, Marlene

    2010-01-01

    Can children genuinely create new knowledge, as opposed to merely carrying out activities that resemble those of mature scientists and innovators? The answer is yes, provided the comparison is not to works of genius but to standards that prevail in ordinary research communities. One important product of knowledge creation is concepts and tools…

  18. Creating Space for Children's Literature

    ERIC Educational Resources Information Center

    Serafini, Frank

    2011-01-01

    As teachers struggle to balance the needs of their students with the requirements of commercial reading materials, educators need to consider how teachers will create space for children's literature in today's classrooms. In this article, 10 practical recommendations for incorporating children's literature in the reading instructional framework…

  19. Creating Time for Equity Together

    ERIC Educational Resources Information Center

    Renée, Michelle

    2015-01-01

    Iin urban communities across the nation, a broad range of partners have committed to reinventing educational time together to ensure equitable access to rich learning opportunities for all young people. Across the nation, education partners are using their creativity, commitment, and unique resources to create new school and system designs that…

  20. Creating a Global Perspective Campus

    ERIC Educational Resources Information Center

    Braskamp, Larry A.

    2011-01-01

    The author has written this Guidebook to assist users interested in creating a campus that will be more global in its mission, programs, and people. His approach is to focus on the views and contributions of the people who are engaged in higher education. Thus it has a "person" emphasis rather than a structural or policy point of view. The author…

  1. Creating a Culture of Leadership.

    ERIC Educational Resources Information Center

    Lewis, Phyllis H.

    1994-01-01

    In financially troubled times, the college or university must develop a culture of leadership. Leadership development programming can strengthen the institution by fostering a team approach to solving institutional problems, by increasing the effectiveness and efficiency of human resources, and by creating a pool of qualified professionals for…

  2. Concordance and discordance between taxonomic and functional homogenization: responses of soil mite assemblages to forest conversion.

    PubMed

    Mori, Akira S; Ota, Aino T; Fujii, Saori; Seino, Tatsuyuki; Kabeya, Daisuke; Okamoto, Toru; Ito, Masamichi T; Kaneko, Nobuhiro; Hasegawa, Motohiro

    2015-10-01

    The compositional characteristics of ecological assemblages are often simplified; this process is termed "biotic homogenization." This process of biological reorganization occurs not only taxonomically but also functionally. Testing both aspects of homogenization is essential if ecosystem functioning supported by a diverse mosaic of functional traits in the landscape is concerned. Here, we aimed to infer the underlying processes of taxonomic/functional homogenization at the local scale, which is a scale that is meaningful for this research question. We recorded species of litter-dwelling oribatid mites along a gradient of forest conversion from a natural forest to a monoculture larch plantation in Japan (in total 11 stands), and collected data on the functional traits of the recorded species to quantify functional diversity. We calculated the taxonomic and functional β-diversity, an index of biotic homogenization. We found that both the taxonomic and functional β-diversity decreased with larch dominance (stand homogenization). After further deconstructing β-diversity into the components of turnover and nestedness, which reflect different processes of community organization, a significant decrease in the response to larch dominance was observed only for the functional turnover. As a result, there was a steeper decline in the functional β-diversity than the taxonomic β-diversity. This discordance between the taxonomic and functional response suggests that species replacement occurs between species that are functionally redundant under environmental homogenization, ultimately leading to the stronger homogenization of functional diversity. The insights gained from community organization of oribatid mites suggest that the functional characteristics of local assemblages, which support the functionality of ecosystems, are of more concern in human-dominated forest landscapes. PMID:26001603

  3. Absorbing metasurface created by diffractionless disordered arrays of nanoantennas

    SciTech Connect

    Chevalier, Paul; Bouchon, Patrick Jaeck, Julien; Lauwick, Diane; Kattnig, Alain; Bardou, Nathalie; Pardo, Fabrice; Haïdar, Riad

    2015-12-21

    We study disordered arrays of metal-insulator-metal nanoantenna in order to create a diffractionless metasurface able to absorb light in the 3–5 μm spectral range. This study is conducted with angle-resolved reflectivity measurements obtained with a Fourier transform infrared spectrometer. A first design is based on a perturbation of a periodic arrangement, leading to a significant reduction of the radiative losses. Then, a random assembly of nanoantennas is built following a Poisson-disk distribution of given density, in order to obtain a nearly perfect cluttered assembly with optical properties of a homogeneous material.

  4. Absorbing metasurface created by diffractionless disordered arrays of nanoantennas

    NASA Astrophysics Data System (ADS)

    Chevalier, Paul; Bouchon, Patrick; Jaeck, Julien; Lauwick, Diane; Bardou, Nathalie; Kattnig, Alain; Pardo, Fabrice; Haïdar, Riad

    2015-12-01

    We study disordered arrays of metal-insulator-metal nanoantenna in order to create a diffractionless metasurface able to absorb light in the 3-5 μm spectral range. This study is conducted with angle-resolved reflectivity measurements obtained with a Fourier transform infrared spectrometer. A first design is based on a perturbation of a periodic arrangement, leading to a significant reduction of the radiative losses. Then, a random assembly of nanoantennas is built following a Poisson-disk distribution of given density, in order to obtain a nearly perfect cluttered assembly with optical properties of a homogeneous material.

  5. Can you help create the next generation of Land Surface Air Temperature products?

    NASA Astrophysics Data System (ADS)

    Thorne, Peter; Venema, Victor

    2013-04-01

    The International Surface Temperature Initiative comprises a group of multi-disciplinary researchers constituted in 2010 with the remit of creating a suite of open, transparent Land Surface Air Temperature products suitable for meeting 21st Century science and societal needs and expectations. Since instigation significant progress has been made in the creation of an improved set of 'raw' Land Surface Air Temperature data holdings (to be released in first version in February 2013), constituting in excess of 30,000 stations many going back over a Century, and towards the creation of a rigorous benchmarking framework. What is now requested is that multiple independent groups take up the challenge of creating global and regional products from the databank and submit their algorithms to the benchmarking framework. Key here is to rigorously assess structural uncertainty - it is not sufficient to assume because one group has tackled the problem it is in any meaningful sense mission accomplished. There undoubtedly exist a myriad of issues in the raw data and it is of vital importance to see how sensitive data homogenization is to the set of processing choices independent groups will undertake. This uncertainty will almost certainly be larger at the station or regional level - yet as we move into the 21st Century it is these scales that are of increasing import to end users. It is essential that we serve the right data in the right way with the correct caveats. This can only be achieved if a sufficient number of groups take up the challenge of creating new products from the raw databank. This poster will outline progress to date in the creation of the databank and global benchmarks and outline how investigators and groups can now get involved in creating products from the databank and participate in the benchmarking exercise. Further details upon the Initiative and its aims can be found at www.surfacetemperatures.org and http://surfacetemperatures.blogspot.com/

  6. Creating false memories for visual scenes.

    PubMed

    Miller, M B; Gazzaniga, M S

    1998-06-01

    Creating false memories has become an important tool to investigate the processes underlying true memories. In the course of investigating the constructive and/or reconstructive processes underlying the formation of false memories, it has become clear that paradigms are needed that can create false memories reliably in a variety of laboratory settings. In particular, neuroimaging techniques present certain constraints in terms of subject response and timing of stimuli that a false memory paradigm needs to comply with. We have developed a picture paradigm which results in the false recognition of items of a scene which did not occur almost as often as the true recognition of items that did occur. It uses a single presentation of pictures with thematic, stereotypical scenes (e.g. a beach scene). Some of the exemplars from the scene were removed (e.g. a beach ball) and used as lures during an auditory recognition test. Subjects' performance on this paradigm was compared with their performance on the word paradigm reintroduced by Roediger and McDermott. The word paradigm has been useful in creating false memories in several neuroimaging studies because of the high frequency of false recognition for critical lures (words not presented but closely associated with lists of words that were presented) and the strong subjective sense of remembering accompanying these false recognitions. However, it has several limitations including small numbers of lures and a particular source confusion. The picture paradigm avoids these limitations and produces identical effects on normal subjects. PMID:9705061

  7. Photonic crystal waveguide created by selective infiltration

    NASA Astrophysics Data System (ADS)

    Casas Bedoya, A.; Domachuk, P.; Grillet, C.; Monat, C.; Mägi, E. C.; Li, E.; Eggleton, B. J.

    2012-06-01

    The marriage of photonics and microfluidics ("optofluidics") uses the inherent mobility of fluids to reversibly tune photonic structures beyond traditional fabrication methods by infiltrating voids in said structures. Photonic crystals (PhCs) strongly control light on the wavelength scale and are well suited to optofluidic tuning because their periodic airhole microstructure is a natural candidate for housing liquids. The infiltration of a single row of holes in the PhC matrix modifies the effective refractive index allowing optical modes to be guided by the PhC bandgap. In this work we present the first experimental demonstration of a reconfigurable single mode W1 photonic crystal defect waveguide created by selective liquid infiltration. We modified a hexagonal silicon planar photonic crystal membrane by selectively filling a single row of air holes with ~300nm resolution, using high refractive index ionic liquid. The modification creates optical confinement in the infiltrated region and allows propagation of a single optical waveguide mode. We describe the challenges arising from the infiltration process and the liquid/solid surface interaction in the photonic crystal. We include a detailed comparison between analytic and numerical modeling and experimental results, and introduce a new approach to create an offset photonic crystal cavity by varying the nature of the selective infiltration process.

  8. Nanodosimetric track structure in homogeneous extended beams.

    PubMed

    Conte, V; Moro, D; Colautti, P; Grosswendt, B

    2015-09-01

    Physical aspects of particle track structure are important in determining the induction of clustered damage in relevant subcellular structures like the DNA and higher-order genomic structures. The direct measurement of track-structure properties of ionising radiation is feasible today by counting the number of ionisations produced inside a small gas volume. In particular, the so-called track-nanodosimeter, installed at the TANDEM-ALPI accelerator complex of LNL, measures ionisation cluster-size distributions in a simulated subcellular structure of dimensions 20 nm, corresponding approximately to the diameter of the chromatin fibre. The target volume is irradiated by pencil beams of primary particles passing at specified impact parameter. To directly relate these measured track-structure data to radiobiological measurements performed in broad homogeneous particle beams, these data can be integrated over the impact parameter. This procedure was successfully applied to 240 MeV carbon ions and compared with Monte Carlo simulations for extended fields. PMID:25848108

  9. Homogeneously dispersed multimetal oxygen-evolving catalysts.

    PubMed

    Zhang, Bo; Zheng, Xueli; Voznyy, Oleksandr; Comin, Riccardo; Bajdich, Michal; García-Melchor, Max; Han, Lili; Xu, Jixian; Liu, Min; Zheng, Lirong; García de Arquer, F Pelayo; Dinh, Cao Thang; Fan, Fengjia; Yuan, Mingjian; Yassitepe, Emre; Chen, Ning; Regier, Tom; Liu, Pengfei; Li, Yuhang; De Luna, Phil; Janmohamed, Alyf; Xin, Huolin L; Yang, Huagui; Vojvodic, Aleksandra; Sargent, Edward H

    2016-04-15

    Earth-abundant first-row (3d) transition metal-based catalysts have been developed for the oxygen-evolution reaction (OER); however, they operate at overpotentials substantially above thermodynamic requirements. Density functional theory suggested that non-3d high-valency metals such as tungsten can modulate 3d metal oxides, providing near-optimal adsorption energies for OER intermediates. We developed a room-temperature synthesis to produce gelled oxyhydroxides materials with an atomically homogeneous metal distribution. These gelled FeCoW oxyhydroxides exhibit the lowest overpotential (191 millivolts) reported at 10 milliamperes per square centimeter in alkaline electrolyte. The catalyst shows no evidence of degradation after more than 500 hours of operation. X-ray absorption and computational studies reveal a synergistic interplay between tungsten, iron, and cobalt in producing a favorable local coordination environment and electronic structure that enhance the energetics for OER. PMID:27013427

  10. The homogeneity conjecture for supergravity backgrounds

    NASA Astrophysics Data System (ADS)

    Figueroa-O'Farrill, José Miguel

    2009-06-01

    These notes record three lectures given at the workshop "Higher symmetries in Physics", held at the Universidad Complutense de Madrid in November 2008. In them we explain how to construct a Lie (super)algebra associated to a spin manifold, perhaps with extra geometric data, and a notion of privileged spinors. The typical examples are supersymmetric supergravity backgrounds; although there are more classical instances of this construction. We focus on two results: the geometric constructions of compact real forms of the simple Lie algebras of type B4, F4 and E8 from S7, S8 and S15, respectively; and the construction of the Killing superalgebra of eleven-dimensional supergravity backgrounds. As an application of this latter construction we show that supersymmetric supergravity backgrounds with enough supersymmetry are necessarily locally homogeneous.

  11. RF Spectroscopy on a Homogeneous Fermi Gas

    NASA Astrophysics Data System (ADS)

    Yan, Zhenjie; Mukherjee, Biswaroop; Patel, Parth; Struck, Julian; Zwierlein, Martin

    2016-05-01

    Over the last two decades RF spectroscopy has been established as an indispensable tool to probe a large variety of fundamental properties of strongly interacting Fermi gases. This ranges from measurement of the pairing gap over tan's contact to the quasi-particle weight of Fermi polarons. So far, most RF spectroscopy experiments have been performed in harmonic traps, resulting in an averaged response over different densities. We have realized an optical uniform potential for ultracold Fermi gases of 6 Li atoms, which allows us to avoid the usual problems connected to inhomogeneous systems. Here we present recent results on RF spectroscopy of these homogeneous samples with a high signal to noise ratio. In addition, we report progress on measuring the contact of a unitary Fermi gas across the normal to superfluid transition.

  12. Analysis of homogeneous turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Leonard, A. D.; Hill, J. C.; Mahalingam, S.; Ferziger, J. H.

    1988-01-01

    Full turbulence simulations at low Reynolds numbers were made for the single-step, irreversible, bimolecular reaction between non-premixed reactants in isochoric, decaying homogeneous turbulence. Various initial conditions for the scalar field were used in the simulations to control the initial scalar dissipation length scale, and simulations were also made for temperature-dependent reaction rates and for non-stoichiometric and unequal diffusivity conditions. Joint probability density functions (pdf's), conditional pdf's, and various statistical quantities appearing in the moment equations were computed. Preliminary analysis of the results indicates that compressive strain-rate correlates better than other dynamical quantities with local reaction rate, and the locations of peak reaction rates seem to be insensitive to the scalar field initial conditions.

  13. Soliton production with nonlinear homogeneous lines

    SciTech Connect

    Elizondo-Decanini, Juan M.; Coleman, Phillip D.; Moorman, Matthew W.; Petney, Sharon Joy Victor; Dudley, Evan C.; Youngman, Kevin; Penner, Tim Dwight; Fang, Lu; Myers, Katherine M.

    2015-11-24

    Low- and high-voltage Soliton waves were produced and used to demonstrate collision and compression using diode-based nonlinear transmission lines. Experiments demonstrate soliton addition and compression using homogeneous nonlinear lines. We built the nonlinear lines using commercially available diodes. These diodes are chosen after their capacitance versus voltage dependence is used in a model and the line design characteristics are calculated and simulated. Nonlinear ceramic capacitors are then used to demonstrate high-voltage pulse amplification and compression. The line is designed such that a simple capacitor discharge, input signal, develops soliton trains in as few as 12 stages. We also demonstrated output voltages in excess of 40 kV using Y5V-based commercial capacitors. The results show some key features that determine efficient production of trains of solitons in the kilovolt range.

  14. Soliton production with nonlinear homogeneous lines

    DOE PAGESBeta

    Elizondo-Decanini, Juan M.; Coleman, Phillip D.; Moorman, Matthew W.; Petney, Sharon Joy Victor; Dudley, Evan C.; Youngman, Kevin; Penner, Tim Dwight; Fang, Lu; Myers, Katherine M.

    2015-11-24

    Low- and high-voltage Soliton waves were produced and used to demonstrate collision and compression using diode-based nonlinear transmission lines. Experiments demonstrate soliton addition and compression using homogeneous nonlinear lines. We built the nonlinear lines using commercially available diodes. These diodes are chosen after their capacitance versus voltage dependence is used in a model and the line design characteristics are calculated and simulated. Nonlinear ceramic capacitors are then used to demonstrate high-voltage pulse amplification and compression. The line is designed such that a simple capacitor discharge, input signal, develops soliton trains in as few as 12 stages. We also demonstrated outputmore » voltages in excess of 40 kV using Y5V-based commercial capacitors. The results show some key features that determine efficient production of trains of solitons in the kilovolt range.« less

  15. Homogeneous catalyst formulations for methanol production

    DOEpatents

    Mahajan, Devinder; Sapienza, Richard S.; Slegeir, William A.; O'Hare, Thomas E.

    1991-02-12

    There is disclosed synthesis of CH.sub.3 OH from carbon monoxide and hydrogen using an extremely active homogeneous catalyst for methanol synthesis directly from synthesis gas. The catalyst operates preferably between 100.degree.-150.degree. C. and preferably at 100-150 psia synthesis gas to produce methanol. Use can be made of syngas mixtures which contain considerable quantities of other gases, such as nitrogen, methane or excess hydrogen. The catalyst is composed of two components: (a) a transition metal carbonyl complex and (b) an alkoxide component. In the simplest formulation, component (a) is a complex of nickel tetracarbonyl and component (b) is methoxide (CH.sub.3 O.sup.-), both being dissolved in a methanol solvent system. The presence of a co-solvent such as p-dioxane, THF, polyalcohols, ethers, hydrocarbons, and crown ethers accelerates the methanol synthesis reaction.

  16. Homogeneous catalyst formulations for methanol production

    DOEpatents

    Mahajan, Devinder; Sapienza, Richard S.; Slegeir, William A.; O'Hare, Thomas E.

    1990-01-01

    There is disclosed synthesis of CH.sub.3 OH from carbon monoxide and hydrogen using an extremely active homogeneous catalyst for methanol synthesis directly from synthesis gas. The catalyst operates preferably between 100.degree.-150.degree. C. and preferably at 100-150 psia synthesis gas to produce methanol. Use can be made of syngas mixtures which contain considerable quantities of other gases, such as nitrogen, methane or excess hydrogen. The catalyst is composed of two components: (a) a transition metal carbonyl complex and (b) an alkoxide component. In the simplest formulation, component (a) is a complex of nickel tetracarbonyl and component (b) is methoxide (CH.sub.3 O.sup.13 ), both being dissolved in a methanol solvent system. The presence of a co-solvent such as p-dioxane, THF, polyalcohols, ethers, hydrocarbons, and crown ethers accelerates the methanol synthesis reaction.

  17. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, N.; Kaya, B.S.; Godt, J.W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs. Copyright 2011 by the American Geophysical Union.

  18. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, Ning; Kaya, Basak Sener; Godt, Jonathan W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs.

  19. Homogeneous Protein Analysis by Magnetic Core-Shell Nanorod Probes.

    PubMed

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Altantzis, Thomas; Bals, Sara; Schotter, Joerg

    2016-04-13

    Studying protein interactions is of vital importance both to fundamental biology research and to medical applications. Here, we report on the experimental proof of a universally applicable label-free homogeneous platform for rapid protein analysis. It is based on optically detecting changes in the rotational dynamics of magnetically agitated core-shell nanorods upon their specific interaction with proteins. By adjusting the excitation frequency, we are able to optimize the measurement signal for each analyte protein size. In addition, due to the locking of the optical signal to the magnetic excitation frequency, background signals are suppressed, thus allowing exclusive studies of processes at the nanoprobe surface only. We study target proteins (soluble domain of the human epidermal growth factor receptor 2 - sHER2) specifically binding to antibodies (trastuzumab) immobilized on the surface of our nanoprobes and demonstrate direct deduction of their respective sizes. Additionally, we examine the dependence of our measurement signal on the concentration of the analyte protein, and deduce a minimally detectable sHER2 concentration of 440 pM. For our homogeneous measurement platform, good dispersion stability of the applied nanoprobes under physiological conditions is of vital importance. To that end, we support our measurement data by theoretical modeling of the total particle-particle interaction energies. The successful implementation of our platform offers scope for applications in biomarker-based diagnostics as well as for answering basic biology questions. PMID:27023370

  20. Homogenization of global radiosonde humidity data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Haimberger, Leopold

    2016-04-01

    The global radiosonde network is an important source of upper-air measurements and is strongly connected to reanalysis efforts of the 20th century. However, measurements are strongly affected by changes in the observing system and require a homogenization before they can be considered useful in climate studies. In particular humidity measurements are known to show spurious trends and biases induced by many sources, e.g. reporting practices or freezing of the sensor. We propose to detect and correct these biases in an automated way, as has been done with temperature and winds. We detect breakpoints in dew point depression (DPD) time series by employing a standard normal homogeneity test (SNHT) on DPD-departures from ERA-Interim. In a next step, we calculate quantile departures between the latter and the earlier part near the breakpoints of the time series, going back in time. These departures adjust the earlier distribution of DPD to the latter distribution, called quantile matching, thus removing for example a non climatic shift. We employ this approach to the existing radiosonde network. In a first step to verify our approach we compare our results with ERA-Interim data and brightness temperatures of humidity-sensitive channels of microwave measuring radiometers (SSMIS) onboard DMSP F16. The results show that some of the biases can be detected and corrected in an automated way, however large biases that impact the distribution of DPD values originating from known reporting practices (e.g. 30 DPD on US stations) remain. These biases can be removed but not corrected. The comparison of brightness temperatures from satellite and radiosondes proofs to be difficult as large differences result from for example representative errors.

  1. Galaxies Collide to Create Hot, Huge Galaxy

    NASA Technical Reports Server (NTRS)

    2009-01-01

    This image of a pair of colliding galaxies called NGC 6240 shows them in a rare, short-lived phase of their evolution just before they merge into a single, larger galaxy. The prolonged, violent collision has drastically altered the appearance of both galaxies and created huge amounts of heat turning NGC 6240 into an 'infrared luminous' active galaxy.

    A rich variety of active galaxies, with different shapes, luminosities and radiation profiles exist. These galaxies may be related astronomers have suspected that they may represent an evolutionary sequence. By catching different galaxies in different stages of merging, a story emerges as one type of active galaxy changes into another. NGC 6240 provides an important 'missing link' in this process.

    This image was created from combined data from the infrared array camera of NASA's Spitzer Space Telescope at 3.6 and 8.0 microns (red) and visible light from NASA's Hubble Space Telescope (green and blue).

  2. Creating and Analyzing a Mirage

    NASA Astrophysics Data System (ADS)

    Richey, Lauren; Stewart, Bailey; Peatross, Justin

    2006-10-01

    Most people have witnessed mirages such as the distant "puddles" that appear on a highway when the pavement is warmed by the Sun. The warmed surface heats the nearby air creating a temperature gradient with the cooler (and more dense) air above. The apparent displacement of distant objects occurs as light refracts through the different air densities. Rays of light from the sky that are originally directed toward the ground can be bent upward, appearing to a viewer as though coming from the ground. This effect is known as an inferior mirage; a superior mirage occurs when cooler air is underneath.1,2 In this paper, a mirage is created indoors using an electric hotplate and a saucepan filled with ice water.

  3. Creating small transcription activating RNAs.

    PubMed

    Chappell, James; Takahashi, Melissa K; Lucks, Julius B

    2015-03-01

    We expanded the mechanistic capability of small RNAs by creating an entirely synthetic mode of regulation: small transcription activating RNAs (STARs). Using two strategies, we engineered synthetic STAR regulators to disrupt the formation of an intrinsic transcription terminator placed upstream of a gene in Escherichia coli. This resulted in a group of four highly orthogonal STARs that had up to 94-fold activation. By systematically modifying sequence features of this group, we derived design principles for STAR function, which we then used to forward engineer a STAR that targets a terminator found in the Escherichia coli genome. Finally, we showed that STARs could be combined in tandem to create previously unattainable RNA-only transcriptional logic gates. STARs provide a new mechanism of regulation that will expand our ability to use small RNAs to construct synthetic gene networks that precisely control gene expression. PMID:25643173

  4. Creating advanced health informatics certification.

    PubMed

    Gadd, Cynthia S; Williamson, Jeffrey J; Steen, Elaine B; Fridsma, Douglas B

    2016-07-01

    In 2005, AMIA leaders and members concluded that certification of advanced health informatics professionals would offer value to individual practitioners, organizations that hire them, and society at large. AMIA's work to create advanced informatics certification began by leading a successful effort to create the clinical informatics subspecialty for American Board of Medical Specialties board-certified physicians. Since 2012, AMIA has been working to establish advanced health informatics certification (AHIC) for all health informatics practitioners regardless of their primary discipline. In November 2015, AMIA completed the first of 3 key tasks required to establish AHIC, with the AMIA Board of Directors' endorsement of proposed eligibility requirements. This AMIA Board white paper describes efforts to establish AHIC, reports on the current status of AHIC components, and provides a context for the proposed AHIC eligibility requirements. PMID:27358327

  5. Effect of cloud-scale vertical velocity on the contribution of homogeneous nucleation to cirrus formation and radiative forcing

    NASA Astrophysics Data System (ADS)

    Shi, X.; Liu, X.

    2016-06-01

    Ice nucleation is a critical process for the ice crystal formation in cirrus clouds. The relative contribution of homogeneous nucleation versus heterogeneous nucleation to cirrus formation differs between measurements and predictions from general circulation models. Here we perform large-ensemble simulations of the ice nucleation process using a cloud parcel model driven by observed vertical motions and find that homogeneous nucleation occurs rather infrequently, in agreement with recent measurement findings. When the effect of observed vertical velocity fluctuations on ice nucleation is considered in the Community Atmosphere Model version 5, the relative contribution of homogeneous nucleation to cirrus cloud occurrences decreases to only a few percent. However, homogeneous nucleation still has strong impacts on the cloud radiative forcing. Hence, the importance of homogeneous nucleation for cirrus cloud formation should not be dismissed on the global scale.

  6. Creating a Mobile Library Website

    ERIC Educational Resources Information Center

    Cutshall, Tom C.; Blake, Lindsay; Bandy, Sandra L.

    2011-01-01

    The overwhelming results were iPhones and Android devices. Since the library wasn't equipped technologically to develop an in-house application platform and because we wanted the content to work across all mobile platforms, we decided to focus on creating a mobile web-based platform. From the NLM page of mobile sites we chose the basic PubMed/…

  7. To homogenize, or not to homogenize, that is the question: Quartz-hosted melt inclusion analysis avenues

    NASA Astrophysics Data System (ADS)

    Mercer, C. N.; Roberge, J.; Todorov, T. I.; Hofstra, A. H.

    2013-12-01

    Melt inclusions hosted in quartz can provide the only direct information about the pressure, temperature, and melt composition of pre-eruptive rhyolitic magmas, many of which are the precursors to mineralizing aqueous fluids [1]. With ideal, rapidly-quenched pumice samples, analysis of glassy quartz-hosted melt inclusions is relatively straightforward. These data can be directly interpreted to represent snapshots of metal and volatile concentrations during magma crystallization and degassing. However, most ore deposit-related igneous samples are non-ideal; being older, potentially hydrothermally altered, and often crystallized due to slow cooling in subvolcanic regions (e.g., porphyry-type deposits). In this case, analysis of crystalline melt inclusions in quartz is not straightforward and resulting data must be meticulously examined before interpretation. Many melt inclusions may have experienced post-entrapment modifications [1] such as diffusion of elements (e.g., H, Li, Na, Ag, Cu) [2], which may lead to changes in measured oxygen fugacity. Slowly cooled inclusions may crystallize, producing a heterogeneous "micro-rock" that cannot be analyzed by spectroscopic methods or electron microprobe. While crystallized inclusions can be homogenized in a high-temperature furnace, many new problems may arise such as inclusion decrepitation [3], diffusion of elements [2], and incorporation of too little or too much Si from the inclusion rim or host crystal. However, if unwanted homogenization effects are minimized by choosing ideal experimental conditions, then these homogenized inclusions can be analyzed by traditional FTIR and electron microprobe methods. The electron microprobe data from homogenized inclusions can be used as accurate internal standards for laser ablation-ICP-MS data reduction. Alternatively, crystalline inclusions can be directly analyzed for major and trace elements by laser ablation-ICP-MS [4], which considerably reduces sample preparation time, but

  8. Creating circularly polarized light with a phase-shifting mirror

    NASA Astrophysics Data System (ADS)

    Aurand, Bastian; Kuschel, Stephan; Rödel, Christian; Heyer, Martin; Wunderlich, Frank; Jäckel, Oliver; Kaluza, Malte C.; Paulus, Gerhard G.; Kühl, Thomas

    2011-08-01

    We report on the performance of a system employing a multi-layer coated mirror creating circularly polarized light in a fully reflective setup. With one specially designed mirror we are able to create laser pulses with an ellipticity of more than ɛ = 98% over the entire spectral bandwidth from initially linearly polarized Titanium:Sapphire femtosecond laser pulses. We tested the homogeneity of the polarization with beam sizes of the order of approximately 10 cm. The damage threshold was determined to be nearly 400 times higher than for a transmissive quartz-wave plate which suggests applications in high intensity laser experiments. Another advantage of the reflective scheme is the absence of nonlinear effects changing the spectrum or the pulse-form and the scalability of coating fabrication to large aperture mirrors.

  9. Engineering the yeast Yarrowia lipolytica for the production of therapeutic proteins homogeneously glycosylated with Man8GlcNAc2 and Man5GlcNAc2

    PubMed Central

    2012-01-01

    Background Protein-based therapeutics represent the fastest growing class of compounds in the pharmaceutical industry. This has created an increasing demand for powerful expression systems. Yeast systems are widely used, convenient and cost-effective. Yarrowia lipolytica is a suitable host that is generally regarded as safe (GRAS). Yeasts, however, modify their glycoproteins with heterogeneous glycans containing mainly mannoses, which complicates downstream processing and often interferes with protein function in man. Our aim was to glyco-engineer Y. lipolytica to abolish the heterogeneous, yeast-specific glycosylation and to obtain homogeneous human high-mannose type glycosylation. Results We engineered Y. lipolytica to produce homogeneous human-type terminal-mannose glycosylated proteins, i.e. glycosylated with Man8GlcNAc2 or Man5GlcNAc2. First, we inactivated the yeast-specific Golgi α-1,6-mannosyltransferases YlOch1p and YlMnn9p; the former inactivation yielded a strain producing homogeneous Man8GlcNAc2 glycoproteins. We tested this strain by expressing glucocerebrosidase and found that the hypermannosylation-related heterogeneity was eliminated. Furthermore, detailed analysis of N-glycans showed that YlOch1p and YlMnn9p, despite some initial uncertainty about their function, are most likely the α-1,6-mannosyltransferases responsible for the addition of the first and second mannose residue, respectively, to the glycan backbone. Second, introduction of an ER-retained α-1,2-mannosidase yielded a strain producing proteins homogeneously glycosylated with Man5GlcNAc2. The use of the endogenous LIP2pre signal sequence and codon optimization greatly improved the efficiency of this enzyme. Conclusions We generated a Y. lipolytica expression platform for the production of heterologous glycoproteins that are homogenously glycosylated with either Man8GlcNAc2 or Man5GlcNAc2 N-glycans. This platform expands the utility of Y. lipolytica as a heterologous expression host

  10. Stochastic Effects in the Bistable Homogeneous Semenov Model

    NASA Astrophysics Data System (ADS)

    Nowakowski, B.; Lemarchand, A.; Nowakowska, E.

    2002-04-01

    We present the mesoscopic description of stochastic effects in a thermochemical bistable diluted gas system subject to the Newtonian heat exchange with a thermostat. We apply the master equation including a transition rate for the Newtonian thermal transfer process, derived on the basis of kinetic theory. As temperature is a continuous variable, this master equation has a complicated integro-differential form. We perform Monte Carlo simulations based on this equation to study the stochastic effects in a homogeneous Semenov model (which neglects reactant consumption) in the bistable regime. The mean first passage time is computed as a function of the number of particles in the system and the distance from the bifurcation associated with the emergence of bistability. An approximate analytical prediction is deduced from the Fokker--Planck equation associated with the master equation. The results of the master equation approach are successfully compared with those of direct simulations of the microscopic particle dynamics.

  11. Analysis of opinion spreading in homogeneous networks with signed relationships

    NASA Astrophysics Data System (ADS)

    Fan, Pengyi; Wang, Hui; Li, Pei; Li, Wei; Jiang, Zhihong

    2012-08-01

    Recently, significant attention has been devoted to opinion dynamics in social networks, in which all the relationships between individuals are assumed as positive ones (i.e. friend, altruism or trust). However, many realistic social networks include negative relationships (i.e. enemy or distrust) as well as positive ones. In order to find the dynamical behavior of opinion spreading in signed networks, we propose a model taking into account the impacts of positive and negative relationships. Based on this model, we analyze the dynamical process and provide a detailed mathematical analysis for identifying the threshold of opinion spreading in homogeneous networks with signed relationships. By performing numerical simulations for the threshold in three different signed networks, we find that the theoretical and numerical results are in good agreement, confirming the correctness of our exact solution.

  12. Molecular Dynamics Simulations of Homogeneous Crystallization in Polymer Melt

    NASA Astrophysics Data System (ADS)

    Kong, Bin

    2015-03-01

    Molecular mechanisms of homogeneous nucleation and crystal growth from the melt of polyethylene-like polymer were investigated by molecular dynamics simulations. The crystallinity was determined by using the site order parameter method (SOP), which described local order degree around an atom. Snapshots of the simulations showed evolution of the nucleation and the crystal growth through SOP images clearly. The isothermal crystallization kinetics was determined at different temperatures. The rate of crystallization, Kc, and the Avrami exponents, n, were determined as a function of temperature. The forming of nucleis was traced to reveal that the nucleis were formed with more ordered cores and less ordered shells. A detailed statistical analysis of the MD snapshots and trajectories suggested conformations of the polymer chains changed smoothly from random coil to chain folded lamella in the crystallization processes.

  13. Toward homogenization of Mediterranean lagoons and their loss of hydrodiversity

    NASA Astrophysics Data System (ADS)

    Ferrarin, Christian; Bajo, Marco; Bellafiore, Debora; Cucco, Andrea; De Pascalis, Francesca; Ghezzo, Michol; Umgiesser, Georg

    2014-08-01

    Lagoons are considered to be the most valuable systems of the Mediterranean coastal area, with crucial ecological, historical, economical, and social relevance. Climate change strongly affects coastal areas and can deeply change the status of transitional areas like lagoons. Herein we investigate the hydrological response of 10 Mediterranean lagoons to climate change by means of numerical models. Our results suggest that Mediterranean lagoons amplify the salinity and temperature changes expected for the open sea. Moreover, numerical simulations indicate that there will be a general loss of intralagoon and interlagoon variability of their physical properties. Therefore, as a result of climate change, we see on Mediterranean lagoons an example of a common process that in future may effect many coastal environments: that of homogenization of the physical characteristics with a tendency toward marinization.

  14. Homogeneously catalyzed oxidation for the destruction of aqueous organic wastes

    SciTech Connect

    Leavitt, D.D.; Horbath, J.S.; Abraham, M.A. )

    1990-11-01

    Several organic species, specifically atrazine, 2,4-dichlorophenozyacetic acid, and biphenyl, were converted to CO{sub 2} and other non-harmful gases through oxidation catalyzed by inorganic acid. Nearly complete conversion was obtained through homogeneous liquid-phase oxidation with ammonium nitrate. The kinetics of reaction have been investigated and indicate parallel oxidation and thermal degradation of the oxidant. This results in a maximum conversion at an intermediate temperature. Increasing oxidant concentration accelerates the rate of conversion and shifts the location of the optimum temperature. Reaction at varying acid concentration revealed that conversion increased with an approximately linear relationship as the pH of the solution was increased. Conversion was increased to greater than 99% through the addition of small amounts of transition metal salts demonstrating the suitability of a treatment process based on this technology for wastestreams containing small quantities of heavy metals.

  15. Homogeneous optical cloak constructed with uniform layered structures.

    PubMed

    Zhang, Jingjing; Liu, Liu; Luo, Yu; Zhang, Shuang; Mortensen, Niels Asger

    2011-04-25

    The prospect of rendering objects invisible has intrigued researchers for centuries. Transformation optics based invisibility cloak design is now bringing this goal from science fictions to reality and has already been demonstrated experimentally in microwave and optical frequencies. However, the majority of the invisibility cloaks reported so far have a spatially varying refractive index which requires complicated design processes. Besides, the size of the hidden object is usually small relative to that of the cloak device. Here we report the experimental realization of a homogenous invisibility cloak with a uniform silicon grating structure. The design strategy eliminates the need for spatial variation of the material index, and in terms of size it allows for a very large obstacle/cloak ratio. A broadband invisibility behavior has been verified at near-infrared frequencies, opening up new opportunities for using uniform layered medium to realize invisibility at any frequency ranges, where high-quality dielectrics are available. PMID:21643114

  16. Edge-Based Image Compression with Homogeneous Diffusion

    NASA Astrophysics Data System (ADS)

    Mainberger, Markus; Weickert, Joachim

    It is well-known that edges contain semantically important image information. In this paper we present a lossy compression method for cartoon-like images that exploits information at image edges. These edges are extracted with the Marr-Hildreth operator followed by hysteresis thresholding. Their locations are stored in a lossless way using JBIG. Moreover, we encode the grey or colour values at both sides of each edge by applying quantisation, subsampling and PAQ coding. In the decoding step, information outside these encoded data is recovered by solving the Laplace equation, i.e. we inpaint with the steady state of a homogeneous diffusion process. Our experiments show that the suggested method outperforms the widely-used JPEG standard and can even beat the advanced JPEG2000 standard for cartoon-like images.

  17. Homogeneous deposition of particles by absorption on hydrogels

    NASA Astrophysics Data System (ADS)

    Boulogne, François; Ingremeau, François; Dervaux, Julien; Limat, Laurent; Stone, Howard A.

    2015-11-01

    When a drop containing colloidal particles evaporates on a surface, a circular stain made of these particles is often observed due to an internal flow toward the contact line. To hinder this effect, several approaches have been proposed such as flow modification by addition of surfactants or control of the interactions between the particles. All of these strategies involve the liquid phase while maintaining the drying process. However, substitution of evaporation by absorption into the substrate of the solvent has been investigated less. Here, we show that a droplet containing colloidal particles deposited on swelling hydrogels can lead to a nearly uniform coating. We report experiments and theory to explore the relation between the gel swelling, uniformity of deposition and the adsorption dynamics of the particles at the substrate. Our findings suggest that draining the solvent by absorption provides a robust route to homogeneous coatings.

  18. Drillstring vibrations create crooked holes

    SciTech Connect

    Dareing, D.W.

    1984-01-01

    Boreholes in hard formations sometimes deviate when the drillstring runs rough or the kelly bounces severely. This article explains how drillstring vibrations produce crooked holes in hard formations. It shows how to reduce dog-leg severity through vibration control. Dog-legs are known to produce cyclic bending-type fatigue loads in drill pipe and collars. Longitudinal and torsional vibrational stresses are additive to rotational bending and further reduce the life of drillstring tubulars. Vibration-induced dog-legs are therefore more damaging to drillstrings than other dog-leg producing mechanisms because total cyclic fatigue loading is the combined effect of bending stress reversal due to rotation plus vibrational stress variations. The vibration-induced dog-leg concept is based on overall vibration response of drillstrings, resultant dynamic displacements of roller cone drill bits, and corresponding dynamic forces between bit and formation. The concept explains how dynamic forces generated by roller cone rock bits might produce helical bore holes in hard homogeneous formations. Dog-legs in hard formations may be due in part to drillstring vibrations. The wellbore deviation concept relates only to roller cone rock bits and is based on dynamically reorienting three-lobed formation pattern hammered out by bottomhole assembly resonance. Analytical studies are needed to determine the effect of bit force impact point location on chip formation and rock removal. Field studies of various bottom hole assemblies operating at critical rotary speeds coupled with directional surveys are needed to test the validity of this theory.

  19. Creating new interspecific hybrid and polyploid crops.

    PubMed

    Mason, Annaliese S; Batley, Jacqueline

    2015-08-01

    Agricultural selection of desirable traits in domesticated plant and animal species mimics natural evolutionary selection for ability of species to survive, thrive, and reproduce in the wild. However, one evolutionary process is currently underutilised for human agricultural purposes: speciation through interspecific hybridisation and polyploid formation. Despite promising successes in creation of new hybrid and or polyploid species in many genera, few geneticists and breeders deliberately take advantage of polyploidy and interspecific hybridisation for crop improvement. We outline the possible benefits as well as potential problems and criticisms with this approach, and address how modern advances in technology and knowledge can help to create new crop species for agriculture. PMID:26164645

  20. Creating semantic maps from laser terrestrial data

    NASA Astrophysics Data System (ADS)

    Będkowski, J.; Majek, K.; Musialik, P.; Masłowski, A.; Adamek, A.

    2013-12-01

    In this paper creating semantic maps based on laser terrestrial data is shown. Semantic map is based on transformed geometric data (3D laser range finder) into the data with assigned labels. These labels can help in several applications such as navigation of mobile robot by finding traversable and not traversable regions. Computation of large 3D data sets requires high computational power, therefore we proposed the GPU based (Graphic Processing Unit) implementation to decrease the computational time. As a result we demonstrate the computed semantic map for mobile robot navigation.

  1. Foam Generation in Homogeneous Porous Media

    SciTech Connect

    Gauglitz, Phillip A.; Friedman, F.; Kam, S. I.; Rossen, W. R.

    2002-10-01

    In steady gas-liquid flow in homogeneous porous media with surfactant present, there is often observed a critical injection velocity or pressure gradient ?grad p min? at which ?weak? or ?coarse? foam is abruptly converted into ?strong foam,? with reduction of one to two orders of magnitude in total mobility: i.e., ?foam generation.? Earlier research on foam generation is extended here with extensive data for a variety of porous media, permeabilities, gases (N2 and C02), surfactants, and temperatures. For bead and sandpacks, ?grad p min? scales like (1/k), where k is permeability, over 2 1/2 orders of magnitude in k; for consolidated media the relation is more complex. For dense C02 foam, ?grad p min? exists but can be less than 1 psi/ft. If pressure drop, rather than flow rates, is fixed, one observes and unstable regime between stable ?strong? and ?coarse? foam regimes; in the unstable regime ?grad p? is nonuniform in space or variable in time.

  2. Homogeneously dispersed, multimetal oxygen-evolving catalysts

    DOE PAGESBeta

    Zhang, Bo; Zheng, Xueli; Voznyy, Oleksandr; Comin, Riccardo; Bajdich, Michal; Garcia-Melchor, Max; Han, Lili; Xu, Jixian; Liu, Min; Zheng, Lirong; et al

    2016-03-24

    Earth-abundant first-row (3d) transition-metal-based catalysts have been developed for the oxygen-evolution reaction (OER); however, they operate at overpotentials significantly above thermodynamic requirements. Density functional theory suggested that non-3d high-valency metals such as tungsten can modulate 3d metal oxides, providing near-optimal adsorption energies for OER intermediates. We developed a room-temperature synthesis to produce gelled oxy-hydroxide materials with an atomically homogeneous metal distribution. These gelled FeCoW oxy-hydroxide exhibits the lowest overpotential (191 mV) reported at 10 mA per square centimeter in alkaline electrolyte. Here, the catalyst shows no evidence of degradation following more than 500 hours of operation. X-ray absorption and computationalmore » studies reveal a synergistic interplay between W, Fe and Co in producing a favorable local coordination environment and electronic structure that enhance the energetics for OER.« less

  3. Nearsightedness of Finite Homogeneous Model Systems

    NASA Astrophysics Data System (ADS)

    Mitsuta, Yuki; Yamanaka, Shusuke; Kawakami, Takashi; Okumura, Mitsutaka; Yamaguchi, Kizashi; Nakamura, Haruki

    On the basis of linear response function (LRF) analysis, nearsightedness of finite systems is examined for nearly homogeneous molecular systems. We first treated with Hn (n = 2-100) to inspect the local or nonlocal responses of these systems, which are, in other words, the magnitudes of nearsightedness of the finite systems. Further, the LRFs of H100n+ (n = 0-98) have been examined in order to clarify whether the magnitude of nearsightedness depends either the size of systems or the number of electrons in systems. From our calculations, we conjectured that the number of electrons are essential for nearsightedness of electronic matter (NEM) of this type of systems. This conjecture has been confirmed from the fact that the LRFs of H100n+ (n = 0-98) are similar to those of N electrons (N = 2-100) in a square well potential, showing that attractive potentials of H100n+ (n = 0-98) do not change significantly the dependence of the magnitudes of NEM on the number of electrons.

  4. Simulation and modeling of homogeneous, compressed turbulence

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.

    1985-01-01

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression was simulated by numerically solving the Navier-Stokes equations. The numerical simulations were performed on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second-order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one-dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. Results from the simulated flow fields were used to test one-point closure, two-equation models. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  5. Numerical study of homogeneous nanodroplet growth.

    PubMed

    Quang, Tran Si Bui; Leong, Fong Yew; Mirsaidov, Utkur M

    2015-01-15

    We investigate the axisymmetric homogeneous growth of 10-100 nm water nanodroplets on a substrate surface. The main mechanism of droplet growth is attributed to the accumulation of laterally diffusing water monomers, formed by the absorption of water vapour in the environment onto the substrate. Under assumptions of quasi-steady thermodynamic equilibrium, the nanodroplet evolves according to the augmented Young-Laplace equation. Using continuum theory, we model the dynamics of nanodroplet growth including the coupled effects of disjoining pressure, contact angle and monomer diffusion. Our numerical results show that the initial droplet growth is dominated by monomer diffusion, and the steady late growth rate of droplet radius follows a power law of 1/3, which is unaffected by the substrate disjoining pressure. Instead, the disjoining pressure modifies the growth rate of the droplet height, which then follows a power law of 1/4. We demonstrate how spatial depletion of monomers could lead to a growth arrest of the nanodroplet, as observed experimentally. This work has further implications on the growth kinetics, transport and phase transition of liquids at the nanoscale. PMID:25454424

  6. Redatuming Operators Analysis in Homogeneous Media

    NASA Astrophysics Data System (ADS)

    Oliveira, Fransisco de Souza; Figueiredo, Jose J. S. de; Freitas, Lucas

    2015-04-01

    A redatuming operation is used to simulate the acquisition of data in new levels, avoiding distortions produced by near-surface irregularities related to either geometric or material property heterogeneities. In this work, the application of the true-amplitude Kirchhoff redatuming (TAKR) operator on homogeneous media is compared with conventional Kirchhoff redatuming (KR) operator restricted to the zero-offset case. The TAKR and the KR operators are analytically and numerically compared in order to verify their impacts on the data at a new level. Analyses of amplitude and velocity sensitivity of the TAKR and KR were performed: one concerning the difference between the weight functions and the other related to the velocity variation. The comparisons between operators were performed using numerical examples. The feasibility of the KR and TAKR operators was demonstrated not only kinematically but also dynamically for their purposes. In other words, one preserves amplitude (KR), and the other corrects the amplitude (TAKR). In the end, we applied the operators to a GPR data set.

  7. Homogenization models for 2-D grid structures

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Cioranescu, D.; Rebnord, D. A.

    1992-01-01

    In the past several years, we have pursued efforts related to the development of accurate models for the dynamics of flexible structures made of composite materials. Rather than viewing periodicity and sparseness as obstacles to be overcome, we exploit them to our advantage. We consider a variational problem on a domain that has large, periodically distributed holes. Using homogenization techniques we show that the solution to this problem is in some topology 'close' to the solution of a similar problem that holds on a much simpler domain. We study the behavior of the solution of the variational problem as the holes increase in number, but decrease in size in such a way that the total amount of material remains constant. The result is an equation that is in general more complex, but with a domain that is simply connected rather than perforated. We study the limit of the solution as the amount of material goes to zero. This second limit will, in most cases, retrieve much of the simplicity that was lost in the first limit without sacrificing the simplicity of the domain. Finally, we show that these results can be applied to the case of a vibrating Love-Kirchhoff plate with Kelvin-Voigt damping. We rely heavily on earlier results of (Du), (CS) for the static, undamped Love-Kirchhoff equation. Our efforts here result in a modification of those results to include both time dependence and Kelvin-Voigt damping.

  8. Fluorescent homogeneous immunosensors for detecting pathogenic bacteria.

    PubMed

    Heyduk, Ewa; Heyduk, Tomasz

    2010-01-15

    We developed a straightforward antibody-based assay for rapid homogeneous detection of bacteria. Our sensors utilize antibody recognizing cell-surface epitopes of the target cell. Two samples of the antibody are prepared, each labeled via nanometer size flexible linkers with short complementary oligonucleotides that are modified with fluorochromes that could participate in fluorescence resonance energy transfer (FRET). The length of the complementary oligonucleotide sequences was designed such that very little annealing occurred in the absence of the target cells. In the presence of the target cells the two labeled antibodies bind to the surface of the cell resulting in a large local concentration of the complementary oligonucleotides that are attached to the antibody. This in turn drives the annealing of the complementary oligonucleotides which brings the fluorescence probes to close proximity producing large FRET signals proportional to the amount of target cells. Long flexible linkers used to attach the oligonucleotides to the antibody enable target-induced oligonucleotide annealing even if the density of surface antigens is only modest. We used Escherichia coli 0157:H7 and Salmonella typhimurium to demonstrate that this design produced sensors exhibiting rapid response time, high specificity, and sensitivity in detecting the target bacteria. PMID:19782039

  9. Homogeneity of Antibody Responses in Tuberculosis Patients

    PubMed Central

    Samanich, K.; Belisle, J. T.; Laal, S.

    2001-01-01

    The goals of the present study were twofold: (i) to compare the repertoires of antigens in culture filtrates of in vitro-grown Mycobacterium tuberculosis that are recognized by antibodies from noncavitary and cavitary tuberculosis (TB) patients and (ii) to determine the extent of variation that exists between the antigen profiles recognized by individual TB patients. Lipoarabinomannan-free culture filtrate proteins of M. tuberculosis were fractionated by one-dimensional (1-D) and 2-D polyacrylamide gel electrophoresis, and the Western blots were probed with sera from non-human immunodeficiency virus (non-HIV)-infected cavitary and noncavitary TB patients and from HIV-infected, noncavitary TB patients. In contrast to earlier studies based on recombinant antigens of M. tuberculosis which suggested that antibody responses in TB patients were heterogeneous (K. Lyashchenko et al., 1998, Infect. Immun. 66:3936–3940, 1998), our studies with native culture filtrate proteins show that the antibody responses in TB patients show significant homogeneity in being directed against a well-defined subset of antigens. Thus, there is a well-defined subset of culture filtrate antigens that elicits antibodies during noncavitary and cavitary disease. In addition, another set of antigens is recognized primarily by cavitary TB patients. The mapping with individual patient sera presented here suggests that serodiagnostic tests based on the subset of antigens recognized during both noncavitary and cavitary TB will enhance the sensitivity of antibody detection in TB patients, especially in difficult-to-diagnose, smear-negative, noncavitary TB patients. PMID:11402004

  10. Inhomogeneous radiative forcing of homogeneous greenhouse gases

    NASA Astrophysics Data System (ADS)

    Huang, Yi; Tan, Xiaoxiao; Xia, Yan

    2016-03-01

    Radiative forcing of a homogeneous greenhouse gas (HGG) can be very inhomogeneous because the forcing is dependent on other atmospheric and surface variables. In the case of doubling CO2, the monthly mean instantaneous forcing at the top of the atmosphere is found to vary geographically and temporally from positive to negative values, with the range (-2.5-5.1 W m-2) being more than 3 times the magnitude of the global mean value (2.3 W m-2). The vertical temperature change across the atmospheric column (temperature lapse rate) is found to be the best single predictor for explaining forcing variation. In addition, the masking effects of clouds and water vapor also contribute to forcing inhomogeneity. A regression model that predicts forcing from geophysical variables is constructed. This model can explain more than 90% of the variance of the forcing. Applying this model to analyzing the forcing variation in the Climate Model Intercomparison Project Phase 5 models, we find that intermodel discrepancy in CO2 forcing caused by model climatology leads to considerable discrepancy in their projected change in poleward energy transport.

  11. Simulation and modeling of homogeneous, compressed turbulence

    NASA Astrophysics Data System (ADS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.

    1985-05-01

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression was simulated by numerically solving the Navier-Stokes equations. The numerical simulations were performed on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second-order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one-dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. Results from the simulated flow fields were used to test one-point closure, two-equation models. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  12. Dynamic contact angle cycling homogenizes heterogeneous surfaces.

    PubMed

    Belibel, R; Barbaud, C; Mora, L

    2016-12-01

    In order to reduce restenosis, the necessity to develop the appropriate coating material of metallic stent is a challenge for biomedicine and scientific research over the past decade. Therefore, biodegradable copolymers of poly((R,S)-3,3 dimethylmalic acid) (PDMMLA) were prepared in order to develop a new coating exhibiting different custom groups in its side chain and being able to carry a drug. This material will be in direct contact with cells and blood. It consists of carboxylic acid and hexylic groups used for hydrophilic and hydrophobic character, respectively. The study of this material wettability and dynamic surface properties is of importance due to the influence of the chemistry and the potential motility of these chemical groups on cell adhesion and polymer kinetic hydrolysis. Cassie theory was used for the theoretical correction of contact angles of these chemical heterogeneous surfaces coatings. Dynamic Surface Analysis was used as practical homogenizer of chemical heterogeneous surfaces by cycling during many cycles in water. In this work, we confirmed that, unlike receding contact angle, advancing contact angle is influenced by the difference of only 10% of acidic groups (%A) in side-chain of polymers. It linearly decreases with increasing acidity percentage. Hysteresis (H) is also a sensitive parameter which is discussed in this paper. Finally, we conclude that cycling provides real information, thus avoiding theoretical Cassie correction. H(10)is the most sensible parameter to %A. PMID:27612817

  13. Homogeneous assay technology based on upconverting phosphors.

    PubMed

    Kuningas, Katri; Rantanen, Terhi; Ukonaho, Telle; Lövgren, Timo; Soukka, Tero

    2005-11-15

    Upconversion photoluminescence can eliminate problems associated with autofluorescence and scattered excitation light in homogeneous luminescence-based assays without need for temporal resolution. We have demonstrated a luminescence resonance energy-transfer-based assay utilizing inorganic upconverting (UPC) lanthanide phosphor as a donor and fluorescent protein as an acceptor. UPC phosphors are excited at near-infrared and they have narrow-banded anti-Stokes emission at visible wavelengths enabling measurement of the proximity-dependent sensitized emission with minimal background. The acceptor alone does not generate any direct emission at shorter wavelengths under near-infrared excitation. A competitive model assay for biotin was constructed using streptavidin-conjugated Er3+,Yb3+-doped UPC phosphor as a donor and biotinylated phycobiliprotein as an acceptor. UPC phosphor was excited at near-infrared (980 nm) and sensitized acceptor emission was measured at red wavelength (600 nm) by using a microtitration plate fluorometer equipped with an infrared laser diode and suitable excitation and emission filters. Lower limit of detection was in the subnanomolar concentration range. Compared to time-resolved fluorometry, the developed assay technology enabled simplified instrumentation. Excitation at near-infrared and emission at red wavelengths render the technology also suitable to analysis of strongly colored and fluorescent samples, which are often of concern in clinical immunoassays and in high-throughput screening. PMID:16285685

  14. Pressure-strain-rate events in homogeneous turbulent shear flow

    NASA Technical Reports Server (NTRS)

    Brasseur, James G.; Lee, Moon J.

    1988-01-01

    A detailed study of the intercomponent energy transfer processes by the pressure-strain-rate in homogeneous turbulent shear flow is presented. Probability density functions (pdf's) and contour plots of the rapid and slow pressure-strain-rate show that the energy transfer processes are extremely peaky, with high-magnitude events dominating low-magnitude fluctuations, as reflected by very high flatness factors of the pressure-strain-rate. A concept of the energy transfer class was applied to investigate details of the direction as well as magnitude of the energy transfer processes. In incompressible flow, six disjoint energy transfer classes exist. Examination of contours in instantaneous fields, pdf's and weighted pdf's of the pressure-strain-rate indicates that in the low magnitude regions all six classes play an important role, but in the high magnitude regions four classes of transfer processes, dominate. The contribution to the average slow pressure-strain-rate from the high magnitude fluctuations is only 50 percent or less. The relative significance of high and low magnitude transfer events is discussed.

  15. Creating Cross-disciplinary Courses

    PubMed Central

    Reynolds, Elaine R.

    2012-01-01

    Because of its focus on the biological underpinnings of action and behavior, neuroscience intersects with many fields of human endeavor. Some of these cross-disciplinary intersections have been long standing, while others, such as neurotheology or neuroeconomics, are more recently formed fields. Many undergraduate institutions have sought to include cross-disciplinary courses in their curriculum because this style of pedagogy is often seen as applicable to real world problems. However, it can be difficult for faculty with specialized training within their discipline to expand beyond their own fields to offer cross-disciplinary courses. I have been creating a series of multi- or cross-disciplinary courses and have found some strategies that have helped me successfully teach these classes. I will discuss general strategies and tools in developing these types of courses including: 1) creating mixed experience classrooms of students and contributing faculty 2) finding the right tools that will allow you to teach to a mixed population without prerequisites 3) examining the topic using multiple disciplinary perspectives 4) feeding off student experience and interest 5) assessing the impact of these courses on student outcomes and your neuroscience program. This last tool in particular is important in establishing the validity of this type of teaching for neuroscience students and the general student population. PMID:23494491

  16. Creating Stop-Motion Videos with iPads to Support Students' Understanding of Cell Processes: "Because You Have to Know What You're Talking about to Be Able to Do It"

    ERIC Educational Resources Information Center

    Deaton, Cynthia C. M.; Deaton, Benjamin E.; Ivankovic, Diana; Norris, Frank A.

    2013-01-01

    The purpose of this qualitative case study is two-fold: (a) describe the implementation of a stop-motion animation video activity to support students' understanding of cell processes, and (b) present research findings about students' beliefs and use of iPads to support their creation of stop-motion videos in an introductory biology course. Data…

  17. Modification of homogeneous and isotropic turbulence by solid particles

    NASA Astrophysics Data System (ADS)

    Hwang, Wontae

    2005-12-01

    Particle-laden flows are prevalent in natural and industrial environments. Dilute loadings of small, heavy particles have been observed to attenuate the turbulence levels of the carrier-phase flow, up to 80% in some cases. We attempt to increase the physical understanding of this complex phenomenon by studying the interaction of solid particles with the most fundamental type of turbulence, which is homogeneous and isotropic with no mean flow. A flow facility was developed that could create air turbulence in a nearly-spherical chamber by means of synthetic jet actuators mounted on the corners. Loudspeakers were used as the actuators. Stationary turbulence and natural decaying turbulence were investigated using two-dimensional particle image velocimetry for the base flow qualification. Results indicated that the turbulence was fairly homogeneous throughout the measurement domain and very isotropic, with small mean flow. The particle-laden flow experiments were conducted in two different environments, the lab and in micro-gravity, to examine the effects of particle wakes and flow structure distortion caused by settling particles. The laboratory experiments showed that glass particles with diameters on the order of the turbulence Kolmogorov length scale attenuated the fluid turbulent kinetic energy (TKE) and dissipation rate with increasing particle mass loadings. The main source of fluid TKE production in the chamber was the speakers, but the loss of potential energy of the settling particles also resulted in a significant amount of production of extra TKE. The sink of TKE in the chamber was due to the ordinary fluid viscous dissipation and extra dissipation caused by particles. This extra dissipation could be divided into "unresolved" dissipation caused by local velocity disturbances in the vicinity of the small particles and dissipation caused by large-scale flow distortions from particle wakes and particle clusters. The micro-gravity experiments in NASA's KC-135

  18. The Combustion Synthesis Zns Doped Materials to Create Ultra-Electroluminscent Materials in Microgravity

    NASA Astrophysics Data System (ADS)

    Castillo, Martin; Steinberg, Theodore

    2012-07-01

    Self-propagating high temperature synthesis (SHS) utilises a rapid exothermic process involving high energy and nonlinearity coupled with a high cooling rate to produce materials formed outside of normal equilibrium boundaries thus possessing unique properties. The elimination of gravity during this process allows capillary forces to dominate mixing of the reactants which results in a superior and enhanced homogeneity in the product materials. ZnS type materials have been previously conducted in reduced gravity and normal gravity. It has been claimed in literature that a near perfect phases of ZnS wurtzite was produced. Although, the SHS of this material is possible at high pressures, there have been no advancements in refining this structure to create ultra-electroluminescent materials. Utilising this process with ZnS doped with Cu, Mn, or rare earth metals such as Eu and Pr leads to electroluminescence properties, thus making this an attractive electroluminescent material. The work described here will revisit the SHS of ZnS and will re-examine the work performed in both normal gravity and in reduced gravity within the Queensland University of Technology Drop Tower Facility. Quantifications in the lattice parameters, crystal structures, and phases produced are presented to further explore the unique structure-property performance relationships produced from the SHS of ZnS materials.

  19. Homogeneity of metal matrix composites deposited by plasma transferred arc welding

    NASA Astrophysics Data System (ADS)

    Wolfe, Tonya Brett Bunton

    Tungsten carbide-based metal matrix composite coatings are deposited by PTAW (Plasma Transferred Arc Welding) on production critical components in oil sands mining. Homogeneous distribution of the reinforcement particles is desirable for optimal wear resistance in order to reduce unplanned maintenance shutdowns. The homogeneity of the coating can be improved by controlling the heat transfer, solidification rate of the process and the volume fraction of carbide. The degree of settling of the particles in the deposit was quantified using image analysis. The volume fraction of carbide was the most significant factor in obtaining a homogeneous coating. Lowering the current made a modest improvement in homogeneity. Changes made in other operational parameters did not effect significant changes in homogeneity. Infrared thermography was used to measure the temperature of the surface of the deposit during the welding process. The emissivity of the materials was required to acquire true temperature readings. The emissivity of the deposit was measured using laser reflectometry and was found to decrease from 0.8 to 0.2 as the temperature increased from 900°C to 1200°C. A correction algorithm was applied to calculate the actual temperature of the surface of the deposit. The corrected temperature did increase as the heat input of the weld increased. A one dimensional mathematical model of the settling profile and solidification of the coatings was developed. The model considers convective and radiative heat input from the plasma, the build-up of the deposit, solidification of the deposit and the settling of the WC particles within the deposit. The model had very good agreement with the experimental results of the homogeneity of the carbide as a function of depth. This fundamental model was able to accurately predict the particle homogeneity of an MMC deposited by an extremely complicated process. It was shown that the most important variable leading to a homogeneous coating

  20. Simulation and Modeling of Homogeneous, Compressed Turbulence.

    NASA Astrophysics Data System (ADS)

    Wu, Chung-Teh

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression has been simulated by numerically solving the Navier-Stokes equations. The numerical simulations were carried out on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second -order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one -dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. It was found that the ratio of the turbulence time scale to the mean-flow time scale is an important parameter in these flows. When this ratio is large, the flow is immediately affected by the mean strain in a manner similar to that predicted by rapid distortion theory. When this ratio is small, the flow retains the character of decaying isotropic turbulence initially; only after the strain has been applied for a long period does the flow accumulate a significant reflection of the effect of mean strain. In these flows, the Kolmogorov length scale decreases rapidly with increasing total strain, due to the density increase that accompanies compression. Results from the simulated flow fields were used to test one-point-closure, two-equation turbulence models. The two-equation models perform well only when the compression rate is small compared to the eddy turn-over rate. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  1. Homogeneous and heterogenized iridium water oxidation catalysts

    NASA Astrophysics Data System (ADS)

    Macchioni, Alceo

    2014-10-01

    The development of an efficient catalyst for the oxidative splitting of water into molecular oxygen, protons and electrons is of key importance for producing solar fuels through artificial photosynthesis. We are facing the problem by means of a rational approach aimed at understanding how catalytic performance may be optimized by the knowledge of the reaction mechanism of water oxidation and the fate of the catalytic site under the inevitably harsh oxidative conditions. For the purposes of our study we selected iridium water oxidation catalysts, exhibiting remarkable performance (TOF > 5 s-1 and TON > 20000). In particular, we recently focused our attention on [Cp*Ir(N,O)X] (N,O = 2-pyridincarboxylate; X = Cl or NO3) and [IrCl(Hedta)]Na water oxidation catalysts. The former exhibited a remarkable TOF whereas the latter showed a very high TON. Furthermore, [IrCl(Hedta)]Na was heterogenized onto TiO2 taking advantage of the presence of a dandling -COOH functionality. The heterogenized catalyst maintained approximately the same catalytic activity of the homogeneous analogous with the advantage that could be reused many times. Mechanistic studies were performed in order to shed some light on the rate-determining step and the transformation of catalysts when exposed to "oxidative stress". It was found that the last oxidative step, preceding oxygen liberation, is the rate-determining step when a small excess of sacrificial oxidant is used. In addition, several intermediates of the oxidative transformation of the catalyst were intercepted and characterized by NMR, X-Ray diffractometry and ESI-MS.

  2. Methodologies in creating skin substitutes.

    PubMed

    Nicholas, Mathew N; Jeschke, Marc G; Amini-Nik, Saeid

    2016-09-01

    The creation of skin substitutes has significantly decreased morbidity and mortality of skin wounds. Although there are still a number of disadvantages of currently available skin substitutes, there has been a significant decline in research advances over the past several years in improving these skin substitutes. Clinically most skin substitutes used are acellular and do not use growth factors to assist wound healing, key areas of potential in this field of research. This article discusses the five necessary attributes of an ideal skin substitute. It comprehensively discusses the three major basic components of currently available skin substitutes: scaffold materials, growth factors, and cells, comparing and contrasting what has been used so far. It then examines a variety of techniques in how to incorporate these basic components together to act as a guide for further research in the field to create cellular skin substitutes with better clinical results. PMID:27154041

  3. Creating an environment for learning.

    PubMed

    Houghton, Trish

    2016-03-16

    This article, the third in a series of 11, provides guidance to new and existing mentors and practice teachers to enable them to progress in their role and develop a portfolio of evidence that meets the Nursing and Midwifery Council's Standards to Support Learning and Assessment in Practice (SSLAP). The importance of developing a high quality practice placement is discussed in relation to the fifth domain of the SSLAP, 'creating an environment for learning'. The article provides learning activities and suggests ways in which mentors and practice teachers can undertake various self-assessments, enabling them to gather relevant evidence to demonstrate how they can meet and maintain the requirements of this domain. PMID:26982867

  4. [Cellular homogeneity in diverse portions of the diaphragm].

    PubMed

    Jiménez-Fuentes, M A; Gea, J; Mariñán, M; Gáldiz, J B; Gallego, F; Broquetas, J M

    1998-02-01

    The diaphragm is the main inspiratory muscle. It is composed of two parts, the costal and crural, with both anatomical and functional differences. The general morphometric characteristics of the diaphragm have been described in various species but homogeneity throughout the muscle has not been adequately studied. The aim of this study was to evaluate the fiber phenotype of various parts of the diaphragm. The entire diaphragm muscles of five New Zealand rabbits were removed and each was divided into quarters. The specimens were processed for morphometry (hematoxyllineosin stains, NADH-TR and ATPase at pH levels of 4.2, 4.6 and 9.4). For each portion we measured percent and size of fibers, expressing the latter as minimum diameter (Dm), measured area (Ar) and calculated area (Ac). Left and right diaphragm hemispheres (20 portions examined) were similar for fiber percentages and sizes. For left and right halves, respectively 50 +/- 2 and 51 +/- 4% of fibers were type I; type I Dm measurements were 38 +/- 5 and 41 +/- 4 microns; type I Ar values were 1798 +/- 481 and 2030 +/- 390 micron 2; type I Ac values were 1181 +/- 360 and 1321 +/- 382 micron 2; type II Dm values were 46 +/- 4 and 46 +/- 5 microns; type II Ar values were 2466 +/- 388 micron 2 and 2539 +/- 456 micron 2; type II Ac data were 1642 +/- 255 and 1655 +/- 382 micron 2. We likewise found no differences between costal and crural portions of the muscle (n = 20). For costal and crural portions, respectively, 50 +/- 3 and 50 +/- 2% of fibers were type I; type I Dm sizes were 39 +/- 5 and 40 +/- 4 microns; type I Ar measurements were 1859 +/- 521 and 1964 +/- 365 micron 2; type I Ac figures were 1231 +/- 317 and 1266 +/- 288 micron 2; type II Dm were 47 +/- 4 and 44 +/- 3 microns; type II Ar were 2563 +/- 481 and 2430 +/- 331 micron 2; type II Ac were 1729 +/- 373 and 1557 +/- 212 micron 2. Type II fibers, however, were somewhat larger than type I fibers in all portions (p = 0.001). New Zealand rabbit

  5. Philanthropy's new agenda: creating value.

    PubMed

    Porter, M E; Kramer, M R

    1999-01-01

    During the past two decades, the number of charitable foundations in the United States has doubled while the value of their assets has increased more than 1,100%. As new wealth continues to pour into foundations, the authors take a timely look at the field and conclude that radical change is needed. First, they explain why. Compared with direct giving, foundations are strongly favored through tax preferences whose value increases in rising stock markets. As a nation, then, we make a substantial investment in foundation philanthropy that goes well beyond the original gifts of private donors. We should therefore expect foundations to achieve a social impact disproportionate to their spending. If foundations serve merely as passive conduits for giving, then they not only fall far short of their potential but also fail to meet an important societal obligation. Drawing on Porter's work on competition and strategy, the authors then present a framework for thinking systematically about how foundations create value and how the various approaches to value creation can be deployed within the context of an overarching strategy. Although many foundations talk about "strategic" giving, much current practice is at odds with strategy. Among the common problems, foundations scatter their funding too broadly, they overlook the value-creating potential of longer and closer working relationships with grantees, and they pay insufficient attention to the ultimate results of the work they fund. This article lays out a blueprint for change, challenging foundation leaders to spearhead the evolution of philanthropy from private acts of conscience into a professional field. PMID:10662001

  6. A non-asymptotic homogenization theory for periodic electromagnetic structures

    PubMed Central

    Tsukerman, Igor; Markel, Vadim A.

    2014-01-01

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions. PMID:25104912

  7. Operator Algebra Quantum Homogeneous Spaces of Universal Gauge Groups

    NASA Astrophysics Data System (ADS)

    Mahanta, Snigdhayan; Mathai, Varghese

    2011-09-01

    In this paper, we quantize universal gauge groups such as SU(∞), as well as their homogeneous spaces, in the σ- C*-algebra setting. More precisely, we propose concise definitions of σ- C*-quantum groups and σ- C*-quantum homogeneous spaces and explain these concepts here. At the same time, we put these definitions in the mathematical context of countably compactly generated spaces as well as C*-compact quantum groups and homogeneous spaces. We also study the representable K-theory of these spaces and compute these groups for the quantum homogeneous spaces associated to the quantum version of the universal gauge group SU(∞).

  8. Development of Dynamic Explicit Crystallographic Homogenization Finite Element Analysis Code to Assess Sheet Metal Formability

    NASA Astrophysics Data System (ADS)

    Nakamura, Yasunori; Tam, Nguyen Ngoc; Ohata, Tomiso; Morita, Kiminori; Nakamachi, Eiji

    2004-06-01

    The crystallographic texture evolution induced by plastic deformation in the sheet metal forming process has a great influence on its formability. In the present study, a dynamic explicit finite element (FE) analysis code is newly developed by introducing a crystallographic homogenization method to estimate the polycrystalline sheet metal formability, such as the extreme thinning and "earing." This code can predict the plastic deformation induced texture evolution at the micro scale and the plastic anisotropy at the macro scale, simultaneously. This multi-scale analysis can couple the microscopic crystal plasticity inhomogeneous deformation with the macroscopic continuum deformation. In this homogenization process, the stress at the macro scale is defined by the volume average of those of the corresponding microscopic crystal aggregations in satisfying the equation of motion and compatibility condition in the micro scale "unit cell," where the periodicity of deformation is satisfied. This homogenization algorithm is implemented in the conventional dynamic explicit finite element code by employing the updated Lagrangian formulation and the rate type elastic/viscoplastic constitutive equation. At first, it has been confirmed through a texture evolution analyses in cases of typical deformation modes that Taylor's "constant strain homogenization algorithm" yields extreme concentration toward the preferred crystal orientations compared with our homogenization one. Second, we study the plastic anisotropy effects on "earing" in the hemispherical cup deep drawing process of pure ferrite phase sheet metal. By the comparison of analytical results with those of Taylor's assumption, conclusions are drawn that the present newly developed dynamic explicit crystallographic homogenization FEM shows more reasonable prediction of plastic deformation induced texture evolution and plastic anisotropy at the macro scale.

  9. A Clash between Cultures: An Approach To Reducing Cultural Hostility in a Homogeneous Classroom.

    ERIC Educational Resources Information Center

    Hunt, David Marshall

    2000-01-01

    Discusses the need to teach cultural awareness and reduce cultural hostility with culturally homogeneous college classes. Recounts use, in a class of graduate business students, of the case of Salman Rushdie being placed under an Iranian death threat for his writings. Explains the six-step process used to structure reading and discussion of the…

  10. A method to eliminate wetting during the homogenization of HgCdTe

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Lehoczky, S. L.; Szofran, F. R.

    1986-01-01

    Adhesion of HgCdTe samples to fused silica ampoule walls, or 'wetting', during the homogenization process was eliminated by adopting a slower heating rate. The idea is to decrease Cd activity in the sample so as to reduce the rate of reaction between Cd and the silica wall.

  11. Enhanced HFIR overpower margin through improvements in fuel plate homogeneity inspection

    SciTech Connect

    Rothrock, R.B.; Hale, R.E.; Knight, R.W.; Cheverton, R.D.

    1995-09-01

    Fuel homogeneity inspection techniques used on the HFIR fuel plates have recently been improved through conversion of the X-ray inspection device to acquire, store, and process data digitally. This paper reports some early results from using the improved equipment and describes future plans for obtaining enhanced fuel thermal performance by exploiting this improved inspection capability.

  12. Create a Pint-Sized Photo Book.

    ERIC Educational Resources Information Center

    Gathright, Pat

    2003-01-01

    Explains a project, which involves creating a book using digital images. Notes that teachers can create books with samples of their work. Provides other suggestions for using this project, such as teaching scanning, creating a photo portfolio as a semester exam project, or creating introduction pieces for yearbook or newspaper staffers. (PM)

  13. Electrothermal atomic absorption spectrophotometry of nickel in tissue homogenates

    SciTech Connect

    Sunderman, F.W. Jr.; Marzouk, A.; Crisostomo, M.C.; Weatherby, D.R.

    1985-01-01

    A method for analysis of Ni concentrations in tissues is described, which involves (a) tissue dissection with metal-free obsidian knives, (b) tissue homogenization in polyethylene bags by use by a Stomacher blender, (c) oxidative digestion with mixed nitric, sulfuric, and perchloric acids, and (d) quantitation of Ni by electrothermal atomic absorption spectrophotometry with Zeeman background correction. The detection limit for Ni in tissues is 10 ng per g, dry weight; the coefficient of variation ranges from 7 to 15%, depending on the tissue Ni concentration; the recovery of Ni added in concentration of 20 ng per g, dry weight, to kidney homogenates averages 101 +/- 8% (mean +/-SD). In control rats, Ni concentrations are highest in lung (102 +/- 39 ng per g, dry weight) and lowest in spleen (35 +/- 16 ng per g, dry wt.). In descending order of Ni concentrations, the tissues of control rats rank as follows: lung > heart > bone > kidney > brain > testis > fat > liver > spleen. In rats killed 24 h after sc injection of NiCl/sub 2/ (0.125 mmol per kg, body weight) Ni concentrations are highest in kidney (17.7 +/- 2.5 ..mu..g per g, dry weight) and lowest in brain (0.38 +/- 0.14 ..mu..g per g, dry weight). In descending order of Ni concentrations, the tissues of NiCl/sub 2/-treated rats rank as follows: kidney >> lung > spleen > testis > heart > fat > liver > bone > brain. The present method fills the need for an accurate, sensitive, and practical technique to determine tissue Ni concentrations, with stringent precautions to minimize Ni contamination during tissue sampling and processing. 35 references, 5 figures, 1 table.

  14. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  15. Phase resolved analysis of the homogeneity of a diffuse dielectric barrier discharge

    NASA Astrophysics Data System (ADS)

    Baldus, Sabrina; Kogelheide, Friederike; Bibinov, Nikita; Stapelmann, Katharina; Awakowicz, Peter

    2015-09-01

    Cold atmospheric pressure plasmas have already proven their ability of supporting the healing process of chronic wounds. Especially simple configurations like a dielectric barrier discharge (DBD), comprising of one driven electrode which is coated with a dielectric layer, are of interest, because they are cost-effective and easy to handle. The homogeneity of such plasmas during treatment is necessary since the whole wound should be treated evenly. In this investigation phase resolved optical emission spectroscopy is used to investigate the homogeneity of a DBD. Electron densities and reduced electric field distributions are determined with temporal and spatial resolution and the differences for applied positive and negative voltage pulses are studied.

  16. Arc melting and homogenization of ZrC and ZrC + B alloys

    NASA Technical Reports Server (NTRS)

    Darolia, R.; Archbold, T. F.

    1973-01-01

    A description is given of the methods used to arc-melt and to homogenize near-stoichiometric ZrC and ZrC-boron alloys, giving attention to the oxygen contamination problem. The starting material for the carbide preparation was ZrC powder with an average particle size of 4.6 micron. Pellets weighing approximately 3 g each were prepared at room temperature from the powder by the use of an isostatic press operated at 50,000 psi. These pellets were individually melted in an arc furnace containing a static atmosphere of purified argon. A graphite resistance furnace was used for the homogenization process.

  17. Analyzing melt homogeneity in a single screw plasticizing unit of an injection molding machine

    NASA Astrophysics Data System (ADS)

    Straka, K.; Praher, B.; Steinbichler, G.

    2013-10-01

    In injection molding investigations on mixing efficiency and thermal homogeneity of the melt in the screw chamber are of great interest as the directly effect the quality of the molded parts. For most injection molding applications mixing is performed in the single screw plasticizing unit of the injection molding machine. In this work, a CFD approach with two coupled fluid domains is used in order to describe the plasticizing process in an injection molding machine. One domain rotates and translates in axial direction (screw), the other one increases its length (chamber). On basis of the calculated pressure, velocity and temperature field of the polymer melt the thermal melt homogeneity is investigated. To analyze the optical-mechanical homogeneity of the melt a Euler-Lagrangian method is used to calculate the distribution of tracer particles within the screw chamber.

  18. Microstructural evolution in Al-Zn-Mg-Cu-Sc-Zr alloys during short-time homogenization

    NASA Astrophysics Data System (ADS)

    Liu, Tao; He, Chun-nian; Li, Gen; Meng, Xin; Shi, Chun-sheng; Zhao, Nai-qin

    2015-05-01

    Microstructural evolution in a new kind of aluminum (Al) alloy with the chemical composition of Al-8.82Zn-2.08Mg-0.80Cu-0.31Sc-0.3Zr was investigated. It is found that the secondary phase MgZn2 is completely dissolved into the matrix during a short homogenization treatment (470°C, 1 h), while the primary phase Al3(Sc,Zr) remains stable. This is due to Sc and Zr additions into the Al alloy, high Zn/Mg mass ratio, and low Cu content. The experimental findings fit well with the results calculated by the homogenization diffusion kinetics equation. The alloy shows an excellent mechanical performance after the short homogenization process followed by hot-extrusion and T6 treatment. Consequently, a good combination of low energy consumption and favorable mechanical properties is obtained.

  19. Creating experimental color harmony map

    NASA Astrophysics Data System (ADS)

    Chamaret, Christel; Urban, Fabrice; Lepinel, Josselin

    2014-02-01

    Starting in the 17th century with Newton, color harmony is a topic that did not reach a consensus on definition, representation or modeling so far. Previous work highlighted specific characteristics for color harmony on com- bination of color doublets or triplets by means of a human rating on a harmony scale. However, there were no investigation involving complex stimuli or pointing out how harmony is spatially located within a picture. The modeling of such concept as well as a reliable ground-truth would be of high value for the community, since the applications are wide and concern several communities: from psychology to computer graphics. We propose a protocol for creating color harmony maps from a controlled experiment. Through an eye-tracking protocol, we focus on the identification of disharmonious colors in pictures. The experiment was composed of a free viewing pass in order to let the observer be familiar with the content before a second pass where we asked "to search for the most disharmonious areas in the picture". Twenty-seven observers participated to the experiments that was composed of a total of 30 different stimuli. The high inter-observer agreement as well as a cross-validation confirm the validity of the proposed ground-truth.

  20. Creating a winning organizational culture.

    PubMed

    Campbell, Robert James

    2009-01-01

    This article explores the idea of how to create a winning organizational culture. By definition, a winning organizational culture is one that is able to make current innovations stick, while continuously changing based on the demands of the marketplace. More importantly, the article explores the notion that a winning organizational culture can have a profound impact on the conscious of the workforce, helping each individual to become a better, more productive person, who provides important services and products to the community. To form a basis toward defining the structure of what a winning organization culture looks like, 4 experts were asked 12 questions related to the development of an organizational culture. Three of the experts have worked intimately within the health care industry, while a fourth has been charged with turning around an organization that has had a losing culture for 17 years. The article provides insight into the role that values, norms, goals, leadership style, familiarity, and hiring practices play in developing a winning organizational culture. The article also emphasizes the important role that leaders perform in developing an organizational culture. PMID:19910709

  1. Creating healthy and just bioregions.

    PubMed

    Pezzoli, Keith; Leiter, Robert Allen

    2016-03-01

    Dramatic changes taking place locally, regionally, globally, demand that we rethink strategies to improve public health, especially in disadvantaged communities where the cumulative impacts of toxicant exposure and other environmental and social stressors are most damaging. The emergent field of Sustainability Science, including a new bioregionalism for the 21st Century, is giving rise to promising place-based (territorially rooted) approaches. Embedded in this bioregional approach is an integrated planning framework (IPF) that enables people to map and develop plans and strategies that cut across various scales (e.g. from regional to citywide to neighborhood scale) and various topical areas (e.g. urban land use planning, water resource planning, food systems planning and "green infrastructure" planning) with the specific intent of reducing the impacts of toxicants to public health and the natural environment. This paper describes a case of bioregionally inspired integrated planning in San Diego, California (USA). The paper highlights food-water-energy linkages and the importance of "rooted" community-university partnerships and knowledge-action collaboratives in creating healthy and just bioregions. PMID:26812849

  2. Creating a urine black hole

    NASA Astrophysics Data System (ADS)

    Hurd, Randy; Pan, Zhao; Meritt, Andrew; Belden, Jesse; Truscott, Tadd

    2015-11-01

    Since the mid-nineteenth century, both enlisted and fashion-conscious owners of khaki trousers have been plagued by undesired speckle patterns resulting from splash-back while urinating. In recent years, industrial designers and hygiene-driven entrepreneurs have sought to limit this splashing by creating urinal inserts, with the effectiveness of their inventions varying drastically. From this large assortment of inserts, designs consisting of macroscopic pillar arrays seem to be the most effective splash suppressers. Interestingly this design partially mimics the geometry of the water capturing moss Syntrichia caninervis, which exhibits a notable ability to suppress splash and quickly absorb water from impacting rain droplets. With this natural splash suppressor in mind, we search for the ideal urine black hole by performing experiments of simulated urine streams (water droplet streams) impacting macroscopic pillar arrays with varying parameters including pillar height and spacing, draining and material properties. We propose improved urinal insert designs based on our experimental data in hopes of reducing potential embarrassment inherent in wearing khakis.

  3. Laser Created Relativistic Positron Jets

    SciTech Connect

    Chen, H; Wilks, S C; Meyerhofer, D D; Bonlie, J; Chen, C D; Chen, S N; Courtois, C; Elberson, L; Gregori, G; Kruer, W; Landoas, O; Mithen, J; Murphy, C; Nilson, P; Price, D; Scheider, M; Shepherd, R; Stoeckl, C; Tabak, M; Tommasini, R; Beiersdorder, P

    2009-10-08

    Electron-positron jets with MeV temperature are thought to be present in a wide variety of astrophysical phenomena such as active galaxies, quasars, gamma ray bursts and black holes. They have now been created in the laboratory in a controlled fashion by irradiating a gold target with an intense picosecond duration laser pulse. About 10{sup 11} MeV positrons are emitted from the rear surface of the target in a 15 to 22-degree cone for a duration comparable to the laser pulse. These positron jets are quasi-monoenergetic (E/{delta}E {approx} 5) with peak energies controllable from 3-19 MeV. They have temperatures from 1-4 MeV in the beam frame in both the longitudinal and transverse directions. Positron production has been studied extensively in recent decades at low energies (sub-MeV) in areas related to surface science, positron emission tomography, basic antimatter science such as antihydrogen experiments, Bose-Einstein condensed positronium, and basic plasma physics. However, the experimental tools to produce very high temperature positrons and high-flux positron jets needed to simulate astrophysical positron conditions have so far been absent. The MeV temperature jets of positrons and electrons produced in our experiments offer a first step to evaluate the physics models used to explain some of the most energetic phenomena in the universe.

  4. Creating Effective K-12 Outreach

    NASA Astrophysics Data System (ADS)

    Hopkins, J.

    2011-12-01

    Grant opportunities require investigators to provide 'broader impacts' for their scientific research. For most researchers this involves some kind of educational outreach for the K-12 community. I have been able to participate in many different types of grant funded science teacher professional development programs. The most valuable have been outreach where the research seamlessly integrated with my classroom curriculum and was sustainable with my future classes. To accomplish these types of programs, the investigators needed to research the K-12 community and identify several key aspects of the K-12 environment where their expertise would benefit me and my students. There are a lot of different K-12 learning environments, so researchers need to be sure to match up with the right grade level and administrative environment. You might want to consider non-main stream school settings, such as magnet programs, STEM academies, and distance learning. The goal is to try to make your outreach seem natural and productive. This presentation will illustrate how researchers can create an educational outreach project that will be a win-win situation for everyone involved.

  5. Adolescents and HIV: creating partnerships.

    PubMed

    Tierney, S

    1998-05-01

    Despite the President's directive on youth and HIV in 1997 to focus the nation's attention on adolescents and the battle against AIDS, prevention programs continue to be ineffective. The number of seropositive youth, ages 13 to 24 years old, is unclear due to inconsistent definitions of age ranges and inadequate access to testing. Youth have not sought testing for many reasons, including failing to perceive their vulnerability to HIV, confidentiality concerns, and not realizing the effectiveness of early treatment. Adolescents are creating independence, establishing relationships, and learning about drugs and alcohol. Young gay and bisexual men, drug-using youth, and youth of color are at high risk of HIV transmission. Identifying the population involved in risk-taking behavior and eliminating the behavior is an ineffective strategy for adolescent HIV prevention programs. Complicating the issue further, the goals and expectations of adolescents differ from the adults who design and deliver prevention programs. HIV education and prevention efforts need to address solutions to hopelessness, isolation, and violence, rather than focusing on the negative effects risky behaviors will have in the future. Effective programs combine a youth/adult partnership to take advantage of the strengths of each individual. Strategies for implementing prevention programs that address the specific needs of adolescents are suggested. PMID:11365416

  6. Homogeneous solar photodegradation of contaminants in water

    SciTech Connect

    Bolton, J.R.; Ravel, M.; Cater, S.R.; Safarzadeh-Amiri, A.

    1996-12-31

    Solarchem has developed a new patented process (Rayox{reg_sign}-A), involving the use of ferrioxalate, which absorbs out to 500 nm and thus uses a much larger fraction of the spectral output of a medium pressure mercury lamp, compared to the UV/H{sub 2}O{sub 2} process. The new process generates hydroxyl radicals, so the chemistry is very similar to that in UV/H{sub 2}O{sub 2} treatments. A variant of this process (called Solaqua{reg_sign}) uses solar energy as the light source. This paper describes the use of solar radiation to destroy contaminants in ground and process waters. Results will be presented of the treatment of 1,4-dioxane spiked into tap water in a 1.45 m{sup 2} test solar reactor, where they have obtained quantum yields of {approximately} 1 for the removal of 1,4-dioxane. Figures-of-merit, based on the collector area (m{sup 2}) required to achieve a certain treatment under standard solar conditions, are introduced, defined and applied to the Solaqua{reg_sign} process for the treatment of 1,4-dioxane.

  7. Converting homogeneous to heterogeneous in electrophilic catalysis using monodisperse metal nanoparticles

    NASA Astrophysics Data System (ADS)

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2010-01-01

    A continuing goal in catalysis is to unite the advantages of homogeneous and heterogeneous catalytic processes. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this unification can also be supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl2, and catalyse a range of π-bond activation reactions previously only catalysed through homogeneous processes. Multiple experimental methods are used to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, a size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared with larger, polymer-capped analogues.

  8. Feeding premature infants banked human milk homogenized by ultrasonic treatment.

    PubMed

    Rayol, M R; Martinez, F E; Jorge, S M; Gonçalves, A L; Desai, I D

    1993-12-01

    Premature neonates fed ultrasonically homogenized human milk had better weight gain and triceps skin-fold thickness than did a control group given untreated human milk (p < 0.01) and also had lower fat loss during tube feeding (p < 0.01). Ultrasonic homogenization of human milk appears to minimize loss of fat and thus allows better growth of premature infants. PMID:8229535

  9. Sensitivity of liquid clouds to homogenous freezing parameterizations

    PubMed Central

    Herbert, Ross J; Murray, Benjamin J; Dobbie, Steven J; Koop, Thomas

    2015-01-01

    Water droplets in some clouds can supercool to temperatures where homogeneous ice nucleation becomes the dominant freezing mechanism. In many cloud resolving and mesoscale models, it is assumed that homogeneous ice nucleation in water droplets only occurs below some threshold temperature typically set at −40°C. However, laboratory measurements show that there is a finite rate of nucleation at warmer temperatures. In this study we use a parcel model with detailed microphysics to show that cloud properties can be sensitive to homogeneous ice nucleation as warm as −30°C. Thus, homogeneous ice nucleation may be more important for cloud development, precipitation rates, and key cloud radiative parameters than is often assumed. Furthermore, we show that cloud development is particularly sensitive to the temperature dependence of the nucleation rate. In order to better constrain the parameterization of homogeneous ice nucleation laboratory measurements are needed at both high (>−35°C) and low (<−38°C) temperatures. Key Points Homogeneous freezing may be significant as warm as −30°C Homogeneous freezing should not be represented by a threshold approximation There is a need for an improved parameterization of homogeneous ice nucleation PMID:26074652

  10. Improving non-homogeneous regression for probabilistic precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Presser, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2016-04-01

    Non-homogenous regression is a state-of-the-art ensemble post-processing technique that statistically corrects ensemble forecasts and predicts a full probability distribution. Originally, a Gaussian model is employed that linearly links the predicted distribution mean and variance to the ensemble mean and variance, respectively. Regarding non-normally distributed precipitation data, this model can be censored at zero to account for periods without precipitation. We improve this regression approach in several directions. First, we consider link functions in the variance sub-model that assure positivity of the model variance. Second, we consider a censored Logistic (instead of censored Gaussian) distribution to accommodate more frequent events with high precipitation. Third, we introduce a splitting procedure, which appropriately accounts for perfect prediction cases, i.e., where no precipitation is observed when all ensemble members predict no precipitation. This study is applied to different accumulation periods (3, 6, 12, 24 hours) for short-range precipitation forecasts in Northern Italy. The choice of link function for the variance parameter, the splitting procedure, and an appropriate distribution assumption for precipitation data significantly improve the probabilistic forecast skill, especially for shorter accumulation periods. KEYWORDS: heteroscedastic ensemble post-processing, censored distribution, maximum likelihood estimation, probabilistic precipitation forecasting

  11. Powerful laser pulse absorption in partly homogenized foam plasma

    NASA Astrophysics Data System (ADS)

    Cipriani, M.; Gus'kov, S. Yu.; De Angelis, R.; Andreoli, P.; Consoli, F.; Cristofari, G.; Di Giorgio, G.; Ingenito, F.; Rupasov, A. A.

    2016-03-01

    The internal volume structure of a porous medium of light elements determines unique features of the absorption mechanism of laser radiation; the characteristics of relaxation and transport processes in the produced plasma are affected as well. Porous materials with an average density larger than the critical density have a central role in enhancing the pressure produced during the ablation by the laser pulse; this pressure can exceed the one produced by target direct irradiation. The problem of the absorption of powerful laser radiation in a porous material is examined both analytically and numerically. The behavior of the medium during the process of pore filling in the heated region is described by a model of viscous homogenization. An expression describing the time and space dependence of the absorption coefficient of laser radiation is therefore obtained from the model. A numerical investigation of the absorption of a nanosecond laser pulse is performed within the present model. In the context of numerical calculations, porous media with an average density larger than the critical density of the laser-produced plasma are considered. Preliminary results about the inclusion of the developed absorption model into an hydrodynamic code are presented.

  12. Dynamics behavior of homogeneous dielectric barrier discharge at atmospheric pressure

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Gu, Biao; Wang, Wenchun; Wang, Dezhen; Peng, Xuwen

    2009-07-01

    An experimental study on the dynamics behavior of homogeneous dielectric barrier discharge (HDBD) at atmospheric pressure is described in this paper. Two kinds of discharge mode, glow and Townsend discharge modes, can be easily identified according to the differential conductivity of current-voltage relationship in the ascent stage of discharge current for the atmospheric HDBD. A (three-dimensional) 3D phase space made by discharge current, gas gap voltage, and charge density of dielectric-plate surface was utilized in the study. By projecting the discharge evolution trajectory in the 3D space, the 3D trajectory of multiple current peaks discharge in atmospheric helium shows a limited cycle with convolutions and undergoes a series of bifurcation process; however, the 3D trajectory of atmospheric N2 HDBD is a limited cycle without any convolution and bifurcation process. In addition, the first ionization coefficient of working gas plays a key role to determine the discharge mode of atmospheric HDBD, the transition of discharge mode and the dynamics stability of atmospheric HDBD.

  13. Dynamics behavior of homogeneous dielectric barrier discharge at atmospheric pressure

    SciTech Connect

    Zhang Yan; Gu Biao; Wang Wenchun; Wang Dezhen; Peng Xuwen

    2009-07-15

    An experimental study on the dynamics behavior of homogeneous dielectric barrier discharge (HDBD) at atmospheric pressure is described in this paper. Two kinds of discharge mode, glow and Townsend discharge modes, can be easily identified according to the differential conductivity of current-voltage relationship in the ascent stage of discharge current for the atmospheric HDBD. A (three-dimensional) 3D phase space made by discharge current, gas gap voltage, and charge density of dielectric-plate surface was utilized in the study. By projecting the discharge evolution trajectory in the 3D space, the 3D trajectory of multiple current peaks discharge in atmospheric helium shows a limited cycle with convolutions and undergoes a series of bifurcation process; however, the 3D trajectory of atmospheric N{sub 2} HDBD is a limited cycle without any convolution and bifurcation process. In addition, the first ionization coefficient of working gas plays a key role to determine the discharge mode of atmospheric HDBD, the transition of discharge mode and the dynamics stability of atmospheric HDBD.

  14. Dynamics of spiking neurons: between homogeneity and synchrony.

    PubMed

    Rangan, Aaditya V; Young, Lai-Sang

    2013-06-01

    Randomly connected networks of neurons driven by Poisson inputs are often assumed to produce "homogeneous" dynamics, characterized by largely independent firing and approximable by diffusion processes. At the same time, it is well known that such networks can fire synchronously. Between these two much studied scenarios lies a vastly complex dynamical landscape that is relatively unexplored. In this paper, we discuss a phenomenon which commonly manifests in these intermediate regimes, namely brief spurts of spiking activity which we call multiple firing events (MFE). These events do not depend on structured network architecture nor on structured input; they are an emergent property of the system. We came upon them in an earlier modeling paper, in which we discovered, through a careful benchmarking process, that MFEs are the single most important dynamical mechanism behind many of the V1 phenomena we were able to replicate. In this paper we explain in a simpler setting how MFEs come about, as well as their potential dynamic consequences. Although the mechanism underlying MFEs cannot easily be captured by current population dynamics models, this phenomena should not be ignored during analysis; there is a growing body of evidence that such collaborative activity may be a key towards unlocking the possible functional properties of many neuronal networks. PMID:23096934

  15. Fuel mixture stratification as a method for improving homogeneous charge compression ignition engine operation

    DOEpatents

    Dec, John E.; Sjoberg, Carl-Magnus G.

    2006-10-31

    A method for slowing the heat-release rate in homogeneous charge compression ignition ("HCCI") engines that allows operation without excessive knock at higher engine loads than are possible with conventional HCCI. This method comprises injecting a fuel charge in a manner that creates a stratified fuel charge in the engine cylinder to provide a range of fuel concentrations in the in-cylinder gases (typically with enough oxygen for complete combustion) using a fuel with two-stage ignition fuel having appropriate cool-flame chemistry so that regions of different fuel concentrations autoignite sequentially.

  16. How to become an authentic speaker. Even sincere speeches often come across as contrived. A four-step process will help you create a true emotional connection with your audience.

    PubMed

    Morgan, Nick

    2008-11-01

    Like the best-laid schemes of mice and men, the best-rehearsed speeches go oft astray. No amount of preparation can counter an audience's perception that the speaker is calculating or insincere. Why do so many managers have trouble communicating authenticity to their listeners? Morgan, a communications coach for more than two decades, offers advice for overcoming this difficulty. Recent brain research shows that natural, unstudied gestures--what Morgan calls the " second conversation"--express emotions or impulses a split second before our thought processes have turned them into words. So the timing of practiced gestures will always be subtly off--just enough to be picked up by listeners' unconscious ability to read body language. If you can't practice the unspoken part of your delivery, what can you do? Tap into four basic impulses underlying your speech--to be open to the audience, to connect with it, to be passionate, and to "listen" to how the audience is responding--and then rehearse your presentation with each in mind. You can become more open, for instance, by imagining that you're speaking to your spouse or close friend. To more readily connect, focus on needing to engage your listeners and then to keep their attention, as if you were speaking to a child who isn't heeding your words. To convey your passion, identify the feelings behind your speech and let them come through. To listen, think about what the audience is probably feeling when you step up to the podium and be alert to the nonverbal messages of its members. Internalizing these four impulses as you practice will help you come across as relaxed and authentic--your body language will take care of itself. PMID:19009725

  17. India creates social marketing organization.

    PubMed

    1984-01-01

    India, in a major policy shift toward reversible birth controls methods, will form a new organization to promote private sector contraceptive sales. The government, through a recently signed agreement with the Agency for International Development (AID), plans to establish a private nonprofit Contraceptive Marketing Organization (CMO) in fiscal year 1984. This momentous move marks a full circle return to a 1969 proposal by AID and Ford Foundation consultants. Funded at about $500 million over a 7 year period, the CMO will function as a semi-autonomous entity run by a board of governors representing government and such public and public sectors as health, communications, management, manufacturing, marketing, advertising, and market research. According to the agreement called the India Family Planning Communications and Marketing Plan, the CMO's activities will cover procurement and distribution of condoms, oral contraceptives (OCs), and other yet to be determined contraceptive methods. Of the $500 million in funds, the government of India has pledged 2/3, AID roughly $50 million in grants and loans, with the balance expected from such sources as the UN Fund for Population Activities. The CMO's goal is a marked increase in contraceptive use by married couples of reproductive age from the current 6% rate to 20% by 1990. As of 1982, India has 122 million such couples, with 1% purchasing commercial products, 2% buying Nirodh Marketing Program condoms and 3% relying on free government contraceptives. Besides creating the CMO, the India/AID pact outlines intensified public sector family planning promotions and activities. Some Indian health experts believe the government's decision to expand social marketing's role rests with a significant decade long decline in the popularity of such permanent birth control measures as vasectomy and tubal ligation. PMID:12313308

  18. Creating the Home Field Advantage

    ERIC Educational Resources Information Center

    Perna, Mark C.

    2007-01-01

    Far too many career and technical education schools overlook the opportunity to establish and cultivate real long-term emotional attachment and loyalty with prospective students and their parents. This occurs because of a basic misconception at the school that they are not in control of their own marketing and recruitment process. Community…

  19. 6 Ways to Create Change

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2013-01-01

    With so many disruptive forces at work in higher education, colleges and universities are faced with the imperative to change not just technologies and processes, but behaviors and mindsets. In part one of a two-part series, change-management experts share six ways to foster large-scale transformations on campus. "Campus Technology"…

  20. Creating relationships with persons with moderate to severe dementia

    PubMed Central

    Kjellström, Sofia; Hellström, Ingrid

    2013-01-01

    The study describes how relationships are created with persons with moderate to severe dementia. The material comprises 24 video sequences of Relational Time (RT) sessions, 24 interviews with persons with dementia and eight interviews with professional caregivers. The study method was Constructivist Grounded Theory. The categories of ‘Assigning time’, ‘Establishing security and trust’ and ‘Communicating equality’ were strategies for arriving at the core category, ‘Opening up’, which was the process that led to creating relationships. Both parties had to contribute to create a relationship; the professional caregiver controlled the process, but the person with dementia permitted the caregiver's overtures and opened up, thus making the relationship possible. Interpersonal relationships are significant to enhancing the well-being of persons with dementia. Small measures like RT that do not require major resources can open paths to creating relationships. PMID:24336663

  1. Creating relationships with persons with moderate to severe dementia.

    PubMed

    Ericsson, Iréne; Kjellström, Sofia; Hellström, Ingrid

    2013-01-01

    The study describes how relationships are created with persons with moderate to severe dementia. The material comprises 24 video sequences of Relational Time (RT) sessions, 24 interviews with persons with dementia and eight interviews with professional caregivers. The study method was Constructivist Grounded Theory. The categories of 'Assigning time', 'Establishing security and trust' and 'Communicating equality' were strategies for arriving at the core category, 'Opening up', which was the process that led to creating relationships. Both parties had to contribute to create a relationship; the professional caregiver controlled the process, but the person with dementia permitted the caregiver's overtures and opened up, thus making the relationship possible. Interpersonal relationships are significant to enhancing the well-being of persons with dementia. Small measures like RT that do not require major resources can open paths to creating relationships. PMID:24336663

  2. Method and Apparatus for Creating a Topography at a Surface

    DOEpatents

    Adams, David P.; Sinclair, Michael B.; Mayer, Thomas M.; Vasile, Michael J.; Sweatt, William C.

    2008-11-11

    Methods and apparatus whereby an optical interferometer is utilized to monitor and provide feedback control to an integrated energetic particle column, to create desired topographies, including the depth, shape and/or roughness of features, at a surface of a specimen. Energetic particle columns can direct energetic species including, ions, photons and/or neutral particles to a surface to create features having in-plane dimensions on the order of 1 micron, and a height or depth on the order of 1 nanometer. Energetic processes can include subtractive processes such as sputtering, ablation, focused ion beam milling and, additive processes, such as energetic beam induced chemical vapor deposition. The integration of interferometric methods with processing by energetic species offers the ability to create desired topographies at surfaces, including planar and curved shapes.

  3. Creating Science Picture Books for an Authentic Audience

    ERIC Educational Resources Information Center

    DeFauw, Danielle L.; Saad, Klodia

    2014-01-01

    This article presents an authentic writing opportunity to help ninth-grade students use the writing process in a science classroom to write and illustrate picture books for fourth-grade students to demonstrate and share their understanding of a biology unit on cells. By creating a picture book, students experience the writing process, understand…

  4. Making It CLICK: Planning, Creating, and Using CPCC Libraries' Logo

    ERIC Educational Resources Information Center

    Moore, Gena

    2005-01-01

    The Central Piedmont Community College Libraries have been successful in creating positive expectations from the CPCC community by connecting an official library logo with quality library service. The creation of the CPCC Libraries' logo CLICK was a process that spanned several months. A history of this process details the meetings and design work…

  5. Creating outcomes with redesign efforts.

    PubMed

    Cole, D A

    1999-09-01

    Integrating principles from a variety of theories, managers have developed a conceptual framework for reengineering processes in an endoscopy unit to improve the value of services provided to customers. A major goal of this redesign was to enhance or maintain quality of care, increase efficiency, and maintain or reduce costs. This was accomplished by analyzing data and outcome measures related to patient, physician, and staff member satisfaction, as well as resource allocation. The departmental results were tangible, positive, and visible almost immediately. With the right team and the right techniques, tools, methodologies, and decision-making processes, redesign projects can and do lead to dramatic improvements in productivity, service, customer and staff member satisfaction, cost control, and innovation. PMID:10514888

  6. Mentoring matters: creating, connecting, empowering.

    PubMed

    McKinley, Mary G

    2004-01-01

    In the current chaotic healthcare environment, growth and development of nursing staff is essential to maintain quality outcomes. The purpose of this article is to highlight the concept of mentoring, explain the benefits of mentoring in fostering the development of novice nurses, and present a primer for how advanced practice nurses could implement a mentoring relationship. A three-step mentoring process of reflecting, reframing, and resolving is described with examples of implementation of these steps. PMID:15461037

  7. Creating Effective Dialogue Around Climate Change

    NASA Astrophysics Data System (ADS)

    Kiehl, J. T.

    2015-12-01

    Communicating climate change to people from diverse sectors of society has proven to be difficult in the United States. It is widely recognized that difficulties arise from a number of sources, including: basic science understanding, the psychologically affect laden content surrounding climate change, and the diversity of value systems that exist in our society. I explore ways of working with the affect that arises around climate change and describe specific methods to work with the resistance often encountered when communicating this important issue. The techniques I describe are rooted in psychology and group process and provide means for creating more effective narratives to break through the barriers to communicating climate change science. Examples are given from personal experiences in presenting climate change to diverse groups.

  8. Creating Constructive Dialogues Around Climate Change

    NASA Astrophysics Data System (ADS)

    Kiehl, J. T.

    2014-12-01

    Presenting scientific facts to the general public often creates strong emotional responses in listeners. This is especially the case around issues like climate change, in which strong resistance can arise in individuals and groups. This is an inherent psychological characteristic when conveying disturbing information to people. In this presentation, I will describe personal experiences of presenting climate change science in various public forums. In particular, I will describe two experiences: one in which I was able to effectively work with the emotional reactions to the scientific information and another in which the resistance was difficult to resolve within the group. Based on these experiences and others, I describe an innovative four-stage process for working with situations in which there is strong resistance to the science of climate change (or other challenging scientific issues). I conclude by discussing how this approach can be employed and potential pitfalls with such an approach.

  9. Towards creating the perfect electronic prescription.

    PubMed

    Dhavle, Ajit A; Rupp, Michael T

    2015-04-01

    Significant strides have been made in electronic (e)-prescribing standards and software applications that have further fueled the adoption and use of e-prescribing. However, for e-prescribing to realize its full potential for improving the safety, effectiveness, and efficiency of prescription drug delivery, important work remains to be carried out. This perspective describes the ultimate goal of all e-prescribing stakeholders including prescribers and dispensing pharmacists: a clear, complete, and unambiguous e-prescription order that can be seamlessly received, processed, and fulfilled at the dispensing pharmacy without the need for additional clarification from the prescriber. We discuss the challenges to creating the perfect e-prescription by focusing on selected data segments and data fields that are available in the new e-prescription transaction as defined in the NCPDP SCRIPT Standard and suggest steps that could be taken to move the industry closer to achieving this vision. PMID:25038197

  10. Creating catastrophes in the classroom

    NASA Astrophysics Data System (ADS)

    Andersson, Thommy

    2013-04-01

    Buildings, infrastructure and human life are being destroyed by wind and landslides. To interest and motivate pupils and to help them understand abstract knowledge, a practical experiment could be useful. These experiments will show why strong winds circulate around tropical cyclones and how fluvial geological processes affect nature and communities. The experiments are easy to set up and the equipment is not expensive. Experiment 1: Exogenic processes of water are often slow processes. This experiment will simulate water processes that can take thousands of years, in less than 40 minutes. This experiment can be presented for and understood by pupils at all levels. Letting the pupils build up the scenery will make them more curious about the course of events. During that time they will see the geomorphological genesis of landforms such as landslides, sandurs, deltas, canyons sedimentations, selective erosions. Placing small houses, bridges etc. we can lead to discussions about natural catastrophes and community planning. Material needed for the experiment is a water bucket, erosion gutter, clay (simulating rock), sand and smaller pebbles (simulating the soil), houses of "Monopoly" size and tubes. By using a table with wheels it is easy to reuse the result for other lessons. Installation of a pump can make the experiment into a closed loop system. This installation can be used for presentations outside the classroom. Experiment 2: The Coriolis Effect explains why the wind (moving objects) deflects when moving. In the northern hemisphere the deflection is clockwise and anti-clockwise in the southern hemisphere. This abstract effect is often hard for upper secondary pupils to understand. This experiment will show the effect and thus make the theory real and visible. Material needed for this experiment is a bucket, pipes, a string. At my school we had cooperation with pupils from the Industrial Technology programme who made a copper pipe construction. During the

  11. Creating a New System for Principal Preparation: Reflections on Efforts to Transcend Tradition and Create New Cultures

    ERIC Educational Resources Information Center

    Reed, Cynthia J.; Kensler, Lisa A. W.

    2010-01-01

    When selected as a pilot redesign site, we decided to both refocus the underlying assumptions guiding our program and to engage in processes allowing us to model best practices while creating a new program. This article summarizes key aspects of our redesign work and offers reflections on the processes used and challenges faced. Murphy's (2006)…

  12. [Creating an integrated nursing curriculum].

    PubMed

    Romano, R A; Papa, L M; Lopes, G T

    1997-01-01

    During the last two decades, Brazilian society has gone through great changes into political, ideological and economical fields. These changes left their strings into society, specially in population health. The nurse formation based on the Law n(o) 5540/68 and on the Statement n(o) 163/72, no more meets population demands. Since 1992, the Nursing Faculty of UERJ-FEUerj intensifies the reflection movement upon teaching-learning process searching for transforming its own reality. The making of this project presents two complementary and important reasons: FEUerj docents and discents' desire in elaborating a curriculum which searches for nurses' formation that articulates teaching-work-community, theory and practice, based on a Critical Theory of Education, on the line of PROBLEMATIZATION, and the accomplishment of Statement n(o) 314/94 from the CFE and from the Letter of Order MEC n(o) 1171/15/dez/94. From debating, the professional profile has been defined from the social environment where the profession is performed and the alumnate's characteristics; area determination or group of attributions, according to professional praxis adequation, concept hierachization, processes, etc., which in the process of 'classification and syntheses' of knowledge results into a netlike chained and related tree. In the first phase of the curriculum study, it has diagnosed as principal condition, the actual curriculum 'DECONTEXTUALIZATION' and the 'US' to be faced to lead it to an end the Curriculum Reformulation Proposal. The Process of Pedagogical Abilitation for professors, workshops, researches on the desirable and present profile, seminars, performance, abilities and principles systematization, identification of areas which compose the integrated curriculum, subjects localization into areas and articulation between professional subjects and other activities, has been implemented. Based on this work on the problematized pedagogy first step, an instrument 'Research on the

  13. Non-homogeneous models of sequence evolution in the Bio++ suite of libraries and programs

    PubMed Central

    2008-01-01

    Background Accurately modeling the sequence substitution process is required for the correct estimation of evolutionary parameters, be they phylogenetic relationships, substitution rates or ancestral states; it is also crucial to simulate realistic data sets. Such simulation procedures are needed to estimate the null-distribution of complex statistics, an approach referred to as parametric bootstrapping, and are also used to test the quality of phylogenetic reconstruction programs. It has often been observed that homologous sequences can vary widely in their nucleotide or amino-acid compositions, revealing that sequence evolution has changed importantly among lineages, and may therefore be most appropriately approached through non-homogeneous models. Several programs implementing such models have been developed, but they are limited in their possibilities: only a few particular models are available for likelihood optimization, and data sets cannot be easily generated using the resulting estimated parameters. Results We hereby present a general implementation of non-homogeneous models of substitutions. It is available as dedicated classes in the Bio++ libraries and can hence be used in any C++ program. Two programs that use these classes are also presented. The first one, Bio++ Maximum Likelihood (BppML), estimates parameters of any non-homogeneous model and the second one, Bio++ Sequence Generator (BppSeqGen), simulates the evolution of sequences from these models. These programs allow the user to describe non-homogeneous models through a property file with a simple yet powerful syntax, without any programming required. Conclusion We show that the general implementation introduced here can accommodate virtually any type of non-homogeneous models of sequence evolution, including heterotachous ones, while being computer efficient. We furthermore illustrate the use of such general models for parametric bootstrapping, using tests of non-homogeneity applied to an

  14. Influence of interspecific competition and landscape structure on spatial homogenization of avian assemblages.

    PubMed

    Robertson, Oliver J; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or 'winners', is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or 'winners', can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems. PMID:23724136

  15. Influence of Interspecific Competition and Landscape Structure on Spatial Homogenization of Avian Assemblages

    PubMed Central

    Robertson, Oliver J.; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or ‘winners’, is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or ‘winners’, can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems. PMID:23724136

  16. Toward a mechanistic understanding and prediction of biotic homogenization.

    PubMed

    Olden, Julian D; Poff, N LeRoy

    2003-10-01

    The widespread replacement of native species with cosmopolitan, nonnative species is homogenizing the global fauna and flora. While the empirical study of biotic homogenization is substantial and growing, theoretical aspects have yet to be explored. Consequently, the breadth of possible ecological mechanisms that can shape current and future patterns and rates of homogenization remain largely unknown. Here, we develop a conceptual model that describes 14 potential scenarios by which species invasions and/or extinctions can lead to various trajectories of biotic homogenization (increased community similarity) or differentiation (decreased community similarity); we then use a simulation approach to explore the model's predictions. We found changes in community similarity to vary with the type and number of nonnative and native species, the historical degree of similarity among the communities, and, to a lesser degree, the richness of the recipient communities. Homogenization is greatest when similar species invade communities, causing either no extinction or differential extinction of native species. The model predictions are consistent with current empirical data for fish, bird, and plant communities and therefore may represent the dominant mechanisms of contemporary homogenization. We present a unifying model illustrating how the balance between invading and extinct species dictates the outcome of biotic homogenization. We conclude by discussing a number of critical but largely unrecognized issues that bear on the empirical study of biotic homogenization, including the importance of spatial scale, temporal scale, and data resolution. We argue that the study of biotic homogenization needs to be placed in a more mechanistic and predictive framework in order for studies to provide adequate guidance in conservation efforts to maintain regional distinctness of the global biota. PMID:14582007

  17. Creating and Testing Simulation Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  18. Process

    SciTech Connect

    Geenen, P.V.; Bennis, J.

    1989-04-04

    A process is described for minimizing the cracking tendency and uncontrolled dimensional change, and improving the strength of a rammed plastic refractory reactor liner comprising phosphate-bonded silicon carbide or phosphate-bonded alumina. It consists of heating the reactor liner placed or mounted in a reactor, prior to its first use, from ambient temperature up to a temperature of from about 490/sup 0/C to about 510/sup 0/C, the heating being carried out by heating the liner at a rate to produce a temperature increase of the liner not greater than about 6/sup 0/C per hour.

  19. Gum tragacanth dispersions: Particle size and rheological properties affected by high-shear homogenization.

    PubMed

    Farzi, Mina; Yarmand, Mohammad Saeed; Safari, Mohammad; Emam-Djomeh, Zahra; Mohammadifar, Mohammad Amin

    2015-08-01

    The effect of high-shear homogenization on the rheological and particle size characteristics of three species of gum tragacanth (GT) was detected. Dispersions were subjected to 0-20 min treatment. Static light scattering techniques and rheological tests were used to study the effect of treatment. The results showed that the process caused a decrease in particle size parameters for all three species, but interestingly, the apparent viscosities increased. The highest increase of apparent viscosity was found for solutions containing Astragalus gossypinus, which possessed the highest insoluble fraction. The viscoelastic behaviors of dispersions were also significantly influenced by the process. Homogenization caused an increase in both G' and G″, in all three species. The alterations seem to be highly dependent on GT species and structure. The results could be of high importance in the industry, since the process will lead to textural modifications of food products containing GT. PMID:25987462

  20. Monte Carlo simulation of near-infrared light propagation through homogeneous mixed media.

    PubMed

    Maughan, Nichole M; Moody, Joseph W; Miller, David R

    2013-10-01

    ABSTRACT. Noninvasive blood analysis devices that can measure levels of small constituents of blood are of interest in the medical community. An important step in creating these devices is to understand the interaction of photons with human tissue in increasingly greater physiological detail. Models based on layered biological materials give excellent results for many applications but may not be as accurate as needed when those materials are finely intertwined to the point of resembling a homogeneous mixture. To explore the ramifications of treating materials as layers versus a mixture, we have modeled, using a Monte Carlo technique, the interaction of photons through epidermis, blood, and water arranged both in layers and in a homogeneous blend. We confirm the expected linear relation between photon attenuation and material volumetric percentage in two-layer models. However, when the materials are homogeneously mixed together and volumetric percentage is replaced with interaction volume percentage, this relationship becomes nonlinear. These nonlinearities become significant when the values of the interaction coefficient, μt, differ by an order of magnitude or more. PMID:24145661