Science.gov

Sample records for process creates homogenous

  1. Children Creating Ways To Represent Changing Situations: On the Development of Homogeneous Spaces.

    ERIC Educational Resources Information Center

    Nemirovsky, Ricardo; Tierney, Cornelia

    2001-01-01

    Focuses on children creating representations on paper for situations that change over time. Articulates the distinction between homogeneous and heterogeneous spaces and reflects on children's tendency to create hybrids between them. (Author/MM)

  2. Creating a Flexible Budget Process

    ERIC Educational Resources Information Center

    Frew, James; Olson, Robert; Pelton, M. Lee

    2009-01-01

    The budget process is often an especially thorny area in communication between administrators and faculty members. Last year, Willamette University took a step toward reducing tensions surrounding the budget. As university administrators planned for the current year, they faced the high degree of uncertainty that the financial crisis has forced on…

  3. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-07

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization.

  4. The Largest Fragment of a Homogeneous Fragmentation Process

    NASA Astrophysics Data System (ADS)

    Kyprianou, Andreas; Lane, Francis; Mörters, Peter

    2017-03-01

    We show that in homogeneous fragmentation processes the largest fragment at time t has size e^{-t Φ '(overline{p})}t^{-3/2 (log Φ )'(overline{p})+o(1)}, where Φ is the Lévy exponent of the fragmentation process, and overline{p} is the unique solution of the equation (log Φ )'(bar{p})=1/1+bar{p}. We argue that this result is in line with predictions arising from the classification of homogeneous fragmentation processes as logarithmically correlated random fields.

  5. Process to create simulated lunar agglutinate particles

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert J. (Inventor); Gustafson, Marty A. (Inventor); White, Brant C. (Inventor)

    2011-01-01

    A method of creating simulated agglutinate particles by applying a heat source sufficient to partially melt a raw material is provided. The raw material is preferably any lunar soil simulant, crushed mineral, mixture of crushed minerals, or similar material, and the heat source creates localized heating of the raw material.

  6. Experimenting With Ore: Creating the Taconite Process; flow chart of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Experimenting With Ore: Creating the Taconite Process; flow chart of process - Mines Experiment Station, University of Minnesota, Twin Cities Campus, 56 East River Road, Minneapolis, Hennepin County, MN

  7. Competing Contact Processes on Homogeneous Networks with Tunable Clusterization

    NASA Astrophysics Data System (ADS)

    Rybak, Marcin; Kułakowski, Krzysztof

    2013-03-01

    We investigate two homogeneous networks: the Watts-Strogatz network with mean degree ⟨k⟩ = 4 and the Erdös-Rényi network with ⟨k⟩ = 10. In both kinds of networks, the clustering coefficient C is a tunable control parameter. The network is an area of two competing contact processes, where nodes can be in two states, S or D. A node S becomes D with probability 1 if at least two its mutually linked neighbors are D. A node D becomes S with a given probability p if at least one of its neighbors is S. The competition between the processes is described by a phase diagram, where the critical probability pc depends on the clustering coefficient C. For p > pc the rate of state S increases in time, seemingly to dominate in the whole system. Below pc, the majority of nodes is in the D-state. The numerical results indicate that for the Watts-Strogatz network the D-process is activated at the finite value of the clustering coefficient C, close to 0.3. On the contrary, for the Erdös-Rényi network the transition is observed at the whole investigated range of C.

  8. Can An Evolutionary Process Create English Text?

    SciTech Connect

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  9. Improved microbiological diagnostic due to utilization of a high-throughput homogenizer for routine tissue processing.

    PubMed

    Redanz, Sylvio; Podbielski, Andreas; Warnke, Philipp

    2015-07-01

    Tissue specimens are valuable materials for microbiological diagnostics and require swift and accurate processing. Established processing methods are complex, labor intensive, hardly if at all standardizable, and prone to incorporate contaminants. To improve analyses from tissue samples in routine microbiological diagnostics, by facilitating, fastening, and standardizing processing as well as increasing the microbial yield, performance of Precellys 24 high-throughput tissue homogenizer was evaluated. Therefore, tissue samples were artificially inoculated with Staphylococcus aureus, Escherichia coli, and Candida albicans in 3 different ways on the surface and within the material. Microbial yield from homogenized samples was compared to direct plating method. Further, as proof of principle, routine tissue samples from knee and hip endoprosthesis infections were analyzed. The process of tissue homogenization with Precellys 24 homogenizer is easy and fast to perform and allows for a high degree of standardization. Microbial yield after homogenization was significantly higher as compared to conventional plating technique.

  10. [Chemiluminescence spectroscopic analysis of homogeneous charge compression ignition combustion processes].

    PubMed

    Liu, Hai-feng; Yao, Ming-fa; Jin, Chao; Zhang, Peng; Li, Zhe-ming; Zheng, Zun-qing

    2010-10-01

    To study the combustion reaction kinetics of homogeneous charge compression ignition (HCCI) under different port injection strategies and intake temperature conditions, the tests were carried out on a modified single-cylinder optical engine using chemiluminescence spectroscopic analysis. The experimental conditions are keeping the fuel mass constant; fueling the n-heptane; controlling speed at 600 r x min(-1) and inlet pressure at 0.1 MPa; controlling inlet temperature at 95 degrees C and 125 degrees C, respectively. The results of chemiluminescence spectrum show that the chemiluminescence is quite faint during low temperature heat release (LTHR), and these bands spectrum originates from formaldehyde (CH2O) chemiluminescence. During the phase of later LTHR-negative temperature coefficient (NTC)-early high temperature heat release (HTHR), these bands spectrum also originates from formaldehyde (CH2O) chemiluminescence. The CO--O* continuum is strong during HTHR, and radicals such as OH, HCO, CH and CH2O appear superimposed on this CO--O* continuum. After the HTHR, the chemiluminescence intensity is quite faint. In comparison to the start of injection (SOI) of -30 degrees ATDC, the chemiluminescence intensity is higher under the SOI = -300 degrees ATDC condition due to the more intense emissions of CO--O* continuum. And more radicals of HCO and OH are formed, which also indicates a more intense combustion reaction. Similarly, more intense CO--O* continuum and more radicals of HCO and OH are emitted under higher intake temperature case.

  11. A Tool for Creating Healthier Workplaces: The Conducivity Process

    ERIC Educational Resources Information Center

    Karasek, Robert A.

    2004-01-01

    The conducivity process, a methodology for creating healthier workplaces by promoting conducive production, is illustrated through the use of the "conducivity game" developed in the NordNet Project in Sweden, which was an action research project to test a job redesign methodology. The project combined the "conducivity" hypotheses about a…

  12. Effect of homogenization process on the hardness of Zn-Al-Cu alloys

    NASA Astrophysics Data System (ADS)

    Villegas-Cardenas, Jose D.; Saucedo-Muñoz, Maribel L.; Lopez-Hirata, Victor M.; De Ita-De la Torre, Antonio; Avila-Davila, Erika O.; Gonzalez-Velazquez, Jorge Luis

    2015-10-01

    The effect of a homogenizing treatment on the hardness of as-cast Zn-Al-Cu alloys was investigated. Eight alloy compositions were prepared and homogenized at 350 °C for 180 h, and their Rockwell "B" hardness was subsequently measured. All the specimens were analyzed by X-ray diffraction and metallographically prepared for observation by optical microscopy and scanning electron microscopy. The results of the present work indicated that the hardness of both alloys (as-cast and homogenized) increased with increasing Al and Cu contents; this increased hardness is likely related to the presence of the θ and τ' phases. A regression equation was obtained to determine the hardness of the homogenized alloys as a function of their chemical composition and processing parameters, such as homogenization time and temperature, used in their preparation.

  13. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  14. Homogeneous and Heterogeneous Catalytic Processes Promoted by Organoactinides

    NASA Astrophysics Data System (ADS)

    Burns, Carol J.; Eisen, Moris S.

    During the last two decades, the chemistry of organoactinides has flourished, reaching a high level of sophistication. The use of organoactinide complexes as stoichiometric or catalytic compounds to promote synthetically important organic transformations has matured due to their rich, complex, and uniquely informative organometallic chemistry. Compared to early or late transition metal complexes, the actinides sometimes exhibit parallel and sometimes totally different reactivities for similar processes. In many instances the regiospecific and chemical selectivities displayed by organoactinide complexes are complementary to that observed for other transition metal complexes. Several recent review articles (Edelman et al., 1995; Edelmann and Gun'ko, 1997; Ephritikhine, 1997; Hitchcock et al., 1997; Berthet and Ephritikhine, 1998; Blake et al., 1998; Edelmann and Lorenz, 2000), dealing mostly with the synthesis of new actinide complexes, confirm the broad and rapidly expanding scope of this field.

  15. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  16. Markov processes and partial differential equations on a group: the space-homogeneous case

    NASA Astrophysics Data System (ADS)

    Bendikov, A. D.

    1987-10-01

    CONTENTS Introduction Terminology and notation Chapter I. Potential theory of conjugate processes § 1.1. Markov processes and harmonic spaces § 1.2. Processes of class \\mathscr{A} and Brelot spaces § 1.3. Processes of class \\mathscr{B} and Bauer spaces Chapter II. Space-homogeneous processes on a group § 2.1. Space-homogeneous processes and harmonic structures § 2.2. Quasidiagonal processes § 2.3. An example of a non-quasidiagonal process Chapter III. Elliptic equations on a group § 3.1. Admissible distributions and multipliers § 3.2. Weak solutions of elliptic equations ( L_p-theory) § 3.3. Weyl's lemma and the hypoelliptic property References

  17. Process spectroscopy in microemulsions—Raman spectroscopy for online monitoring of a homogeneous hydroformylation process

    NASA Astrophysics Data System (ADS)

    Paul, Andrea; Meyer, Klas; Ruiken, Jan-Paul; Illner, Markus; Müller, David-Nicolas; Esche, Erik; Wozny, Günther; Westad, Frank; Maiwald, Michael

    2017-03-01

    A major industrial reaction based on homogeneous catalysis is hydroformylation for the production of aldehydes from alkenes and syngas. Hydroformylation in microemulsions, which is currently under investigation at Technische Universität Berlin on a mini-plant scale, was identified as a cost efficient approach which also enhances product selectivity. Herein, we present the application of online Raman spectroscopy on the reaction of 1-dodecene to 1-tridecanal within a microemulsion. To achieve a good representation of the operation range in the mini-plant with regard to concentrations of the reactants a design of experiments was used. Based on initial Raman spectra partial least squares regression (PLSR) models were calibrated for the prediction of 1-dodecene and 1-tridecanal. Limits of predictions arise from nonlinear correlations between Raman intensity and mass fractions of compounds in the microemulsion system. Furthermore, the prediction power of PLSR models becomes limited due to unexpected by-product formation. Application of the lab-scale derived calibration spectra and PLSR models on online spectra from a mini-plant operation yielded promising estimations of 1-tridecanal and acceptable predictions of 1-dodecene mass fractions suggesting Raman spectroscopy as a suitable technique for process analytics in microemulsions.

  18. Low-risk gasoline alkylation process using a homogeneous liquid phase catalyst

    SciTech Connect

    Nelson, S.R.; Nelson, L.G.

    1996-12-31

    Kerr-McGee`s interest in finding additional applications for its ROSES technology has led to a promising new alkylation process for the production of gasoline. The technology is timely due to its inherent environmental safety. The Homogeneous Alkylation Technology (HAT{trademark}) process uses a soluble alkylaluminum chloride-based catalyst at less than 1 percent of the acid concentrations used in conventional alkylation processes. The patented process greatly reduces the environmental risks associated with accidental acid releases from HF and sulfuric acid alkylation units. In addition, the process is projected to operate at lower cost than sulfuric acid alkylation and is a retrofit option for existing HF and sulfuric-acid alkylation units. Kerr-McGee has entered into a relationship with a major U.S. refiner to carry on the development of the HAT process. A gallon-per-day-scale pilot unit has been constructed for use in developing the process. 1 fig., 1 tab.

  19. Novel particulate production processes to create unique security materials

    NASA Astrophysics Data System (ADS)

    Hampden-Smith, Mark; Kodas, Toivo; Haubrich, Scott; Oljaca, Miki; Einhorn, Rich; Williams, Darryl

    2006-02-01

    Particles are frequently used to impart security features to high value items. These particles are typically produced by traditional methods, and therefore the security must be derived from the chemical composition of the particles rather than the particle production process. Here, we present new and difficult-to-reproduce particle production processes based on spray pyrolysis that can produce unique particles and features that are dependent on the use of these new-to-the-world processes and process trade secrets. Specifically two examples of functional materials are described, luminescent materials and electrocatalytic materials.

  20. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  1. Parallel information processing channels created in the retina

    PubMed Central

    Schiller, Peter H.

    2010-01-01

    In the retina, several parallel channels originate that extract different attributes from the visual scene. This review describes how these channels arise and what their functions are. Following the introduction four sections deal with these channels. The first discusses the “ON” and “OFF” channels that have arisen for the purpose of rapidly processing images in the visual scene that become visible by virtue of either light increment or light decrement; the ON channel processes images that become visible by virtue of light increment and the OFF channel processes images that become visible by virtue of light decrement. The second section examines the midget and parasol channels. The midget channel processes fine detail, wavelength information, and stereoscopic depth cues; the parasol channel plays a central role in processing motion and flicker as well as motion parallax cues for depth perception. Both these channels have ON and OFF subdivisions. The third section describes the accessory optic system that receives input from the retinal ganglion cells of Dogiel; these cells play a central role, in concert with the vestibular system, in stabilizing images on the retina to prevent the blurring of images that would otherwise occur when an organism is in motion. The last section provides a brief overview of several additional channels that originate in the retina. PMID:20876118

  2. Cyclization of 1,4-hydroxycarbonyls is not a homogenous gas phase process

    NASA Astrophysics Data System (ADS)

    Dibble, Theodore S.

    2007-10-01

    Previous studies of 1,4-hydroxycarbonyls derived from alkanes have suggested that they can cyclize to saturated furans, which can subsequently eliminate water to form the corresponding dihydrofurans. CBS-QB3 and G3 studies of 5-hydroxy-2-pentanone and 2-hydroxypentanal show that both steps have activation barriers far too large for these reactions to occur as homogenous gas phase reactions. Similar results were obtained in CBS-QB3 studies of the analogous process leading from 2- and 3-methyl-4-hydroxy-2-butenal (species posited to form in the degradation of isoprene) to 3-methylfuran. The latter two processes are much more favorable, thermodynamically, than the formation of dihydrofurans from the saturated 1,4-hydroxycarbonyls.

  3. Creating the Virtual Work: Readers' Processes in Understanding Literary Texts.

    ERIC Educational Resources Information Center

    Earthman, Elise Ann

    A study examined the ways in which college readers interact with literary texts. The method of interviews and think-along protocols, in which a text was read aloud by the subject while he simultaneously verbalized his thoughts, was used to compare the reading processes of eight college freshman to those of eight masters students in literature who…

  4. Redistribution Mechanisms and Quantification of Homogeneity in Friction Stir Welding and Processing of an Aluminum Silicon Alloy

    DTIC Science & Technology

    2012-09-01

    wide range of particle-containing materials. Materials such as Nickel Aluminum Bronze (NAB), high yield (HY) Steels , and AA5083 are common in many...REDISTRIBUTION MECHANISMS AND QUANTIFICATION OF HOMOGENEITY IN FRICTION STIR WELDING AND PROCESSING OF AN ALUMINUM SILICON ALLOY by Jeffrey C. Woertz...Homogeneity in Friction Stir Welding and Processing of an Aluminum Silicon Alloy 5. FUNDING NUMBERS 6. AUTHOR(S) Jeffrey C. Woertz 7

  5. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment.

  6. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda

    2015-12-01

    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  7. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  8. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    SciTech Connect

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the charged wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.

  9. Creating a national citizen engagement process for energy policy

    PubMed Central

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-01-01

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants’ broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  10. Creating a national citizen engagement process for energy policy.

    PubMed

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-09-16

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy.

  11. Homogeneous sonophotolysis of food processing industry wastewater: Study of synergistic effects, mineralization and toxicity removal.

    PubMed

    Durán, A; Monteagudo, J M; Sanmartín, I; Gómez, P

    2013-03-01

    The mineralization of industrial wastewater coming from food industry using an emerging homogeneous sonophotolytic oxidation process was evaluated as an alternative to or a rapid pretreatment step for conventional anaerobic digestion with the aim of considerably reducing the total treatment time. At the selected operation conditions ([H(2)O(2)]=11,750ppm, pH=8, amplitude=50%, pulse length (cycles)=1), 60% of TOC is removed after 60min and 98% after 180min when treating an industrial effluent with 2114ppm of total organic carbon (TOC). This process removed completely the toxicity generated during storing or due to intermediate compounds. An important synergistic effect between sonolysis and photolysis (H(2)O(2)/UV) was observed. Thus the sonophotolysis (ultrasound/H(2)O(2)/UV) technique significantly increases TOC removal when compared with each individual process. Finally, a preliminary economical analysis confirms that the sono-photolysis with H(2)O(2) and pretreated water is a profitable system when compared with the same process without using ultrasound waves and with no pretreatment.

  12. People Create Health: Effective Health Promotion is a Creative Process

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    Effective health promotion involves the creative cultivation of physical, mental, social, and spiritual well-being. Efforts at health promotion produce weak and inconsistent benefits when it does not engage people to express their own goals and values. Likewise, health promotion has been ineffective when it relies only on instruction about facts regarding a healthy lifestyle, or focuses on reduction of disease rather than the cultivation of well-being. Meta-analysis of longitudinal studies and experimental interventions shows that improvements in subjective well-being lead to short-term and long-term reductions in medical morbidity and mortality, as well as to healthier functioning and longevity. However, these effects are inconsistent and weak (correlations of about 0.15). The most consistent and strong predictor of both subjective well-being and objective health status in longitudinal studies is a creative personality profile characterized by being highly self-directed, cooperative, and self-transcendent. There is a synergy among these personality traits that enhances all aspects of the health and happiness of people. Experimental interventions to cultivate this natural creative potential of people are now just beginning, but available exploratory research has shown that creativity can be enhanced and the changes are associated with widespread and profound benefits, including greater physical, mental, social, and spiritual well-being. In addition to benefits mediated by choice of diet, physical activity, and health care utilization, the effect of a creative personality on health may be partly mediated by effects on the regulation of heart rate variability. Creativity promotes autonomic balance with parasympathetic dominance leading to a calm alert state that promotes an awakening of plasticities and intelligences that stress inhibits. We suggest that health, happiness, and meaning can be cultivated by a complex adaptive process that enhances healthy functioning

  13. Evidence of linked biogeochemical and hydrological processes in homogeneous and layered vadose zone systems

    NASA Astrophysics Data System (ADS)

    McGuire, J. T.; Hansen, D. J.; Mohanty, B. P.

    2010-12-01

    Understanding chemical fate and transport in the vadose zone is critical to protect groundwater resources and preserve ecosystem health. However, prediction can be challenging due to the dynamic hydrologic and biogeochemical nature of the vadose zone. Additional controls on hydrobiogeochemical processes are added by subsurface structural heterogeneity. This study uses repacked soil column experiments to quantify linkages between microbial activity, geochemical cycling and hydrologic flow. Three “short” laboratory soil columns were constructed to evaluate the effects of soil layering: a homogenized medium-grained sand, homogenized organic-rich loam, and a sand-over-loam layered column. In addition, two “long” columns were constructed using either gamma-irradiated (sterilized) or untreated sediments to evaluate the effects of both soil layers and the presence of microorganisms. The long columns were packed identically; a medium-grained sand matrix with two vertically separated and horizontally offset lenses of organic-rich loam. In all 5 columns, downward and upward infiltration of water was evaluated to simulate rainfall and rising water table events respectively. In-situ colocated probes were used to measure soil water content, matric potential, Eh, major anions, ammonium, Fe2+, and total sulfide. Enhanced biogeochemical cycling was observed in the short layered column versus the short, homogeneous columns, and enumerations of iron and sulfate reducing bacteria were 1-2 orders of magnitude greater. In the long columns, microbial activity caused mineral bands and produced insoluble gases that impeded water flow through the pores of the sediment. Capillary barriers, formed around the lenses due to soil textural differences, retarded water flow rates through the lenses. This allowed reducing conditions to develop, evidenced by the production of Fe2+ and S2-. At the fringes of the lenses, Fe2+ oxidized to form Fe(III)-oxide bands that further retarded water

  14. Spatial Division Multiplexed Microwave Signal processing by selective grating inscription in homogeneous multicore fibers

    PubMed Central

    Gasulla, Ivana; Barrera, David; Hervás, Javier; Sales, Salvador

    2017-01-01

    The use of Spatial Division Multiplexing for Microwave Photonics signal processing is proposed and experimentally demonstrated, for the first time to our knowledge, based on the selective inscription of Bragg gratings in homogeneous multicore fibers. The fabricated devices behave as sampled true time delay elements for radiofrequency signals offering a wide range of operation possibilities within the same optical fiber. The key to processing flexibility comes from the implementation of novel multi-cavity configurations by inscribing a variety of different fiber Bragg gratings along the different cores of a 7-core fiber. This entails the development of the first fabrication method to inscribe high-quality gratings characterized by arbitrary frequency spectra and located in arbitrary longitudinal positions along the individual cores of a multicore fiber. Our work opens the way towards the development of unique compact fiber-based solutions that enable the implementation of a wide variety of 2D (spatial and wavelength diversity) signal processing functionalities that will be key in future fiber-wireless communications scenarios. We envisage that Microwave Photonics systems and networks will benefit from this technology in terms of compactness, operation versatility and performance stability. PMID:28134304

  15. Spatial Division Multiplexed Microwave Signal processing by selective grating inscription in homogeneous multicore fibers.

    PubMed

    Gasulla, Ivana; Barrera, David; Hervás, Javier; Sales, Salvador

    2017-01-30

    The use of Spatial Division Multiplexing for Microwave Photonics signal processing is proposed and experimentally demonstrated, for the first time to our knowledge, based on the selective inscription of Bragg gratings in homogeneous multicore fibers. The fabricated devices behave as sampled true time delay elements for radiofrequency signals offering a wide range of operation possibilities within the same optical fiber. The key to processing flexibility comes from the implementation of novel multi-cavity configurations by inscribing a variety of different fiber Bragg gratings along the different cores of a 7-core fiber. This entails the development of the first fabrication method to inscribe high-quality gratings characterized by arbitrary frequency spectra and located in arbitrary longitudinal positions along the individual cores of a multicore fiber. Our work opens the way towards the development of unique compact fiber-based solutions that enable the implementation of a wide variety of 2D (spatial and wavelength diversity) signal processing functionalities that will be key in future fiber-wireless communications scenarios. We envisage that Microwave Photonics systems and networks will benefit from this technology in terms of compactness, operation versatility and performance stability.

  16. Spatial Division Multiplexed Microwave Signal processing by selective grating inscription in homogeneous multicore fibers

    NASA Astrophysics Data System (ADS)

    Gasulla, Ivana; Barrera, David; Hervás, Javier; Sales, Salvador

    2017-01-01

    The use of Spatial Division Multiplexing for Microwave Photonics signal processing is proposed and experimentally demonstrated, for the first time to our knowledge, based on the selective inscription of Bragg gratings in homogeneous multicore fibers. The fabricated devices behave as sampled true time delay elements for radiofrequency signals offering a wide range of operation possibilities within the same optical fiber. The key to processing flexibility comes from the implementation of novel multi-cavity configurations by inscribing a variety of different fiber Bragg gratings along the different cores of a 7-core fiber. This entails the development of the first fabrication method to inscribe high-quality gratings characterized by arbitrary frequency spectra and located in arbitrary longitudinal positions along the individual cores of a multicore fiber. Our work opens the way towards the development of unique compact fiber-based solutions that enable the implementation of a wide variety of 2D (spatial and wavelength diversity) signal processing functionalities that will be key in future fiber-wireless communications scenarios. We envisage that Microwave Photonics systems and networks will benefit from this technology in terms of compactness, operation versatility and performance stability.

  17. Decolorization of Reactive Red 2 by advanced oxidation processes: Comparative studies of homogeneous and heterogeneous systems.

    PubMed

    Wu, Chung-Hsin; Chang, Chung-Liang

    2006-02-06

    This study investigated the decolorization of the Reactive Red 2 in water using advanced oxidation processes (AOPs): UV/TiO2, UV/SnO2, UV/TiO2+SnO2, O3, O3+MnO2, UV/O3 and UV/O3+TiO2+SnO2. Kinetic analyses indicated that the decolorization rates of Reactive Red 2 could be approximated as pseudo-first-order kinetics for both homogeneous and heterogeneous systems. The decolorization rate at pH 7 exceeded pH 4 and 10 in UV/TiO2 and UV/TiO2+SnO2 systems, respectively. However, the rate constants in the systems (including O3) demonstrated the order of pH 10>pH 7>pH 4. The UV/TiO2+SnO2 and O3+MnO2 systems exhibited a greater decolorization rate than the UV/TiO2 and O3 systems, respectively. Additionally, the promotion of rate depended on pH. The variation of dye concentration influenced the decolorization efficiency of heterogeneous systems more significant than homogeneous systems. Experimental results verified that decolorization and desulfuration occurred at nearly the same rate. Moreover, the decolorization rate constants at pH 7 in various systems followed the order of UV/O3 > or = O3+MnO2 > or = UV/O3+TiO2+SnO2 > O3 > UV/TiO2+SnO2 > or = UV/TiO2 > UV/SnO2.

  18. We're Born To Learn: Using the Brain's Natural Learning Process To Create Today's Curriculum.

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    This book provides research-based, concrete strategies for creating a student-centered curriculum in which every student can learn. It breaks down the Natural Human Learning Process (NHLP) into six stages, providing guidelines and models showing educators how to create learning experiences at each stage of the process for individuals, small…

  19. Effective inactivation of Saccharomyces cerevisiae in minimally processed Makgeolli using low-pressure homogenization-based pasteurization.

    PubMed

    Bak, Jin Seop

    2015-01-01

    In order to address the limitations associated with the inefficient pasteurization platform used to make Makgeolli, such as the presence of turbid colloidal dispersions in suspension, commercially available Makgeolli was minimally processed using a low-pressure homogenization-based pasteurization (LHBP) process. This continuous process demonstrates that promptly reducing the exposure time to excessive heat using either large molecules or insoluble particles can dramatically improve internal quality and decrease irreversible damage. Specifically, optimal homogenization increased concomitantly with physical parameters such as colloidal stability (65.0% of maximum and below 25-μm particles) following two repetitions at 25.0 MPa. However, biochemical parameters such as microbial population, acidity, and the presence of fermentable sugars rarely affected Makgeolli quality. Remarkably, there was a 4.5-log reduction in the number of Saccharomyces cerevisiae target cells at 53.5°C for 70 sec in optimally homogenized Makgeolli. This value was higher than the 37.7% measured from traditionally pasteurized Makgeolli. In contrast to the analytical similarity among homogenized Makgeollis, our objective quality evaluation demonstrated significant differences between pasteurized (or unpasteurized) Makgeolli and LHBP-treated Makgeolli. Low-pressure homogenization-based pasteurization, Makgeolli, minimal processing-preservation, Saccharomyces cerevisiae, suspension stability.

  20. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  1. Process spectroscopy in microemulsions—setup and multi-spectral approach for reaction monitoring of a homogeneous hydroformylation process

    NASA Astrophysics Data System (ADS)

    Meyer, K.; Ruiken, J.-P.; Illner, M.; Paul, A.; Müller, D.; Esche, E.; Wozny, G.; Maiwald, M.

    2017-03-01

    Reaction monitoring in disperse systems, such as emulsions, is of significant technical importance in various disciplines like biotechnological engineering, chemical industry, food science, and a growing number other technical fields. These systems pose several challenges when it comes to process analytics, such as heterogeneity of mixtures, changes in optical behavior, and low optical activity. Concerning this, online nuclear magnetic resonance (NMR) spectroscopy is a powerful technique for process monitoring in complex reaction mixtures due to its unique direct comparison abilities, while at the same time being non-invasive and independent of optical properties of the sample. In this study the applicability of online-spectroscopic methods on the homogeneously catalyzed hydroformylation system of 1-dodecene to tridecanal is investigated, which is operated in a mini-plant scale at Technische Universität Berlin. The design of a laboratory setup for process-like calibration experiments is presented, including a 500 MHz online NMR spectrometer, a benchtop NMR device with 43 MHz proton frequency as well as two Raman probes and a flow cell assembly for an ultraviolet and visible light (UV/VIS) spectrometer. Results of high-resolution online NMR spectroscopy are shown and technical as well as process-specific problems observed during the measurements are discussed.

  2. A monolith purification process for virus-like particles from yeast homogenate.

    PubMed

    Burden, Claire S; Jin, Jing; Podgornik, Aleš; Bracewell, Daniel G

    2012-01-01

    Monoliths are an alternative stationary phase format to conventional particle based media for large biomolecules. Conventional resins suffer from limited capacities and flow rates when used for viruses, virus-like particles (VLP) and other nanoplex materials. The monolith structure provides a more open pore structure to improve accessibility for these materials and better mass transport from convective flow and reduced pressure drops. To examine the performance of this format for bioprocessing we selected the challenging capture of a VLP from clarified yeast homogenate. Using a recombinant Saccharomyces cerevisiae host it was found hydrophobic interaction based separation using a hydroxyl derivatised monolith had the best performance. The monolith was then compared to a known beaded resin method, where the dynamic binding capacity was shown to be three-fold superior for the monolith with equivalent 90% recovery of the VLP. To understand the impact of the crude feed material confocal microscopy was used to visualise lipid contaminants, deriving from the homogenised yeast. It was seen that the lipid formed a layer on top of the column, even after regeneration of the column with isopropanol, resulting in increasing pressure drops with the number of operational cycles. Removal of the lipid pre-column significantly reduces the amount and rate of this fouling process. Using Amberlite/XAD-4 beads around 70% of the lipid was removed, with a loss of VLP around 20%. Applying a reduced lipid feed versus an untreated feed further increased the dynamic binding capacity of the monolith from 0.11 mg/mL column to 0.25 mg/mL column.

  3. Modeling of HIV/AIDS dynamic evolution using non-homogeneous semi-markov process.

    PubMed

    Dessie, Zelalem Getahun

    2014-01-01

    The purpose of this study is to model the progression of HIV/AIDS disease of an individual patient under ART follow-up using non-homogeneous semi-Markov processes. The model focuses on the patient's age as a relevant factor to forecast the transitions among the different levels of seriousness of the disease. A sample of 1456 patients was taken from a hospital record at Amhara Referral Hospitals, Amhara Region, Ethiopia, who were under ART follow up from June 2006 to August 2013. The states of disease progression adopted in the model were defined based on of the following CD4 cell counts: >500 cells/mm(3) (SI); 349 to 500 cells/mm(3) (SII); 199 to 350 cells/mm(3)(SIII); ≤200 cells/mm(3) (SIV); and death (D). The first four states are referred as living states. The probability that an HIV/AIDS patient with any one of the living states will transition to the death state is greater with increasing age, irrespective of the current state and age of the patient. More generally, the probability of dying decreases with increasing CD4 counts over time. For an HIV/AIDS patient in a specific state of the disease, the probability of remaining in the same state decreases with increasing age. Within the living states, the results show that the probability of being in a better state is non-zero, but less than the probability of being in a worse state for all ages. A reliability analysis also revealed that the survival probabilities are all declining over time. Computed conditional probabilities show differential subject response that depends on the age of the patient. The dynamic nature of AIDS progression is confirmed with particular findings that patients are more likely to be in a worse state than a better one unless interventions are made. Our findings suggest that ongoing ART treatment services could be provided more effectively with careful consideration of the recent disease status of patients.

  4. Efficacy of low-temperature high hydrostatic pressure processing in inactivating Vibrio parahaemolyticus in culture suspension and oyster homogenate.

    PubMed

    Phuvasate, Sureerat; Su, Yi-Cheng

    2015-03-02

    Culture suspensions of five clinical and five environmental Vibrio parahaemolyticus strains in 2% NaCl solution were subjected to high pressure processing (HPP) under various conditions (200-300MPa for 5 and 10 min at 1.5-20°C) to study differences in pressure resistance among the strains. The most pressure-resistant and pressure-sensitive strains were selected to investigate the effects of low temperatures (15, 5 and 1.5°C) on HPP (200 or 250MPa for 5 min) to inactivate V. parahaemolyticus in sterile oyster homogenates. Inactivation of V. parahaemolyticus cells in culture suspensions and oyster homogenates was greatly enhanced by lowering the processing temperature from 15 to 5 or 1.5°C. A treatment of oyster homogenates at 250MPa for 5 min at 5°C decreased the populations of V. parahaemolyticus by 6.2logCFU/g for strains 10290 and 100311Y11 and by >7.4logCFU/g for strain 10292. Decreasing the processing temperature of the same treatment to 1.5°C reduced all the V. parahaemolyticus strains inoculated to oyster homogenates to non-detectable (<10CFU/g) levels. Factors including pressure level, processing temperature and time all need to be considered for developing effective HPP for eliminating pathogens from foods. Further studies are needed to validate the efficacy of the HPP (250MPa for 5 min at 1.5°C) in inactivating V. parahaemolyticus cells in whole oysters.

  5. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    SciTech Connect

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCP is referred to herein as a Virtual ADCP (VADCP).

  6. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    SciTech Connect

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-12-01

    Observations of the flow conditions in the forebay of a hydroelectric power station indicate significant regions of non-homogeneous velocities near the intakes and shoreline. The effect of these non-homogeneous regions on the velocity measurement of an acoustic Doppler current profiler (ADCP) is investigated. By using a numerical model of an ADCP operating in a velocity field calculated using computational fluid dynamics (CFD), the errors due to the spatial variation of the flow velocity are identified. The numerical model of the ADCP is referred to herein as a Virtual ADCP (VADCP). Two scenarios are modeled in the numerical analyses presented. Firstly the measurement error of the VADCP is calculated for a single instrument adjacent to the short converging intake of the powerhouse. Secondly, the flow discharge through the forebay is estimated from a transect of VADCP instruments at dif- ferent distances from the powerhouse. The influence of instrument location and orientation are investigated for both cases. A velocity error of over up to 94% of the reference velocity is calculated for a VADCP modeled adjacent to an operating intake. Qualitative agreement is observed between the calculated VADCP velocities and reference velocities by an offset of one intake height upstream of the powerhouse.

  7. PROCESS WATER BUILDING, TRA605. FORMS AR SET TO CREATE THREE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PROCESS WATER BUILDING, TRA-605. FORMS AR SET TO CREATE THREE SHIELDED CELLS FOR THE PUMPS THAT WILL BE IN WEST HALF OF THE BUILDING. PUMPS WILL LIFT WATER TO WORKING RESERVOIR. CAMERA FACES NORTHEAST. INL NEGATIVE NO. 1465. Unknown Photographer, 2/13/1951 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  8. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    DOE PAGES

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; ...

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCPmore » is referred to herein as a Virtual ADCP (VADCP).« less

  9. Optimization of homogenization-evaporation process for lycopene nanoemulsion production and its beverage applications.

    PubMed

    Kim, Sang Oh; Ha, Thi Van Anh; Choi, Young Jin; Ko, Sanghoon

    2014-08-01

    Lycopene is a natural antioxidant which has several health benefits. Undesirable oxidation of lycopene compromises its health benefits and also affects the sensory quality of food products containing lycopene. Health benefits associated with lycopene in food preparations can be enhanced by preventing its degradation by incorporating it into the oil phase of an oil-in-water nanoemulsion. In this study, lycopene nanoemulsions were prepared from a low-concentration lycopene extract using an emulsification-evaporation technique. The effects of the concentrations of the lycopene extract (0.015 to 0.085 mg/mL) and emulsifier (0.3 to 0.7 mg/mL), and the number of homogenization cycles (2 to 4) on the droplet size, emulsification efficiency (EE), and nanoemulsion stability were investigated and optimized by statistical analysis using a Box-Behnken design. Regression analysis was used to determine the 2nd-order polynomial model relationship of independent and dependent variables, with multiple regression coefficients (R(2)) of 0.924, 0.933, and 0.872, for the droplet size, EE, and nanoemulsion stability, respectively. Analysis of variance showed that the lycopene extract concentration has the most significant effect on all the response variables. Response surface methodology predicted that a formulation containing 0.085 mg/mL of lycopene extract and 0.7 mg/mL of emulsifier, subjected to 3 homogenization cycles, is optimal for achieving the smallest droplet size, greatest emulsion stability, and acceptable EE. The observed responses were in agreement with the predicted values of the optimized formulation. This study provided important information about the statistical design of lycopene nanoemulsion preparation.

  10. Regional Homogeneity

    PubMed Central

    Jiang, Lili; Zuo, Xi-Nian

    2015-01-01

    Much effort has been made to understand the organizational principles of human brain function using functional magnetic resonance imaging (fMRI) methods, among which resting-state fMRI (rfMRI) is an increasingly recognized technique for measuring the intrinsic dynamics of the human brain. Functional connectivity (FC) with rfMRI is the most widely used method to describe remote or long-distance relationships in studies of cerebral cortex parcellation, interindividual variability, and brain disorders. In contrast, local or short-distance functional interactions, especially at a scale of millimeters, have rarely been investigated or systematically reviewed like remote FC, although some local FC algorithms have been developed and applied to the discovery of brain-based changes under neuropsychiatric conditions. To fill this gap between remote and local FC studies, this review will (1) briefly survey the history of studies on organizational principles of human brain function; (2) propose local functional homogeneity as a network centrality to characterize multimodal local features of the brain connectome; (3) render a neurobiological perspective on local functional homogeneity by linking its temporal, spatial, and individual variability to information processing, anatomical morphology, and brain development; and (4) discuss its role in performing connectome-wide association studies and identify relevant challenges, and recommend its use in future brain connectomics studies. PMID:26170004

  11. Study on rheo-diecasting process of 7075R alloys by SA-EMS melt homogenized treatment

    NASA Astrophysics Data System (ADS)

    Zhihua, G.; Jun, X.; Zhifeng, Z.; Guojun, L.; Mengou, T.

    2016-03-01

    An advanced melt processing technology, spiral annular electromagnetic stirring (SA-EMS) based on the annular electromagnetic stirring (A-EMS) process was developed for manufacturing Al-alloy components with high integrity. The SA-EMS process innovatively combines non-contact electromagnetic stirring and a spiral annular chamber with specially designed profiles to in situ make high quality melt slurry, and intensive forced shearing can be achieved under high shear rate and high intensity of turbulence inside the spiral annular chamber. In this paper, the solidification microstructure and hardness of 7075R alloy die-casting connecting rod conditioned by the SA-EMS melt processing technology were investigated. The results indicate that, the SA-EMS melt processing technology exhibited superior grain refinement and remarkable structure homogeneity. In addition, it can evidently enhance the mechanical performance and reduce the crack tendency.

  12. Porcine liver decellularization under oscillating pressure conditions: a technical refinement to improve the homogeneity of the decellularization process.

    PubMed

    Struecker, Benjamin; Hillebrandt, Karl Herbert; Voitl, Robert; Butter, Antje; Schmuck, Rosa B; Reutzel-Selke, Anja; Geisel, Dominik; Joehrens, Korinna; Pickerodt, Philipp A; Raschzok, Nathanael; Puhl, Gero; Neuhaus, Peter; Pratschke, Johann; Sauer, Igor M

    2015-03-01

    Decellularization and recellularization of parenchymal organs may facilitate the generation of autologous functional liver organoids by repopulation of decellularized porcine liver matrices with induced liver cells. We present an accelerated (7 h overall perfusion time) and effective protocol for human-scale liver decellularization by pressure-controlled perfusion with 1% Triton X-100 and 1% sodium dodecyl sulfate via the hepatic artery (120 mmHg) and portal vein (60 mmHg). In addition, we analyzed the effect of oscillating pressure conditions on pig liver decellularization (n=19). The proprietary perfusion device used to generate these pressure conditions mimics intra-abdominal conditions during respiration to optimize microperfusion within livers and thus optimize the homogeneity of the decellularization process. The efficiency of perfusion decellularization was analyzed by macroscopic observation, histological staining (hematoxylin and eosin [H&E], Sirius red, and alcian blue), immunohistochemical staining (collagen IV, laminin, and fibronectin), and biochemical assessment (DNA, collagen, and glycosaminoglycans) of decellularized liver matrices. The integrity of the extracellular matrix (ECM) postdecellularization was visualized by corrosion casting and three-dimensional computed tomography scanning. We found that livers perfused under oscillating pressure conditions (P(+)) showed a more homogenous course of decellularization and contained less DNA compared with livers perfused without oscillating pressure conditions (P(-)). Microscopically, livers from the (P(-)) group showed remnant cell clusters, while no cells were found in livers from the (P(+)) group. The grade of disruption of the ECM was higher in livers from the (P(-)) group, although the perfusion rates and pressure did not significantly differ. Immunohistochemical staining revealed that important matrix components were still present after decellularization. Corrosion casting showed an intact

  13. Challenges in modelling homogeneous catalysis: new answers from ab initio molecular dynamics to the controversy over the Wacker process.

    PubMed

    Stirling, András; Nair, Nisanth N; Lledós, Agustí; Ujaque, Gregori

    2014-07-21

    We present here a review of the mechanistic studies of the Wacker process stressing the long controversy about the key reaction steps. We give an overview of the previous experimental and theoretical studies on the topic. Then we describe the importance of the most recent Ab Initio Molecular Dynamics (AIMD) calculations in modelling organometallic reactivity in water. As a prototypical example of homogeneous catalytic reactions, the Wacker process poses serious challenges to modelling. The adequate description of the multiple role of the water solvent is very difficult by using static quantum chemical approaches including cluster and continuum solvent models. In contrast, such reaction systems are suitable for AIMD, and by combining with rare event sampling techniques, the method provides reaction mechanisms and the corresponding free energy profiles. The review also highlights how AIMD has helped to obtain a novel understanding of the mechanism and kinetics of the Wacker process.

  14. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    SciTech Connect

    Suzuki, Hiroyuki Y.

    2008-02-15

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  15. Flexible printed circuit boards laser bonding using a laser beam homogenization process

    NASA Astrophysics Data System (ADS)

    Kim, Joohan; Choi, Haewoon

    2012-11-01

    A laser micro-bonding process using laser beam shaping is successfully demonstrated for flexible printed circuit boards. A CW Ytterbium fiber laser with a wavelength of 1070 nm and a laser power density of 1-7 W/mm2 is employed as a local heat source for bonding flexible printed circuit boards to rigid printed circuit boards. To improve the bonding quality, a micro-lens array is used to modify the Gaussian laser beam for the bonding process. An electromagnetic modeling and heat transfer simulation is conducted to verify the effect of the micro-lens array on the laser bonding process. The optimal bonding parameters are found experimentally. As the measured temperature ramp rate of the boards exceeds 1100 K/s, bonding occurs within 100-200 ms at a laser power density of 5 W/mm2. The bonding quality of the FPCB is verified with a shear strength test. Process characteristics are also discussed.

  16. Polymer powder processing of cryomilled polycaprolactone for solvent-free generation of homogeneous bioactive tissue engineering scaffolds.

    PubMed

    Lim, Jing; Chong, Mark Seow Khoon; Chan, Jerry Kok Yen; Teoh, Swee-Hin

    2014-06-25

    Synthetic polymers used in tissue engineering require functionalization with bioactive molecules to elicit specific physiological reactions. These additives must be homogeneously dispersed in order to achieve enhanced composite mechanical performance and uniform cellular response. This work demonstrates the use of a solvent-free powder processing technique to form osteoinductive scaffolds from cryomilled polycaprolactone (PCL) and tricalcium phosphate (TCP). Cryomilling is performed to achieve micrometer-sized distribution of PCL and reduce melt viscosity, thus improving TCP distribution and improving structural integrity. A breakthrough is achieved in the successful fabrication of 70 weight percentage of TCP into a continuous film structure. Following compaction and melting, PCL/TCP composite scaffolds are found to display uniform distribution of TCP throughout the PCL matrix regardless of composition. Homogeneous spatial distribution is also achieved in fabricated 3D scaffolds. When seeded onto powder-processed PCL/TCP films, mesenchymal stem cells are found to undergo robust and uniform osteogenic differentiation, indicating the potential application of this approach to biofunctionalize scaffolds for tissue engineering applications.

  17. New American Cancer Society process for creating trustworthy cancer screening guidelines.

    PubMed

    Brawley, Otis; Byers, Tim; Chen, Amy; Pignone, Michael; Ransohoff, David; Schenk, Maryjean; Smith, Robert; Sox, Harold; Thorson, Alan G; Wender, Richard

    2011-12-14

    Guidelines for cancer screening written by different organizations often differ, even when they are based on the same evidence. Those dissimilarities can create confusion among health care professionals, the general public, and policy makers. The Institute of Medicine (IOM) recently released 2 reports to establish new standards for developing more trustworthy clinical practice guidelines and conducting systematic evidence reviews that serve as their basis. Because the American Cancer Society (ACS) is an important source of guidance about cancer screening for both health care practitioners and the general public, it has revised its methods to create a more transparent, consistent, and rigorous process for developing and communicating guidelines. The new ACS methods align with the IOM principles for trustworthy clinical guideline development by creating a single generalist group for writing the guidelines, commissioning independent systematic evidence reviews, and clearly articulating the benefits, limitations, and harms associated with a screening test. This new process should ensure that ACS cancer screening guidelines will continue to be a trustworthy source of information for both health care practitioners and the general public to guide clinical practice, personal choice, and public policy about cancer screening.

  18. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  19. Homogenous VUV advanced oxidation process for enhanced degradation and mineralization of antibiotics in contaminated water.

    PubMed

    Pourakbar, Mojtaba; Moussavi, Gholamreza; Shekoohiyan, Sakine

    2016-03-01

    This study was aimed to evaluate the degradation and mineralization of amoxicillin(AMX), using VUV advanced process. The effect of pH, AMX initial concentration, presence of water ingredients, the effect of HRT, and mineralization level by VUV process were taken into consideration. In order to make a direct comparison, the test was also performed by UVC radiation. The results show that the degradation of AMX was following the first-order kinetic. It was found that direct photolysis by UVC was able to degrade 50mg/L of AMX in 50min,while it was 3min for VUV process. It was also found that the removal efficiency by VUV process was directly influenced by pH of the solution, and higher removal rates were achieved at high pH values.The results show that 10mg/L of AMX was completely degraded and mineralized within 50s and 100s, respectively, indicating that the AMX was completely destructed into non-hazardous materials. Operating the photoreactor in contentious-flow mode revealed that 10mg/L AMX was completely degraded and mineralized at HRT values of 120s and 300s. it was concluded that the VUV advanced process was an efficient and viable technique for degradation and mineralization of contaminated water by antibiotics.

  20. Experimental development of processes to produce homogenized alloys of immiscible metals, phase 3

    NASA Technical Reports Server (NTRS)

    Reger, J. L.

    1976-01-01

    An experimental drop tower package was designed and built for use in a drop tower. This effort consisted of a thermal analysis, container/heater fabrication, and assembly of an expulsion device for rapid quenching of heated specimens during low gravity conditions. Six gallium bismuth specimens with compositions in the immiscibility region (50 a/o of each element) were processed in the experimental package: four during low gravity conditions and two under a one gravity environment. One of the one gravity processed specimens did not have telemetry data and was subsequently deleted for analysis since the processing conditions were not known. Metallurgical, Hall effect, resistivity, and superconductivity examinations were performed on the five specimens. Examination of the specimens showed that the gallium was dispersed in the bismuth. The low gravity processed specimens showed a relatively uniform distribution of gallium, with particle sizes of 1 micrometer or less, in contrast to the one gravity control specimen. Comparison of the cooling rates of the dropped specimens versus microstructure indicated that low cooling rates are more desirable.

  1. Development of a reference material for Staphylococcus aureus enterotoxin A in cheese: feasibility study, processing, homogeneity and stability assessment.

    PubMed

    Zeleny, R; Emteborg, H; Charoud-Got, J; Schimmel, H; Nia, Y; Mutel, I; Ostyn, A; Herbin, S; Hennekinne, J-A

    2015-02-01

    Staphylococcal food poisoning is caused by enterotoxins excreted into foods by strains of staphylococci. Commission Regulation 1441/2007 specifies thresholds for the presence of these toxins in foods. In this article we report on the progress towards reference materials (RMs) for Staphylococcal enterotoxin A (SEA) in cheese. RMs are crucial to enforce legislation and to implement and safeguard reliable measurements. First, a feasibility study revealed a suitable processing procedure for cheese powders: the blank material was prepared by cutting, grinding, freeze-drying and milling. For the spiked material, a cheese-water slurry was spiked with SEA solution, freeze-dried and diluted with blank material to the desired SEA concentration. Thereafter, batches of three materials (blank; two SEA concentrations) were processed. The materials were shown to be sufficiently homogeneous, and storage at ambient temperature for 4weeks did not indicate degradation. These results provide the basis for the development of a RM for SEA in cheese.

  2. Multiple-pass high-pressure homogenization of milk for the development of pasteurization-like processing conditions.

    PubMed

    Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J

    2013-02-01

    Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P < 0·05) on microbial reduction of both spoilage microflora and Staph. aureus in milk. Optimized UHPH processes (five 202-MPa passes; four 232-MPa passes) defined a region where a 5-log(10) reduction of total bacterial count of milk and a baroresistant pathogen are attainable, as a requisite parameter for establishing an alternative method of pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization.

  3. Clinical perspective: creating an effective practice peer review process-a primer.

    PubMed

    Gandhi, Manisha; Louis, Frances S; Wilson, Shae H; Clark, Steven L

    2017-03-01

    Peer review serves as an important adjunct to other hospital quality and safety programs. Despite its importance, the available literature contains virtually no guidance regarding the structure and function of effective peer review committees. This Clinical Perspective provides a summary of the purposes, structure, and functioning of effective peer review committees. We also discuss important legal considerations that are a necessary component of such processes. This discussion includes useful templates for case selection and review. Proper committee structure, membership, work flow, and leadership as well as close cooperation with the hospital medical executive committee and legal representatives are essential to any effective peer review process. A thoughtful, fair, systematic, and organized approach to creating a peer review process will lead to confidence in the committee by providers, hospital leadership, and patients. If properly constructed, such committees may also assist in monitoring and enforcing compliance with departmental protocols, thus reducing harm and promoting high-quality practice.

  4. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with

  5. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique.

    PubMed

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5-20 minutes) and speed (1,000-9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters.

  6. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique

    PubMed Central

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5–20 minutes) and speed (1,000–9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters. PMID:25709435

  7. The Parametric Model of the Human Mandible Coronoid Process Created by Method of Anatomical Features.

    PubMed

    Vitković, Nikola; Mitić, Jelena; Manić, Miodrag; Trajanović, Miroslav; Husain, Karim; Petrović, Slađana; Arsić, Stojanka

    2015-01-01

    Geometrically accurate and anatomically correct 3D models of the human bones are of great importance for medical research and practice in orthopedics and surgery. These geometrical models can be created by the use of techniques which can be based on input geometrical data acquired from volumetric methods of scanning (e.g., Computed Tomography (CT)) or on the 2D images (e.g., X-ray). Geometrical models of human bones created in such way can be applied for education of medical practitioners, preoperative planning, etc. In cases when geometrical data about the human bone is incomplete (e.g., fractures), it may be necessary to create its complete geometrical model. The possible solution for this problem is the application of parametric models. The geometry of these models can be changed and adapted to the specific patient based on the values of parameters acquired from medical images (e.g., X-ray). In this paper, Method of Anatomical Features (MAF) which enables creation of geometrically precise and anatomically accurate geometrical models of the human bones is implemented for the creation of the parametric model of the Human Mandible Coronoid Process (HMCP). The obtained results about geometrical accuracy of the model are quite satisfactory, as it is stated by the medical practitioners and confirmed in the literature.

  8. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each

  9. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2003-01-01

    The objective of this study was to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements were carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The main experimental technique employed was turbulent flow-chemical ionization mass spectrometry, which is particularly well suited for investigations of radical-radical reactions.

  10. Small-scale variability in solute transport processes in a homogeneous clay loam soil

    SciTech Connect

    Garrido, F.; Ghodrati, M.; Chendorain, M.; Campbell, C.G.

    1999-12-01

    Small-scale variations in transport parameters may have a profound influence on larger scale flow processes. Fiber-optic miniprobes (FOMPs) provide the opportunity to continuously measure solute resident concentration in small soil volumes. A 20-channel multi-plexed-FOMP system was used in repeated miscible displacements in a repacked clay loam soil column to examine small-scale, point-to-point variability in convective-dispersive transport processes. Transport parameters, measured 10 cm below the surface, were compared at two drip irrigation point densities and two fluxes. Irrigation densities of one irrigation drip point per 4 cm{sup 2} and 11 cm{sup 2} of column surface area produced similar results. The breakthrough curves measured at 0.10 cm h{sup {minus}1} had a larger immobile phase than at a flux of 1.07 cm h{sup {minus}1}. In the clay loam soil the mobile-immobile model fit the breakthrough curves better than the convective-dispersive equation (CDE), with r{sup 2} values of 99.6 and 97.1, respectively. This analysis demonstrated that dispersion and mass recovery were much more variable than pore water velocity in this repacked clay loam soil. However, even in the most variable transport conditions encountered, only 17 sampling points were necessary to describe the column average transport parameters within 20% of the mean.

  11. Microstructural Homogeneity and Hot Deformation of Various Friction-Stir-Processed 5083 Al Alloys

    NASA Astrophysics Data System (ADS)

    García-Bernal, M. A.; Mishra, R. S.; Hernández-Silva, D.; Sauce-Rangel, V. M.

    2016-12-01

    Diverse studies on FSP of 5083 Al alloys have been conducted, and some have made comparisons with previous studies of similar alloys, but many times such comparisons could be invalid because of differences in the parameters used during FSP, above all, tool profile. Five 5083 Al alloys produced by different production routes were friction-stir-processed and compared among themselves and with other two superplastic forming (SPF) grade 5083 Al alloys. Results suggest that the grain size refinement is independent of the original microstructure and that there is a relationship between the size of the second phase before and after FSP. The combination of continuous casting 5083 Al alloys + FSP had an outstanding behavior in hot deformation in comparison with rolled or extruded 5083 Al alloys + FSP, and even SPF 5083 Al alloys.

  12. Microstructural Homogeneity and Hot Deformation of Various Friction-Stir-Processed 5083 Al Alloys

    NASA Astrophysics Data System (ADS)

    García-Bernal, M. A.; Mishra, R. S.; Hernández-Silva, D.; Sauce-Rangel, V. M.

    2017-01-01

    Diverse studies on FSP of 5083 Al alloys have been conducted, and some have made comparisons with previous studies of similar alloys, but many times such comparisons could be invalid because of differences in the parameters used during FSP, above all, tool profile. Five 5083 Al alloys produced by different production routes were friction-stir-processed and compared among themselves and with other two superplastic forming (SPF) grade 5083 Al alloys. Results suggest that the grain size refinement is independent of the original microstructure and that there is a relationship between the size of the second phase before and after FSP. The combination of continuous casting 5083 Al alloys + FSP had an outstanding behavior in hot deformation in comparison with rolled or extruded 5083 Al alloys + FSP, and even SPF 5083 Al alloys.

  13. Degradation mechanism of cyanobacterial toxin cylindrospermopsin by hydroxyl radicals in homogeneous UV/H₂O₂ process.

    PubMed

    He, Xuexiang; Zhang, Geshan; de la Cruz, Armah A; O'Shea, Kevin E; Dionysiou, Dionysios D

    2014-04-15

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometry. Various byproducts identified indicated three common reaction pathways: hydroxyl addition (+16 Da), alcoholic oxidation or dehydrogenation (-2 Da), and elimination of sulfate (-80 Da). The initiation of the degradation was observed at the hydroxymethyl uracil and tricyclic guanidine groups; uracil moiety cleavage/fragmentation and further ring-opening of the alkaloid were also noted at an extended reaction time or higher UV fluence. The degradation rates of CYN decreased and less byproducts (species) were detected using natural water matrices; however, CYN was effectively eliminated under extended UV irradiation. This study demonstrates the efficiency of CYN degradation and provides a better understanding of the mechanism of CYN degradation by hydroxyl radical, a reactive oxygen species that can be generated by most AOPs and is present in natural water environment.

  14. Processing of α-chitin nanofibers by dynamic high pressure homogenization: characterization and antifungal activity against A. niger.

    PubMed

    Salaberria, Asier M; Fernandes, Susana C M; Diaz, Rene Herrera; Labidi, Jalel

    2015-02-13

    Chitin nano-objects become more interesting and attractive material than native chitin because of their usable form, low density, high surface area and promising mechanical properties. This work suggests a straightforward and environmentally friendly method for processing chitin nanofibers using dynamic high pressure homogenization. This technique proved to be a remarkably simple way to get α-chitin into α-chitin nanofibers from yellow lobster wastes with a uniform width (bellow 100 nm) and high aspect ratio; and may contributes to a major breakthrough in chitin applications. Moreover, the resulting α-chitin nanofibers were characterized and compared with native α-chitin in terms of chemical and crystal structure, thermal degradation and antifungal activity. The biological assays highlighted that the nano nature of chitin nanofibers plays an important role in the antifungal activity against Aspergillus niger.

  15. Processing sleep data created with the Drosophila Activity Monitoring (DAM) System.

    PubMed

    Pfeiffenberger, Cory; Lear, Bridget C; Keegan, Kevin P; Allada, Ravi

    2010-11-01

    Adult behavioral assays have been used with great success in Drosophila melanogaster to identify circadian rhythm genes. In particular, the locomotor activity assay can identify altered behavior patterns over the course of several days in small populations, or even individual flies. Sleep is a highly conserved behavior that is required for optimal performance and, in many cases, life of an organism. Drosophila demonstrate a behavioral state that shows traits consistent with sleep: periods of relative behavioral immobility that coincide with an increased arousal threshold after ~5 min of inactivity, regulated by circadian and homeostatic mechanisms. However, because flies do not produce brain waves recordable by electroencephalography, sleep researchers use behavior-based paradigms to infer when a fly is asleep, as opposed to awake but immobile. Data on Drosophila activity can be collected using an automated monitoring system to provide insight into sleep duration, consolidation, and latency, as well as sleep deprivation and rebound. This protocol details the use of Counting Macro, an Excel-based program, to process data created with the Drosophila Activity Monitoring (DAM) System from TriKinetics for sleep analyses. Specifically, it details the steps necessary to convert the raw data created by the DAM System into sleep duration and consolidation data, broken down into the light (L), dark (D), light:dark cycling (LD), and constant darkness (DD) phases of a behavior experiment.

  16. Preparation of cotton linter nanowhiskers by high-pressure homogenization process and its application in thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Savadekar, N. R.; Karande, V. S.; Vigneshwaran, N.; Kadam, P. G.; Mhaske, S. T.

    2015-03-01

    The present work deals with the preparation of cotton linter nanowhiskers (CLNW) by acid hydrolysis and subsequent processing in a high-pressure homogenizer. Prepared CLNW were then used as a reinforcing material in thermoplastic starch (TPS), with an aim to improve its performance properties. Concentration of CLNW was varied as 0, 1, 2, 3, 4 and 5 wt% in TPS. TPS/CLNW nanocomposite films were prepared by solution-casting process. The nanocomposite films were characterized by tensile, differential scanning calorimetry, scanning electron microscopy (SEM), water vapor permeability (WVP), oxygen permeability (OP), X-ray diffraction and light transmittance properties. 3 wt% CLNW-loaded TPS nanocomposite films demonstrated 88 % improvement in the tensile strength as compared to the pristine TPS polymer film; whereas, WVP and OP decreased by 90 and 92 %, respectively, which is highly appreciable compared to the quantity of CLNW added. DSC thermograms of nanocomposite films did not show any significant effect on melting temperature as compared to the pristine TPS. Light transmittance ( T r) value of TPS decreased with increased content of CLNW. Better interaction between CLNW and TPS, caused due to the hydrophilic nature of both the materials, and uniform distribution of CLNW in TPS were the prime reason for the improvement in properties observed at 3 wt% loading of CLNW in TPS. However, CLNW was seen to have formed agglomerates at higher concentration as determined from SEM analysis. These nanocomposite films can have potential use in food and pharmaceutical packaging applications.

  17. Using a critical reflection process to create an effective learning community in the workplace.

    PubMed

    Walker, Rachel; Cooke, Marie; Henderson, Amanda; Creedy, Debra K

    2013-05-01

    Learning circles are an enabling process to critically examine and reflect on practices with the purpose of promoting individual and organizational growth and change. The authors adapted and developed a learning circle strategy to facilitate open discourse between registered nurses, clinical leaders, clinical facilitators and students, to critically reflect on practice experiences to promote a positive learning environment. This paper reports on an analysis of field notes taken during a critical reflection process used to create an effective learning community in the workplace. A total of 19 learning circles were conducted during in-service periods (that is, the time allocated for professional education between morning and afternoon shifts) over a 3 month period with 56 nurses, 33 students and 1 university-employed clinical supervisor. Participation rates ranged from 3 to 12 individuals per discussion. Ten themes emerged from content analysis of the clinical learning issues identified through the four-step model of critical reflection used in learning circle discussions. The four-step model of critical reflection allowed participants to reflect on clinical learning issues, and raise them in a safe environment that enabled topics to be challenged and explored in a shared and cooperative manner.

  18. Creating Economic Incentives for Waste Disposal in Developing Countries Using the MixAlco Process.

    PubMed

    Lonkar, Sagar; Fu, Zhihong; Wales, Melinda; Holtzapple, Mark

    2017-01-01

    In rapidly growing developing countries, waste disposal is a major challenge. Current waste disposal methods (e.g., landfills and sewage treatment) incur costs and often are not employed; thus, wastes accumulate in the environment. To address this challenge, it is advantageous to create economic incentives to collect and process wastes. One approach is the MixAlco process, which uses methane-inhibited anaerobic fermentation to convert waste biomass into carboxylate salts, which are chemically converted to industrial chemicals and fuels. In this paper, humanure (raw human feces and urine) is explored as a possible nutrient source for fermentation. This work focuses on fermenting municipal solid waste (energy source) and humanure (nutrient source) in batch fermentations. Using the Continuum Particle Distribution Model (CPDM), the performance of continuous countercurrent fermentation was predicted at different volatile solid loading rates (VSLR) and liquid residence times (LRT). For a four-stage countercurrent fermentation system at VSLR = 4 g/(L∙day), LRT = 30 days, and solids concentration = 100 g/L liquid, the model predicts carboxylic acid concentration of 68 g/L and conversion of 78.5 %.

  19. Regional Homogeneity of Resting-State Brain Activity Suppresses the Effect of Dopamine-Related Genes on Sensory Processing Sensitivity

    PubMed Central

    Chen, Chuansheng; Moyzis, Robert; Xia, Mingrui; He, Yong; Xue, Gui; Li, Jin; He, Qinghua; Lei, Xuemei; Wang, Yunxin; Liu, Bin; Chen, Wen; Zhu, Bi; Dong, Qi

    2015-01-01

    Sensory processing sensitivity (SPS) is an intrinsic personality trait whose genetic and neural bases have recently been studied. The current study used a neural mediation model to explore whether resting-state brain functions mediated the effects of dopamine-related genes on SPS. 298 healthy Chinese college students (96 males, mean age = 20.42 years, SD = 0.89) were scanned with magnetic resonance imaging during resting state, genotyped for 98 loci within the dopamine system, and administered the Highly Sensitive Person Scale. We extracted a “gene score” that summarized the genetic variations representing the 10 loci that were significantly linked to SPS, and then used path analysis to search for brain regions whose resting-state data would help explain the gene-behavior association. Mediation analysis revealed that temporal homogeneity of regional spontaneous activity (ReHo) in the precuneus actually suppressed the effect of dopamine-related genes on SPS. The path model explained 16% of the variance of SPS. This study represents the first attempt at using a multi-gene voxel-based neural mediation model to explore the complex relations among genes, brain, and personality. PMID:26308205

  20. Development of a web-based support system for both homogeneous and heterogeneous air quality control networks: process and product.

    PubMed

    Andrade, J; Ares, J; García, R; Presa, J; Rodríguez, S; Piñeiro-Iglesias, M; López-Mahía, P; Muniategui, S; Prada, D

    2007-10-01

    The Environmental Laboratories Automation Software System or PALMA (Spanish abbreviation) was developed by a multidisciplinary team in order to support the main tasks of heterogeneous air quality control networks. The software process for PALMA development, which can be perfectly applied to similar multidisciplinary projects, was (a) well-defined, (b) arranged between environmental technicians and informatics, (c) based on quality guides, and (d) clearly user-centred. Moreover, it introduces some interesting advantages with regard to the classical step-by-step approaches. PALMA is a web-based system that allows 'off-line' and automated telematic data acquisition from distributed inmission stations belonging not only to homogeneous but also to heterogeneous air quality control networks. It provides graphic and tabular representations for a comprehensive and centralised analysis of acquired data, and considers the daily work that is associated with such networks: validation of the acquired data, alerts with regard to (periodical) tasks (e.g., analysers verification), downloading of files with environmental information (e.g., dust forecasts), etc. The implantation of PALMA has provided qualitative and quantitative improvements in the work performed by the people in charge of the considered control network.

  1. Waste container weighing data processing to create reliable information of household waste generation.

    PubMed

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations.

  2. Two-Step Process To Create "Roll-Off" Superamphiphobic Paper Surfaces.

    PubMed

    Jiang, Lu; Tang, Zhenguan; Clinton, Rahmat M; Breedveld, Victor; Hess, Dennis W

    2017-03-15

    Surface modification of cellulose-based paper, which displays roll-off properties for water and oils (surface tension ≥23.8 mN·m(-1)) and good repellency toward n-heptane (20.1 mN·m(-1)), is reported. Droplets of water, diiodomethane, motor oil, hexadecane, and decane all "bead up", i.e., exhibit high contact angles, and roll off the treated surface under the influence of gravity. Unlike widely used approaches that rely on the deposition of nanoparticles or electrospun nanofibers to create superamphiphobic surfaces, our method generates a hierarchical structure as an inherent property of the substrate and displays good adhesion between the film and substrate. The two-step combination of plasma etching and vapor deposition used in this study enables fine-tuning of the nanoscale roughness and thereby facilitates enhanced fundamental understanding of the effect of micro- and nanoscale roughness on the paper wetting properties. The surfaces maintain their "roll-off" properties after dynamic impact tests, demonstrating their mechanical robustness. Furthermore, the superamphiphobic paper has high gas permeability due to pore-volume enhancement by plasma etching but maintains the mechanical flexibility and strength of untreated paper, despite the presence of nanostructures. The unique combination of the chemical and physical properties of the resulting superamphiphobic paper is of practical interest for a range of applications such as breathable and disposable medical apparel, antifouling biomedical devices, antifingerprint paper, liquid packaging, microfluidic devices, and medical testing strips through a simple surface etching plus coating process.

  3. Five Important Lessons I Learned during the Process of Creating New Child Care Centers

    ERIC Educational Resources Information Center

    Whitehead, R. Ann

    2005-01-01

    In this article, the author describes her experiences of developing new child care sites and offers five important lessons that she learned through her experiences which helped her to create successful child care centers. These lessons include: (1) Finding an appropriate area and location; (2) Creating realistic financial projections based on real…

  4. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  5. Noble metal-catalyzed homogeneous and heterogeneous processes in treating simulated nuclear waste media with formic acid

    SciTech Connect

    King, R.B.; Bhattacharyya, N.K.; Smith, H.D.

    1995-09-01

    Simulants for the Hanford Waste Vitrification Plant feed containing the major non-radioactive components Al, Cd, Fe, Mn, Nd, Ni, Si, Zr, Na, CO{sub 3}{sup 2}-, NO{sub 3}-, and NO{sub 2}- were used to study reactions of formic acid at 90{degrees}C catalyzed by the noble metals Ru, Rh, and/or Pd found in significant quantities in uranium fission products. Such reactions were monitored using gas chromatography to analyze the CO{sub 2}, H{sub 2}, NO, and N{sub 2}O in the gas phase and a microammonia electrode to analyze the NH{sub 4}+/NH{sub 3} in the liquid phase as a function of time. The following reactions have been studied in these systems since they are undesirable side reactions in nuclear waste processing: (1) Decomposition of formic acid to CO{sub 2} + H{sub 2} is undesirable because of the potential fire and explosion hazard of H{sub 2}. Rhodium, which was introduced as soluble RhCl{sub 3}-3H{sub 2}O, was found to be the most active catalyst for H{sub 2} generation from formic acid above {approximately} 80{degrees}C in the presence of nitrite ion. The H{sub 2} production rate has an approximate pseudo first-order dependence on the Rh concentration, (2) Generation of NH{sub 3} from the formic acid reduction of nitrate and/or nitrite is undesirable because of a possible explosion hazard from NH{sub 4}NO{sub 3} accumulation in a waste processing plant off-gas system. The Rh-catalyzed reduction of nitrogen-oxygen compounds to ammonia by formic acid was found to exhibit the following features: (a) Nitrate rather than nitrite is the principal source of NH{sub 3}. (b) Ammonia production occurs at the expense of hydrogen production. (c) Supported rhodium metal catalysts are more active than rhodium in any other form, suggesting that ammonia production involves heterogeneous rather than homogeneous catalysis.

  6. Benchmarking monthly homogenization algorithms

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2011-08-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  7. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  8. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  9. Effects of ultrasonication and conventional mechanical homogenization processes on the structures and dielectric properties of BaTiO3 ceramics.

    PubMed

    Akbas, Hatice Zehra; Aydin, Zeki; Yilmaz, Onur; Turgut, Selvin

    2017-01-01

    The effects of the homogenization process on the structures and dielectric properties of pure and Nb-doped BaTiO3 ceramics have been investigated using an ultrasonic homogenization and conventional mechanical methods. The reagents were homogenized using an ultrasonic processor with high-intensity ultrasonic waves and using a compact mixer-shaker. The components and crystal types of the powders were determined by Fourier-transform infrared spectroscopy (FTIR) and X-ray diffraction (XRD) analyses. The complex permittivity (ε('), ε″) and AC conductivity (σ') of the samples were analyzed in a wide frequency range of 20Hz to 2MHz at room temperature. The structures and dielectric properties of pure and Nb-doped BaTiO3 ceramics strongly depend on the homogenization process in a solid-state reaction method. Using an ultrasonic processor with high-intensity ultrasonic waves based on acoustic cavitation phenomena can make a significant improvement in producing high-purity BaTiO3 ceramics without carbonate impurities with a small dielectric loss.

  10. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  11. Novel Processing for Creating 3D Architectured Porous Shape Memory Alloy

    DTIC Science & Technology

    2013-03-01

    Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS shape memory alloy, powder metallurgy , digital image...were used as a spaceholder with powder metallography. High carbon steel wires were chosen with a wire diameter of 400 μm. The wires were orthogonally...similar to the method detailed above, in that a composite of NiTi powders and a steel spaceholder frame is created, and the frame is electrochemically

  12. Method of removing the effects of electrical shorts and shunts created during the fabrication process of a solar cell

    DOEpatents

    Nostrand, Gerald E.; Hanak, Joseph J.

    1979-01-01

    A method of removing the effects of electrical shorts and shunts created during the fabrication process and improving the performance of a solar cell with a thick film cermet electrode opposite to the incident surface by applying a reverse bias voltage of sufficient magnitude to burn out the electrical shorts and shunts but less than the break down voltage of the solar cell.

  13. Homogeneous Atomic Fermi Gases

    NASA Astrophysics Data System (ADS)

    Mukherjee, Biswaroop; Yan, Zhenjie; Patel, Parth B.; Hadzibabic, Zoran; Yefsah, Tarik; Struck, Julian; Zwierlein, Martin W.

    2017-03-01

    We report on the creation of homogeneous Fermi gases of ultracold atoms in a uniform potential. In the momentum distribution of a spin-polarized gas, we observe the emergence of the Fermi surface and the saturated occupation of one particle per momentum state: the striking consequence of Pauli blocking in momentum space for a degenerate gas. Cooling a spin-balanced Fermi gas at unitarity, we create homogeneous superfluids and observe spatially uniform pair condensates. For thermodynamic measurements, we introduce a hybrid potential that is harmonic in one dimension and uniform in the other two. The spatially resolved compressibility reveals the superfluid transition in a spin-balanced Fermi gas, saturation in a fully polarized Fermi gas, and strong attraction in the polaronic regime of a partially polarized Fermi gas.

  14. The difference between implicit and explicit associative processes at study in creating false memory in the DRM paradigm.

    PubMed

    Kawasaki, Yayoi; Yama, Hiroshi

    2006-01-01

    The effects of implicit and explicit associative processes for false recognition were examined by manipulating exposure duration of studied items; 20 ms or 2000 ms. Participants studied lists of words that were high associates to a nonpresented word (critical lure) in either condition. After learning each list, they took a recognition test and remember/know judgements immediately (Experiment 1) or 1 minute later (Experiment 2). In Experiment 1, know responses for critical lures were more in the 20 ms than in the 2000 ms conditions, while remember responses for them were more in the 2000 ms condition. Implicit associative processes create familiarity of critical lures, and explicit associative processes create details of false memories. Comparing the results of Experiment 1 with those of Experiment 2, remember responses for critical lures were increased with the prolonged time only in the 20 ms condition. Characteristics of false memory made by implicit associative processes could be changed by prolonged time.

  15. Atomic processes in plasmas created by an ultra-short laser pulse

    NASA Astrophysics Data System (ADS)

    Audebert, P.; Lecherbourg, L.; Bastiani-Ceccotti, S.; Geindre, J.-P.; Blancard, C.; Cossé, P.; Faussurier, G.; Shepherd, R.; Renaudin, P.

    2008-05-01

    Point projection K-shell absorption spectroscopy has been used to measure absorption spectra of transient aluminum plasma created by an ultra-short laser pulse. 1s-2p and 1s-3p absorption lines of weakly ionized aluminum were measured for an extended range of densities in a relatively low-temperature regime. Independent plasma characterization was obtained from frequency domain interferometry (FDI) diagnostic and allows the interpretation of the absorption spectra in terms of spectral opacities. The experimental spectra are compared with opacity calculations using the density and temperature inferred from the analysis of the FDI data.

  16. Measurement of USMC Logistics Processes: Creating a Baseline to Support Precision Logistics Implementation

    DTIC Science & Technology

    1998-01-01

    unavailability of parts. ORDER AND SHIP TIMES FROM RETAIL SUPPLY We turn now to the supply of parts, beginning with measurement of the order and ship ( O &S...Point, according to archived supply data. Defining the Order and Ship Process from Retail Supply The retail O &S process begins with the identification...take more than two weeks for the entire O &S process, even though backorders are not at issue here. What is not clear from these results is what

  17. Simulation of the Vapor Intrusion Process for Non-Homogeneous Soils Using a Three-Dimensional Numerical Model

    PubMed Central

    Bozkurt, Ozgur; Pennell, Kelly G.; Suuberg, Eric M.

    2010-01-01

    This paper presents model simulation results of vapor intrusion into structures built atop sites contaminated with volatile or semi-volatile chemicals of concern. A three-dimensional finite element model was used to investigate the importance of factors that could influence vapor intrusion when the site is characterized by non-homogeneous soils. Model simulations were performed to examine how soil layers of differing properties alter soil gas concentration profiles and vapor intrusion rates into structures. The results illustrate difference in soil gas concentration profiles and vapor intrusion rates between homogeneous and layered soils. The findings support the need for site conceptual models to adequately represent the site’s geology when conducting site characterizations, interpreting field data and assessing the risk of vapor intrusion at a given site. For instance, in layered geologies, a lower permeability and diffusivity soil layer between the source and building often limits vapor intrusion rates, even if a higher permeability layer near the foundation permits increased soil gas flow rates into the building. In addition, the presence of water-saturated clay layers can considerably influence soil gas concentration profiles. Therefore, interpreting field data without accounting for clay layers in the site conceptual model could result in inaccurate risk calculations. Important considerations for developing more accurate conceptual site models are discussed in light of the findings. PMID:20664816

  18. Simulation of the Vapor Intrusion Process for Non-Homogeneous Soils Using a Three-Dimensional Numerical Model.

    PubMed

    Bozkurt, Ozgur; Pennell, Kelly G; Suuberg, Eric M

    2009-01-01

    This paper presents model simulation results of vapor intrusion into structures built atop sites contaminated with volatile or semi-volatile chemicals of concern. A three-dimensional finite element model was used to investigate the importance of factors that could influence vapor intrusion when the site is characterized by non-homogeneous soils. Model simulations were performed to examine how soil layers of differing properties alter soil gas concentration profiles and vapor intrusion rates into structures. The results illustrate difference in soil gas concentration profiles and vapor intrusion rates between homogeneous and layered soils. The findings support the need for site conceptual models to adequately represent the site's geology when conducting site characterizations, interpreting field data and assessing the risk of vapor intrusion at a given site. For instance, in layered geologies, a lower permeability and diffusivity soil layer between the source and building often limits vapor intrusion rates, even if a higher permeability layer near the foundation permits increased soil gas flow rates into the building. In addition, the presence of water-saturated clay layers can considerably influence soil gas concentration profiles. Therefore, interpreting field data without accounting for clay layers in the site conceptual model could result in inaccurate risk calculations. Important considerations for developing more accurate conceptual site models are discussed in light of the findings.

  19. Creating Space Force Structure Through Strategic Planning: The Air Force Reserve Visioning Process

    DTIC Science & Technology

    2007-11-02

    strategic visioning and strategic planning processes together to achieve military mission objectives. It focuses on our current National and Military...dissimilar and similar organizations to work together through the strategic planning process to achieve common objectives. Finally, a case study will be

  20. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  1. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film in accordance with this standard. Process color film in accordance with the manufacturer's recommendations. (3...) version of each image must be comparable in quality to a 35 mm film photograph or better, and must...

  2. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film in accordance with this standard. Process color film in accordance with the manufacturer's recommendations. (3...) version of each image must be comparable in quality to a 35 mm film photograph or better, and must...

  3. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film in accordance with this standard. Process color film in accordance with the manufacturer's recommendations. (3...) version of each image must be comparable in quality to a 35 mm film photograph or better, and must...

  4. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  5. 36 CFR 1237.26 - What materials and processes must agencies use to create audiovisual records?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... records? Agencies must: (a) For picture negatives and motion picture preprints (negatives, masters, and... photographic film does not exceed 0.014 grams per square meter. (2) Require laboratories to process film...

  6. The Contribution of Prefrontal Executive Processes to Creating a Sense of Self**

    PubMed Central

    Hirstein, William

    2011-01-01

    According to several current theories, executive processes help achieve various mental actions such as remembering, planning and decision-making, by executing cognitive operations on representations held in consciousness. I plan to argue that these executive processes are partly responsible for our sense of self, because of the way they produce the impression of an active, controlling presence in consciousness. If we examine what philosophers have said about the “ego” (Descartes), “the Self” (Locke and Hume), the “self of all selves” (William James), we will find that it fits what is now known about executive processes. Hume, for instance, famously argued that he could not detect the self in consciousness, and this would correspond to the claim (made by Crick and Koch, for instance) that we are not conscious of the executive processes themselves, but rather of their results. PMID:21694967

  7. Lycopene degradation, isomerization and in vitro bioaccessibility in high pressure homogenized tomato puree containing oil: effect of additional thermal and high pressure processing.

    PubMed

    Knockaert, Griet; Pulissery, Sudheer K; Colle, Ines; Van Buggenhout, Sandy; Hendrickx, Marc; Loey, Ann Van

    2012-12-01

    In the present study, the effect of equivalent thermal and high pressure processes at pasteurization and sterilization intensities on some health related properties of high pressure homogenized tomato puree containing oil were investigated. Total lycopene concentration, cis-lycopene content and in vitro lycopene bioaccessibility were examined as health related properties. Results showed that pasteurization hardly affected the health related properties of tomato puree. Only the formation of cis-lycopene during intense thermal pasteurization was observed. Sterilization processes on the other hand had a significant effect on the health related properties. A significant decrease in total lycopene concentration was found after the sterilization processes. Next to degradation, significant isomerization was also observed: all-trans-lycopene was mainly converted to 9-cis- and 13-cis-lycopene. High pressure sterilization limited the overall lycopene isomerization, when compared to the equivalent thermal sterilization processes. The formation of 5-cis-lycopene on the other hand seemed to be favoured by high pressure. The in vitro lycopene bioaccessibility of high pressure homogenized tomato puree containing oil was decreased during subsequent thermal or high pressure processing, whereby significant changes were observed for all the sterilization processes.

  8. Disruption of Pseudomonas putida by high pressure homogenization: a comparison of the predictive capacity of three process models for the efficient release of arginine deiminase.

    PubMed

    Patil, Mahesh D; Patel, Gopal; Surywanshi, Balaji; Shaikh, Naeem; Garg, Prabha; Chisti, Yusuf; Banerjee, Uttam Chand

    2016-12-01

    Disruption of Pseudomonas putida KT2440 by high-pressure homogenization in a French press is discussed for the release of arginine deiminase (ADI). The enzyme release response of the disruption process was modelled for the experimental factors of biomass concentration in the broth being disrupted, the homogenization pressure and the number of passes of the cell slurry through the homogenizer. For the same data, the response surface method (RSM), the artificial neural network (ANN) and the support vector machine (SVM) models were compared for their ability to predict the performance parameters of the cell disruption. The ANN model proved to be best for predicting the ADI release. The fractional disruption of the cells was best modelled by the RSM. The fraction of the cells disrupted depended mainly on the operating pressure of the homogenizer. The concentration of the biomass in the slurry was the most influential factor in determining the total protein release. Nearly 27 U/mL of ADI was released within a single pass from slurry with a biomass concentration of 260 g/L at an operating pressure of 510 bar. Using a biomass concentration of 100 g/L, the ADI release by French press was 2.7-fold greater than in a conventional high-speed bead mill. In the French press, the total protein release was 5.8-fold more than in the bead mill. The statistical analysis of the completely unseen data exhibited ANN and SVM modelling as proficient alternatives to RSM for the prediction and generalization of the cell disruption process in French press.

  9. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head.

    PubMed

    Li, Kai; Papademetris, Xenophon; Tucker, Don M

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages.

  10. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head

    PubMed Central

    Li, Kai; Papademetris, Xenophon; Tucker, Don M.

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  11. Study of stirred layers on 316L steel created by friction stir processing

    NASA Astrophysics Data System (ADS)

    Langlade, C.; Roman, A.; Schlegel, D.; Gete, E.; Folea, M.

    2014-08-01

    Nanostructured materials are known to exhibit attractive properties, especially in the mechanical field where high hardness is of great interest. The friction stir process (FSP) is a recent surface engineering technique derived from the friction stir welding method (FSW). In this study, the FSP of an 316L austenitic stainless steel has been evaluated. The treated layers have been characterized in terms of hardness and microstructure and these results have been related to the FSP operational parameters. The process has been analysed using a Response Surface Method (RSM) to enable the stirred layer thickness prediction.

  12. Partners in Process: How Museum Educators and Classroom Teachers Can Create Outstanding Results

    ERIC Educational Resources Information Center

    Moisan, Heidi

    2009-01-01

    Collaborative processes by nature are not neat and tidy; and if mismanaged, they can lead to chaos rather than creative productivity. However, when a museum and a group of teachers establish a respectful peer community that maximizes all the members talents, truly impactful teaching and learning result. This article analyzes the "Great…

  13. Creating an Administrative Structure to Support Faculty Governance: A Participatory Process.

    ERIC Educational Resources Information Center

    Littlefield, Vivian M.

    1989-01-01

    Principles of organizational change are examined as they apply to academic units in general, and the way in which one well-established academic department in nursing changed its administrative structure is described. The process used faculty participation in decision-making. (Author/MSE)

  14. Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement

    ERIC Educational Resources Information Center

    Raven, Arjan; Randolph, Adriane B.; Heil, Shelli

    2010-01-01

    The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…

  15. Creating Professional Communities in Schools through Organizational Learning: An Evaluation of a School Improvement Process.

    ERIC Educational Resources Information Center

    Scribner, Jay Paredes; Cockrell, Karen Sunday; Cockrell, Dan H.; Valentine, Jerry W.

    1999-01-01

    Analyzes a school-improvement process's potential to foster professional community in three rural middle schools through organizational learning. Findings of a two-year qualitative case study reveal bureaucracy/community tensions and isolate four influential community-building factors: principal leadership, organizational history, organizational…

  16. Not All Analogies Are Created Equal: Associative and Categorical Analogy Processing following Brain Damage

    ERIC Educational Resources Information Center

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and…

  17. Feasibility study for producing a carrot/potato matrix reference material for 11 selected pesticides at EU MRL level: material processing, homogeneity and stability assessment.

    PubMed

    Saldanha, Helena; Sejerøe-Olsen, Berit; Ulberth, Franz; Emons, Hendrik; Zeleny, Reinhard

    2012-05-01

    The feasibility for producing a matrix reference material for selected pesticides in a carrot/potato matrix was investigated. A commercially available baby food (carrot/potato-based mash) was spiked with 11 pesticides at the respective EU maximum residue limits (MRLs), and further processed by either freezing or freeze-drying. Batches of some 150 units were produced per material type. First, the materials were assessed for the relative amount of pesticide recovered after processing (ratio of pesticide concentration in the processed material to the initially spiked pesticide concentration). In addition, the materials' homogeneity (bottle-to-bottle variation), and the short-term (1 month) and mid-term (5 months) stability at different temperatures were assessed. For this, an in-house validated GC-EI-MS method operated in the SIM mode with a sample preparation procedure based on the QuEChERS ("quick, easy, cheap, effective, rugged, and safe") principle was applied. Measurements on the frozen material provided the most promising results (smallest analyte losses during production), and also freeze-drying proved to be a suitable alternative processing technique for most of the investigated pesticides. Both the frozen and the freeze-dried material showed to be sufficiently homogeneous for the intended use, and storage at -20°C for 5 months did not reveal any detectable material degradation. The results constitute an important step towards the development of a pesticide matrix reference material.

  18. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse

    PubMed Central

    Hovick, Torre J.; Allred, Brady W.; Elmore, R. Dwayne; Fuhlendorf, Samuel D.; Hamilton, Robert G.; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993–2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy’s Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken’s distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as

  19. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse.

    PubMed

    Hovick, Torre J; Allred, Brady W; Elmore, R Dwayne; Fuhlendorf, Samuel D; Hamilton, Robert G; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993-2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy's Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken's distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as static

  20. ArhiNet - A Knowledge-Based System for Creating, Processing and Retrieving Archival eContent

    NASA Astrophysics Data System (ADS)

    Salomie, Ioan; Dinsoreanu, Mihaela; Pop, Cristina; Suciu, Sorin

    This paper addresses the problem of creating, processing and querying semantically enhanced eContent from archives and digital libraries. We present an analysis of the archival domain, resulting in the creation of an archival domain model and of a domain ontology core. Our system adds semantic mark-up to the historical documents content, thus enabling document and knowledge retrieval as response to natural language ontology-guided queries. The system functionality follows two main workflows: (i) semantically enhanced eContent generation and knowledge acquisition and (ii) knowledge processing and retrieval. Within the first workflow, the relevant domain information is extracted from documents written in natural languages, followed by semantic annotation and domain ontology population. In the second workflow, ontologically guided natural language queries trigger reasoning processes that provide relevant search results. The paper also discusses the transformation of the OWL domain ontology into a hierarchical data model, thus providing support for the efficient ontology processing.

  1. Creating a process for incorporating epidemiological modelling into outbreak management decisions.

    PubMed

    Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy

    2012-01-01

    Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks.

  2. Methodology for Creating UMLS Content Views Appropriate for Biomedical Natural Language Processing

    PubMed Central

    Aronson, Alan R.; Mork, James G.; Névéol, Aurélie; Shooshan, Sonya E.; Demner-Fushman, Dina

    2008-01-01

    Given the growth in UMLS Metathesaurus content and the consequent growth in language complexity, it is not surprising that NLP applications that depend on the UMLS are experiencing increased difficulty in maintaining adequate levels of performance. This phenomenon underscores the need for UMLS content views which can support NLP processing of both the biomedical literature and clinical text. We report on experiments designed to provide guidance as to whether to adopt a conservative vs. an aggressive approach to the construction of UMLS content views. We tested three conservative views and two new aggressive views against two NLP applications and found that the conservative views consistently performed better for the literature application, but the most aggressive view performed best for the clinical application. PMID:18998883

  3. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  4. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  5. Integrated assessment of emerging science and technologies as creating learning processes among assessment communities.

    PubMed

    Forsberg, Ellen-Marie; Ribeiro, Barbara; Heyen, Nils B; Nielsen, Rasmus Øjvind; Thorstensen, Erik; de Bakker, Erik; Klüver, Lars; Reiss, Thomas; Beekman, Volkert; Millar, Kate

    2016-12-01

    Emerging science and technologies are often characterised by complexity, uncertainty and controversy. Regulation and governance of such scientific and technological developments needs to build on knowledge and evidence that reflect this complicated situation. This insight is sometimes formulated as a call for integrated assessment of emerging science and technologies, and such a call is analysed in this article. The article addresses two overall questions. The first is: to what extent are emerging science and technologies currently assessed in an integrated way. The second is: if there appears to be a need for further integration, what should such integration consist in? In the article we briefly outline the pedigree of the term 'integrated assessment' and present a number of interpretations of the concept that are useful for informing current analyses and discussions of integration in assessment. Based on four case studies of assessment of emerging science and technologies, studies of assessment traditions, literature analysis and dialogues with assessment professionals, currently under-developed integration dimensions are identified. It is suggested how these dimensions can be addressed in a practical approach to assessment where representatives of different assessment communities and stakeholders are involved. We call this approach the Trans Domain Technology Evaluation Process (TranSTEP).

  6. Not all analogies are created equal: Associative and categorical analogy processing following brain damage

    PubMed Central

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile; and contain more distant relationships. PMID:22402184

  7. Rapid monitoring for the enhanced definition and control of a selective cell homogenate purification by a batch-flocculation process.

    PubMed

    Habib, G; Zhou, Y; Hoare, M

    2000-10-20

    Downstream-bioprocess operations, for example, selective flocculation, are inherently variable due to fluctuations in feed material, equipment performance, and quality of additives such as flocculating agents. Due to these fluctuations in operating conditions, some form of process control is essential for reproducible and satisfactory process performance and hence, product quality. Both product (alcohol dehydrogenase) and key contaminants (RNA, protein, cell debris) within a Saccharomyces cerevisiae system were monitored in real-time adopting an at-line enzymatic reaction and rapid UV-VIS spectral-analysis technique every 135 seconds. The real-time measurements were implemented within two control configurations to regulate the batch-flocculation process according to prespecified control objectives, using the flocculant dose as the sole manipulative variable. An adaptive, model-based control arrangement was studied, which combined the rapid measurements with a process model and two model parameter-identification techniques for real-time prediction of process behavior. Based on an up-to-date mathematical description of the flocculation system, process optimization was attained and subsequent feedback control to this optimum operating set point was reproducibly demonstrated with a 92% accuracy. A simpler control configuration was also investigated adopting the cell debris concentration as the control variable. Both control arrangements resulted in superior flocculation-process performances in terms of contaminant removal, product recovery, and excess flocculant usage compared to an uncontrolled system.

  8. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  9. Degradation Mechanism of Cyanobacterial Toxin Cylindrospermopsin by Hydroxyl Radicals in Homogeneous UV/H2O2 Process

    EPA Science Inventory

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometr...

  10. Dimensional Methods: Dimensions, Units and the Principle of Dimensional Homogeneity. Physical Processes in Terrestrial and Aquatic Ecosystems, Applied Mathematics.

    ERIC Educational Resources Information Center

    Fletcher, R. Ian

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. The module is concerned with conventional techniques such as concepts of measurement,…

  11. PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Glauco; Netto Lahoz, Carlos Henrique

    2013-09-01

    This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).

  12. Magnifying absolute instruments for optically homogeneous regions

    SciTech Connect

    Tyc, Tomas

    2011-09-15

    We propose a class of magnifying absolute optical instruments with a positive isotropic refractive index. They create magnified stigmatic images, either virtual or real, of optically homogeneous three-dimensional spatial regions within geometrical optics.

  13. Open system Hf isotope homogenization by a DISPOREP process under amphibolite-facies conditions, an example from the Limpopo Belt (South Africa)

    NASA Astrophysics Data System (ADS)

    Zeh, Armin; Gerdes, Axel

    2013-04-01

    Isotope homogenization in metamorphic rock is a prerequisite for precise isochrone dating. However, whether or not homogenisation occurs during a metamorphic overprint dependent on several parameters and processes, which compete with each other and comprise at least (i) volume diffusion, (ii) dissolution-re-precipitation, (iii) intergranular diffusive or fluid enhanced transport, and (iv) metamorphic mineral reaction(s). Isotope homogenisation is commonly reached in high-grade (granulite-facies) metamorphic rocks, where diffusion is fast, and mineral reactions and dissolution-re-precipitation accompanied or maintained by a melt phase, but it is incomplete in low-grade to amphibolite-facies rocks, in the presence of an aqueous fluid phase. This holds true, in particular, for the Lu-Hf isotope system, which is mainly controlled by accessory zircon, which is very resistant against dissolution in aqueous fluids and has slow diffusivity for Hf, U, Pb. Thus zircon often maintains it primary U-Pb-Hf isotope composition obtained during previous magmatic crystallisation (i.e, magmatic grains in orthogneisses or detrital magmatic grains in paragneisses), even under very high-grade metamorphic conditions >1000° C. However, results of recent isotope studies show, that the U-Pb and Lu-Hf isotope systems of zircon-bearing ortho- and paragneisses can homogenize completely (on hand specimen scale) even under amphibolite facies T - P conditions of

  14. An approach for extraction of kernel oil from Pinus pumila using homogenate-circulating ultrasound in combination with an aqueous enzymatic process and evaluation of its antioxidant activity.

    PubMed

    Chen, Fengli; Zhang, Qiang; Gu, Huiyan; Yang, Lei

    2016-11-04

    In this study, a novel approach involving homogenate-circulating ultrasound in combination with aqueous enzymatic extraction (H-CUAEE) was developed for extraction of kernel oil from Pinus pumila. Following comparison of enzyme types and concentrations, an enzyme mixture consisting of cellulase, pectinase and hemicellulase (1:1:1, w/w/w) at a concentration of 2.5% was selected and applied for effective oil extraction and release. Several variables potentially influencing extraction yields, namely, homogenization time, incubation temperature, incubation time, mark-space ratio of ultrasound irradiation, ultrasound irradiation power, liquid-solid ratio, pH and stirring rate, were optimized by Plackett-Burman design. Among the eight variables, incubation temperature, incubation time and liquid-solid ratio were statistically significant and were further optimized by Box-Behnken design to predict optimum extraction conditions and ascertain operability ranges for maximum extraction yield. Under optimum operating conditions, extraction yields of P. pumila kernel oil were 31.89±1.12% with a Δ5-unsaturated polymethylene interrupted fatty acid content of 20.07% and an unsaturated fatty acid content of 93.47%. Our study results indicate that the proposed H-CUAEE process has enormous potential for efficient and environmentally friendly extraction of edible oils.

  15. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants

    PubMed Central

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (−47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  16. Creating Sub-50 Nm Nanofluidic Junctions in PDMS Microfluidic Chip via Self-Assembly Process of Colloidal Particles.

    PubMed

    Wei, Xi; Syed, Abeer; Mao, Pan; Han, Jongyoon; Song, Yong-Ak

    2016-03-13

    Polydimethylsiloxane (PDMS) is the prevailing building material to make microfluidic devices due to its ease of molding and bonding as well as its transparency. Due to the softness of the PDMS material, however, it is challenging to use PDMS for building nanochannels. The channels tend to collapse easily during plasma bonding. In this paper, we present an evaporation-driven self-assembly method of silica colloidal nanoparticles to create nanofluidic junctions with sub-50 nm pores between two microchannels. The pore size as well as the surface charge of the nanofluidic junction is tunable simply by changing the colloidal silica bead size and surface functionalization outside of the assembled microfluidic device in a vial before the self-assembly process. Using the self-assembly of nanoparticles with a bead size of 300 nm, 500 nm, and 900 nm, it was possible to fabricate a porous membrane with a pore size of ~45 nm, ~75 nm and ~135 nm, respectively. Under electrical potential, this nanoporous membrane initiated ion concentration polarization (ICP) acting as a cation-selective membrane to concentrate DNA by ~1,700 times within 15 min. This non-lithographic nanofabrication process opens up a new opportunity to build a tunable nanofluidic junction for the study of nanoscale transport processes of ions and molecules inside a PDMS microfluidic chip.

  17. Important components to create personal working alliances with clients in the mental health sector to support the recovery process.

    PubMed

    Klockmo, Carolina; Marnetoft, Sven-Uno; Selander, John; Nordenmark, Mikael

    2014-03-01

    Personligt ombud (PO) is a Swedish version of case management that aims to support individuals with psychiatric disabilities. Guidelines to the PO service emphasize the different role that the PO plays with respect to the relationship with clients. The aim of this study was to investigate the components that POs found to be important in the relationship with clients. Telephone interviews with 22 POs across Sweden were carried out. The interviews were recorded, transcribed, and analyzed using qualitative content analysis. The relationship with each client was described as the foundation of the POs' work; it was the only 'tool' they had. The findings were reflected in a main theme, which showed the importance of creating personal working alliances with each client where POs put the client at the center of the work and adjusted their support according to the client's needs at the time. Important components were that the PO and the client trusted each other, that the power between the PO and the client was balanced, and to be a personal support. Many of the components that POs found to be important are shown as essential in recovery-oriented services. POs followed the client in the process and remained as long as necessary and this is one way of bringing hope to the client's recovery process. However, the personal tone can be fraught with difficulties and to maintain professionalism, it is necessary to reflect, through discussions with colleagues, with the leader and in supervision.

  18. Study of fundamental chemical processes in explosive decomposition by laser-powered homogeneous pyrolysis. Final report 1 jul 78-31 aug 81

    SciTech Connect

    McMillen, D.F.; Golden, D.M.

    1981-11-12

    Very Low-Pressure Pyrolysis studies of 2,4-dinitrotoluene decomposition resulted in decomposition rates consistent with log (ks) = 12.1 - 43.9/2.3 RT. These results support the conclusion that previously reported 'anomalously' low Arrhenius parameters for the homogeneous gas-phase decomposition of ortho-nitrotoluene actually represent surface-catalyzed reactions. Preliminary qualitative results for pyrolysis of ortho-nitrotouene in the absence of hot reactor walls, using the Laser-Powered Homogeneous Pyrolysis technique (LPHP), provide further support for this conclusion: only products resulting from Ph-NO2 bond scission were observed; no products indicating complex intramolecular oxidation-reduction or elimination processes could be detected. The LPHP technique was successfully modified to use a pulsed laser and a heated flow system, so that the technique becomes suitable for study of surface-sensitive, low vapor pressure substrates such as TNT. The validity and accuracy of the technique was demonstrated by applying it to the decomposition of substances whose Arrhenius parameters for decomposition were already well known. IR-fluorescence measurements show that the temperature-space-time behavior under the present LPHP conditions is in agreement with expectations and with requirements which must be met if the method is to have quantitative validity. LPHP studies of azoisopropane decomposition, chosen as a radical-forming test reaction, show the accepted literature parameters to be substantially in error and indicate that the correct values are in all probability much closer to those measured in this work: log (k/s) = 13.9 - 41.2/2.3 RT.

  19. Homogeneous processes of atmospheric interest

    NASA Technical Reports Server (NTRS)

    Rossi, M. J.; Barker, J. R.; Golden, D. M.

    1983-01-01

    Upper atmospheric research programs in the department of chemical kinetics are reported. Topics discussed include: (1) third-order rate constants of atmospheric importance; (2) a computational study of the HO2 + HO2 and DO2 + DO2 reactions; (3) measurement and estimation of rate constants for modeling reactive systems; (4) kinetics and thermodynamics of ion-molecule association reactions; (5) entropy barriers in ion-molecule reactions; (6) reaction rate constant for OH + HOONO2 yields products over the temperature range 246 to 324 K; (7) very low-pressure photolysis of tert-bytyl nitrite at 248 nm; (8) summary of preliminary data for the photolysis of C1ONO2 and N2O5 at 285 nm; and (9) heterogeneous reaction of N2O5 and H2O.

  20. Homogeneity and elemental distribution in self-assembled bimetallic Pd-Pt aerogels prepared by a spontaneous one-step gelation process.

    PubMed

    Oezaslan, M; Liu, W; Nachtegaal, M; Frenkel, A I; Rutkowski, B; Werheid, M; Herrmann, A-K; Laugier-Bonnaud, C; Yilmaz, H-C; Gaponik, N; Czyrska-Filemonowicz, A; Eychmüller, A; Schmidt, T J

    2016-07-27

    Multi-metallic aerogels have recently emerged as a novel and promising class of unsupported electrocatalyst materials due to their high catalytic activity and improved durability for various electrochemical reactions. Aerogels can be prepared by a spontaneous one-step gelation process, where the chemical co-reduction of metal precursors and the prompt formation of nanochain-containing hydrogels, as a preliminary stage for the preparation of aerogels, take place. However, detailed knowledge about the homogeneity and chemical distribution of these three-dimensional Pd-Pt aerogels at the nano-scale as well as at the macro-scale is still unclear. Therefore, we used a combination of spectroscopic and microscopic techniques to obtain a better insight into the structure and elemental distribution of the various Pd-rich Pd-Pt aerogels prepared by the spontaneous one-step gelation process. Synchrotron-based extended X-ray absorption fine structure (EXAFS) spectroscopy and high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) in combination with energy-dispersive X-ray spectroscopy (EDX) were employed in this work to uncover the structural architecture and chemical composition of the various Pd-rich Pd-Pt aerogels over a broad length range. The Pd80Pt20, Pd60Pt40 and Pd50Pt50 aerogels showed heterogeneity in the chemical distribution of the Pt and Pd atoms inside the macroscopic nanochain-network. The features of mono-metallic clusters were not detected by EXAFS or STEM-EDX, indicating alloyed nanoparticles. However, the local chemical composition of the Pd-Pt alloys strongly varied along the nanochains and thus within a single aerogel. To determine the electrochemically active surface area (ECSA) of the Pd-Pt aerogels for application in electrocatalysis, we used the electrochemical CO stripping method. Due to their high porosity and extended network structure, the resulting values of the ECSA for the Pd-Pt aerogels were higher than that for

  1. High pressure homogenization processing, thermal treatment and milk matrix affect in vitro bioaccessibility of phenolics in apple, grape and orange juice to different extents.

    PubMed

    He, Zhiyong; Tao, Yadan; Zeng, Maomao; Zhang, Shuang; Tao, Guanjun; Qin, Fang; Chen, Jie

    2016-06-01

    The effects of high pressure homogenization processing (HPHP), thermal treatment (TT) and milk matrix (soy, skimmed and whole milk) on the phenolic bioaccessibility and the ABTS scavenging activity of apple, grape and orange juice (AJ, GJ and OJ) were investigated. HPHP and soy milk diminished AJ's total phenolic bioaccessibility 29.3%, 26.3%, respectively, whereas TT and bovine milk hardly affected it. HPHP had little effect on GJ's and OJ's total phenolic bioaccessibility, while TT enhanced them 27.3-33.9%, 19.0-29.2%, respectively, and milk matrix increased them 26.6-31.1%, 13.3-43.4%, respectively. Furthermore, TT (80 °C/30 min) and TT (90 °C/30 s) presented the similar influences on GJ's and OJ's phenolic bioaccessibility. Skimmed milk showed a better enhancing effect on OJ's total phenolic bioaccessibility than soy and whole milk, but had a similar effect on GJ's as whole milk. These results contribute to promoting the health benefits of fruit juices by optimizing the processing and formulas in the food industry.

  2. Thermomechanical process optimization of U-10wt% Mo – Part 2: The effect of homogenization on the mechanical properties and microstructure

    SciTech Connect

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean M.; Burkes, Douglas E.

    2015-07-09

    Low-enriched uranium alloyed with 10 wt% molybdenum (U-10Mo) is currently being investigated as an alternative fuel for the highly enriched uranium used in several of the United States’ high performance research reactors. Development of the methods to fabricate the U-10Mo fuel plates is currently underway and requires fundamental understanding of the mechanical properties at the expected processing temperatures. In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the “homogenized” materials were then tested under compression between 500 and 800°C. The as-cast samples and those treated at 800°C for 24 hours had grain sizes of 25-30 μm, whereas those treated at 1000°C for 16 hours had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  3. Don't homogenize, synchronize.

    PubMed

    Sawhney, M

    2001-01-01

    To be more responsive to customers, companies often break down organizational walls between their units--setting up all manner of cross-business and cross-functional task forces and working groups and promoting a "one-company" culture. But such attempts can backfire terribly by distracting business and functional units and by contaminating their strategies and processes. Fortunately, there's a better way, says the author. Rather than tear down organizational walls, a company can make them permeable to information. It can synchronize all its data on products, filtering the information through linked databases and applications and delivering it in a coordinated, meaningful form to customers. As a result, the organization can present a single, unified face to the customer--one that can change as market conditions warrant--without imposing homogeneity on its people. Such synchronization can lead not just to stronger customer relationships and more sales but also to greater operational efficiency. It allows a company, for example, to avoid the high costs of maintaining many different information systems with redundant data. The decoupling of product control from customer control in a synchronized company reflects a fundamental fact about business: While companies have to focus on creating great products, customers think in terms of the activities they perform and the benefits they seek. For companies, products are ends, but for customers, products are means. The disconnect between how customers think and how companies organize themselves is what leads to inefficiencies and missed opportunities, and that's exactly the problem that synchronization solves. Synchronized companies can get closer to customers, sustain product innovation, and improve operational efficiency--goals that have traditionally been very difficult to achieve simultaneously.

  4. Is the Universe homogeneous?

    PubMed

    Maartens, Roy

    2011-12-28

    The standard model of cosmology is based on the existence of homogeneous surfaces as the background arena for structure formation. Homogeneity underpins both general relativistic and modified gravity models and is central to the way in which we interpret observations of the cosmic microwave background (CMB) and the galaxy distribution. However, homogeneity cannot be directly observed in the galaxy distribution or CMB, even with perfect observations, since we observe on the past light cone and not on spatial surfaces. We can directly observe and test for isotropy, but to link this to homogeneity we need to assume the Copernican principle (CP). First, we discuss the link between isotropic observations on the past light cone and isotropic space-time geometry: what observations do we need to be isotropic in order to deduce space-time isotropy? Second, we discuss what we can say with the Copernican assumption. The most powerful result is based on the CMB: the vanishing of the dipole, quadrupole and octupole of the CMB is sufficient to impose homogeneity. Real observations lead to near-isotropy on large scales--does this lead to near-homogeneity? There are important partial results, and we discuss why this remains a difficult open question. Thus, we are currently unable to prove homogeneity of the Universe on large scales, even with the CP. However, we can use observations of the cosmic microwave background, galaxies and clusters to test homogeneity itself.

  5. Fe2O3-loaded activated carbon fiber/polymer materials and their photocatalytic activity for methylene blue mineralization by combined heterogeneous-homogeneous photocatalytic processes

    NASA Astrophysics Data System (ADS)

    Kadirova, Zukhra C.; Hojamberdiev, Mirabbos; Katsumata, Ken-Ichi; Isobe, Toshihiro; Matsushita, Nobuhiro; Nakajima, Akira; Okada, Kiyoshi

    2017-04-01

    Fe2O3-supported activated carbon felts (Fe-ACFTs) were prepared by impregnating the felts consisted of activated carbon fibers (ACFs) with either polyester fibers (PS-A20) or polyethylene pulp (PE-W15) in Fe(III) nitrate solution and calcination at 250 °C for 1 h. The prepared Fe-ACFTs with 31-35 wt% Fe were characterized by N2-adsorption, scanning electron microscopy, and X-ray diffraction. The Fe-ACFT(PS-A20) samples with 5-31 wt% Fe were microporous with specific surface areas (SBET) ranging from 750 to 150 m2/g, whereas the Fe-ACFT(PE-W15) samples with 2-35 wt% Fe were mesoporous with SBET ranging from 830 to 320 m2/g. The deposition of iron oxide resulted in a decrease in the SBET and methylene blue (MB) adsorption capacity while increasing the photodegradation of MB. The optimum MB degradation conditions included 0.98 mM oxalic acid, pH = 3, 0.02-0.05 mM MB, and 100 mg/L photocatalyst. The negative impact of MB desorption during the photodegradation reaction was more pronounced for mesoporous PE-W15 samples and can be neglected by adding oxalic acid in cyclic experiments. Almost complete and simultaneous mineralization of oxalate and MB was achieved by the combined heterogeneous-homogeneous photocatalytic processes. The leached Fe ions in aqueous solution [Fe3+]f were measured after 60 min for every cycle and found to be about 2 ppm in all four successive cycles. The developed photocatalytic materials have shown good performance even at low content of iron oxide (2-5 wt% Fe-ACFT). Moreover, it is easy to re-impregnate the ACF when the content of iron oxide is reduced during the cyclic process. Thus, low leaching of Fe ions and possibility of cyclic usage are the advantages of the photocatalytic materials developed in this study.

  6. Report: Recipient Subawards to Fellows Did Not Comply With Federal Requirements and EPA’s Involvement in Fellow Selection Process Creates the Appearance EPA Could Be Circumventing the Hiring Process

    EPA Pesticide Factsheets

    Report #14-P-0357, September 17, 2014. ASPH’s subawards to fellows made under the CA are contrary to federal requirements ... and ... creates an appearance that the EPA could be circumventing the hiring process.

  7. Creating bulk nanocrystalline metal.

    SciTech Connect

    Fredenburg, D. Anthony; Saldana, Christopher J.; Gill, David D.; Hall, Aaron Christopher; Roemer, Timothy John; Vogler, Tracy John; Yang, Pin

    2008-10-01

    Nanocrystalline and nanostructured materials offer unique microstructure-dependent properties that are superior to coarse-grained materials. These materials have been shown to have very high hardness, strength, and wear resistance. However, most current methods of producing nanostructured materials in weapons-relevant materials create powdered metal that must be consolidated into bulk form to be useful. Conventional consolidation methods are not appropriate due to the need to maintain the nanocrystalline structure. This research investigated new ways of creating nanocrystalline material, new methods of consolidating nanocrystalline material, and an analysis of these different methods of creation and consolidation to evaluate their applicability to mesoscale weapons applications where part features are often under 100 {micro}m wide and the material's microstructure must be very small to give homogeneous properties across the feature.

  8. Phase-shifting of correlation fringes created by image processing as an alternative to improve digital shearography

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Marcon, Marlon; Magalhães, Ricardo R.; Paiva-Almeida, Thiago; Santos, Igor V. A.; Martins, Moisés

    2016-12-01

    The adoption of digital speckle pattern shearing interferometry, or speckle shearography, is well known in many areas when one needs to measure micro-displacements in-plane and out of the plane in biological and non-biological objects; it is based on the Michelson's Interferometer with the use of a piezoelectric transducer (PZT) in order to provide the phase-shift of the fringes and then to improve the quality of the final image. The creation of the shifting images using a PZT, despite its widespread use, has some drawbacks or limitations, such as the cost of the apparatus, the difficulties in applying the same displacement in the mirror repeated times, and when the phase-shift cannot be used in dynamic object measurement. The aim of this work was to create digitally phase-shift images avoiding the mechanical adjustments of the PZT, testing them with the digital shearography method. The methodology was tested using a well-known object, a cantilever beam of aluminium under deformation. The results documented the ability to create the deformation map and curves with reliability and sensitivity, reducing the cost, and improving the robustness and also the accessibility of digital speckle pattern shearing interferometry.

  9. Creating a Whole Greater than the Sum of Its Parts: Fostering Integrative Learning with a Reflective ePortfolio Process

    ERIC Educational Resources Information Center

    McGuinness, Thomas Patrick

    2015-01-01

    This research explores one university's effort to facilitate integrative learning with a reflective ePortfolio process. Integrative learning is conceptualized using a multi-theoretical construct consisting of transfer of learning, reflective practice, and self-authorship. As part of the evaluation of this process, students completed a pre-survey…

  10. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    NASA Astrophysics Data System (ADS)

    Bulutsuz, A. G.; Demircioglu, P.; Bogrekci, I.; Durakbasa, M. N.; Katiboglu, A. B.

    2015-03-01

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  11. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    SciTech Connect

    Bulutsuz, A. G.; Demircioglu, P. Bogrekci, I.; Durakbasa, M. N.

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  12. Creating Processes Associated with Providing Government Goods and Services Under the Commercial Space Launch Act at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Letchworth, Janet F.

    2011-01-01

    Kennedy Space Center (KSC) has decided to write its agreements under the Commercial Space Launch Act (CSLA) authority to cover a broad range of categories of support that KSC could provide to our commercial partner. Our strategy was to go through the onerous process of getting the agreement in place once and allow added specificity and final cost estimates to be documented on a separate Task Order Request (TOR). This paper is written from the implementing engineering team's perspective. It describes how we developed the processes associated with getting Government support to our emerging commercial partners, such as SpaceX and reports on our success to date.

  13. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  14. Decision-Making Processes of SME in Cloud Computing Adoption to Create Disruptive Innovation: Mediating Effect of Collaboration

    ERIC Educational Resources Information Center

    Sonthiprasat, Rattanawadee

    2014-01-01

    THE PROBLEM. The purpose of this quantitative correlation study was to assess the relationship between different Cloud service levels of effective business innovation for SMEs. In addition, the new knowledge gained from the benefits of Cloud adoption with knowledge sharing would enhance the decision making process for businesses to consider the…

  15. Are Children's Memory Illusions Created Differently from Those of Adults? Evidence from Levels-of-Processing and Divided Attention Paradigms

    ERIC Educational Resources Information Center

    Wimmer, Marina C.; Howe, Mark L.

    2010-01-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for…

  16. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  17. Are children's memory illusions created differently from those of adults? Evidence from levels-of-processing and divided attention paradigms.

    PubMed

    Wimmer, Marina C; Howe, Mark L

    2010-09-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for all ages. For false recognition, when information was encoded on a shallow level, we found a different pattern for young children compared with that for older children and adults. False recognition rates were related to the overall amount of correctly remembered information for 7-year-olds, whereas no such association was found for the other age groups. In the second experiment, divided attention decreased true recognition for all ages. In contrast, children's (7- and 11-year-olds) false recognition rates were again dependent on the overall amount of correctly remembered information, whereas adults' false recognition was left unaffected. Overall, children's false recognition rates changed when levels of processing or divided attention was manipulated in comparison with adults. Together, these results suggest that there may be both quantitative and qualitative changes in false memory rates with age.

  18. Restoration of overwash processes creates piping plover (Charadrius melodus) habitat on a barrier island (Assateague Island, Maryland)

    NASA Astrophysics Data System (ADS)

    Schupp, Courtney A.; Winn, Neil T.; Pearl, Tami L.; Kumer, John P.; Carruthers, Tim J. B.; Zimmerman, Carl S.

    2013-01-01

    On Assateague Island, an undeveloped barrier island along Maryland and Virginia, a foredune was constructed to protect the island from the erosion and breaching threat caused by permanent jetties built to maintain Ocean City Inlet. Scientists and engineers integrated expertise in vegetation, wildlife, geomorphology, and coastal engineering in order to design a habitat restoration project that would be evaluated in terms of coastal processes rather than static features. Development of specific restoration targets, thresholds for intervention, and criteria to evaluate long-term project success were based on biological and geomorphological data and coastal engineering models. A detailed long-term monitoring plan was established to measure project sustainability. The foredune unexpectedly acted as near-total barrier to both overwash and wind, and the dynamic ecosystem underwent undesirable habitat changes including conversion of early-succession beach habitat to herbaceous and shrub communities, diminishing availability of foraging habitat and thereby reducing productivity of the Federally-listed Threatened Charadrius melodus (piping plover). To address these impacts, multiple notches were cut through the constructed foredune. The metric for initial geomorphological success-restoration of at least one overwash event per year across the constructed foredune, if occurring elsewhere on the island-was reached. New overwash fans increased island stability by increasing interior island elevation. At every notch, areas of sparse vegetation increased and the new foraging habitat was utilized by breeding pairs during the 2010 breeding season. However, the metric for long-term biological success-an increase to 37% sparsely vegetated habitat on the North End and an increase in piping plover productivity to 1.25 chicks fledged per breeding pair-has not yet been met. By 2010 there was an overall productivity of 1.2 chicks fledged per breeding pair and a 1.7% decrease in sparsely

  19. Experimental Simulation of the Radionuclide Behaviour in the Process of Creating Additional Safety Barriers in Solid Radioactive Waste Repositories Containing Irradiated Graphite

    NASA Astrophysics Data System (ADS)

    Pavliuk, A. O.; Kotlyarevskiy, S. G.; Bespala, E. V.; Zakarova, E. V.; Rodygina, N. I.; Ermolaev, V. M.; Proshin, I. M.; Volkova, A.

    2016-08-01

    Results of the experimental modeling of radionuclide behavior when creating additional safety barriers in solid radioactive waste repositories are presented. The experiments were run on the repository mockup containing solid radioactive waste fragments including irradiated graphite. The repository mockup layout is given; the processes with radionuclides that occur during the barrier creation with a clayey solution and during the following barrier operation are investigated. The results obtained confirm high anti-migration and anti-filtration properties of clay used for the barrier creation even under the long-term excessive water saturation of rocks confining the repository.

  20. A model cerium oxide matrix composite reinforced with a homogeneous dispersion of silver particulate - prepared using the glycine-nitrate process

    SciTech Connect

    Weil, K. Scott; Hardy, John S.

    2005-01-31

    Recently a new method of ceramic brazing has been developed. Based on a two-phase liquid composed of silver and copper oxide, brazing is conducted directly in air without the need of an inert cover gas or the use of surface reactive fluxes. Because the braze displays excellent wetting characteristics on a number ceramic surfaces, including alumina, various perovskites, zirconia, and ceria, we were interested in investigating whether a metal-reinforced ceramic matrix composite (CMC) could be developed with this material. In the present study, two sets of homogeneously mixed silver/copper oxide/ceria powders were synthesized using a combustion synthesis technique. The powders were compacted and heat treated in air above the liquidus temperature for the chosen Ag-CuO composition. Metallographic analysis indicates that the resulting composite microstructures are extremely uniform with respect to both the size of the metallic reinforcement as well as its spatial distribution within the ceramic matrix. The size, morphology, and spacing of the metal particulate in the densified composite appears to be dependent on the original size and the structure of the starting combustion synthesized powders.

  1. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  2. Optimizing homogenization by chaotic unmixing?

    NASA Astrophysics Data System (ADS)

    Weijs, Joost; Bartolo, Denis

    2016-11-01

    A number of industrial processes rely on the homogeneous dispersion of non-brownian particles in a viscous fluid. An ideal mixing would yield a so-called hyperuniform particle distribution. Such configurations are characterized by density fluctuations that grow slower than the standard √{ N}-fluctuations. Even though such distributions have been found in several natural structures, e.g. retina receptors in birds, they have remained out of experimental reach until very recently. Over the last 5 years independent experiments and numerical simulations have shown that periodically driven suspensions can self-assemble hyperuniformally. Simple as the recipe may be, it has one important disadvantage. The emergence of hyperuniform states co-occurs with a critical phase transition from reversible to non reversible particle dynamics. As a consequence the homogenization dynamics occurs over a time that diverges with the system size (critical slowing down). Here, we discuss how this process can be sped up by exploiting the stirring properties of chaotic advection. Among the questions that we answer are: What are the physical mechanisms in a chaotic flow that are relevant for hyperuniformity? How can we tune the flow parameters such to obtain optimal hyperuniformity in the fastest way? JW acknowledges funding by NWO (Netherlands Organisation for Scientific Research) through a Rubicon Grant.

  3. Homogeneous, bioluminescent proteasome assays.

    PubMed

    O'Brien, Martha A; Moravec, Richard A; Riss, Terry L; Bulleit, Robert F

    2015-01-01

    Protein degradation is mediated predominantly through the ubiquitin-proteasome pathway. The importance of the proteasome in regulating degradation of proteins involved in cell-cycle control, apoptosis, and angiogenesis led to the recognition of the proteasome as a therapeutic target for cancer. The proteasome is also essential for degrading misfolded and aberrant proteins, and impaired proteasome function has been implicated in neurodegerative and cardiovascular diseases. Robust, sensitive assays are essential for monitoring proteasome activity and for developing inhibitors of the proteasome. Peptide-conjugated fluorophores are widely used as substrates for monitoring proteasome activity, but fluorogenic substrates can exhibit significant background and can be problematic for screening because of cellular autofluorescence or interference from fluorescent library compounds. Furthermore, fluorescent proteasome assays require column-purified 20S or 26S proteasome (typically obtained from erythrocytes), or proteasome extracts from whole cells, as their samples. To provide assays more amenable to high-throughput screening, we developed a homogeneous, bioluminescent method that combines peptide-conjugated aminoluciferin substrates and a stabilized luciferase. Using substrates for the chymotrypsin-like, trypsin-like, and caspase-like proteasome activities in combination with a selective membrane permeabilization step, we developed single-step, cell-based assays to measure each of the proteasome catalytic activities. The homogeneous method eliminates the need to prepare individual cell extracts as samples and has adequate sensitivity for 96- and 384-well plates. The simple "add and read" format enables sensitive and rapid proteasome assays ideal for inhibitor screening.

  4. Light-created chemiluminescence

    NASA Astrophysics Data System (ADS)

    Vasil'ev, Rostislav F.; Tsaplev, Yuri B.

    2006-11-01

    The results of studies of light-created chemiluminescence are described systematically. Conditions for the transformation of a dark chemical reaction into a chemiluminescence reaction are considered. Examples of photosensitised and photoinduced processes as well as of analytical applications are given.

  5. Creating Collaborative Advantage.

    ERIC Educational Resources Information Center

    Huxham, Chris, Ed.

    Although interorganizational collaboration is becoming increasingly significant as a means of achieving organizational objectives, it is not an easy process to implement. Drawing on the work of authors with extensive experience, an accessible introduction to the theory and practice of creating collaborative advantage is presented in this volume.…

  6. HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Hammond, R.P.; Busey, H.M.

    1959-02-17

    Nuclear reactors of the homogeneous liquid fuel type are discussed. The reactor is comprised of an elongated closed vessel, vertically oriented, having a critical region at the bottom, a lower chimney structure extending from the critical region vertically upwardly and surrounded by heat exchanger coils, to a baffle region above which is located an upper chimney structure containing a catalyst functioning to recombine radiolyticallydissociated moderator gages. In operation the liquid fuel circulates solely by convection from the critical region upwardly through the lower chimney and then downwardly through the heat exchanger to return to the critical region. The gases formed by radiolytic- dissociation of the moderator are carried upwardly with the circulating liquid fuel and past the baffle into the region of the upper chimney where they are recombined by the catalyst and condensed, thence returning through the heat exchanger to the critical region.

  7. Homogeneous quantum electrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1992-01-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  8. Homogeneous nucleation kinetics

    NASA Technical Reports Server (NTRS)

    Rasmussen, D. H.; Appleby, M. R.; Leedom, G. L.; Babu, S. V.; Naumann, R. J.

    1983-01-01

    Homogeneous nucleation kinetics are rederived in a manner fundamentally similar to the approach of classical nucleation theory with the following modifications and improvements. First, the cluster is a parent phase cluster and does not require energization to the parent state. Second, the thermodynamic potential used to describe phase stability is a continuous function along the pathway of phase decomposition. Third, the kinetics of clustering corresponds directly to the diffusional flux of monomers through the cluster distribution and are formally similar to classical theory with the resulting kinetic equation modified by two terms in the preexponential factor. These terms correct for the influence of a supersaturation dependent clustering within the parent phase and for the influence of an asymmetrical cluster concentration as a function of cluster size at the critical cluster size. Fourth, the supersaturation dependence of the nucleation rate is of the same form as that given by classical nucleation theory. This supersaturation dependence must however be interpreted in terms of a size dependent surface tension. Finally, there are two scaling laws which describe supersaturation to either constant nucleation rate or to the thermodynamically determined physical spinodal.

  9. Universum Inference and Corpus Homogeneity

    NASA Astrophysics Data System (ADS)

    Vogel, Carl; Lynch, Gerard; Janssen, Jerom

    Universum Inference is re-interpreted for assessment of corpus homogeneity in computational stylometry. Recent stylometric research quantifies strength of characterization within dramatic works by assessing the homogeneity of corpora associated with dramatic personas. A methodological advance is suggested to mitigate the potential for the assessment of homogeneity to be achieved by chance. Baseline comparison analysis is constructed for contributions to debates by nonfictional participants: the corpus analyzed consists of transcripts of US Presidential and Vice-Presidential debates from the 2000 election cycle. The corpus is also analyzed in translation to Italian, Spanish and Portuguese. Adding randomized categories makes assessments of homogeneity more conservative.

  10. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  11. Reciprocity theory of homogeneous reactions

    NASA Astrophysics Data System (ADS)

    Agbormbai, Adolf A.

    1990-03-01

    The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.

  12. A homogeneous fluorometric assay platform based on novel synthetic proteins

    SciTech Connect

    Vardar-Schara, Goenuel; Krab, Ivo M.; Yi, Guohua; Su, Wei Wen . E-mail: wsu@hawaii.edu

    2007-09-14

    Novel synthetic recombinant sensor proteins have been created to detect analytes in solution, in a rapid single-step 'mix and read' noncompetitive homogeneous assay process, based on modulating the Foerster resonance energy transfer (FRET) property of the sensor proteins upon binding to their targets. The sensor proteins comprise a protein scaffold that incorporates a specific target-capturing element, sandwiched by genetic fusion between two molecules that form a FRET pair. The utility of the sensor proteins was demonstrated via three examples, for detecting an anti-biotin Fab antibody, a His-tagged recombinant protein, and an anti-FLAG peptide antibody, respectively, all done directly in solution. The diversity of sensor-target interactions that we have demonstrated in this study points to a potentially universal applicability of the biosensing concept. The possibilities for integrating a variety of target-capturing elements with a common sensor scaffold predict a broad range of practical applications.

  13. A comparison of the source processes of four Boso Peninsula slow slip events from 1996 to 2011 based on nearly homogeneous GNSS stations

    NASA Astrophysics Data System (ADS)

    Hirose, H.; Matsuzawa, T.; Kimura, T.

    2013-12-01

    Around the Boso Peninsula, Japan, slow slip events (SSEs) accompanied with earthquake swarms recurs with the repeating intervals between four to seven years, associated with the subduction of the Philippine Sea Plate (PHS) from the Sagami trough beneath the Kanto area. The latest event occurred in October 2011, which was likely hastened by the great Tohoku earthquake (magnitude 9.0) in March 2011 and an earlier episode was likely delayed by an intraslab earthquake in 1987 (magnitude 6.7) that occurred just beneath the source area of the SSEs (Hirose et al., 2012). This suggests that the occurrence of the Boso SSEs is largely influenced by stress disturbances of the order of 0.1 MPa, indicating the sensitive nature of the SSE source area. This recurrence history is useful to understand the frictional properties on the plate interface because it is rare to observe recurrent slip events on the interface occur at almost the same place with nearly the same observation coverage. The later four episodes (1996, 2002, 2007, 2011) were observed with GNSS Earth Observation Network System (GEONET) operated by Geospatial Authority of Japan (GSI) (Sagiya, 2004; Ozawa et al., 2003, 2007; Hirose et al., 2012). We invert displacement data for these four Boso SSEs to obtain the source process for each episode, and to discuss the possible relation to the fluctuation in the recurrence intervals. Network Inversion Filter (Segall and Matthews, 1997; Hirose and Obara, 2010) is applied to the GNSS data sets. We define the PHS plate configuration beneath the Kanto area based on the distribution of repeating earthquakes (Kimura et al., 2006) and the compilation of seismic reflection surveys (Takeda et al., 2007). There is a common slip area among the four SSEs in the eastern offshore region in the study area. Slip always starts on the offshore region and migrates to the west or to the north. This migration pattern roughly corresponds to the migration of the accompanied earthquake activity

  14. Effects of sample homogenization on solid phase sediment toxicity

    SciTech Connect

    Anderson, B.S.; Hunt, J.W.; Newman, J.W.; Tjeerdema, R.S.; Fairey, W.R.; Stephenson, M.D.; Puckett, H.M.; Taberski, K.M.

    1995-12-31

    Sediment toxicity is typically assessed using homogenized surficial sediment samples. It has been recognized that homogenization alters sediment integrity and may result in changes in chemical bioavailability through oxidation-reduction or other chemical processes. In this study, intact (unhomogenized) sediment cores were taken from a Van Veen grab sampler and tested concurrently with sediment homogenate from the same sample in order to investigate the effect of homogenization on toxicity. Two different solid-phase toxicity test protocols were used for these comparisons. Results of amphipod exposures to samples from San Francisco Bay indicated minimal difference between intact and homogenized samples. Mean amphipod survival in intact cores relative to homogenates was similar at two contaminated sites. Mean survival was 34 and 33% in intact and homogenized samples, respectively, at Castro Cove. Mean survival was 41% and 57%, respectively, in intact and homogenized samples from Islais Creek. Studies using the sea urchin development protocol, modified for testing at the sediment/water interface, indicated considerably more toxicity in intact samples relative to homogenized samples from San Diego Bay. Measures of metal flux into the overlying water demonstrated greater flux of metals from the intact samples. Zinc flux was five times greater, and copper flux was twice as great in some intact samples relative to homogenates. Future experiments will compare flux of metals and organic compounds in intact and homogenized sediments to further evaluate the efficacy of using intact cores for solid phase toxicity assessment.

  15. STEAM STIRRED HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Busey, H.M.

    1958-06-01

    A homogeneous nuclear reactor utilizing a selfcirculating liquid fuel is described. The reactor vessel is in the form of a vertically disposed tubular member having the lower end closed by the tube walls and the upper end closed by a removal fianged assembly. A spherical reaction shell is located in the lower end of the vessel and spaced from the inside walls. The reaction shell is perforated on its lower surface and is provided with a bundle of small-diameter tubes extending vertically upward from its top central portion. The reactor vessel is surrounded in the region of the reaction shell by a neutron reflector. The liquid fuel, which may be a solution of enriched uranyl sulfate in ordinary or heavy water, is mainiained at a level within the reactor vessel of approximately the top of the tubes. The heat of the reaction which is created in the critical region within the spherical reaction shell forms steam bubbles which more upwardly through the tubes. The upward movement of these bubbles results in the forcing of the liquid fuel out of the top of these tubes, from where the fuel passes downwardly in the space between the tubes and the vessel wall where it is cooled by heat exchangers. The fuel then re-enters the critical region in the reaction shell through the perforations in the bottom. The upper portion of the reactor vessel is provided with baffles to prevent the liquid fuel from splashing into this region which is also provided with a recombiner apparatus for recombining the radiolytically dissociated moderator vapor and a control means.

  16. Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (initial award through Award Modification 2); Energy & Risk Transfer Assessment (Award Modifications 3 - 6)

    SciTech Connect

    Michael Ebert

    2008-02-28

    This is the final report for the DOE-NETL grant entitled 'Creating New Incentives for Risk Identification & Insurance Processes for the Electric Utility Industry' and later, 'Energy & Risk Transfer Assessment'. It reflects work done on projects from 15 August 2004 to 29 February 2008. Projects were on a variety of topics, including commercial insurance for electrical utilities, the Electrical Reliability Organization, cost recovery by Gulf State electrical utilities after major hurricanes, and review of state energy emergency plans. This Final Technical Report documents and summarizes all work performed during the award period, which in this case is from 15 August 2004 (date of notification of original award) through 29 February 2008. This report presents this information in a comprehensive, integrated fashion that clearly shows a logical and synergistic research trajectory, and is augmented with findings and conclusions drawn from the research as a whole. Four major research projects were undertaken and completed during the 42 month period of activities conducted and funded by the award; these are: (1) Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (also referred to as the 'commercial insurance' research). Three major deliverables were produced: a pre-conference white paper, a two-day facilitated stakeholders workshop conducted at George Mason University, and a post-workshop report with findings and recommendations. All deliverables from this work are published on the CIP website at http://cipp.gmu.edu/projects/DoE-NETL-2005.php. (2) The New Electric Reliability Organization (ERO): an examination of critical issues associated with governance, standards development and implementation, and jurisdiction (also referred to as the 'ERO study'). Four major deliverables were produced: a series of preliminary memoranda for the staff of the Office of Electricity Delivery and Energy Reliability ('OE'), an ERO interview

  17. Creating a Health Journal

    MedlinePlus

    ... Health Resources Healthcare Management Working With Your Doctor Creating a Personal Health Journal (Health Diary) Creating a Personal Health Journal (Health Diary) Healthcare ManagementWorking ...

  18. Locally homogeneous pp-waves

    NASA Astrophysics Data System (ADS)

    Globke, Wolfgang; Leistner, Thomas

    2016-10-01

    We show that every n-dimensional locally homogeneous pp-wave is a plane wave, provided it is indecomposable and its curvature operator, when acting on 2-forms, has rank greater than one. As a consequence we obtain that indecomposable, Ricci-flat locally homogeneous pp-waves are plane waves. This generalises a classical result by Jordan, Ehlers and Kundt in dimension 4. Several examples show that our assumptions on indecomposability and the rank of the curvature are essential.

  19. Operator estimates in homogenization theory

    NASA Astrophysics Data System (ADS)

    Zhikov, V. V.; Pastukhova, S. E.

    2016-06-01

    This paper gives a systematic treatment of two methods for obtaining operator estimates: the shift method and the spectral method. Though substantially different in mathematical technique and physical motivation, these methods produce basically the same results. Besides the classical formulation of the homogenization problem, other formulations of the problem are also considered: homogenization in perforated domains, the case of an unbounded diffusion matrix, non-self-adjoint evolution equations, and higher-order elliptic operators. Bibliography: 62 titles.

  20. Revisiting Shock Initiation Modeling of Homogeneous Explosives

    NASA Astrophysics Data System (ADS)

    Partom, Yehuda

    2013-04-01

    Shock initiation of homogeneous explosives has been a subject of research since the 1960s, with neat and sensitized nitromethane as the main materials for experiments. A shock initiation model of homogeneous explosives was established in the early 1960s. It involves a thermal explosion event at the shock entrance boundary, which develops into a superdetonation that overtakes the initial shock. In recent years, Sheffield and his group, using accurate experimental tools, were able to observe details of buildup of the superdetonation. There are many papers on modeling shock initiation of heterogeneous explosives, but there are only a few papers on modeling shock initiation of homogeneous explosives. In this article, bulk reaction reactive flow equations are used to model homogeneous shock initiation in an attempt to reproduce experimental data of Sheffield and his group. It was possible to reproduce the main features of the shock initiation process, including thermal explosion, superdetonation, input shock overtake, overdriven detonation after overtake, and the beginning of decay toward Chapman-Jouget (CJ) detonation. The time to overtake (TTO) as function of input pressure was also calculated and compared to the experimental TTO.

  1. (Ultra) High Pressure Homogenization for Continuous High Pressure Sterilization of Pumpable Foods – A Review

    PubMed Central

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work. PMID:25988118

  2. (Ultra) high pressure homogenization for continuous high pressure sterilization of pumpable foods - a review.

    PubMed

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work.

  3. Political homogeneity can nurture threats to research validity.

    PubMed

    Chambers, John R; Schlenker, Barry R

    2015-01-01

    Political homogeneity within a scientific field nurtures threats to the validity of many research conclusions by allowing ideologically compatible values to influence interpretations, by minimizing skepticism, and by creating premature consensus. Although validity threats can crop in any research, the usual corrective activities in science are more likely to be minimized and delayed.

  4. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    SciTech Connect

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  5. Entanglement Created by Dissipation

    SciTech Connect

    Alharbi, Abdullah F.; Ficek, Zbigniew

    2011-10-27

    A technique for entangling closely separated atoms by the process of dissipative spontaneous emission is presented. The system considered is composed of two non-identical two-level atoms separated at the quarter wavelength of a driven standing wave laser field. At this atomic distance, only one of the atoms can be addressed by the laser field. In addition, we arrange the atomic dipole moments to be oriented relative to the inter-atomic axis such that the dipole-dipole interaction between the atoms is zero at this specific distance. It is shown that an entanglement can be created between the atoms on demand by tuning the Rabi frequency of the driving field to the difference between the atomic transition frequencies. The amount of the entanglement created depends on the ratio between the damping rates of the atoms, but is independent of the frequency difference between the atoms. We also find that the transient buildup of an entanglement between the atoms may differ dramatically for different initial atomic conditions.

  6. Dynamics of compact homogeneous universes

    SciTech Connect

    Tanimoto, M.; Koike, T.; Hosoya, A.

    1997-01-01

    A complete description of dynamics of compact locally homogeneous universes is given, which, in particular, includes explicit calculations of Teichm{umlt u}ller deformations and careful counting of dynamical degrees of freedom. We regard each of the universes as a simply connected four-dimensional space{endash}time with identifications by the action of a discrete subgroup of the isometry group. We then reduce the identifications defined by the space{endash}time isometries to ones in a homogeneous section, and find a condition that such spatial identifications must satisfy. This is essential for explicit construction of compact homogeneous universes. Some examples are demonstrated for Bianchi II, VI{sub 0}, VII{sub 0}, and I universal covers. {copyright} {ital 1997 American Institute of Physics.}

  7. The Art of Gymnastics: Creating Sequences.

    ERIC Educational Resources Information Center

    Rovegno, Inez

    1988-01-01

    Offering students opportunities for creating movement sequences in gymnastics allows them to understand the essence of gymnastics, have creative experiences, and learn about themselves. The process of creating sequences is described. (MT)

  8. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  9. The Birth and Re-Birth of the ISBDs: Process and Procedures for Creating and Revising the International Standard Bibliographic Descriptions [and] Section on Bibliography--Review of Activities, 1999-2000.

    ERIC Educational Resources Information Center

    Byrum, John D.

    This document contains two papers. The first paper discusses the process and procedures for creating and revising the ISBD (International Standard Bibliographic Description), including historical background from 1969 to the present, a description of revision projects, and a chart that summarizes the history and current status of the full range of…

  10. Creating visual explanations improves learning.

    PubMed

    Bobek, Eliza; Tversky, Barbara

    2016-01-01

    Many topics in science are notoriously difficult for students to learn. Mechanisms and processes outside student experience present particular challenges. While instruction typically involves visualizations, students usually explain in words. Because visual explanations can show parts and processes of complex systems directly, creating them should have benefits beyond creating verbal explanations. We compared learning from creating visual or verbal explanations for two STEM domains, a mechanical system (bicycle pump) and a chemical system (bonding). Both kinds of explanations were analyzed for content and learning assess by a post-test. For the mechanical system, creating a visual explanation increased understanding particularly for participants of low spatial ability. For the chemical system, creating both visual and verbal explanations improved learning without new teaching. Creating a visual explanation was superior and benefitted participants of both high and low spatial ability. Visual explanations often included crucial yet invisible features. The greater effectiveness of visual explanations appears attributable to the checks they provide for completeness and coherence as well as to their roles as platforms for inference. The benefits should generalize to other domains like the social sciences, history, and archeology where important information can be visualized. Together, the findings provide support for the use of learner-generated visual explanations as a powerful learning tool.

  11. A compact setup to study homogeneous nucleation and condensation

    NASA Astrophysics Data System (ADS)

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 106 Pa resulting from mass flow rates of up to 600 lN min-1 and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD.

  12. A compact setup to study homogeneous nucleation and condensation.

    PubMed

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 10(6) Pa resulting from mass flow rates of up to 600 l(N) min(-1) and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD.

  13. Homogeneous Pt-bimetallic Electrocatalysts

    SciTech Connect

    Wang, Chao; Chi, Miaofang; More, Karren Leslie; Markovic, Nenad; Stamenkovic, Vojislav

    2011-01-01

    Alloying has shown enormous potential for tailoring the atomic and electronic structures, and improving the performance of catalytic materials. Systematic studies of alloy catalysts are, however, often compromised by inhomogeneous distribution of alloying components. Here we introduce a general approach for the synthesis of monodispersed and highly homogeneous Pt-bimetallic alloy nanocatalysts. Pt{sub 3}M (where M = Fe, Ni, or Co) nanoparticles were prepared by an organic solvothermal method and then supported on high surface area carbon. These catalysts attained a homogeneous distribution of elements, as demonstrated by atomic-scale elemental analysis using scanning transmission electron microscopy. They also exhibited high catalytic activities for the oxygen reduction reaction (ORR), with improvement factors of 2-3 versus conventional Pt/carbon catalysts. The measured ORR catalytic activities for Pt{sub 3}M nanocatalysts validated the volcano curve established on extended surfaces, with Pt{sub 3}Co being the most active alloy.

  14. High School Student Perceptions of the Utility of the Engineering Design Process: Creating Opportunities to Engage in Engineering Practices and Apply Math and Science Content

    NASA Astrophysics Data System (ADS)

    Berland, Leema; Steingut, Rebecca; Ko, Pat

    2014-12-01

    Research and policy documents increasingly advocate for incorporating engineering design into K-12 classrooms in order to accomplish two goals: (1) provide an opportunity to engage with science content in a motivating real-world context; and (2) introduce students to the field of engineering. The present study uses multiple qualitative data sources (i.e., interviews, artifact analysis) in order to examine the ways in which engaging in engineering design can support students in participating in engineering practices and applying math and science knowledge. This study suggests that students better understand and value those aspects of engineering design that are more qualitative (i.e., interviewing users, generating multiple possible solutions) than the more quantitative aspects of design which create opportunities for students to integrate traditional math and science content into their design work (i.e., modeling or systematically choosing between possible design solutions). Recommendations for curriculum design and implementation are discussed.

  15. Multifractal spectra in homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  16. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  17. Development of an efficient anaerobic co-digestion process for garbage, excreta, and septic tank sludge to create a resource recycling-oriented society.

    PubMed

    Sun, Zhao-Yong; Liu, Kai; Tan, Li; Tang, Yue-Qin; Kida, Kenji

    2017-03-01

    In order to develop a resource recycling-oriented society, an efficient anaerobic co-digestion process for garbage, excreta and septic tank sludge was studied based on the quantity of each biomass waste type discharged in Ooki machi, Japan. The anaerobic digestion characteristics of garbage, excreta and 5-fold condensed septic tank sludge (hereafter called condensed sludge) were determined separately. In single-stage mesophilic digestion, the excreta with lower C/N ratios yielded lower biogas volumes and accumulated higher volumes of volatile fatty acid (VFA). On the other hand, garbage allowed for a significantly larger volatile total solid (VTS) digestion efficiency as well as biogas yield by thermophilic digestion. Thus, a two-stage anaerobic co-digestion process consisting of thermophilic liquefaction and mesophilic digestion phases was proposed. In the thermophilic liquefaction of mixed condensed sludge and household garbage (wet mass ratio of 2.2:1), a maximum VTS loading rate of 24g/L/d was achieved. In the mesophilic digestion of mixed liquefied material and excreta (wet mass ratio of 1:1), biogas yield reached approximately 570ml/g-VTS fed with a methane content of 55% at a VTS loading rate of 1.0g/L/d. The performance of the two-stage process was evaluated by comparing it with a single-stage process in which biomass wastes were treated separately. Biogas production by the two-stage process was found to increase by approximately 22.9%. These results demonstrate the effectiveness of a two-stage anaerobic co-digestion process in enhancement of biogas production.

  18. Rapid homogeneous endothelialization of high aspect ratio microvascular networks.

    PubMed

    Naik, Nisarga; Hanjaya-Putra, Donny; Haller, Carolyn A; Allen, Mark G; Chaikof, Elliot L

    2015-08-01

    Microvascularization of an engineered tissue construct is necessary to ensure the nourishment and viability of the hosted cells. Microvascular constructs can be created by seeding the luminal surfaces of microfluidic channel arrays with endothelial cells. However, in a conventional flow-based system, the uniformity of endothelialization of such an engineered microvascular network is constrained by mass transfer of the cells through high length-to-diameter (L/D) aspect ratio microchannels. Moreover, given the inherent limitations of the initial seeding process to generate a uniform cell coating, the large surface-area-to-volume ratio of microfluidic systems demands long culture periods for the formation of confluent cellular microconduits. In this report, we describe the design of polydimethylsiloxane (PDMS) and poly(glycerol sebacate) (PGS) microvascular constructs with reentrant microchannels that facilitates rapid, spatially homogeneous endothelial cell seeding of a high L/D (2 cm/35 μm; > 550:1) aspect ratio microchannels. MEMS technology was employed for the fabrication of a monolithic, elastomeric, reentrant microvascular construct. Isotropic etching and PDMS micromolding yielded a near-cylindrical microvascular channel array. A 'stretch - seed - seal' operation was implemented for uniform incorporation of endothelial cells along the entire microvascular area of the construct yielding endothelialized microvascular networks in less than 24 h. The feasibility of this endothelialization strategy and the uniformity of cellularization were established using confocal microscope imaging.

  19. Creating a Comprehensive, Efficient, and Sustainable Nuclear Regulatory Structure: A Process Report from the U.S. Department of Energy's Material Protection, Control and Accounting Program

    SciTech Connect

    Wright, Troy L.; O'Brien, Patricia E.; Hazel, Michael J.; Tuttle, John D.; Cunningham, Mitchel E.; Schlegel, Steven C.

    2010-08-11

    With the congressionally mandated January 1, 2013 deadline for the U.S. Department of Energy’s (DOE) Nuclear Material Protection, Control and Accounting (MPC&A) program to complete its transition of MPC&A responsibility to the Russian Federation, National Nuclear Security Administration (NNSA) management directed its MPC&A program managers and team leaders to demonstrate that work in ongoing programs would lead to successful and timely achievement of these milestones. In the spirit of planning for successful project completion, the NNSA review of the Russian regulatory development process confirmed the critical importance of an effective regulatory system to a sustainable nuclear protection regime and called for an analysis of the existing Russian regulatory structure and the identification of a plan to ensure a complete MPC&A regulatory foundation. This paper describes the systematic process used by DOE’s MPC&A Regulatory Development Project (RDP) to develop an effective and sustainable MPC&A regulatory structure in the Russian Federation. This nuclear regulatory system will address all non-military Category I and II nuclear materials at State Corporation for Atomic Energy “Rosatom,” the Federal Service for Ecological, Technological, and Nuclear Oversight (Rostechnadzor), the Federal Agency for Marine and River Transport (FAMRT, within the Ministry of Transportation), and the Ministry of Industry and Trade (Minpromtorg). The approach to ensuring a complete and comprehensive nuclear regulatory structure includes five sequential steps. The approach was adopted from DOE’s project management guidelines and was adapted to the regulatory development task by the RDP. The five steps in the Regulatory Development Process are: 1) Define MPC&A Structural Elements; 2) Analyze the existing regulatory documents using the identified Structural Elements; 3) Validate the analysis with Russian colleagues and define the list of documents to be developed; 4) Prioritize and

  20. BLENDING LOW ENRICHED URANIUM WITH DEPLETED URANIUM TO CREATE A SOURCE MATERIAL ORE THAT CAN BE PROCESSED FOR THE RECOVERY OF YELLOWCAKE AT A CONVENTIONAL URANIUM MILL

    SciTech Connect

    Schutt, Stephen M.; Hochstein, Ron F.; Frydenlund, David C.; Thompson, Anthony J.

    2003-02-27

    Throughout the United States Department of Energy (DOE) complex, there are a number of streams of low enriched uranium (LEU) that contain various trace contaminants. These surplus nuclear materials require processing in order to meet commercial fuel cycle specifications. To date, they have not been designated as waste for disposal at the DOE's Nevada Test Site (NTS). Currently, with no commercial outlet available, the DOE is evaluating treatment and disposal as the ultimate disposition path for these materials. This paper will describe an innovative program that will provide a solution to DOE that will allow disposition of these materials at a cost that will be competitive with treatment and disposal at the NTS, while at the same time recycling the material to recover a valuable energy resource (yellowcake) for reintroduction into the commercial nuclear fuel cycle. International Uranium (USA) Corporation (IUSA) and Nuclear Fuel Services, Inc. (NFS) have entered into a commercial relationship to pursue the development of this program. The program involves the design of a process and construction of a plant at NFS' site in Erwin, Tennessee, for the blending of contaminated LEU with depleted uranium (DU) to produce a uranium source material ore (USM Ore{trademark}). The USM Ore{trademark} will then be further processed at IUC's White Mesa Mill, located near Blanding, Utah, to produce conventional yellowcake, which can be delivered to conversion facilities, in the same manner as yellowcake that is produced from natural ores or other alternate feed materials. The primary source of feed for the business will be the significant sources of trace contaminated materials within the DOE complex. NFS has developed a dry blending process (DRYSM Process) to blend the surplus LEU material with DU at its Part 70 licensed facility, to produce USM Ore{trademark} with a U235 content within the range of U235 concentrations for source material. By reducing the U235 content to source

  1. Homogeneous Open Quantum Random Walks on a Lattice

    NASA Astrophysics Data System (ADS)

    Carbone, Raffaella; Pautrat, Yan

    2015-09-01

    We study open quantum random walks (OQRWs) for which the underlying graph is a lattice, and the generators of the walk are homogeneous in space. Using the results recently obtained in Carbone and Pautrat (Ann Henri Poincaré, 2015), we study the quantum trajectory associated with the OQRW, which is described by a position process and a state process. We obtain a central limit theorem and a large deviation principle for the position process. We study in detail the case of homogeneous OQRWs on the lattice , with internal space.

  2. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    SciTech Connect

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  3. An approximation for homogeneous freezing temperature of water droplets

    NASA Astrophysics Data System (ADS)

    O, K.-T.; Wood, R.

    2015-11-01

    In this work, based on the well-known formulae of classical nucleation theory (CNT), the temperature TNc = 1 at which the mean number of critical embryos inside a droplet is unity is derived and proposed as a new approximation for homogeneous freezing temperature of water droplets. Without consideration of time dependence and stochastic nature of the ice nucleation process, the approximation TNc = 1 is able to reproduce the dependence of homogeneous freezing temperature on drop size and water activity of aqueous drops observed in a wide range of experimental studies. We use the TNc = 1 approximation to argue that the distribution of homogeneous freezing temperatures observed in the experiments may largely be explained by the spread in the size distribution of droplets used in the particular experiment. It thus appears that this approximation is useful for predicting homogeneous freezing temperatures of water droplets in the atmosphere.

  4. Create a Logo.

    ERIC Educational Resources Information Center

    Duchen, Gail

    2002-01-01

    Presents an art lesson that introduced students to graphic art as a career path. Explains that the students met a graphic artist and created a logo for a pretend client. Explains that the students researched logos. (CMK)

  5. Creating physics stars

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2013-07-01

    Korea has begun an ambitious 5bn plan to create 50 new institutes dedicated to fundamental research. Michael Banks meets physicist Se-Jung Oh, president of the Institute for Basic Science, to find out more.

  6. 7 CFR 58.623 - Homogenizer.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE....623 Homogenizer. Homogenizer shall comply with 3-A Sanitary Standards....

  7. On the decay of homogeneous isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Skrbek, L.; Stalp, Steven R.

    2000-08-01

    wind tunnels and a water channel, the temporal decay of turbulence created by an oscillating grid in water and the decay of energy and vorticity created by a towed grid in a stationary sample of water. We also analyze decaying vorticity data we obtained in superfluid helium and show that decaying superfluid turbulence can be described classically. This paper offers a unified investigation of decaying isotropic, homogeneous turbulence that is based on accepted forms of the three-dimensional turbulent spectra and a variety of experimental decay data obtained in air, water, and superfluid helium.

  8. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  9. Coherence delay augmented laser beam homogenizer

    DOEpatents

    Rasmussen, P.; Bernhardt, A.

    1993-06-29

    The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.

  10. Coherence delay augmented laser beam homogenizer

    DOEpatents

    Rasmussen, Paul; Bernhardt, Anthony

    1993-01-01

    The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.

  11. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-03

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales.

  12. Numerical experiments in homogeneous turbulence

    NASA Technical Reports Server (NTRS)

    Rogallo, R. S.

    1981-01-01

    The direct simulation methods developed by Orszag and Patternson (1972) for isotropic turbulence were extended to homogeneous turbulence in an incompressible fluid subjected to uniform deformation or rotation. The results of simulations for irrotational strain (plane and axisymmetric), shear, rotation, and relaxation toward isotropy following axisymmetric strain are compared with linear theory and experimental data. Emphasis is placed on the shear flow because of its importance and because of the availability of accurate and detailed experimental data. The computed results are used to assess the accuracy of two popular models used in the closure of the Reynolds-stress equations. Data from a variety of the computed fields and the details of the numerical methods used in the simulation are also presented.

  13. Steps Towards a Homogenized Sub-Monthly Temperature Monitoring Tool

    NASA Astrophysics Data System (ADS)

    Rennie, J.; Kunkel, K.

    2015-12-01

    Land surface air temperature products have been essential for monitoring the evolution of the climate system. Before a temperature dataset is included in such reports, it is important that non-climatic influences be removed or changed so the dataset is considered homogenous. These inhomogeneities include changes in station location, instrumentation and observing practices. Very few datasets are free of these influences and therefore require homogenization schemes. While many homogenized products exist on the monthly time scale, few daily products exist, due to the complication of removing break points that are truly inhomogeneous rather than solely by chance (for example, sharp changes due to synoptic conditions). Since there is a high demand for sub-monthly monitoring tools, there is a need to address these issues. The Global Historical Climatology Network - Daily dataset provides a strong foundation of the Earth's climate on the daily scale, and is the official archive of daily data in the United States. While the dataset adheres to a strict set of quality assurance, no daily adjustments are applied. However, this dataset lays the groundwork for other products distributed at NCEI-Asheville, including the climate divisional dataset (nClimDiv), the North American monthly homogenized product (Northam) and the 1981-2010 Normals. Since these downstream products already provide homogenization and base period schemes, it makes sense to combine these datasets to provide a sub-monthly monitoring tool for the United States. Using these datasets already in existence, monthly adjustments are applied to daily data, and then anomalies are created using a base climatology defined by the 1981-2010 Normals. Station data is then aggregated to the state level and then regions defined by the National Climate Assessment. Ranks are then created to provide informational monitoring tools that could be of use for public dissemination. This presentation goes over the product, including

  14. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  15. Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals: stereoelectronic role of trans effect in a metal-mediated pericyclic process and a shift from homogeneous to heterogeneous catalysis during a one-pot reaction.

    PubMed

    Vidhani, Dinesh V; Krafft, Marie E; Alabugin, Igor V

    2014-01-03

    The combination of experiments and computations reveals unusual features of stereoselective Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals. The first step, the conversion of propargyl vinyl ethers into allene aldehydes, proceeds under homogeneous conditions via a "cyclization-mediated" mechanism initiated by Rh(I) coordination at the alkyne. This path agrees well with the small experimental effects of substituents on the carbinol carbon. The key feature revealed by the computational study is the stereoelectronic effect of the ligand arrangement at the catalytic center. The rearrangement barriers significantly decrease due to the greater transfer of electron density from the catalytic metal center to the CO ligand oriented trans to the alkyne. This effect increases electrophilicity of the metal and lowers the calculated barriers by 9.0 kcal/mol. Subsequent evolution of the catalyst leads to the in situ formation of Rh(I) nanoclusters that catalyze stereoselective tautomerization. The intermediacy of heterogeneous catalysis by nanoclusters was confirmed by mercury poisoning, temperature-dependent sigmoidal kinetic curves, and dynamic light scattering. The combination of experiments and computations suggests that the initially formed allene-aldehyde product assists in the transformation of a homogeneous catalyst (or "a cocktail of catalysts") into nanoclusters, which in turn catalyze and control the stereochemistry of subsequent transformations.

  16. Comparative Analysis of a MOOC and a Residential Community Using Introductory College Physics: Documenting How Learning Environments Are Created, Lessons Learned in the Process, and Measurable Outcomes

    NASA Astrophysics Data System (ADS)

    Olsen, Jack Ryan

    Higher education institutions, such as the University of Colorado Boulder (CU-Boulder), have as a core mission to advance their students' academic performance. On the frontier of education technologies that hold the promise to address our educational mission are Massively Open Online Courses (MOOCs) which are new enough to not be fully understood or well-researched. MOOCs, in theory, have vast potential for being cost-effective and for reaching diverse audiences across the world. This thesis examines the implementation of one MOOC, Physics 1 for Physical Science Majors, implemented in the augural round of institutionally sanctioned MOOCs in Fall 2013. While comparatively inexpensive to a brick-and-mortar course and while it initially enrolled audience of nearly 16,000 students, this MOOC was found to be time-consuming to implement, and only roughly 1.5% of those who enrolled completed the course---approximately 1/4 of those who completed the standard brick and mortar course that the MOOC was designed around. An established education technology, residential communities, contrast the MOOCs by being high-touch and highly humanized, but by being expensive and locally-based. The Andrews Hall Residential College (AHRC) on the CU campus fosters academic success and retention by engaging and networking students outside of the standard brick and mortar courses and enculturating students into an environment with vertical integration through the different classes: freshman, sophomore, junior, etc. The physics MOOC and the AHRC were studied to determine how the environments were made and what lessons were learned in the process. Also, student performance was compared for the physics MOOC, a subset of the AHRC students enrolled in a special physics course, and the standard CU Physics 1 brick and mortar course. All yielded similar learning gains for physics 1 performance, for those who completed the courses. These environments are presented together to compare and contrast their

  17. Creating Dialogue by Storytelling

    ERIC Educational Resources Information Center

    Passila, Anne; Oikarinen, Tuija; Kallio, Anne

    2013-01-01

    Purpose: The objective of this paper is to develop practice and theory from Augusto Boal's dialogue technique (Image Theatre) for organisational use. The paper aims to examine how the members in an organisation create dialogue together by using a dramaturgical storytelling framework where the dialogue emerges from storytelling facilitated by…

  18. Creating Photo Illustrations.

    ERIC Educational Resources Information Center

    Wilson, Bradley

    2003-01-01

    Explains the uses of photo illustrations. Notes that the key to developing a successful photo illustration is collaborative planning. Outlines the following guidelines for photo illustrations: never set up a photograph to mimic reality; create only abstractions with photo illustrations; clearly label photo illustrations; and never play photo…

  19. Looking, Writing, Creating.

    ERIC Educational Resources Information Center

    Katzive, Bonnie

    1997-01-01

    Describes how a middle school language arts teacher makes analyzing and creating visual art a partner to reading and writing in her classroom. Describes a project on art and Vietnam which shows how background information can add to and influence interpretation. Describes a unit on Greek mythology and Greek vases which leads to a related visual…

  20. [Teenagers creating art].

    PubMed

    Ahovi, Jonathan; Viverge, Agathe

    Teenagers need to interpret the world around them, sometimes in a completely different way to that in which, as children, they represented external reality. Some like drawing. They can use it to express their thoughts on death, sexuality or freedom. Their creative capacities are immense: they are creating art.

  1. Creating a Classroom Makerspace

    ERIC Educational Resources Information Center

    Rivas, Luz

    2014-01-01

    What is a makerspace? Makerspaces are community-operated physical spaces where people (makers) create do-it-yourself projects together. These membership spaces serve as community labs where people learn together and collaborate on projects. Makerspaces often have tools and equipment like 3-D printers, laser cutters, and soldering irons.…

  2. Creating a Market.

    ERIC Educational Resources Information Center

    Kazimirski, J.; And Others

    The second in a series of programmed books, "Creating a Market" is published by the International Labour Office as a manual for persons studying marketing. This manual was designed to meet the needs of the labor organization's technical cooperation programs and is primarily concerned with consumer goods industries. Using a fill-in-the-blanks and…

  3. Creating Quality Schools.

    ERIC Educational Resources Information Center

    American Association of School Administrators, Arlington, VA.

    This booklet presents information on how total quality management can be applied to school systems to create educational improvement. Total quality management offers education a systemic approach and a new set of assessment tools. Chapter 1 provides a definition and historical overview of total quality management. Chapter 2 views the school…

  4. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  5. Creating an Assessments Library

    ERIC Educational Resources Information Center

    Duncan, Greg; Gilbert, Jacqueline; Mackenzie, Mary; Meulener, Carol; Smith, Martin; Yetman, Beatrice; Zeppieri, Rosanne

    2006-01-01

    This article presents the steps taken over three years (2003-2006) by the Consortium for Assessing Performance Standards, a New Jersey Grant Project to create a database of thematically organized, integrated performance assessment tasks at the benchmark levels of proficiency, novice-mid, intermediate-low and pre-advanced as defined by the ACTFL…

  6. Creating Historical Drama.

    ERIC Educational Resources Information Center

    Cassler, Robert

    1990-01-01

    Describes creating for the National Archives Public Education Department a historical drama, "Second in the Realm," based on the story of the Magna Carta. Demonstrates the effectiveness of historical drama as a teaching tool. Explains the difficulties of writing such dramas and provides guidelines for overcoming these problems. (NL)

  7. Creating dedicated bioenergy crops

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Bioenergy is one of the current mechanisms of producing renewable energy to reduce our use of nonrenewable fossil fuels and to reduce carbon emissions into the atmosphere. Humans have been using bioenergy since we first learned to create and control fire - burning manure, peat, and wood to cook food...

  8. Creating a Virtual Gymnasium

    ERIC Educational Resources Information Center

    Fiorentino, Leah H.; Castelli, Darla

    2005-01-01

    Physical educators struggle with the challenges of assessing student performance, providing feedback about motor skills, and creating opportunities for all students to engage in game-play on a daily basis. The integration of technology in the gymnasium can address some of these challenges by improving teacher efficiency and increasing student…

  9. Creating Pupils' Internet Magazine

    ERIC Educational Resources Information Center

    Bognar, Branko; Šimic, Vesna

    2014-01-01

    This article presents an action research, which aimed to improve pupils' literary creativity and enable them to use computers connected to the internet. The study was conducted in a small district village school in Croatia. Creating a pupils' internet magazine appeared to be an excellent way for achieving the educational aims of almost all…

  10. Creating a Classroom Newspaper.

    ERIC Educational Resources Information Center

    Buss, Kathleen, Ed.; McClain-Ruelle, Leslie, Ed.

    Based on the premise that students can learn a great deal by reading and writing a newspaper, this book was created by preservice instructors to teach upper elementary students (grades 3-5) newspaper concepts, journalism, and how to write newspaper articles. It shows how to use newspaper concepts to help students integrate knowledge from multiple…

  11. Creating Motivating Job Aids.

    ERIC Educational Resources Information Center

    Tilaro, Angie; Rossett, Allison

    1993-01-01

    Explains how to create job aids that employees will be motivated to use, based on a review of pertinent literature and interviews with professionals. Topics addressed include linking motivation with job aids; Keller's ARCS (Attention, Relevance, Confidence, Satisfaction) model of motivation; and design strategies for job aids based on Keller's…

  12. How Banks Create Money.

    ERIC Educational Resources Information Center

    Beale, Lyndi

    This teaching module explains how the U.S. banking system uses excess reserves to create money in the form of new deposits for borrowers. The module is part of a computer-animated series of four-to-five-minute modules illustrating standard concepts in high school economics. Although the module is designed to accompany the video program, it may be…

  13. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  14. Homogeneous modes of cosmological instantons

    SciTech Connect

    Gratton, Steven; Turok, Neil

    2001-06-15

    We discuss the O(4) invariant perturbation modes of cosmological instantons. These modes are spatially homogeneous in Lorentzian spacetime and thus not relevant to density perturbations. But their properties are important in establishing the meaning of the Euclidean path integral. If negative modes are present, the Euclidean path integral is not well defined, but may nevertheless be useful in an approximate description of the decay of an unstable state. When gravitational dynamics is included, counting negative modes requires a careful treatment of the conformal factor problem. We demonstrate that for an appropriate choice of coordinate on phase space, the second order Euclidean action is bounded below for normalized perturbations and has a finite number of negative modes. We prove that there is a negative mode for many gravitational instantons of the Hawking-Moss or Coleman{endash}De Luccia type, and discuss the associated spectral flow. We also investigate Hawking-Turok constrained instantons, which occur in a generic inflationary model. Implementing the regularization and constraint proposed by Kirklin, Turok and Wiseman, we find that those instantons leading to substantial inflation do not possess negative modes. Using an alternate regularization and constraint motivated by reduction from five dimensions, we find a negative mode is present. These investigations shed new light on the suitability of Euclidean quantum gravity as a potential description of our universe.

  15. Creating Multiple Processes from Multiple Intelligences.

    ERIC Educational Resources Information Center

    Wolffe, Robert; Robinson, Helja; Grant, Jean Marie

    1998-01-01

    Howard Gardner's multiple-intelligences theory stresses that all humans possess the various intelligences (linguistic, logical-mathematical, spatial, bodily-kinesthetic, musical, interpersonal, intrapersonal, and naturalist) to differing degrees, and most people can attain adequate competency levels. This article provides a sample checklist for…

  16. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  17. Creating Geoscience Leaders

    NASA Astrophysics Data System (ADS)

    Buskop, J.; Buskop, W.

    2013-12-01

    The United Nations Educational, Scientific, and Cultural Organization recognizes 21 World Heritage in the United States, ten of which have astounding geological features: Wrangell St. Elias National Park, Olympic National Park, Mesa Verde National Park, Chaco Canyon, Glacier National Park, Carlsbad National Park, Mammoth Cave, Great Smokey Mountains National Park, Hawaii Volcanoes National Park, and Everglades National Park. Created by a student frustrated with fellow students addicted to smart phones with an extreme lack of interest in the geosciences, one student visited each World Heritage site in the United States and created one e-book chapter per park. Each chapter was created with original photographs, and a geological discovery hunt to encourage teen involvement in preserving remarkable geological sites. Each chapter describes at least one way young adults can get involved with the geosciences, such a cave geology, glaciology, hydrology, and volcanology. The e-book describes one park per chapter, each chapter providing a geological discovery hunt, information on how to get involved with conservation of the parks, geological maps of the parks, parallels between archaeological and geological sites, and how to talk to a ranger. The young author is approaching UNESCO to publish the work as a free e-book to encourage involvement in UNESCO sites and to prove that the geosciences are fun.

  18. The Homogenization and Optimization of Thermoelectric Composites

    DTIC Science & Technology

    2015-04-17

    AFRL-OSR-VA-TR-2015-0090 The Homogenization and Optimization of Thermoelectric Composites Jiangyu Li UNIVERSITY OF WASHINGTON Final Report 04/17/2015...SUBTITLE The Homogenization and Optimization of Thermoelectric Composites 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1-0325 5c. PROGRAM ELEMENT...behavior of thermoelectric composites using rigorous homogenization technique in this project. In the last three years, our accomplishment includes: (1

  19. Homogeneous catalysts in hypersonic combustion

    SciTech Connect

    Harradine, D.M.; Lyman, J.L.; Oldenborg, R.C.; Pack, R.T.; Schott, G.L.

    1989-01-01

    Density and residence time both become unfavorably small for efficient combustion of hydrogen fuel in ramjet propulsion in air at high altitude and hypersonic speed. Raising the density and increasing the transit time of the air through the engine necessitates stronger contraction of the air flow area. This enhances the kinetic and thermodynamic tendency of H/sub 2/O to form completely, accompanied only by N/sub 2/ and any excess H/sub 2/(or O/sub 2/). The by-products to be avoided are the energetically expensive fragment species H and/or O atoms and OH radicals, and residual (2H/sub 2/ plus O/sub 2/). However, excessive area contraction raises air temperature and consequent combustion-product temperature by adiabatic compression. This counteracts and ultimately overwhelms the thermodynamic benefit by which higher density favors the triatomic product, H/sub 2/O, over its monatomic and diatomic alternatives. For static pressures in the neighborhood of 1 atm, static temperature must be kept or brought below ca. 2400 K for acceptable stability of H/sub 2/O. Another measure, whose requisite chemistry we address here, is to extract propulsive work from the combustion products early in the expansion. The objective is to lower the static temperature of the combustion stream enough for H/sub 2/O to become adequately stable before the exhaust flow is massively expanded and its composition ''frozen.'' We proceed to address this mechanism and its kinetics, and then examine prospects for enhancing its rate by homogeneous catalysts. 9 refs.

  20. Influence of homogenization treatment on physicochemical properties and enzymatic hydrolysis rate of pure cellulose fibers.

    PubMed

    Jacquet, N; Vanderghem, C; Danthine, S; Blecker, C; Paquot, M

    2013-02-01

    The aim of this study is to compare the effect of different homogenization treatments on the physicochemical properties and the hydrolysis rate of a pure bleached cellulose. Results obtained show that homogenization treatments improve the enzymatic hydrolysis rate of the cellulose fibers by 25 to 100 %, depending of the homogenization treatment applied. Characterization of the samples showed also that homogenization had an impact on some physicochemical properties of the cellulose. For moderate treatment intensities (pressure below 500 b and degree of homogenization below 25), an increase of water retention values (WRV) that correlated to the increase of the hydrolysis rate was highlighted. Result also showed that the overall crystallinity of the cellulose properties appeared not to be impacted by the homogenization treatment. For higher treatment intensities, homogenized cellulose samples developed a stable tridimentional network that contributes to decrease cellulase mobility and slowdown the hydrolysis process.

  1. Effect of heat and homogenization on in vitro digestion of milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  2. Cell-Laden Poly(ɛ-caprolactone)/Alginate Hybrid Scaffolds Fabricated by an Aerosol Cross-Linking Process for Obtaining Homogeneous Cell Distribution: Fabrication, Seeding Efficiency, and Cell Proliferation and Distribution

    PubMed Central

    Lee, HyeongJin; Ahn, SeungHyun; Bonassar, Lawrence J.; Chun, Wook

    2013-01-01

    Generally, solid-freeform fabricated scaffolds show a controllable pore structure (pore size, porosity, pore connectivity, and permeability) and mechanical properties by using computer-aided techniques. Although the scaffolds can provide repeated and appropriate pore structures for tissue regeneration, they have a low biological activity, such as low cell-seeding efficiency and nonuniform cell density in the scaffold interior after a long culture period, due to a large pore size and completely open pores. Here we fabricated three different poly(ɛ-caprolactone) (PCL)/alginate scaffolds: (1) a rapid prototyped porous PCL scaffold coated with an alginate, (2) the same PCL scaffold coated with a mixture of alginate and cells, and (3) a multidispensed hybrid PCL/alginate scaffold embedded with cell-laden alginate struts. The three scaffolds had similar micropore structures (pore size=430–580 μm, porosity=62%–68%, square pore shape). Preosteoblast cells (MC3T3-E1) were used at the same cell density in each scaffold. By measuring cell-seeding efficiency, cell viability, and cell distribution after various periods of culturing, we sought to determine which scaffold was more appropriate for homogeneously regenerated tissues. PMID:23469894

  3. A tree-based model for homogeneous groupings of multinomials.

    PubMed

    Yang, Tae Young

    2005-11-30

    The motivation of this paper is to provide a tree-based method for grouping multinomial data according to their classification probability vectors. We produce an initial tree by binary recursive partitioning whereby multinomials are successively split into two subsets and the splits are determined by maximizing the likelihood function. If the number of multinomials k is too large, we propose to order the multinomials, and then build the initial tree based on a dramatically smaller number k-1 of possible splits. The tree is then pruned from the bottom up. The pruning process involves a sequence of hypothesis tests of a single homogeneous group against the alternative that there are two distinct, internally homogeneous groups. As pruning criteria, the Bayesian information criterion and the Wilcoxon rank-sum test are proposed. The tree-based model is illustrated on genetic sequence data. Homogeneous groupings of genetic sequences present new opportunities to understand and align these sequences.

  4. Creating new growth platforms.

    PubMed

    Laurie, Donald L; Doz, Yves L; Sheer, Claude P

    2006-05-01

    Sooner or later, most companies can't attain the growth rates expected by their boards and CEOs and demanded by investors. To some extent, such businesses are victims of their own successes. Many were able to sustain high growth rates for a long time because they were in high-growth industries. But once those industries slowed down, the businesses could no longer deliver the performance that investors had come to take for granted. Often, companies have resorted to acquisition, though this strategy has a discouraging track record. Over time, 65% of acquisitions destroy more value than they create. So where does real growth come from? For the past 12 years, the authors have been researching and advising companies on this issue. With the support of researchers at Harvard Business School and Insead, they instituted a project titled "The CEO Agenda and Growth". They identified and approached 24 companies that had achieved significant organic growth and interviewed their CEOs, chief strategists, heads of R&D, CFOs, and top-line managers. They asked, "Where does your growth come from?" and found a consistent pattern in the answers. All the businesses grew by creating new growth platforms (NGPs) on which they could build families of products and services and extend their capabilities into multiple new domains. Identifying NGP opportunities calls for executives to challenge conventional wisdom. In all the companies studied, top management believed that NGP innovation differed significantly from traditional product or service innovation. They had independent, senior-level units with a standing responsibility to create NGPs, and their CEOs spent as much as 50% of their time working with these units. The payoff has been spectacular and lasting. For example, from 1985 to 2004, the medical devices company Medtronic grew revenues at 18% per year, earnings at 20%, and market capitalization at 30%.

  5. Context homogeneity facilitates both distractor inhibition and target enhancement.

    PubMed

    Feldmann-Wüstefeld, Tobias; Schubö, Anna

    2013-05-06

    Homogeneous contexts were shown to result in prioritized processing of embedded targets compared to heterogeneous contexts (Duncan & Humphreys, 1989). The present experiment used behavioral and ERP measures to examine whether context homogeneity affects both enhancing relevant information and inhibiting irrelevant in contexts of varying homogeneity. Targets and distractors were presented laterally or on the vertical midline which allowed disentangling target- and distractor-related activity in the lateralized ERP (Hickey, diLollo, & McDonald, 2009). In homogeneous contexts, targets elicited an NT component from 150 ms on and a PD component from 200 ms on, showing early attention deployment at target locations and active suppression of distractors. In heterogeneous contexts, an NT component was also found from 150 ms on and PD was found from 250 ms on, suggesting delayed suppression of the distractor. Before 250 ms, distractors in heterogeneous contexts elicited a contralateral negativity, indicating attentional capture of the distractor prior to active suppression. In sum the present results suggest that top-down control of attention is more pronounced in homogeneous than in heterogeneous contexts.

  6. Creating healthy camp experiences.

    PubMed

    Walton, Edward A; Tothy, Alison S

    2011-04-01

    The American Academy of Pediatrics has created recommendations for health appraisal and preparation of young people before participation in day or resident camps and to guide health and safety practices for children at camp. These recommendations are intended for parents, primary health care providers, and camp administration and health center staff. Although camps have diverse environments, there are general guidelines that apply to all situations and specific recommendations that are appropriate under special conditions. This policy statement has been reviewed and is supported by the American Camp Association.

  7. Creating a practice website.

    PubMed

    Downes, P K

    2007-05-26

    A website is a window to the outside world. For a dental practice, it may be the first point of contact for a prospective new patient and will therefore provide them with their 'first impression'; this may be days or weeks before actually visiting the practice. This section considers the different ways of creating a dental practice website and lists some of the main dental website design companies. It also describes what factors make a successful website and offers advice on how to ensure that it complies with current regulations and recommendations.

  8. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratiannil, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.; Willett, K.

    2013-09-01

    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones.

  9. Creating corporate advantage.

    PubMed

    Collis, D J; Montgomery, C A

    1998-01-01

    What differentiates truly great corporate strategies from the merely adequate? How can executives at the corporate level create tangible advantage for their businesses that makes the whole more than the sum of the parts? This article presents a comprehensive framework for value creation in the multibusiness company. It addresses the most fundamental questions of corporate strategy: What businesses should a company be in? How should it coordinate activities across businesses? What role should the corporate office play? How should the corporation measure and control performance? Through detailed case studies of Tyco International, Sharp, the Newell Company, and Saatchi and Saatchi, the authors demonstrate that the answers to all those questions are driven largely by the nature of a company's special resources--its assets, skills, and capabilities. These range along a continuum from the highly specialized at one end to the very general at the other. A corporation's location on the continuum constrains the set of businesses it should compete in and limits its choices about the design of its organization. Applying the framework, the authors point out the common mistakes that result from misaligned corporate strategies. Companies mistakenly enter businesses based on similarities in products rather than the resources that contribute to competitive advantage in each business. Instead of tailoring organizational structures and systems to the needs of a particular strategy, they create plain-vanilla corporate offices and infrastructures. The company examples demonstrate that one size does not fit all. One can find great corporate strategies all along the continuum.

  10. Are geological media homogeneous or heterogeneous for neutron investigations?

    PubMed

    Woźnicka, U; Drozdowicz, K; Gabańska, B; Krynicka, E; Igielski, A

    2003-01-01

    The thermal neutron absorption cross section of a heterogeneous material is lower than that of the corresponding homogeneous one which contains the same components. When rock materials are investigated the sample usually contains grains which create heterogeneity. The heterogeneity effect depends on the mass contribution of highly and low-absorbing centers, on the ratio of their absorption cross sections, and on their sizes. An influence of the granulation of silicon and diabase samples on the absorption cross section measured with Czubek's method has been experimentally investigated. A 20% underestimation of the absorption cross section has been observed for diabase grains of sizes from 6.3 to 12.8 mm.

  11. Creating breakthroughs at 3M.

    PubMed

    von Hippel, E; Thomke, S; Sonnack, M

    1999-01-01

    Most senior managers want their product development teams to create break-throughs--new products that will allow their companies to grow rapidly and maintain high margins. But more often they get incremental improvements to existing products. That's partly because companies must compete in the short term. Searching for breakthroughs is expensive and time consuming; line extensions can help the bottom line immediately. In addition, developers simply don't know how to achieve breakthroughs, and there is usually no system in place to guide them. By the mid-1990s, the lack of such a system was a problem even for an innovative company like 3M. Then a project team in 3M's Medical-Surgical Markets Division became acquainted with a method for developing breakthrough products: the lead user process. The process is based on the fact that many commercially important products are initially thought of and even prototyped by "lead users"--companies, organizations, or individuals that are well ahead of market trends. Their needs are so far beyond those of the average user that lead users create innovations on their own that may later contribute to commercially attractive breakthroughs. The lead user process transforms the job of inventing breakthroughs into a systematic task of identifying lead users and learning from them. The authors explain the process and how the 3M project team successfully navigated through it. In the end, the team proposed three major new product lines and a change in the division's strategy that has led to the development of breakthrough products. And now several more divisions are using the process to break away from incrementalism.

  12. Creating virtual ARDS patients.

    PubMed

    Das, Anup; Haque, Mainul; Chikhani, Marc; Wenfei Wang; Hardman, Jonathan G; Bates, Declan G

    2016-08-01

    This paper presents the methodology used in patient-specific calibration of a novel highly integrated model of the cardiovascular and pulmonary pathophysiology associated with Acute Respiratory Distress Syndrome (ARDS). We focus on data from previously published clinical trials on the static and dynamic cardio-pulmonary responses of three ARDS patients to changes in ventilator settings. From this data, the parameters of the integrated model were identified using an optimization-based methodology in multiple stages. Computational simulations confirm that the resulting model outputs accurately reproduce the available clinical data. Our results open up the possibility of creating in silico a biobank of virtual ARDS patients that could be used to evaluate current, and investigate novel, therapeutic strategies.

  13. Creating With Carbon

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A subsidiary of SI Diamond Technology, Inc., Applied Nanotech, of Austin, Texas, is creating a buzz among various technology firms and venture capital groups interested in the company s progressive research on carbon-related field emission devices, including carbon nanotubes, filaments of pure carbon less than one ten-thousandth the width of human hair. Since their discovery in 1991, carbon nanotubes have gained considerable attention due to their unique physical properties. For example, a single perfect carbon nanotube can range from 10 to 100 times stronger than steel, per unit weight. Recent studies also indicate that the nanotubes may be the best heat-conducting material in existence. These properties, combined with the ease of growing thin films or nanotubes by a variety of deposition techniques, make the carbon-based material one of the most desirable for cold field emission cathodes.

  14. Energy cost of creating quantum coherence

    NASA Astrophysics Data System (ADS)

    Misra, Avijit; Singh, Uttam; Bhattacharya, Samyadeb; Pati, Arun Kumar

    2016-05-01

    We consider physical situations where the resource theories of coherence and thermodynamics play competing roles. In particular, we study the creation of quantum coherence using unitary operations with limited thermodynamic resources. We find the maximal coherence that can be created under unitary operations starting from a thermal state and find explicitly the unitary transformation that creates the maximal coherence. Since coherence is created by unitary operations starting from a thermal state, it requires some amount of energy. This motivates us to explore the trade-off between the amount of coherence that can be created and the energy cost of the unitary process. We also find the maximal achievable coherence under the constraint on the available energy. Additionally, we compare the maximal coherence and the maximal total correlation that can be created under unitary transformations with the same available energy at our disposal. We find that when maximal coherence is created with limited energy, the total correlation created in the process is upper bounded by the maximal coherence, and vice versa. For two-qubit systems we show that no unitary transformation exists that creates the maximal coherence and maximal total correlation simultaneously with a limited energy cost.

  15. Creating Griffith Observatory

    NASA Astrophysics Data System (ADS)

    Cook, Anthony

    2013-01-01

    Griffith Observatory has been the iconic symbol of the sky for southern California since it began its public mission on May 15, 1935. While the Observatory is widely known as being the gift of Col. Griffith J. Griffith (1850-1919), the story of how Griffith’s gift became reality involves many of the people better known for other contributions that made Los Angeles area an important center of astrophysics in the 20th century. Griffith began drawing up his plans for an observatory and science museum for the people of Los Angeles after looking at Saturn through the newly completed 60-inch reflector on Mt. Wilson. He realized the social impact that viewing the heavens could have if made freely available, and discussing the idea of a public observatory with Mt. Wilson Observatory’s founder, George Ellery Hale, and Director, Walter Adams. This resulted, in 1916, in a will specifying many of the features of Griffith Observatory, and establishing a committee managed trust fund to build it. Astronomy popularizer Mars Baumgardt convinced the committee at the Zeiss Planetarium projector would be appropriate for Griffith’s project after the planetarium was introduced in Germany in 1923. In 1930, the trust committee judged funds to be sufficient to start work on creating Griffith Observatory, and letters from the Committee requesting help in realizing the project were sent to Hale, Adams, Robert Millikan, and other area experts then engaged in creating the 200-inch telescope eventually destined for Palomar Mountain. A Scientific Advisory Committee, headed by Millikan, recommended that Caltech Physicist Edward Kurth be put in charge of building and exhibit design. Kurth, in turn, sought help from artist Russell Porter. The architecture firm of John C. Austin and Fredrick Ashley was selected to design the project, and they adopted the designs of Porter and Kurth. Philip Fox of the Adler Planetarium was enlisted to manage the completion of the Observatory and become its

  16. Creating the living brand.

    PubMed

    Bendapudi, Neeli; Bendapudi, Venkat

    2005-05-01

    It's easy to conclude from the literature and the lore that top-notch customer service is the province of a few luxury companies and that any retailer outside that rarefied atmosphere is condemned to offer mediocre service at best. But even companies that position themselves for the mass market can provide outstanding customer-employee interactions and profit from them, if they train employees to reflect the brand's core values. The authors studied the convenience store industry in depth and focused on two that have developed a devoted following: QuikTrip (QT) and Wawa. Turnover rates at QT and Wawa are 14% and 22% respectively, much lower than the typical rate in retail. The authors found six principles that both firms embrace to create a strong culture of customer service. Know what you're looking for: A focus on candidates' intrinsic traits allows the companies to hire people who will naturally bring the right qualities to the job. Make the most of talent: In mass-market retail, talent is generally viewed as a commodity, but that outlook becomes a self-fulfilling prophesy. Create pride in the brand: Service quality depends directly on employees' attachment to the brand. Build community: Wawa and QT have made concerted efforts to build customer loyalty through a sense of community. Share the business context: Employees need a clear understanding of how their company operates and how it defines success. Satisfy the soul: To win an employee's passionate engagement, a company must meet his or her needs for security, esteem, and justice.

  17. Deforestation homogenizes tropical parasitoid-host networks.

    PubMed

    Laliberté, Etienne; Tylianakis, Jason M

    2010-06-01

    Human activities drive biotic homogenization (loss of regional diversity) of many taxa. However, whether species interaction networks (e.g., food webs) can also become homogenized remains largely unexplored. Using 48 quantitative parasitoid-host networks replicated through space and time across five tropical habitats, we show that deforestation greatly homogenized network structure at a regional level, such that interaction composition became more similar across rice and pasture sites compared with forested habitats. This was not simply caused by altered consumer and resource community composition, but was associated with altered consumer foraging success, such that parasitoids were more likely to locate their hosts in deforested habitats. Furthermore, deforestation indirectly homogenized networks in time through altered mean consumer and prey body size, which decreased in deforested habitats. Similar patterns were obtained with binary networks, suggesting that interaction (link) presence-absence data may be sufficient to detect network homogenization effects. Our results show that tropical agroforestry systems can support regionally diverse parasitoid-host networks, but that removal of canopy cover greatly homogenizes the structure of these networks in space, and to a lesser degree in time. Spatiotemporal homogenization of interaction networks may alter coevolutionary outcomes and reduce ecological resilience at regional scales, but may not necessarily be predictable from community changes observed within individual trophic levels.

  18. Benchmarking homogenization algorithms for monthly data

    NASA Astrophysics Data System (ADS)

    Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M. J.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.

    2012-01-01

    The COST (European Cooperation in Science and Technology) Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data

  19. Create an Emergency Kit

    MedlinePlus

    ... Process: Some First Steps Adoption Success Story Watch Classroom Recordings Empowered Patient Online Toolkit Tab 1: Very ... Kathy Groebner Education Programs Patients and Caregivers PHA Classroom PHA on the Road: PH Patients and Families ...

  20. Convective mixing in homogeneous porous media flow

    NASA Astrophysics Data System (ADS)

    Ching, Jia-Hau; Chen, Peilong; Tsai, Peichun Amy

    2017-01-01

    Inspired by the flow processes in the technology of carbon dioxide (CO2) storage in saline formations, we modeled a homogeneous porous media flow in a Hele-Shaw cell to investigate density-driven convection due to dissolution. We used an analogy of the fluid system to mimic the diffusion and subsequent convection when CO2 dissolves in brine, which generates a heavier solution. By varying the permeability, we examined the onset of convection, the falling dynamics, the wavelengths of fingers, and the rate of dissolution, for the Rayleigh number Ra (a dimensionless forcing term which is the ratio of buoyancy to diffusivity) in the range of 2.0 ×104≤Ra≤8.26 ×105 . Our results reveal that the effect of permeability influences significantly the initial convective speed, as well as the later coarsening dynamics of the heavier fingering plumes. However, the total dissolved mass, characterized by a nondimensional Nusselt number Nu, has an insignificant dependence on Ra. This implies that the total dissolution rate of CO2 is nearly constant in high Ra geological porous structures.

  1. Creating alternatives in science.

    PubMed

    Gravagna, Nicole G

    2009-04-01

    Traditional scientist training at the PhD level does not prepare students to be competitive in biotechnology or other non-academic science careers. Some universities have developed biotechnology-relevant doctoral programmes, but most have not. Forming a life science career club makes a statement to university administrators that it is time to rework the curriculum to include biotechnology-relevant training. A career club can supplement traditional PhD training by introducing students to available career choices, help them develop a personal network and teach the business skills that they will need to be competitive in science outside of academia. This paper is an instructional guide designed to help students create a science career club at their own university. These suggestions are based on the experience gained in establishing such a club for the Graduate School at the University of Colorado Denver. We describe the activities that can be offered, the job descriptions for the offices required and potential challenges. With determination, a creative spirit, and the guidance of this paper, students should be able to greatly increase awareness of science career options, and begin building the skills necessary to become competitive in non-academic science.

  2. Creating new market space.

    PubMed

    Kim, W C; Mauborgne, R

    1999-01-01

    Most companies focus on matching and beating their rivals. As a result, their strategies tend to take on similar dimensions. What ensues is head-to-head competition based largely on incremental improvements in cost, quality, or both. The authors have studied how innovative companies break free from the competitive pack by staking out fundamentally new market space--that is, by creating products or services for which there are no direct competitors. This path to value innovation requires a different competitive mind-set and a systematic way of looking for opportunities. Instead of looking within the conventional boundaries that define how an industry competes, managers can look methodically across them. By so doing, they can find unoccupied territory that represents real value innovation. Rather than looking at competitors within their own industry, for example, managers can ask why customers make the trade-off between substitute products or services. Home Depot, for example, looked across the substitutes serving home improvement needs. Intuit looked across the substitutes available to individuals managing their personal finances. In both cases, powerful insights were derived from looking at familiar data from a new perspective. Similar insights can be gleaned by looking across strategic groups within an industry; across buyer groups; across complementary product and service offerings; across the functional-emotional orientation of an industry; and even across time. To help readers explore new market space systematically, the authors developed a tool, the value curve, that can be used to represent visually a range of value propositions.

  3. Creating Heliophysics Concept Maps

    NASA Astrophysics Data System (ADS)

    Ali, N. A.; Peticolas, L. M.; Paglierani, R.; Mendez, B. J.

    2011-12-01

    The Center for Science Education at University of California Berkeley's Space Sciences Laboratory is creating concept maps for Heliophysics and would like to get input from scientists. The purpose of this effort is to identify key concepts related to Heliophysics and map their progression to show how students' understanding of Heliophysics might develop from Kindergarten through higher education. These maps are meant to tie into the AAAS Project 2061 Benchmarks for Scientific Literacy and National Science Education Standards. It is hoped that the results of this effort will be useful for curriculum designers developing Heliophysics-related curriculum materials and classroom teachers using Heliophysics materials. The need for concept maps was identified as a result of product analysis undertaken by the NASA Heliophysics Forum Team. The NASA Science Education and Public Outreach Forums have as two of their goals to improve the characterization of the contents of the Science Mission Directorate and Public Outreach (SMD E/PO) portfolio (Objective 2.1) and assist SMD in addressing gaps in the portfolio of SMD E/PO products and project activities (Objective 2.2). An important part of this effort is receiving feedback from solar scientists regarding the inclusion of key concepts and their progression in the maps. This session will introduce the draft concept maps and elicit feedback from scientists.

  4. Homogeneous cosmological models in Yang's gravitation theory

    NASA Technical Reports Server (NTRS)

    Fennelly, A. J.; Pavelle, R.

    1979-01-01

    We present a dynamic, spatially homogeneous solution of Yang's pure space gravitational field equations which is non-Einsteinian. The predictions of this cosmological model seem to be at variance with observations.

  5. Producing tritium in a homogenous reactor

    DOEpatents

    Cawley, William E.

    1985-01-01

    A method and apparatus are described for the joint production and separation of tritium. Tritium is produced in an aqueous homogenous reactor and heat from the nuclear reaction is used to distill tritium from the lower isotopes of hydrogen.

  6. Model Misspecification: Finite Mixture or Homogeneous?

    PubMed Central

    Tarpey, Thaddeus; Yun, Dong; Petkova, Eva

    2007-01-01

    A common problem in statistical modelling is to distinguish between finite mixture distribution and a homogeneous non-mixture distribution. Finite mixture models are widely used in practice and often mixtures of normal densities are indistinguishable from homogenous non-normal densities. This paper illustrates what happens when the EM algorithm for normal mixtures is applied to a distribution that is a homogeneous non-mixture distribution. In particular, a population-based EM algorithm for finite mixtures is introduced and applied directly to density functions instead of sample data. The population-based EM algorithm is used to find finite mixture approximations to common homogeneous distributions. An example regarding the nature of a placebo response in drug treated depressed subjects is used to illustrate ideas. PMID:18974843

  7. Generating and controlling homogeneous air turbulence using random jet arrays

    NASA Astrophysics Data System (ADS)

    Carter, Douglas; Petersen, Alec; Amili, Omid; Coletti, Filippo

    2016-12-01

    The use of random jet arrays, already employed in water tank facilities to generate zero-mean-flow homogeneous turbulence, is extended to air as a working fluid. A novel facility is introduced that uses two facing arrays of individually controlled jets (256 in total) to force steady homogeneous turbulence with negligible mean flow, shear, and strain. Quasi-synthetic jet pumps are created by expanding pressurized air through small straight nozzles and are actuated by fast-response low-voltage solenoid valves. Velocity fields, two-point correlations, energy spectra, and second-order structure functions are obtained from 2D PIV and are used to characterize the turbulence from the integral-to-the Kolmogorov scales. Several metrics are defined to quantify how well zero-mean-flow homogeneous turbulence is approximated for a wide range of forcing and geometric parameters. With increasing jet firing time duration, both the velocity fluctuations and the integral length scales are augmented and therefore the Reynolds number is increased. We reach a Taylor-microscale Reynolds number of 470, a large-scale Reynolds number of 74,000, and an integral-to-Kolmogorov length scale ratio of 680. The volume of the present homogeneous turbulence, the largest reported to date in a zero-mean-flow facility, is much larger than the integral length scale, allowing for the natural development of the energy cascade. The turbulence is found to be anisotropic irrespective of the distance between the jet arrays. Fine grids placed in front of the jets are effective at modulating the turbulence, reducing both velocity fluctuations and integral scales. Varying the jet-to-jet spacing within each array has no effect on the integral length scale, suggesting that this is dictated by the length scale of the jets.

  8. Effect of homogenization and pasteurization on the structure and thermal stability of whey protein in milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The effect of homogenization alone or in combination with high temperature, short time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a two-...

  9. Creating your own leadership brand.

    PubMed

    Kerfoot, Karlene

    2002-01-01

    Building equity in a brand happens through many encounters. The initial attraction must be followed by the meeting of expectations. This creates a loyalty that is part of an emotional connection to that brand. This is the same process people go through when they first meet a leader and decide if this is a person they want to buy into. People will examine your style, your competence, and your standards. If you fail on any of these fronts, your ability to lead will be severely compromised. People expect more of leaders now, because they know and recognize good leaders. And, predictably, people are now more cynical of leaders because of the well-publicized excess of a few leaders who advanced their own causes at the expense of their people and their financial future. This will turn out to be a good thing, because it will create a higher standard of leadership that all must aspire to achieve. When the bar is raised for us, our standards of performance are also raised.

  10. Creating a sling - slideshow

    MedlinePlus

    ... this important distinction for online health information and services. Learn more about A.D.A.M.'s editorial policy , editorial process and privacy policy . A.D.A.M. is also a founding member of Hi-Ethics and subscribes to the principles of the Health on the Net Foundation (www. ...

  11. Create a Classroom Blog!

    ERIC Educational Resources Information Center

    Brunsell, Eric; Horejsi, Martin

    2010-01-01

    Science education blogs can serve as powerful digital lab notebooks that contain text, images, and videos. Each blog entry documents a moment in time, but becomes interactive with the addition of readers' comments. Blogs can provide a realistic experience of the peer-review process and generate evolving descriptions of observations through time.…

  12. Creating a Desired Future

    ERIC Educational Resources Information Center

    Jenkins-Scott, Jackie

    2008-01-01

    When the author became president of Wheelock College in Boston in 2004, she asked the trustees and the entire campus community to engage in an innovative strategic planning and visioning process. The goal was to achieve consensus on a strategic vision for the future of Wheelock College by the end of her first year. This article discusses how…

  13. Creating a Children's Village

    ERIC Educational Resources Information Center

    Roberts, Paul

    2012-01-01

    Five years ago the author embarked on an odyssey that would fundamentally change his life as an architect. He and his partner, Dave Deppen, were selected through a very competitive process to design a new Child Development and Family Studies Center in the Sierra Foothills, near Yosemite National Park for Columbia College. The Columbia College…

  14. Cluster Mechanism of Homogeneous Crystallization (Computer Study)

    NASA Astrophysics Data System (ADS)

    Belashchenko, D. K.

    2008-12-01

    A molecular dynamics (MD) study of homogeneous crystallization of liquid rubidium is conducted with an inter-particle pair potential. The equilibrium crystallization temperature of the models was 313 K. Models consisted of 500, 998, and 1968 particles in a basic cube. The main investigation method was as follows: to detect (along the MD run) the atoms with Voronoi polyhedrons (VP) of 0608 type (“0608-atoms,” as in a bcc crystal) and to detect the bound groups of 0608-atoms (“0608-clusters”) that could play the role of the seeds in crystallization. Full crystallization was observed only at temperatures lower than 185 K with the creation of a predominant bcc crystal. The crystallization mechanism of Rb models differs drastically from the mechanism adopted in classical nucleation theory. It consists of the growth of the total number of 0608-atoms on cooling and the formation of 0608-clusters, analogous to the case of coagulation of solute for a supersaturated two-component solution. At the first stage of the process the clusters have a very loose structure (something like medusa or octopus with many tentacles) and include inside atoms with other Voronoi polyhedron types. The dimensions of clusters quickly increase and approach those of the basic cube. 0608-atoms play the leading role in the crystallization process and activate the transition of the atoms involved in the 0608-coordination. The fast growth of the maximum cluster begins after it attains a critical size (about 150 0608-atoms). The fluctuations of cluster sizes are very important in the creation of a 0608-cluster of critical (threshold) size. These fluctuations are especially large in the interval from 180 K to 185 K.

  15. Creating electron vortex beams with light.

    PubMed

    Handali, Jonathan; Shakya, Pratistha; Barwick, Brett

    2015-02-23

    We propose an all-optical method of creating electron vortices utilizing the Kapitza-Dirac effect. This technique uses the transfer of orbital angular momentum from photons to free electrons creating electron vortex beams in the process. The laser intensities needed for this experiment can be obtained with available pulsed lasers and the resulting electron beams carrying orbital angular momentum will be particularly useful in the study of magnetic materials and chiral plasmonic structures in ultrafast electron microscopy.

  16. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  17. Isotopic homogeneity of iron in the early solar nebula.

    PubMed

    Zhu, X K; Guo, Y; O'Nions, R K; Young, E D; Ash, R D

    2001-07-19

    The chemical and isotopic homogeneity of the early solar nebula, and the processes producing fractionation during its evolution, are central issues of cosmochemistry. Studies of the relative abundance variations of three or more isotopes of an element can in principle determine if the initial reservoir of material was a homogeneous mixture or if it contained several distinct sources of precursor material. For example, widespread anomalies observed in the oxygen isotopes of meteorites have been interpreted as resulting from the mixing of a solid phase that was enriched in 16O with a gas phase in which 16O was depleted, or as an isotopic 'memory' of Galactic evolution. In either case, these anomalies are regarded as strong evidence that the early solar nebula was not initially homogeneous. Here we present measurements of the relative abundances of three iron isotopes in meteoritic and terrestrial samples. We show that significant variations of iron isotopes exist in both terrestrial and extraterrestrial materials. But when plotted in a three-isotope diagram, all of the data for these Solar System materials fall on a single mass-fractionation line, showing that homogenization of iron isotopes occurred in the solar nebula before both planetesimal accretion and chondrule formation.

  18. Researchers Create Artificial Mouse 'Embryo'

    MedlinePlus

    ... news/fullstory_163881.html Researchers Create Artificial Mouse 'Embryo' Experiment used two types of gene-modified stem ... they've created a kind of artificial mouse embryo using stem cells, which can be coaxed to ...

  19. Creating Chemigrams in the Classroom.

    ERIC Educational Resources Information Center

    Guhin, Paula

    2003-01-01

    Describes an art activity in which students create "chemigrams" using exposed photo paper to create designs. Explains that this activity can be used with middle and high school students as an introduction to photography or use of chemicals. (CMK)

  20. Creating a Toilet Training Plan

    MedlinePlus

    ... Size Email Print Share Creating a Toilet Training Plan Page Content Article Body These are the tools ... will need to create your own toilet-training plan and implement it at the best time for ...

  1. Creating a Family Health History

    MedlinePlus

    ... please turn Javascript on. Creating a Family Health History Why Create a Family Health History? Click for more information A Family Tree for ... Click for more information What a Family Health History May Reveal You can use a family health ...

  2. Simulator for SUPO, a Benchmark Aqueous Homogeneous Reactor (AHR)

    SciTech Connect

    Klein, Steven Karl; Determan, John C.

    2015-10-14

    A simulator has been developed for SUPO (Super Power) an aqueous homogeneous reactor (AHR) that operated at Los Alamos National Laboratory (LANL) from 1951 to 1974. During that period SUPO accumulated approximately 600,000 kWh of operation. It is considered the benchmark for steady-state operation of an AHR. The SUPO simulator was developed using the process that resulted in a simulator for an accelerator-driven subcritical system, which has been previously reported.

  3. Multimode stretched spiral vortex and nonequilibrium energy spectrum in homogeneous shear flow turbulence

    NASA Astrophysics Data System (ADS)

    Horiuti, Kiyosi; Ozawa, Tetsuya

    2011-03-01

    The stretched spiral vortex [T. S. Lundgren, "Strained spiral vortex model for turbulent structures," Phys. Fluids 25, 2193 (1982)] is identified in turbulence in homogeneous shear flow and the spectral properties of this flow are studied using direct-numerical simulation data. The effects of mean shear on the genesis, growth, and annihilation processes of the spiral vortex are elucidated, and the role of the spiral vortex in the generation of turbulence is shown. As in homogeneous isotropic turbulence [K. Horiuti and T. Fujisawa, "The multi mode stretched spiral vortex in homogeneous isotropic turbulence," J. Fluid Mech. 595, 341 (2008)], multimodes of the spiral vortex are extracted. Two symmetric modes of configurations with regard to the vorticity alignment along the vortex tube in the core region and dual vortex sheets spiraling around the tube are often educed. One of the two symmetric modes is created by a conventional rolling-up of a single spanwise shear layer. Another one is created by the convergence of the recirculating flow or streamwise roll [F. Waleffe, "Homotopy of exact coherent structures in plane shear flows," Phys. Fluids 15, 1517 (2003)] caused by the upward and downward motions associated with the streaks. The vortex tube is formed by axial straining and lowering of pressure in the recirculating region. The spanwise shear layers are entrained by the tube and they form spiral turns. The latter symmetric mode tends to be transformed into the former mode with lapse of time due to the action of the pressure Hessian term. The power law in the inertial subrange energy spectrum is studied. The base steady spectrum fits the equilibrium Kolmogorov -5/3 spectrum, to which a nonequilibrium component induced by the fluctuation of the dissipation rate ɛ is added. This component is extracted using the conditional sampling on ɛ, and it is shown that it fits the -7/3 power in accordance with the statistical theory. The correlation between these spectra and

  4. Preparation and characterization of paclitaxel nanosuspension using novel emulsification method by combining high speed homogenizer and high pressure homogenization.

    PubMed

    Li, Yong; Zhao, Xiuhua; Zu, Yuangang; Zhang, Yin

    2015-07-25

    The aim of this study was to develop an alternative, more bio-available, better tolerated paclitaxel nanosuspension (PTXNS) for intravenous injection in comparison with commercially available Taxol(®) formulation. In this study, PTXNS was prepared by emulsification method through combination of high speed homogenizer and high pressure homogenization, followed by lyophilization process for intravenous administration. The main production parameters including volume ratio of organic phase in water and organic phase (Vo:Vw+o), concentration of PTX, content of PTX and emulsification time (Et), homogenization pressure (HP) and passes (Ps) for high pressure homogenization were optimized and their effects on mean particle size (MPS) and particle size distribution (PSD) of PTXNS were investigated. The characteristics of PTXNS, such as, surface morphology, physical status of paclitaxel (PTX) in PTXNS, redispersibility of PTXNS in purified water, in vitro dissolution study and bioavailability in vivo were all investigated. The PTXNS obtained under optimum conditions had an MPS of 186.8 nm and a zeta potential (ZP) of -6.87 mV. The PTX content in PTXNS was approximately 3.42%. Moreover, the residual amount of chloroform was lower than the International Conference on Harmonization limit (60 ppm) for solvents. The dissolution study indicated PTXNS had merits including effect to fast at the side of raw PTX and sustained-dissolution character compared with Taxol(®) formulation. Moreover, the bioavailability of PTXNS increased 14.38 and 3.51 times respectively compared with raw PTX and Taxol(®) formulation.

  5. A criterion for assessing homogeneity distribution in hyperspectral images. Part 2: application of homogeneity indices to solid pharmaceutical dosage forms.

    PubMed

    Rosas, Juan G; Blanco, Marcelo

    2012-11-01

    This article is the second of a series of two articles detailing the application of mixing index to assess homogeneity distribution in oral pharmaceutical solid dosage forms by image analysis. Chemical imaging (CI) is an emerging technique integrating conventional imaging and spectroscopic techniques with a view to obtaining spatial and spectral information from a sample. Near infrared chemical imaging (NIR-CI) has proved an excellent analytical tool for extracting high-quality information from sample surfaces. The primary objective of this second part was to demonstrate that the approach developed in the first part could be successfully applied to near infrared hyperspectral images of oral pharmaceutical solid dosage forms such as coated, uncoated and effervescent tablets, as well as to powder blends. To this end, we assessed a new criterion for establishing mixing homogeneity by using four different methods based on a three-dimensional (M×N×λ) data array of hyperspectral images (spectral standard deviations and correlation coefficients) or a two-dimensional (M×N) data array (concentration maps and binary images). The four methods were used applying macropixel analysis to the Poole (M(P)) and homogeneity (H%(Poole)) indices. Both indices proved useful for assessing the degree of homogeneity of pharmaceutical samples. The results testify that the proposed approach can be effectively used in the pharmaceutical industry, in the finished products (e.g., tablets) and in mixing unit operations for example, as a process analytical technology tool for the blending monitoring (see part 1).

  6. Effect of non-homogenous thermal stress during sub-lethal photodynamic antimicrobial chemotherapy

    NASA Astrophysics Data System (ADS)

    Gadura, N.; Kokkinos, D.; Dehipawala, S.; Cheung, E.; Sullivan, R.; Subramaniam, R.; Schneider, P.; Tremberger, G., Jr.; Holden, T.; Lieberman, D.; Cheung, T.

    2012-03-01

    Pathogens could be inactivated via a light source coupled with a photosensitizing agent in photodynamic antimicrobial chemotherapy (PACT). This project studied the effect of non-homogenous substrate on cell colony. The non-homogeneity could be controlled by iron oxide nano-particles doping in porous glassy substrates such that each cell would experience tens of hot spots when illuminated with additional light source. The substrate non-homogeneity was characterized by Atomic Force Microscopy, Transmission Electron Microscopy and Extended X-Ray Absorption Fine Structure at Brookhaven Synchrotron Light Source. Microscopy images of cell motion were used to study the motility. Laboratory cell colonies on non-homogenous substrates exhibit reduced motility similar to those observed with sub-lethal PCAT treatment. Such motility reduction on non-homogenous substrate is interpreted as the presence of thermal stress. The studied pathogens included E. coli and Pseudomonas aeruginosa. Non-pathogenic microbes Bacillus subtilis was also studied for comparison. The results show that sub-lethal PACT could be effective with additional non-homogenous thermal stress. The use of non-uniform illumination on a homogeneous substrate to create thermal stress in sub-micron length scale is discussed via light correlation in propagation through random medium. Extension to sub-lethal PACT application complemented with thermal stress would be an appropriate application.

  7. Homogeneous anisotropic solutions of topologically massive gravity with a cosmological constant and their homogeneous deformations

    NASA Astrophysics Data System (ADS)

    Moutsopoulos, George

    2013-06-01

    We solve the equations of topologically massive gravity (TMG) with a potentially non-vanishing cosmological constant for homogeneous metrics without isotropy. We only reproduce known solutions. We also discuss their homogeneous deformations, possibly with isotropy. We show that de Sitter space and hyperbolic space cannot be infinitesimally homogeneously deformed in TMG. We clarify some of their Segre-Petrov types and discuss the warped de Sitter spacetime.

  8. Analysis of homogeneous/non-homogeneous nanofluid models accounting for nanofluid-surface interactions

    NASA Astrophysics Data System (ADS)

    Ahmad, R.

    2016-07-01

    This article reports an unbiased analysis for the water based rod shaped alumina nanoparticles by considering both the homogeneous and non-homogeneous nanofluid models over the coupled nanofluid-surface interface. The mechanics of the surface are found for both the homogeneous and non-homogeneous models, which were ignored in previous studies. The viscosity and thermal conductivity data are implemented from the international nanofluid property benchmark exercise. All the simulations are being done by using the experimentally verified results. By considering the homogeneous and non-homogeneous models, the precise movement of the alumina nanoparticles over the surface has been observed by solving the corresponding system of differential equations. For the non-homogeneous model, a uniform temperature and nanofluid volume fraction are assumed at the surface, and the flux of the alumina nanoparticle is taken as zero. The assumption of zero nanoparticle flux at the surface makes the non-homogeneous model physically more realistic. The differences of all profiles for both the homogeneous and nonhomogeneous models are insignificant, and this is due to small deviations in the values of the Brownian motion and thermophoresis parameters.

  9. Climate Data Homogenization Using Edge Detection Algorithms

    NASA Astrophysics Data System (ADS)

    Hammann, A. C.; Rennermalm, A. K.

    2015-12-01

    The problem of climate data homogenization has predominantly been addressed by testing the likelihood of one or more breaks inserted into a given time series and modeling the mean to be stationary in between the breaks. We recast the same problem in a slightly different form: that of detecting step-like changes in noisy data, and observe that this problem has spawned a large number of approaches to its solution as the "edge detection" problem in image processing. With respect to climate data, we ask the question: How can we optimally separate step-like from smoothly-varying low-frequency signals? We study the hypothesis that the edge-detection approach makes better use of all information contained in the time series than the "traditional" approach (e.g. Caussinus and Mestre, 2004), which we base on several observations. 1) The traditional formulation of the problem reduces the available information from the outset to that contained in the test statistic. 2) The criterion of local steepness of the low-frequency variability, while at least hypothetically useful, is ignored. 3) The practice of using monthly data corresponds, mathematically, to applying a moving average filter (to reduce noise) and subsequent subsampling of the result; this subsampling reduces the amount of available information beyond what is necessary for noise reduction. Most importantly, the tradeoff between noise reduction (better with filters with wide support in the time domain) and localization of detected changes (better with filters with narrow support) is expressed in the well-known uncertainty principle and can be addressed optimally within a time-frequency framework. Unsurprisingly, a large number of edge-detection algorithms have been proposed that make use of wavelet decompositions and similar techniques. We are developing this framework in part to be applied to a particular set of climate data from Greenland; we will present results from this application as well as from tests with

  10. Method of Mapping Anomalies in Homogenous Material

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E. (Inventor); Taylor, Bryant D. (Inventor)

    2016-01-01

    An electrical conductor and antenna are positioned in a fixed relationship to one another. Relative lateral movement is generated between the electrical conductor and a homogenous material while maintaining the electrical conductor at a fixed distance from the homogenous material. The antenna supplies a time-varying magnetic field that causes the electrical conductor to resonate and generate harmonic electric and magnetic field responses. Disruptions in at least one of the electric and magnetic field responses during this lateral movement are indicative of a lateral location of a subsurface anomaly. Next, relative out-of-plane movement is generated between the electrical conductor and the homogenous material in the vicinity of the anomaly's lateral location. Disruptions in at least one of the electric and magnetic field responses during this out-of-plane movement are indicative of a depth location of the subsurface anomaly. A recording of the disruptions provides a mapping of the anomaly.

  11. Computational Homogenization of Defect Driving Forces

    NASA Astrophysics Data System (ADS)

    Ricker, Sarah; Mergheim, Julia; Steinmann, Paul

    Due to the fact that many engineering materials and also biological tissues possess an underlying (heterogeneous) micro-structure it is not sufficient to simulate these materials by pre-assumed overall constitutive assumptions. Therefore, we apply a homogenization scheme, which determines the macroscopic material behavior based on analysis of the underlying micro-structure. In the work at hand focus is put on the extension of the classical computational homogenization scheme towards the homogenization of material forces. Therefore, volume forces have to incorporated which may emerge due to inhomogeneities in the material. With assistance of this material formulation and the equivalence of the J-integral and the material force at a crack tip, studies on the influence of the micro-structure onto the macroscopic crack-propagation are carried out.

  12. Quality Control and Homogeneity of Precipitation Data in the Southwest of Europe.

    NASA Astrophysics Data System (ADS)

    González-Rouco, J. Fidel; Jiménez, J. Luis; Quesada, Vicente; Valero, Francisco

    2001-03-01

    A quality control process involving outliers processing, homogenization, and interpolation has been applied to 95 monthly precipitation series in the Iberian Peninsula, southern France, and northern Africa during the period 1899-1989. A detailed description of the procedure results is provided and the impact of adjustments on trend estimation is discussed.Outliers have been censored by trimming extreme values. Homogeneity adjustments have been developed by applying the Standard Normal Homogeneity Test in combination with an objective methodology to select reference series.The spatial distribution of outliers indicates that they are due to climate variability rather than measurement errors. After carrying out the homogeneity procedure, 40% of the series were found to be homogeneous, 49.5% became homogeneous after one adjustment, and 9.5% after two adjustments. About 30% of the inhomogeneities could be traced to information in the scarce history files.It is shown that these data present severe homogeneity problems and that applying outliers and homogeneity adjustments greatly changes the patterns of trends for this area.

  13. Contribution of the live-vertebrate trade toward taxonomic homogenization.

    PubMed

    Romagosa, Christina M; Guyer, Craig; Wooten, Michael C

    2009-08-01

    The process of taxonomic homogenization occurs through two mechanisms, extinctions and introductions, and leads to a reduction of global biodiversity. We used available U.S. trade data as a proxy for global trade in live vertebrates to assess the contribution of trade to the process of taxonomic homogenization. Data included all available U.S. importation and exportation records, estimation of extinction risk, and reports of establishment outside the native range for species within six vertebrate groups. Based on Monte Carlo sampling, the number of species traded, established outside of the native range, and threatened with extinction was not randomly distributed among vertebrate families. Twenty-eight percent of vertebrate families that were traded preferentially were also established or threatened with extinction, an unusually high percentage compared with the 7% of families that were not traded preferentially but that became established or threatened with extinction. The importance of trade in homogenization of vertebrates suggests that additional efforts should be made to prevent introductions and extinctions through this medium.

  14. A comparative study of Casson fluid with homogeneous-heterogeneous reactions.

    PubMed

    Khan, Muhammad Ijaz; Waqas, Muhammad; Hayat, Tasawar; Alsaedi, Ahmed

    2017-03-09

    Magnetohydrodynamic (MHD) stagnation point flow of Casson fluid towards a stretching sheet is addressed. Homogeneous-heterogeneous reactions together with homogeneous heat effect subject to a resistive force of electromagnetic origin is discussed. It is assumed that the homogeneous process in the ambient fluid is governed by first order kinetics and the heterogeneous process on the wall surface is given by isothermal cubic autocatalator kinetics. Ordinary differential systems have been considered. Solutions of the problems are presented via a numerical technique namely built in shooting method. Graphical behaviors of velocity, temperature and concentration are analyzed comprehensively. Velocity is noticed a decreasing function of Hartman number.

  15. Hyperelastic bodies under homogeneous Cauchy stress induced by non-homogeneous finite deformations

    NASA Astrophysics Data System (ADS)

    Mihai, L. Angela; Neff, Patrizio

    2017-03-01

    We discuss whether homogeneous Cauchy stress implies homogeneous strain in isotropic nonlinear elasticity. While for linear elasticity the positive answer is clear, we exhibit, through detailed calculations, an example with inhomogeneous continuous deformation but constant Cauchy stress. The example is derived from a non rank-one convex elastic energy.

  16. RELIABLE COMPUTATION OF HOMOGENEOUS AZEOTROPES. (R824731)

    EPA Science Inventory

    Abstract

    It is important to determine the existence and composition of homogeneous azeotropes in the analysis of phase behavior and in the synthesis and design of separation systems, from both theoretical and practical standpoints. A new method for reliably locating an...

  17. HSTEP - Homogeneous Studies of Transiting Extrasolar Planets

    NASA Astrophysics Data System (ADS)

    Southworth, John

    2014-04-01

    This paper presents a summary of the HSTEP project: an effort to calculate the physical properties of the known transiting extrasolar planets using a homogeneous approach. I discuss the motivation for the project, list the 83 planets which have already been studied, run through some important aspects of the methodology, and finish with a synopsis of the results.

  18. Homogeneous Immunoassays: Historical Perspective and Future Promise

    NASA Astrophysics Data System (ADS)

    Ullman, Edwin F.

    1999-06-01

    The founding and growth of Syva Company is examined in the context of its leadership role in the development of homogeneous immunoassays. The simple mix and read protocols of these methods offer advantages in routine analytical and clinical applications. Early homogeneous methods were based on insensitive detection of immunoprecipitation during antigen/antibody binding. The advent of reporter groups in biology provided a means of quantitating immunochemical binding by labeling antibody or antigen and physically separating label incorporated into immune complexes from free label. Although high sensitivity was achieved, quantitative separations were experimentally demanding. Only when it became apparent that reporter groups could provide information, not only about the location of a molecule but also about its microscopic environment, was it possible to design practical non-separation methods. The evolution of early homogenous immunoassays was driven largely by the development of improved detection strategies. The first commercial spin immunoassays, developed by Syva for drug abuse testing during the Vietnam war, were followed by increasingly powerful methods such as immunochemical modulation of enzyme activity, fluorescence, and photo-induced chemiluminescence. Homogeneous methods that quantify analytes at femtomolar concentrations within a few minutes now offer important new opportunities in clinical diagnostics, nucleic acid detection and drug discovery.

  19. Spatial Homogeneity and Redshift--Distance Laws

    NASA Astrophysics Data System (ADS)

    Nicoll, J. F.; Segal, I. E.

    1982-06-01

    Spatial homogeneity in the radial direction of low-redshift galaxies is subjected to Kafka-Schmidt V/Vm tests using well-documented samples. Homogeneity is consistent with the assumption of the Lundmark (quadratic redshift-distance) law, but large deviations from homogeneity are implied by the assumption of the Hubble (linear redshift-distance) law. These deviations are similar to what would be expected on the basis of the Lundmark law. Luminosity functions are obtained for each law by a nonparametric statistically optimal method that removes the observational cutoff bias in complete samples. Although the Hubble law correlation of absolute magnitude with redshift is reduced considerably by elimination of the bias, computer simulations show that its bias-free value is nevertheless at a satistically quite significant level, indicating the self-inconsistency of the law. The corresponding Lundmark law correlations are quite satisfactory satistically. The regression of redshift on magnitude also involves radial spatial homogeneity and, according to R. Soneira, has slope determining the redshift-magnitude exponent independently of the luminosity function. We have, however, rigorously proved the material dependence of the regression on this function and here exemplify our treatment by using the bias-free functions indicated, with results consistent with the foregoing argument.

  20. Reduced-order modelling numerical homogenization.

    PubMed

    Abdulle, A; Bai, Y

    2014-08-06

    A general framework to combine numerical homogenization and reduced-order modelling techniques for partial differential equations (PDEs) with multiple scales is described. Numerical homogenization methods are usually efficient to approximate the effective solution of PDEs with multiple scales. However, classical numerical homogenization techniques require the numerical solution of a large number of so-called microproblems to approximate the effective data at selected grid points of the computational domain. Such computations become particularly expensive for high-dimensional, time-dependent or nonlinear problems. In this paper, we explain how numerical homogenization method can benefit from reduced-order modelling techniques that allow one to identify offline and online computational procedures. The effective data are only computed accurately at a carefully selected number of grid points (offline stage) appropriately 'interpolated' in the online stage resulting in an online cost comparable to that of a single-scale solver. The methodology is presented for a class of PDEs with multiple scales, including elliptic, parabolic, wave and nonlinear problems. Numerical examples, including wave propagation in inhomogeneous media and solute transport in unsaturated porous media, illustrate the proposed method.

  1. Coherence delay augmented laser beam homogenizer

    SciTech Connect

    Rasmussen, P.; Bernhardt, A.

    1991-12-31

    It is an object of the present invention to provide an apparatus that can reduce the apparent coherence length of a laser beam so the beam can be used with an inexpensive homogenizer to produce an output beam with a uniform spatial intensity across its entire cross section. It is a further object of the invention to provide an improved homogenizer with a variable aperture size that is simple and easily made. It is still an additional object of the invention to provide an improved liquid filled homogenizer utilizing total internal reflection for improved efficiency. These, and other objects of the invention are realized by using a ``coherence delay line,`` according to the present invention, in series between a laser and a homogenizer. The coherence delay line is an optical ``line`` that comprises two mirrors, one partially reflecting, and one totally reflecting, arranged so that light incident from the laser first strikes the partially reflecting mirror. A portion of the beam passes through, and a portion is reflected back to the totally reflecting mirror.

  2. General Theorems about Homogeneous Ellipsoidal Inclusions

    ERIC Educational Resources Information Center

    Korringa, J.; And Others

    1978-01-01

    Mathematical theorems about the properties of ellipsoids are developed. Included are Poisson's theorem concerning the magnetization of a homogeneous body of ellipsoidal shape, the polarization of a dielectric, the transport of heat or electricity through an ellipsoid, and other problems. (BB)

  3. On the supposed influence of milk homogenization on the risk of CVD, diabetes and allergy.

    PubMed

    Michalski, Marie-Caroline

    2007-04-01

    Commercial milk is homogenized for the purpose of physical stability, thereby reducing fat droplet size and including caseins and some whey proteins at the droplet interface. This seems to result in a better digestibility than untreated milk. Various casein peptides and milk fat globule membrane (MFGM) proteins are reported to present either harmful (e.g. atherogenic) or beneficial bioactivity (e.g. hypotensive, anticarcinogenic and others). Homogenization might enhance either of these effects, but this remains controversial. The effect of homogenization has not been studied regarding the link between early cow's milk consumption and occurrence of type I diabetes in children prone to the disease and no link appears in the general population. Homogenization does not influence milk allergy and intolerance in allergic children and lactose-intolerant or milk-hypersensitive adults. The impact of homogenization, as well as heating and other treatments such as cheesemaking processes, on the health properties of milk and dairy products remains to be fully elucidated.

  4. A study on beam homogeneity for a Siemens Primus linac.

    PubMed

    Cutanda Henriquez, F; Vargas-Castrillón, S T

    2007-06-01

    Asymmetric offset fields are an important tool for radiotherapy and their suitability for treatment should be assessed. Dose homogeneity for highly asymmetric fields has been studied for a Siemens PRIMUS clinical linear accelerator. Profiles and absolute dose have been measured in fields with two jaws at maximal position (20 cm) and the other two at maximal overtravel (10 cm), corresponding to 10 cm x 10 cm fields with extreme offset. Measured profiles have a marked decreasing gradient towards the beam edge, making these fields unsuitable for treatments. The flattening filter radius is smaller than the primary collimator aperture, and this creates beam inhomogeneities that affect large fields in areas far from the collimator axis, and asymmetric fields with large offset. The results presented assess the effect that the design of the primary collimator and flattening filter assembly has on beam homogeneity. This can have clinical consequences for treatments involving fields that include these inhomogeneous areas. Comparison with calculations from a treatment planning system, Philips Pinnacle v6.3, which computes under the hypotheses of a uniformly flattened beam, results in severe discrepancies.

  5. Anthropogenic Matrices Favor Homogenization of Tree Reproductive Functions in a Highly Fragmented Landscape

    PubMed Central

    2016-01-01

    Species homogenization or floristic differentiation are two possible consequences of the fragmentation process in plant communities. Despite the few studies, it seems clear that fragments with low forest cover inserted in anthropogenic matrices are more likely to experience floristic homogenization. However, the homogenization process has two other components, genetic and functional, which have not been investigated. The purpose of this study was to verify whether there was homogenization of tree reproductive functions in a fragmented landscape and, if found, to determine how the process was influenced by landscape composition. The study was conducted in eight fragments in southwest Brazil. The study was conducted in eight fragments in southwestern Brazil. In each fragment, all individual trees were sampled that had a diameter at breast height ≥3 cm, in ten plots (0.2 ha) and, classified within 26 reproductive functional types (RFTs). The process of functional homogenization was evaluated using additive partitioning of diversity. Additionally, the effect of landscape composition on functional diversity and on the number of individuals within each RFT was evaluated using a generalized linear mixed model. appeared to be in a process of functional homogenization (dominance of RFTs, alpha diversity lower than expected by chance and and low beta diversity). More than 50% of the RFTs and the functional diversity were affected by the landscape parameters. In general, the percentage of forest cover has a positive effect on RFTs while the percentage of coffee matrix has a negative one. The process of functional homogenization has serious consequences for biodiversity conservation because some functions may disappear that, in the long term, would threaten the fragments. This study contributes to a better understanding of how landscape changes affect the functional diversity, abundance of individuals in RFTs and the process of functional homogenization, as well as how to

  6. Can cognitive science create a cognitive economics?

    PubMed

    Chater, Nick

    2015-02-01

    Cognitive science can intersect with economics in at least three productive ways: by providing richer models of individual behaviour for use in economic analysis; by drawing from economic theory in order to model distributed cognition; and jointly to create more powerful 'rational' models of cognitive processes and social interaction. There is the prospect of moving from behavioural economics to a genuinely cognitive economics.

  7. Designing and Creating Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    McMeen, George R.

    Designed to encourage the use of a defined methodology and careful planning in creating computer-assisted instructional programs, this paper describes the instructional design process, compares computer-assisted instruction (CAI) and programmed instruction (PI), and discusses pragmatic concerns in computer programming. Topics addressed include:…

  8. Making Coalitions Work: Creating a Viable Environment.

    ERIC Educational Resources Information Center

    Killacky, Jim; Hulse-Killacky, Diana

    1997-01-01

    Describes community-based programming (CBP), a cooperative process that allows community and technical colleges to address critical issues through coalitions. Provides strategies for creating effective coalitions, focusing on information that new members should know, essential leadership qualities, group rules and activities, and ways to close…

  9. Effect of high-pressure homogenization on different matrices of food supplements.

    PubMed

    Martínez-Sánchez, Ascensión; Tarazona-Díaz, Martha Patricia; García-González, Antonio; Gómez, Perla A; Aguayo, Encarna

    2016-12-01

    There is a growing demand for food supplements containing high amounts of vitamins, phenolic compounds and mineral content that provide health benefits. Those functional compounds have different solubility properties, and the maintenance of their compounds and the guarantee of their homogenic properties need the application of novel technologies. The quality of different drinkable functional foods after thermal processing (0.1 MPa) or high-pressure homogenization under two different conditions (80 MPa, 33 ℃ and 120 MPa, 43 ℃) was studied. Physicochemical characteristics and sensory qualities were evaluated throughout the six months of accelerated storage at 40 ℃ and 75% relative humidity (RH). Aroma and color were better maintained in high-pressure homogenization-treated samples than the thermally treated ones, which contributed significantly to extending their shelf life. The small particle size obtained after high-pressure homogenization treatments caused differences in turbidity and viscosity with respect to heat-treated samples. The use of high-pressure homogenization, more specifically, 120 MPa, provided active ingredient homogeneity to ensure uniform content in functional food supplements. Although the effect of high-pressure homogenization can be affected by the food matrix, high-pressure homogenization can be implemented as an alternative to conventional heat treatments in a commercial setting within the functional food supplement or pharmaceutical industry.

  10. Photoinduced electron transfer processes in homogeneous and microheterogeneous solutions

    SciTech Connect

    Whitten, D.G.

    1991-10-01

    The focus of the work described in this report is on single electron transfer reactions of excited states which culminate in the formation of stable or metastable even electron species. For the most part the studies have involved even electron organic substrates which are thus converted photochemically to odd electron species and then at some stage reconvert to even electron products. These reactions generally fall into two rather different categories. In one set of studies we have examined reactions in which the metastable reagents generated by single electron transfer quenching of an excited state undergo novel fragmentation reactions, chiefly involving C-C bond cleavage. These reactions often culminate in novel and potentially useful chemical reactions and frequently have the potential for leading to new chemical products otherwise unaffordable by conventional reaction paths. In a rather different investigation we have also studied reactions in which single electron transfer quenching of an excited state is followed by subsequent reactions which lead reversibly to metastable two electron products which, often stable in themselves, can nonetheless be reacted with each other or with other reagents to regenerate the starting materials with release of energy. 66 refs., 9 figs., 1 tab.

  11. Background: What the States Created

    ERIC Educational Resources Information Center

    Cox, James C.

    2009-01-01

    Prior to 2003, virtual universities were being created at a rate that would question the usual perception that higher education rarely changed, or changed (if at all) at a glacial speed. No comprehensive study of what was actually being created had been done; nor had anyone tapped the experiences of the developers in the states to see what was…

  12. Homogeneous UVA system for corneal cross-linking treatment

    NASA Astrophysics Data System (ADS)

    Ayres Pereira, Fernando R.; Stefani, Mario A.; Otoboni, José A.; Richter, Eduardo H.; Ventura, Liliane

    2010-02-01

    The treatment of keratoconus and corneal ulcers by collagen cross-linking using ultraviolet type A irradiation, combined with photo-sensitizer Riboflavin (vitamin B2), is a promising technique. The standard protocol suggests instilling Riboflavin in the pre-scratched cornea every 5min for 30min, during the UVA irradiation of the cornea at 3mW/cm2 for 30 min. This process leads to an increase of the biomechanical strength of the cornea, stopping the progression, or sometimes, even reversing Keratoconus. The collagen cross-linking can be achieved by many methods, but the utilization of UVA light, for this purpose, is ideal because of its possibility of a homogeneous treatment leading to an equal result along the treated area. We have developed a system, to be clinically used for treatment of unhealthy corneas using the cross-linking technique, which consists of an UVA emitting delivery device controlled by a closed loop system with high homogeneity. The system is tunable and delivers 3-5 mW/cm2, at 365nm, for three spots (6mm, 8mm and 10mm in diameter). The electronics close loop presents 1% of precision, leading to an overall error, after the calibration, of less than 10% and approximately 96% of homogeneity.

  13. Tissue homogeneity requires inhibition of unequal gene silencing during development

    PubMed Central

    Le, Hai H.; Looney, Monika; Strauss, Benjamin; Bloodgood, Michael

    2016-01-01

    Multicellular organisms can generate and maintain homogenous populations of cells that make up individual tissues. However, cellular processes that can disrupt homogeneity and how organisms overcome such disruption are unknown. We found that ∼100-fold differences in expression from a repetitive DNA transgene can occur between intestinal cells in Caenorhabditis elegans. These differences are caused by gene silencing in some cells and are actively suppressed by parental and zygotic factors such as the conserved exonuclease ERI-1. If unsuppressed, silencing can spread between some cells in embryos but can be repeat specific and independent of other homologous loci within each cell. Silencing can persist through DNA replication and nuclear divisions, disrupting uniform gene expression in developed animals. Analysis at single-cell resolution suggests that differences between cells arise during early cell divisions upon unequal segregation of an initiator of silencing. Our results suggest that organisms with high repetitive DNA content, which include humans, could use similar developmental mechanisms to achieve and maintain tissue homogeneity. PMID:27458132

  14. Homogenization of tissues via picosecond-infrared laser (PIRL) ablation: Giving a closer view on the in-vivo composition of protein species as compared to mechanical homogenization

    PubMed Central

    Kwiatkowski, M.; Wurlitzer, M.; Krutilin, A.; Kiani, P.; Nimer, R.; Omidi, M.; Mannaa, A.; Bussmann, T.; Bartkowiak, K.; Kruber, S.; Uschold, S.; Steffen, P.; Lübberstedt, J.; Küpker, N.; Petersen, H.; Knecht, R.; Hansen, N.O.; Zarrine-Afsar, A.; Robertson, W.D.; Miller, R.J.D.; Schlüter, H.

    2016-01-01

    Posttranslational modifications and proteolytic processing regulate almost all physiological processes. Dysregulation can potentially result in pathologic protein species causing diseases. Thus, tissue species proteomes of diseased individuals provide diagnostic information. Since the composition of tissue proteomes can rapidly change during tissue homogenization by the action of enzymes released from their compartments, disease specific protein species patterns can vanish. Recently, we described a novel, ultrafast and soft method for cold vaporization of tissue via desorption by impulsive vibrational excitation (DIVE) using a picosecond-infrared-laser (PIRL). Given that DIVE extraction may provide improved access to the original composition of protein species in tissues, we compared the proteome composition of tissue protein homogenates after DIVE homogenization with conventional homogenizations. A higher number of intact protein species was observed in DIVE homogenates. Due to the ultrafast transfer of proteins from tissues via gas phase into frozen condensates of the aerosols, intact protein species were exposed to a lesser extent to enzymatic degradation reactions compared with conventional protein extraction. In addition, total yield of the number of proteins is higher in DIVE homogenates, because they are very homogenous and contain almost no insoluble particles, allowing direct analysis with subsequent analytical methods without the necessity of centrifugation. Biological significance Enzymatic protein modifications during tissue homogenization are responsible for changes of the in-vivo protein species composition. Cold vaporization of tissues by PIRL-DIVE is comparable with taking a snapshot at the time of the laser irradiation of the dynamic changes that occur continuously under in-vivo conditions. At that time point all biomolecules are transferred into an aerosol, which is immediately frozen. PMID:26778141

  15. Applications of High and Ultra High Pressure Homogenization for Food Safety

    PubMed Central

    Patrignani, Francesca; Lanciotti, Rosalba

    2016-01-01

    Traditionally, the shelf-life and safety of foods have been achieved by thermal processing. Low temperature long time and high temperature short time treatments are the most commonly used hurdles for the pasteurization of fluid foods and raw materials. However, the thermal treatments can reduce the product quality and freshness. Consequently, some non-thermal pasteurization process have been proposed during the last decades, including high hydrostatic pressure, pulsed electric field, ultrasound (US), and high pressure homogenization (HPH). This last technique has been demonstrated to have a great potential to provide “fresh-like” products with prolonged shelf-life. Moreover, the recent developments in high-pressure-homogenization technology and the design of new homogenization valves able to withstand pressures up to 350–400 MPa have opened new opportunities to homogenization processing in the food industries and, consequently, permitted the development of new products differentiated from traditional ones by sensory and structural characteristics or functional properties. For this, this review deals with the principal mechanisms of action of HPH against microorganisms of food concern in relation to the adopted homogenizer and process parameters. In addition, the effects of homogenization on foodborne pathogenic species inactivation in relation to the food matrix and food chemico-physical and process variables will be reviewed. Also the combined use of this alternative technology with other non-thermal technologies will be considered. PMID:27536270

  16. Kinematical uniqueness of homogeneous isotropic LQC

    NASA Astrophysics Data System (ADS)

    Engle, Jonathan; Hanusch, Maximilian

    2017-01-01

    In a paper by Ashtekar and Campiglia, invariance under volume preserving residual diffeomorphisms has been used to single out the standard representation of the reduced holonomy-flux algebra in homogeneous loop quantum cosmology (LQC). In this paper, we use invariance under all residual diffeomorphisms to single out the standard kinematical Hilbert space of homogeneous isotropic LQC for both the standard configuration space {{{R}}\\text{Bohr}} , as well as for the Fleischhack one {R}\\sqcup {{{R}}\\text{Bohr}} . We first determine the scale invariant Radon measures on these spaces, and then show that the Haar measure on {{{R}}\\text{Bohr}} is the only such measure for which the momentum operator is hermitian w.r.t. the corresponding inner product. In particular, the measure is forced to be identically zero on {R} in the Fleischhack case, so that for both approaches, the standard kinematical LQC-Hilbert space is singled out.

  17. Detonation in shocked homogeneous high explosives

    SciTech Connect

    Yoo, C.S.; Holmes, N.C.; Souers, P.C.

    1995-11-01

    We have studied shock-induced changes in homogeneous high explosives including nitromethane, tetranitromethane, and single crystals of pentaerythritol tetranitrate (PETN) by using fast time-resolved emission and Raman spectroscopy at a two-stage light-gas gun. The results reveal three distinct steps during which the homogeneous explosives chemically evolve to final detonation products. These are (1) the initiation of shock compressed high explosives after an induction period, (2) thermal explosion of shock-compressed and/or reacting materials, and (3) a decay to a steady-state representing a transition to the detonation of uncompressed high explosives. Based on a gray-body approximation, we have obtained the CJ temperatures: 3800 K for nitromethane, 2950 K for tetranitromethane, and 4100 K for PETN. We compare the data with various thermochemical equilibrium calculations. In this paper we will also show a preliminary result of single-shot time-resolved Raman spectroscopy applied to shock-compressed nitromethane.

  18. Coherent Eigenmodes in Homogeneous MHD Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2010-01-01

    The statistical mechanics of Fourier models of ideal, homogeneous, incompressible magnetohydrodynamic (MHD) turbulence is discussed, along with their relevance for dissipative magnetofluids. Although statistical theory predicts that Fourier coefficients of fluid velocity and magnetic field are zero-mean random variables, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation, i.e., we have coherent structure. We use eigenanalysis of the modal covariance matrices in the probability density function to explain this phenomena in terms of `broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We provide examples from 2-D and 3-D magnetohydrodynamic simulations of homogeneous turbulence, and show new results from long-time simulations of MHD turbulence with and without a mean magnetic field

  19. Program Logics for Homogeneous Meta-programming

    NASA Astrophysics Data System (ADS)

    Berger, Martin; Tratt, Laurence

    A meta-program is a program that generates or manipulates another program; in homogeneous meta-programming, a program may generate new parts of, or manipulate, itself. Meta-programming has been used extensively since macros were introduced to Lisp, yet we have little idea how formally to reason about meta-programs. This paper provides the first program logics for homogeneous meta-programming - using a variant of MiniML_e^{square} by Davies and Pfenning as underlying meta-programming language. We show the applicability of our approach by reasoning about example meta-programs from the literature. We also demonstrate that our logics are relatively complete in the sense of Cook, enable the inductive derivation of characteristic formulae, and exactly capture the observational properties induced by the operational semantics.

  20. CUDA Simulation of Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John V.; Shum, Victor; Fu, Terry

    2011-01-01

    We discuss very fast Compute Unified Device Architecture (CUDA) simulations of ideal homogeneous incompressible turbulence based on Fourier models. These models have associated statistical theories that predict that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. Prior numerical simulations have shown that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We review the theoretical basis of this "broken ergodicity" as applied to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence. Our new simulations examine the phenomenon of broken ergodicity through very long time and large grid size runs performed on a state-of-the-art CUDA platform. Results comparing various CUDA hardware configurations and grid sizes are discussed. NS and MHD results are compared.

  1. Homogeneous isolation of nanocellulose from sugarcane bagasse by high pressure homogenization.

    PubMed

    Li, Jihua; Wei, Xiaoyi; Wang, Qinghuang; Chen, Jiacui; Chang, Gang; Kong, Lingxue; Su, Junbo; Liu, Yuhuan

    2012-11-06

    Nanocellulose from sugarcane bagasse was isolated by high pressure homogenization in a homogeneous media. Pretreatment with an ionic liquid (1-butyl-3-methylimidazolium chloride ([Bmim]Cl)) was initially involved to dissolve the bagasse cellulose. Subsequently, the homogeneous solution was passed through a high pressure homogenizer without any clogging. The nanocellulose was obtained at 80 MPa for 30 cycles with recovery of 90% under the optimum refining condition. Nanocellulose had been characterized by Fourier transformed infrared spectra, X-ray diffraction, thermogravimetric analysis, rheological measurements and transmission electron microscopy. The results showed that nanocellulose was 10-20 nm in diameter, and presented lower thermal stability and crystallinity than the original cellulose. The developed nanocellulose would be a very versatile renewable material.

  2. Recent advances in homogeneous nickel catalysis

    NASA Astrophysics Data System (ADS)

    Tasker, Sarah Z.; Standley, Eric A.; Jamison, Timothy F.

    2014-05-01

    Tremendous advances have been made in nickel catalysis over the past decade. Several key properties of nickel, such as facile oxidative addition and ready access to multiple oxidation states, have allowed the development of a broad range of innovative reactions. In recent years, these properties have been increasingly understood and used to perform transformations long considered exceptionally challenging. Here we discuss some of the most recent and significant developments in homogeneous nickel catalysis, with an emphasis on both synthetic outcome and mechanism.

  3. Effect of homogenization and ultrasonication on the physical properties of insoluble wheat bran fibres

    NASA Astrophysics Data System (ADS)

    Hu, Ran; Zhang, Min; Adhikari, Benu; Liu, Yaping

    2015-10-01

    Wheat bran is rich in dietary fibre and its annual output is abundant, but underutilized. Insoluble dietary fibre often influences food quality negatively; therefore, how to improve the physical and chemical properties of insoluble dietary fibre of wheat bran for post processing is a challenge. Insoluble dietary fibre was obtained from wheat bran and micronized using high-pressure homogenization, high-intensity sonication, and a combination of these two methods. The high-pressure homogenization and high-pressure homogenization+high-intensity sonication treatments significantly (p<0.05) improved the solubility, swelling, water-holding, oil-holding, and cation exchange capacities. The improvement of the above properties by high-intensity sonication alone was marginal. In most cases, the high-pressure homogenization process was as good as the high-pressure homogenization+high-intensity sonication process in improving the above-mentioned properties; hence, the contribution of high-`intensity sonication in the high-pressure homogenization+high-intensity sonication process was minimal. The best results show that the minimum particle size of wheat bran can reach 9 μm, and the solubility, swelling, water-holding, oil-holding, cation exchange capacities change significantly.

  4. TESTING HOMOGENEITY WITH GALAXY STAR FORMATION HISTORIES

    SciTech Connect

    Hoyle, Ben; Jimenez, Raul; Tojeiro, Rita; Maartens, Roy; Heavens, Alan; Clarkson, Chris

    2013-01-01

    Observationally confirming spatial homogeneity on sufficiently large cosmological scales is of importance to test one of the underpinning assumptions of cosmology, and is also imperative for correctly interpreting dark energy. A challenging aspect of this is that homogeneity must be probed inside our past light cone, while observations take place on the light cone. The star formation history (SFH) in the galaxy fossil record provides a novel way to do this. We calculate the SFH of stacked luminous red galaxy (LRG) spectra obtained from the Sloan Digital Sky Survey. We divide the LRG sample into 12 equal-area contiguous sky patches and 10 redshift slices (0.2 < z < 0.5), which correspond to 120 blocks of volume {approx}0.04 Gpc{sup 3}. Using the SFH in a time period that samples the history of the universe between look-back times 11.5 and 13.4 Gyr as a proxy for homogeneity, we calculate the posterior distribution for the excess large-scale variance due to inhomogeneity, and find that the most likely solution is no extra variance at all. At 95% credibility, there is no evidence of deviations larger than 5.8%.

  5. Tits Satake projections of homogeneous special geometries

    NASA Astrophysics Data System (ADS)

    Fré, Pietro; Gargiulo, Floriana; Rosseel, Jan; Rulik, Ksenya; Trigiante, Mario; Van Proeyen, Antoine

    2007-01-01

    We organize the homogeneous special geometries, describing as well the couplings of D = 6, 5, 4 and 3 supergravities with eight supercharges, in a small number of universality classes. This relates manifolds on which similar types of dynamical solutions can exist. The mathematical ingredient is the Tits Satake projection of real simple Lie algebras, which we extend to all solvable Lie algebras occurring in these homogeneous special geometries. Apart from some exotic cases all the other, 'very special', homogeneous manifolds can be grouped into seven universality classes. The organization of these classes, which capture the essential features of their basic dynamics, commutes with the r- and c-map. Different members are distinguished by different choices of the paint group, a notion discovered in the context of cosmic billiard dynamics of non-maximally supersymmetric supergravities. We comment on the usefulness of this organization in universality class both in relation with cosmic billiard dynamics and with configurations of branes and orbifolds defining special geometry backgrounds.

  6. Homogeneous Biosensing Based on Magnetic Particle Labels

    PubMed Central

    Schrittwieser, Stefan; Pelaz, Beatriz; Parak, Wolfgang J.; Lentijo-Mozo, Sergio; Soulantica, Katerina; Dieckhoff, Jan; Ludwig, Frank; Guenther, Annegret; Tschöpe, Andreas; Schotter, Joerg

    2016-01-01

    The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation. PMID:27275824

  7. Effect of homogenization and pasteurization on the structure and stability of whey protein in milk.

    PubMed

    Qi, Phoebe X; Ren, Daxi; Xiao, Yingping; Tomasula, Peggy M

    2015-05-01

    The effect of homogenization alone or in combination with high-temperature, short-time (HTST) pasteurization or UHT processing on the whey fraction of milk was investigated using highly sensitive spectroscopic techniques. In pilot plant trials, 1-L quantities of whole milk were homogenized in a 2-stage homogenizer at 35°C (6.9 MPa/10.3 MPa) and, along with skim milk, were subjected to HTST pasteurization (72°C for 15 s) or UHT processing (135°C for 2 s). Other whole milk samples were processed using homogenization followed by either HTST pasteurization or UHT processing. The processed skim and whole milk samples were centrifuged further to remove fat and then acidified to pH 4.6 to isolate the corresponding whey fractions, and centrifuged again. The whey fractions were then purified using dialysis and investigated using the circular dichroism, Fourier transform infrared, and Trp intrinsic fluorescence spectroscopic techniques. Results demonstrated that homogenization combined with UHT processing of milk caused not only changes in protein composition but also significant secondary structural loss, particularly in the amounts of apparent antiparallel β-sheet and α-helix, as well as diminished tertiary structural contact. In both cases of homogenization alone and followed by HTST treatments, neither caused appreciable chemical changes, nor remarkable secondary structural reduction. But disruption was evident in the tertiary structural environment of the whey proteins due to homogenization of whole milk as shown by both the near-UV circular dichroism and Trp intrinsic fluorescence. In-depth structural stability analyses revealed that even though processing of milk imposed little impairment on the secondary structural stability, the tertiary structural stability of whey protein was altered significantly. The following order was derived based on these studies: raw whole>HTST, homogenized, homogenized and pasteurized>skimmed and pasteurized, and skimmed UHT>homogenized

  8. Converting Homogeneous to Heterogeneous in Electrophilic Catalysis using Monodisperse Metal Nanoparticles

    SciTech Connect

    Witham, Cole A.; Huang, Wenyu; Tsung, Chia-Kuang; Kuhn, John N.; Somorjai, Gabor A.; Toste, F. Dean

    2009-10-15

    A continuing goal in catalysis is the transformation of processes from homogeneous to heterogeneous. To this end, nanoparticles represent a new frontier in heterogeneous catalysis, where this conversion is supplemented by the ability to obtain new or divergent reactivity and selectivity. We report a novel method for applying heterogeneous catalysts to known homogeneous catalytic reactions through the design and synthesis of electrophilic platinum nanoparticles. These nanoparticles are selectively oxidized by the hypervalent iodine species PhICl{sub 2}, and catalyze a range of {pi}-bond activation reactions previously only homogeneously catalyzed. Multiple experimental methods are utilized to unambiguously verify the heterogeneity of the catalytic process. The discovery of treatments for nanoparticles that induce the desired homogeneous catalytic activity should lead to the further development of reactions previously inaccessible in heterogeneous catalysis. Furthermore, our size and capping agent study revealed that Pt PAMAM dendrimer-capped nanoparticles demonstrate superior activity and recyclability compared to larger, polymer-capped analogues.

  9. Homogeneous and heterogeneous chemistry along air parcel trajectories

    NASA Technical Reports Server (NTRS)

    Jones, R. L.; Mckenna, D. L.; Poole, L. R.; Solomon, S.

    1990-01-01

    The study of coupled heterogeneous and homogeneous chemistry due to polar stratospheric clouds (PSC's) using Lagrangian parcel trajectories for interpretation of the Airborne Arctic Stratosphere Experiment (AASE) is discussed. This approach represents an attempt to quantitatively model the physical and chemical perturbation to stratospheric composition due to formation of PSC's using the fullest possible representation of the relevant processes. Further, the meteorological fields from the United Kingdom Meteorological office global model were used to deduce potential vorticity and inferred regions of PSC's as an input to flight planning during AASE.

  10. Homogeneous Charge Compression Ignition Free Piston Linear Alternator

    SciTech Connect

    Janson Wu; Nicholas Paradiso; Peter Van Blarigan; Scott Goldsborough

    1998-11-01

    An experimental and theoretical investigation of a homogeneous charge compression ignition (HCCI) free piston powered linear alternator has been conducted to determine if improvements can be made in the thermal and conversion efficiencies of modern electrical generator systems. Performance of a free piston engine was investigated using a rapid compression expansion machine and a full cycle thermodynamic model. Linear alternator performance was investigated with a computer model. In addition linear alternator testing and permanent magnet characterization hardware were developed. The development of the two-stroke cycle scavenging process has begun.

  11. Creating and Nurturing Strong Teams.

    ERIC Educational Resources Information Center

    Martin, Kaye M.

    1999-01-01

    Discusses ways to create and sustain strong teaching teams, including matching curriculum goals, complementary professional strengths, and exercise of autonomy. Elaborates the administrator's role in nurturing and supporting teamwork. (JPB)

  12. Creating a family health history

    MedlinePlus

    Family health history; Create a family health history; Family medical history ... Many factors affect your health. These include your: Genes Diet and exercise habits Environment Family members tend to share certain behaviors, genetic traits, and habits. ...

  13. Creating and Exploring Simple Models

    ERIC Educational Resources Information Center

    Hubbard, Miles J.

    2007-01-01

    Students manipulate data algebraically, and statistically to create models applied to a falling ball. They also borrow tools from arithmetic progressions to examine the relationship between the velocity and the distance the ball falls. (Contains 2 tables and 5 figures.)

  14. Creating Cartoons to Promote Leaderships Skills and Explore Leadership Qualities

    ERIC Educational Resources Information Center

    Smith, Latisha L.; Clausen, Courtney K.; Teske, Jolene K.; Ghayoorrad, Maryam; Gray, Phyllis; Al Subia, Sukainah; Atwood-Blaine, Dana; Rule, Audrey C.

    2015-01-01

    This document describes a strategy for increasing student leadership and creativity skills through the creation of cartoons. Creating cartoons engages students in divergent thinking and cognitive processes, such as perception, recall, and mental processing. When students create cartoons focused on a particular topic, they are making connections to…

  15. Homogenous charge compression ignition engine having a cylinder including a high compression space

    DOEpatents

    Agama, Jorge R.; Fiveland, Scott B.; Maloney, Ronald P.; Faletti, James J.; Clarke, John M.

    2003-12-30

    The present invention relates generally to the field of homogeneous charge compression engines. In these engines, fuel is injected upstream or directly into the cylinder when the power piston is relatively close to its bottom dead center position. The fuel mixes with air in the cylinder as the power piston advances to create a relatively lean homogeneous mixture that preferably ignites when the power piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. Thus, the present invention divides the homogeneous charge between a controlled volume higher compression space and a lower compression space to better control the start of ignition.

  16. Computationally Probing the Performance of Hybrid, Heterogeneous, and Homogeneous Iridium-Based Catalysts for Water Oxidation

    SciTech Connect

    García-Melchor, Max; Vilella, Laia; López, Núria; Vojvodic, Aleksandra

    2016-04-29

    An attractive strategy to improve the performance of water oxidation catalysts would be to anchor a homogeneous molecular catalyst on a heterogeneous solid surface to create a hybrid catalyst. The idea of this combined system is to take advantage of the individual properties of each of the two catalyst components. We use Density Functional Theory to determine the stability and activity of a model hybrid water oxidation catalyst consisting of a dimeric Ir complex attached on the IrO2(110) surface through two oxygen atoms. We find that homogeneous catalysts can be bound to its matrix oxide without losing significant activity. Hence, designing hybrid systems that benefit from both the high tunability of activity of homogeneous catalysts and the stability of heterogeneous systems seems feasible.

  17. Turbulent Diffusion in Non-Homogeneous Environments

    NASA Astrophysics Data System (ADS)

    Diez, M.; Redondo, J. M.; Mahjoub, O. B.; Sekula, E.

    2012-04-01

    Many experimental studies have been devoted to the understanding of non-homogeneous turbulent dynamics. Activity in this area intensified when the basic Kolmogorov self-similar theory was extended to two-dimensional or quasi 2D turbulent flows such as those appearing in the environment, that seem to control mixing [1,2]. The statistical description and the dynamics of these geophysical flows depend strongly on the distribution of long lived organized (coherent) structures. These flows show a complex topology, but may be subdivided in terms of strongly elliptical domains (high vorticity regions), strong hyperbolic domains (deformation cells with high energy condensations) and the background turbulent field of moderate elliptic and hyperbolic characteristics. It is of fundamental importance to investigate the different influence of these topological diverse regions. Relevant geometrical information of different areas is also given by the maximum fractal dimension, which is related to the energy spectrum of the flow. Using all the available information it is possible to investigate the spatial variability of the horizontal eddy diffusivity K(x,y). This information would be very important when trying to model numerically the behaviour in time of the oil spills [3,4] There is a strong dependence of horizontal eddy diffusivities with the Wave Reynolds number as well as with the wind stress measured as the friction velocity from wind profiles measured at the coastline. Natural sea surface oily slicks of diverse origin (plankton, algae or natural emissions and seeps of oil) form complicated structures in the sea surface due to the effects of both multiscale turbulence and Langmuir circulation. It is then possible to use the topological and scaling analysis to discriminate the different physical sea surface processes. We can relate higher orden moments of the Lagrangian velocity to effective diffusivity in spite of the need to calibrate the different regions determining the

  18. Homogenization, lyophilization or acid-extraction of meat products improves iron uptake from cereal-meat product combinations in an in vitro digestion/Caco-2 cell model.

    PubMed

    Pachón, Helena; Stoltzfus, Rebecca J; Glahn, Raymond P

    2009-03-01

    The effect of processing (homogenization, lyophilization, acid-extraction) meat products on iron uptake from meat combined with uncooked iron-fortified cereal was evaluated using an in vitro digestion/Caco-2 cell model. Beef was cooked, blended to create smaller meat particles, and combined with electrolytic iron-fortified infant rice cereal. Chicken liver was cooked and blended, lyophilized, or acid-extracted, and combined with FeSO4-fortified wheat flour. In the beef-cereal combination, Caco-2 cell iron uptake, assessed by measuring the ferritin formed by cells, was greater when the beef was blended for the greatest amount of time (360 s) compared with 30 s (P < 0.05). Smaller liver particles (blended for 360 s or lyophilized) significantly enhanced iron uptake compared to liver blended for 60 s (P < 0.001) in the liver-flour combination. Compared to liver blended for 60 s, acid-extraction of liver significantly enhanced iron uptake (P = 0.03) in the liver-flour combination. Homogenization of beef and homogenization, lyophilization, or acid-extraction of chicken liver increases the enhancing effect of meat products on iron absorption in iron-fortified cereals.

  19. Numerical homogenization of the Richards equation for unsaturated water flow through heterogeneous soils

    NASA Astrophysics Data System (ADS)

    Li, Na; Yue, Xingye; Ren, Li

    2016-11-01

    Homogenized equations and the corresponding effective constitutive relations are generally necessary for numerically modeling large-scale unsaturated flow processes in soils. Recently, based on the Kirchhoff transformation and the two-scale convergence theory, a homogenization method for the Richards equation with the Mualem-van Genuchten model has been proposed, with a constant model parameter α relating to the inverse of the air-entry pressure and the soil pore size distribution. The homogenized model is computationally efficient and convenient to use because of its explicit expression. In this study, we generalize this method, allowing α to be a spatially distributed random field and proposing a homogenized Richards equation in the mixed form (θ/h) under the condition that the effective hydraulic conductivity tensor is diagonal. This generalization eliminates the limitation of a constant α in practical applications; the proposed homogenized model is meaningful in most situations because the flow problems are influenced mainly by the diagonal terms of conductivity and the off-diagonal terms are often neglected. Two-dimensional numerical tests are conducted in soil profiles with different degrees of spatial heterogeneity structure to illustrate that the homogenized model can capture the fine-scale flow behaviors on coarse grids effectively. Homogenization for the Richards equation with other two commonly used constitutive relations—the Brooks-Corey model and the Gardner-Russo model—is also illustrated in this study.

  20. Modeling the homogenization kinetics of as-cast U-10wt% Mo alloys

    NASA Astrophysics Data System (ADS)

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang; Paxton, Dean; Lavender, Curt; Burkes, Douglas

    2016-04-01

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  1. Modeling the Homogenization Kinetics of As-Cast U-10wt% Mo alloys

    SciTech Connect

    Xu, Zhijie; Joshi, Vineet; Hu, Shenyang Y.; Paxton, Dean M.; Lavender, Curt A.; Burkes, Douglas

    2016-01-15

    Low-enriched U-22at% Mo (U-10Mo) alloy has been considered as an alternative material to replace the highly enriched fuels in research reactors. For the U-10Mo to work effectively and replace the existing fuel material, a thorough understanding of the microstructure development from as-cast to the final formed structure is required. The as-cast microstructure typically resembles an inhomogeneous microstructure with regions containing molybdenum-rich and -lean regions, which may affect the processing and possibly the in-reactor performance. This as-cast structure must be homogenized by thermal treatment to produce a uniform Mo distribution. The development of a modeling capability will improve the understanding of the effect of initial microstructures on the Mo homogenization kinetics. In the current work, we investigated the effect of as-cast microstructure on the homogenization kinetics. The kinetics of the homogenization was modeled based on a rigorous algorithm that relates the line scan data of Mo concentration to the gray scale in energy dispersive spectroscopy images, which was used to generate a reconstructed Mo concentration map. The map was then used as realistic microstructure input for physics-based homogenization models, where the entire homogenization kinetics can be simulated and validated against the available experiment data at different homogenization times and temperatures.

  2. Effect of ultrasonic homogenization on the Vis/NIR bulk optical properties of milk.

    PubMed

    Aernouts, Ben; Van Beers, Robbe; Watté, Rodrigo; Huybrechts, Tjebbe; Jordens, Jeroen; Vermeulen, Daniel; Van Gerven, Tom; Lammertyn, Jeroen; Saeys, Wouter

    2015-02-01

    The size of colloidal particles in food products has a considerable impact on the product's physicochemical, functional and sensory characteristics. Measurement techniques to monitor the size of suspended particles could, therefore, help to further reduce the variability in production processes and promote the development of new food products with improved properties. Visible and near-infrared (Vis/NIR) spectroscopy is already widely used to measure the composition of agricultural and food products. However, this technology can also be consulted to acquire microstructure-related scattering properties of food products. In this study, the effect of the fat globule size on the Vis/NIR bulk scattering properties of milk was investigated. Variability in fat globule size distribution was created using ultrasonic homogenization of raw milk. Reduction of the fat globule size resulted in a higher wavelength-dependency of both the Vis/NIR bulk scattering coefficient and the scattering anisotropy factor. Moreover, the anisotropy factor and the bulk scattering coefficients for wavelengths above 600 nm were reduced and were dominated by Rayleigh scattering. Additionally, the bulk scattering properties could be well (R(2) ≥ 0.990) estimated from measured particle size distributions by consulting an algorithm based on the Mie solution. Future research could aim at the inversion of this model to estimate the particle size distributions from Vis/NIR spectroscopic measurements.

  3. Sulfur isotope homogeneity of lunar mare basalts

    NASA Astrophysics Data System (ADS)

    Wing, Boswell A.; Farquhar, James

    2015-12-01

    We present a new set of high precision measurements of relative 33S/32S, 34S/32S, and 36S/32S values in lunar mare basalts. The measurements are referenced to the Vienna-Canyon Diablo Troilite (V-CDT) scale, on which the international reference material, IAEA-S-1, is characterized by δ33S = -0.061‰, δ34S ≡ -0.3‰ and δ36S = -1.27‰. The present dataset confirms that lunar mare basalts are characterized by a remarkable degree of sulfur isotopic homogeneity, with most new and published SF6-based sulfur isotope measurements consistent with a single mass-dependent mean isotopic composition of δ34S = 0.58 ± 0.05‰, Δ33S = 0.008 ± 0.006‰, and Δ36S = 0.2 ± 0.2‰, relative to V-CDT, where the uncertainties are quoted as 99% confidence intervals on the mean. This homogeneity allows identification of a single sample (12022, 281) with an apparent 33S enrichment, possibly reflecting cosmic-ray-induced spallation reactions. It also reveals that some mare basalts have slightly lower δ34S values than the population mean, which is consistent with sulfur loss from a reduced basaltic melt prior to eruption at the lunar surface. Both the sulfur isotope homogeneity of the lunar mare basalts and the predicted sensitivity of sulfur isotopes to vaporization-driven fractionation suggest that less than ≈1-10% of lunar sulfur was lost after a potential moon-forming impact event.

  4. The Chemical Homogeneity of Open Clusters

    NASA Astrophysics Data System (ADS)

    Bovy, Jo

    2016-01-01

    Determining the level of chemical homogeneity in open clusters is of fundamental importance in the study of the evolution of star-forming clouds and that of the Galactic disk. Yet limiting the initial abundance spread in clusters has been hampered by difficulties in obtaining consistent spectroscopic abundances for different stellar types. Without reference to any specific model of stellar photospheres, a model for a homogeneous cluster is that it forms a one-dimensional sequence, with any differences between members due to variations in stellar mass and observational uncertainties. I present a novel method for investigating the abundance spread in open clusters that tests this one-dimensional hypothesis at the level of observed stellar spectra, rather than constraining homogeneity using derived abundances as is traditionally done. Using high-resolution APOGEE spectra for 49 giants in M67, NGC 6819, and NGC 2420 I demonstrate that these spectra form one-dimensional sequences for each cluster. With detailed forward modeling of the spectra and Approximate Bayesian Computation, I derive strong limits on the initial abundance spread of 15 elements: <0.01 (0.02) {dex} for C and Fe, ≲0.015 (0.03) {dex} for N, O, Mg, Si, and Ni, ≲0.02 (0.03) {dex} for Al, Ca, and Mn, and ≲0.03 (0.05) {dex} for Na, S, K, Ti, and V (at 68% and 95% confidence, respectively). The strong limits on C and O imply that no pollution by massive core-collapse supernovae occurred during star formation in open clusters, which, thus, need to form within ≲6 {Myr}. Further development of this and related techniques will bring the power of differential abundances to stars other than solar twins in large spectroscopic surveys and will help unravel the history of star formation and chemical enrichment in the Milky Way through chemical tagging.

  5. Isotropic homogeneous universe with viscous fluid

    SciTech Connect

    Santos, N.O.; Dias, R.S.; Banerjee, A.

    1985-04-01

    Exact solutions are obtained for the isotropic homogeneous cosmological model with viscous fluid. The fluid has only bulk viscosity and the viscosity coefficient is taken to be a power function of the mass density. The equation of state assumed obeys a linear relation between mass density and pressure. The models satisfying Hawking's energy conditions are discussed. Murphy's model is only a special case of this general set of solutions and it is shown that Murphy's conclusion that the introduciton of bulk viscosity can avoid the occurrence of space-time singularity at finite past is not, in general, valid.

  6. Homogenization and Numerical Methods for Hyperbolic Equations

    NASA Astrophysics Data System (ADS)

    Liu, Jian-Guo

    1990-01-01

    This dissertation studies three aspects of analysis and numerical methods for partial differential equations with oscillatory solutions. 1. Homogenization theory for certain linear hyperbolic equations is developed. We derive the homogenized convection equations for linear convection problems with rapidly varying velocity in space and time. We find that the oscillatory solutions are very sensitive to the arithmetic properties of certain parameters, such as the corresponding rotation number and the ratio between the components of the mean velocity field in linear convection. We also show that the oscillatory velocity field in two dimensional incompressible flow behaves like shear flows. 2. The homogenization of scalar nonlinear conservation laws in several space variables with oscillatory initial data is also discussed. We prove that the initial oscillations will be eliminated for any positive time when the equations are non-degenerate. This is also true for degenerate equations if there is enough mixing among the initial oscillations in the degenerate direction. Otherwise, the initial oscillation, for which the homogenized equation is obtained, will survive and be propagated. The large-time behavior of conservation laws with several space variables is studied. We show that, under a new nondegenerate condition (the second derivatives of the flux functions are linearly independent in any interval), a piecewise smooth periodic solution with converge strongly to the mean value of initial data. This generalizes Glimm and Lax's result for the one dimensional problem (3). 3. Numerical simulations of the oscillatory solutions are also carried out. We give some error estimate for varepsilon-h resonance ( varepsilon: oscillation wave length, h: numerical step) and prove essential convergence (24) of order alpha < 1 for some numerical schemes. These include upwind schemes and particle methods for linear hyperbolic equations with oscillatory coefficients. A stochastic analysis

  7. Homogeneous sphere packings with triclinic symmetry.

    PubMed

    Fischer, W; Koch, E

    2002-11-01

    All homogeneous sphere packings with triclinic symmetry have been derived by studying the characteristic Wyckoff positions P -1 1a and P -1 2i of the two triclinic lattice complexes. These sphere packings belong to 30 different types. Only one type exists that has exclusively triclinic sphere packings and no higher-symmetry ones. The inherent symmetry of part of the sphere packings is triclinic for 18 types. Sphere packings of all but six of the 30 types may be realized as stackings of parallel planar nets.

  8. Heterogeneity versus homogeneity of multiple sclerosis

    PubMed Central

    Sato, Fumitaka; Martinez, Nicholas E; Omura, Seiichi; Tsunoda, Ikuo

    2011-01-01

    The 10th International Congress of Neuroimmunology, including the 10th European School of Neuroimmunology Course, was held by the International Society of Neuroimmunology in Sitges (Barcelona, Spain) on 26–30 October 2010. The conference covered a wide spectrum of issues and challenges in both basic science and clinical aspects of neuroimmunology. Data and ideas were shared through a variety of programs, including review talks and poster sessions. One of the topics of the congress was whether multiple sclerosis is a homogenous or heterogenous disease, clinically and pathologically, throughout its course. PMID:21426254

  9. Compressible homogeneous shear: Simulation and modeling

    NASA Technical Reports Server (NTRS)

    Sarkar, S.; Erlebacher, G.; Hussaini, M. Y.

    1992-01-01

    Compressibility effects were studied on turbulence by direct numerical simulation of homogeneous shear flow. A primary observation is that the growth of the turbulent kinetic energy decreases with increasing turbulent Mach number. The sinks provided by compressible dissipation and the pressure dilatation, along with reduced Reynolds shear stress, are shown to contribute to the reduced growth of kinetic energy. Models are proposed for these dilatational terms and verified by direct comparison with the simulations. The differences between the incompressible and compressible fields are brought out by the examination of spectra, statistical moments, and structure of the rate of strain tensor.

  10. Bio-inspired homogeneous multi-scale place recognition.

    PubMed

    Chen, Zetao; Lowry, Stephanie; Jacobson, Adam; Hasselmo, Michael E; Milford, Michael

    2015-12-01

    Robotic mapping and localization systems typically operate at either one fixed spatial scale, or over two, combining a local metric map and a global topological map. In contrast, recent high profile discoveries in neuroscience have indicated that animals such as rodents navigate the world using multiple parallel maps, with each map encoding the world at a specific spatial scale. While a number of theoretical-only investigations have hypothesized several possible benefits of such a multi-scale mapping system, no one has comprehensively investigated the potential mapping and place recognition performance benefits for navigating robots in large real world environments, especially using more than two homogeneous map scales. In this paper we present a biologically-inspired multi-scale mapping system mimicking the rodent multi-scale map. Unlike hybrid metric-topological multi-scale robot mapping systems, this new system is homogeneous, distinguishable only by scale, like rodent neural maps. We present methods for training each network to learn and recognize places at a specific spatial scale, and techniques for combining the output from each of these parallel networks. This approach differs from traditional probabilistic robotic methods, where place recognition spatial specificity is passively driven by models of sensor uncertainty. Instead we intentionally create parallel learning systems that learn associations between sensory input and the environment at different spatial scales. We also conduct a systematic series of experiments and parameter studies that determine the effect on performance of using different neural map scaling ratios and different numbers of discrete map scales. The results demonstrate that a multi-scale approach universally improves place recognition performance and is capable of producing better than state of the art performance compared to existing robotic navigation algorithms. We analyze the results and discuss the implications with respect to

  11. Si isotope homogeneity of the solar nebula

    SciTech Connect

    Pringle, Emily A.; Savage, Paul S.; Moynier, Frédéric; Jackson, Matthew G.; Barrat, Jean-Alix E-mail: savage@levee.wustl.edu E-mail: moynier@ipgp.fr E-mail: Jean-Alix.Barrat@univ-brest.fr

    2013-12-20

    The presence or absence of variations in the mass-independent abundances of Si isotopes in bulk meteorites provides important clues concerning the evolution of the early solar system. No Si isotopic anomalies have been found within the level of analytical precision of 15 ppm in {sup 29}Si/{sup 28}Si across a wide range of inner solar system materials, including terrestrial basalts, chondrites, and achondrites. A possible exception is the angrites, which may exhibit small excesses of {sup 29}Si. However, the general absence of anomalies suggests that primitive meteorites and differentiated planetesimals formed in a reservoir that was isotopically homogenous with respect to Si. Furthermore, the lack of resolvable anomalies in the calcium-aluminum-rich inclusion measured here suggests that any nucleosynthetic anomalies in Si isotopes were erased through mixing in the solar nebula prior to the formation of refractory solids. The homogeneity exhibited by Si isotopes may have implications for the distribution of Mg isotopes in the solar nebula. Based on supernova nucleosynthetic yield calculations, the expected magnitude of heavy-isotope overabundance is larger for Si than for Mg, suggesting that any potential Mg heterogeneity, if present, exists below the 15 ppm level.

  12. On shearing fluids with homogeneous densities

    NASA Astrophysics Data System (ADS)

    Srivastava, D. C.; Srivastava, V. C.; Kumar, Rajesh

    2016-06-01

    In this paper, we study shearing spherically symmetric homogeneous density fluids in comoving coordinates. It is found that the expansion of the four-velocity of a perfect fluid is homogeneous, whereas its shear is generated by an arbitrary function of time M( t), related to the mass function of the distribution. This function is found to bear a functional relationship with density. The field equations are reduced to two coupled first order ordinary differential equations for the metric coefficients g_{11} and g_{22}. We have explored a class of solutions assuming that M is a linear function of the density. This class embodies, as a subcase, the complete class of shear-free solutions. We have discussed the off quoted work of Kustaanheimo (Comment Phys Math XIII:12, 1, 1947) and have noted that it deals with shear-free fluids having anisotropic pressure. It is shown that the anisotropy of the fluid is characterized by an arbitrary function of time. We have discussed some issues of historical priorities and credentials related to shear-free solutions. Recent controversial claims by Mitra (Astrophys Space Sci 333:351, 2011 and Gravit Cosmol 18:17, 2012) have also been addressed. We found that the singularity and the shearing motion of the fluid are closely related. Hence, there is a need for fresh look to the solutions obtained earlier in comoving coordinates.

  13. Modified Homogeneous Data Set of Coronal Intensities

    NASA Astrophysics Data System (ADS)

    Dorotovič, I.; Minarovjech, M.; Lorenc, M.; Rybanský, M.

    2014-07-01

    The Astronomical Institute of the Slovak Academy of Sciences has published the intensities, recalibrated with respect to a common intensity scale, of the 530.3 nm (Fe xiv) green coronal line observed at ground-based stations up to the year 2008. The name of this publication is Homogeneous Data Set (HDS). We have developed a method that allows one to successfully substitute the ground-based observations by satellite observations and, thus, continue with the publication of the HDS. For this purpose, the observations of the Extreme-ultraviolet Imaging Telescope (EIT), onboard the Solar and Heliospheric Observatory (SOHO) satellite, were exploited. Among other data the EIT instrument provides almost daily 28.4 nm (Fe xv) emission-line snapshots of the corona. The Fe xiv and Fe xv data (4051 observation days) taken in the period 1996 - 2008 have been compared and good agreement was found. The method to obtain the individual data for the HDS follows from the correlation analysis described in this article. The resulting data, now under the name of Modified Homogeneous Data Set (MHDS), are identical up to 1996 to those in the HDS. The MHDS can be used further for studies of the coronal solar activity and its cycle. These data are available at http://www.suh.sk.

  14. Microstructure and homogeneity of dental porcelain frits.

    PubMed

    Ban, S; Matsuo, K; Mizutani, N; Iwase, H; Kani, T; Hasegawa, J

    1998-12-01

    The microstructure and homogeneity of three commercial dentin and incisal unfired porcelain frits (one conventional and two ultra-low fusing types, fused-to metal were analyzed by X-ray diffractometry, scanning electron microspectroscopy, and wavelength- and energy dispersive X-ray microspectroscopy. The average contents of tetragonal and cubic leucite for the conventional and one of the ultra-low fusing type frits were 20.1-22.6 wt% and 0-2.6 wt%, respectively, whereas those of another of the ultra-low fusing type frits were about 11.5-11.6 wt% and 2.9-4.6 wt%, respectively. The conventional type frits seemed to be admixtures of three kinds of glass frits. One of the ultra-low fusing type frits seemed to be an admixture of four kinds of glass frits. Another ultra-low fusing frits seemed to be only one kind of glass frit dispersed with small size, less than 1 micron, leucite crystals. There were no remarkable differences in microstructure and homogeneity between dentin and incisal porcelain frits in each brand.

  15. Emergence of Leadership within a Homogeneous Group

    PubMed Central

    Eskridge, Brent E.; Valle, Elizabeth; Schlupp, Ingo

    2015-01-01

    Large scale coordination without dominant, consistent leadership is frequent in nature. How individuals emerge from within the group as leaders, however transitory this position may be, has become an increasingly common question asked. This question is further complicated by the fact that in many of these aggregations, differences between individuals are minor and the group is largely considered to be homogeneous. In the simulations presented here, we investigate the emergence of leadership in the extreme situation in which all individuals are initially identical. Using a mathematical model developed using observations of natural systems, we show that the addition of a simple concept of leadership tendencies which is inspired by observations of natural systems and is affected by experience can produce distinct leaders and followers using a nonlinear feedback loop. Most importantly, our results show that small differences in experience can promote the rapid emergence of stable roles for leaders and followers. Our findings have implications for our understanding of adaptive behaviors in initially homogeneous groups, the role experience can play in shaping leadership tendencies, and the use of self-assessment in adapting behavior and, ultimately, self-role-assignment. PMID:26226381

  16. The Statistical Mechanics of Ideal Homogeneous Turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    2002-01-01

    Plasmas, such as those found in the space environment or in plasma confinement devices, are often modeled as electrically conducting fluids. When fluids and plasmas are energetically stirred, regions of highly nonlinear, chaotic behavior known as turbulence arise. Understanding the fundamental nature of turbulence is a long-standing theoretical challenge. The present work describes a statistical theory concerning a certain class of nonlinear, finite dimensional, dynamical models of turbulence. These models arise when the partial differential equations describing incompressible, ideal (i.e., nondissipative) homogeneous fluid and magnetofluid (i.e., plasma) turbulence are Fourier transformed into a very large set of ordinary differential equations. These equations define a divergenceless flow in a high-dimensional phase space, which allows for the existence of a Liouville theorem, guaranteeing a distribution function based on constants of the motion (integral invariants). The novelty of these particular dynamical systems is that there are integral invariants other than the energy, and that some of these invariants behave like pseudoscalars under two of the discrete symmetry transformations of physics, parity, and charge conjugation. In this work the 'rugged invariants' of ideal homogeneous turbulence are shown to be the only significant scalar and pseudoscalar invariants. The discovery that pseudoscalar invariants cause symmetries of the original equations to be dynamically broken and induce a nonergodic structure on the associated phase space is the primary result presented here. Applicability of this result to dissipative turbulence is also discussed.

  17. Planar factors of proper homogeneous Lorentz transformations

    SciTech Connect

    Fahnline, D.E.

    1985-02-01

    This article discusses two constructions factoring proper homogeneous Lorentz transformations H into the product of two planar transformations. A planar transformation is a proper homogeneous Lorentz transformation changing vectors in a two-flat through the origin, called the transformation two-flat, into new vectors in the same two-flat and which leaves unchanged vectors in the orthogonal two-flat, called the pointwise invariant two-flat. The first construction provides two planar factors such that a given timelike vector lies in the transformation two-flat of one and in the pointwise invariant two-flat of the other; it leads to several basic conditions on the trace of H and to necessary and sufficient conditions for H to be planar. The second construction yields explicit formulas for the orthogonal factors of H when they exist and are unique, where two planar transformations are orthogonal if the transformation two-flat of one is the pointwise invariant two-flat of the other.

  18. Primary Healthcare Solo Practices: Homogeneous or Heterogeneous?

    PubMed Central

    Beaulieu, Marie-Dominique; Boivin, Antoine; Prud'homme, Alexandre

    2014-01-01

    Introduction. Solo practices have generally been viewed as forming a homogeneous group. However, they may differ on many characteristics. The objective of this paper is to identify different forms of solo practice and to determine the extent to which they are associated with patient experience of care. Methods. Two surveys were carried out in two regions of Quebec in 2010: a telephone survey of 9180 respondents from the general population and a postal survey of 606 primary healthcare (PHC) practices. Data from the two surveys were linked through the respondent's usual source of care. A taxonomy of solo practices was constructed (n = 213), using cluster analysis techniques. Bivariate and multilevel analyses were used to determine the relationship of the taxonomy with patient experience of care. Results. Four models were derived from the taxonomy. Practices in the “resourceful networked” model contrast with those of the “resourceless isolated” model to the extent that the experience of care reported by their patients is more favorable. Conclusion. Solo practice is not a homogeneous group. The four models identified have different organizational features and their patients' experience of care also differs. Some models seem to offer a better organizational potential in the context of current reforms. PMID:24523964

  19. Population dynamics in non-homogeneous environments

    NASA Astrophysics Data System (ADS)

    Alards, Kim M. J.; Tesser, Francesca; Toschi, Federico

    2014-11-01

    For organisms living in aquatic ecosystems the presence of fluid transport can have a strong influence on the dynamics of populations and on evolution of species. In particular, displacements due to self-propulsion, summed up with turbulent dispersion at larger scales, strongly influence the local densities and thus population and genetic dynamics. Real marine environments are furthermore characterized by a high degree of non-homogeneities. In the case of population fronts propagating in ``fast'' turbulence, with respect to the population duplication time, the flow effect can be studied by replacing the microscopic diffusivity with an effective turbulent diffusivity. In the opposite case of ``slow'' turbulence the advection by the flow has to be considered locally. Here we employ numerical simulations to study the influence of non-homogeneities in the diffusion coefficient of reacting individuals of different species expanding in a 2 dimensional space. Moreover, to explore the influence of advection, we consider a population expanding in the presence of simple velocity fields like cellular flows. The output is analyzed in terms of front roughness, front shape, propagation speed and, concerning the genetics, by means of heterozygosity and local and global extinction probabilities.

  20. HYPERLEDA. II. The homogenized HI data

    NASA Astrophysics Data System (ADS)

    Paturel, G.; Theureau, G.; Bottinelli, L.; Gouguenheim, L.; Coudreau-Durand, N.; Hallet, N.; Petit, C.

    2003-12-01

    After a compilation of HI data from 611 references and new observations made in Nançay, we produce a catalog of homogenized HI data for 16781 galaxies. The homogenization is made using the EPIDEMIC method from which all data are progressively converted into the adopted standard. The result is a catalog giving: 1) the logarithm of twice the maximum rotation velocity, log 2V_Msin i, converted to the system of Mathewson et al. (\\cite{Mathewson1996}). This quantity is given without correction for inclination; 2) the HI magnitude, m21, (area of the 21-cm line width expressed in magnitude) converted to the flux system of Theureau et al. (\\cite{Theureau1998}); 3) the HI velocity, V_HI, expressed with the optical definition (i.e., using wavelengths instead frequencies). The typical uncertainties are: 0.04 for log 2V_Msin i, 0.25 mag for m21 and 9 km s-1 for V_HI. Full Tables \\ref{epidemicw}, \\ref{epidemicw2}, \\ref{epidemicf}, \\ref{epidemicf2} and Fig. \\ref{profiles} are available in electronic form at http://www.edpsciences.org. Full Tables \\ref{references}, \\ref{cataf}, \\ref{newdata} and \\ref{notes} are available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/412/57

  1. Homogenization in micro-magneto-mechanics

    NASA Astrophysics Data System (ADS)

    Sridhar, A.; Keip, M.-A.; Miehe, C.

    2016-07-01

    Ferromagnetic materials are characterized by a heterogeneous micro-structure that can be altered by external magnetic and mechanical stimuli. The understanding and the description of the micro-structure evolution is of particular importance for the design and the analysis of smart materials with magneto-mechanical coupling. The macroscopic response of the material results from complex magneto-mechanical interactions occurring on smaller length scales, which are driven by magnetization reorientation and associated magnetic domain wall motions. The aim of this work is to directly base the description of the macroscopic magneto-mechanical material behavior on the micro-magnetic domain evolution. This will be realized by the incorporation of a ferromagnetic phase-field formulation into a macroscopic Boltzmann continuum by the use of computational homogenization. The transition conditions between the two scales are obtained via rigorous exploitation of rate-type and incremental variational principles, which incorporate an extended version of the classical Hill-Mandel macro-homogeneity condition covering the phase field on the micro-scale. An efficient two-scale computational scenario is developed based on an operator splitting scheme that includes a predictor for the magnetization on the micro-scale. Two- and three-dimensional numerical simulations demonstrate the performance of the method. They investigate micro-magnetic domain evolution driven by macroscopic fields as well as the associated overall hysteretic response of ferromagnetic solids.

  2. Creating a culture of mutual respect.

    PubMed

    Kaplan, Kathryn; Mestel, Pamela; Feldman, David L

    2010-04-01

    The Joint Commission mandates that hospitals seeking accreditation have a process to define and address disruptive behavior. Leaders at Maimonides Medical Center, Brooklyn, New York, took the initiative to create a code of mutual respect that not only requires respectful behavior, but also encourages sensitivity and awareness to the causes of frustration that often lead to inappropriate behavior. Steps to implementing the code included selecting code advocates, setting up a system for mediating disputes, tracking and addressing operational system issues, providing training for personnel, developing a formal accountability process, and measuring the results.

  3. Creating Time for Equity Together

    ERIC Educational Resources Information Center

    Renée, Michelle

    2015-01-01

    Iin urban communities across the nation, a broad range of partners have committed to reinventing educational time together to ensure equitable access to rich learning opportunities for all young people. Across the nation, education partners are using their creativity, commitment, and unique resources to create new school and system designs that…

  4. Creating a Global Perspective Campus

    ERIC Educational Resources Information Center

    Braskamp, Larry A.

    2011-01-01

    The author has written this Guidebook to assist users interested in creating a campus that will be more global in its mission, programs, and people. His approach is to focus on the views and contributions of the people who are engaged in higher education. Thus it has a "person" emphasis rather than a structural or policy point of view. The author…

  5. Creating a Positive Work Environment

    ERIC Educational Resources Information Center

    Anderson, Susan

    2010-01-01

    The author believes happy staff make for happy classrooms and happy classrooms make for happy children. However, with all the pressures facing child care programs--from the economy to state requirements--creating and maintaining a positive work environment becomes tougher and tougher. In this article, the author discusses the importance of…

  6. Can Children Really Create Knowledge?

    ERIC Educational Resources Information Center

    Bereiter, Carl; Scardamalia, Marlene

    2010-01-01

    Can children genuinely create new knowledge, as opposed to merely carrying out activities that resemble those of mature scientists and innovators? The answer is yes, provided the comparison is not to works of genius but to standards that prevail in ordinary research communities. One important product of knowledge creation is concepts and tools…

  7. Creating Adult Basic Education Programs.

    ERIC Educational Resources Information Center

    Harris, Dolores M.

    Adult basic education programs must teach the "social living skills" disadvantaged adults need, as well as basic literacy skills. In creating an ABE program, one must first assess the needs of the target population--through surveys, group meetings, an advisory council of members of the target population, demographic studies, and consideration of…

  8. Creating a New Professional Association

    ERIC Educational Resources Information Center

    Journal of College Reading and Learning, 2009

    2009-01-01

    This position paper investigates the merits and potential benefits of creating a new, more comprehensive professional association for members of the learning assistance and developmental education profession. This was the task assigned to the College Reading and Learning Association/National Association for Developmental Education (CRLA/NADE)…

  9. Creating Three-Dimensional Scenes

    ERIC Educational Resources Information Center

    Krumpe, Norm

    2005-01-01

    Persistence of Vision Raytracer (POV-Ray), a free computer program for creating photo-realistic, three-dimensional scenes and a link for Mathematica users interested in generating POV-Ray files from within Mathematica, is discussed. POV-Ray has great potential in secondary mathematics classrooms and helps in strengthening students' visualization…

  10. Creating Presentations on ICT Classes

    ERIC Educational Resources Information Center

    Marchis, Iuliana

    2010-01-01

    The article focuses on the creation of presentations on ICT classes. The first part highlights the most important steps when creating a presentation. The main idea is, that the computer presentation shouldn't consist only from the technological part, i.e. the editing of the presentation in a computer program. There are many steps before and after…

  11. Creating Frameworks for Reflective Teaching

    ERIC Educational Resources Information Center

    Carter, Margie

    2007-01-01

    The task of creating organizational policies and systems that promote and support reflective teaching is multifaceted and seldom enumerated in early childhood professional literature. One of the best overviews the author has found comes from Carol Brunson Phillips and Sue Bredekamp (1998). The author opines that if the early childhood profession…

  12. Creating Highlander Wherever You Are

    ERIC Educational Resources Information Center

    Williams, Susan; Mullett, Cathy

    2016-01-01

    Highlander Research and Education Center serves as a catalyst for grassroots organizing and movement building. This article focuses on an interview with education coordinator Susan Williams who has worked at Highlander for 26 years. We discuss how others can and do create powerful popular education experiences anywhere, whether they have a…

  13. Creating Space for Children's Literature

    ERIC Educational Resources Information Center

    Serafini, Frank

    2011-01-01

    As teachers struggle to balance the needs of their students with the requirements of commercial reading materials, educators need to consider how teachers will create space for children's literature in today's classrooms. In this article, 10 practical recommendations for incorporating children's literature in the reading instructional framework…

  14. Creating Great Overheads with Computers.

    ERIC Educational Resources Information Center

    Gribas, Cyndy; And Others

    1996-01-01

    Steps in preparing effective overhead projector transparencies for college instruction are outlined, using the PowerPoint program for Windows. They include thinking analogically in translating from concept to visual form; using the features of the presentation program to create a polished product; and assuring readability (visibility, typeface…

  15. Creating an Innovative Learning Organization

    ERIC Educational Resources Information Center

    Salisbury, Mark

    2010-01-01

    This article describes how to create an innovative learning (iLearning) organization. It begins by discussing the life cycle of knowledge in an organization, followed by a description of the theoretical foundation for iLearning. Next, the article presents an example of iLearning, followed by a description of the distributed nature of work, the…

  16. Creating Valuable Class Web Sites

    ERIC Educational Resources Information Center

    Baker, Elizabeth A.

    2008-01-01

    Even those teachers with the best intentions of taking advantage of the Internet to support learning may have obstacles before them. In researching the problem, the author has heard their complaints and understands some of the difficulties. However, creating a classroom Web site is not as difficult as one might think. In this article, the author…

  17. Absorbing metasurface created by diffractionless disordered arrays of nanoantennas

    SciTech Connect

    Chevalier, Paul; Bouchon, Patrick Jaeck, Julien; Lauwick, Diane; Kattnig, Alain; Bardou, Nathalie; Pardo, Fabrice; Haïdar, Riad

    2015-12-21

    We study disordered arrays of metal-insulator-metal nanoantenna in order to create a diffractionless metasurface able to absorb light in the 3–5 μm spectral range. This study is conducted with angle-resolved reflectivity measurements obtained with a Fourier transform infrared spectrometer. A first design is based on a perturbation of a periodic arrangement, leading to a significant reduction of the radiative losses. Then, a random assembly of nanoantennas is built following a Poisson-disk distribution of given density, in order to obtain a nearly perfect cluttered assembly with optical properties of a homogeneous material.

  18. Can you help create the next generation of Land Surface Air Temperature products?

    NASA Astrophysics Data System (ADS)

    Thorne, Peter; Venema, Victor

    2013-04-01

    The International Surface Temperature Initiative comprises a group of multi-disciplinary researchers constituted in 2010 with the remit of creating a suite of open, transparent Land Surface Air Temperature products suitable for meeting 21st Century science and societal needs and expectations. Since instigation significant progress has been made in the creation of an improved set of 'raw' Land Surface Air Temperature data holdings (to be released in first version in February 2013), constituting in excess of 30,000 stations many going back over a Century, and towards the creation of a rigorous benchmarking framework. What is now requested is that multiple independent groups take up the challenge of creating global and regional products from the databank and submit their algorithms to the benchmarking framework. Key here is to rigorously assess structural uncertainty - it is not sufficient to assume because one group has tackled the problem it is in any meaningful sense mission accomplished. There undoubtedly exist a myriad of issues in the raw data and it is of vital importance to see how sensitive data homogenization is to the set of processing choices independent groups will undertake. This uncertainty will almost certainly be larger at the station or regional level - yet as we move into the 21st Century it is these scales that are of increasing import to end users. It is essential that we serve the right data in the right way with the correct caveats. This can only be achieved if a sufficient number of groups take up the challenge of creating new products from the raw databank. This poster will outline progress to date in the creation of the databank and global benchmarks and outline how investigators and groups can now get involved in creating products from the databank and participate in the benchmarking exercise. Further details upon the Initiative and its aims can be found at www.surfacetemperatures.org and http://surfacetemperatures.blogspot.com/

  19. Photonic crystal waveguide created by selective infiltration

    NASA Astrophysics Data System (ADS)

    Casas Bedoya, A.; Domachuk, P.; Grillet, C.; Monat, C.; Mägi, E. C.; Li, E.; Eggleton, B. J.

    2012-06-01

    The marriage of photonics and microfluidics ("optofluidics") uses the inherent mobility of fluids to reversibly tune photonic structures beyond traditional fabrication methods by infiltrating voids in said structures. Photonic crystals (PhCs) strongly control light on the wavelength scale and are well suited to optofluidic tuning because their periodic airhole microstructure is a natural candidate for housing liquids. The infiltration of a single row of holes in the PhC matrix modifies the effective refractive index allowing optical modes to be guided by the PhC bandgap. In this work we present the first experimental demonstration of a reconfigurable single mode W1 photonic crystal defect waveguide created by selective liquid infiltration. We modified a hexagonal silicon planar photonic crystal membrane by selectively filling a single row of air holes with ~300nm resolution, using high refractive index ionic liquid. The modification creates optical confinement in the infiltrated region and allows propagation of a single optical waveguide mode. We describe the challenges arising from the infiltration process and the liquid/solid surface interaction in the photonic crystal. We include a detailed comparison between analytic and numerical modeling and experimental results, and introduce a new approach to create an offset photonic crystal cavity by varying the nature of the selective infiltration process.

  20. Homogeneous catalyst formulations for methanol production

    DOEpatents

    Mahajan, Devinder; Sapienza, Richard S.; Slegeir, William A.; O'Hare, Thomas E.

    1990-01-01

    There is disclosed synthesis of CH.sub.3 OH from carbon monoxide and hydrogen using an extremely active homogeneous catalyst for methanol synthesis directly from synthesis gas. The catalyst operates preferably between 100.degree.-150.degree. C. and preferably at 100-150 psia synthesis gas to produce methanol. Use can be made of syngas mixtures which contain considerable quantities of other gases, such as nitrogen, methane or excess hydrogen. The catalyst is composed of two components: (a) a transition metal carbonyl complex and (b) an alkoxide component. In the simplest formulation, component (a) is a complex of nickel tetracarbonyl and component (b) is methoxide (CH.sub.3 O.sup.13 ), both being dissolved in a methanol solvent system. The presence of a co-solvent such as p-dioxane, THF, polyalcohols, ethers, hydrocarbons, and crown ethers accelerates the methanol synthesis reaction.

  1. Homogeneous catalyst formulations for methanol production

    DOEpatents

    Mahajan, Devinder; Sapienza, Richard S.; Slegeir, William A.; O'Hare, Thomas E.

    1991-02-12

    There is disclosed synthesis of CH.sub.3 OH from carbon monoxide and hydrogen using an extremely active homogeneous catalyst for methanol synthesis directly from synthesis gas. The catalyst operates preferably between 100.degree.-150.degree. C. and preferably at 100-150 psia synthesis gas to produce methanol. Use can be made of syngas mixtures which contain considerable quantities of other gases, such as nitrogen, methane or excess hydrogen. The catalyst is composed of two components: (a) a transition metal carbonyl complex and (b) an alkoxide component. In the simplest formulation, component (a) is a complex of nickel tetracarbonyl and component (b) is methoxide (CH.sub.3 O.sup.-), both being dissolved in a methanol solvent system. The presence of a co-solvent such as p-dioxane, THF, polyalcohols, ethers, hydrocarbons, and crown ethers accelerates the methanol synthesis reaction.

  2. Instability of Homogeneous State in Magnetic Semiconductors

    NASA Astrophysics Data System (ADS)

    Sinkkonen, J.; Kuivalainen, P.; Stubb, T.

    1982-06-01

    The instability of the homogeneous state in a ferromagnetic semiconductor is studied. The electronic part of the free energy is determined using Thomas-Fermi statistical model and the magnetic part is calculated by the molecular field approximation including the RKKY-interaction. The inhomogeneity consists of a small magnetically polarized region with a high electron density surrounded by a less polarized positively charged depletion layer. The inhomogeneous state is found to be stable in a relatively broad temperature range around the Curie temperature at low and intermediate doping densities. The stability range shrinks in an applied magnetic field. At fields exceeding about 3 T or at doping densities larger than 1021 cm-3 the inhomogeneous state is no more stable.

  3. A homogeneous survey of red supergiants

    NASA Astrophysics Data System (ADS)

    Marco, Amparo; Dorda, Ricardo; González-Fernández, Carlos; Negueruela, Ignacio

    2015-08-01

    We have carried out a comprehensive homogeneous spectroscopic and photometric study of a sample of a few hundred red supergiants in the Milky Way, the Large Magellanic Cloud and the Small Magellanic Cloud. Our results show that global trends can be derived for many spectroscopic features independently of metallicity. The intensity of atomic Ti lines is directly correlated to spectral type, suggesting a real temperature change in the photospheric temperature. We find that the shape of the spectral energy distribution stops being directly related to surface temperature around mid-K spectral types, and becomes strongly correlated to mass loss. The distribution of spectral types is markedly different for the subset of red supergiants above a given luminosity cut, giving very strong hints of a separate evolutionary phase.

  4. Soliton production with nonlinear homogeneous lines

    DOE PAGES

    Elizondo-Decanini, Juan M.; Coleman, Phillip D.; Moorman, Matthew W.; ...

    2015-11-24

    Low- and high-voltage Soliton waves were produced and used to demonstrate collision and compression using diode-based nonlinear transmission lines. Experiments demonstrate soliton addition and compression using homogeneous nonlinear lines. We built the nonlinear lines using commercially available diodes. These diodes are chosen after their capacitance versus voltage dependence is used in a model and the line design characteristics are calculated and simulated. Nonlinear ceramic capacitors are then used to demonstrate high-voltage pulse amplification and compression. The line is designed such that a simple capacitor discharge, input signal, develops soliton trains in as few as 12 stages. We also demonstrated outputmore » voltages in excess of 40 kV using Y5V-based commercial capacitors. The results show some key features that determine efficient production of trains of solitons in the kilovolt range.« less

  5. Homogeneously dispersed, multimetal oxygen-evolving catalysts

    SciTech Connect

    Zhang, Bo; Zheng, Xueli; Voznyy, Oleksandr; Comin, Riccardo; Bajdich, Michal; Garcia-Melchor, Max; Han, Lili; Xu, Jixian; Liu, Min; Zheng, Lirong; F. Pelayo Garcia de Arquer; Dinh, Cao Thang; Fan, Fengjia; Yuan, Mingjian; Yassitepe, Emre; Chen, Ning; Regier, Tom; Liu, Pengfei; Li, Yuhang; De Luna, Phil; Janmohamed, Alyf; Xin, Huolin L.; Yang, Huagui; Vojvodic, Aleksandra; Sargent, Edward H.

    2016-03-24

    Earth-abundant first-row (3d) transition-metal-based catalysts have been developed for the oxygen-evolution reaction (OER); however, they operate at overpotentials significantly above thermodynamic requirements. Density functional theory suggested that non-3d high-valency metals such as tungsten can modulate 3d metal oxides, providing near-optimal adsorption energies for OER intermediates. We developed a room-temperature synthesis to produce gelled oxy-hydroxide materials with an atomically homogeneous metal distribution. These gelled FeCoW oxy-hydroxide exhibits the lowest overpotential (191 mV) reported at 10 mA per square centimeter in alkaline electrolyte. Here, the catalyst shows no evidence of degradation following more than 500 hours of operation. X-ray absorption and computational studies reveal a synergistic interplay between W, Fe and Co in producing a favorable local coordination environment and electronic structure that enhance the energetics for OER.

  6. Leith diffusion model for homogeneous anisotropic turbulence

    SciTech Connect

    Rubinstein, Robert; Clark, Timothy T.; Kurien, Susan

    2016-07-19

    Here, a proposal for a spectral closure model for homogeneous anisotropic turbulence. The systematic development begins by closing the third-order correlation describing nonlinear interactions by an anisotropic generalization of the Leith diffusion model for isotropic turbulence. The correlation tensor is then decomposed into a tensorially isotropic part, or directional anisotropy, and a trace-free remainder, or polarization anisotropy. The directional and polarization components are then decomposed using irreducible representations of the SO(3) symmetry group. Under the ansatz that the decomposition is truncated at quadratic order, evolution equations are derived for the directional and polarization pieces of the correlation tensor. Here, numerical simulation of the model equations for a freely decaying anisotropic flow illustrate the non-trivial effects of spectral dependencies on the different return-to-isotropy rates of the directional and polarization contributions.

  7. An inhomogeneous model universe behaving homogeneously

    NASA Astrophysics Data System (ADS)

    Khosravi, Sh.; Kourkchi, E.; Mansouri, R.; Akrami, Y.

    2008-05-01

    We present a new model universe based on the junction of FRW to flat Lemaitre Tolman Bondi (LTB) solutions of Einstein equations along our past light cone, bringing structures within the FRW models. The model is assumed globally to be homogeneous, i.e. the cosmological principle is valid. Local inhomogeneities within the past light cone are modeled as a flat LTB, whereas those outside the light cone are assumed to be smoothed out and represented by a FRW model. The model is singularity free, always FRW far from the observer along the past light cone, gives way to a different luminosity distance relation as for the CDM/FRW models, a negative deceleration parameter near the observer, and correct linear and non-linear density contrast. As a whole, the model behaves like a FRW model on the past light cone with a special behavior of the scale factor, Hubble and deceleration parameter, mimicking dark energy.

  8. Leith diffusion model for homogeneous anisotropic turbulence

    DOE PAGES

    Rubinstein, Robert; Clark, Timothy T.; Kurien, Susan

    2016-07-19

    Here, a proposal for a spectral closure model for homogeneous anisotropic turbulence. The systematic development begins by closing the third-order correlation describing nonlinear interactions by an anisotropic generalization of the Leith diffusion model for isotropic turbulence. The correlation tensor is then decomposed into a tensorially isotropic part, or directional anisotropy, and a trace-free remainder, or polarization anisotropy. The directional and polarization components are then decomposed using irreducible representations of the SO(3) symmetry group. Under the ansatz that the decomposition is truncated at quadratic order, evolution equations are derived for the directional and polarization pieces of the correlation tensor. Here, numericalmore » simulation of the model equations for a freely decaying anisotropic flow illustrate the non-trivial effects of spectral dependencies on the different return-to-isotropy rates of the directional and polarization contributions.« less

  9. The homogeneity conjecture for supergravity backgrounds

    NASA Astrophysics Data System (ADS)

    Figueroa-O'Farrill, José Miguel

    2009-06-01

    These notes record three lectures given at the workshop "Higher symmetries in Physics", held at the Universidad Complutense de Madrid in November 2008. In them we explain how to construct a Lie (super)algebra associated to a spin manifold, perhaps with extra geometric data, and a notion of privileged spinors. The typical examples are supersymmetric supergravity backgrounds; although there are more classical instances of this construction. We focus on two results: the geometric constructions of compact real forms of the simple Lie algebras of type B4, F4 and E8 from S7, S8 and S15, respectively; and the construction of the Killing superalgebra of eleven-dimensional supergravity backgrounds. As an application of this latter construction we show that supersymmetric supergravity backgrounds with enough supersymmetry are necessarily locally homogeneous.

  10. Soliton production with nonlinear homogeneous lines

    SciTech Connect

    Elizondo-Decanini, Juan M.; Coleman, Phillip D.; Moorman, Matthew W.; Petney, Sharon Joy Victor; Dudley, Evan C.; Youngman, Kevin; Penner, Tim Dwight; Fang, Lu; Myers, Katherine M.

    2015-11-24

    Low- and high-voltage Soliton waves were produced and used to demonstrate collision and compression using diode-based nonlinear transmission lines. Experiments demonstrate soliton addition and compression using homogeneous nonlinear lines. We built the nonlinear lines using commercially available diodes. These diodes are chosen after their capacitance versus voltage dependence is used in a model and the line design characteristics are calculated and simulated. Nonlinear ceramic capacitors are then used to demonstrate high-voltage pulse amplification and compression. The line is designed such that a simple capacitor discharge, input signal, develops soliton trains in as few as 12 stages. We also demonstrated output voltages in excess of 40 kV using Y5V-based commercial capacitors. The results show some key features that determine efficient production of trains of solitons in the kilovolt range.

  11. Analysis of homogeneous turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Leonard, A. D.; Hill, J. C.; Mahalingam, S.; Ferziger, J. H.

    1988-01-01

    Full turbulence simulations at low Reynolds numbers were made for the single-step, irreversible, bimolecular reaction between non-premixed reactants in isochoric, decaying homogeneous turbulence. Various initial conditions for the scalar field were used in the simulations to control the initial scalar dissipation length scale, and simulations were also made for temperature-dependent reaction rates and for non-stoichiometric and unequal diffusivity conditions. Joint probability density functions (pdf's), conditional pdf's, and various statistical quantities appearing in the moment equations were computed. Preliminary analysis of the results indicates that compressive strain-rate correlates better than other dynamical quantities with local reaction rate, and the locations of peak reaction rates seem to be insensitive to the scalar field initial conditions.

  12. Leith diffusion model for homogeneous anisotropic turbulence

    NASA Astrophysics Data System (ADS)

    Rubinstein, Robert; Clark, Timothy; Kurien, Susan

    2016-11-01

    A new spectral closure model for homogeneous anisotropic turbulence is proposed. The systematic development begins by closing the third-order correlation describing nonlinear interactions by an anisotropic generalization of the Leith diffusion model for isotropic turbulence. The correlation tensor is then decomposed into a tensorially isotropic part, or directional anisotropy, and a trace-free remainder, or polarization anisotropy. The directional and polarization components are then decomposed using irreducible representations of the SO(3) symmetry group. Under the ansatz that the decomposition is truncated at quadratic order, evolution equations are derived for the directional and polarization pieces of the correlation tensor. Numerical simulation of the model equations for a freely decaying anisotropic flow illustrate the non-trivial effects of spectral dependencies on the different return-to-isotropy rates of the directional and polarization contributions.

  13. Nanodosimetric track structure in homogeneous extended beams.

    PubMed

    Conte, V; Moro, D; Colautti, P; Grosswendt, B

    2015-09-01

    Physical aspects of particle track structure are important in determining the induction of clustered damage in relevant subcellular structures like the DNA and higher-order genomic structures. The direct measurement of track-structure properties of ionising radiation is feasible today by counting the number of ionisations produced inside a small gas volume. In particular, the so-called track-nanodosimeter, installed at the TANDEM-ALPI accelerator complex of LNL, measures ionisation cluster-size distributions in a simulated subcellular structure of dimensions 20 nm, corresponding approximately to the diameter of the chromatin fibre. The target volume is irradiated by pencil beams of primary particles passing at specified impact parameter. To directly relate these measured track-structure data to radiobiological measurements performed in broad homogeneous particle beams, these data can be integrated over the impact parameter. This procedure was successfully applied to 240 MeV carbon ions and compared with Monte Carlo simulations for extended fields.

  14. Iodothyronine Metabolism in Rat Liver Homogenates

    PubMed Central

    Kaplan, Michael M.; Utiger, Robert D.

    1978-01-01

    To investigate mechanisms of extrathyroidal thyroid hormone metabolism, conversion of thyroxine (T4) to 3,5,3′-triiodothyronine (T3) and degradation of 3,3′,5′-triiodothyronine (rT3) were studied in rat liver homogenates. Both reactions were enzymatic. For conversion of T4 to T3, the Km of T4 was 7.7 μM, and the Vmax was 0.13 pmol T3/min per mg protein. For rT3 degradation, the Km of rT3 was 7.5 nM, and the Vmax was 0.36 pmol rT3/min per mg protein. Production of rT3 or degradation of T4 or T3 was not detected under the conditions employed. rT3 was a potent competitive inhibitor of T4 to T3 conversion with a Ki of 4.5 nM; 3,3′-diiodothyronine was a less potent inhibitor of this reaction. T4 was a competitive inhibitor of rT3 degradation with a Ki of 10.2 μM. Agents which inhibited both reactions included propylthiouracil, which appeared to be an allosteric inhibitor, 2,4-dinitrophenol, and iopanoic acid. Sodium diatrizoate had a weak inhibitory effect. No inhibition was found with α-methylparatyrosine, Fe+2, Fe+3, reduced glutathione, β-hydroxybutyrate, or oleic acid. Fasting resulted in inhibition of T4 to T3 conversion and of rT3 degradation by rat liver homogenates which was reversible after refeeding. Serum T4, T3, and thyrotropin concentrations fell during fasting, with no decrease in serum protein binding as assessed by a T3-charcoal uptake. There was no consistent change in serum rT3 concentrations. Dexamethasone had no effect in vitro. In vivo dexamethasone administration resulted in elevated serum rT3 concentrations after 1 day, and after 5 days, in inhibition of T4 to T3 conversion and rT3 degradation without altering serum T4, T3, or thyrotropin concentrations. Endotoxin treatment had no effect of iodothyronine metabolism in liver homogenates. In kidney homogenates the reaction rates and response to propylthiouracil in vitro were similar to those in liver. No significant T4 to T3 conversion or rT3 production or degradation could be detected

  15. Homogeneous and heterogeneous reactions of anthracene with selected atmospheric oxidants.

    PubMed

    Zhang, Yang; Shu, Jinian; Zhang, Yuanxun; Yang, Bo

    2013-09-01

    The reactions of gas-phase anthracene and suspended anthracene particles with O3 and O3-NO were conducted in a 200-L reaction chamber, respectively. The secondary organic aerosol (SOA) formations from gas-phase reactions of anthracene with O3 and O3-NO were observed. Meanwhile, the size distributions and mass concentrations of SOA were monitored with a scanning mobility particle sizer (SMPS) during the formation processes. The rapid exponential growths of SOA reveal that the atmospheric lifetimes of gas-phase anthracene towards O3 and O3-NO are less than 20.5 and 4.34 hr, respectively. The particulate oxidation products from homogeneous and heterogeneous reactions were analyzed with a vacuum ultraviolet photoionization aerosol time-of-flight mass spectrometer (VUV-ATOFMS). Gas chromatograph/mass spectrometer (GC/MS) analyses of oxidation products of anthracene were carried out for assigning the time-of-flight (TOF) mass spectra of products from homogeneous and heterogeneous reactions. Anthrone, anthraquinone, 9,10-dihydroxyanthracene, and 1,9,10-trihydroxyanthracene were the ozonation products of anthracene, while anthrone, anthraquinone, 9-nitroanthracene, and 1,8-dihydroxyanthraquinone were the main products of anthracene with O3-NO.

  16. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, Ning; Kaya, Basak Sener; Godt, Jonathan W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs.

  17. Direction of unsaturated flow in a homogeneous and isotropic hillslope

    USGS Publications Warehouse

    Lu, N.; Kaya, B.S.; Godt, J.W.

    2011-01-01

    The distribution of soil moisture in a homogeneous and isotropic hillslope is a transient, variably saturated physical process controlled by rainfall characteristics, hillslope geometry, and the hydrological properties of the hillslope materials. The major driving mechanisms for moisture movement are gravity and gradients in matric potential. The latter is solely controlled by gradients of moisture content. In a homogeneous and isotropic saturated hillslope, absent a gradient in moisture content and under the driving force of gravity with a constant pressure boundary at the slope surface, flow is always in the lateral downslope direction, under either transient or steady state conditions. However, under variably saturated conditions, both gravity and moisture content gradients drive fluid motion, leading to complex flow patterns. In general, the flow field near the ground surface is variably saturated and transient, and the direction of flow could be laterally downslope, laterally upslope, or vertically downward. Previous work has suggested that prevailing rainfall conditions are sufficient to completely control these flow regimes. This work, however, shows that under time-varying rainfall conditions, vertical, downslope, and upslope lateral flow can concurrently occur at different depths and locations within the hillslope. More importantly, we show that the state of wetting or drying in a hillslope defines the temporal and spatial regimes of flow and when and where laterally downslope and/or laterally upslope flow occurs. Copyright 2011 by the American Geophysical Union.

  18. Data Homogenization of the NOAA Long-Term Ozonesonde Records

    NASA Astrophysics Data System (ADS)

    Johnson, B.; Cullis, P.; Sterling, C. W.; Jordan, A. F.; Hall, E. G.; Petropavlovskikh, I. V.; Oltmans, S. J.; Mcconville, G.

    2015-12-01

    The NOAA long term balloon-borne ozonesonde sites at Boulder, Colorado; Hilo, Hawaii; and South Pole Station, Antarctica have measured weekly ozone profiles for more than 3 decades. The ozonesonde consists of an electrochemical concentration cell (ECC) sensor interfaced with a weather radiosonde which transmits high resolution ozone and meteorological data during ascent from the surface to 30-35 km altitude. During this 30 year time period there have been several model changes in the commercially available ECC ozonesondes and radiosondes as well as three adjustments in the ozone sensor solution composition at NOAA. These changes were aimed at optimizing the ozonesonde performance. Organized intercomparison campaigns conducted at the environmental simulation facility at the Research Centre Juelich, Germany and international field site testing have been the primary process for assessing new designs, instruments, or sensor solution changes and developing standard operating procedures. NOAA has also performed in-house laboratory tests and launched 28 dual ozonesondes at various sites since 1994 to provide further comparison data to determine the optimum homogenized data set. The final homogenization effort involved reviewing and editing several thousand individual ozonesonde profiles followed by applying the optimum correction algorithms for changes in type of sensor solution composition. The results of improved data sets will be shown with long term trends and uncertainties at various altitude levels.

  19. Homogeneous cosmologies as group field theory condensates

    NASA Astrophysics Data System (ADS)

    Gielen, Steffen; Oriti, Daniele; Sindoni, Lorenzo

    2014-06-01

    We give a general procedure, in the group field theory (GFT) formalism for quantum gravity, for constructing states that describe macroscopic, spatially homogeneous universes. These states are close to coherent (condensate) states used in the description of Bose-Einstein condensates. The condition on such states to be (approximate) solutions to the quantum equations of motion of GFT is used to extract an effective dynamics for homogeneous cosmologies directly from the underlying quantum theory. The resulting description in general gives nonlinear and nonlocal equations for the `condensate wavefunction' which are analogous to the Gross-Pitaevskii equation in Bose-Einstein condensates. We show the general form of the effective equations for current quantum gravity models, as well as some concrete examples. We identify conditions under which the dynamics becomes linear, admitting an interpretation as a quantum-cosmological Wheeler-DeWitt equation, and give its semiclassical (WKB) approximation in the case of a kinetic term that includes a Laplace-Beltrami operator. For isotropic states, this approximation reproduces the classical Friedmann equation in vacuum with positive spatial curvature. We show how the formalism can be consistently extended from Riemannian signature to Lorentzian signature models, and discuss the addition of matter fields, obtaining the correct coupling of a massless scalar in the Friedmann equation from the most natural extension of the GFT action. We also outline the procedure for extending our condensate states to include cosmological perturbations. Our results form the basis of a general programme for extracting effective cosmological dynamics directly from a microscopic non-perturbative theory of quantum gravity.

  20. Homogenization of global radiosonde humidity data

    NASA Astrophysics Data System (ADS)

    Blaschek, Michael; Haimberger, Leopold

    2016-04-01

    The global radiosonde network is an important source of upper-air measurements and is strongly connected to reanalysis efforts of the 20th century. However, measurements are strongly affected by changes in the observing system and require a homogenization before they can be considered useful in climate studies. In particular humidity measurements are known to show spurious trends and biases induced by many sources, e.g. reporting practices or freezing of the sensor. We propose to detect and correct these biases in an automated way, as has been done with temperature and winds. We detect breakpoints in dew point depression (DPD) time series by employing a standard normal homogeneity test (SNHT) on DPD-departures from ERA-Interim. In a next step, we calculate quantile departures between the latter and the earlier part near the breakpoints of the time series, going back in time. These departures adjust the earlier distribution of DPD to the latter distribution, called quantile matching, thus removing for example a non climatic shift. We employ this approach to the existing radiosonde network. In a first step to verify our approach we compare our results with ERA-Interim data and brightness temperatures of humidity-sensitive channels of microwave measuring radiometers (SSMIS) onboard DMSP F16. The results show that some of the biases can be detected and corrected in an automated way, however large biases that impact the distribution of DPD values originating from known reporting practices (e.g. 30 DPD on US stations) remain. These biases can be removed but not corrected. The comparison of brightness temperatures from satellite and radiosondes proofs to be difficult as large differences result from for example representative errors.

  1. Effect of cloud-scale vertical velocity on the contribution of homogeneous nucleation to cirrus formation and radiative forcing

    NASA Astrophysics Data System (ADS)

    Shi, X.; Liu, X.

    2016-06-01

    Ice nucleation is a critical process for the ice crystal formation in cirrus clouds. The relative contribution of homogeneous nucleation versus heterogeneous nucleation to cirrus formation differs between measurements and predictions from general circulation models. Here we perform large-ensemble simulations of the ice nucleation process using a cloud parcel model driven by observed vertical motions and find that homogeneous nucleation occurs rather infrequently, in agreement with recent measurement findings. When the effect of observed vertical velocity fluctuations on ice nucleation is considered in the Community Atmosphere Model version 5, the relative contribution of homogeneous nucleation to cirrus cloud occurrences decreases to only a few percent. However, homogeneous nucleation still has strong impacts on the cloud radiative forcing. Hence, the importance of homogeneous nucleation for cirrus cloud formation should not be dismissed on the global scale.

  2. Creating a climate for excellence.

    PubMed

    Lancaster, J

    1985-01-01

    Some people are motivated to achieve in a manner consistent with the goals of their organization while others pursue individual goals. The attitudes people hold determine their behavior. Therefore, the manager is charged with creating an environment that fosters employee commitment to organizational goals. To create a climate for achievement, managers must recognize that all employees want recognition. Employees perform more effectively when they understand the goals of the organization, know what is expected of them, and are part of a system that includes feedback and reinforcement. Generally, people perform more effectively in an environment with minimal threat and punishment; individual responsibility should be encouraged, rewards based on results, and a climate of trust and open communication should prevail.

  3. Creating a Mobile Library Website

    ERIC Educational Resources Information Center

    Cutshall, Tom C.; Blake, Lindsay; Bandy, Sandra L.

    2011-01-01

    The overwhelming results were iPhones and Android devices. Since the library wasn't equipped technologically to develop an in-house application platform and because we wanted the content to work across all mobile platforms, we decided to focus on creating a mobile web-based platform. From the NLM page of mobile sites we chose the basic PubMed/…

  4. Evidence for homogeneous distribution of osmium in the protosolar nebula

    NASA Astrophysics Data System (ADS)

    Walker, Richard J.

    2012-10-01

    Separate s-, r-, and possibly p-process enriched and depleted components have been shown to host Os in low metamorphic grade chondrites, although no measureable Os isotopic anomalies have yet been discovered for bulk chondrites. Here, iron meteorites from groups IAB, IIAB, IIIAB, IVA and IVB, as well as the main group pallasites are examined. Many of these meteorites show well-resolved anomalies in ɛ190Os, ɛ189Os and ɛ186Osi. The anomalies, however, differ from those observed in chemically extracted components from chondrites, and are interpreted to reflect long-term exposure of the meteorites to cosmic rays, rather than nucleosynthetic effects. A neutron capture model is presented that can well account for observed isotopic variations in 190Os and 189Os. The same model predicts greater enrichment in 186Osi than is observed for at least one iron, suggesting as yet unaccounted for effects, or failings of the model. Despite the variable anomalies resulting from cosmic ray exposure, each of the major meteorite groups examined contains at one member with normal Os isotopic compositions that are unresolved from chondritic compositions. This indicates that some domains within these meteorites were little affected by cosmic rays. These domains are excellent candidates for application of the 182Hf-182W system for dating metal-silicate segregation on their parent bodies. The normal Os also implies that Os was homogeneously distributed throughout the protosolar nebula on the scale of planetesimal accretion, within the current level of analytical resolution. The homogeneity in Os contrasts with isotopic heterogeneity present for other siderophile elements, including Mo, Ru and W. The contrast in the scale of anomalies may reflect a late stage-injection of s- and p- process rich material into the coalescing nebula. Alternately, nebular thermal processing and destruction of some presolar host phases of Mo, Ru and W may also be responsible.

  5. Influence of interspecific competition and landscape structure on spatial homogenization of avian assemblages.

    PubMed

    Robertson, Oliver J; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or 'winners', is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or 'winners', can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems.

  6. The transition from inhomogeneous to homogeneous kinetics in CO binding to myoglobin.

    PubMed Central

    Agmon, N; Doster, W; Post, F

    1994-01-01

    Heme proteins react inhomogeneously with ligands at cryogenic temperatures and homogeneously at room temperature. We have identified and characterized a transition from inhomogeneous to homogeneous behavior at intermediate temperatures in the time dependence of CO binding to horse myoglobin. The turnover is attributed to a functionally important tertiary protein relaxation process during which the barrier increases dynamically. This is verified by a combination of theory and multipulse measurements. A likely biological significance of this effect is in the autocatalysis of the ligand release process. Images FIGURE 2 FIGURE 4 PMID:8061210

  7. Homogenizing and diversifying effects of intensive agricultural land-use on plant species beta diversity in Central Europe - A call to adapt our conservation measures.

    PubMed

    Buhk, Constanze; Alt, Martin; Steinbauer, Manuel J; Beierkuhnlein, Carl; Warren, Steven D; Jentsch, Anke

    2017-01-15

    The prevention of biodiversity loss in agricultural landscapes to protect ecosystem stability and functions is of major importance to stabilize overall diversity. Intense agriculture leads to a loss in species richness and homogenization of species pools, but the processes behind are poorly understood due to a lack of systematic case studies: The specific impacts by agriculture in contrast to other land-use creating open habitat are not studied as such landscapes hardly exist in temperate regions. Applying systematic grids, we compared the plant species distribution at the landscape scale between an active military training areas in Europe and an adjacent rather intensively used agricultural landscape. As the study areas differ mainly in the type of disturbance regime (agricultural vs. non-agricultural), differences in species pattern can be traced back more or less directly to the management. Species trait analyses and multiple measures of beta diversity were applied to differentiate between species similarities between plots, distance-decay, or nestedness. Contrary to our expectation, overall beta diversity in the agricultural area was not reduced but increased under agricultural. This was probably the result of species nestedness due to fragmentation. The natural process of increasing dissimilarity with distance (distance-decay) was suppressed by intense agricultural land-use, generalist and long-distance dispersers gained importance, while rare species lost continuity. There are two independent processes that need to be addressed separately to halt biodiversity loss in agricultural land. There is a need to conserve semi-natural open habitat patches of diverse size to favor poor dispersers and specialist species. At the same time, we stress the importance of mediating biotic homogenization caused by the decrease of distance-decay: The spread of long-distance dispersers in agricultural fields may be acceptable, however, optimized fertilizer input and erosion

  8. Investigation of subgrid models in homogeneous incompressible turbulence

    NASA Astrophysics Data System (ADS)

    Teissedre, C.

    1987-08-01

    A data base of simulated homogeneous, incompressible turbulence in an anisotropic regime was derived using a direct simulation code on a parallel processing computer. The simulated distributions were used to validate subgrid models of the turbulent viscosity and similitude type (analogy between the near field of the cut-off and the subgrid field). The first type of model accounts for the evolution of turbulent kinetic energy well, while the second type, although it better represents the exact value of stress in the subgrid, seems to present a defect of nondissipation. Tests of a model of perturbation of nonlinear terms were performed in an isotropic situation with large structures. The results show the same kind of nondissipative behavior as for the similitude model.

  9. Homogeneously catalyzed oxidation for the destruction of aqueous organic wastes

    SciTech Connect

    Leavitt, D.D.; Horbath, J.S.; Abraham, M.A. )

    1990-11-01

    Several organic species, specifically atrazine, 2,4-dichlorophenozyacetic acid, and biphenyl, were converted to CO{sub 2} and other non-harmful gases through oxidation catalyzed by inorganic acid. Nearly complete conversion was obtained through homogeneous liquid-phase oxidation with ammonium nitrate. The kinetics of reaction have been investigated and indicate parallel oxidation and thermal degradation of the oxidant. This results in a maximum conversion at an intermediate temperature. Increasing oxidant concentration accelerates the rate of conversion and shifts the location of the optimum temperature. Reaction at varying acid concentration revealed that conversion increased with an approximately linear relationship as the pH of the solution was increased. Conversion was increased to greater than 99% through the addition of small amounts of transition metal salts demonstrating the suitability of a treatment process based on this technology for wastestreams containing small quantities of heavy metals.

  10. Molecular Dynamics Simulations of Homogeneous Crystallization in Polymer Melt

    NASA Astrophysics Data System (ADS)

    Kong, Bin

    2015-03-01

    Molecular mechanisms of homogeneous nucleation and crystal growth from the melt of polyethylene-like polymer were investigated by molecular dynamics simulations. The crystallinity was determined by using the site order parameter method (SOP), which described local order degree around an atom. Snapshots of the simulations showed evolution of the nucleation and the crystal growth through SOP images clearly. The isothermal crystallization kinetics was determined at different temperatures. The rate of crystallization, Kc, and the Avrami exponents, n, were determined as a function of temperature. The forming of nucleis was traced to reveal that the nucleis were formed with more ordered cores and less ordered shells. A detailed statistical analysis of the MD snapshots and trajectories suggested conformations of the polymer chains changed smoothly from random coil to chain folded lamella in the crystallization processes.

  11. Nature of low-frequency noise in homogeneous semiconductors

    NASA Astrophysics Data System (ADS)

    Palenskis, Vilius; Maknys, Kęstutis

    2015-12-01

    This report deals with a 1/f noise in homogeneous classical semiconductor samples on the base of silicon. We perform detail calculations of resistance fluctuations of the silicon sample due to both a) the charge carrier number changes due to their capture-emission processes, and b) due to screening effect of those negative charged centers, and show that proportionality of noise level to square mobility appears as a presentation parameter, but not due to mobility fluctuations. The obtained calculation results explain well the observed experimental results of 1/f noise in Si, Ge, GaAs and exclude the mobility fluctuations as the nature of 1/f noise in these materials and their devices. It is also shown how from the experimental 1/f noise results to find the effective number of defects responsible for this noise in the measured frequency range.

  12. Complexation of Cp2MCl2 in a Chloroaluminate Molten Salt: Relevance to Homogeneous Ziegler-Natta Catalysis

    DTIC Science & Technology

    1990-01-01

    Molecular aspects of heterogeneous catalysis and catalytic fundamentals of industrial processes) Editors Professor VV. Marconi, Assoreni, via E. Ramarini 32... heterogeneous catalysis , but homogeneous effects; methods of catalyst characterization when they catalysis and enzymatic catalysis may also be included are

  13. Nanoparticles and networks created within liquid crystals

    NASA Astrophysics Data System (ADS)

    Kang, Shin-Wong; Kundu, Sudarshan

    We report the in situ creation of growing polymer nanoparticles and resulting polymer networks formed in liquid crystals. Depending on the concentration of monomer, polymerization-induced phase separation proceeds in two distinct regimes. For a high monomer concentration with a good miscibility, phase separation is initiated through the nucleation and growth mechanism in the binodal decomposition regime and rapidly crosses over to the spinodal decomposition process, consequently resulting in interpenetrating polymer networks. For a dilute system, however, the phase separation mainly proceeds and completes in the binodal decomposition regime. The system resembles the aggregation process of colloidal particles. For a dilute system, the reaction kinetics is limited by the reaction between in situ created polymer aggregates and hence the network morphologies are greatly inuenced by the diffusion of reactive growing polymer particles. The thin polymer layers localized at the surface of substrate are frequently observed and can be comprehended by the interfacial adsorption and further cross-linking reaction of in situ created polymer aggregates at the interface. This process provides a direct perception on understanding polymer stabilized liquid crystals accomplished by the interfacial polymer layer formed by polymerization of dilute reactive monomers in liquid crystal (LC) host.

  14. An efficient, reliable and inexpensive device for the rapid homogenization of multiple tissue samples by centrifugation.

    PubMed

    Ilyin, S E; Plata-Salamán, C R

    2000-02-15

    Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.

  15. Creating a digital medical illustration.

    PubMed

    Culley, Joanna

    2016-01-01

    This paper covers the steps required to complete a medical illustration in a digital format using Adobe Illustrator and Photoshop. The project example is the surgical procedure for the release of the glenohumeral joint for the condition known as 'frozen shoulder'. The purpose is to demonstrate one method which an artist can use within digital media to create a colour illustration such as the release of the glenohumeral joint. Included is a general overview as how to deal with the administration of a medical illustration commission through the experience of a professional freelance artist.

  16. Creating an environment for learning.

    PubMed

    Houghton, Trish

    2016-03-16

    This article, the third in a series of 11, provides guidance to new and existing mentors and practice teachers to enable them to progress in their role and develop a portfolio of evidence that meets the Nursing and Midwifery Council's Standards to Support Learning and Assessment in Practice (SSLAP). The importance of developing a high quality practice placement is discussed in relation to the fifth domain of the SSLAP, 'creating an environment for learning'. The article provides learning activities and suggests ways in which mentors and practice teachers can undertake various self-assessments, enabling them to gather relevant evidence to demonstrate how they can meet and maintain the requirements of this domain.

  17. Auditory Spatial Receptive Fields Created by Multiplication

    NASA Astrophysics Data System (ADS)

    Peña, José Luis; Konishi, Masakazu

    2001-04-01

    Examples of multiplication by neurons or neural circuits are scarce, although many computational models use this basic operation. The owl's auditory system computes interaural time (ITD) and level (ILD) differences to create a two-dimensional map of auditory space. Space-specific neurons are selective for combinations of ITD and ILD, which define, respectively, the horizontal and vertical dimensions of their receptive fields. A multiplication of separate postsynaptic potentials tuned to ITD and ILD, rather than an addition, can account for the subthreshold responses of these neurons to ITD-ILD pairs. Other nonlinear processes improve the spatial tuning of the spike output and reduce the fit to the multiplicative model.

  18. Simulation and modeling of homogeneous, compressed turbulence

    NASA Technical Reports Server (NTRS)

    Wu, C. T.; Ferziger, J. H.; Chapman, D. R.

    1985-01-01

    Low Reynolds number homogeneous turbulence undergoing low Mach number isotropic and one-dimensional compression was simulated by numerically solving the Navier-Stokes equations. The numerical simulations were performed on a CYBER 205 computer using a 64 x 64 x 64 mesh. A spectral method was used for spatial differencing and the second-order Runge-Kutta method for time advancement. A variety of statistical information was extracted from the computed flow fields. These include three-dimensional energy and dissipation spectra, two-point velocity correlations, one-dimensional energy spectra, turbulent kinetic energy and its dissipation rate, integral length scales, Taylor microscales, and Kolmogorov length scale. Results from the simulated flow fields were used to test one-point closure, two-equation models. A new one-point-closure, three-equation turbulence model which accounts for the effect of compression is proposed. The new model accurately calculates four types of flows (isotropic decay, isotropic compression, one-dimensional compression, and axisymmetric expansion flows) for a wide range of strain rates.

  19. Dynamic contact angle cycling homogenizes heterogeneous surfaces.

    PubMed

    Belibel, R; Barbaud, C; Mora, L

    2016-12-01

    In order to reduce restenosis, the necessity to develop the appropriate coating material of metallic stent is a challenge for biomedicine and scientific research over the past decade. Therefore, biodegradable copolymers of poly((R,S)-3,3 dimethylmalic acid) (PDMMLA) were prepared in order to develop a new coating exhibiting different custom groups in its side chain and being able to carry a drug. This material will be in direct contact with cells and blood. It consists of carboxylic acid and hexylic groups used for hydrophilic and hydrophobic character, respectively. The study of this material wettability and dynamic surface properties is of importance due to the influence of the chemistry and the potential motility of these chemical groups on cell adhesion and polymer kinetic hydrolysis. Cassie theory was used for the theoretical correction of contact angles of these chemical heterogeneous surfaces coatings. Dynamic Surface Analysis was used as practical homogenizer of chemical heterogeneous surfaces by cycling during many cycles in water. In this work, we confirmed that, unlike receding contact angle, advancing contact angle is influenced by the difference of only 10% of acidic groups (%A) in side-chain of polymers. It linearly decreases with increasing acidity percentage. Hysteresis (H) is also a sensitive parameter which is discussed in this paper. Finally, we conclude that cycling provides real information, thus avoiding theoretical Cassie correction. H(10)is the most sensible parameter to %A.

  20. Converter film technology for homogeneous white light

    NASA Astrophysics Data System (ADS)

    Jordan, Rafael C.; Bauer, Jörg; Oppermann, Hermann

    2007-09-01

    An important issue for white ultra high power LEDs is the generation of a homogeneous light with high efficiency and a good color rendering index. Different from hot light sources LEDs do not emit the whole range of visible wavelengths. Only a certain wavelength with a limited full width at half maximum is emitted. Therefore a combination of wavelengths must be used to satisfy the human eye for white light. The CIE chromaticity diagram (Fig. 1) shows, that several combinations of wavelengths let the brain realize white light. Already the combination of two wavelengths (e.g. cyan and red or blue and yellow) let us think, that the source is white, if this wavelengths hit our receptors. This is completely different, if the light is illuminating an object. The reflection spectra of this object, which is crucial for our color feeling about this object, can not be stimulated in the whole range. For example a red stop sign, which is absorbing all wavelength excepting red, will absorb the blue and yellow light from our "white" light source and due to the missing red, the sign seems to be dark grey or black.

  1. Davydov's solitons in a homogeneous nucleotide chain

    NASA Astrophysics Data System (ADS)

    Lakhno, Victor D.

    Charge transfer in homogeneous nucleotide chains is modeled on the basis of Holstein Hamiltonian. The path length of Davydov solitons in these chains is being studied. It is shown that in a dispersionless case, when the soliton velocity V is small, the path length grows exponentially as V decreases. In this case, the state of a moving soliton is quasisteady. In the presence of dispersion determined by the dependenceΩ2 =Ω 02 + V 02κ2, the path length in the region 0 < V < V0 is equal to infinity. In this case, the phonon environment follows the charge motion. In the region V > V0, the soliton motion is accompanied by emission of phonons which leads to a finite path length of a soliton. The latter tends to infinity as V → V0 + 0 and V → ∞. The presence of dissipation leads to a finite soliton path length. An equilibrium velocity of soliton in an external electric field is calculated. It is shown that there is a maximum intensity of an electric field at which a steady motion of a soliton is possible. The soliton mobility is calculated for the stable or ohmic brunch.

  2. Magnetic field homogeneity for neutron EDM experiment

    NASA Astrophysics Data System (ADS)

    Anderson, Melissa

    2016-09-01

    The neutron electric dipole moment (nEDM) is an observable which, if non-zero, would violate time-reversal symmetry, and thereby charge-parity symmetry of nature. New sources of CP violation beyond those found in the standard model of particle physics are already tightly constrained by nEDM measurements. Our future nEDM experiment seeks to improve the precision on the nEDM by a factor of 30, using a new ultracold neutron (UCN) source that is being constructed at TRIUMF. Systematic errors in the nEDM experiment are driven by magnetic field inhomogeneity and instability. The goal field inhomogeneity averaged over the experimental measurement cell (order of 1 m) is 1 nT/m, at a total magnetic field of 1 microTesla. This equates to roughly 10-3 homogeneity. A particularly challenging aspect of the design problem is that nearby magnetic materials will also affect the magnetic inhomogeneity, and this must be taken into account in completing the design. This poster will present the design methodology and status of the main coil for the experiment where we use FEA software (COMSOL) to simulate and analyze the magnetic field. Natural Sciences and Engineering Research Council.

  3. Homogeneously dispersed, multimetal oxygen-evolving catalysts

    DOE PAGES

    Zhang, Bo; Zheng, Xueli; Voznyy, Oleksandr; ...

    2016-03-24

    Earth-abundant first-row (3d) transition-metal-based catalysts have been developed for the oxygen-evolution reaction (OER); however, they operate at overpotentials significantly above thermodynamic requirements. Density functional theory suggested that non-3d high-valency metals such as tungsten can modulate 3d metal oxides, providing near-optimal adsorption energies for OER intermediates. We developed a room-temperature synthesis to produce gelled oxy-hydroxide materials with an atomically homogeneous metal distribution. These gelled FeCoW oxy-hydroxide exhibits the lowest overpotential (191 mV) reported at 10 mA per square centimeter in alkaline electrolyte. Here, the catalyst shows no evidence of degradation following more than 500 hours of operation. X-ray absorption and computationalmore » studies reveal a synergistic interplay between W, Fe and Co in producing a favorable local coordination environment and electronic structure that enhance the energetics for OER.« less

  4. Pressure-strain-rate events in homogeneous turbulent shear flow

    NASA Technical Reports Server (NTRS)

    Brasseur, James G.; Lee, Moon J.

    1988-01-01

    A detailed study of the intercomponent energy transfer processes by the pressure-strain-rate in homogeneous turbulent shear flow is presented. Probability density functions (pdf's) and contour plots of the rapid and slow pressure-strain-rate show that the energy transfer processes are extremely peaky, with high-magnitude events dominating low-magnitude fluctuations, as reflected by very high flatness factors of the pressure-strain-rate. A concept of the energy transfer class was applied to investigate details of the direction as well as magnitude of the energy transfer processes. In incompressible flow, six disjoint energy transfer classes exist. Examination of contours in instantaneous fields, pdf's and weighted pdf's of the pressure-strain-rate indicates that in the low magnitude regions all six classes play an important role, but in the high magnitude regions four classes of transfer processes, dominate. The contribution to the average slow pressure-strain-rate from the high magnitude fluctuations is only 50 percent or less. The relative significance of high and low magnitude transfer events is discussed.

  5. Creating Cross-disciplinary Courses

    PubMed Central

    Reynolds, Elaine R.

    2012-01-01

    Because of its focus on the biological underpinnings of action and behavior, neuroscience intersects with many fields of human endeavor. Some of these cross-disciplinary intersections have been long standing, while others, such as neurotheology or neuroeconomics, are more recently formed fields. Many undergraduate institutions have sought to include cross-disciplinary courses in their curriculum because this style of pedagogy is often seen as applicable to real world problems. However, it can be difficult for faculty with specialized training within their discipline to expand beyond their own fields to offer cross-disciplinary courses. I have been creating a series of multi- or cross-disciplinary courses and have found some strategies that have helped me successfully teach these classes. I will discuss general strategies and tools in developing these types of courses including: 1) creating mixed experience classrooms of students and contributing faculty 2) finding the right tools that will allow you to teach to a mixed population without prerequisites 3) examining the topic using multiple disciplinary perspectives 4) feeding off student experience and interest 5) assessing the impact of these courses on student outcomes and your neuroscience program. This last tool in particular is important in establishing the validity of this type of teaching for neuroscience students and the general student population. PMID:23494491

  6. Creating Stop-Motion Videos with iPads to Support Students' Understanding of Cell Processes: "Because You Have to Know What You're Talking about to Be Able to Do It"

    ERIC Educational Resources Information Center

    Deaton, Cynthia C. M.; Deaton, Benjamin E.; Ivankovic, Diana; Norris, Frank A.

    2013-01-01

    The purpose of this qualitative case study is two-fold: (a) describe the implementation of a stop-motion animation video activity to support students' understanding of cell processes, and (b) present research findings about students' beliefs and use of iPads to support their creation of stop-motion videos in an introductory biology course. Data…

  7. Creating a Fellowship Curriculum in Patient Safety and Quality.

    PubMed

    Abookire, Susan A; Gandhi, Tejal K; Kachalia, Allen; Sands, Kenneth; Mort, Elizabeth; Bommarito, Grace; Gagne, Jane; Sato, Luke; Weingart, Saul N

    2016-01-01

    The authors sought to create a curriculum suitable for a newly created clinical fellowship curriculum across Harvard Medical School-affiliated teaching hospitals as part of a newly created 2-year quality and safety fellowship program described in the companion article "Design and Implementation of the Harvard Fellowship in Patient Safety and Quality." The aim of the curriculum development process was to define, coordinate, design, and implement a set of essential skills for future physician-scholars of any specialty to lead operational quality and patient safety efforts. The process of curriculum development and the ultimate content are described in this article.

  8. Modification of homogeneous and isotropic turbulence by solid particles

    NASA Astrophysics Data System (ADS)

    Hwang, Wontae

    2005-12-01

    Particle-laden flows are prevalent in natural and industrial environments. Dilute loadings of small, heavy particles have been observed to attenuate the turbulence levels of the carrier-phase flow, up to 80% in some cases. We attempt to increase the physical understanding of this complex phenomenon by studying the interaction of solid particles with the most fundamental type of turbulence, which is homogeneous and isotropic with no mean flow. A flow facility was developed that could create air turbulence in a nearly-spherical chamber by means of synthetic jet actuators mounted on the corners. Loudspeakers were used as the actuators. Stationary turbulence and natural decaying turbulence were investigated using two-dimensional particle image velocimetry for the base flow qualification. Results indicated that the turbulence was fairly homogeneous throughout the measurement domain and very isotropic, with small mean flow. The particle-laden flow experiments were conducted in two different environments, the lab and in micro-gravity, to examine the effects of particle wakes and flow structure distortion caused by settling particles. The laboratory experiments showed that glass particles with diameters on the order of the turbulence Kolmogorov length scale attenuated the fluid turbulent kinetic energy (TKE) and dissipation rate with increasing particle mass loadings. The main source of fluid TKE production in the chamber was the speakers, but the loss of potential energy of the settling particles also resulted in a significant amount of production of extra TKE. The sink of TKE in the chamber was due to the ordinary fluid viscous dissipation and extra dissipation caused by particles. This extra dissipation could be divided into "unresolved" dissipation caused by local velocity disturbances in the vicinity of the small particles and dissipation caused by large-scale flow distortions from particle wakes and particle clusters. The micro-gravity experiments in NASA's KC-135

  9. The Combustion Synthesis Zns Doped Materials to Create Ultra-Electroluminscent Materials in Microgravity

    NASA Astrophysics Data System (ADS)

    Castillo, Martin; Steinberg, Theodore

    2012-07-01

    Self-propagating high temperature synthesis (SHS) utilises a rapid exothermic process involving high energy and nonlinearity coupled with a high cooling rate to produce materials formed outside of normal equilibrium boundaries thus possessing unique properties. The elimination of gravity during this process allows capillary forces to dominate mixing of the reactants which results in a superior and enhanced homogeneity in the product materials. ZnS type materials have been previously conducted in reduced gravity and normal gravity. It has been claimed in literature that a near perfect phases of ZnS wurtzite was produced. Although, the SHS of this material is possible at high pressures, there have been no advancements in refining this structure to create ultra-electroluminescent materials. Utilising this process with ZnS doped with Cu, Mn, or rare earth metals such as Eu and Pr leads to electroluminescence properties, thus making this an attractive electroluminescent material. The work described here will revisit the SHS of ZnS and will re-examine the work performed in both normal gravity and in reduced gravity within the Queensland University of Technology Drop Tower Facility. Quantifications in the lattice parameters, crystal structures, and phases produced are presented to further explore the unique structure-property performance relationships produced from the SHS of ZnS materials.

  10. Can the Universe create itself?

    NASA Astrophysics Data System (ADS)

    Gott, J. Richard, III; Li, Li-Xin

    1998-07-01

    The question of first-cause has troubled philosophers and cosmologists alike. Now that it is apparent that our universe began in a big bang explosion, the question of what happened before the big bang arises. Inflation seems like a very promising answer, but as Borde and Vilenkin have shown, the inflationary state preceding the big bang could not have been infinite in duration-it must have had a beginning also. Where did it come from? Ultimately, the difficult question seems to be how to make something out of nothing. This paper explores the idea that this is the wrong question-that that is not how the Universe got here. Instead, we explore the idea of whether there is anything in the laws of physics that would prevent the Universe from creating itself. Because spacetimes can be curved and multiply connected, general relativity allows for the possibility of closed timelike curves (CTCs). Thus, tracing backwards in time through the original inflationary state we may eventually encounter a region of CTCs-giving no first-cause. This region of CTCs may well be over by now (being bounded toward the future by a Cauchy horizon). We illustrate that such models-with CTCs-are not necessarily inconsistent by demonstrating self-consistent vacuums for Misner space and a multiply connected de Sitter space in which the renormalized energy-momentum tensor does not diverge as one approaches the Cauchy horizon and solves Einstein's equations. Some specific scenarios (out of many possible ones) for this type of model are described. For example, a metastable vacuum inflates producing an infinite number of (big-bang-type) bubble universes. In many of these, either by natural causes or by action of advanced civilizations, a number of bubbles of metastable vacuum are created at late times by high energy events. These bubbles will usually collapse and form black holes, but occasionally one will tunnel to create an expanding metastable vacuum (a baby universe) on the other side of the

  11. Line segments in homogeneous scalar turbulence

    NASA Astrophysics Data System (ADS)

    Gauding, Michael; Goebbert, Jens Henrik; Hasse, Christian; Peters, Norbert

    2015-09-01

    The local structure of a turbulent scalar field in homogeneous isotropic turbulence is analyzed by direct numerical simulations (DNS) with different Taylor micro-scale based Reynolds numbers between 119 and 529. A novel signal decomposition approach is introduced where the signal of the scalar along a straight line is partitioned into segments based on the local extremal points of the scalar field. These segments are then parameterized by the distance ℓ between adjacent extremal points and the scalar difference Δϕ at the extrema. Both variables are statistical quantities and a joint distribution function of these quantities contains most information to statistically describe the scalar field. It is highlighted that the marginal distribution function of the length becomes independent of Reynolds number when normalized by the mean length ℓm. From a statistical approach, it is further shown that the mean length scales with the Kolmogorov length, which is also confirmed by DNS. For turbulent mixing, the scalar gradient plays a paramount role. Turbulent scalar fields are characterized by cliff-ramp-like structures manifesting the occurrence of localized large scalar gradients. To study turbulent mixing, a segment-based gradient is defined as Δϕ/ℓ. Joint statistics of the length and the segment-based gradient provide novel understanding of cliff-ramp-like structures. Ramp-like structures are unveiled by the asymmetry of the joint distribution function of the segment-based gradient and the length. Cliff-like structures are further analyzed by conditional statistics and it is shown from DNS that the width of cliffs scales with the Kolmogorov length scale.

  12. Homogeneous and heterogenized iridium water oxidation catalysts

    NASA Astrophysics Data System (ADS)

    Macchioni, Alceo

    2014-10-01

    The development of an efficient catalyst for the oxidative splitting of water into molecular oxygen, protons and electrons is of key importance for producing solar fuels through artificial photosynthesis. We are facing the problem by means of a rational approach aimed at understanding how catalytic performance may be optimized by the knowledge of the reaction mechanism of water oxidation and the fate of the catalytic site under the inevitably harsh oxidative conditions. For the purposes of our study we selected iridium water oxidation catalysts, exhibiting remarkable performance (TOF > 5 s-1 and TON > 20000). In particular, we recently focused our attention on [Cp*Ir(N,O)X] (N,O = 2-pyridincarboxylate; X = Cl or NO3) and [IrCl(Hedta)]Na water oxidation catalysts. The former exhibited a remarkable TOF whereas the latter showed a very high TON. Furthermore, [IrCl(Hedta)]Na was heterogenized onto TiO2 taking advantage of the presence of a dandling -COOH functionality. The heterogenized catalyst maintained approximately the same catalytic activity of the homogeneous analogous with the advantage that could be reused many times. Mechanistic studies were performed in order to shed some light on the rate-determining step and the transformation of catalysts when exposed to "oxidative stress". It was found that the last oxidative step, preceding oxygen liberation, is the rate-determining step when a small excess of sacrificial oxidant is used. In addition, several intermediates of the oxidative transformation of the catalyst were intercepted and characterized by NMR, X-Ray diffractometry and ESI-MS.

  13. Homogeneity of passively ventilated waste tanks

    SciTech Connect

    Huckaby, J.L.; Jensen, L.; Cromar, R.D.; Hayes, J.C.

    1997-07-01

    Gases and vapors in the high-level radioactive waste underground storage tanks at the Hanford Site are being characterized to help resolve waste storage safety issues and estimate air emissions. Characterization is accomplished by collecting and analyzing air samples from the headspaces of the tanks. Samples are generally collected from a single central location within the headspace, and it is assumed that they are representative of the entire headspace. The validity of this assumption appears to be very good for most tanks, because thermally induced convection currents within the headspaces mix constituents continuously. In the coolest waste tanks, however, thermally induced convection may be suppressed for several months of each year because of the seasonal soil temperature cycle. To determine whether composition does vary significantly with location in a cool tank, the headspaces of three waste tanks have been sampled at different horizontal and vertical locations during that part of the year when thermally induced convection is minimized. This report describes the bases for tank selection and the sampling and analytical methods used, then analyzes and discusses the results. Headspace composition data from two risers at three elevations in Tanks 241-B-103, TY-103, and U-112 have been analyzed by standard analysis of variance (ANOVA) methods, which indicate that these tank headspaces are essentially homogeneous. No stratification of denser vapors (e.g., carbon tetrachloride, dodecane) or lighter gases (e.g., ammonia, hydrogen) was detected in any of the three tanks. A qualitative examination of all tentatively identified organic vapors in SUMMA{trademark} and TST samples supported this conclusion.

  14. Molecular precursors for the preparation of homogenous zirconia-silica materials by hydrolytic sol-gel process in organic media. Crystal structures of [Zr{OSi(O(t)Bu)3}4(H2O)2]·2H2O and [Ti(O(t)Bu){OSi(O(t)Bu)3}3].

    PubMed

    Dhayal, Veena; Chaudhary, Archana; Choudhary, Banwari Lal; Nagar, Meena; Bohra, Rakesh; Mobin, Shaikh M; Mathur, Pradeep

    2012-08-21

    [Zr(OPr(i))(4)·Pr(i)OH] reacts with [HOSi(O(t)Bu)(3)] in anhydrous benzene in 1:1 and 1:2 molar ratios to afford alkoxy zirconosiloxane precursors of the types [Zr(OPr(i))(3){OSi(O(t)Bu)(3)}] (A) and [Zr(OPr(i))(2){OSi(O(t)Bu)(3)}(2)] (B), respectively. Further reactions of A or B with glycols in 1:1 molar ratio afforded six chemically modified precursors of the types [Zr(OPr(i))(OGO){OSi(O(t)Bu)(3)}] (1A-3A) and [Zr(OGO){OSi(O(t)Bu)(3)}(2)] (1B-3B), respectively [where G = (-CH(2)-)(2) (1A, 1B); (-CH(2)-)(3) (2A, 2B) and (-CH(2)CH(2)CH(CH(3)-)} (3A, 3B)]. The precursors A and B are viscous liquids, which solidify on ageing whereas the other products are all solids, soluble in common organic solvents. These were characterized by elemental analyses, molecular weight measurements, FAB mass, FTIR, (1)H, (13)C and (29)Si-NMR studies. Cryoscopic molecular weight measurements of all the products, as well as the FAB mass studies of 3A and 3B, indicate their monomeric nature. However, FAB mass spectrum of the solidified B suggests that it exists in dimeric form. Single crystal structure analysis of [Zr{OSi(O(t)Bu)(3)}(4)(H(2)O)(2)]·2H(2)O (3b) (R(fac) = 11.9%) as well as that of corresponding better quality crystals of [Ti(O(t)Bu){OSi(O(t)Bu)(3)}(3)] (4) (R(fac) = 5.97%) indicate the presence of a M-O-Si bond. TG analyses of 3A, B, and 3B indicate the formation of zirconia-silica materials of the type ZrO(2)·SiO(2) from 3A and ZrO(2)·2SiO(2) from B or 3B at low decomposition temperatures (≤200 °C). The desired homogenous nano-sized zirconia-silica materials [ZrO(2)·nSiO(2)] have been obtained easily from the precursors A and B as well as from the glycol modified precursors 3A and 3B by hydrolytic sol-gel process in organic media without using any acid or base catalyst, and these were characterized by powder XRD patterns, SEM images, EDX analyses and IR spectroscopy.

  15. Creating genetic resistance to HIV.

    PubMed

    Burnett, John C; Zaia, John A; Rossi, John J

    2012-10-01

    HIV/AIDS remains a chronic and incurable disease, in spite of the notable successes of combination antiretroviral therapy. Gene therapy offers the prospect of creating genetic resistance to HIV that supplants the need for antiviral drugs. In sight of this goal, a variety of anti-HIV genes have reached clinical testing, including gene-editing enzymes, protein-based inhibitors, and RNA-based therapeutics. Combinations of therapeutic genes against viral and host targets are designed to improve the overall antiviral potency and reduce the likelihood of viral resistance. In cell-based therapies, therapeutic genes are expressed in gene modified T lymphocytes or in hematopoietic stem cells that generate an HIV-resistant immune system. Such strategies must promote the selective proliferation of the transplanted cells and the prolonged expression of therapeutic genes. This review focuses on the current advances and limitations in genetic therapies against HIV, including the status of several recent and ongoing clinical studies.

  16. Creating an effective poster presentation.

    PubMed

    Taggart, H M; Arslanian, C

    2000-01-01

    One way to build knowledge in nursing is to share research findings or clinical program outcomes. The dissemination of these findings is often a difficult final step in a project that has taken months or years to complete. One method of sharing findings in a relaxed and informal setting is a poster presentation. This method is an effective form for presenting findings using an interactive approach. The milieu of a poster presentation enables the presenters to interact and dialogue with colleagues. Guidelines for size and format require that the poster is clear and informative. Application of design helps to create visually appealing posters. This article summarizes elements of designing and conducting a poster presentation.

  17. HOMOGENEITY-HETEROGENEITY OF GROUP MEMBERSHIP.

    DTIC Science & Technology

    GROUP DYNAMICS, * PERSONALITY , SCIENTIFIC RESEARCH, INTERACTIONS, PERFORMANCE(HUMAN), PROBLEM SOLVING, SOCIAL PSYCHOLOGY , MATHEMATICAL MODELS...ANALYSIS OF VARIANCE, BEHAVIOR, STATISTICAL PROCESSES, NAVAL RESEARCH, CONFINED ENVIRONMENTS.

  18. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  19. Mechanized syringe homogenization of human and animal tissues.

    PubMed

    Kurien, Biji T; Porter, Andrew C; Patel, Nisha C; Kurono, Sadamu; Matsumoto, Hiroyuki; Scofield, R Hal

    2004-06-01

    Tissue homogenization is a prerequisite to any fractionation schedule. A plethora of hands-on methods are available to homogenize tissues. Here we report a mechanized method for homogenizing animal and human tissues rapidly and easily. The Bio-Mixer 1200 (manufactured by Innovative Products, Inc., Oklahoma City, OK) utilizes the back-and-forth movement of two motor-driven disposable syringes, connected to each other through a three-way stopcock, to homogenize animal or human tissue. Using this method, we were able to homogenize human or mouse tissues (brain, liver, heart, and salivary glands) in 5 min. From sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis and a matrix-assisted laser desorption/ionization time-of-flight mass spectrometric enzyme assay for prolidase, we have found that the homogenates obtained were as good or even better than that obtained used a manual glass-on-Teflon (DuPont, Wilmington, DE) homogenization protocol (all-glass tube and Teflon pestle). Use of the Bio-Mixer 1200 to homogenize animal or human tissue precludes the need to stay in the cold room as is the case with the other hands-on homogenization methods available, in addition to freeing up time for other experiments.

  20. A non-asymptotic homogenization theory for periodic electromagnetic structures

    PubMed Central

    Tsukerman, Igor; Markel, Vadim A.

    2014-01-01

    Homogenization of electromagnetic periodic composites is treated as a two-scale problem and solved by approximating the fields on both scales with eigenmodes that satisfy Maxwell's equations and boundary conditions as accurately as possible. Built into this homogenization methodology is an error indicator whose value characterizes the accuracy of homogenization. The proposed theory allows one to define not only bulk, but also position-dependent material parameters (e.g. in proximity to a physical boundary) and to quantify the trade-off between the accuracy of homogenization and its range of applicability to various illumination conditions. PMID:25104912

  1. Homogeneous Freezing of Water Droplets and its Dependence on Droplet Size

    NASA Astrophysics Data System (ADS)

    Schmitt, Thea; Möhler, Ottmar; Höhler, Kristina; Leisner, Thomas

    2014-05-01

    The formulation and parameterisation of microphysical processes in tropospheric clouds, such as phase transitions, is still a challenge for weather and climate models. This includes the homogeneous freezing of supercooled water droplets, since this is an important process in deep convective systems, where almost pure water droplets may stay liquid until homogeneous freezing occurs at temperatures around 238 K. Though the homogeneous ice nucleation in supercooled water is considered to be well understood, recent laboratory experiments with typical cloud droplet sizes showed one to two orders of magnitude smaller nucleation rate coefficients than previous literature results, including earlier results from experiments with single levitated water droplets and from cloud simulation experiments at the AIDA (Aerosol Interaction and Dynamics in the Atmosphere) facility. This motivated us to re-analyse homogeneous droplet freezing experiments conducted during the previous years at the AIDA cloud chamber. This cloud chamber has a volume of 84m3 and operates under atmospherically relevant conditions within wide ranges of temperature, pressure and humidity, whereby investigations of both tropospheric mixed-phase clouds and cirrus clouds can be realised. By controlled adiabatic expansions, the ascent of an air parcel in the troposphere can be simulated. According to our new results and their comparison to the results from single levitated droplet experiments, the homogeneous freezing of water droplets seems to be a volume-dependent process, at least for droplets as small as a few micrometers in diameter. A contribution of surface induced freezing can be ruled out, in agreement to previous conclusions from the single droplet experiments. The obtained volume nucleation rate coefficients are in good agreement, within error bars, with some previous literature data, including our own results from earlier AIDA experiments, but they do not agree with recently published lower volume

  2. Homogenization of historical time series on a subdaily scale

    NASA Astrophysics Data System (ADS)

    Kocen, Renate; Brönnimann, Stefan; Breda, Leila; Spadin, Reto; Begert, Michael; Füllemann, Christine

    2010-05-01

    Homogeneous long-term climatological time series provide useful information on climate back to the preindustrial era. High temporal resolution of climate data is desirable to address trends and variability in the mean climate and in climatic extremes. For Switzerland, three long (~250 yrs) historical time series (Basel, Geneva, Gr. St. Bernhard) that were hitherto available in the form of monthly means only have recently been digitized (in cooperation with MeteoSwiss) on a subdaily scale. The digitized time series contain subdaily data (varies from 2-5 daily measurements) on temperature, precipitation/snow height, pressure and humidity, as subdaily descriptions on wind direction, wind speeds and cloud cover. Long-term climatological records often contain inhomogeneities due to non climatic changes such as station relocations, changes in instrumentation and instrument exposure, changes in observing schedules/practices and environmental changes in the proximity of the observation site. Those disturbances can distort or hide the true climatic signal and could seriously affect the correct assessment and analysis of climate trends, variability and climatic extremes. It is therefore crucial to detect and eliminate artificial shifts and trends, to the extent possible, in the climate data prior to its application. Detailed information of the station history and instruments (metadata) can be of fundamental importance in the process of homogenization in order to support the determination of the exact time of inhomogeneities and the interpretation of statistical test results. While similar methods can be used for the detection of inhomogeneities in subdaily or monthly mean data, quite different correction methods can be chosen. The wealth of information in a high temporal resolution allows more physics-based correction methods. For instance, a detected radiation error in temperature can be corrected with an error model that incorporates radiation and ventilation terms using

  3. Electrothermal atomic absorption spectrophotometry of nickel in tissue homogenates

    SciTech Connect

    Sunderman, F.W. Jr.; Marzouk, A.; Crisostomo, M.C.; Weatherby, D.R.

    1985-01-01

    A method for analysis of Ni concentrations in tissues is described, which involves (a) tissue dissection with metal-free obsidian knives, (b) tissue homogenization in polyethylene bags by use by a Stomacher blender, (c) oxidative digestion with mixed nitric, sulfuric, and perchloric acids, and (d) quantitation of Ni by electrothermal atomic absorption spectrophotometry with Zeeman background correction. The detection limit for Ni in tissues is 10 ng per g, dry weight; the coefficient of variation ranges from 7 to 15%, depending on the tissue Ni concentration; the recovery of Ni added in concentration of 20 ng per g, dry weight, to kidney homogenates averages 101 +/- 8% (mean +/-SD). In control rats, Ni concentrations are highest in lung (102 +/- 39 ng per g, dry weight) and lowest in spleen (35 +/- 16 ng per g, dry wt.). In descending order of Ni concentrations, the tissues of control rats rank as follows: lung > heart > bone > kidney > brain > testis > fat > liver > spleen. In rats killed 24 h after sc injection of NiCl/sub 2/ (0.125 mmol per kg, body weight) Ni concentrations are highest in kidney (17.7 +/- 2.5 ..mu..g per g, dry weight) and lowest in brain (0.38 +/- 0.14 ..mu..g per g, dry weight). In descending order of Ni concentrations, the tissues of NiCl/sub 2/-treated rats rank as follows: kidney >> lung > spleen > testis > heart > fat > liver > bone > brain. The present method fills the need for an accurate, sensitive, and practical technique to determine tissue Ni concentrations, with stringent precautions to minimize Ni contamination during tissue sampling and processing. 35 references, 5 figures, 1 table.

  4. Create a Pint-Sized Photo Book.

    ERIC Educational Resources Information Center

    Gathright, Pat

    2003-01-01

    Explains a project, which involves creating a book using digital images. Notes that teachers can create books with samples of their work. Provides other suggestions for using this project, such as teaching scanning, creating a photo portfolio as a semester exam project, or creating introduction pieces for yearbook or newspaper staffers. (PM)

  5. Creating bicultural experiences in nursing.

    PubMed

    Hezekiah, J

    1993-01-01

    This article describes the process (activities) involved in helping Registered Nurse students from Pakistan in an international health project adjust to Canadian culture and readjust to their home culture. The process, involving both structured and informal activities in Pakistan and in Canada, was designed to assist the students in adapting to both the foreign and home cultures. These processes included both human and material resources. Predeparture and reentry workshops, support systems in the form of Karachi-based faculty advisers, and intensive orientation programs were identified as important factors in the project students' adjustment.

  6. Propagation of the light generated by quasi-homogeneous sources through quasi-homogeneous media

    NASA Astrophysics Data System (ADS)

    Li, Jia; Chen, Yan-Ru; Zhao, Qi; Zhou, Mu-Chun; Xu, Shi-Xue

    2010-01-01

    The spectral density of the quasi-homogeneous (QH) light has been known when it scatters on QH media or propagates in free space. The case that QH sources are surrounded by QH media is proposed in this paper. Under the paraxial approximation, the spectral density of the QH light propagating through QH media is derived. A modified scaling law for the propagation of the QH light through QH media is also obtained. This law also holds true in the far field beyond the paraxial approximation.

  7. Creating Class Rules: A Beginning to Creating Community.

    ERIC Educational Resources Information Center

    Goularte, Renee

    The shared writing program described in this lesson engages K-2 students in thinking about the process of learning and the behavioral and community needs which support a productive classroom environment. During the two 30-minute lessons, students will: participate in group discussions about learning; identify and agree on classroom goals and…

  8. Dynamics of the light-induced atomic desorption at homogeneous illumination

    NASA Astrophysics Data System (ADS)

    Tsvetkov, S.; Taslakov, M.; Gateva, S.

    2017-03-01

    An experimental investigation of Light-Induced Atomic Desorption (LIAD) at homogeneous illumination in uncoated Rb glass cell is reported. The dynamics parameters of LIAD and their dependences on the illumination intensity in uncoated cell are measured and compared with these in paraffin-coated cell and the theoretical dependences for coated cell at homogeneous illumination. The homogeneous illumination not only increases the yield of LIAD, but increases the rates of desorption and adsorption. The results are interesting for the better understanding of the process of LIAD and the atom-surface interaction, for the development of new LIAD-loaded atomic devices, all-optical control of light, optical sensors miniaturization, and new methods for surface and coating diagnostics and nanostructuring.

  9. The role of exotic species in homogenizing the North American flora.

    PubMed

    Qian, Hong; Ricklefs, Robert E

    2006-12-01

    Exotic species have begun to homogenize the global biota, yet few data are available to assess the extent of this process or factors that constrain its advance at global or continental scales. We evaluate homogenization of vascular plants across America north of Mexico by comparing similarity in the complete native and exotic floras between states and provinces of the USA and Canada. Compared with native species, exotic plants are distributed haphazardly among areas but spread more widely, producing differentiation of floras among neighbouring areas but homogenization at greater distance. The number of exotic species is more closely associated with the size of the human population than with ecological conditions, as in the case of native species, and their distributions are less influenced by climate than those of native species.

  10. Synthesis of cyclic sulfites from epoxides and sulfur dioxide with silica-immobilized homogeneous catalysts.

    PubMed

    Takenaka, Yasumasa; Kiyosu, Takahiro; Mori, Goro; Choi, Jun-Chul; Fukaya, Norihisa; Sakakura, Toshiyasu; Yasuda, Hiroyuki

    2012-01-09

    Quaternary ammonium- and amino-functionalized silica catalysts have been prepared for the selective synthesis of cyclic sulfites from epoxides and sulfur dioxide, demonstrating the effects of immobilizing the homogeneous catalysts on silica. The cycloaddition of sulfur dioxide to various epoxides was conducted under solvent-free conditions at 100 °C. The quaternary ammonium- and amino-functionalized silica catalysts produced cyclic sulfites in high yields (79-96 %) that are comparable to those produced by the homogeneous catalysts. The functionalized silica catalysts could be separated from the product solution by filtration, thereby avoiding the catalytic decomposition of the cyclic sulfite products upon distillation of the product solution. Heterogenization of a homogeneous catalyst by immobilization can, therefore, improve the efficiency of the purification of crude reaction products. Despite a decrease in catalytic activity after each recycling step, the heterogeneous pyridine-functionalized silica catalyst provided high yields after as many as five recycling processes.

  11. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    NASA Astrophysics Data System (ADS)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2016-11-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  12. Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results

    NASA Astrophysics Data System (ADS)

    Jablonski, Paul D.; Hawk, Jeffrey A.

    2017-01-01

    Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.

  13. Homogeneous eutectic of Pb-Sb

    NASA Technical Reports Server (NTRS)

    Winter, J. M., Jr.

    1977-01-01

    Dendrite free eutectic mixture of Pb-Sb is expected to be superelastic material that can be used in formation of shaped charge liners for industrial explosive metal-forming processes and other applications.

  14. Creating experimental color harmony map

    NASA Astrophysics Data System (ADS)

    Chamaret, Christel; Urban, Fabrice; Lepinel, Josselin

    2014-02-01

    Starting in the 17th century with Newton, color harmony is a topic that did not reach a consensus on definition, representation or modeling so far. Previous work highlighted specific characteristics for color harmony on com- bination of color doublets or triplets by means of a human rating on a harmony scale. However, there were no investigation involving complex stimuli or pointing out how harmony is spatially located within a picture. The modeling of such concept as well as a reliable ground-truth would be of high value for the community, since the applications are wide and concern several communities: from psychology to computer graphics. We propose a protocol for creating color harmony maps from a controlled experiment. Through an eye-tracking protocol, we focus on the identification of disharmonious colors in pictures. The experiment was composed of a free viewing pass in order to let the observer be familiar with the content before a second pass where we asked "to search for the most disharmonious areas in the picture". Twenty-seven observers participated to the experiments that was composed of a total of 30 different stimuli. The high inter-observer agreement as well as a cross-validation confirm the validity of the proposed ground-truth.

  15. Creating a urine black hole

    NASA Astrophysics Data System (ADS)

    Hurd, Randy; Pan, Zhao; Meritt, Andrew; Belden, Jesse; Truscott, Tadd

    2015-11-01

    Since the mid-nineteenth century, both enlisted and fashion-conscious owners of khaki trousers have been plagued by undesired speckle patterns resulting from splash-back while urinating. In recent years, industrial designers and hygiene-driven entrepreneurs have sought to limit this splashing by creating urinal inserts, with the effectiveness of their inventions varying drastically. From this large assortment of inserts, designs consisting of macroscopic pillar arrays seem to be the most effective splash suppressers. Interestingly this design partially mimics the geometry of the water capturing moss Syntrichia caninervis, which exhibits a notable ability to suppress splash and quickly absorb water from impacting rain droplets. With this natural splash suppressor in mind, we search for the ideal urine black hole by performing experiments of simulated urine streams (water droplet streams) impacting macroscopic pillar arrays with varying parameters including pillar height and spacing, draining and material properties. We propose improved urinal insert designs based on our experimental data in hopes of reducing potential embarrassment inherent in wearing khakis.

  16. Creating Effective K-12 Outreach

    NASA Astrophysics Data System (ADS)

    Hopkins, J.

    2011-12-01

    Grant opportunities require investigators to provide 'broader impacts' for their scientific research. For most researchers this involves some kind of educational outreach for the K-12 community. I have been able to participate in many different types of grant funded science teacher professional development programs. The most valuable have been outreach where the research seamlessly integrated with my classroom curriculum and was sustainable with my future classes. To accomplish these types of programs, the investigators needed to research the K-12 community and identify several key aspects of the K-12 environment where their expertise would benefit me and my students. There are a lot of different K-12 learning environments, so researchers need to be sure to match up with the right grade level and administrative environment. You might want to consider non-main stream school settings, such as magnet programs, STEM academies, and distance learning. The goal is to try to make your outreach seem natural and productive. This presentation will illustrate how researchers can create an educational outreach project that will be a win-win situation for everyone involved.

  17. Nanocrystals encapsulated in SiO2 particles: silanization and homogenous coating for bio applications.

    PubMed

    Yang, Ping; Li, Xiaoyu; Zhang, Ruili; Liu, Ning; Zhang, Yulan

    2013-03-01

    Sol-gel procedures have been developed to encapsulate inorganic nanocrystals including metallic Au and II-VI semiconductor materials (CdSe/Cd(1-x)Zn(x)S) in SiO2 particles by using tetraethyl orthosilicate. The key strategy was the control of a sol-gel procedure. The anisotropic deposition of SiO2 monomers occurs because well-developed crystal facets having different affinity to SiO2 monomers. SiO2 monomers were not homogeneously deposited on nonspherical Au and CdSe/Cd(1-x)Zn(x)S nanocrystals. A surface silanization process, partly hydrolyzed tetraethyl orthosilicate were attached to the nanocrystals instead of initial ligands, plays an important role for the nanocrystals coated homogeneously with a SiO2 layer. Furthermore, CdSe/Cd(1-x)Zn(x)S nanocrystals were homogeneously coated with a thin SiO2 layer by the surface silanization process and a subsequent reverse micelle route. Colloidal Au nanocrystals were homogeneously coated with a SiO2 shell by the surface silanization process and subsequent Stöber synthesis without using a silane coupling agent or bulk polymer as the surface primer to render the Au surface vitreophilic. These results indicated partly hydrolyzed tetraethyl orthosilicate has an ability to replace the ligand on nanocrystals. After surface modification, the SiO2 particles with nanocrystals were conjugated with antibody for bioapplications.

  18. Feeding premature infants banked human milk homogenized by ultrasonic treatment.

    PubMed

    Rayol, M R; Martinez, F E; Jorge, S M; Gonçalves, A L; Desai, I D

    1993-12-01

    Premature neonates fed ultrasonically homogenized human milk had better weight gain and triceps skin-fold thickness than did a control group given untreated human milk (p < 0.01) and also had lower fat loss during tube feeding (p < 0.01). Ultrasonic homogenization of human milk appears to minimize loss of fat and thus allows better growth of premature infants.

  19. Pi overlapping ring systems contained in a homogeneous assay: a novel homogeneous assay for antigens

    NASA Astrophysics Data System (ADS)

    Kidwell, David A.

    1993-05-01

    A novel immunoassay, Pi overlapping ring systems contained in a homogeneous assay (PORSCHA), is described. This assay relies upon the change in fluorescent spectral properties that pyrene and its derivatives show with varying concentration. Because antibodies and other biomolecules can bind two molecules simultaneously, they can change the local concentration of the molecules that they bind. This concentration change may be detected spectrally as a change in the fluorescence emission wavelength of an appropriately labeled biomolecule. Several tests of PORSCHA have been performed which demonstrate this principle. For example: with streptavidin as the binding biomolecule and a biotin labeled pyrene derivative, the production of the excimer emitting at 470 nm is observed. Without the streptavidin present, only the monomer emitting at 378 and 390 nm is observed. The ratio of monomer to excimer provides the concentration of unlabeled biotin in the sample. Approximately 1 ng/mL of biotin may be detected with this system using a 50 (mu) l sample (2 X 10-16 moles biotin). The principles behind PORSCHA, the results with the streptavidin/biotin system are discussed and extensions of the PORSCHA concept to antibodies as the binding partner and DNA in homogeneous assays are suggested.

  20. Fuel mixture stratification as a method for improving homogeneous charge compression ignition engine operation

    DOEpatents

    Dec, John E.; Sjoberg, Carl-Magnus G.

    2006-10-31

    A method for slowing the heat-release rate in homogeneous charge compression ignition ("HCCI") engines that allows operation without excessive knock at higher engine loads than are possible with conventional HCCI. This method comprises injecting a fuel charge in a manner that creates a stratified fuel charge in the engine cylinder to provide a range of fuel concentrations in the in-cylinder gases (typically with enough oxygen for complete combustion) using a fuel with two-stage ignition fuel having appropriate cool-flame chemistry so that regions of different fuel concentrations autoignite sequentially.

  1. Improving non-homogeneous regression for probabilistic precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Presser, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2016-04-01

    Non-homogenous regression is a state-of-the-art ensemble post-processing technique that statistically corrects ensemble forecasts and predicts a full probability distribution. Originally, a Gaussian model is employed that linearly links the predicted distribution mean and variance to the ensemble mean and variance, respectively. Regarding non-normally distributed precipitation data, this model can be censored at zero to account for periods without precipitation. We improve this regression approach in several directions. First, we consider link functions in the variance sub-model that assure positivity of the model variance. Second, we consider a censored Logistic (instead of censored Gaussian) distribution to accommodate more frequent events with high precipitation. Third, we introduce a splitting procedure, which appropriately accounts for perfect prediction cases, i.e., where no precipitation is observed when all ensemble members predict no precipitation. This study is applied to different accumulation periods (3, 6, 12, 24 hours) for short-range precipitation forecasts in Northern Italy. The choice of link function for the variance parameter, the splitting procedure, and an appropriate distribution assumption for precipitation data significantly improve the probabilistic forecast skill, especially for shorter accumulation periods. KEYWORDS: heteroscedastic ensemble post-processing, censored distribution, maximum likelihood estimation, probabilistic precipitation forecasting

  2. Powerful laser pulse absorption in partly homogenized foam plasma

    NASA Astrophysics Data System (ADS)

    Cipriani, M.; Gus'kov, S. Yu.; De Angelis, R.; Andreoli, P.; Consoli, F.; Cristofari, G.; Di Giorgio, G.; Ingenito, F.; Rupasov, A. A.

    2016-03-01

    The internal volume structure of a porous medium of light elements determines unique features of the absorption mechanism of laser radiation; the characteristics of relaxation and transport processes in the produced plasma are affected as well. Porous materials with an average density larger than the critical density have a central role in enhancing the pressure produced during the ablation by the laser pulse; this pressure can exceed the one produced by target direct irradiation. The problem of the absorption of powerful laser radiation in a porous material is examined both analytically and numerically. The behavior of the medium during the process of pore filling in the heated region is described by a model of viscous homogenization. An expression describing the time and space dependence of the absorption coefficient of laser radiation is therefore obtained from the model. A numerical investigation of the absorption of a nanosecond laser pulse is performed within the present model. In the context of numerical calculations, porous media with an average density larger than the critical density of the laser-produced plasma are considered. Preliminary results about the inclusion of the developed absorption model into an hydrodynamic code are presented.

  3. How to become an authentic speaker. Even sincere speeches often come across as contrived. A four-step process will help you create a true emotional connection with your audience.

    PubMed

    Morgan, Nick

    2008-11-01

    Like the best-laid schemes of mice and men, the best-rehearsed speeches go oft astray. No amount of preparation can counter an audience's perception that the speaker is calculating or insincere. Why do so many managers have trouble communicating authenticity to their listeners? Morgan, a communications coach for more than two decades, offers advice for overcoming this difficulty. Recent brain research shows that natural, unstudied gestures--what Morgan calls the " second conversation"--express emotions or impulses a split second before our thought processes have turned them into words. So the timing of practiced gestures will always be subtly off--just enough to be picked up by listeners' unconscious ability to read body language. If you can't practice the unspoken part of your delivery, what can you do? Tap into four basic impulses underlying your speech--to be open to the audience, to connect with it, to be passionate, and to "listen" to how the audience is responding--and then rehearse your presentation with each in mind. You can become more open, for instance, by imagining that you're speaking to your spouse or close friend. To more readily connect, focus on needing to engage your listeners and then to keep their attention, as if you were speaking to a child who isn't heeding your words. To convey your passion, identify the feelings behind your speech and let them come through. To listen, think about what the audience is probably feeling when you step up to the podium and be alert to the nonverbal messages of its members. Internalizing these four impulses as you practice will help you come across as relaxed and authentic--your body language will take care of itself.

  4. Sensitivity of liquid clouds to homogenous freezing parameterizations.

    PubMed

    Herbert, Ross J; Murray, Benjamin J; Dobbie, Steven J; Koop, Thomas

    2015-03-16

    Water droplets in some clouds can supercool to temperatures where homogeneous ice nucleation becomes the dominant freezing mechanism. In many cloud resolving and mesoscale models, it is assumed that homogeneous ice nucleation in water droplets only occurs below some threshold temperature typically set at -40°C. However, laboratory measurements show that there is a finite rate of nucleation at warmer temperatures. In this study we use a parcel model with detailed microphysics to show that cloud properties can be sensitive to homogeneous ice nucleation as warm as -30°C. Thus, homogeneous ice nucleation may be more important for cloud development, precipitation rates, and key cloud radiative parameters than is often assumed. Furthermore, we show that cloud development is particularly sensitive to the temperature dependence of the nucleation rate. In order to better constrain the parameterization of homogeneous ice nucleation laboratory measurements are needed at both high (>-35°C) and low (<-38°C) temperatures.

  5. Fibrations and globalizations of compact homogeneous CR-manifolds

    NASA Astrophysics Data System (ADS)

    Gilligan, B.; Huckleberry, Alan T.

    2009-06-01

    Fibration methods which were previously used for complex homogeneous spaces and CR-homogeneous spaces of special types [1]-[4] are developed in a general framework. These include the \\mathfrak{g}-anticanonical fibration in the CR-setting, which reduces certain considerations to the compact projective algebraic case, where a Borel-Remmert type splitting theorem is proved. This leads to a reduction to spaces homogeneous under actions of compact Lie groups. General globalization theorems are proved which enable one to regard a homogeneous CR-manifold as an orbit of a real Lie group in a complex homogeneous space of a complex Lie group. In the special case of CR-codimension at most two, precise classification results are proved and are applied to show that in most cases there exists such a globalization.

  6. 6 Ways to Create Change

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2013-01-01

    With so many disruptive forces at work in higher education, colleges and universities are faced with the imperative to change not just technologies and processes, but behaviors and mindsets. In part one of a two-part series, change-management experts share six ways to foster large-scale transformations on campus. "Campus Technology"…

  7. Creating the Home Field Advantage

    ERIC Educational Resources Information Center

    Perna, Mark C.

    2007-01-01

    Far too many career and technical education schools overlook the opportunity to establish and cultivate real long-term emotional attachment and loyalty with prospective students and their parents. This occurs because of a basic misconception at the school that they are not in control of their own marketing and recruitment process. Community…

  8. Method and Apparatus for Creating a Topography at a Surface

    DOEpatents

    Adams, David P.; Sinclair, Michael B.; Mayer, Thomas M.; Vasile, Michael J.; Sweatt, William C.

    2008-11-11

    Methods and apparatus whereby an optical interferometer is utilized to monitor and provide feedback control to an integrated energetic particle column, to create desired topographies, including the depth, shape and/or roughness of features, at a surface of a specimen. Energetic particle columns can direct energetic species including, ions, photons and/or neutral particles to a surface to create features having in-plane dimensions on the order of 1 micron, and a height or depth on the order of 1 nanometer. Energetic processes can include subtractive processes such as sputtering, ablation, focused ion beam milling and, additive processes, such as energetic beam induced chemical vapor deposition. The integration of interferometric methods with processing by energetic species offers the ability to create desired topographies at surfaces, including planar and curved shapes.

  9. Creating CHEA: Building a New National Organization on Accrediting.

    ERIC Educational Resources Information Center

    Bloland, Harland G.

    1999-01-01

    Describes the process by which the higher education community, working through national higher education association presidents, regional accreditation commission directors, and college/university presidents, created the Council for Higher Education Accreditation (CHEA), a new national organization on accrediting. The process illustrates the…

  10. Making It CLICK: Planning, Creating, and Using CPCC Libraries' Logo

    ERIC Educational Resources Information Center

    Moore, Gena

    2005-01-01

    The Central Piedmont Community College Libraries have been successful in creating positive expectations from the CPCC community by connecting an official library logo with quality library service. The creation of the CPCC Libraries' logo CLICK was a process that spanned several months. A history of this process details the meetings and design work…

  11. Creating the Link between Institutional Effectiveness and Assessment.

    ERIC Educational Resources Information Center

    Easterling, Doug; And Others

    The self-examination process used by Sinclair Community College (SCC), in Dayton, Ohio, is designed to improve student learning and the processes that contributes to effective and efficient learning. In 1988, SCC created a college-wide Assessment Steering Committee (ASC) charged with reviewing the status of assessment practices at SCC and making…

  12. Creating Science Picture Books for an Authentic Audience

    ERIC Educational Resources Information Center

    DeFauw, Danielle L.; Saad, Klodia

    2014-01-01

    This article presents an authentic writing opportunity to help ninth-grade students use the writing process in a science classroom to write and illustrate picture books for fourth-grade students to demonstrate and share their understanding of a biology unit on cells. By creating a picture book, students experience the writing process, understand…

  13. Creating environments of care with transgender communities.

    PubMed

    Thornhill, Lee; Klein, Pamela

    2010-01-01

    Partnerships between transgender individuals and community health nurses have been a primary source of monitoring and responding to the impact of the HIV epidemic on transgender communities, specifically transgender women. This article provides two perspectives: first, from a transgender service provider, and second, from a public health nurse, on forming partnerships that brought consumers and providers together to create environments of care in which many transgender persons living with and at high risk of HIV were able to engage with medical providers who believed in their right to self-determination. The process led to an increased understanding of HIV prevention and treatment needs, better individual-level health outcomes, and institutional change, including the creation of a transgender medical clinic serving homeless transgender individuals in greater Boston.

  14. Creating Effective Dialogue Around Climate Change

    NASA Astrophysics Data System (ADS)

    Kiehl, J. T.

    2015-12-01

    Communicating climate change to people from diverse sectors of society has proven to be difficult in the United States. It is widely recognized that difficulties arise from a number of sources, including: basic science understanding, the psychologically affect laden content surrounding climate change, and the diversity of value systems that exist in our society. I explore ways of working with the affect that arises around climate change and describe specific methods to work with the resistance often encountered when communicating this important issue. The techniques I describe are rooted in psychology and group process and provide means for creating more effective narratives to break through the barriers to communicating climate change science. Examples are given from personal experiences in presenting climate change to diverse groups.

  15. Creating semantics in tool use.

    PubMed

    Badets, Arnaud; Michelet, Thomas; de Rugy, Aymar; Osiurak, François

    2017-02-21

    This article presents the first evidence for a functional link between tool use and the processing of abstract symbols like Arabic numbers. Participants were required to perform a tool-use task after the processing of an Arabic number. These numbers represented either a small (2 or 3) or a large magnitude (8 or 9). The tool-use task consisted in using inverse pliers for gripping either a small or a large object. The inverse pliers enable to dissociate the hand action from the tool action in relation to the object (i.e., closing the hand led to an opening of the tool and vice versa). The number/tool hypothesis predicts that the quantity representation associated with Arabic numbers will interact with the action of the tool toward the object. Conversely, the number/hand hypothesis predicts that the quantity associated with numbers will interact with the action of the hand toward the tool. Results confirmed the first hypothesis and rejected the second. Indeed, large numbers interacted with the action of the tool, such that participants were longer to perform an "opening-hand/closing-tool" action after the processing of large numbers. Moreover, no effect was detected for small numbers, confirming previous studies which used only finger movements. Altogether, our finding suggests that the well-known finger/number interaction can be reversed with tool use.

  16. Creating catastrophes in the classroom

    NASA Astrophysics Data System (ADS)

    Andersson, Thommy

    2013-04-01

    Buildings, infrastructure and human life are being destroyed by wind and landslides. To interest and motivate pupils and to help them understand abstract knowledge, a practical experiment could be useful. These experiments will show why strong winds circulate around tropical cyclones and how fluvial geological processes affect nature and communities. The experiments are easy to set up and the equipment is not expensive. Experiment 1: Exogenic processes of water are often slow processes. This experiment will simulate water processes that can take thousands of years, in less than 40 minutes. This experiment can be presented for and understood by pupils at all levels. Letting the pupils build up the scenery will make them more curious about the course of events. During that time they will see the geomorphological genesis of landforms such as landslides, sandurs, deltas, canyons sedimentations, selective erosions. Placing small houses, bridges etc. we can lead to discussions about natural catastrophes and community planning. Material needed for the experiment is a water bucket, erosion gutter, clay (simulating rock), sand and smaller pebbles (simulating the soil), houses of "Monopoly" size and tubes. By using a table with wheels it is easy to reuse the result for other lessons. Installation of a pump can make the experiment into a closed loop system. This installation can be used for presentations outside the classroom. Experiment 2: The Coriolis Effect explains why the wind (moving objects) deflects when moving. In the northern hemisphere the deflection is clockwise and anti-clockwise in the southern hemisphere. This abstract effect is often hard for upper secondary pupils to understand. This experiment will show the effect and thus make the theory real and visible. Material needed for this experiment is a bucket, pipes, a string. At my school we had cooperation with pupils from the Industrial Technology programme who made a copper pipe construction. During the

  17. Scientists Create Mosquitoes Resistant to Dengue Virus

    MedlinePlus

    ... fullstory_163019.html Scientists Create Mosquitoes Resistant to Dengue Virus Hope is to eventually make the bugs ... say they have created mosquitoes resistant to the dengue virus, which might eventually help control the spread ...

  18. The effect of high pressure homogenization on the activity of a commercial β-galactosidase.

    PubMed

    Tribst, Alline A L; Augusto, Pedro E D; Cristianini, Marcelo

    2012-11-01

    High pressure homogenization (HPH) has been proposed as a promising method for changing the activity and stability of enzymes. Therefore, this research studied the activity of β-galactosidase before and after HPH. The enzyme solution at pH values of 6.4, 7.0, and 8.0 was processed at pressures of up to 150 MPa, and the effects of HPH were determined from the residual enzyme activity measured at 5, 30, and 45 °C immediately after homogenization and after 1 day of refrigerated storage. The results indicated that at neutral pH the enzyme remained active at 30 °C (optimum temperature) even after homogenization at pressures of up to 150 MPa. On the contrary, when the β-galactosidase was homogenized at pH 6.4 and 8.0, a gradual loss of activity was observed, reaching a minimum activity (around 30 %) after HPH at 150 MPa and pH 8.0. After storage, only β-galactosidase that underwent HPH at pH 7.0 retained similar activity to the native sample. Thus, HPH did not affect the activity and stability of β-galactosidase only when the process was carried out at neutral pH; for the other conditions, HPH resulted in partial inactivation of the enzyme. Considering the use of β-galactosidase to produce low lactose milk, it was concluded that HPH can be applied with no deleterious effects on enzyme activity.

  19. At tank Low Activity Feed Homogeneity Analysis Verification

    SciTech Connect

    DOUGLAS, J.G.

    2000-09-28

    This report evaluates the merit of selecting sodium, aluminum, and cesium-137 as analytes to indicate homogeneity of soluble species in low-activity waste (LAW) feed and recommends possible analytes and physical properties that could serve as rapid screening indicators for LAW feed homogeneity. The three analytes are adequate as screening indicators of soluble species homogeneity for tank waste when a mixing pump is used to thoroughly mix the waste in the waste feed staging tank and when all dissolved species are present at concentrations well below their solubility limits. If either of these conditions is violated, then the three indicators may not be sufficiently chemically representative of other waste constituents to reliably indicate homogeneity in the feed supernatant. Additional homogeneity indicators that should be considered are anions such as fluoride, sulfate, and phosphate, total organic carbon/total inorganic carbon, and total alpha to estimate the transuranic species. Physical property measurements such as gamma profiling, conductivity, specific gravity, and total suspended solids are recommended as possible at-tank methods for indicating homogeneity. Indicators of LAW feed homogeneity are needed to reduce the U.S. Department of Energy, Office of River Protection (ORP) Program's contractual risk by assuring that the waste feed is within the contractual composition and can be supplied to the waste treatment plant within the schedule requirements.

  20. Creating a New System for Principal Preparation: Reflections on Efforts to Transcend Tradition and Create New Cultures

    ERIC Educational Resources Information Center

    Reed, Cynthia J.; Kensler, Lisa A. W.

    2010-01-01

    When selected as a pilot redesign site, we decided to both refocus the underlying assumptions guiding our program and to engage in processes allowing us to model best practices while creating a new program. This article summarizes key aspects of our redesign work and offers reflections on the processes used and challenges faced. Murphy's (2006)…

  1. Influence of Interspecific Competition and Landscape Structure on Spatial Homogenization of Avian Assemblages

    PubMed Central

    Robertson, Oliver J.; McAlpine, Clive; House, Alan; Maron, Martine

    2013-01-01

    Human-induced biotic homogenization resulting from landscape change and increased competition from widespread generalists or ‘winners’, is widely recognized as a global threat to biodiversity. However, it remains unclear what aspects of landscape structure influence homogenization. This paper tests the importance of interspecific competition and landscape structure, for the spatial homogeneity of avian assemblages within a fragmented agricultural landscape of eastern Australia. We used field observations of the density of 128 diurnal bird species to calculate taxonomic and functional similarity among assemblages. We then examined whether taxonomic and functional similarity varied with patch type, the extent of woodland habitat, land-use intensity, habitat subdivision, and the presence of Manorina colonies (a competitive genus of honeyeaters). We found the presence of a Manorina colony was the most significant factor positively influencing both taxonomic and functional similarity of bird assemblages. Competition from members of this widespread genus of native honeyeater, rather than landscape structure, was the main cause of both taxonomic and functional homogenization. These species have not recently expanded their range, but rather have increased in density in response to agricultural landscape change. The negative impacts of Manorina honeyeaters on assemblage similarity were most pronounced in landscapes of moderate land-use intensity. We conclude that in these human-modified landscapes, increased competition from dominant native species, or ‘winners’, can result in homogeneous avian assemblages and the loss of specialist species. These interacting processes make biotic homogenization resulting from land-use change a global threat to biodiversity in modified agro-ecosystems. PMID:23724136

  2. Standardization of sample homogenization for mosquito identification using an innovative proteomic tool based on protein profiling.

    PubMed

    Nebbak, Amira; Willcox, Alexandra C; Bitam, Idir; Raoult, Didier; Parola, Philippe; Almeras, Lionel

    2016-12-01

    The rapid spread of vector-borne diseases demands the development of an innovative strategy for arthropod monitoring. The emergence of MALDI-TOF MS as a rapid, low-cost, and accurate tool for arthropod identification is revolutionizing medical entomology. However, as MS spectra from an arthropod can vary according to the body part selected, the sample homogenization method used and the mode and duration of sample storage, standardization of protocols is indispensable prior to the creation and sharing of an MS reference spectra database. In the present study, manual grinding of Anopheles gambiae Giles and Aedes albopictus mosquitoes at the adult and larval (L3) developmental stages was compared to automated homogenization. Settings for each homogenizer were optimized, and glass powder was found to be the best sample disruptor based on its ability to create reproducible and intense MS spectra. In addition, the suitability of common arthropod storage conditions for further MALDI-TOF MS analysis was kinetically evaluated. The conditions that best preserved samples for accurate species identification by MALDI-TOF MS were freezing at -20°C or in liquid nitrogen for up to 6 months. The optimized conditions were objectified based on the reproducibility and stability of species-specific MS profiles. The automation and standardization of mosquito sample preparation methods for MALDI-TOF MS analyses will popularize the use of this innovative tool for the rapid identification of arthropods with medical interest.

  3. Self-dispersible nanocrystals of albendazole produced by high pressure homogenization and spray-drying.

    PubMed

    Paredes, Alejandro Javier; Llabot, Juan Manuel; Sánchez Bruni, Sergio; Allemandi, Daniel; Palma, Santiago Daniel

    2016-10-01

    Albendazole (ABZ) is a broad-spectrum antiparasitic drug used in the treatment of human or animal infections. Although ABZ has shown a high efficacy for repeated doses in monogastric mammals, its low aqueous solubility leads to erratic bioavailability. The aim of this work was to optimize a procedure in order to obtain ABZ self-dispersible nanocrystals (SDNC) by combining high pressure homogenization (HPH) and spray-drying (SD). The material thus obtained was characterized and the variables affecting both the HPH and SD processes were studied. As expected, the homogenizing pressure and number of cycles influenced the final particle size, while the stabilizer concentration had a strong impact on SD output and redispersion of powders upon contact with water. ABZ SDNC were successfully obtained with high process yield and redispersibility. The characteristic peaks of ABZ were clearly identified in the X-ray patterns of the processed samples. A noticeable increase in the dissolution rate was observed in the aqueous environment.

  4. Optimizing SLN and NLC by 2(2) full factorial design: effect of homogenization technique.

    PubMed

    Severino, Patrícia; Santana, Maria Helena A; Souto, Eliana B

    2012-08-01

    Solid lipid nanoparticles (SLN) and nanostructured lipid carrier (NLC) have been employed in pharmaceutics and biomedical formulations. The present study focuses on the optimization of the production process of SLN and NLC by High Shear Homogenization (HSH) and High Pressure Homogenization (HPH). To build up the surface response charts, a 2(2) full factorial design based on 2 independent variables was used to obtain an optimized formulation. The effects of the production process on the mean particle size, polydispersity index (PI) and zeta potential (ZP) were investigated. Optimized SLN were produced applying 20,000 rpm HSH and 500 bar HPH pressure and NLC process 15,000 rpm HSH and 700 bar HPH pressure, respectively. This factorial design study has proven to be a useful tool in optimizing SLN (~100 nm) and NLC (~300 nm) formulations. The present results highlight the benefit of applying statistical designs in the preparation of lipid nanoparticles.

  5. Creating and Testing Simulation Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina M.

    2013-01-01

    The goal of this project is to learn about the software development process, specifically the process to test and fix components of the software. The paper will cover the techniques of testing code, and the benefits of using one style of testing over another. It will also discuss the overall software design and development lifecycle, and how code testing plays an integral role in it. Coding is notorious for always needing to be debugged due to coding errors or faulty program design. Writing tests either before or during program creation that cover all aspects of the code provide a relatively easy way to locate and fix errors, which will in turn decrease the necessity to fix a program after it is released for common use. The backdrop for this paper is the Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI), a project whose goal is to simulate a launch using simulated models of the ground systems and the connections between them and the control room. The simulations will be used for training and to ensure that all possible outcomes and complications are prepared for before the actual launch day. The code being tested is the Programmable Logic Controller Interface (PLCIF) code, the component responsible for transferring the information from the models to the model Programmable Logic Controllers (PLCs), basic computers that are used for very simple tasks.

  6. Heterogenization of Homogeneous Catalysts: the Effect of the Support

    SciTech Connect

    Earl, W.L.; Ott, K.C.; Hall, K.A.; de Rege, F.M.; Morita, D.K.; Tumas, W.; Brown, G.H.; Broene, R.D.

    1999-06-29

    We have studied the influence of placing a soluble, homogeneous catalyst onto a solid support. We determined that such a 'heterogenized' homogeneous catalyst can have improved activity and selectivity for the asymmetric hydrogenation of enamides to amino acid derivatives. The route of heterogenization of RhDuPhos(COD){sup +} cations occurs via electrostatic interactions with anions that are capable of strong hydrogen bonding to silica surfaces. This is a novel approach to supported catalysis. Supported RhDuPhos(COD){sup +} is a recyclable, non-leaching catalyst in non-polar media. This is one of the few heterogenized catalysts that exhibits improved catalytic performance as compared to its homogeneous analog.

  7. A Locally-Exact Homogenization Approach for Periodic Heterogeneous Materials

    SciTech Connect

    Drago, Anthony S.; Pindera, Marek-Jerzy

    2008-02-15

    Elements of the homogenization theory are utilized to develop a new micromechanics approach for unit cells of periodic heterogeneous materials based on locally-exact elasticity solutions. Closed-form expressions for the homogenized moduli of unidirectionally-reinforced heterogeneous materials are obtained in terms of Hill's strain concentration matrices valid under arbitrary combined loading, which yield the homogenized Hooke's law. Results for simple unit cells with off-set fibers, which require the use of periodic boundary conditions, are compared with corresponding finite-element results demonstrating excellent correlation.

  8. Hydrogen storage materials and method of making by dry homogenation

    DOEpatents

    Jensen, Craig M.; Zidan, Ragaiy A.

    2002-01-01

    Dry homogenized metal hydrides, in particular aluminum hydride compounds, as a material for reversible hydrogen storage is provided. The reversible hydrogen storage material comprises a dry homogenized material having transition metal catalytic sites on a metal aluminum hydride compound, or mixtures of metal aluminum hydride compounds. A method of making such reversible hydrogen storage materials by dry doping is also provided and comprises the steps of dry homogenizing metal hydrides by mechanical mixing, such as be crushing or ball milling a powder, of a metal aluminum hydride with a transition metal catalyst. In another aspect of the invention, a method of powering a vehicle apparatus with the reversible hydrogen storage material is provided.

  9. Stochastic homogenization of a front propagation problem with unbounded velocity

    NASA Astrophysics Data System (ADS)

    Hajej, A.

    2017-04-01

    We study the homogenization of Hamilton-Jacobi equations which arise in front propagation problems in stationary ergodic media. Our results are obtained for fronts moving with possible unbounded velocity. We show, by an example, that the homogenized Hamiltonian, which always exists, may be unbounded. In this context, we show convergence results if we start with a compact initial front. On the other hand, if the media satisfies a finite range of dependence condition, we prove that the effective Hamiltonian is bounded and obtain classical homogenization in this context.

  10. No hypocalcemic action of Stannius corpuscle homogenates in rats.

    PubMed

    Ukawa, K; Sasayama, Y

    1993-04-01

    1. Serum Ca level of goldfish administered with homogenate of the corpuscles of Stannius (CS) taken from 1/3 seawater-acclimated goldfish was significantly lower than that of the control goldfish up to 2 hr after administration. 2. Serum Ca, Mg, Pi, Na and K levels of rats administered with CS homogenates of freshwater eels, 1/3 seawater-acclimated goldfish, or seawater-inhabited wrasse were not statistically different from those of control rats during the 3 hr investigation. 3. It was concluded that in rats, CS homogenates did not decrease the serum mineral levels under the present conditions.

  11. Homogeneous Iron Phosphate Nanoparticles by Combustion of Sprays

    PubMed Central

    Rudin, Thomas; Pratsinis, Sotiris E.

    2013-01-01

    Low-cost synthesis of iron phosphate nanostructured particles is attractive for large scale fortification of basic foods (rice, bread, etc.) as well as for Li-battery materials. This is achieved here by flame-assisted and flame spray pyrolysis (FASP and FSP) of inexpensive precursors (iron nitrate, phosphate), solvents (ethanol), and support gases (acetylene and methane). The iron phosphate powders produced here were mostly amorphous and exhibited excellent solubility in dilute acid, an indicator of relative iron bioavailability. The amorphous and crystalline fractions of such powders were determined by X-ray diffraction (XRD) and their cumulative size distribution by X-ray disk centrifuge. Fine and coarse size fractions were obtained also by sedimentation and characterized by microscopy and XRD. The coarse size fraction contained maghemite Fe2O3 while the fine was amorphous iron phosphate. Furthermore, the effect of increased production rate (up to 11 g/h) on product morphology and solubility was explored. Using increased methane flow rates through the ignition/pilot flame of the FSP-burner and inexpensive powder precursors resulted in also homogeneous iron phosphate nanoparticles essentially converting the FSP to a FASP process. The powders produced by FSP at increased methane flow had excellent solubility in dilute acid as well. Such use of methane or even natural gas might be economically attractive for large scale flame-synthesis of nanoparticles. PMID:23407874

  12. Homogeneous Iron Phosphate Nanoparticles by Combustion of Sprays.

    PubMed

    Rudin, Thomas; Pratsinis, Sotiris E

    2012-06-13

    Low-cost synthesis of iron phosphate nanostructured particles is attractive for large scale fortification of basic foods (rice, bread, etc.) as well as for Li-battery materials. This is achieved here by flame-assisted and flame spray pyrolysis (FASP and FSP) of inexpensive precursors (iron nitrate, phosphate), solvents (ethanol), and support gases (acetylene and methane). The iron phosphate powders produced here were mostly amorphous and exhibited excellent solubility in dilute acid, an indicator of relative iron bioavailability. The amorphous and crystalline fractions of such powders were determined by X-ray diffraction (XRD) and their cumulative size distribution by X-ray disk centrifuge. Fine and coarse size fractions were obtained also by sedimentation and characterized by microscopy and XRD. The coarse size fraction contained maghemite Fe(2)O(3) while the fine was amorphous iron phosphate. Furthermore, the effect of increased production rate (up to 11 g/h) on product morphology and solubility was explored. Using increased methane flow rates through the ignition/pilot flame of the FSP-burner and inexpensive powder precursors resulted in also homogeneous iron phosphate nanoparticles essentially converting the FSP to a FASP process. The powders produced by FSP at increased methane flow had excellent solubility in dilute acid as well. Such use of methane or even natural gas might be economically attractive for large scale flame-synthesis of nanoparticles.

  13. A homogeneous biochemiluminescent assay for detection of influenza

    NASA Astrophysics Data System (ADS)

    Hui, Kwok Min; Li, Xiao Jing; Pan, Lu; Li, X. J.

    2015-05-01

    Current methods of rapid detection of influenza are based on detection of the nucleic acids or antigens of influenza viruses. Since influenza viruses constantly mutate leading to appearance of new strains or variants of viruses, these detection methods are susceptible to genetic changes in influenza viruses. Type A and B influenza viruses contain neuraminidase, an essential enzyme for virus replication which enables progeny influenza viruses leave the host cells to infect new cells. Here we describe an assay method, the homogeneous biochemiluminescent assay (HBA), for rapid detection of influenza by detecting viral neuraminidase activity. The assay mimics the light production process of a firefly: a viral neuraminidase specific substrate containing a luciferin moiety is cleaved in the presence of influenza virus to release luciferin, which becomes a substrate to firefly luciferase in a light production system. All reagents can be formulated in a single reaction mix so that the assay involves only one manual step, i.e., sample addition. Presence of Type A or B influenza virus in the sample leads to production of strong, stable and easily detectable light signal, which lasts for hours. Thus, this influenza virus assay is suitable for use in point-of-care settings.

  14. Reynolds and Atwood Numbers Effects on Homogeneous Rayleigh Taylor Instability

    NASA Astrophysics Data System (ADS)

    Aslangil, Denis; Livescu, Daniel; Banerjee, Arindam

    2015-11-01

    The effects of Reynolds and Atwood numbers on turbulent mixing of a heterogeneous mixture of two incompressible, miscible fluids with different densities are investigated by using high-resolution Direct Numerical Simulations (DNS). The flow occurs in a triply periodic 3D domain, with the two fluids initially segregated in random patches, and turbulence is generated in response to buoyancy. In turn, stirring produced by turbulence breaks down the scalar structures, accelerating the molecular mixing. Statistically homogeneous variable-density (VD) mixing, with density variations due to compositional changes, is a basic mixing problem and aims to mimic the core of the mixing layer of acceleration driven Rayleigh Taylor Instability (RTI). We present results covering a large range of kinematic viscosity values for density contrasts including small (A =0.04), moderate (A =0.5), and high (A =0.75 and 0.9) Atwood numbers. Particular interest will be given to the structure of the turbulence and mixing process, including the alignment between various turbulence and scalar quantities, as well as providing fidelity data for verification and validation of mix models. Arindam Banerjee acknowledges support from NSF CAREER award # 1453056.

  15. Modeling of homogeneous charge compression ignition (HCCI) of methane

    SciTech Connect

    Smith, J.R.; Aceves, S.M.; Westbrook, C.; Pitz, W.

    1997-05-01

    The operation of piston engines on a compression ignition cycle using a lean, homogeneous charge has many potential attractive features. These include the potential for extremely low NO{sub x} and particulate emissions while maintaining high thermal efficiency and not requiring the expensive high pressure injection system of the typical modem diesel engine. Using the HCT chemical kinetics code to simulate autoignition of methane-air mixtures, we have explored the ignition timing, burn duration, NO{sub x} production, indicated efficiency and power output of an engine with a compression ratio of 15:1 at 1200 and 2400 rpm. HCT was modified to include the effects of heat transfer. This study used a single control volume reaction zone that varies as a function of crank angle. The ignition process is controlled by varying the intake equivalence ratio and varying the residual gas trapping (RGT). RGT is internal exhaust gas recirculation which recycles both heat and combustion product species. It is accomplished by varying the timing of the exhaust valve closure. Inlet manifold temperature was held constant at 330 Kelvins. Results show that there is a narrow range of operational conditions that show promise of achieving the control necessary to vary power output while keeping indicated efficiency above 50% and NO{sub x} levels below 100 ppm.

  16. Successful strategic planning: creating clarity.

    PubMed

    Adams, Jim

    2005-01-01

    Most healthcare organizations have a strategic plan of some kind. Many of these organizations also have difficulty translating their strategic plan into specific actions that result in successful performance. In the worst cases, this can jeopardize the viability of the organization. The trouble lies in a lack of clarity in what a strategic plan is and what it should do for the organization. This article will answer key questions such as: What is strategy and how does it fit with other commonly used constructs such as mission, vision, and goals? What criteria can be used to determine if something is truly strategic to the organization? What are the phases of the strategy lifecycle? How do approaches for dealing with uncertainty, such as scenario planning, fit with organizational strategic planning? How can a meaningful IT strategy be developed if the organization strategy is lacking? What principles should guide a good IT planning process?

  17. Moving beyond gender: processes that create relationship equality.

    PubMed

    Knudson-Martin, Carmen; Mahoney, Anne Rankin

    2005-04-01

    Equality is related to relationship success, yet few couples achieve it. In this qualitative study, we examine how couples with children in two time cohorts (1982 and 2001) moved toward equality. The analysis identifies three types of couples: Postgender, gender legacy, and traditional. Movement toward equality is facilitated by: (a) Stimulus for change, including awareness of gender, commitment to family and work, and situational pressures; and (b) patterns that promote change, including active negotiation, challenges to gender entitlement, development of new competencies, and mutual attention to relationship and family tasks. Implications for practice are discussed.

  18. Process and Product: Creating Stories with Deaf Students

    ERIC Educational Resources Information Center

    Enns, Catherine; Hall, Ricki; Isaac, Becky; MacDonald, Patricia

    2007-01-01

    This article describes the implementation of one element of an adapted language arts curriculum for Deaf students in a bilingual (American Sign Language and English) educational setting. It examines the implementation of writing workshops in three elementary classrooms in a school for Deaf students. The typical steps of preparing/planning,…

  19. Using the Comer Process To Create a Successful Middle School.

    ERIC Educational Resources Information Center

    Malloy, William; Rayle, Joseph

    2000-01-01

    Presents a case study showing how one middle school in decline used the Comer school development program of integrative partnership to restructure education and support services to improve the school's performance. Describes how the school enacted the three-team approach of the Comer model, including governance, student support services, and…

  20. Creating a Conference and Workshop Attendance Process with Teacher Collaboration

    ERIC Educational Resources Information Center

    Hironaka, Janet Hiroko

    2009-01-01

    This study aimed to build a professional learning community collaboratively among high school teachers for lifelong growth to minimize teacher isolation, discomfort, and fear. The study was driven by a compelling desire to extend personal learning in ways to impact the school learning community. The foundation for this study was the body of…