Science.gov

Sample records for process creates homogenous

  1. Creating a Flexible Budget Process

    ERIC Educational Resources Information Center

    Frew, James; Olson, Robert; Pelton, M. Lee

    2009-01-01

    The budget process is often an especially thorny area in communication between administrators and faculty members. Last year, Willamette University took a step toward reducing tensions surrounding the budget. As university administrators planned for the current year, they faced the high degree of uncertainty that the financial crisis has forced on…

  2. Web Pages Created Via SCID Process.

    ERIC Educational Resources Information Center

    Stammen, Ronald M.

    This paper describes the use of a management process, Systematic Curriculum and Instructional Development (SCID), for developing online multimedia modules. The project, "Collaboratively Creating Multimedia Modules for Teachers and Professors," was funded by the USWEST Foundation. The curriculum development process involved teams of experts in…

  3. Pattern and process of biotic homogenization in the New Pangaea.

    PubMed

    Baiser, Benjamin; Olden, Julian D; Record, Sydne; Lockwood, Julie L; McKinney, Michael L

    2012-12-01

    Human activities have reorganized the earth's biota resulting in spatially disparate locales becoming more or less similar in species composition over time through the processes of biotic homogenization and biotic differentiation, respectively. Despite mounting evidence suggesting that this process may be widespread in both aquatic and terrestrial systems, past studies have predominantly focused on single taxonomic groups at a single spatial scale. Furthermore, change in pairwise similarity is itself dependent on two distinct processes, spatial turnover in species composition and changes in gradients of species richness. Most past research has failed to disentangle the effect of these two mechanisms on homogenization patterns. Here, we use recent statistical advances and collate a global database of homogenization studies (20 studies, 50 datasets) to provide the first global investigation of the homogenization process across major faunal and floral groups and elucidate the relative role of changes in species richness and turnover. We found evidence of homogenization (change in similarity ranging from -0.02 to 0.09) across nearly all taxonomic groups, spatial extent and grain sizes. Partitioning of change in pairwise similarity shows that overall change in community similarity is driven by changes in species richness. Our results show that biotic homogenization is truly a global phenomenon and put into question many of the ecological mechanisms invoked in previous studies to explain patterns of homogenization. PMID:23055062

  4. Spoken Word Processing Creates a Lexical Bottleneck

    ERIC Educational Resources Information Center

    Cleland, Alexandra A.; Tamminen, Jakke; Quinlan, Philip T.; Gaskell, M. Gareth

    2012-01-01

    We report 3 experiments that examined whether presentation of a spoken word creates an attentional bottleneck associated with lexical processing in the absence of a response to that word. A spoken word and a visual stimulus were presented in quick succession, but only the visual stimulus demanded a response. Response times to the visual stimulus…

  5. Creating Only Isotropic Homogeneous Turbulence in Liquid Helium near Absolute Zero

    NASA Astrophysics Data System (ADS)

    Ihas, G. G.; Thompson, K. J.; Labbe, G.; McClintock, P. V. E.

    2012-02-01

    Flow through a grid is a standard method to produce isotropic, homogeneous turbulence for laboratory study. This technique has been used to generate quantum turbulence (QT) above 1 K in superfluid heliumootnotetextS. R. Stalp, L. Skrbek, and R. J. Donnelly, Phys. Rev. Lett. 82, 4831 (1999). where QT seems to mimic classical turbulence. Efforts have been made recentlyootnotetextG. G. Ihas, G. Labbe, S-c. Liu, and K. J. Thompson, J. Low Temp. Phys. 150, 384 (2008). to make similar measurements near absolute zero, where there is an almost total absence of normal fluid and hence classical viscosity. This presents the difficulty that most motive force devices produce heat which overwhelms the phenomena being investigated. The process of designing and implimenting a ``dissipation-free'' motor for pulling a grid through superfluid helium at millikelvin temperatures has resulted in the development of new techniques which have broad application in low temperature research. Some of these, such as Meissner-affect magnetic drives, capacitive and inductive position sensors, and magnetic centering devices will be described. Heating results for devices which can move in a controlled fashion from very low speed up to 10 cm/s will be presented. Acknowledgement: We thank W.F. Vinen for many useful discussions.

  6. Process to create simulated lunar agglutinate particles

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert J. (Inventor); Gustafson, Marty A. (Inventor); White, Brant C. (Inventor)

    2011-01-01

    A method of creating simulated agglutinate particles by applying a heat source sufficient to partially melt a raw material is provided. The raw material is preferably any lunar soil simulant, crushed mineral, mixture of crushed minerals, or similar material, and the heat source creates localized heating of the raw material.

  7. Creep rupture as a non-homogeneous Poissonian process

    PubMed Central

    Danku, Zsuzsa; Kun, Ferenc

    2013-01-01

    Creep rupture of heterogeneous materials occurring under constant sub-critical external loads is responsible for the collapse of engineering constructions and for natural catastrophes. Acoustic monitoring of crackling bursts provides microscopic insight into the failure process. Based on a fiber bundle model, we show that the accelerating bursting activity when approaching failure can be described by the Omori law. For long range load redistribution the time series of bursts proved to be a non-homogeneous Poissonian process with power law distributed burst sizes and waiting times. We demonstrate that limitations of experiments such as finite detection threshold and time resolution have striking effects on the characteristic exponents, which have to be taken into account when comparing model calculations with experiments. Recording events solely within the Omori time to failure the size distribution of bursts has a crossover to a lower exponent which is promising for forecasting the imminent catastrophic failure. PMID:24045539

  8. On homogenization of diffusion processes in microperiodic stratified bodies

    SciTech Connect

    Matysiak, S.J.; Mieszkowski, R.

    1999-05-01

    The bodies with microperiodic layered structures can be made by man (laminated composites) or can be found in nature (warved clays, sandstone-slates, sandstone-shales, thin-layered limestones). The knowledge of diffusion processes in the microperiodic stratified bodies is very important in the chemical engineering, material technology and environmental engineering. The warved clays are applied as natural barriers in the construction of waste dumps. Here, the aim of this contribution is to present the homogenized model of the diffusion processes in microperiodic stratified bodies. The considerations are based on the linear Fick`s theory of diffusion and the procedure of microlocal modeling. The obtained model takes into account certain microlocal structure of the body. As the illustration of the application of presented model, a simple example is given.

  9. Experimenting With Ore: Creating the Taconite Process; flow chart of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Experimenting With Ore: Creating the Taconite Process; flow chart of process - Mines Experiment Station, University of Minnesota, Twin Cities Campus, 56 East River Road, Minneapolis, Hennepin County, MN

  10. Competing Contact Processes on Homogeneous Networks with Tunable Clusterization

    NASA Astrophysics Data System (ADS)

    Rybak, Marcin; Kułakowski, Krzysztof

    2013-03-01

    We investigate two homogeneous networks: the Watts-Strogatz network with mean degree ⟨k⟩ = 4 and the Erdös-Rényi network with ⟨k⟩ = 10. In both kinds of networks, the clustering coefficient C is a tunable control parameter. The network is an area of two competing contact processes, where nodes can be in two states, S or D. A node S becomes D with probability 1 if at least two its mutually linked neighbors are D. A node D becomes S with a given probability p if at least one of its neighbors is S. The competition between the processes is described by a phase diagram, where the critical probability pc depends on the clustering coefficient C. For p > pc the rate of state S increases in time, seemingly to dominate in the whole system. Below pc, the majority of nodes is in the D-state. The numerical results indicate that for the Watts-Strogatz network the D-process is activated at the finite value of the clustering coefficient C, close to 0.3. On the contrary, for the Erdös-Rényi network the transition is observed at the whole investigated range of C.

  11. Creating Documentary Theatre as Educational Process.

    ERIC Educational Resources Information Center

    Hirschfeld-Medalia, Adeline

    With the celebration of the United States bicentennial as impetus, university students and faculty attempted several approaches to the creation of a touring documentary production composed almost completely from primary sources. This paper describes the process involved in producing a traveling show which featured groups relatively excluded from…

  12. Can An Evolutionary Process Create English Text?

    SciTech Connect

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  13. [Chemiluminescence spectroscopic analysis of homogeneous charge compression ignition combustion processes].

    PubMed

    Liu, Hai-feng; Yao, Ming-fa; Jin, Chao; Zhang, Peng; Li, Zhe-ming; Zheng, Zun-qing

    2010-10-01

    To study the combustion reaction kinetics of homogeneous charge compression ignition (HCCI) under different port injection strategies and intake temperature conditions, the tests were carried out on a modified single-cylinder optical engine using chemiluminescence spectroscopic analysis. The experimental conditions are keeping the fuel mass constant; fueling the n-heptane; controlling speed at 600 r x min(-1) and inlet pressure at 0.1 MPa; controlling inlet temperature at 95 degrees C and 125 degrees C, respectively. The results of chemiluminescence spectrum show that the chemiluminescence is quite faint during low temperature heat release (LTHR), and these bands spectrum originates from formaldehyde (CH2O) chemiluminescence. During the phase of later LTHR-negative temperature coefficient (NTC)-early high temperature heat release (HTHR), these bands spectrum also originates from formaldehyde (CH2O) chemiluminescence. The CO--O* continuum is strong during HTHR, and radicals such as OH, HCO, CH and CH2O appear superimposed on this CO--O* continuum. After the HTHR, the chemiluminescence intensity is quite faint. In comparison to the start of injection (SOI) of -30 degrees ATDC, the chemiluminescence intensity is higher under the SOI = -300 degrees ATDC condition due to the more intense emissions of CO--O* continuum. And more radicals of HCO and OH are formed, which also indicates a more intense combustion reaction. Similarly, more intense CO--O* continuum and more radicals of HCO and OH are emitted under higher intake temperature case. PMID:21137383

  14. A Tool for Creating Healthier Workplaces: The Conducivity Process

    ERIC Educational Resources Information Center

    Karasek, Robert A.

    2004-01-01

    The conducivity process, a methodology for creating healthier workplaces by promoting conducive production, is illustrated through the use of the "conducivity game" developed in the NordNet Project in Sweden, which was an action research project to test a job redesign methodology. The project combined the "conducivity" hypotheses about a…

  15. A study of the role of homogeneous process in heterogeneous high explosives

    SciTech Connect

    Tang, P.K.

    1993-05-01

    In a new hydrodynamic formulation of shock-induced chemical reaction, we can show formally that the presence of certain homogenous reaction characteristics is becoming more evident as shock pressure increase even in heterogeneous high explosives. The homogeneous reaction pathway includes nonequilibrium excitation and deactivation stages prior to chemical reaction. The excitation process leads to an intermediate state at higher energy level than the equilibrium state, and as a result, the effective activation energy appears to be lower than the value based on thermal experiments. As the pressure goes up higher, the homogeneous reaction can even surpass the heterogeneous process and becomes the dominant mechanism.

  16. A study of the role of homogeneous process in heterogeneous high explosives

    SciTech Connect

    Tang, P.K.

    1993-01-01

    In a new hydrodynamic formulation of shock-induced chemical reaction, we can show formally that the presence of certain homogenous reaction characteristics is becoming more evident as shock pressure increase even in heterogeneous high explosives. The homogeneous reaction pathway includes nonequilibrium excitation and deactivation stages prior to chemical reaction. The excitation process leads to an intermediate state at higher energy level than the equilibrium state, and as a result, the effective activation energy appears to be lower than the value based on thermal experiments. As the pressure goes up higher, the homogeneous reaction can even surpass the heterogeneous process and becomes the dominant mechanism.

  17. Effect of homogenization process on the hardness of Zn-Al-Cu alloys

    NASA Astrophysics Data System (ADS)

    Villegas-Cardenas, Jose D.; Saucedo-Muñoz, Maribel L.; Lopez-Hirata, Victor M.; De Ita-De la Torre, Antonio; Avila-Davila, Erika O.; Gonzalez-Velazquez, Jorge Luis

    2015-10-01

    The effect of a homogenizing treatment on the hardness of as-cast Zn-Al-Cu alloys was investigated. Eight alloy compositions were prepared and homogenized at 350 °C for 180 h, and their Rockwell "B" hardness was subsequently measured. All the specimens were analyzed by X-ray diffraction and metallographically prepared for observation by optical microscopy and scanning electron microscopy. The results of the present work indicated that the hardness of both alloys (as-cast and homogenized) increased with increasing Al and Cu contents; this increased hardness is likely related to the presence of the θ and τ' phases. A regression equation was obtained to determine the hardness of the homogenized alloys as a function of their chemical composition and processing parameters, such as homogenization time and temperature, used in their preparation.

  18. Design and fabrication of optical homogenizer with micro structure by injection molding process

    NASA Astrophysics Data System (ADS)

    Chen, C.-C. A.; Chang, S.-W.; Weng, C.-J.

    2008-08-01

    This paper is to design and fabricate an optical homogenizer with hybrid design of collimator, toroidal lens array, and projection lens for beam shaping of Gaussian beam into uniform cylindrical beam. TracePro software was used to design the geometry of homogenizer and simulation of injection molding was preceded by Moldflow MPI to evaluate the mold design for injection molding process. The optical homogenizer is a cylindrical part with thickness 8.03 mm and diameter 5 mm. The micro structure of toroidal array has groove height designed from 12 μm to 99 μm. An electrical injection molding machine and PMMA (n= 1.4747) were selected to perform the experiment. Experimental results show that the optics homogenizer has achieved the transfer ratio of grooves (TRG) as 88.98% and also the optical uniformity as 68% with optical efficiency as 91.88%. Future study focuses on development of an optical homogenizer for LED light source.

  19. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. PMID:21412782

  20. Process to Create High-Fidelity Lunar Dust Simulants

    NASA Technical Reports Server (NTRS)

    Gustafson, Robert

    2010-01-01

    A method was developed to create high-fidelity lunar dust simulants that better match the unique properties of lunar dust than the existing simulants. The new dust simulant is designed to more closely approximate the size, morphology, composition, and other important properties of lunar dust (including the presence of nanophase iron). A two-step process is required to create this dust simulant. The first step is to prepare a feedstock material that contains a high percentage of agglutinate-like particles with iron globules (including nanophase iron). The raw material selected must have the proper mineralogical composition. In the second processing step, the feedstock material from the first step is jet-milled to reduce the particle size to a range consistent with lunar dust.

  1. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  2. Creating "Intelligent" Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, Noel; Taylor, Patrick

    2014-05-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is used to add value to individual model projections and construct a consensus projection. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, individual models reproduce certain climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. The intention is to produce improved ("intelligent") unequal-weight ensemble averages. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Several climate process metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument in combination with surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing the equal-weighted ensemble average and an ensemble weighted using the process-based metric. Additionally, this study investigates the dependence of the metric weighting scheme on the climate state using a combination of model simulations including a non-forced preindustrial control experiment, historical simulations, and

  3. Analysis of daily rainfall processes in lower extremadura (Spain) and homogenization of the data

    NASA Astrophysics Data System (ADS)

    Garcia, J. A.; Marroquin, A.; Garrido, J.; Mateos, V. L.

    1995-03-01

    In this paper we analyze, from the point of view of stochastic processes, daily rainfall data recorded at the Badajoz Observatory (Southwestern Spain) since the beginning of the century. We attempt to identify any periodicities or trends in the daily rainfall occurrences and their dependence structure, and attempt to select the appropriate point stochastic model for the daily rainfall series. Standard regression analysis, graphical methods and the Cramer statistic show a rise in the number of cases of light rain (between 0.1 and 5 mm/d) and a decline in the number of cases of moderate to heavy rain (> 5 mm/d) in the daily rainfall at least at the 5% significance level. That the homogenization process was satisfactory is shown by the mean interarrival time of the homogenized series and the test of the rate of homogenized daily rainfall occurrences. Our analysis also shows that the behavior of the spectra of the homogenized daily rainfall counts is completely different from that of a Poisson process, so that the hypothesis of a non-homogeneous Poisson process is rejected.

  4. Occurrence analysis of daily rainfalls by using non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2009-09-01

    In recent years several temporally homogeneous stochastic models have been applied to describe the rainfall process. In particular stochastic analysis of daily rainfall time series may contribute to explain the statistic features of the temporal variability related to the phenomenon. Due to the evident periodicity of the physical process, these models have to be used only to short temporal intervals in which occurrences and intensities of rainfalls can be considered reliably homogeneous. To this aim, occurrences of daily rainfalls can be considered as a stationary stochastic process in monthly periods. In this context point process models are widely used for at-site analysis of daily rainfall occurrence; they are continuous time series models, and are able to explain intermittent feature of rainfalls and simulate interstorm periods. With a different approach, periodic features of daily rainfalls can be interpreted by using a temporally non-homogeneous stochastic model characterized by parameters expressed as continuous functions in the time. In this case, great attention has to be paid to the parsimony of the models, as regards the number of parameters and the bias introduced into the generation of synthetic series, and to the influence of threshold values in extracting peak storm database from recorded daily rainfall heights. In this work, a stochastic model based on a non-homogeneous Poisson process, characterized by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. In particular, variation of rainfall occurrence intensity ? (t) is modelled by using Fourier series analysis, in which the non-homogeneous process is transformed into a homogeneous and unit one through a proper transformation of time domain, and the choice of the minimum number of harmonics is evaluated applying available statistical tests. The procedure is applied to a dataset of rain gauges located in

  5. Creating an Equity State of Mind: A Learning Process

    ERIC Educational Resources Information Center

    Pickens, Augusta Maria

    2012-01-01

    The Diversity Scorecard Project evaluated in this study was created by the University of Southern California's Center for Urban Education. It was designed to create awareness among institutional members about the state of inequities in educational outcomes for underrepresented students. The Diversity Scorecard Project facilitators' aimed…

  6. Creating a Standardized Process to Meet Core Measure Compliance.

    PubMed

    Kwan, Sarah; Daniels, Melodie; Ryan, Lindsey; Fields, Willa

    2015-01-01

    A standardized process to improve compliance with venous thromboembolism prophylaxis and hospital-based inpatient psychiatric services Core Measures was developed, implemented, and evaluated by a clinical nurse specialist team. The use of a 1-page tool with the requirements and supporting evidence, combined with concurrent data and feedback, ensured success of improving compliance. The initial robust process of education and concurrent and retrospective review follow-up allowed for this process to be successful. PMID:26274512

  7. Novel particulate production processes to create unique security materials

    NASA Astrophysics Data System (ADS)

    Hampden-Smith, Mark; Kodas, Toivo; Haubrich, Scott; Oljaca, Miki; Einhorn, Rich; Williams, Darryl

    2006-02-01

    Particles are frequently used to impart security features to high value items. These particles are typically produced by traditional methods, and therefore the security must be derived from the chemical composition of the particles rather than the particle production process. Here, we present new and difficult-to-reproduce particle production processes based on spray pyrolysis that can produce unique particles and features that are dependent on the use of these new-to-the-world processes and process trade secrets. Specifically two examples of functional materials are described, luminescent materials and electrocatalytic materials.

  8. Occurrence analysis of daily rainfalls through non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2011-06-01

    A stochastic model based on a non-homogeneous Poisson process, characterised by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. The data modelling has been performed with a partition of observed daily rainfall data into a calibration period for parameter estimation and a validation period for checking on occurrence process changes. The model has been applied to a set of rain gauges located in different geographical areas of Southern Italy. The results show a good fit for time-varying intensity of rainfall occurrence process by 2-harmonic Fourier law and no statistically significant evidence of changes in the validation period for different threshold values.

  9. Creating Reflective Choreographers: The Eyes See/Mind Sees Process

    ERIC Educational Resources Information Center

    Kimbrell, Sinead

    2012-01-01

    Since 1999, when the author first started teaching creative process-based dance programs in public schools, she has struggled to find the time to teach children the basic concepts and tools of dance while teaching them to be deliberate with their choreographic choices. In this article, the author describes a process that helps students and…

  10. Process for forming a homogeneous oxide solid phase of catalytically active material

    DOEpatents

    Perry, Dale L.; Russo, Richard E.; Mao, Xianglei

    1995-01-01

    A process is disclosed for forming a homogeneous oxide solid phase reaction product of catalytically active material comprising one or more alkali metals, one or more alkaline earth metals, and one or more Group VIII transition metals. The process comprises reacting together one or more alkali metal oxides and/or salts, one or more alkaline earth metal oxides and/or salts, one or more Group VIII transition metal oxides and/or salts, capable of forming a catalytically active reaction product, in the optional presence of an additional source of oxygen, using a laser beam to ablate from a target such metal compound reactants in the form of a vapor in a deposition chamber, resulting in the deposition, on a heated substrate in the chamber, of the desired oxide phase reaction product. The resulting product may be formed in variable, but reproducible, stoichiometric ratios. The homogeneous oxide solid phase product is useful as a catalyst, and can be produced in many physical forms, including thin films, particulate forms, coatings on catalyst support structures, and coatings on structures used in reaction apparatus in which the reaction product of the invention will serve as a catalyst.

  11. Parallel information processing channels created in the retina

    PubMed Central

    Schiller, Peter H.

    2010-01-01

    In the retina, several parallel channels originate that extract different attributes from the visual scene. This review describes how these channels arise and what their functions are. Following the introduction four sections deal with these channels. The first discusses the “ON” and “OFF” channels that have arisen for the purpose of rapidly processing images in the visual scene that become visible by virtue of either light increment or light decrement; the ON channel processes images that become visible by virtue of light increment and the OFF channel processes images that become visible by virtue of light decrement. The second section examines the midget and parasol channels. The midget channel processes fine detail, wavelength information, and stereoscopic depth cues; the parasol channel plays a central role in processing motion and flicker as well as motion parallax cues for depth perception. Both these channels have ON and OFF subdivisions. The third section describes the accessory optic system that receives input from the retinal ganglion cells of Dogiel; these cells play a central role, in concert with the vestibular system, in stabilizing images on the retina to prevent the blurring of images that would otherwise occur when an organism is in motion. The last section provides a brief overview of several additional channels that originate in the retina. PMID:20876118

  12. Ask--Think--Create: The Process of Inquiry

    ERIC Educational Resources Information Center

    Diggs, Valerie

    2009-01-01

    Today's students find it difficult to develop an understanding of what it is they need to know, and more importantly, why they need to know it. Framing this "need to know" has been called by various names, such as "inquiry," "inquiry process," "essential questions," "knowledge construction." Inquiry, however, goes much deeper than casual…

  13. Informativeness ratings of messages created on an AAC processing prosthesis.

    PubMed

    Bartlett, Megan R; Fink, Ruth B; Schwartz, Myrna F; Linebarger, Marcia

    2007-01-01

    BACKGROUND: SentenceShaper() (SSR) is a computer program that supports spoken language production in aphasia by recording and storing the fragments that the user speaks into the microphone, making them available for playback and allowing them to be combined and integrated into larger structures (i.e., sentences and narratives). A prior study that measured utterance length and grammatical complexity in story-plot narratives produced with and without the aid of SentenceShaper demonstrated an "aided effect" in some speakers with aphasia, meaning an advantage for the narratives that were produced with the support of this communication aid (Linebarger, Schwartz, Romania, Kohn, & Stephens, 2000). The present study deviated from Linebarger et al.'s methods in key respects and again showed aided effects of SentenceShaper in persons with aphasia. AIMS: Aims were (1) to demonstrate aided effects in "functional narratives" conveying hypothetical real-life situations from a first person perspective; (2) for the first time, to submit aided and spontaneous speech samples to listener judgements of informativeness; and (3) to produce preliminary evidence on topic-specific carryover from SentenceShaper, i.e., carryover from an aided production to a subsequent unaided production on the same topic. METHODS #ENTITYSTARTX00026; PROCEDURES: Five individuals with chronic aphasia created narratives on two topics, under three conditions: Unaided (U), Aided (SSR), and Post-SSR Unaided (Post-U). The 30 samples (5 participants, 2 topics, 3 conditions) were randomised and judged for informativeness by graduate students in speech-language pathology. The method for rating was Direct Magnitude Estimation (DME). OUTCOMES #ENTITYSTARTX00026; RESULTS: Repeated measures ANOVAs were performed on DME ratings for each participant on each topic. A main effect of Condition was present for four of the five participants, on one or both topics. Planned contrasts revealed that the aided effect (SSR >U) was

  14. A hybrid process combining homogeneous catalytic ozonation and membrane distillation for wastewater treatment.

    PubMed

    Zhang, Yong; Zhao, Peng; Li, Jie; Hou, Deyin; Wang, Jun; Liu, Huijuan

    2016-10-01

    A novel catalytic ozonation membrane reactor (COMR) coupling homogeneous catalytic ozonation and direct contact membrane distillation (DCMD) was developed for refractory saline organic pollutant treatment from wastewater. An ozonation process took place in the reactor to degrade organic pollutants, whilst the DCMD process was used to recover ionic catalysts and produce clean water. It was found that 98.6% total organic carbon (TOC) and almost 100% salt were removed and almost 100% metal ion catalyst was recovered. TOC in the permeate water was less than 16 mg/L after 5 h operation, which was considered satisfactory as the TOC in the potassium hydrogen phthalate (KHP) feed water was as high as 1000 mg/L. Meanwhile, the membrane distillation flux in the COMR process was 49.8% higher than that in DCMD process alone after 60 h operation. Further, scanning electron microscope images showed less amount and smaller size of contaminants on the membrane surface, which indicated the mitigation of membrane fouling. The tensile strength and FT-IR spectra tests did not reveal obvious changes for the polyvinylidene fluoride membrane after 60 h operation, which indicated the good durability. This novel COMR hybrid process exhibited promising application prospects for saline organic wastewater treatment. PMID:27372262

  15. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGESBeta

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  16. Specific multiple-scattering process in acoustic cloak with multilayered homogeneous isotropic materials

    NASA Astrophysics Data System (ADS)

    Cheng, Ying; Liu, XiaoJun

    2008-11-01

    It was qualitatively demonstrated through finite-element full-wave simulations that acoustic cloak can be constructed by using concentric multilayered structure with alternating homogeneous isotropic materials [Y. Cheng et al., Appl. Phys. Lett. 92, 151913 (2008)]. Here we present a sequential in-depth analysis of the proposed cloak by means of the multiple-scattering algorithms. Calculated pressure fields demonstrate that the cloak possesses low-reflection and wavefront-bending properties. The scattering patterns further characterize the directional cloaking performance in the far field, which is consistent with the pressure fields. The mechanism of the cloaking is ascribed to a specific multiple-scattering process determined by the microscopic material distribution and structural details of the cloak. We also discuss the behavior of the multilayered cloak as a function of wavelength.

  17. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    SciTech Connect

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the charged wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.

  18. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda

    2015-12-01

    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  19. Creating a national citizen engagement process for energy policy.

    PubMed

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-09-16

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants' broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  20. Creating a national citizen engagement process for energy policy

    PubMed Central

    Pidgeon, Nick; Demski, Christina; Butler, Catherine; Parkhill, Karen; Spence, Alexa

    2014-01-01

    This paper examines some of the science communication challenges involved when designing and conducting public deliberation processes on issues of national importance. We take as our illustrative case study a recent research project investigating public values and attitudes toward future energy system change for the United Kingdom. National-level issues such as this are often particularly difficult to engage the public with because of their inherent complexity, derived from multiple interconnected elements and policy frames, extended scales of analysis, and different manifestations of uncertainty. With reference to the energy system project, we discuss ways of meeting a series of science communication challenges arising when engaging the public with national topics, including the need to articulate systems thinking and problem scale, to provide balanced information and policy framings in ways that open up spaces for reflection and deliberation, and the need for varied methods of facilitation and data synthesis that permit access to participants’ broader values. Although resource intensive, national-level deliberation is possible and can produce useful insights both for participants and for science policy. PMID:25225393

  1. CO2-assisted high pressure homogenization: a solvent-free process for polymeric microspheres and drug-polymer composites.

    PubMed

    Kluge, Johannes; Mazzotti, Marco

    2012-10-15

    The study explores the enabling role of near-critical CO(2) as a reversible plasticizer in the high pressure homogenization of polymer particles, aiming at their comminution as well as at the formation of drug-polymer composites. First, the effect of near-critical CO(2) on the homogenization of aqueous suspensions of poly lactic-co-glycolic acid (PLGA) was investigated. Applying a pressure drop of 900 bar and up to 150 passes across the homogenizer, it was found that particles processed in the presence of CO(2) were generally of microspherical morphology and at all times significantly smaller than those obtained in the absence of a plasticizer. The smallest particles, exhibiting a median x(50) of 1.3 μm, were obtained by adding a small quantity of ethyl acetate, which exerts on PLGA an additional plasticizing effect during the homogenization step. Further, the study concerns the possibility of forming drug-polymer composites through simultaneous high pressure homogenization of the two relevant solids, and particularly the effect of near-critical CO(2) on this process. Therefore, PLGA was homogenized together with crystalline S-ketoprofen (S-KET), a non-steroidal anti-inflammatory drug, at a drug to polymer ratio of 1:10, a pressure drop of 900 bar and up to 150 passes across the homogenizer. When the process was carried out in the presence of CO(2), an impregnation efficiency of 91% has been reached, corresponding to 8.3 wt.% of S-KET in PLGA; moreover, composite particles were of microspherical morphology and significantly smaller than those obtained in the absence of CO(2). The formation of drug-polymer composites through simultaneous homogenization of the two materials is thus greatly enhanced by the presence of CO(2), which increases the efficiency for both homogenization and impregnation. PMID:22750408

  2. Homogeneous sonophotolysis of food processing industry wastewater: Study of synergistic effects, mineralization and toxicity removal.

    PubMed

    Durán, A; Monteagudo, J M; Sanmartín, I; Gómez, P

    2013-03-01

    The mineralization of industrial wastewater coming from food industry using an emerging homogeneous sonophotolytic oxidation process was evaluated as an alternative to or a rapid pretreatment step for conventional anaerobic digestion with the aim of considerably reducing the total treatment time. At the selected operation conditions ([H(2)O(2)]=11,750ppm, pH=8, amplitude=50%, pulse length (cycles)=1), 60% of TOC is removed after 60min and 98% after 180min when treating an industrial effluent with 2114ppm of total organic carbon (TOC). This process removed completely the toxicity generated during storing or due to intermediate compounds. An important synergistic effect between sonolysis and photolysis (H(2)O(2)/UV) was observed. Thus the sonophotolysis (ultrasound/H(2)O(2)/UV) technique significantly increases TOC removal when compared with each individual process. Finally, a preliminary economical analysis confirms that the sono-photolysis with H(2)O(2) and pretreated water is a profitable system when compared with the same process without using ultrasound waves and with no pretreatment. PMID:23122709

  3. People Create Health: Effective Health Promotion is a Creative Process

    PubMed Central

    Cloninger, C. Robert; Cloninger, Kevin M.

    2015-01-01

    Effective health promotion involves the creative cultivation of physical, mental, social, and spiritual well-being. Efforts at health promotion produce weak and inconsistent benefits when it does not engage people to express their own goals and values. Likewise, health promotion has been ineffective when it relies only on instruction about facts regarding a healthy lifestyle, or focuses on reduction of disease rather than the cultivation of well-being. Meta-analysis of longitudinal studies and experimental interventions shows that improvements in subjective well-being lead to short-term and long-term reductions in medical morbidity and mortality, as well as to healthier functioning and longevity. However, these effects are inconsistent and weak (correlations of about 0.15). The most consistent and strong predictor of both subjective well-being and objective health status in longitudinal studies is a creative personality profile characterized by being highly self-directed, cooperative, and self-transcendent. There is a synergy among these personality traits that enhances all aspects of the health and happiness of people. Experimental interventions to cultivate this natural creative potential of people are now just beginning, but available exploratory research has shown that creativity can be enhanced and the changes are associated with widespread and profound benefits, including greater physical, mental, social, and spiritual well-being. In addition to benefits mediated by choice of diet, physical activity, and health care utilization, the effect of a creative personality on health may be partly mediated by effects on the regulation of heart rate variability. Creativity promotes autonomic balance with parasympathetic dominance leading to a calm alert state that promotes an awakening of plasticities and intelligences that stress inhibits. We suggest that health, happiness, and meaning can be cultivated by a complex adaptive process that enhances healthy functioning

  4. Kappa Distribution in a Homogeneous Medium: Adiabatic Limit of a Super-diffusive Process?

    NASA Astrophysics Data System (ADS)

    Roth, I.

    2015-12-01

    The classical statistical theory predicts that an ergodic, weakly interacting system like charged particles in the presence of electromagnetic fields, performing Brownian motions (characterized by small range deviations in phase space and short-term microscopic memory), converges into the Gibbs-Boltzmann statistics. Observation of distributions with a kappa-power-law tails in homogeneous systems contradicts this prediction and necessitates a renewed analysis of the basic axioms of the diffusion process: characteristics of the transition probability density function (pdf) for a single interaction, with a possibility of non-Markovian process and non-local interaction. The non-local, Levy walk deviation is related to the non-extensive statistical framework. Particles bouncing along (solar) magnetic field with evolving pitch angles, phases and velocities, as they interact resonantly with waves, undergo energy changes at undetermined time intervals, satisfying these postulates. The dynamic evolution of a general continuous time random walk is determined by pdf of jumps and waiting times resulting in a fractional Fokker-Planck equation with non-integer derivatives whose solution is given by a Fox H-function. The resulting procedure involves the known, although not frequently used in physics fractional calculus, while the local, Markovian process recasts the evolution into the standard Fokker-Planck equation. Solution of the fractional Fokker-Planck equation with the help of Mellin transform and evaluation of its residues at the poles of its Gamma functions results in a slowly converging sum with power laws. It is suggested that these tails form the Kappa function. Gradual vs impulsive solar electron distributions serve as prototypes of this description.

  5. Study on Flow Stress Model and Processing Map of Homogenized Mg-Gd-Y-Zn-Zr Alloy During Thermomechanical Processes

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Zhang, Zhimin; Lu, Guang; Xie, Zhiping; Yang, Yongbiao; Cui, Ya

    2015-02-01

    Quantities of billets were compressed with 50% height reduction on a hot process simulator to study the plastic flow behaviors of homogenized as-cast Mg-13Gd-4Y-2Zn-0.6Zr alloy. The test alloy was heat treated at 520 °C for 12 h before thermomechanical experiments. The temperature of the processes ranged from 300 to 480 °C. The strain rate was varied between 0.001 and 0.5 s-1. According to the Arrhenius type equation, a flow stress model was established. In this model, flow stress was regarded as the function of the stress peak, strain peak, and the strain. A softening factor was used to characterize the dynamic softening phenomenon that occurred in the deformation process. Meanwhile, the processing maps based on the dynamic material modeling were constructed. The optimum temperature and strain rate for hot working of the test alloy were 480 °C and 0.01 s-1, respectively. Furthermore, the flow instability occurred in the two areas where the temperature ranged from 350 to 480 °C at strain rate of 0.01-0.1 s-1, and the temperature ranged from 450 to 480 °C with a strain rate of 0.1 s-1. According to the determined hot deformation parameters, four components were successfully formed, and the ultimate tensile strength, yield strength, and elongation of the component were 386 MPa, 331 MPa, and 6.3%, respectively.

  6. First-Principles Molecular Dynamics Studies of Organometallic Complexes and Homogeneous Catalytic Processes.

    PubMed

    Vidossich, Pietro; Lledós, Agustí; Ujaque, Gregori

    2016-06-21

    Computational chemistry is a valuable aid to complement experimental studies of organometallic systems and their reactivity. It allows probing mechanistic hypotheses and investigating molecular structures, shedding light on the behavior and properties of molecular assemblies at the atomic scale. When approaching a chemical problem, the computational chemist has to decide on the theoretical approach needed to describe electron/nuclear interactions and the composition of the model used to approximate the actual system. Both factors determine the reliability of the modeling study. The community dedicated much effort to developing and improving the performance and accuracy of theoretical approaches for electronic structure calculations, on which the description of (inter)atomic interactions rely. Here, the importance of the model system used in computational studies is highlighted through examples from our recent research focused on organometallic systems and homogeneous catalytic processes. We show how the inclusion of explicit solvent allows the characterization of molecular events that would otherwise not be accessible in reduced model systems (clusters). These include the stabilization of nascent charged fragments via microscopic solvation (notably, hydrogen bonding), transfer of charge (protons) between distant fragments mediated by solvent molecules, and solvent coordination to unsaturated metal centers. Furthermore, when weak interactions are involved, we show how conformational and solvation properties of organometallic complexes are also affected by the explicit inclusion of solvent molecules. Such extended model systems may be treated under periodic boundary conditions, thus removing the cluster/continuum (or vacuum) boundary, and require a statistical mechanics simulation technique to sample the accessible configurational space. First-principles molecular dynamics, in which atomic forces are computed from electronic structure calculations (namely, density

  7. Gravitational influences on the liquid-state homogenization and solidification of aluminum antimonide. [space processing of solar cell material

    NASA Technical Reports Server (NTRS)

    Ang, C.-Y.; Lacy, L. L.

    1979-01-01

    Typical commercial or laboratory-prepared samples of polycrystalline AlSb contain microstructural inhomogeneities of Al- or Sb-rich phases in addition to the primary AlSb grains. The paper reports on gravitational influences, such as density-driven convection or sedimentation, that cause microscopic phase separation and nonequilibrium conditions to exist in earth-based melts of AlSb. A triple-cavity electric furnace is used to homogenize the multiphase AlSb samples in space and on earth. A comparative characterization of identically processed low- and one-gravity samples of commercial AlSb reveals major improvements in the homogeneity of the low-gravity homogenized material.

  8. Creating a Context for the Learning of Science Process Skills through Picture Books

    ERIC Educational Resources Information Center

    Monhardt, Leigh; Monhardt, Rebecca

    2006-01-01

    This article provides suggestions on ways in which science process skills can be taught in a meaningful context through children's literature. It is hoped that the following examples of how process skills can be taught using children's books will provide a starting point from which primary teachers can create additional examples. Many…

  9. Homogeneous and heterogeneous distributed cluster processing for two- and three-dimensional viscoelastic flows

    NASA Astrophysics Data System (ADS)

    Baloch, A.; Grant, P. W.; Webster, M. F.

    2002-12-01

    A finite-element study of two- and three-dimensional incompressible viscoelastic flows in a planar lid-driven cavity and concentric rotating cylinders is presented. The hardware platforms consist of both homogeneous and heterogeneous clusters of workstations. A semi-implicit time-stepping Taylor-Galerkin scheme is employed using the message passing mechanism provided by the Parallel Virtual Machine libraries. DEC-alpha, Intel Solaris and AMD-K7(Athlon) Linux clusters are utilized. Parallel results are compared against single processor (sequentially) solutions, using the parallelism paradigm of domain decomposition. Communication is effectively masked and practically ideal, linear speed-up with the number of processors is realized.

  10. Effects of non-homogeneous flow on ADCP data processing in a hydroturbine forebay

    DOE PAGESBeta

    Harding, S. F.; Richmond, M. C.; Romero-Gomez, P.; Serkowski, J. A.

    2016-01-02

    Accurate modeling of the velocity field in the forebay of a hydroelectric power station is important for both power generation and fish passage, and is able to be increasingly well represented by computational fluid dynamics (CFD) simulations. Acoustic Doppler Current Profiler (ADCP) are investigated herein as a method of validating the numerical flow solutions, particularly in observed and calculated regions of non-homogeneous flow velocity. By using a numerical model of an ADCP operating in a velocity field calculated using CFD, the errors due to the spatial variation of the flow velocity are quantified. Furthermore, the numerical model of the ADCPmore » is referred to herein as a Virtual ADCP (VADCP).« less

  11. Study on rheo-diecasting process of 7075R alloys by SA-EMS melt homogenized treatment

    NASA Astrophysics Data System (ADS)

    Zhihua, G.; Jun, X.; Zhifeng, Z.; Guojun, L.; Mengou, T.

    2016-03-01

    An advanced melt processing technology, spiral annular electromagnetic stirring (SA-EMS) based on the annular electromagnetic stirring (A-EMS) process was developed for manufacturing Al-alloy components with high integrity. The SA-EMS process innovatively combines non-contact electromagnetic stirring and a spiral annular chamber with specially designed profiles to in situ make high quality melt slurry, and intensive forced shearing can be achieved under high shear rate and high intensity of turbulence inside the spiral annular chamber. In this paper, the solidification microstructure and hardness of 7075R alloy die-casting connecting rod conditioned by the SA-EMS melt processing technology were investigated. The results indicate that, the SA-EMS melt processing technology exhibited superior grain refinement and remarkable structure homogeneity. In addition, it can evidently enhance the mechanical performance and reduce the crack tendency.

  12. Porcine liver decellularization under oscillating pressure conditions: a technical refinement to improve the homogeneity of the decellularization process.

    PubMed

    Struecker, Benjamin; Hillebrandt, Karl Herbert; Voitl, Robert; Butter, Antje; Schmuck, Rosa B; Reutzel-Selke, Anja; Geisel, Dominik; Joehrens, Korinna; Pickerodt, Philipp A; Raschzok, Nathanael; Puhl, Gero; Neuhaus, Peter; Pratschke, Johann; Sauer, Igor M

    2015-03-01

    Decellularization and recellularization of parenchymal organs may facilitate the generation of autologous functional liver organoids by repopulation of decellularized porcine liver matrices with induced liver cells. We present an accelerated (7 h overall perfusion time) and effective protocol for human-scale liver decellularization by pressure-controlled perfusion with 1% Triton X-100 and 1% sodium dodecyl sulfate via the hepatic artery (120 mmHg) and portal vein (60 mmHg). In addition, we analyzed the effect of oscillating pressure conditions on pig liver decellularization (n=19). The proprietary perfusion device used to generate these pressure conditions mimics intra-abdominal conditions during respiration to optimize microperfusion within livers and thus optimize the homogeneity of the decellularization process. The efficiency of perfusion decellularization was analyzed by macroscopic observation, histological staining (hematoxylin and eosin [H&E], Sirius red, and alcian blue), immunohistochemical staining (collagen IV, laminin, and fibronectin), and biochemical assessment (DNA, collagen, and glycosaminoglycans) of decellularized liver matrices. The integrity of the extracellular matrix (ECM) postdecellularization was visualized by corrosion casting and three-dimensional computed tomography scanning. We found that livers perfused under oscillating pressure conditions (P(+)) showed a more homogenous course of decellularization and contained less DNA compared with livers perfused without oscillating pressure conditions (P(-)). Microscopically, livers from the (P(-)) group showed remnant cell clusters, while no cells were found in livers from the (P(+)) group. The grade of disruption of the ECM was higher in livers from the (P(-)) group, although the perfusion rates and pressure did not significantly differ. Immunohistochemical staining revealed that important matrix components were still present after decellularization. Corrosion casting showed an intact

  13. Quality function deployment: A customer-driven process to create and deliver value. Final report

    SciTech Connect

    George, S.S.

    1994-12-01

    Quality function deployment (QFD) is a team-oriented decision-making process used by more than 100 US businesses and industries to develop new products and marketing strategies. This report provides a detailed description of QFD and case study examples of how electric utilities can apply QFD principles in creating successful marketing and demand-side management (DSM) programs. The five-stage QFD process involves identifying customer needs and using this information to systematically develop program features, implementation activities, management procedures, and evaluation plans. QFD is not a deterministic model that provides answers, but a flexible, pragmatic tool for systematically organizing and communicating information to help utilities make better decisions.

  14. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    NASA Astrophysics Data System (ADS)

    Suzuki, Hiroyuki Y.

    2008-02-01

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  15. Dense and Homogeneous Compaction of Fine Ceramic and Metallic Powders: High-Speed Centrifugal Compaction Process

    SciTech Connect

    Suzuki, Hiroyuki Y.

    2008-02-15

    High-Speed Centrifugal Compaction Process (HCP) is a variation of colloidal compacting method, in which the powders sediment under huge centrifugal force. Compacting mechanism of HCP differs from conventional colloidal process such as slip casting. The unique compacting mechanism of HCP leads to a number of characteristics such as a higher compacting speed, wide applicability for net shape formation, flawless microstructure of the green compacts, etc. However, HCP also has several deteriorative characteristics that must be overcome to fully realize this process' full potential.

  16. The influence of melting process and parameters on the structure and homogeneity of titanium-tantalum alloys

    SciTech Connect

    Dunn, P.S.; Korzewka, D.; Garcia, F.; Damkroger, B.K.; Van Den Avyle, J.A.; Tissot, R.G.

    1995-12-31

    Alloys of titanium with refractory metals are attractive materials for applications requiring high temperature strength and corrosion resistance. However, the widely different characteristics of the component elements have made it difficult to produce sound, compositionally homogeneous ingots using traditional melting techniques. This is particularly critical because the compositional ranges spanned by the micro- and macrosegregation in theses systems can easily encompass a number of microconstituents which are detrimental to mechanical properties. This paper presents results of a study of plasma (PAM) and vacuum-arc (VAR) melting of a 60 wt% tantalum, 40 wt% titanium binary alloy. The structural and compositional homogeneity of both PAM consolidated + PAM remelted, and PAM consolidated + VAR remelted ingots were characterized and compared using optical and electron microscopy and x-ray fluorescence microanalysis. Additionally, the effect of melting parameter, including melt rate and magnetic stirring, was studied. Results indicate that PAM remelting achieves more complete dissolution of lie starting electrode, due to greater local superheat, than does VAR remelting. PAM remelting also produces a finer as-solidified grain structure, due to the smaller molten pool and lower local solidification times. Conversely, VAR remelting produces an ingot with a more uniform macrostructure, due to the more stable movement of the solidification interface and more uniform material feed rate. Based on these results, a three-step process of PAM consolidation, followed by a PAM intermediate melt and a VAR final melt, has been selected for further development of the alloy and processing sequence.

  17. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  18. Deactivation processes of homogeneous Pd catalysts using in situ time resolved spectroscopic techniques.

    PubMed

    Tromp, Moniek; Sietsma, Jelle R A; van Bokhoven, Jeroen A; van Strijdonck, Gino P F; van Haaren, Richard J; van der Eerden, Ad M J; van Leeuwen, Piet W N M; Koningsberger, Diek C

    2003-01-01

    UV-Vis, combined with ED-XAFS shows, for the first time, the evolution of inactive Pd dimers and trimers, that are a possible first stage in the deactivation process of important palladium catalysed reactions, leading to larger palladium clusters and eventually palladium black. PMID:12610999

  19. Experimental development of processes to produce homogenized alloys of immiscible metals, phase 3

    NASA Technical Reports Server (NTRS)

    Reger, J. L.

    1976-01-01

    An experimental drop tower package was designed and built for use in a drop tower. This effort consisted of a thermal analysis, container/heater fabrication, and assembly of an expulsion device for rapid quenching of heated specimens during low gravity conditions. Six gallium bismuth specimens with compositions in the immiscibility region (50 a/o of each element) were processed in the experimental package: four during low gravity conditions and two under a one gravity environment. One of the one gravity processed specimens did not have telemetry data and was subsequently deleted for analysis since the processing conditions were not known. Metallurgical, Hall effect, resistivity, and superconductivity examinations were performed on the five specimens. Examination of the specimens showed that the gallium was dispersed in the bismuth. The low gravity processed specimens showed a relatively uniform distribution of gallium, with particle sizes of 1 micrometer or less, in contrast to the one gravity control specimen. Comparison of the cooling rates of the dropped specimens versus microstructure indicated that low cooling rates are more desirable.

  20. Development of a reference material for Staphylococcus aureus enterotoxin A in cheese: feasibility study, processing, homogeneity and stability assessment.

    PubMed

    Zeleny, R; Emteborg, H; Charoud-Got, J; Schimmel, H; Nia, Y; Mutel, I; Ostyn, A; Herbin, S; Hennekinne, J-A

    2015-02-01

    Staphylococcal food poisoning is caused by enterotoxins excreted into foods by strains of staphylococci. Commission Regulation 1441/2007 specifies thresholds for the presence of these toxins in foods. In this article we report on the progress towards reference materials (RMs) for Staphylococcal enterotoxin A (SEA) in cheese. RMs are crucial to enforce legislation and to implement and safeguard reliable measurements. First, a feasibility study revealed a suitable processing procedure for cheese powders: the blank material was prepared by cutting, grinding, freeze-drying and milling. For the spiked material, a cheese-water slurry was spiked with SEA solution, freeze-dried and diluted with blank material to the desired SEA concentration. Thereafter, batches of three materials (blank; two SEA concentrations) were processed. The materials were shown to be sufficiently homogeneous, and storage at ambient temperature for 4weeks did not indicate degradation. These results provide the basis for the development of a RM for SEA in cheese. PMID:25172706

  1. The Distribution of Family Sizes Under a Time-Homogeneous Birth and Death Process.

    PubMed

    Moschopoulos, Panagis; Shpak, Max

    2010-05-11

    The number of extant individuals within a lineage, as exemplified by counts of species numbers across genera in a higher taxonomic category, is known to be a highly skewed distribution. Because the sublineages (such as genera in a clade) themselves follow a random birth process, deriving the distribution of lineage sizes involves averaging the solutions to a birth and death process over the distribution of time intervals separating the origin of the lineages. In this article, we show that the resulting distributions can be represented by hypergeometric functions of the second kind. We also provide approximations of these distributions up to the second order, and compare these results to the asymptotic distributions and numerical approximations used in previous studies. For two limiting cases, one with a relatively high rate of lineage origin, one with a low rate, the cumulative probability densities and percentiles are compared to show that the approximations are robust over a wide range of parameters. It is proposed that the probability distributions of lineage size may have a number of relevant applications to biological problems such as the coalescence of genetic lineages and in predicting the number of species in living and extinct higher taxa, as these systems are special instances of the underlying process analyzed in this article. PMID:23543815

  2. Effects of homogenization process parameters on physicochemical properties of astaxanthin nanodispersions prepared using a solvent-diffusion technique

    PubMed Central

    Anarjan, Navideh; Jafarizadeh-Malmiri, Hoda; Nehdi, Imededdine Arbi; Sbihi, Hassen Mohamed; Al-Resayes, Saud Ibrahim; Tan, Chin Ping

    2015-01-01

    Nanodispersion systems allow incorporation of lipophilic bioactives, such as astaxanthin (a fat soluble carotenoid) into aqueous systems, which can improve their solubility, bioavailability, and stability, and widen their uses in water-based pharmaceutical and food products. In this study, response surface methodology was used to investigate the influences of homogenization time (0.5–20 minutes) and speed (1,000–9,000 rpm) in the formation of astaxanthin nanodispersions via the solvent-diffusion process. The product was characterized for particle size and astaxanthin concentration using laser diffraction particle size analysis and high performance liquid chromatography, respectively. Relatively high determination coefficients (ranging from 0.896 to 0.969) were obtained for all suggested polynomial regression models. The overall optimal homogenization conditions were determined by multiple response optimization analysis to be 6,000 rpm for 7 minutes. In vitro cellular uptake of astaxanthin from the suggested individual and multiple optimized astaxanthin nanodispersions was also evaluated. The cellular uptake of astaxanthin was found to be considerably increased (by more than five times) as it became incorporated into optimum nanodispersion systems. The lack of a significant difference between predicted and experimental values confirms the suitability of the regression equations connecting the response variables studied to the independent parameters. PMID:25709435

  3. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2003-01-01

    The objective of this study was to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements were carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The main experimental technique employed was turbulent flow-chemical ionization mass spectrometry, which is particularly well suited for investigations of radical-radical reactions.

  4. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with

  5. The Parametric Model of the Human Mandible Coronoid Process Created by Method of Anatomical Features

    PubMed Central

    Vitković, Nikola; Mitić, Jelena; Manić, Miodrag; Trajanović, Miroslav; Husain, Karim; Petrović, Slađana; Arsić, Stojanka

    2015-01-01

    Geometrically accurate and anatomically correct 3D models of the human bones are of great importance for medical research and practice in orthopedics and surgery. These geometrical models can be created by the use of techniques which can be based on input geometrical data acquired from volumetric methods of scanning (e.g., Computed Tomography (CT)) or on the 2D images (e.g., X-ray). Geometrical models of human bones created in such way can be applied for education of medical practitioners, preoperative planning, etc. In cases when geometrical data about the human bone is incomplete (e.g., fractures), it may be necessary to create its complete geometrical model. The possible solution for this problem is the application of parametric models. The geometry of these models can be changed and adapted to the specific patient based on the values of parameters acquired from medical images (e.g., X-ray). In this paper, Method of Anatomical Features (MAF) which enables creation of geometrically precise and anatomically accurate geometrical models of the human bones is implemented for the creation of the parametric model of the Human Mandible Coronoid Process (HMCP). The obtained results about geometrical accuracy of the model are quite satisfactory, as it is stated by the medical practitioners and confirmed in the literature. PMID:26064183

  6. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or Denver AWIPS Risk Reduction and Requirements Evaluation (DARE) Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU) located at Cape Canaveral Air Force Station (CCAFS), Florida. The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) at Johnson Space Center, Texas and 45th Weather Squadron (45 WS) at CCAFS to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. The presentation will list the advantages and disadvantages of both file types for creating interactive graphical overlays in future AWIPS applications. Shapefiles are a popular format used extensively in Geographical Information Systems. They are usually used in AWIPS to depict static map backgrounds. A shapefile stores the geometry and attribute information of spatial features in a dataset (ESRI 1998). Shapefiles can contain point, line, and polygon features. Each shapefile contains a main file, index file, and a dBASE table. The main file contains a record for each spatial feature, which describes the feature with a list of its vertices. The index file contains the offset of each record from the beginning of the main file. The dBASE table contains records for each

  7. Laboratory Studies of Homogeneous and Heterogeneous Chemical Processes of Importance in the Upper Atmosphere

    NASA Technical Reports Server (NTRS)

    Molina, Mario J.

    2001-01-01

    The objective of this study is to conduct measurements of chemical kinetics parameters for reactions of importance in the stratosphere and upper troposphere, and to study the interaction of trace gases such as HCl with ice surfaces in order to elucidate the mechanism of heterogeneous chlorine activation processes, using both a theoretical and an experimental approach. The measurements will be carried out under temperature and pressure conditions covering those applicable to the stratosphere and upper troposphere. The techniques to be employed include turbulent flow - chemical ionization mass spectrometry, and optical ellipsometry. The next section summarizes our research activities during the second year of the project, and the section that follows consists of the statement of work for the third year.

  8. Creating OGC Web Processing Service workflows using a web-based editor

    NASA Astrophysics Data System (ADS)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  9. Processing of α-chitin nanofibers by dynamic high pressure homogenization: characterization and antifungal activity against A. niger.

    PubMed

    Salaberria, Asier M; Fernandes, Susana C M; Diaz, Rene Herrera; Labidi, Jalel

    2015-02-13

    Chitin nano-objects become more interesting and attractive material than native chitin because of their usable form, low density, high surface area and promising mechanical properties. This work suggests a straightforward and environmentally friendly method for processing chitin nanofibers using dynamic high pressure homogenization. This technique proved to be a remarkably simple way to get α-chitin into α-chitin nanofibers from yellow lobster wastes with a uniform width (bellow 100 nm) and high aspect ratio; and may contributes to a major breakthrough in chitin applications. Moreover, the resulting α-chitin nanofibers were characterized and compared with native α-chitin in terms of chemical and crystal structure, thermal degradation and antifungal activity. The biological assays highlighted that the nano nature of chitin nanofibers plays an important role in the antifungal activity against Aspergillus niger. PMID:25458302

  10. Development of a new cucumber reference material for pesticide residue analysis: feasibility study for material processing, homogeneity and stability assessment.

    PubMed

    Grimalt, Susana; Harbeck, Stefan; Shegunova, Penka; Seghers, John; Sejerøe-Olsen, Berit; Emteborg, Håkan; Dabrio, Marta

    2015-04-01

    The feasibility of the production of a reference material for pesticide residue analysis in a cucumber matrix was investigated. Cucumber was spiked at 0.075 mg/kg with each of the 15 selected pesticides (acetamiprid, azoxystrobin, carbendazim, chlorpyrifos, cypermethrin, diazinon, (α + β)-endosulfan, fenitrothion, imazalil, imidacloprid, iprodione, malathion, methomyl, tebuconazole and thiabendazole) respectively. Three different strategies were considered for processing the material, based on the physicochemical properties of the vegetable and the target pesticides. As a result, a frozen spiked slurry of fresh cucumber, a spiked freeze-dried cucumber powder and a freeze-dried cucumber powder spiked by spraying the powder were studied. The effects of processing and aspects related to the reconstitution of the material were evaluated by monitoring the pesticide levels in the three materials. Two separate analytical methods based on LC-MS/MS and GC-MS/MS were developed and validated in-house. The spiked freeze-dried cucumber powder was selected as the most feasible material and more exhaustive studies on homogeneity and stability of the pesticide residues in the matrix were carried out. The results suggested that the between-unit homogeneity was satisfactory with a sample intake of dried material as low as 0.1 g. A 9-week isochronous stability study was undertaken at -20 °C, 4 °C and 18 °C, with -70 °C designated as the reference temperature. The pesticides tested exhibited adequate stability at -20 °C during the 9-week period as well as at -70 °C for a period of 18 months. These results constitute a good basis for the development of a new candidate reference material for selected pesticides in a cucumber matrix. PMID:25627789

  11. Nonstationary homogeneous nucleation

    NASA Technical Reports Server (NTRS)

    Harstad, K. G.

    1974-01-01

    The theory of homogeneous condensation is reviewed and equations describing this process are presented. Numerical computer solutions to transient problems in nucleation (relaxation to steady state) are presented and compared to a prior computation.

  12. Preparation of cotton linter nanowhiskers by high-pressure homogenization process and its application in thermoplastic starch

    NASA Astrophysics Data System (ADS)

    Savadekar, N. R.; Karande, V. S.; Vigneshwaran, N.; Kadam, P. G.; Mhaske, S. T.

    2015-03-01

    The present work deals with the preparation of cotton linter nanowhiskers (CLNW) by acid hydrolysis and subsequent processing in a high-pressure homogenizer. Prepared CLNW were then used as a reinforcing material in thermoplastic starch (TPS), with an aim to improve its performance properties. Concentration of CLNW was varied as 0, 1, 2, 3, 4 and 5 wt% in TPS. TPS/CLNW nanocomposite films were prepared by solution-casting process. The nanocomposite films were characterized by tensile, differential scanning calorimetry, scanning electron microscopy (SEM), water vapor permeability (WVP), oxygen permeability (OP), X-ray diffraction and light transmittance properties. 3 wt% CLNW-loaded TPS nanocomposite films demonstrated 88 % improvement in the tensile strength as compared to the pristine TPS polymer film; whereas, WVP and OP decreased by 90 and 92 %, respectively, which is highly appreciable compared to the quantity of CLNW added. DSC thermograms of nanocomposite films did not show any significant effect on melting temperature as compared to the pristine TPS. Light transmittance ( T r) value of TPS decreased with increased content of CLNW. Better interaction between CLNW and TPS, caused due to the hydrophilic nature of both the materials, and uniform distribution of CLNW in TPS were the prime reason for the improvement in properties observed at 3 wt% loading of CLNW in TPS. However, CLNW was seen to have formed agglomerates at higher concentration as determined from SEM analysis. These nanocomposite films can have potential use in food and pharmaceutical packaging applications.

  13. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to

  14. Using a critical reflection process to create an effective learning community in the workplace.

    PubMed

    Walker, Rachel; Cooke, Marie; Henderson, Amanda; Creedy, Debra K

    2013-05-01

    Learning circles are an enabling process to critically examine and reflect on practices with the purpose of promoting individual and organizational growth and change. The authors adapted and developed a learning circle strategy to facilitate open discourse between registered nurses, clinical leaders, clinical facilitators and students, to critically reflect on practice experiences to promote a positive learning environment. This paper reports on an analysis of field notes taken during a critical reflection process used to create an effective learning community in the workplace. A total of 19 learning circles were conducted during in-service periods (that is, the time allocated for professional education between morning and afternoon shifts) over a 3 month period with 56 nurses, 33 students and 1 university-employed clinical supervisor. Participation rates ranged from 3 to 12 individuals per discussion. Ten themes emerged from content analysis of the clinical learning issues identified through the four-step model of critical reflection used in learning circle discussions. The four-step model of critical reflection allowed participants to reflect on clinical learning issues, and raise them in a safe environment that enabled topics to be challenged and explored in a shared and cooperative manner. PMID:22459911

  15. Manufacturing of 9CrMoCoB Steel of Large Ingot with Homogeneity by ESR Process

    NASA Astrophysics Data System (ADS)

    Kim, D. S.; Lee, G. J.; Lee, M. B.; Hur, J. I.; Lee, J. W.

    2016-07-01

    In case of 9CrMoCoB (COST FB2) steel, equilibrium relation between [B]/[Si] ratio and (B2O3)/(SiO2) ratio is very important to control [Si] and [B] in optimum range. Therefore, in this work, to investigate the thermodynamic equilibrium relation between [B]/[Si] ratio and (B2O3)/(SiO2) ratio, pilot ESR experiments of 9CrMoCoB steel were carried out using the CaF2-CaO-Al2O3-SiO2-B2O3 slag system according to change of Si content in electrode and B2O3 content in the slag. Furthermore, through the test melting of the 20ton-class ESR ingot, the merits and demerits of soft arcing were investigated. From these results, it is concluded that oxygen content in the ESR ingot decrease with decreasing SiO2 content in the slag, relation function between [B]/[Si] ratio and (B2O3)/(SiO2) ratio derived by Pilot ESR test shows a good agreement as compared to the calculated line with a same slope and soft arcing makes interior and surface quality of ingot worse. With the optimized ESR conditions obtained from the present study, a 1000mm diameter (20 tons) and 2200mm diameter (120ton) 9CrMoCoB steel of the ESR ingot were successfully manufactured with good homogeneity by the ESR process.

  16. Regional Homogeneity of Resting-State Brain Activity Suppresses the Effect of Dopamine-Related Genes on Sensory Processing Sensitivity

    PubMed Central

    Chen, Chuansheng; Moyzis, Robert; Xia, Mingrui; He, Yong; Xue, Gui; Li, Jin; He, Qinghua; Lei, Xuemei; Wang, Yunxin; Liu, Bin; Chen, Wen; Zhu, Bi; Dong, Qi

    2015-01-01

    Sensory processing sensitivity (SPS) is an intrinsic personality trait whose genetic and neural bases have recently been studied. The current study used a neural mediation model to explore whether resting-state brain functions mediated the effects of dopamine-related genes on SPS. 298 healthy Chinese college students (96 males, mean age = 20.42 years, SD = 0.89) were scanned with magnetic resonance imaging during resting state, genotyped for 98 loci within the dopamine system, and administered the Highly Sensitive Person Scale. We extracted a “gene score” that summarized the genetic variations representing the 10 loci that were significantly linked to SPS, and then used path analysis to search for brain regions whose resting-state data would help explain the gene-behavior association. Mediation analysis revealed that temporal homogeneity of regional spontaneous activity (ReHo) in the precuneus actually suppressed the effect of dopamine-related genes on SPS. The path model explained 16% of the variance of SPS. This study represents the first attempt at using a multi-gene voxel-based neural mediation model to explore the complex relations among genes, brain, and personality. PMID:26308205

  17. Waste container weighing data processing to create reliable information of household waste generation.

    PubMed

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. PMID:25765610

  18. Five Important Lessons I Learned during the Process of Creating New Child Care Centers

    ERIC Educational Resources Information Center

    Whitehead, R. Ann

    2005-01-01

    In this article, the author describes her experiences of developing new child care sites and offers five important lessons that she learned through her experiences which helped her to create successful child care centers. These lessons include: (1) Finding an appropriate area and location; (2) Creating realistic financial projections based on real…

  19. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  20. Mechanical homogenization increases bacterial homogeneity in sputum.

    PubMed

    Stokell, Joshua R; Khan, Ammad; Steck, Todd R

    2014-07-01

    Sputum obtained from patients with cystic fibrosis (CF) is highly viscous and often heterogeneous in bacterial distribution. Adding dithiothreitol (DTT) is the standard method for liquefaction prior to processing sputum for molecular detection assays. To determine if DTT treatment homogenizes the bacterial distribution within sputum, we measured the difference in mean total bacterial abundance and abundance of Burkholderia multivorans between aliquots of DTT-treated sputum samples with and without a mechanical homogenization (MH) step using a high-speed dispersing element. Additionally, we measured the effect of MH on bacterial abundance. We found a significant difference between the mean bacterial abundances in aliquots that were subjected to only DTT treatment and those of the aliquots which included an MH step (all bacteria, P = 0.04; B. multivorans, P = 0.05). There was no significant effect of MH on bacterial abundance in sputum. Although our results are from a single CF patient, they indicate that mechanical homogenization increases the homogeneity of bacteria in sputum. PMID:24759710

  1. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  2. Detailed homogeneous abundance studies of 14 Galactic s-process enriched post-AGB stars: In search of lead (Pb)

    NASA Astrophysics Data System (ADS)

    De Smedt, K.; Van Winckel, H.; Kamath, D.; Siess, L.; Goriely, S.; Karakas, A. I.; Manick, R.

    2016-03-01

    Context. This paper is part of a larger project in which we systematically study the chemical abundances of Galactic and extragalactic post-asymptotic giant branch (post-AGB) stars. The goal at large is to provide improved observational constraints to the models of the complex interplay between the AGB s-process nucleosynthesis and the associated mixing processes. Aims: Lead (Pb) is the final product of the s-process nucleosynthesis and is predicted to have large overabundances with respect to other s-process elements in AGB stars of low metallicities. However, Pb abundance studies of s-process enriched post-AGB stars in the Magellanic Clouds show a discrepancy between observed and predicted Pb abundances. The determined upper limits based on spectral studies are much lower than what is predicted. In this paper, we focus specifically on the Pb abundance of 14 Galactic s-process enhanced post-AGB stars to check whether the same discrepancy is present in the Galaxy as well. Among these 14 objects, two were not yet subject to a detailed abundance study in the literature. We apply the same method to obtain accurate abundances for the 12 others. Our homogeneous abundance results provide the input of detailed spectral synthesis computations in the spectral regions where Pb lines are located. Methods: We used high-resolution UVES and HERMES spectra for detailed spectral abundance studies of our sample of Galactic post-AGB stars. None of the sample stars display clear Pb lines, and we only deduced upper limits of the Pb abundance by using spectrum synthesis in the spectral ranges of the strongest Pb lines. Results: We do not find any clear evidence of Pb overabundances in our sample. The derived upper limits are strongly correlated with the effective temperature of the stars with increasing upper limits for increasing effective temperatures. We obtain stronger Pb constraints on the cooler objects. Moreover, we confirm the s-process enrichment and carbon enhancement of two

  3. Using the "New Planning for Results" Process To Create Local Standards of Library Service.

    ERIC Educational Resources Information Center

    Kotch, Marianne

    2002-01-01

    Discusses "The New Planning for Results" manual published by the American Library Association that helps create local standards of public library service, and provides implementation examples based on experiences in Vermont. Highlights include evaluating community needs; service responses to those needs; developing library objectives; and…

  4. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  5. Thermomechanical process optimization of U-10wt% Mo - Part 2: The effect of homogenization on the mechanical properties and microstructure

    NASA Astrophysics Data System (ADS)

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean; Burkes, Douglas E.

    2015-10-01

    In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the "homogenized" materials were then tested under compression between 500 and 800 °C. The as-cast samples and those treated at 800 °C for 24 h had grain sizes of 25-30 μm, whereas those treated at 1000 °C for 16 h had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  6. Orthogonality Measurement for Homogenous Projects-Bases

    ERIC Educational Resources Information Center

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  7. It's Who You Know "and" What You Know: The Process of Creating Partnerships between Schools and Communities

    ERIC Educational Resources Information Center

    Hands, Catherine

    2005-01-01

    Based on qualitative research, this article aims to clarify the process of creating school-community partnerships. Two secondary schools with numerous partnerships were selected within a southern Ontario school board characterized by economic and cultural diversity. Drawing on the within- and cross-case analyses of documents, observations, and 25…

  8. We're Born to Learn: Using the Brain's Natural Learning Process to Create Today's Curriculum. Second Edition

    ERIC Educational Resources Information Center

    Smilkstein, Rita

    2011-01-01

    This updated edition of the bestselling book on the brain's natural learning process brings new research results and applications in a power-packed teacher tool kit. Rita Smilkstein shows teachers how to create and deliver curricula that help students become the motivated, successful, and natural learners they were born to be. Updated features…

  9. Method of removing the effects of electrical shorts and shunts created during the fabrication process of a solar cell

    DOEpatents

    Nostrand, Gerald E.; Hanak, Joseph J.

    1979-01-01

    A method of removing the effects of electrical shorts and shunts created during the fabrication process and improving the performance of a solar cell with a thick film cermet electrode opposite to the incident surface by applying a reverse bias voltage of sufficient magnitude to burn out the electrical shorts and shunts but less than the break down voltage of the solar cell.

  10. Atomic processes in plasmas created by an ultra-short laser pulse

    NASA Astrophysics Data System (ADS)

    Audebert, P.; Lecherbourg, L.; Bastiani-Ceccotti, S.; Geindre, J.-P.; Blancard, C.; Cossé, P.; Faussurier, G.; Shepherd, R.; Renaudin, P.

    2008-05-01

    Point projection K-shell absorption spectroscopy has been used to measure absorption spectra of transient aluminum plasma created by an ultra-short laser pulse. 1s-2p and 1s-3p absorption lines of weakly ionized aluminum were measured for an extended range of densities in a relatively low-temperature regime. Independent plasma characterization was obtained from frequency domain interferometry (FDI) diagnostic and allows the interpretation of the absorption spectra in terms of spectral opacities. The experimental spectra are compared with opacity calculations using the density and temperature inferred from the analysis of the FDI data.

  11. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  12. Simulation of the Vapor Intrusion Process for Non-Homogeneous Soils Using a Three-Dimensional Numerical Model

    PubMed Central

    Bozkurt, Ozgur; Pennell, Kelly G.; Suuberg, Eric M.

    2010-01-01

    This paper presents model simulation results of vapor intrusion into structures built atop sites contaminated with volatile or semi-volatile chemicals of concern. A three-dimensional finite element model was used to investigate the importance of factors that could influence vapor intrusion when the site is characterized by non-homogeneous soils. Model simulations were performed to examine how soil layers of differing properties alter soil gas concentration profiles and vapor intrusion rates into structures. The results illustrate difference in soil gas concentration profiles and vapor intrusion rates between homogeneous and layered soils. The findings support the need for site conceptual models to adequately represent the site’s geology when conducting site characterizations, interpreting field data and assessing the risk of vapor intrusion at a given site. For instance, in layered geologies, a lower permeability and diffusivity soil layer between the source and building often limits vapor intrusion rates, even if a higher permeability layer near the foundation permits increased soil gas flow rates into the building. In addition, the presence of water-saturated clay layers can considerably influence soil gas concentration profiles. Therefore, interpreting field data without accounting for clay layers in the site conceptual model could result in inaccurate risk calculations. Important considerations for developing more accurate conceptual site models are discussed in light of the findings. PMID:20664816

  13. Numerical Simulation of Crater Creating Process in Dynamic Replacement Method by Smooth Particle Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Danilewicz, Andrzej; Sikora, Zbigniew

    2015-02-01

    A theoretical base of SPH method, including the governing equations, discussion of importance of the smoothing function length, contact formulation, boundary treatment and finally utilization in hydrocode simulations are presented. An application of SPH to a real case of large penetrations (crater creating) into the soil caused by falling mass in Dynamic Replacement Method is discussed. An influence of particles spacing on method accuracy is presented. An example calculated by LS-DYNA software is discussed. Chronological development of Smooth Particle Hydrodynamics is presented. Theoretical basics of SPH method stability and consistency in SPH formulation, artificial viscosity and boundary treatment are discussed. Time integration techniques with stability conditions, SPH+FEM coupling, constitutive equation and equation of state (EOS) are presented as well.

  14. Chemically Patterned Inverse Opal Created by a Selective Photolysis Modification Process.

    PubMed

    Tian, Tian; Gao, Ning; Gu, Chen; Li, Jian; Wang, Hui; Lan, Yue; Yin, Xianpeng; Li, Guangtao

    2015-09-01

    Anisotropic photonic crystal materials have long been pursued for their broad applications. A novel method for creating chemically patterned inverse opals is proposed here. The patterning technique is based on selective photolysis of a photolabile polymer together with postmodification on released amine groups. The patterning method allows regioselective modification within an inverse opal structure, taking advantage of selective chemical reaction. Moreover, combined with the unique signal self-reporting feature of the photonic crystal, the fabricated structure is capable of various applications, including gradient photonic bandgap and dynamic chemical patterns. The proposed method provides the ability to extend the structural and chemical complexity of the photonic crystal, as well as its potential applications. PMID:26269453

  15. Rethinking Communication in Innovation Processes: Creating Space for Change in Complex Systems

    ERIC Educational Resources Information Center

    Leeuwis, Cees; Aarts, Noelle

    2011-01-01

    This paper systematically rethinks the role of communication in innovation processes, starting from largely separate theoretical developments in communication science and innovation studies. Literature review forms the basis of the arguments presented. The paper concludes that innovation is a collective process that involves the contextual…

  16. Creating Trauma-Informed Child Welfare Systems Using a Community Assessment Process

    ERIC Educational Resources Information Center

    Hendricks, Alison; Conradi, Lisa; Wilson, Charles

    2011-01-01

    This article describes a community assessment process designed to evaluate a specific child welfare jurisdiction based on the current definition of trauma-informed child welfare and its essential elements. This process has recently been developed and pilot tested within three diverse child welfare systems in the United States. The purpose of the…

  17. The Process of Inclusion and Accommodation: Creating Accessible Groups for Individuals with Disabilities.

    ERIC Educational Resources Information Center

    Patterson, Jeanne Boland; And Others

    1995-01-01

    Supports the important work of group counselors by focusing on the inclusion of individuals with disabilities in nondisability specific groups and addressing disability myths, disability etiquette, architectural accessibility, and group process issues. (LKS)

  18. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head

    PubMed Central

    Li, Kai; Papademetris, Xenophon; Tucker, Don M.

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  19. Creating Low Vision and Nonvisual Instructions for Diabetes Technology: An Empirically Validated Process

    PubMed Central

    Williams, Ann S.

    2012-01-01

    Introduction Nearly 20% of the adults with diagnosed diabetes in the United States also have visual impairment. Many individuals in this group perform routine diabetes self-management tasks independently, often using technology that was not specifically designed for use by people with visual impairment (e.g., insulin pumps and pens). Equitable care for persons with disabilities requires providing instructions in formats accessible for nonreaders. However, instructions in accessible formats, such as recordings, braille, or digital documents that are legible to screen readers, are seldom available. Method This article includes a summary of existing guidelines for creating accessible documents. The guidelines are followed by a description of the production of accessible nonvisual instructions for use of insulin pens used in a study of dosing accuracy. The study results indicate that the instructions were used successfully by 40 persons with visual impairment. Discussion and Conclusions Instructions in accessible formats can increase access to the benefits of diabetes technology for persons with visual impairment. Recorded instructions may also be useful to sighted persons who do not read well, such as those with dyslexia, low literacy, or who use English as a second language. Finally, they may have important benefits for fully sighted people who find it easier to learn to use technology by handling the equipment while listening to instructions. Manufacturers may also benefit from marketing to an increased pool of potential users. PMID:22538133

  20. BrainK for Structural Image Processing: Creating Electrical Models of the Human Head.

    PubMed

    Li, Kai; Papademetris, Xenophon; Tucker, Don M

    2016-01-01

    BrainK is a set of automated procedures for characterizing the tissues of the human head from MRI, CT, and photogrammetry images. The tissue segmentation and cortical surface extraction support the primary goal of modeling the propagation of electrical currents through head tissues with a finite difference model (FDM) or finite element model (FEM) created from the BrainK geometries. The electrical head model is necessary for accurate source localization of dense array electroencephalographic (dEEG) measures from head surface electrodes. It is also necessary for accurate targeting of cerebral structures with transcranial current injection from those surface electrodes. BrainK must achieve five major tasks: image segmentation, registration of the MRI, CT, and sensor photogrammetry images, cortical surface reconstruction, dipole tessellation of the cortical surface, and Talairach transformation. We describe the approach to each task, and we compare the accuracies for the key tasks of tissue segmentation and cortical surface extraction in relation to existing research tools (FreeSurfer, FSL, SPM, and BrainVisa). BrainK achieves good accuracy with minimal or no user intervention, it deals well with poor quality MR images and tissue abnormalities, and it provides improved computational efficiency over existing research packages. PMID:27293419

  1. Creating aging-enriched social work education:a process of curricular and organizational change.

    PubMed

    Hooyman, Nancy; St Peter, Suzanne

    2006-01-01

    The CSWE Geriatric Enrichment in Social Work Education Project, funded by the John A. Hartford foundation, aimed to change curricula and organizational structure in 67 GeroRich projects so that all students would graduate with foundation knowledge and skills to work effectively with older adults and their families. The emphasis was on change processes to infuse and sustain gerontological competencies and curricular resources in foundation courses. This article presents lessons learned and strategies for engaging faculty, practitioners and students in the curriculum and organizational change process. PMID:17200068

  2. Study of stirred layers on 316L steel created by friction stir processing

    NASA Astrophysics Data System (ADS)

    Langlade, C.; Roman, A.; Schlegel, D.; Gete, E.; Folea, M.

    2014-08-01

    Nanostructured materials are known to exhibit attractive properties, especially in the mechanical field where high hardness is of great interest. The friction stir process (FSP) is a recent surface engineering technique derived from the friction stir welding method (FSW). In this study, the FSP of an 316L austenitic stainless steel has been evaluated. The treated layers have been characterized in terms of hardness and microstructure and these results have been related to the FSP operational parameters. The process has been analysed using a Response Surface Method (RSM) to enable the stirred layer thickness prediction.

  3. Not All Analogies Are Created Equal: Associative and Categorical Analogy Processing following Brain Damage

    ERIC Educational Resources Information Center

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and…

  4. Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement

    ERIC Educational Resources Information Center

    Raven, Arjan; Randolph, Adriane B.; Heil, Shelli

    2010-01-01

    The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…

  5. Feasibility study for producing a carrot/potato matrix reference material for 11 selected pesticides at EU MRL level: material processing, homogeneity and stability assessment.

    PubMed

    Saldanha, Helena; Sejerøe-Olsen, Berit; Ulberth, Franz; Emons, Hendrik; Zeleny, Reinhard

    2012-05-01

    The feasibility for producing a matrix reference material for selected pesticides in a carrot/potato matrix was investigated. A commercially available baby food (carrot/potato-based mash) was spiked with 11 pesticides at the respective EU maximum residue limits (MRLs), and further processed by either freezing or freeze-drying. Batches of some 150 units were produced per material type. First, the materials were assessed for the relative amount of pesticide recovered after processing (ratio of pesticide concentration in the processed material to the initially spiked pesticide concentration). In addition, the materials' homogeneity (bottle-to-bottle variation), and the short-term (1 month) and mid-term (5 months) stability at different temperatures were assessed. For this, an in-house validated GC-EI-MS method operated in the SIM mode with a sample preparation procedure based on the QuEChERS ("quick, easy, cheap, effective, rugged, and safe") principle was applied. Measurements on the frozen material provided the most promising results (smallest analyte losses during production), and also freeze-drying proved to be a suitable alternative processing technique for most of the investigated pesticides. Both the frozen and the freeze-dried material showed to be sufficiently homogeneous for the intended use, and storage at -20°C for 5 months did not reveal any detectable material degradation. The results constitute an important step towards the development of a pesticide matrix reference material. PMID:26434333

  6. Near InfraRed Spectroscopy homogeneity evaluation of complex powder blends in a small-scale pharmaceutical preformulation process, a real-life application.

    PubMed

    Storme-Paris, I; Clarot, I; Esposito, S; Chaumeil, J C; Nicolas, A; Brion, F; Rieutord, A; Chaminade, P

    2009-05-01

    Near InfraRed Spectroscopy (NIRS) is a potentially powerful tool for assessing the homogeneity of industrial powder blends. In the particular context of hospital manufacturing, we considered the introduction of the technique at a small pharmaceutical process scale, with the objective of following blend homogeneity in mixtures of seven components. This article investigates the performance of various NIRS-based methodologies to assess powder blending. The formulation studied is prescribed in haematology unit, as part of the treatment for digestive decontamination in children receiving stem-cell transplantation. It is composed of the active pharmaceutical ingredients (APIs) colimycin and tobramycin and five excipients. We evaluated 39 different blends composing 14 different formulations, with uncorrelated proportions of constituents between these 14 formulations. The reference methods used to establish the NIRS models were gravimetry and a High Performance Liquid Chromatography method coupled to an Evaporative Light Scattering Detection. Unsupervised and supervised qualitative and quantitative chemometric methods were performed to assess powder blend homogeneity using a bench top instrument equipped with an optical fibre. For qualitative evaluations, unsupervised Moving Block Standard Deviation, autocorrelation functions and Partial Least Square Discriminant Analysis (PLS-DA) were used. For quantitative evaluations, Partial Least Square Cross-Validated models were chosen. Results are expressed as API, and major excipient percentages of theoretical values as a function of blending time. The 14 different formulations were only satisfactorily discriminated by supervised algorithms, such as an optimised PLS-DA model. The homogeneity state was demonstrated after 16 min of blending, quantifying three components with a precision between 1.2% and 1.4% w/w. This study demonstrates, for the first time, the effective implementation of NIRS for blend homogeneity evaluation, as

  7. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse.

    PubMed

    Hovick, Torre J; Allred, Brady W; Elmore, R Dwayne; Fuhlendorf, Samuel D; Hamilton, Robert G; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993-2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy's Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken's distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as static

  8. Dynamic Disturbance Processes Create Dynamic Lek Site Selection in a Prairie Grouse

    PubMed Central

    Hovick, Torre J.; Allred, Brady W.; Elmore, R. Dwayne; Fuhlendorf, Samuel D.; Hamilton, Robert G.; Breland, Amber

    2015-01-01

    It is well understood that landscape processes can affect habitat selection patterns, movements, and species persistence. These selection patterns may be altered or even eliminated as a result of changes in disturbance regimes and a concomitant management focus on uniform, moderate disturbance across landscapes. To assess how restored landscape heterogeneity influences habitat selection patterns, we examined 21 years (1991, 1993–2012) of Greater Prairie-Chicken (Tympanuchus cupido) lek location data in tallgrass prairie with restored fire and grazing processes. Our study took place at The Nature Conservancy’s Tallgrass Prairie Preserve located at the southern extent of Flint Hills in northeastern Oklahoma. We specifically addressed stability of lek locations in the context of the fire-grazing interaction, and the environmental factors influencing lek locations. We found that lek locations were dynamic in a landscape with interacting fire and grazing. While previous conservation efforts have treated leks as stable with high site fidelity in static landscapes, a majority of lek locations in our study (i.e., 65%) moved by nearly one kilometer on an annual basis in this dynamic setting. Lek sites were in elevated areas with low tree cover and low road density. Additionally, lek site selection was influenced by an interaction of fire and patch edge, indicating that in recently burned patches, leks were located near patch edges. These results suggest that dynamic and interactive processes such as fire and grazing that restore heterogeneity to grasslands do influence habitat selection patterns in prairie grouse, a phenomenon that is likely to apply throughout the Greater Prairie-Chicken’s distribution when dynamic processes are restored. As conservation moves toward restoring dynamic historic disturbance patterns, it will be important that siting and planning of anthropogenic structures (e.g., wind energy, oil and gas) and management plans not view lek locations as

  9. ArhiNet - A Knowledge-Based System for Creating, Processing and Retrieving Archival eContent

    NASA Astrophysics Data System (ADS)

    Salomie, Ioan; Dinsoreanu, Mihaela; Pop, Cristina; Suciu, Sorin

    This paper addresses the problem of creating, processing and querying semantically enhanced eContent from archives and digital libraries. We present an analysis of the archival domain, resulting in the creation of an archival domain model and of a domain ontology core. Our system adds semantic mark-up to the historical documents content, thus enabling document and knowledge retrieval as response to natural language ontology-guided queries. The system functionality follows two main workflows: (i) semantically enhanced eContent generation and knowledge acquisition and (ii) knowledge processing and retrieval. Within the first workflow, the relevant domain information is extracted from documents written in natural languages, followed by semantic annotation and domain ontology population. In the second workflow, ontologically guided natural language queries trigger reasoning processes that provide relevant search results. The paper also discusses the transformation of the OWL domain ontology into a hierarchical data model, thus providing support for the efficient ontology processing.

  10. Creating a process for incorporating epidemiological modelling into outbreak management decisions.

    PubMed

    Akselrod, Hana; Mercon, Monica; Kirkeby Risoe, Petter; Schlegelmilch, Jeffrey; McGovern, Joanne; Bogucki, Sandy

    2012-01-01

    Modern computational models of infectious diseases greatly enhance our ability to understand new infectious threats and assess the effects of different interventions. The recently-released CDC Framework for Preventing Infectious Diseases calls for increased use of predictive modelling of epidemic emergence for public health preparedness. Currently, the utility of these technologies in preparedness and response to outbreaks is limited by gaps between modelling output and information requirements for incident management. The authors propose an operational structure that will facilitate integration of modelling capabilities into action planning for outbreak management, using the Incident Command System (ICS) and Synchronization Matrix framework. It is designed to be adaptable and scalable for use by state and local planners under the National Response Framework (NRF) and Emergency Support Function #8 (ESF-8). Specific epidemiological modelling requirements are described, and integrated with the core processes for public health emergency decision support. These methods can be used in checklist format to align prospective or real-time modelling output with anticipated decision points, and guide strategic situational assessments at the community level. It is anticipated that formalising these processes will facilitate translation of the CDC's policy guidance from theory to practice during public health emergencies involving infectious outbreaks. PMID:22948107

  11. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E. S.; Buchanan, J. A.; Holter, N. A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL's Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL's Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL's Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  12. Description of the process used to create 1992 Hanford Morality Study database

    SciTech Connect

    Gilbert, E.S.; Buchanan, J.A.; Holter, N.A.

    1992-12-01

    An updated and expanded database for the Hanford Mortality Study has been developed by PNL`s Epidemiology and Biometry Department. The purpose of this report is to document this process. The primary sources of data were the Occupational Health History (OHH) files maintained by the Hanford Environmental Health Foundation (HEHF) and including demographic data and job histories; the Hanford Mortality (HMO) files also maintained by HEHF and including information of deaths of Hanford workers; the Occupational Radiation Exposure (ORE) files maintained by PNL`s Health Physics Department and containing data on external dosimetry; and a file of workers with confirmed internal depositions of radionuclides also maintained by PNL`s Health Physics Department. This report describes each of these files in detail, and also describes the many edits that were performed to address the consistency and accuracy of data within and between these files.

  13. Climate for Learning: A Symposium. Creating a Climate for Learning, and the Humanizing Process. The Principal and School Discipline. Curriculum Bulletin Vol. XXXII, No. 341.

    ERIC Educational Resources Information Center

    Johnson, Simon O.; Chaky, June

    This publication contains two articles focusing on creating a climate for learning. In "Creating a Climate for Learning, and the Humanizing Process," Simon O. Johnson offers practical suggestions for creating a humanistic learning environment. The author begins by defining the basic concepts--humanism, affective education, affective situation,…

  14. Integrated assessment of emerging science and technologies as creating learning processes among assessment communities.

    PubMed

    Forsberg, Ellen-Marie; Ribeiro, Barbara; Heyen, Nils B; Nielsen, Rasmus Øjvind; Thorstensen, Erik; de Bakker, Erik; Klüver, Lars; Reiss, Thomas; Beekman, Volkert; Millar, Kate

    2016-12-01

    Emerging science and technologies are often characterised by complexity, uncertainty and controversy. Regulation and governance of such scientific and technological developments needs to build on knowledge and evidence that reflect this complicated situation. This insight is sometimes formulated as a call for integrated assessment of emerging science and technologies, and such a call is analysed in this article. The article addresses two overall questions. The first is: to what extent are emerging science and technologies currently assessed in an integrated way. The second is: if there appears to be a need for further integration, what should such integration consist in? In the article we briefly outline the pedigree of the term 'integrated assessment' and present a number of interpretations of the concept that are useful for informing current analyses and discussions of integration in assessment. Based on four case studies of assessment of emerging science and technologies, studies of assessment traditions, literature analysis and dialogues with assessment professionals, currently under-developed integration dimensions are identified. It is suggested how these dimensions can be addressed in a practical approach to assessment where representatives of different assessment communities and stakeholders are involved. We call this approach the Trans Domain Technology Evaluation Process (TranSTEP). PMID:27465504

  15. Not all analogies are created equal: Associative and categorical analogy processing following brain damage

    PubMed Central

    Schmidt, Gwenda L.; Cardillo, Eileen R.; Kranjec, Alexander; Lehet, Matthew; Widick, Page; Chatterjee, Anjan

    2012-01-01

    Current research on analogy processing assumes that different conceptual relations are treated similarly. However, just as words and concepts are related in distinct ways, different kinds of analogies may employ distinct types of relationships. An important distinction in how words are related is the difference between associative (dog-bone) and categorical (dog-cat) relations. To test the hypothesis that analogical mapping of different types of relations would have different neural instantiations, we tested patients with left and right hemisphere lesions on their ability to understand two types of analogies, ones expressing an associative relationship and others expressing a categorical relationship. Voxel-based lesion-symptom mapping (VLSM) and behavioral analyses revealed that associative analogies relied on a large left-lateralized language network while categorical analogies relied on both left and right hemispheres. The verbal nature of the task could account for the left hemisphere findings. We argue that categorical relations additionally rely on the right hemisphere because they are more difficult, abstract, and fragile; and contain more distant relationships. PMID:22402184

  16. AFRA confronts gender issues: the process of creating a gender strategy.

    PubMed

    Bydawell, M

    1997-02-01

    The Association for Rural Advancement (AFRA), a nongovernmental organization in South Africa affiliated with the National Land Committee (NLC), seeks to redress the legacy of unjust land dispensation during the apartheid period. AFRA is the first organization within NLC to deal openly with issues of race and gender; this process has been conflictual, however. At gender training workshops conducted by White development workers, many staff expressed the view that sexism is an alien Western issue. Moreover, gender sensitivity was interpreted by Black staff as an assault on their race and cultural identity. The staff itself was polarized on racial grounds, with White managers and Black field workers. Staff further expressed concerns that a gender perspective would dilute AFRA's focus on land reform and alienate rural women who want male household heads to continue to hold the title to their land. The organizational structure was reorganized, though, to become more democratic and racially representative. The 1995 appointment of the first field worker assigned to address women's empowerment in both the organization and target communities refueled the controversy, and a gender workshop led by a psychologist was held to build trust and unity. Staff moved toward a shared understanding of gender as an aspect of social differentiation. AFRA has since committed itself to develop an integrated gender strategy sensitive to people's needs and fears. PMID:12320741

  17. Creating Interactive Graphical Overlays in the Advanced Weather Interactive Processing System (AWIPS) Using Shapefiles and DGM Files

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Lafosse, Richard; Hood, Doris; Hoeth, Brian

    2007-01-01

    Graphical overlays can be created in real-time in the Advanced Weather Interactive Processing System (AWIPS) using shapefiles or DARE Graphics Metafile (DGM) files. This presentation describes how to create graphical overlays on-the-fly for AWIPS, by using two examples of AWIPS applications that were created by the Applied Meteorology Unit (AMU). The first example is the Anvil Threat Corridor Forecast Tool, which produces a shapefile that depicts a graphical threat corridor of the forecast movement of thunderstorm anvil clouds, based on the observed or forecast upper-level winds. This tool is used by the Spaceflight Meteorology Group (SMG) and 45th Weather Squadron (45 WS) to analyze the threat of natural or space vehicle-triggered lightning over a location. The second example is a launch and landing trajectory tool that produces a DGM file that plots the ground track of space vehicles during launch or landing. The trajectory tool can be used by SMG and the 45 WS forecasters to analyze weather radar imagery along a launch or landing trajectory. Advantages of both file types will be listed.

  18. Dimensional Methods: Dimensions, Units and the Principle of Dimensional Homogeneity. Physical Processes in Terrestrial and Aquatic Ecosystems, Applied Mathematics.

    ERIC Educational Resources Information Center

    Fletcher, R. Ian

    These materials were designed to be used by life science students for instruction in the application of physical theory to ecosystem operation. Most modules contain computer programs which are built around a particular application of a physical process. The module is concerned with conventional techniques such as concepts of measurement,…

  19. Degradation Mechanism of Cyanobacterial Toxin Cylindrospermopsin by Hydroxyl Radicals in Homogeneous UV/H2O2 Process

    EPA Science Inventory

    The degradation of cylindrospermopsin (CYN), a widely distributed and highly toxic cyanobacterial toxin (cyanotoxin), remains poorly elucidated. In this study, the mechanism of CYN destruction by UV-254 nm/H2O2 advanced oxidation process (AOP) was investigated by mass spectrometr...

  20. A trapped magnetic field of 3 T in homogeneous, bulk MgB2 superconductors fabricated by a modified precursor infiltration and growth process

    NASA Astrophysics Data System (ADS)

    Bhagurkar, A. G.; Yamamoto, A.; Anguilano, L.; Dennis, A. R.; Durrell, J. H.; Babu, N. Hari; Cardwell, D. A.

    2016-03-01

    The wetting of boron with liquid magnesium is a critical factor in the synthesis of MgB2 bulk superconductors by the infiltration and growth (IG) process. Poor wetting characteristics can therefore result potentially in non-uniform infiltration, formation of defects in the final sample structure and poor structural homogeneity throughout the bulk material. Here we report the fabrication of near-net-shaped MgB2 bulk superconductors by a modified precursor infiltration and growth (MPIG) technique. A homogeneous bulk microstructure has subsequently been achieved via the uniform infiltration of Mg liquid by enriching pre-reacted MgB2 powder within the green precursor pellet as a wetting enhancer, leading to relatively little variation in superconducting properties across the entire bulk sample. Almost identical values of trapped magnetic field of 2.12 T have been measured at 5 K at both the top and bottom surfaces of a sample fabricated by the MPIG process, confirming the uniformity of the bulk microstructure. A maximum trapped field of 3 T has been measured at 5 K at the centre of a stack of two bulk MgB2 samples fabricated using this technique. A steady rise in trapped field was observed for this material with decreasing temperature down to 5 K without the occurrence of flux avalanches and with a relatively low field decay rate (1.5%/d). These properties are attributed to the presence of a fine distribution of residual Mg within the bulk microstructure generated by the MPIG processing technique.

  1. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants.

    PubMed

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (-47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  2. On the Importance of Processing Conditions for the Nutritional Characteristics of Homogenized Composite Meals Intended for Infants

    PubMed Central

    Östman, Elin; Forslund, Anna; Tareke, Eden; Björck, Inger

    2016-01-01

    The nutritional quality of infant food is an important consideration in the effort to prevent a further increase in the rate of childhood obesity. We hypothesized that the canning of composite infant meals would lead to elevated contents of carboxymethyl-lysine (CML) and favor high glycemic and insulinemic responses compared with milder heat treatment conditions. We have compared composite infant pasta Bolognese meals that were either conventionally canned (CANPBol), or prepared by microwave cooking (MWPBol). A meal where the pasta and Bolognese sauce were separate during microwave cooking (MWP_CANBol) was also included. The infant meals were tested at breakfast in healthy adults using white wheat bread (WWB) as reference. A standardized lunch meal was served at 240 min and blood was collected from fasting to 360 min after breakfast. The 2-h glucose response (iAUC) was lower following the test meals than with WWB. The insulin response was lower after the MWP_CANBol (−47%, p = 0.0000) but markedly higher after CANPBol (+40%, p = 0.0019), compared with WWB. A combined measure of the glucose and insulin responses (ISIcomposite) revealed that MWP_CANBol resulted in 94% better insulin sensitivity than CANPBol. Additionally, the separate processing of the meal components in MWP_CANBol resulted in 39% lower CML levels than the CANPBol. It was therefore concluded that intake of commercially canned composite infant meals leads to reduced postprandial insulin sensitivity and increased exposure to oxidative stress promoting agents. PMID:27271662

  3. Is cryopreservation a homogeneous process? Ultrastructure and motility of untreated, prefreezing, and postthawed spermatozoa of Diplodus puntazzo (Cetti).

    PubMed

    Taddei, A R; Barbato, F; Abelli, L; Canese, S; Moretti, F; Rana, K J; Fausto, A M; Mazzini, M

    2001-06-01

    This study subdivides the cryopreservation procedure for Diplodus puntazzo spermatozoa into three key phases, fresh, prefreezing (samples equilibrated in cryosolutions), and postthawed stages, and examines the ultrastructural anomalies and motility profiles of spermatozoa in each stage, with different cryodiluents. Two simple cryosolutions were evaluated: 0.17 M sodium chloride containing a final concentration of 15% dimethyl sulfoxide (Me(2)SO) (cryosolution A) and 0.1 M sodium citrate containing a final concentration of 10% Me(2)SO (cryosolution B). Ultrastructural anomalies of the plasmatic and nuclear membranes of the sperm head were common and the severity of the cryoinjury differed significantly between the pre- and the postfreezing phases and between the two cryosolutions. In spermatozoa diluted with cryosolution A, during the prefreezing phase, the plasmalemma of 61% of the cells was absent or damaged compared with 24% in the fresh sample (P < 0.001). In spermatozoa diluted with cryosolution B, there was a pronounced increase in the number of cells lacking the head plasmatic membrane from the prefreezing to the postthawed stages (from 32 to 52%, P < 0.01). In both cryosolutions, damages to nuclear membrane were significantly higher after freezing (cryosolution A: 8 to 23%, P < 0.01; cryosolution B: 5 to 38%, P < 0.001). With cryosolution A, the after-activation motility profile confirmed a consistent drop from fresh at the prefreezing stage, whereas freezing and thawing did not affect the motility much further and 50% of the cells were immotile by 60-90 s after activation. With cryosolution B, only the postthawing stage showed a sharp drop of motility profile. This study suggests that the different phases of the cryoprocess should be investigated to better understand the process of sperm damage. PMID:11748933

  4. Creating a high-reliability health care system: improving performance on core processes of care at Johns Hopkins Medicine.

    PubMed

    Pronovost, Peter J; Armstrong, C Michael; Demski, Renee; Callender, Tiffany; Winner, Laura; Miller, Marlene R; Austin, J Matthew; Berenholtz, Sean M; Yang, Ting; Peterson, Ronald R; Reitz, Judy A; Bennett, Richard G; Broccolino, Victor A; Davis, Richard O; Gragnolati, Brian A; Green, Gene E; Rothman, Paul B

    2015-02-01

    In this article, the authors describe an initiative that established an infrastructure to manage quality and safety efforts throughout a complex health care system and that improved performance on core measures for acute myocardial infarction, heart failure, pneumonia, surgical care, and children's asthma. The Johns Hopkins Medicine Board of Trustees created a governance structure to establish health care system-wide oversight and hospital accountability for quality and safety efforts throughout Johns Hopkins Medicine. The Armstrong Institute for Patient Safety and Quality was formed; institute leaders used a conceptual model nested in a fractal infrastructure to implement this initiative to improve performance at two academic medical centers and three community hospitals, starting in March 2012. The initiative aimed to achieve ≥ 96% compliance on seven inpatient process-of-care core measures and meet the requirements for the Delmarva Foundation and Joint Commission awards. The primary outcome measure was the percentage of patients at each hospital who received the recommended process of care. The authors compared health system and hospital performance before (2011) and after (2012, 2013) the initiative. The health system achieved ≥ 96% compliance on six of the seven targeted measures by 2013. Of the five hospitals, four received the Delmarva Foundation award and two received The Joint Commission award in 2013. The authors argue that, to improve quality and safety, health care systems should establish a system-wide governance structure and accountability process. They also should define and communicate goals and measures and build an infrastructure to support peer learning. PMID:25517699

  5. ABA Southern Region Burn disaster plan: the process of creating and experience with the ABA southern region burn disaster plan.

    PubMed

    Kearns, Randy D; Cairns, Bruce A; Hickerson, William L; Holmes, James H

    2014-01-01

    The Southern Region of the American Burn Association began to craft a regional plan to address a surge of burn-injured patients after a mass casualty event in 2004. Published in 2006, this plan has been tested through modeling, exercise, and actual events. This article focuses on the process of how the plan was created, how it was tested, and how it interfaces with other ongoing efforts on preparedness. One key to success regarding how people respond to a disaster can be traced to preexisting relationships and collaborations. These activities would include training or working together and building trust long before the crisis. Knowing who you can call and rely on when you need help, within the context of your plan, can be pivotal in successfully managing a disaster. This article describes how a coalition of burn center leaders came together. Their ongoing personal association has facilitated the development of planning activities and has kept the process dynamic. This article also includes several of the building blocks for developing a plan from creation to composition, implementation, and testing. The plan discussed here is an example of linking leadership, relationships, process, and documentation together. On the basis of these experiences, the authors believe these elements are present in other regions. The intent of this work is to share an experience and to offer it as a guide to aid others in their regional burn disaster planning efforts. PMID:23666386

  6. Creating Sub-50 Nm Nanofluidic Junctions in PDMS Microfluidic Chip via Self-Assembly Process of Colloidal Particles.

    PubMed

    Wei, Xi; Syed, Abeer; Mao, Pan; Han, Jongyoon; Song, Yong-Ak

    2016-01-01

    Polydimethylsiloxane (PDMS) is the prevailing building material to make microfluidic devices due to its ease of molding and bonding as well as its transparency. Due to the softness of the PDMS material, however, it is challenging to use PDMS for building nanochannels. The channels tend to collapse easily during plasma bonding. In this paper, we present an evaporation-driven self-assembly method of silica colloidal nanoparticles to create nanofluidic junctions with sub-50 nm pores between two microchannels. The pore size as well as the surface charge of the nanofluidic junction is tunable simply by changing the colloidal silica bead size and surface functionalization outside of the assembled microfluidic device in a vial before the self-assembly process. Using the self-assembly of nanoparticles with a bead size of 300 nm, 500 nm, and 900 nm, it was possible to fabricate a porous membrane with a pore size of ~45 nm, ~75 nm and ~135 nm, respectively. Under electrical potential, this nanoporous membrane initiated ion concentration polarization (ICP) acting as a cation-selective membrane to concentrate DNA by ~1,700 times within 15 min. This non-lithographic nanofabrication process opens up a new opportunity to build a tunable nanofluidic junction for the study of nanoscale transport processes of ions and molecules inside a PDMS microfluidic chip. PMID:27023724

  7. The Denali EarthScope Education Partnership: Creating Opportunities for Learning About Solid Earth Processes in Alaska and Beyond.

    NASA Astrophysics Data System (ADS)

    Roush, J. J.; Hansen, R. A.

    2003-12-01

    The Geophysical Institute of the University of Alaska Fairbanks, in partnership with Denali National Park and Preserve, has begun an education outreach program that will create learning opportunities in solid earth geophysics for a wide sector of the public. We will capitalize upon a unique coincidence of heightened public interest in earthquakes (due to the M 7.9 Denali Fault event of Nov. 3rd, 2002), the startup of the EarthScope experiment, and the construction of the Denali Science & Learning Center, a premiere facility for science education located just 43 miles from the epicenter of the Denali Fault earthquake. Real-time data and current research results from EarthScope installations and science projects in Alaska will be used to engage students and teachers, national park visitors, and the general public in a discovery process that will enhance public understanding of tectonics, seismicity and volcanism along the boundary between the Pacific and North American plates. Activities will take place in five program areas, which are: 1) museum displays and exhibits, 2) outreach via print publications and electronic media, 3) curriculum development to enhance K-12 earth science education, 4) teacher training to develop earth science expertise among K-12 educators, and 5) interaction between scientists and the public. In order to engage the over 1 million annual visitors to Denali, as well as people throughout Alaska, project activities will correspond with the opening of the Denali Science and Learning Center in 2004. An electronic interactive kiosk is being constructed to provide public access to real-time data from seismic and geodetic monitoring networks in Alaska, as well as cutting edge visualizations of solid earth processes. A series of print publications and a website providing access to real-time seismic and geodetic data will be developed for park visitors and the general public, highlighting EarthScope science in Alaska. A suite of curriculum modules

  8. Homogeneous processes of atmospheric interest

    NASA Technical Reports Server (NTRS)

    Rossi, M. J.; Barker, J. R.; Golden, D. M.

    1983-01-01

    Upper atmospheric research programs in the department of chemical kinetics are reported. Topics discussed include: (1) third-order rate constants of atmospheric importance; (2) a computational study of the HO2 + HO2 and DO2 + DO2 reactions; (3) measurement and estimation of rate constants for modeling reactive systems; (4) kinetics and thermodynamics of ion-molecule association reactions; (5) entropy barriers in ion-molecule reactions; (6) reaction rate constant for OH + HOONO2 yields products over the temperature range 246 to 324 K; (7) very low-pressure photolysis of tert-bytyl nitrite at 248 nm; (8) summary of preliminary data for the photolysis of C1ONO2 and N2O5 at 285 nm; and (9) heterogeneous reaction of N2O5 and H2O.

  9. Homogeneity and elemental distribution in self-assembled bimetallic Pd-Pt aerogels prepared by a spontaneous one-step gelation process.

    PubMed

    Oezaslan, M; Liu, W; Nachtegaal, M; Frenkel, A I; Rutkowski, B; Werheid, M; Herrmann, A-K; Laugier-Bonnaud, C; Yilmaz, H-C; Gaponik, N; Czyrska-Filemonowicz, A; Eychmüller, A; Schmidt, T J

    2016-07-27

    Multi-metallic aerogels have recently emerged as a novel and promising class of unsupported electrocatalyst materials due to their high catalytic activity and improved durability for various electrochemical reactions. Aerogels can be prepared by a spontaneous one-step gelation process, where the chemical co-reduction of metal precursors and the prompt formation of nanochain-containing hydrogels, as a preliminary stage for the preparation of aerogels, take place. However, detailed knowledge about the homogeneity and chemical distribution of these three-dimensional Pd-Pt aerogels at the nano-scale as well as at the macro-scale is still unclear. Therefore, we used a combination of spectroscopic and microscopic techniques to obtain a better insight into the structure and elemental distribution of the various Pd-rich Pd-Pt aerogels prepared by the spontaneous one-step gelation process. Synchrotron-based extended X-ray absorption fine structure (EXAFS) spectroscopy and high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) in combination with energy-dispersive X-ray spectroscopy (EDX) were employed in this work to uncover the structural architecture and chemical composition of the various Pd-rich Pd-Pt aerogels over a broad length range. The Pd80Pt20, Pd60Pt40 and Pd50Pt50 aerogels showed heterogeneity in the chemical distribution of the Pt and Pd atoms inside the macroscopic nanochain-network. The features of mono-metallic clusters were not detected by EXAFS or STEM-EDX, indicating alloyed nanoparticles. However, the local chemical composition of the Pd-Pt alloys strongly varied along the nanochains and thus within a single aerogel. To determine the electrochemically active surface area (ECSA) of the Pd-Pt aerogels for application in electrocatalysis, we used the electrochemical CO stripping method. Due to their high porosity and extended network structure, the resulting values of the ECSA for the Pd-Pt aerogels were higher than that for

  10. High pressure homogenization processing, thermal treatment and milk matrix affect in vitro bioaccessibility of phenolics in apple, grape and orange juice to different extents.

    PubMed

    He, Zhiyong; Tao, Yadan; Zeng, Maomao; Zhang, Shuang; Tao, Guanjun; Qin, Fang; Chen, Jie

    2016-06-01

    The effects of high pressure homogenization processing (HPHP), thermal treatment (TT) and milk matrix (soy, skimmed and whole milk) on the phenolic bioaccessibility and the ABTS scavenging activity of apple, grape and orange juice (AJ, GJ and OJ) were investigated. HPHP and soy milk diminished AJ's total phenolic bioaccessibility 29.3%, 26.3%, respectively, whereas TT and bovine milk hardly affected it. HPHP had little effect on GJ's and OJ's total phenolic bioaccessibility, while TT enhanced them 27.3-33.9%, 19.0-29.2%, respectively, and milk matrix increased them 26.6-31.1%, 13.3-43.4%, respectively. Furthermore, TT (80 °C/30 min) and TT (90 °C/30 s) presented the similar influences on GJ's and OJ's phenolic bioaccessibility. Skimmed milk showed a better enhancing effect on OJ's total phenolic bioaccessibility than soy and whole milk, but had a similar effect on GJ's as whole milk. These results contribute to promoting the health benefits of fruit juices by optimizing the processing and formulas in the food industry. PMID:26830567

  11. Thermomechanical process optimization of U-10wt% Mo – Part 2: The effect of homogenization on the mechanical properties and microstructure

    SciTech Connect

    Joshi, Vineet V.; Nyberg, Eric A.; Lavender, Curt A.; Paxton, Dean M.; Burkes, Douglas E.

    2015-07-09

    Low-enriched uranium alloyed with 10 wt% molybdenum (U-10Mo) is currently being investigated as an alternative fuel for the highly enriched uranium used in several of the United States’ high performance research reactors. Development of the methods to fabricate the U-10Mo fuel plates is currently underway and requires fundamental understanding of the mechanical properties at the expected processing temperatures. In the first part of this series, it was determined that the as-cast U-10Mo had a dendritic microstructure with chemical inhomogeneity and underwent eutectoid transformation during hot compression testing. In the present (second) part of the work, the as-cast samples were heat treated at several temperatures and times to homogenize the Mo content. Like the previous as-cast material, the “homogenized” materials were then tested under compression between 500 and 800°C. The as-cast samples and those treated at 800°C for 24 hours had grain sizes of 25-30 μm, whereas those treated at 1000°C for 16 hours had grain sizes around 250 μm before testing. Upon compression testing, it was determined that the heat treatment had effects on the mechanical properties and the precipitation of the lamellar phase at sub-eutectoid temperatures.

  12. Is the Universe homogeneous?

    PubMed

    Maartens, Roy

    2011-12-28

    The standard model of cosmology is based on the existence of homogeneous surfaces as the background arena for structure formation. Homogeneity underpins both general relativistic and modified gravity models and is central to the way in which we interpret observations of the cosmic microwave background (CMB) and the galaxy distribution. However, homogeneity cannot be directly observed in the galaxy distribution or CMB, even with perfect observations, since we observe on the past light cone and not on spatial surfaces. We can directly observe and test for isotropy, but to link this to homogeneity we need to assume the Copernican principle (CP). First, we discuss the link between isotropic observations on the past light cone and isotropic space-time geometry: what observations do we need to be isotropic in order to deduce space-time isotropy? Second, we discuss what we can say with the Copernican assumption. The most powerful result is based on the CMB: the vanishing of the dipole, quadrupole and octupole of the CMB is sufficient to impose homogeneity. Real observations lead to near-isotropy on large scales--does this lead to near-homogeneity? There are important partial results, and we discuss why this remains a difficult open question. Thus, we are currently unable to prove homogeneity of the Universe on large scales, even with the CP. However, we can use observations of the cosmic microwave background, galaxies and clusters to test homogeneity itself. PMID:22084298

  13. Creating Poetry.

    ERIC Educational Resources Information Center

    Drury, John

    Encouraging exploration and practice, this book offers hundreds of exercises and numerous tips covering every step involved in creating poetry. Each chapter is a self-contained unit offering an overview of material in the chapter, a definition of terms, and poetry examples from well-known authors designed to supplement the numerous exercises.…

  14. Spatial homogenization methods for pin-by-pin neutron transport calculations

    NASA Astrophysics Data System (ADS)

    Kozlowski, Tomasz

    For practical reactor core applications low-order transport approximations such as SP3 have been shown to provide sufficient accuracy for both static and transient calculations with considerably less computational expense than the discrete ordinate or the full spherical harmonics methods. These methods have been applied in several core simulators where homogenization was performed at the level of the pin cell. One of the principal problems has been to recover the error introduced by pin-cell homogenization. Two basic approaches to treat pin-cell homogenization error have been proposed: Superhomogenization (SPH) factors and Pin-Cell Discontinuity Factors (PDF). These methods are based on well established Equivalence Theory and Generalized Equivalence Theory to generate appropriate group constants. These methods are able to treat all sources of error together, allowing even few-group diffusion with one mesh per cell to reproduce the reference solution. A detailed investigation and consistent comparison of both homogenization techniques showed potential of PDF approach to improve accuracy of core calculation, but also reveal its limitation. In principle, the method is applicable only for the boundary conditions at which it was created, i.e. for boundary conditions considered during the homogenization process---normally zero current. Therefore, there exists a need to improve this method, making it more general and environment independent. The goal of proposed general homogenization technique is to create a function that is able to correctly predict the appropriate correction factor with only homogeneous information available, i.e. a function based on heterogeneous solution that could approximate PDFs using homogeneous solution. It has been shown that the PDF can be well approximated by least-square polynomial fit of non-dimensional heterogeneous solution and later used for PDF prediction using homogeneous solution. This shows a promise for PDF prediction for off

  15. Creating Community

    PubMed Central

    Budin, Wendy C.

    2009-01-01

    In this column, the editor of The Journal of Perinatal Education describes ways that Lamaze International is helping to create a community for those who share a common interest in promoting, supporting, and protecting natural, safe, and healthy childbirth. The editor also describes the contents of this issue, which offer a broad range of resources, research, and inspiration for childbirth educators in their efforts to promote normal birth. PMID:19936112

  16. Creating bulk nanocrystalline metal.

    SciTech Connect

    Fredenburg, D. Anthony; Saldana, Christopher J.; Gill, David D.; Hall, Aaron Christopher; Roemer, Timothy John; Vogler, Tracy John; Yang, Pin

    2008-10-01

    Nanocrystalline and nanostructured materials offer unique microstructure-dependent properties that are superior to coarse-grained materials. These materials have been shown to have very high hardness, strength, and wear resistance. However, most current methods of producing nanostructured materials in weapons-relevant materials create powdered metal that must be consolidated into bulk form to be useful. Conventional consolidation methods are not appropriate due to the need to maintain the nanocrystalline structure. This research investigated new ways of creating nanocrystalline material, new methods of consolidating nanocrystalline material, and an analysis of these different methods of creation and consolidation to evaluate their applicability to mesoscale weapons applications where part features are often under 100 {micro}m wide and the material's microstructure must be very small to give homogeneous properties across the feature.

  17. A study of the process of using Pro/ENGINEER geometry models to create finite element models

    SciTech Connect

    Kistler, B.L.

    1997-02-01

    Methods for building Pro/ENGINEER models which allowed integration with structural and thermal mesh generation and analyses software without recreating geometry were evaluated. This study was not intended to be an in-depth study of the mechanics of Pro/ENGINEER or of mesh generation or analysis software, but instead was a first cut attempt to provide recommendations for Sandia personnel which would yield useful analytical models in less time than an analyst would require to create a separate model. The study evaluated a wide variety of geometries built in Pro/ENGINEER and provided general recommendations for designers, drafters, and analysts.

  18. Phase-shifting of correlation fringes created by image processing as an alternative to improve digital shearography

    NASA Astrophysics Data System (ADS)

    Braga, Roberto A.; González-Peña, Rolando J.; Marcon, Marlon; Magalhães, Ricardo R.; Paiva-Almeida, Thiago; Santos, Igor V. A.; Martins, Moisés

    2016-12-01

    The adoption of digital speckle pattern shearing interferometry, or speckle shearography, is well known in many areas when one needs to measure micro-displacements in-plane and out of the plane in biological and non-biological objects; it is based on the Michelson's Interferometer with the use of a piezoelectric transducer (PZT) in order to provide the phase-shift of the fringes and then to improve the quality of the final image. The creation of the shifting images using a PZT, despite its widespread use, has some drawbacks or limitations, such as the cost of the apparatus, the difficulties in applying the same displacement in the mirror repeated times, and when the phase-shift cannot be used in dynamic object measurement. The aim of this work was to create digitally phase-shift images avoiding the mechanical adjustments of the PZT, testing them with the digital shearography method. The methodology was tested using a well-known object, a cantilever beam of aluminium under deformation. The results documented the ability to create the deformation map and curves with reliability and sensitivity, reducing the cost, and improving the robustness and also the accessibility of digital speckle pattern shearing interferometry.

  19. Ecological and evolutionary consequences of biotic homogenization.

    PubMed

    Olden, Julian D; Leroy Poff, N; Douglas, Marlis R; Douglas, Michael E; Fausch, Kurt D

    2004-01-01

    Biotic homogenization, the gradual replacement of native biotas by locally expanding non-natives, is a global process that diminishes floral and faunal distinctions among regions. Although patterns of homogenization have been well studied, their specific ecological and evolutionary consequences remain unexplored. We argue that our current perspective on biotic homogenization should be expanded beyond a simple recognition of species diversity loss, towards a synthesis of higher order effects. Here, we explore three distinct forms of homogenization (genetic, taxonomic and functional), and discuss their immediate and future impacts on ecological and evolutionary processes. Our goal is to initiate future research that investigates the broader conservation implications of homogenization and to promote a proactive style of adaptive management that engages the human component of the anthropogenic blender that is currently mixing the biota on Earth. PMID:16701221

  20. A rapid method for creating drug implants: translating laboratory-based methods into a scalable manufacturing process.

    PubMed

    Wang, Cheng-Kuo; Wang, Wan-Yi; Meyer, Robert F; Liang, Yuling; Winey, Karen I; Siegel, Steven J

    2010-05-01

    Low compliance with medication is the major cause of poor outcome in schizophrenia treatment. While surgically implantable solvent-cast pellets were produced to improve outcome by increased compliance with medication, this process is laborious and time-consuming, inhibiting its broader application (Siegel et al., Eur J Pharm Biopharm 2006;64:287-293). In this study, the previous fabrication process was translated to a continuous and scalable extrusion method. Extrusion processes were modified based on in vitro release studies, drug load consistency examination, and surface morphology analysis using scanning electron microscopy. Afterward, optimized haloperidol implants were implanted into rats for preliminary analysis of biocompatibility. Barrel temperature, screw speed and resulting processing pressure influenced surface morphology and drug release. Data suggest that fewer surface pores shift the mechanism from bulk to surface PLGA degradation and longer lag period. Results demonstrate that extrusion is a viable process for manufacturing antipsychotic implants. PMID:20225251

  1. The second phase in creating the cardiac center for the next generation: beyond structure to process improvement.

    PubMed

    Woods, J

    2001-01-01

    The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute. PMID:11765624

  2. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    NASA Astrophysics Data System (ADS)

    Bulutsuz, A. G.; Demircioglu, P.; Bogrekci, I.; Durakbasa, M. N.; Katiboglu, A. B.

    2015-03-01

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  3. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    SciTech Connect

    Bulutsuz, A. G.; Demircioglu, P. Bogrekci, I.; Durakbasa, M. N.

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in surface

  4. Creating Processes Associated with Providing Government Goods and Services Under the Commercial Space Launch Act at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Letchworth, Janet F.

    2011-01-01

    Kennedy Space Center (KSC) has decided to write its agreements under the Commercial Space Launch Act (CSLA) authority to cover a broad range of categories of support that KSC could provide to our commercial partner. Our strategy was to go through the onerous process of getting the agreement in place once and allow added specificity and final cost estimates to be documented on a separate Task Order Request (TOR). This paper is written from the implementing engineering team's perspective. It describes how we developed the processes associated with getting Government support to our emerging commercial partners, such as SpaceX and reports on our success to date.

  5. Homogeneity and Entropy

    NASA Astrophysics Data System (ADS)

    Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.

    1990-11-01

    RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS

  6. Are Children's Memory Illusions Created Differently from Those of Adults? Evidence from Levels-of-Processing and Divided Attention Paradigms

    ERIC Educational Resources Information Center

    Wimmer, Marina C.; Howe, Mark L.

    2010-01-01

    In two experiments, we investigated the robustness and automaticity of adults' and children's generation of false memories by using a levels-of-processing paradigm (Experiment 1) and a divided attention paradigm (Experiment 2). The first experiment revealed that when information was encoded at a shallow level, true recognition rates decreased for…

  7. Strictly homogeneous laterally complete modules

    NASA Astrophysics Data System (ADS)

    Chilin, V. I.; Karimov, J. A.

    2016-03-01

    Let A be a laterally complete commutative regular algebra and X be a laterally complete A-module. In this paper we introduce a notion of homogeneous and strictly homogeneous A-modules. It is proved that any homogeneous A-module is strictly homogeneous A-module, if the Boolean algebra of all idempotents in A is multi-σ-finite.

  8. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  9. SP CREATE. Creating Sample Plans

    SciTech Connect

    Spears, J.H.; Seebode, L.

    1998-11-10

    The program has been designed to increase the accuracy and reduce the preparation time for completing sampling plans. It consists of our files 1. Analyte/Combination (AnalCombo) A list of analytes and combinations of analytes that can be requested of the onsite and offsite labs. Whenever a specific combination of analytes or suite names appear on the same line as the code number, this indicates that one sample can be placed in one bottle to be analyzed for these paremeters. A code number is assigned for each analyte and combination of analytes. 2. Sampling Plans Database (SPDb) A database that contains all of the analytes and combinations of analytes along with the basic information required for preparing a sample plan. That basic information includes the following fields; matrix, hold time, preservation, sample volume, container size, if the bottle caps are taped, acceptable choices. 3. Sampling plans create (SPcreate) a file that will lookup information from the Sampling Plans Database and the Job Log File (JLF98) A major database used by Sample Managemnet Services for recording more than 100 fields of information.

  10. Homogeneous and inhomogeneous eddies

    SciTech Connect

    Pavia, E.G.

    1994-12-31

    This work deals with mesoscale warm oceanic eddies; i.e., self-contained bodies of water which transport heat, among other things, for several months and for several hundreds of kilometers. This heat transport is believed to play an important role in the atmospheric and oceanic conditions of the region where it is being transported. Here the author examines the difference in evolution between eddies modeled as blobs of homogeneous water and eddies in which density varies in the horizontal. Preliminary results suggest that instability is enhanced by inhomogeneities, which would imply that traditional modeling studies, based on homogeneous vortices have underestimated the rate of heat-release from oceanic eddies to the surroundings. The approach is modeling in the simplest form; i.e., one single active layer. Although previous studies have shown the drastic effect on stability brought by two or more dynamically-relevant homogeneous layers, the author believes the single-layer eddy-model has not been investigated thoroughly.

  11. Restoration of overwash processes creates piping plover (Charadrius melodus) habitat on a barrier island (Assateague Island, Maryland)

    NASA Astrophysics Data System (ADS)

    Schupp, Courtney A.; Winn, Neil T.; Pearl, Tami L.; Kumer, John P.; Carruthers, Tim J. B.; Zimmerman, Carl S.

    2013-01-01

    On Assateague Island, an undeveloped barrier island along Maryland and Virginia, a foredune was constructed to protect the island from the erosion and breaching threat caused by permanent jetties built to maintain Ocean City Inlet. Scientists and engineers integrated expertise in vegetation, wildlife, geomorphology, and coastal engineering in order to design a habitat restoration project that would be evaluated in terms of coastal processes rather than static features. Development of specific restoration targets, thresholds for intervention, and criteria to evaluate long-term project success were based on biological and geomorphological data and coastal engineering models. A detailed long-term monitoring plan was established to measure project sustainability. The foredune unexpectedly acted as near-total barrier to both overwash and wind, and the dynamic ecosystem underwent undesirable habitat changes including conversion of early-succession beach habitat to herbaceous and shrub communities, diminishing availability of foraging habitat and thereby reducing productivity of the Federally-listed Threatened Charadrius melodus (piping plover). To address these impacts, multiple notches were cut through the constructed foredune. The metric for initial geomorphological success-restoration of at least one overwash event per year across the constructed foredune, if occurring elsewhere on the island-was reached. New overwash fans increased island stability by increasing interior island elevation. At every notch, areas of sparse vegetation increased and the new foraging habitat was utilized by breeding pairs during the 2010 breeding season. However, the metric for long-term biological success-an increase to 37% sparsely vegetated habitat on the North End and an increase in piping plover productivity to 1.25 chicks fledged per breeding pair-has not yet been met. By 2010 there was an overall productivity of 1.2 chicks fledged per breeding pair and a 1.7% decrease in sparsely

  12. Star formation in the filament of S254-S258 OB complex: a cluster in the process of being created

    NASA Astrophysics Data System (ADS)

    Samal, M. R.; Ojha, D. K.; Jose, J.; Zavagno, A.; Takahashi, S.; Neichel, B.; Kim, J. S.; Chauhan, N.; Pandey, A. K.; Zinchenko, I.; Tamura, M.; Ghosh, S. K.

    2015-09-01

    Infrared dark clouds are ideal laboratories for studying the initial processes of high-mass star and star-cluster formation. We investigated the star formation activity of an unexplored filamentary dark cloud (size ~5.7 pc × 1.9 pc), which itself is part of a large filament (~20 pc) located in the S254-S258 OB complex at a distance of 2.5 kpc. Using Multi-band Imaging Photometer (MIPS) Spitzer 24 μm data, we uncovered 49 sources with signal-to-noise ratios greater than 5. We identified 45 sources as candidate young stellar objects (YSOs) of Class I, flat-spectrum, and Class II natures. Additional 17 candidate YSOs (9 Class I and 8 Class II) are also identified using JHK and Wide-field Infrared Survey Explorer (WISE) photometry. We find that the protostar-to-Class II sources ratio (~2) and the protostar fraction (~70%) of the region are high. Comparison of the protostar fraction to other young clusters suggests that the star formation in the dark cloud possibly started only 1 Myr ago. Combining the near-infrared photometry of the YSO candidates with the theoretical evolutionary models, we infer that most of the candidate YSOs formed in the dark cloud are low-mass (<2 M⊙). We examine the spatial distribution of the YSOs and find that majority of them are linearly aligned along the highest column density line (N(H2)~1 × 1022 cm-2) of the dark cloud along its long axis at the mean nearest-neighbour separation of ~0.2 pc. Using the observed properties of the YSOs, physical conditions of the cloud and a simple cylindrical model, we explore the possible star formation process of this filamentary dark cloud and suggest that gravitational fragmentation within the filament should have played a dominant role in the formation of the YSOs. From the total mass of the YSOs, the gaseous mass associated with the dark cloud, and the surrounding environment, we infer that the region is presently forming stars at an efficiency of ~3% and a rate ~30 M⊙ Myr-1, and it may emerge

  13. Effect of homogenization techniques on reducing the size of microcapsules and the survival of probiotic bacteria therein.

    PubMed

    Ding, W K; Shah, N P

    2009-08-01

    This study investigated 2 different homogenization techniques for reducing the size of calcium alginate beads during the microencapsulation process of 8 probiotic bacteria strains, namely, Lactobacillus rhamnosus, L. salivarius, L. plantarum, L. acidophilus, L. paracasei, Bifidobacterium longum, B. lactis type Bi-04, and B. lactis type Bi-07. Two different homogenization techniques were used, namely, ultra-turrax benchtop homogenizer and Microfluidics microfluidizer. Various settings on the homogenization equipment were studied such as the number of passes, speed (rpm), duration (min), and pressure (psi). The traditional mixing method using a magnetic stirrer was used as a control. The size of microcapsules resulting from the homogenization technique, and the various settings were measured using a light microscope and a stage micrometer. The smallest capsules measuring (31.2 microm) were created with the microfluidizer using 26 passes at 1200 psi for 40 min. The greatest loss in viability of 3.21 log CFU/mL was observed when using the ultra-turrax benchtop homogenizer with a speed of 1300 rpm for 5 min. Overall, both homogenization techniques reduced capsule sizes; however, homogenization settings at high rpm also greatly reduced the viability of probiotic organisms. PMID:19723206

  14. Microfluidic Generation of Monodisperse, Structurally Homogeneous Alginate Microgels for Cell Encapsulation and 3D Cell Culture.

    PubMed

    Utech, Stefanie; Prodanovic, Radivoje; Mao, Angelo S; Ostafe, Raluca; Mooney, David J; Weitz, David A

    2015-08-01

    Monodisperse alginate microgels (10-50 μm) are created via droplet-based microfluidics by a novel crosslinking procedure. Ionic crosslinking of alginate is induced by release of chelated calcium ions. The process separates droplet formation and gelation reaction enabling excellent control over size and homogeneity under mild reaction conditions. Living mesenchymal stem cells are encapsulated and cultured in the generated 3D microenvironments. PMID:26039892

  15. A model cerium oxide matrix composite reinforced with a homogeneous dispersion of silver particulate - prepared using the glycine-nitrate process

    SciTech Connect

    Weil, K. Scott; Hardy, John S.

    2005-01-31

    Recently a new method of ceramic brazing has been developed. Based on a two-phase liquid composed of silver and copper oxide, brazing is conducted directly in air without the need of an inert cover gas or the use of surface reactive fluxes. Because the braze displays excellent wetting characteristics on a number ceramic surfaces, including alumina, various perovskites, zirconia, and ceria, we were interested in investigating whether a metal-reinforced ceramic matrix composite (CMC) could be developed with this material. In the present study, two sets of homogeneously mixed silver/copper oxide/ceria powders were synthesized using a combustion synthesis technique. The powders were compacted and heat treated in air above the liquidus temperature for the chosen Ag-CuO composition. Metallographic analysis indicates that the resulting composite microstructures are extremely uniform with respect to both the size of the metallic reinforcement as well as its spatial distribution within the ceramic matrix. The size, morphology, and spacing of the metal particulate in the densified composite appears to be dependent on the original size and the structure of the starting combustion synthesized powders.

  16. HOMOGENEOUS NUCLEAR POWER REACTOR

    DOEpatents

    King, L.D.P.

    1959-09-01

    A homogeneous nuclear power reactor utilizing forced circulation of the liquid fuel is described. The reactor does not require fuel handling outside of the reactor vessel during any normal operation including complete shutdown to room temperature, the reactor being selfregulating under extreme operating conditions and controlled by the thermal expansion of the liquid fuel. The liquid fuel utilized is a uranium, phosphoric acid, and water solution which requires no gus exhaust system or independent gas recombining system, thereby eliminating the handling of radioiytic gas.

  17. Heterogeneous nucleation or homogeneous nucleation?

    NASA Astrophysics Data System (ADS)

    Liu, X. Y.

    2000-06-01

    The generic heterogeneous effect of foreign particles on three dimensional nucleation was examined both theoretically and experimentally. It shows that the nucleation observed under normal conditions includes a sequence of progressive heterogeneous processes, characterized by different interfacial correlation function f(m,x)s. At low supersaturations, nucleation will be controlled by the process with a small interfacial correlation function f(m,x), which results from a strong interaction and good structural match between the foreign bodies and the crystallizing phase. At high supersaturations, nucleation on foreign particles having a weak interaction and poor structural match with the crystallizing phase (f(m,x)→1) will govern the kinetics. This frequently leads to the false identification of homogeneous nucleation. Genuine homogeneous nucleation, which is the up-limit of heterogeneous nucleation, may not be easily achievable under gravity. In order to check these results, the prediction is confronted with nucleation experiments of some organic and inorganic crystals. The results are in excellent agreement with the theory.

  18. Creating Happy Memories.

    ERIC Educational Resources Information Center

    Weeks, Denise Jarrett

    2001-01-01

    Some teachers are building and sharing their wisdom and know-how through lesson study, in the process creating memorable learning experiences for students and for each other. This paper describes how lesson study can transform teaching and how schools are implementing lesson study. A sidebar presents questions to consider in lesson study. (SM)

  19. Influence of Gas Flow and Improvement of Homogeneity on the Distribution of Critical Current Density in YBCO Coated Conductor Processed by TFA-MOD Method

    NASA Astrophysics Data System (ADS)

    Shiohara, Kei; Higashikawa, Kohei; Kawaguchi, Teppei; Inoue, Masayoshi; Kiss, Takanobu; Yoshizumi, Masateru; Izumi, Teruo

    Using a scanning Hall-probe microscopy, we have investigated in-plane distribution of critical current density in TFA-MOD processed YBCO coated conductors. We compared the distributions of critical current density for two kinds of coated conductors processed with different directions of gas flow at the calcinations. As a result, it was found that the direction of the gas flow largely influenced the distribution of critical current density. For example, the maximum value of critical current density was 1.5 times higher than the average for a sample processed with a gas flow in width direction. On the other hand, the distribution of critical current density was relatively uniform for the one with a gas flow in axial direction perpendicular to the surface of the conductor. These findings will be very important information for the optimization of the manufacturer processes for the conductors. Actually, a very uniform distribution of critical current density has been observed for a coated conductor produced by an optimized process. This demonstrates a high potential of TFA-MOD processed YBCO coated conductors for practical applications.

  20. Homogeneous quantum electrodynamic turbulence

    NASA Technical Reports Server (NTRS)

    Shebalin, John V.

    1992-01-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  1. HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Hammond, R.P.; Busey, H.M.

    1959-02-17

    Nuclear reactors of the homogeneous liquid fuel type are discussed. The reactor is comprised of an elongated closed vessel, vertically oriented, having a critical region at the bottom, a lower chimney structure extending from the critical region vertically upwardly and surrounded by heat exchanger coils, to a baffle region above which is located an upper chimney structure containing a catalyst functioning to recombine radiolyticallydissociated moderator gages. In operation the liquid fuel circulates solely by convection from the critical region upwardly through the lower chimney and then downwardly through the heat exchanger to return to the critical region. The gases formed by radiolytic- dissociation of the moderator are carried upwardly with the circulating liquid fuel and past the baffle into the region of the upper chimney where they are recombined by the catalyst and condensed, thence returning through the heat exchanger to the critical region.

  2. Homogeneous quantum electrodynamic turbulence

    SciTech Connect

    Shebalin, J.V.

    1992-10-01

    The electromagnetic field equations and Dirac equations for oppositely charged wave functions are numerically time-integrated using a spatial Fourier method. The numerical approach used, a spectral transform technique, is based on a continuum representation of physical space. The coupled classical field equations contain a dimensionless parameter which sets the strength of the nonlinear interaction (as the parameter increases, interaction volume decreases). For a parameter value of unity, highly nonlinear behavior in the time-evolution of an individual wave function, analogous to ideal fluid turbulence, is observed. In the truncated Fourier representation which is numerically implemented here, the quantum turbulence is homogeneous but anisotropic and manifests itself in the nonlinear evolution of equilibrium modal spatial spectra for the probability density of each particle and also for the electromagnetic energy density. The results show that nonlinearly interacting fermionic wave functions quickly approach a multi-mode, dynamic equilibrium state, and that this state can be determined by numerical means.

  3. Homogeneity study of candidate reference material in fish matrix

    NASA Astrophysics Data System (ADS)

    Ulrich, J. C.; Sarkis, J. E. S.; Hortellani, M. A.

    2015-01-01

    A material is perfectly homogeneous with respect to a given characteristic, or composition, if there is no difference between the values obtained from one part to another. Homogeneity is usually evaluated using analysis of variance (ANOVA). However, the requirement that populations of data to be processed must have a normal distribution and equal variances greatly limits the use of this statistical tool. A more suitable test for assessing the homogeneity of RMs, known as "sufficient homogeneity", was proposed by Fearn and Thompson. In this work, we evaluate the performance of the two statistical treatments for assessing homogeneity of methylmercury (MeHg) in candidate reference material of fish tissue.

  4. The Leadership Assignment: Creating Change.

    ERIC Educational Resources Information Center

    Calabrese, Raymond L.

    This book provides change-motivated leaders with an understanding of the change process and the tools to drive change. Eight change principles guide change agents in creating and sustaining change: prepare to lead change; knowledge is power; create empowering mental models; overcome resistance to change; lead change; accelerate the change process;…

  5. Shear wave splitting hints at dynamical features of mantle convection: a global study of homogeneously processed source and receiver side upper mantle anisotropy

    NASA Astrophysics Data System (ADS)

    Walpole, J.; Wookey, J. M.; Masters, G.; Kendall, J. M.

    2013-12-01

    The asthenosphere is embroiled in the process of mantle convection. Its viscous properties allow it to flow around sinking slabs and deep cratonic roots as it is displaced by intruding material and dragged around by the moving layer above. As the asthenosphere flows it develops a crystalline fabric with anisotropic crystals preferentially aligned in the direction of flow. Meanwhile, the lithosphere above deforms as it is squeezed and stretched by underlying tectonic processes, enabling anisotropic fabrics to develop and become fossilised in the rigid rock and to persist over vast spans of geological time. As a shear wave passes through an anisotropic medium it splits into two orthogonally polarised quasi shear waves that propagate at different velocities (this phenomenon is known as shear wave splitting). By analysing the polarisation and the delay time of many split waves that have passed through a region it is possible to constrain the anisotropy of the medium in that region. This anisotropy is the key to revealing the deformation history of the deep Earth. In this study we present measurements of shear wave splitting recorded on S, SKS, and SKKS waves from earthquakes recorded at stations from the IRIS DMC catalogue (1976-2010). We have used a cluster analysis phase picking technique [1] to pick hundreds of thousands of high signal to noise waveforms on long period data. These picks are used to feed the broadband data into an automated processing workflow that recovers shear wave splitting parameters [2,3]. The workflow includes a new method for making source and receiver corrections, whereby the stacked error surfaces are used as input to correction rather than a single set of parameters, this propagates uncertainty information into the final measurement. Using SKS, SKKS, and source corrected S, we recover good measurements of anisotropy beneath 1,569 stations. Using receiver corrected S we recover good measurements of anisotropy beneath 470 events. We compare

  6. Homogeneous spaces of Dirac groupoids

    NASA Astrophysics Data System (ADS)

    Jotz Lean, Madeleine

    2016-06-01

    A Poisson structure on a homogeneous space of a Poisson groupoid is homogeneous if the action of the Lie groupoid on the homogeneous space is compatible with the Poisson structures. According to a result of Liu, Weinstein and Xu, Poisson homogeneous spaces of a Poisson groupoid are in correspondence with suitable Dirac structures in the Courant algebroid defined by the Lie bialgebroid of the Poisson groupoid. We show that this correspondence result fits into a more natural context: the one of Dirac groupoids, which are objects generalizing Poisson groupoids and multiplicative closed 2-forms on groupoids.

  7. Homogeneous Catalysis by Transition Metal Compounds.

    ERIC Educational Resources Information Center

    Mawby, Roger

    1988-01-01

    Examines four processes involving homogeneous catalysis which highlight the contrast between the simplicity of the overall reaction and the complexity of the catalytic cycle. Describes how catalysts provide circuitous routes in which all energy barriers are relatively low rather than lowering the activation energy for a single step reaction.…

  8. STEAM STIRRED HOMOGENEOUS NUCLEAR REACTOR

    DOEpatents

    Busey, H.M.

    1958-06-01

    A homogeneous nuclear reactor utilizing a selfcirculating liquid fuel is described. The reactor vessel is in the form of a vertically disposed tubular member having the lower end closed by the tube walls and the upper end closed by a removal fianged assembly. A spherical reaction shell is located in the lower end of the vessel and spaced from the inside walls. The reaction shell is perforated on its lower surface and is provided with a bundle of small-diameter tubes extending vertically upward from its top central portion. The reactor vessel is surrounded in the region of the reaction shell by a neutron reflector. The liquid fuel, which may be a solution of enriched uranyl sulfate in ordinary or heavy water, is mainiained at a level within the reactor vessel of approximately the top of the tubes. The heat of the reaction which is created in the critical region within the spherical reaction shell forms steam bubbles which more upwardly through the tubes. The upward movement of these bubbles results in the forcing of the liquid fuel out of the top of these tubes, from where the fuel passes downwardly in the space between the tubes and the vessel wall where it is cooled by heat exchangers. The fuel then re-enters the critical region in the reaction shell through the perforations in the bottom. The upper portion of the reactor vessel is provided with baffles to prevent the liquid fuel from splashing into this region which is also provided with a recombiner apparatus for recombining the radiolytically dissociated moderator vapor and a control means.

  9. Effects of sample homogenization on solid phase sediment toxicity

    SciTech Connect

    Anderson, B.S.; Hunt, J.W.; Newman, J.W.; Tjeerdema, R.S.; Fairey, W.R.; Stephenson, M.D.; Puckett, H.M.; Taberski, K.M.

    1995-12-31

    Sediment toxicity is typically assessed using homogenized surficial sediment samples. It has been recognized that homogenization alters sediment integrity and may result in changes in chemical bioavailability through oxidation-reduction or other chemical processes. In this study, intact (unhomogenized) sediment cores were taken from a Van Veen grab sampler and tested concurrently with sediment homogenate from the same sample in order to investigate the effect of homogenization on toxicity. Two different solid-phase toxicity test protocols were used for these comparisons. Results of amphipod exposures to samples from San Francisco Bay indicated minimal difference between intact and homogenized samples. Mean amphipod survival in intact cores relative to homogenates was similar at two contaminated sites. Mean survival was 34 and 33% in intact and homogenized samples, respectively, at Castro Cove. Mean survival was 41% and 57%, respectively, in intact and homogenized samples from Islais Creek. Studies using the sea urchin development protocol, modified for testing at the sediment/water interface, indicated considerably more toxicity in intact samples relative to homogenized samples from San Diego Bay. Measures of metal flux into the overlying water demonstrated greater flux of metals from the intact samples. Zinc flux was five times greater, and copper flux was twice as great in some intact samples relative to homogenates. Future experiments will compare flux of metals and organic compounds in intact and homogenized sediments to further evaluate the efficacy of using intact cores for solid phase toxicity assessment.

  10. Integration of a nurse navigator into the triage process for patients with non-small-cell lung cancer: creating systematic improvements in patient care

    PubMed Central

    Zibrik, K.; Laskin, J.; Ho, C.

    2016-01-01

    Nurse navigation is a developing facet of oncology care. The concept of patient navigation was originally created in 1990 at the Harlem Hospital Center in New York City as a strategy to assist vulnerable and socially disadvantaged populations with timely access to breast cancer care. Since the mid-1990s, navigation programs have expanded to include many patient populations that require specialized management and prompt access to diagnostic and clinical resources. Advanced non-small-cell lung cancer is ideally suited for navigation to facilitate efficient assessment in this fragile patient population and to ensure timely results of molecular tests for first-line therapy with appropriately targeted agents. At the BC Cancer Agency, nurse navigator involvement with thoracic oncology triage has been demonstrated to increase the proportion of patients receiving systemic treatment, to shorten the time to delivery of systemic treatment, and to increase the rate of molecular testing and the number of patients with molecular testing results available at time of initial consultation. Insights gained through the start-up process are briefly discussed, and a framework for implementation at other institutions is outlined. PMID:27330366

  11. Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (initial award through Award Modification 2); Energy & Risk Transfer Assessment (Award Modifications 3 - 6)

    SciTech Connect

    Michael Ebert

    2008-02-28

    This is the final report for the DOE-NETL grant entitled 'Creating New Incentives for Risk Identification & Insurance Processes for the Electric Utility Industry' and later, 'Energy & Risk Transfer Assessment'. It reflects work done on projects from 15 August 2004 to 29 February 2008. Projects were on a variety of topics, including commercial insurance for electrical utilities, the Electrical Reliability Organization, cost recovery by Gulf State electrical utilities after major hurricanes, and review of state energy emergency plans. This Final Technical Report documents and summarizes all work performed during the award period, which in this case is from 15 August 2004 (date of notification of original award) through 29 February 2008. This report presents this information in a comprehensive, integrated fashion that clearly shows a logical and synergistic research trajectory, and is augmented with findings and conclusions drawn from the research as a whole. Four major research projects were undertaken and completed during the 42 month period of activities conducted and funded by the award; these are: (1) Creating New Incentives for Risk Identification and Insurance Process for the Electric Utility Industry (also referred to as the 'commercial insurance' research). Three major deliverables were produced: a pre-conference white paper, a two-day facilitated stakeholders workshop conducted at George Mason University, and a post-workshop report with findings and recommendations. All deliverables from this work are published on the CIP website at http://cipp.gmu.edu/projects/DoE-NETL-2005.php. (2) The New Electric Reliability Organization (ERO): an examination of critical issues associated with governance, standards development and implementation, and jurisdiction (also referred to as the 'ERO study'). Four major deliverables were produced: a series of preliminary memoranda for the staff of the Office of Electricity Delivery and Energy Reliability ('OE'), an ERO interview

  12. Thermocouple homogeneity scanning

    NASA Astrophysics Data System (ADS)

    Webster, E.; White, D. R.

    2015-02-01

    The inhomogeneities within a thermocouple influence the measured temperature and contribute the largest component to uncertainty. Currently there is no accepted best practice for measuring the inhomogeneities or for forecasting their effects on real-world measurements. The aim of this paper is to provide guidance on the design and performance assessment of thermocouple inhomogeneity scanners by characterizing the qualitative performance of the various designs reported in the literature, and developing a quantitative measure of scanner resolution. Numerical simulations incorporating Fourier transforms and convolutions are used to gauge the levels of attenuation and distortion present in single- and double-gradient scanners. Single-gradient scanners are found to be far superior to double-gradient scanners, which are unsuitable for quantitative measurements due to their blindness to inhomogeneities at many spatial frequencies and severe attenuation of signals at other frequencies. It is recommended that the standard deviation of the temperature gradient within the scanner is used as a measure of the scanner resolution and spatial bandwidth. Recommendations for the design of scanners are presented, and include advice on the basic design of scanners, the media employed, operating temperature, scan rates, construction of survey probes, data processing, gradient symmetry, and the spatial resolution required for research and calibration applications.

  13. Homogeneity analysis of precipitation series in Iran

    NASA Astrophysics Data System (ADS)

    Hosseinzadeh Talaee, P.; Kouchakzadeh, Mahdi; Shifteh Some'e, B.

    2014-10-01

    Assessment of the reliability and quality of historical precipitation data is required in the modeling of hydrology and water resource processes and for climate change studies. The homogeneity of the annual and monthly precipitation data sets throughout Iran was tested using the Bayesian, Cumulative Deviations, and von Neumann tests at a significance level of 0.05. The precipitation records from 41 meteorological stations covering the years between 1966 and 2005 were considered. The annual series of Iranian precipitation were found to be homogeneous by applying the Bayesian and Cumulative Deviations tests, while the von Neumann test detected inhomogeneities at seven stations. Almost all the monthly precipitation data sets are homogeneous and considered as "useful." The outputs of the statistical tests for the homogeneity analysis of the precipitation time series had discrepancies in some cases which are related to different sensitivities of the tests to break in the time series. It was found that the von Neumann test is more sensitive than the Bayesian and Cumulative Deviations tests in the determination of inhomogeneity in the precipitation series.

  14. (Ultra) High Pressure Homogenization for Continuous High Pressure Sterilization of Pumpable Foods – A Review

    PubMed Central

    Georget, Erika; Miller, Brittany; Callanan, Michael; Heinz, Volker; Mathys, Alexander

    2014-01-01

    Bacterial spores have a strong resistance to both chemical and physical hurdles and create a risk for the food industry, which has been tackled by applying high thermal intensity treatments to sterilize food. These strong thermal treatments lead to a reduction of the organoleptic and nutritional properties of food and alternatives are actively searched for. Innovative hurdles offer an alternative to inactivate bacterial spores. In particular, recent technological developments have enabled a new generation of high pressure homogenizer working at pressures up to 400 MPa and thus, opening new opportunities for high pressure sterilization of foods. In this short review, we summarize the work conducted on (ultra) high pressure homogenization (U)HPH to inactivate endospores in model and food systems. Specific attention is given to process parameters (pressure, inlet, and valve temperatures). This review gathers the current state of the art and underlines the potential of UHPH sterilization of pumpable foods while highlighting the needs for future work. PMID:25988118

  15. The Architecture of a Homogeneous Vector Supercomputer

    NASA Astrophysics Data System (ADS)

    Gustafson, J. L.; Hawkinson, S.; Scott, K.

    A new homogeneous computer architecture combines two fundamental techniques for high-speed computing: parallelism based on the binary n-cube interconnect, and pipelined vector arithmetic. The design makes extensive use of VLSI technology, resulting in a processing node that can be economically replicated. The new system achieves a careful balance between high-speed communication and floating-point computation. This paper describes the new architecture in detail and explores some of the issues in developing effective software.

  16. Using high-performance ¹H NMR (HP-qNMR®) for the certification of organic reference materials under accreditation guidelines--describing the overall process with focus on homogeneity and stability assessment.

    PubMed

    Weber, Michael; Hellriegel, Christine; Rueck, Alexander; Wuethrich, Juerg; Jenks, Peter

    2014-05-01

    Quantitative NMR spectroscopy (qNMR) is gaining interest across both analytical and industrial research applications and has become an essential tool for the content assignment and quantitative determination of impurities. The key benefits of using qNMR as measurement method for the purity determination of organic molecules are discussed, with emphasis on the ability to establish traceability to "The International System of Units" (SI). The work describes a routine certification procedure from the point of view of a commercial producer of certified reference materials (CRM) under ISO/IEC 17025 and ISO Guide 34 accreditation, that resulted in a set of essential references for (1)H qNMR measurements, and the relevant application data for these substances are given. The overall process includes specific selection criteria, pre-tests, experimental conditions, homogeneity and stability studies. The advantages of an accelerated stability study over the classical stability-test design are shown with respect to shelf-life determination and shipping conditions. PMID:24182847

  17. The OPtimising HEalth LIterAcy (Ophelia) process: study protocol for using health literacy profiling and community engagement to create and implement health reform

    PubMed Central

    2014-01-01

    Background Health literacy is a multi-dimensional concept comprising a range of cognitive, affective, social, and personal skills and attributes. This paper describes the research and development protocol for a large communities-based collaborative project in Victoria, Australia that aims to identify and respond to health literacy issues for people with chronic conditions. The project, called Ophelia (OPtimising HEalth LIterAcy) Victoria, is a partnership between two universities, eight service organisations and the Victorian Government. Based on the identified issues, it will develop and pilot health literacy interventions across eight disparate health services to inform the creation of a health literacy response framework to improve health outcomes and reduce health inequalities. Methods/Design The protocol draws on many inputs including the experience of the partners in previous co-creation and roll-out of large-scale health-promotion initiatives. Three key conceptual models/discourses inform the protocol: intervention mapping; quality improvement collaboratives, and realist synthesis. The protocol is outcomes-oriented and focuses on two key questions: ‘What are the health literacy strengths and weaknesses of clients of participating sites?’, and ‘How do sites interpret and respond to these in order to achieve positive health and equity outcomes for their clients?’. The process has six steps in three main phases. The first phase is a needs assessment that uses the Health Literacy Questionnaire (HLQ), a multi-dimensional measure of health literacy, to identify common health literacy needs among clients. The second phase involves front-line staff and management within each service organisation in co-creating intervention plans to strategically respond to the identified local needs. The third phase will trial the interventions within each site to determine if the site can improve identified limitations to service access and/or health outcomes. Discussion

  18. AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT

    SciTech Connect

    Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.

    2010-12-03

    Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.

  19. Creating Sub-50 nm Nanofluidic Junctions in PDMS Microchip via Self-Assembly Process of Colloidal Silica Beads for Electrokinetic Concentration of Biomolecules

    PubMed Central

    Syed, A.; Mangano, L.; Mao, P.; Han, J.

    2014-01-01

    In this work we describe a novel and simple self-assembly of colloidal silica beads to create nanofluidic junction between two microchannels. The nanoporous membrane was used to induce ion concentration polarization inside the microchannel and this electrokinetic preconcentration system allowed rapid concentration of DNA samples by ∼1700 times and protein samples by ∼100 times within 5 minutes. PMID:25254651

  20. Strongly Interacting Homogeneous Fermi Gases

    NASA Astrophysics Data System (ADS)

    Mukherjee, Biswaroop; Patel, Parth; Yan, Zhenjie; Struck, Julian; Zwierlein, Martin

    2016-05-01

    We present a homogeneous box potential for strongly interacting Fermi gases. The local density approximation (LDA) allows measurements on traditional inhomogeneous traps to observe a continuous distribution of Fermi gases in a single shot, but also suffer from a broadened response due to line-of-sight averaging over varying densities. We trap ultracold Fermionic (6 Li) in an optical homogeneous potential and characterize its flatness through in-situ tomography. A hybrid approach combining a cylindrical optical potential with a harmonic magnetic trap allows us to exploit the LDA and measure local RF spectra without requiring significant image reconstruction. We extract various quantities from the RF spectra such as the Tan's contact, and discuss further measurements of homogeneous Fermi systems under spin imbalance and finite temperature.

  1. Entanglement Created by Dissipation

    SciTech Connect

    Alharbi, Abdullah F.; Ficek, Zbigniew

    2011-10-27

    A technique for entangling closely separated atoms by the process of dissipative spontaneous emission is presented. The system considered is composed of two non-identical two-level atoms separated at the quarter wavelength of a driven standing wave laser field. At this atomic distance, only one of the atoms can be addressed by the laser field. In addition, we arrange the atomic dipole moments to be oriented relative to the inter-atomic axis such that the dipole-dipole interaction between the atoms is zero at this specific distance. It is shown that an entanglement can be created between the atoms on demand by tuning the Rabi frequency of the driving field to the difference between the atomic transition frequencies. The amount of the entanglement created depends on the ratio between the damping rates of the atoms, but is independent of the frequency difference between the atoms. We also find that the transient buildup of an entanglement between the atoms may differ dramatically for different initial atomic conditions.

  2. Broken Ergodicity in Ideal, Homogeneous, Incompressible Turbulence

    NASA Technical Reports Server (NTRS)

    Morin, Lee; Shebalin, John; Fu, Terry; Nguyen, Phu; Shum, Victor

    2010-01-01

    We discuss the statistical mechanics of numerical models of ideal homogeneous, incompressible turbulence and their relevance for dissipative fluids and magnetofluids. These numerical models are based on Fourier series and the relevant statistical theory predicts that Fourier coefficients of fluid velocity and magnetic fields (if present) are zero-mean random variables. However, numerical simulations clearly show that certain coefficients have a non-zero mean value that can be very large compared to the associated standard deviation. We explain this phenomena in terms of broken ergodicity', which is defined to occur when dynamical behavior does not match ensemble predictions on very long time-scales. We review the theoretical basis of broken ergodicity, apply it to 2-D and 3-D fluid and magnetohydrodynamic simulations of homogeneous turbulence, and show new results from simulations using GPU (graphical processing unit) computers.

  3. The Art of Gymnastics: Creating Sequences.

    ERIC Educational Resources Information Center

    Rovegno, Inez

    1988-01-01

    Offering students opportunities for creating movement sequences in gymnastics allows them to understand the essence of gymnastics, have creative experiences, and learn about themselves. The process of creating sequences is described. (MT)

  4. Homogenizing Developmental Studies and ESL.

    ERIC Educational Resources Information Center

    Weaver, Margaret E.

    A discussion of pragmatic issues in both developmental studies (DS) and English-as-a-second-language (ESL) instruction at the college level argues that because the two fields have common problems, challenges, and objectives, they have become homogenized as one in many institutions. Because full-time college faculty avoid teaching developmental…

  5. High frequency homogenization for structural mechanics

    NASA Astrophysics Data System (ADS)

    Nolde, E.; Craster, R. V.; Kaplunov, J.

    2011-03-01

    We consider a net created from elastic strings as a model structure to investigate the propagation of waves through semi-discrete media. We are particularly interested in the development of continuum models, valid at high frequencies, when the wavelength and each cell of the net are of similar order. Net structures are chosen as these form a general two-dimensional example, encapsulating the essential physics involved in the two-dimensional excitation of a lattice structure whilst retaining the simplicity of dealing with elastic strings. Homogenization techniques are developed here for wavelengths commensurate with the cellular scale. Unlike previous theories, these techniques are not limited to low frequency or static regimes, and lead to effective continuum equations valid on a macroscale with the details of the cellular structure encapsulated only through integrated quantities. The asymptotic procedure is based upon a two-scale approach and the physical observation that there are frequencies that give standing waves, periodic with the period or double-period of the cell. A specific example of a net created by a lattice of elastic strings is constructed, the theory is general and not reliant upon the net being infinite, none the less the infinite net is a useful special case for which Bloch theory can be applied. This special case is explored in detail allowing for verification of the theory, and highlights the importance of degenerate cases; the specific example of a square net is treated in detail. An additional illustration of the versatility of the method is the response to point forcing which provides a stringent test of the homogenized equations; an exact Green's function for the net is deduced and compared to the asymptotics.

  6. Homogeneous cooling state of frictionless rod particles

    NASA Astrophysics Data System (ADS)

    Rubio-Largo, S. M.; Alonso-Marroquin, F.; Weinhart, T.; Luding, S.; Hidalgo, R. C.

    2016-02-01

    In this work, we report some theoretical results on granular gases consisting of frictionless 3D rods with low energy dissipation. We performed simulations on the temporal evolution of soft spherocylinders, using a molecular dynamics algorithm implemented on GPU architecture. A homogeneous cooling state for rods, where the time dependence of the system's intensive variables occurs only through a global granular temperature, has been identified. We have found a homogeneous cooling process, which is in excellent agreement with Haff's law, when using an adequate rescaling time τ(ξ), the value of which depends on the particle elongation ξ and the restitution coefficient. It was further found that scaled particle velocity distributions remain approximately Gaussian regardless of the particle shape. Similarly to a system of ellipsoids, energy equipartition between rotational and translational degrees of freedom was better satisfied as one gets closer to the elastic limit. Taking advantage of scaling properties, we have numerically determined the general functionality of the magnitude Dc(ξ), which describes the efficiency of the energy interchange between rotational and translational degrees of freedom, as well as its dependence on particle shape. We have detected a range of particle elongations (1.5 < ξ < 4.0), where the average energy transfer between the rotational and translational degrees of freedom results greater for spherocylinders than for homogeneous ellipsoids with the same aspect ratio.

  7. A compact setup to study homogeneous nucleation and condensation.

    PubMed

    Karlsson, Mattias; Alxneit, Ivo; Rütten, Frederik; Wuillemin, Daniel; Tschudi, Hans Rudolf

    2007-03-01

    An experiment is presented to study homogeneous nucleation and the subsequent droplet growth at high temperatures and high pressures in a compact setup that does not use moving parts. Nucleation and condensation are induced in an adiabatic, stationary expansion of the vapor and an inert carrier gas through a Laval nozzle. The adiabatic expansion is driven against atmospheric pressure by pressurized inert gas its mass flow carefully controlled. This allows us to avoid large pumps or vacuum storage tanks. Because we eventually want to study the homogeneous nucleation and condensation of zinc, the use of carefully chosen materials is required that can withstand pressures of up to 10(6) Pa resulting from mass flow rates of up to 600 l(N) min(-1) and temperatures up to 1200 K in the presence of highly corrosive zinc vapor. To observe the formation of droplets a laser beam propagates along the axis of the nozzle and the light scattered by the droplets is detected perpendicularly to the nozzle axis. An ICCD camera allows to record the scattered light through fused silica windows in the diverging part of the nozzle spatially resolved and to detect nucleation and condensation coherently in a single exposure. For the data analysis, a model is needed to describe the isentropic core part of the flow along the nozzle axis. The model must incorporate the laws of fluid dynamics, the nucleation and condensation process, and has to predict the size distribution of the particles created (PSD) at every position along the nozzle axis. Assuming Rayleigh scattering, the intensity of the scattered light can then be calculated from the second moment of the PSD. PMID:17411197

  8. Self Creating Universe

    NASA Astrophysics Data System (ADS)

    Terry, Bruce

    2001-04-01

    Cosmology has deduced that our existence began 15 billion years ago but that does not constitute a true story. When compared against infinity, the true question one must as is, ‘why did creation begin now (a mere 15 billion give or take years ago) and not at some infinite point before? What could keep the one common original source static for an infinity, and then spring forth into existence?’ Also, accelerators are actually creating atmospheres much like that within quasars, black holes and stars. This destructive/creative environment is not that of original creation, it is of that which occurs in a later stage of cosmic evolution. Knowing that it is only a matter of movement or change, understanding what is moving is the key. Regardless of how much power is used to alter the character of a particle’s matter, it does not make its essence go away, nor does it make the understanding of original essence clearer. To find the true answer of what occurred, one must look back in time and think carefully over the process of elimination to find the original creation of matter, albeit different than that of the later processes. Matter and the physical laws formed themselves in an absolute infinity of blackness prior to light and no Big Bang scenario was necessary.

  9. Creating improved ASTER DEMs over glacierized terrain

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2006-12-01

    Digital elevation models (DEMs) produced from ASTER stereo imagery over glacierized terrain frequently contain data voids, which some software packages fill by interpolation. Even when interpolation is applied, the results are often not accurate enough for studies of glacier thickness changes. DEMs are created by automatic cross-correlation between the image pairs, and rely on spatial variability in the digital number (DN) values for this process. Voids occur in radiometrically homogeneous regions, such as glacier accumulation areas covered with uniform snow, due to lack of correlation. The same property that leads to lack of correlation makes possible the derivation of elevation information from photoclinometry, also known as shape-from-shading. We demonstrate a technique to produce improved DEMs from ASTER data by combining the results from conventional cross-correlation DEM-generation software with elevation information produced from shape-from-shading in the accumulation areas of glacierized terrain. The resulting DEMs incorporate more information from the imagery, and the filled voids more accurately represent the glacier surface. This will allow for more accurate determination of glacier hypsometry and thickness changes, leading to better predictions of response to climate change.

  10. Homogeneous melting of superheated crystals: Molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Forsblom, Mattias; Grimvall, Göran

    2005-08-01

    The homogeneous melting mechanism in a superheated fcc lattice is studied through molecular dynamics simulations, usually for about 20 000 atoms, with the Ercolessi and Adams interaction that represents aluminum. The periodic boundary conditions for the simulation cell suppress the usual surface-initiated melting at Tm=939K , and the solid-to-liquid transition takes place at the temperature Ts=1.3Tm . By logging the position of each atom at every time step in the simulation, we can follow the melting process in detail at the atomic level. Thermal fluctuations close to Ts create interstitial-vacancy pairs, which occasionally separate into mobile interstitials and almost immobile vacancies. There is an attraction between two interstitials, with a calculated maximum interaction energy of about 0.7eV . When three to four migrating interstitials have come close enough to form a bound aggregate of point defects, and a few thermally created interstitial-vacancy pairs have been added to the aggregate, such a defect configuration usually continues to grow irreversibly to the liquid state. For 20 000 atoms in the simulation cell, the growth process takes about 102τ to be completed, where τ is the period of a typical atomic vibration in the solid phase. This melting mechanism involves fewer atoms in its crucial initial phase than has been suggested in other melting models. The elastic shear moduli c44 and c'=(c11-c12)/2 were calculated as a function of temperature and were shown to be finite at the onset of melting.

  11. Homogeneous Pt-bimetallic Electrocatalysts

    SciTech Connect

    Wang, Chao; Chi, Miaofang; More, Karren Leslie; Markovic, Nenad; Stamenkovic, Vojislav

    2011-01-01

    Alloying has shown enormous potential for tailoring the atomic and electronic structures, and improving the performance of catalytic materials. Systematic studies of alloy catalysts are, however, often compromised by inhomogeneous distribution of alloying components. Here we introduce a general approach for the synthesis of monodispersed and highly homogeneous Pt-bimetallic alloy nanocatalysts. Pt{sub 3}M (where M = Fe, Ni, or Co) nanoparticles were prepared by an organic solvothermal method and then supported on high surface area carbon. These catalysts attained a homogeneous distribution of elements, as demonstrated by atomic-scale elemental analysis using scanning transmission electron microscopy. They also exhibited high catalytic activities for the oxygen reduction reaction (ORR), with improvement factors of 2-3 versus conventional Pt/carbon catalysts. The measured ORR catalytic activities for Pt{sub 3}M nanocatalysts validated the volcano curve established on extended surfaces, with Pt{sub 3}Co being the most active alloy.

  12. Homogeneous enzyme immunoassay for netilmicin.

    PubMed Central

    Wenk, M; Hemmann, R; Follath, F

    1982-01-01

    A newly developed homogeneous enzyme immunoassay for the determination of netilmicin in serum was evaluated and compared with a radioenzymatic assay. A total of 102 serum samples from patients treated with netilmicin were measured by both methods. This comparison showed an excellent correlation (r = 0.993). The enzyme immunoassay has proved to be precise, accurate, and specific. Because of its rapidity and the ease of performance, this method is a useful alternative to current assays for monitoring serum netilmicin concentrations. PMID:6760807

  13. High School Student Perceptions of the Utility of the Engineering Design Process: Creating Opportunities to Engage in Engineering Practices and Apply Math and Science Content

    NASA Astrophysics Data System (ADS)

    Berland, Leema; Steingut, Rebecca; Ko, Pat

    2014-12-01

    Research and policy documents increasingly advocate for incorporating engineering design into K-12 classrooms in order to accomplish two goals: (1) provide an opportunity to engage with science content in a motivating real-world context; and (2) introduce students to the field of engineering. The present study uses multiple qualitative data sources (i.e., interviews, artifact analysis) in order to examine the ways in which engaging in engineering design can support students in participating in engineering practices and applying math and science knowledge. This study suggests that students better understand and value those aspects of engineering design that are more qualitative (i.e., interviewing users, generating multiple possible solutions) than the more quantitative aspects of design which create opportunities for students to integrate traditional math and science content into their design work (i.e., modeling or systematically choosing between possible design solutions). Recommendations for curriculum design and implementation are discussed.

  14. Multifractal spectra in homogeneous shear flow

    NASA Technical Reports Server (NTRS)

    Deane, A. E.; Keefe, L. R.

    1988-01-01

    Employing numerical simulations of 3-D homogeneous shear flow, the associated multifractal spectra of the energy dissipation, scalar dissipation and vorticity fields were calculated. The results for (128) cubed simulations of this flow, and those obtained in recent experiments that analyzed 1- and 2-D intersections of atmospheric and laboratory flows, are in some agreement. A two-scale Cantor set model of the energy cascade process which describes the experimental results from 1-D intersections quite well, describes the 3-D results only marginally.

  15. Variable valve timing in a homogenous charge compression ignition engine

    DOEpatents

    Lawrence, Keith E.; Faletti, James J.; Funke, Steven J.; Maloney, Ronald P.

    2004-08-03

    The present invention relates generally to the field of homogenous charge compression ignition engines, in which fuel is injected when the cylinder piston is relatively close to the bottom dead center position for its compression stroke. The fuel mixes with air in the cylinder during the compression stroke to create a relatively lean homogeneous mixture that preferably ignites when the piston is relatively close to the top dead center position. However, if the ignition event occurs either earlier or later than desired, lowered performance, engine misfire, or even engine damage, can result. The present invention utilizes internal exhaust gas recirculation and/or compression ratio control to control the timing of ignition events and combustion duration in homogeneous charge compression ignition engines. Thus, at least one electro-hydraulic assist actuator is provided that is capable of mechanically engaging at least one cam actuated intake and/or exhaust valve.

  16. Reduction of pantethine in rabbit ocular lens homogenate.

    PubMed

    Fisher, D H; Szulc, M E

    1997-02-01

    In several animal models, preliminary studies have indicated that pantethine may inhibit cataract formation. Therefore, preclinical trials need to be conducted to study the pharmacology of pantethine in the ocular lens and to establish its efficacy. Since pantethine, which is a disulfide, can undergo a variety of chemical modifications such as reduction and formation of mixed disulfides, a detailed study was first conducted to determine the stability of pantethine in rabbit lens homogenate. A knowledge of the stability of pantethine in lens homogenate was necessary to establish if pantethine could be metabolized in the time it takes to harvest and homogenize a lens. The results of this study will be used to establish a protocol for harvesting and homogenizing lens samples. Pantethine (100 microM) is completely reduced to pantetheine in rabbit lens homogenate in about 16 min. About 1.5% of the pantethine added to lens homogenate forms a mixed disulfide with lens proteins, and the remainder is found in the supernatant. The supernatant pantethine concentration decreases exponentially as a function of time, and the terminal half-life for this process is 3.3 min. The free supernatant pantetheine concentration increases in pseudo first order manner as a function of time with a rate constant of 4.3 min. Pantethinase activity is not significant, because the free supernatant pantetheine concentration did not decrease. The exact mechanism of pantethine reduction in rabbit lens homogenate remains to be determined. PMID:9127277

  17. Homogenization and improvement in energy dissipation of nonlinear composites

    NASA Astrophysics Data System (ADS)

    Verma, Luv; Sivakumar, Srinivasan M.; Vedantam, S.

    2016-04-01

    Due to their high strength to weight and stiffness to weight ratio, there is a huge shift towards the composite materials from the conventional metals, but composites have poor damage resistance in the transverse direction. Undergoing impact loads, they can fail in wide variety of modes which severely reduces the structural integrity of the component. This paper deals with the homogenization of glass-fibers and epoxy composite with a material introduced as an inelastic inclusion. This nonlinearity is being modelled by kinematic hardening procedure and homogenization is done by one of the mean field homogenization technique known as Mori-Tanaka method. The homogenization process consider two phases, one is the matrix and another is the inelastic inclusion, thus glass-fibers and epoxy are two phases which can be considered as one phase and act as a matrix while homogenizing non-linear composite. Homogenization results have been compared to the matrix at volume fraction zero of the inelastic inclusions and to the inelastic material at volume fraction one. After homogenization, increase of the energy dissipation into the composite due to addition of inelastic material and effects onto the same by changing the properties of the matrix material have been discussed.

  18. Matrix shaped pulsed laser deposition: New approach to large area and homogeneous deposition

    NASA Astrophysics Data System (ADS)

    Akkan, C. K.; May, A.; Hammadeh, M.; Abdul-Khaliq, H.; Aktas, O. C.

    2014-05-01

    Pulsed laser deposition (PLD) is one of the well-established physical vapor deposition methods used for synthesis of ultra-thin layers. Especially PLD is suitable for the preparation of thin films of complex alloys and ceramics where the conservation of the stoichiometry is critical. Beside several advantages of PLD, inhomogeneity in thickness limits use of PLD in some applications. There are several approaches such as rotation of the substrate or scanning of the laser beam over the target to achieve homogenous layers. On the other hand movement and transition create further complexity in process parameters. Here we present a new approach which we call Matrix Shaped PLD to control the thickness and homogeneity of deposited layers precisely. This new approach is based on shaping of the incoming laser beam by a microlens array and a Fourier lens. The beam is split into much smaller multi-beam array over the target and this leads to a homogenous plasma formation. The uniform intensity distribution over the target yields a very uniform deposit on the substrate. This approach is used to deposit carbide and oxide thin films for biomedical applications. As a case study coating of a stent which has a complex geometry is presented briefly.

  19. Homogeneous cooling of mixtures of particle shapes

    NASA Astrophysics Data System (ADS)

    Hidalgo, R. C.; Serero, D.; Pöschel, T.

    2016-07-01

    In this work, we examine theoretically the cooling dynamics of binary mixtures of spheres and rods. To this end, we introduce a generalized mean field analytical theory, which describes the free cooling behavior of the mixture. The relevant characteristic time scale for the cooling process is derived, depending on the mixture composition and the aspect ratio of the rods. We simulate mixtures of spherocylinders and spheres using a molecular dynamics algorithm implemented on graphics processing unit (GPU) architecture. We systematically study mixtures composed of spheres and rods with several aspect ratios and varying the mixture composition. A homogeneous cooling state, where the time dependence of the system's intensive variables occurs only through a global granular temperature, is identified. We find cooling dynamics in excellent agreement with Haff's law, when using an adequate time scale. Using the scaling properties of the homogeneous cooling dynamics, we estimated numerically the efficiency of the energy interchange between rotational and translational degrees of freedom for collisions between spheres and rods.

  20. Creating a Comprehensive, Efficient, and Sustainable Nuclear Regulatory Structure: A Process Report from the U.S. Department of Energy's Material Protection, Control and Accounting Program

    SciTech Connect

    Wright, Troy L.; O'Brien, Patricia E.; Hazel, Michael J.; Tuttle, John D.; Cunningham, Mitchel E.; Schlegel, Steven C.

    2010-08-11

    With the congressionally mandated January 1, 2013 deadline for the U.S. Department of Energy’s (DOE) Nuclear Material Protection, Control and Accounting (MPC&A) program to complete its transition of MPC&A responsibility to the Russian Federation, National Nuclear Security Administration (NNSA) management directed its MPC&A program managers and team leaders to demonstrate that work in ongoing programs would lead to successful and timely achievement of these milestones. In the spirit of planning for successful project completion, the NNSA review of the Russian regulatory development process confirmed the critical importance of an effective regulatory system to a sustainable nuclear protection regime and called for an analysis of the existing Russian regulatory structure and the identification of a plan to ensure a complete MPC&A regulatory foundation. This paper describes the systematic process used by DOE’s MPC&A Regulatory Development Project (RDP) to develop an effective and sustainable MPC&A regulatory structure in the Russian Federation. This nuclear regulatory system will address all non-military Category I and II nuclear materials at State Corporation for Atomic Energy “Rosatom,” the Federal Service for Ecological, Technological, and Nuclear Oversight (Rostechnadzor), the Federal Agency for Marine and River Transport (FAMRT, within the Ministry of Transportation), and the Ministry of Industry and Trade (Minpromtorg). The approach to ensuring a complete and comprehensive nuclear regulatory structure includes five sequential steps. The approach was adopted from DOE’s project management guidelines and was adapted to the regulatory development task by the RDP. The five steps in the Regulatory Development Process are: 1) Define MPC&A Structural Elements; 2) Analyze the existing regulatory documents using the identified Structural Elements; 3) Validate the analysis with Russian colleagues and define the list of documents to be developed; 4) Prioritize and

  1. Homogeneous Open Quantum Random Walks on a Lattice

    NASA Astrophysics Data System (ADS)

    Carbone, Raffaella; Pautrat, Yan

    2015-09-01

    We study open quantum random walks (OQRWs) for which the underlying graph is a lattice, and the generators of the walk are homogeneous in space. Using the results recently obtained in Carbone and Pautrat (Ann Henri Poincaré, 2015), we study the quantum trajectory associated with the OQRW, which is described by a position process and a state process. We obtain a central limit theorem and a large deviation principle for the position process. We study in detail the case of homogeneous OQRWs on the lattice , with internal space.

  2. ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.

    SciTech Connect

    BULLOCK,R.M.; BENDER,B.R.

    2000-12-01

    The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.

  3. Creating a framework for experimentally testing early visual processing: a response to Nurmoja, et al. (2012) on trait perception from pixelized faces.

    PubMed

    Carbon, Claus-Christian

    2013-08-01

    Nurmoja, Eamets, Härma, and Bachmann (2012) revealed that strongly pixelated pictures of faces still provide relevant cues for reliably assessing the apparent (i.e., subjectively perceived) traits of the portrayed. The present article responds to the paper by developing the outline of a framework for future research to reveal certain steps in processing complex visual stimuli. This framework combines the approach of degradation of the stimuli with the so-called microgenetic approach of percepts based on presentation time limitations. The proposed combination of a particular kind of stimulus manipulation and a specific experimental procedure allows testing targeted assumptions concerning visual processing, not only in the domain of face perception, but in all domains involving complex visual stimuli, for example, art perception. PMID:24422351

  4. 3D modeling of the molten zone shape created by an asymmetric HF EM field during the FZ crystal growth process

    NASA Astrophysics Data System (ADS)

    Rudevics, A.; Muiznieks, A.; Ratnieks, G.; Riemann, H.

    2005-06-01

    In the modern industrial floating zone (FZ) silicon crystal growth process by the needle-eye technique, the high frequency (HF) electromagnetic (EM) field plays a crucial role. The EM field melts a rotating poly silicon feed rod and maintains the zone of molten silicon, which is held by the rotating single crystal. To model such a system, the 2D axi-symmetric models can be used, however, due to the system's asymmetry (e.g., the asymmetry of the HF inductor) the applicability of such models is restricted. Therefore, the modeling of FZ process in three dimensions (3D) is necessary. This paper describes a new complex 3D mathematical model of the FZ crystal growth and a correspondingly developed software package Shape3D. A 3D calculation example for the realistic FZ system is also presented. Figs 25, Refs 9.

  5. Create a Logo.

    ERIC Educational Resources Information Center

    Duchen, Gail

    2002-01-01

    Presents an art lesson that introduced students to graphic art as a career path. Explains that the students met a graphic artist and created a logo for a pretend client. Explains that the students researched logos. (CMK)

  6. Invariant distributions on compact homogeneous spaces

    SciTech Connect

    Gorbatsevich, V V

    2013-12-31

    In this paper, we study distributions on compact homogeneous spaces, including invariant distributions and also distributions admitting a sub-Riemannian structure. We first consider distributions of dimension 1 and 2 on compact homogeneous spaces. After this, we study the cases of compact homogeneous spaces of dimension 2, 3, and 4 in detail. Invariant distributions on simply connected compact homogeneous spaces are also treated. Bibliography: 18 titles.

  7. Numerical experiments in homogeneous turbulence

    NASA Technical Reports Server (NTRS)

    Rogallo, R. S.

    1981-01-01

    The direct simulation methods developed by Orszag and Patternson (1972) for isotropic turbulence were extended to homogeneous turbulence in an incompressible fluid subjected to uniform deformation or rotation. The results of simulations for irrotational strain (plane and axisymmetric), shear, rotation, and relaxation toward isotropy following axisymmetric strain are compared with linear theory and experimental data. Emphasis is placed on the shear flow because of its importance and because of the availability of accurate and detailed experimental data. The computed results are used to assess the accuracy of two popular models used in the closure of the Reynolds-stress equations. Data from a variety of the computed fields and the details of the numerical methods used in the simulation are also presented.

  8. Homogenization of regional river dynamics by dams and global biodiversity implications.

    PubMed

    Poff, N Leroy; Olden, Julian D; Merritt, David M; Pepin, David M

    2007-04-01

    Global biodiversity in river and riparian ecosystems is generated and maintained by geographic variation in stream processes and fluvial disturbance regimes, which largely reflect regional differences in climate and geology. Extensive construction of dams by humans has greatly dampened the seasonal and interannual streamflow variability of rivers, thereby altering natural dynamics in ecologically important flows on continental to global scales. The cumulative effects of modification to regional-scale environmental templates caused by dams is largely unexplored but of critical conservation importance. Here, we use 186 long-term streamflow records on intermediate-sized rivers across the continental United States to show that dams have homogenized the flow regimes on third- through seventh-order rivers in 16 historically distinctive hydrologic regions over the course of the 20th century. This regional homogenization occurs chiefly through modification of the magnitude and timing of ecologically critical high and low flows. For 317 undammed reference rivers, no evidence for homogenization was found, despite documented changes in regional precipitation over this period. With an estimated average density of one dam every 48 km of third- through seventh-order river channel in the United States, dams arguably have a continental scale effect of homogenizing regionally distinct environmental templates, thereby creating conditions that favor the spread of cosmopolitan, nonindigenous species at the expense of locally adapted native biota. Quantitative analyses such as ours provide the basis for conservation and management actions aimed at restoring and maintaining native biodiversity and ecosystem function and resilience for regionally distinct ecosystems at continental to global scales. PMID:17360379

  9. Challenges of daily data homogenization

    NASA Astrophysics Data System (ADS)

    Gruber, C.; Auer, I.; Mestre, O.

    2009-04-01

    In recent years the growing demand of extreme value studies has led to the development of methods for the homogenisation of daily data. The behaviour of some of these methods has been investigated: Two methods (HOM: Della-Marta and Wanner, 2006 and SPLIDHOM: Mestre et al., submitted) which adjust the whole distribution of the climate element (especially minimum and maximum temperature) have been compared to the simpler Vincent's method (Vincent et al., 2002) which interpolates monthly adjustment factors onto daily data. The results indicate that the behaviour of the methods HOM and SPLIDHOM is very similar, although the complexity of these methods is different. They can improve the results compared to the Vincent's method when inhomogeneities in higher order moments occur. However, their applicability is limited since highly correlated neighbour series are required. More over, more data in the intervals before and after breaks is needed if the whole distribution shall be adjusted instead of the mean only. Due to these limitations a combination of distribution dependent adjustment methods and the Vincent method seems to be necessary for the homogenization of many time series. A dataset of Austrian daily maximum and minimum temperature data is used to illustrate the challenges of distribution dependent homogenization methods. Emphasis is placed on the estimation of the (sampling) uncertainty of these methods. Therefore a bootstrap approach is used. The accuracy of the calculated adjustments varies mainly between about 0.5°C for mean temperatures and more than one degree Celsius for the margins of the distribution. These uncertainty estimates can be valuable for extreme value studies.

  10. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  11. Homogenization patterns of the world’s freshwater fish faunas

    PubMed Central

    Villéger, Sébastien; Blanchet, Simon; Beauchard, Olivier; Oberdorff, Thierry; Brosse, Sébastien

    2011-01-01

    The world is currently undergoing an unprecedented decline in biodiversity, which is mainly attributable to human activities. For instance, nonnative species introduction, combined with the extirpation of native species, affects biodiversity patterns, notably by increasing the similarity among species assemblages. This biodiversity change, called taxonomic homogenization, has rarely been assessed at the world scale. Here, we fill this gap by assessing the current homogenization status of one of the most diverse vertebrate groups (i.e., freshwater fishes) at global and regional scales. We demonstrate that current homogenization of the freshwater fish faunas is still low at the world scale (0.5%) but reaches substantial levels (up to 10%) in some highly invaded river basins from the Nearctic and Palearctic realms. In these realms experiencing high changes, nonnative species introductions rather than native species extirpations drive taxonomic homogenization. Our results suggest that the “Homogocene era” is not yet the case for freshwater fish fauna at the worldwide scale. However, the distressingly high level of homogenization noted for some biogeographical realms stresses the need for further understanding of the ecological consequences of homogenization processes. PMID:22025692

  12. Polyurethane phantoms with homogeneous and nearly homogeneous optical properties

    NASA Astrophysics Data System (ADS)

    Keränen, Ville T.; Mäkynen, Anssi J.; Dayton, Amanda L.; Prahl, Scott A.

    2010-02-01

    Phantoms with controlled optical properties are often used for calibration and standardization. The phantoms are typically prepared by adding absorbers and scatterers to a clear host material. It is usually assumed that the scatterers and absorbers are uniformly dispersed within the medium. To explore the effects of this assumption, we prepared paired sets of polyurethane phantoms (both with identical masses of absorber, India ink and scatterer, titanium dioxide). Polyurethane phantoms were made by mixing two polyurethane parts (a and b) together and letting them cure in a polypropylene container. The mixture was degassed before curing to ensure a sample without bubbles. The optical properties were controlled by mixing titanium dioxide or India ink into polyurethane part (a or b) before blending the parts together. By changing the mixing sequence, we could change the aggregation of the scattering and absorbing particles. Each set had one sample with homogeneously dispersed scatterers and absorbers, and a second sample with slightly aggregated scatterers or absorbers. We found that the measured transmittance could easily vary by a factor of twenty. The estimated optical properties (using the inverse adding-doubling method) indicate that when aggregation is present, the optical properties are no longer proportional to the concentrations of absorbers or scatterers.

  13. An observation of homogeneous and heterogeneous catalysis processes in the decomposition of H sub 2 O sub 2 over MnO sub 2 and Mn(OH) sub 2

    SciTech Connect

    Jiang, S.P.; Ashton, W.R.; Tseung, A.C.C. )

    1991-09-01

    The kinetics of peroxide decomposition by manganese dioxide (MnO{sub 2}) and manganese hydroxide (Mn(OH){sub 2}) have been studied in alkaline solutions. The activity for peroxide decomposition on Mn(OH){sub 2} was generally higher than MnO{sub 2} and the kinetics for the decomposition of H{sub 2}O{sub 2} were first-order in the case of MnO{sub 2} catalysts, but 1.3-order for Mn(OH){sub 2} catalysts. It is suggested that H{sub 2}O{sub 2} is mainly homogeneously decomposed by Mn{sup 2+} ions (in the form of HMnO{sub 2}{sup {minus}} ions in concentrated alkaline solutions) dissolved in the solution in the case of Mn(OH){sub 2}. Compared with the results reported for the decomposition of H{sub 2}O{sub 2} in the presence of 1 ppm Co{sup 2+} ions, it is concluded that the kinetics of the homogeneous decomposition of H{sub 2}O{sub 2} are directly influenced by the concentration of the active species in the solution.

  14. Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals: stereoelectronic role of trans effect in a metal-mediated pericyclic process and a shift from homogeneous to heterogeneous catalysis during a one-pot reaction.

    PubMed

    Vidhani, Dinesh V; Krafft, Marie E; Alabugin, Igor V

    2014-01-01

    The combination of experiments and computations reveals unusual features of stereoselective Rh(I)-catalyzed transformation of propargyl vinyl ethers into (E,Z)-dienals. The first step, the conversion of propargyl vinyl ethers into allene aldehydes, proceeds under homogeneous conditions via a "cyclization-mediated" mechanism initiated by Rh(I) coordination at the alkyne. This path agrees well with the small experimental effects of substituents on the carbinol carbon. The key feature revealed by the computational study is the stereoelectronic effect of the ligand arrangement at the catalytic center. The rearrangement barriers significantly decrease due to the greater transfer of electron density from the catalytic metal center to the CO ligand oriented trans to the alkyne. This effect increases electrophilicity of the metal and lowers the calculated barriers by 9.0 kcal/mol. Subsequent evolution of the catalyst leads to the in situ formation of Rh(I) nanoclusters that catalyze stereoselective tautomerization. The intermediacy of heterogeneous catalysis by nanoclusters was confirmed by mercury poisoning, temperature-dependent sigmoidal kinetic curves, and dynamic light scattering. The combination of experiments and computations suggests that the initially formed allene-aldehyde product assists in the transformation of a homogeneous catalyst (or "a cocktail of catalysts") into nanoclusters, which in turn catalyze and control the stereochemistry of subsequent transformations. PMID:24304338

  15. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer.

    PubMed

    An, Ning; Shi, Xiaoyu; Zhang, Yueming; Lv, Ning; Feng, Lin; Di, Xuebing; Han, Naijun; Wang, Guiqi; Cheng, Shujun; Zhang, Kaitai

    2015-01-01

    Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis), probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan-Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11), and significantly associated with disease-free survival in four

  16. Discovery of a Novel Immune Gene Signature with Profound Prognostic Value in Colorectal Cancer: A Model of Cooperativity Disorientation Created in the Process from Development to Cancer

    PubMed Central

    An, Ning; Shi, Xiaoyu; Zhang, Yueming; Lv, Ning; Feng, Lin; Di, Xuebing; Han, Naijun; Wang, Guiqi

    2015-01-01

    Immune response-related genes play a major role in colorectal carcinogenesis by mediating inflammation or immune-surveillance evasion. Although remarkable progress has been made to investigate the underlying mechanism, the understanding of the complicated carcinogenesis process was enormously hindered by large-scale tumor heterogeneity. Development and carcinogenesis share striking similarities in their cellular behavior and underlying molecular mechanisms. The association between embryonic development and carcinogenesis makes embryonic development a viable reference model for studying cancer thereby circumventing the potentially misleading complexity of tumor heterogeneity. Here we proposed that the immune genes, responsible for intra-immune cooperativity disorientation (defined in this study as disruption of developmental expression correlation patterns during carcinogenesis), probably contain untapped prognostic resource of colorectal cancer. In this study, we determined the mRNA expression profile of 137 human biopsy samples, including samples from different stages of human colonic development, colorectal precancerous progression and colorectal cancer samples, among which 60 were also used to generate miRNA expression profile. We originally established Spearman correlation transition model to quantify the cooperativity disorientation associated with the transition from normal to precancerous to cancer tissue, in conjunction with miRNA-mRNA regulatory network and machine learning algorithm to identify genes with prognostic value. Finally, a 12-gene signature was extracted, whose prognostic value was evaluated using Kaplan–Meier survival analysis in five independent datasets. Using the log-rank test, the 12-gene signature was closely related to overall survival in four datasets (GSE17536, n = 177, p = 0.0054; GSE17537, n = 55, p = 0.0039; GSE39582, n = 562, p = 0.13; GSE39084, n = 70, p = 0.11), and significantly associated with disease-free survival in four

  17. Comparative Analysis of a MOOC and a Residential Community Using Introductory College Physics: Documenting How Learning Environments Are Created, Lessons Learned in the Process, and Measurable Outcomes

    NASA Astrophysics Data System (ADS)

    Olsen, Jack Ryan

    Higher education institutions, such as the University of Colorado Boulder (CU-Boulder), have as a core mission to advance their students' academic performance. On the frontier of education technologies that hold the promise to address our educational mission are Massively Open Online Courses (MOOCs) which are new enough to not be fully understood or well-researched. MOOCs, in theory, have vast potential for being cost-effective and for reaching diverse audiences across the world. This thesis examines the implementation of one MOOC, Physics 1 for Physical Science Majors, implemented in the augural round of institutionally sanctioned MOOCs in Fall 2013. While comparatively inexpensive to a brick-and-mortar course and while it initially enrolled audience of nearly 16,000 students, this MOOC was found to be time-consuming to implement, and only roughly 1.5% of those who enrolled completed the course---approximately 1/4 of those who completed the standard brick and mortar course that the MOOC was designed around. An established education technology, residential communities, contrast the MOOCs by being high-touch and highly humanized, but by being expensive and locally-based. The Andrews Hall Residential College (AHRC) on the CU campus fosters academic success and retention by engaging and networking students outside of the standard brick and mortar courses and enculturating students into an environment with vertical integration through the different classes: freshman, sophomore, junior, etc. The physics MOOC and the AHRC were studied to determine how the environments were made and what lessons were learned in the process. Also, student performance was compared for the physics MOOC, a subset of the AHRC students enrolled in a special physics course, and the standard CU Physics 1 brick and mortar course. All yielded similar learning gains for physics 1 performance, for those who completed the courses. These environments are presented together to compare and contrast their

  18. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  19. Creating Photo Illustrations.

    ERIC Educational Resources Information Center

    Wilson, Bradley

    2003-01-01

    Explains the uses of photo illustrations. Notes that the key to developing a successful photo illustration is collaborative planning. Outlines the following guidelines for photo illustrations: never set up a photograph to mimic reality; create only abstractions with photo illustrations; clearly label photo illustrations; and never play photo…

  20. Creating dedicated bioenergy crops

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Bioenergy is one of the current mechanisms of producing renewable energy to reduce our use of nonrenewable fossil fuels and to reduce carbon emissions into the atmosphere. Humans have been using bioenergy since we first learned to create and control fire - burning manure, peat, and wood to cook food...

  1. Create a Critter Collector.

    ERIC Educational Resources Information Center

    Hinchey, Elizabeth K.; Nestlerode, Janet A.

    2001-01-01

    Presents methods for creating appropriate ways of collecting live specimens to use for firsthand observation in the classroom. Suggests ecological questions for students to address using these devices. This project is ideal for schools that have access to piers or bridges on a coastal body of water. (NB)

  2. Creating a Market.

    ERIC Educational Resources Information Center

    Kazimirski, J.; And Others

    The second in a series of programmed books, "Creating a Market" is published by the International Labour Office as a manual for persons studying marketing. This manual was designed to meet the needs of the labor organization's technical cooperation programs and is primarily concerned with consumer goods industries. Using a fill-in-the-blanks and…

  3. Looking, Writing, Creating.

    ERIC Educational Resources Information Center

    Katzive, Bonnie

    1997-01-01

    Describes how a middle school language arts teacher makes analyzing and creating visual art a partner to reading and writing in her classroom. Describes a project on art and Vietnam which shows how background information can add to and influence interpretation. Describes a unit on Greek mythology and Greek vases which leads to a related visual…

  4. Creating Dialogue by Storytelling

    ERIC Educational Resources Information Center

    Passila, Anne; Oikarinen, Tuija; Kallio, Anne

    2013-01-01

    Purpose: The objective of this paper is to develop practice and theory from Augusto Boal's dialogue technique (Image Theatre) for organisational use. The paper aims to examine how the members in an organisation create dialogue together by using a dramaturgical storytelling framework where the dialogue emerges from storytelling facilitated by…

  5. Create Your State

    ERIC Educational Resources Information Center

    Dunham, Kris; Melvin, Samantha

    2011-01-01

    Students are often encouraged to work together with their classmates, sometimes with other classes, occasionally with kids at other schools, but rarely with kids across the country. In this article the authors describe the Create Your State project, a collaborative nationwide project inspired by the Texas Chair Project wherein the artist, Damien…

  6. Creating Pupils' Internet Magazine

    ERIC Educational Resources Information Center

    Bognar, Branko; Šimic, Vesna

    2014-01-01

    This article presents an action research, which aimed to improve pupils' literary creativity and enable them to use computers connected to the internet. The study was conducted in a small district village school in Croatia. Creating a pupils' internet magazine appeared to be an excellent way for achieving the educational aims of almost all…

  7. Creating an Interactive Globe.

    ERIC Educational Resources Information Center

    Martin, Kurt D.

    1989-01-01

    Describes a hands-on geography activity that is designed to teach longitude and latitude to fifth-grade students. Children create a scale model of the earth from a 300 gram weather balloon. This activity incorporates geography, mathematics, science, art, and homework. Provides information for obtaining materials. (KO)

  8. How Banks Create Money.

    ERIC Educational Resources Information Center

    Beale, Lyndi

    This teaching module explains how the U.S. banking system uses excess reserves to create money in the form of new deposits for borrowers. The module is part of a computer-animated series of four-to-five-minute modules illustrating standard concepts in high school economics. Although the module is designed to accompany the video program, it may be…

  9. Creating Quality Media Materials.

    ERIC Educational Resources Information Center

    Hortin, John A.; Bailey, Gerald D.

    1982-01-01

    Innovation, imagination, and student creativity are key ingredients in creating quality media materials for the small school. Student-produced media materials, slides without a camera, personalized slide programs and copy work, self-made task cards, self-made overhead transparencies, graphic materials, and utilization of the mass media are some of…

  10. Creating a Reference Toolbox.

    ERIC Educational Resources Information Center

    Scott, Jane

    1997-01-01

    To help students understand that references are tools used to locate specific information, one librarian has her third-grade students create their own reference toolboxes as she introduces dictionaries, atlases, encyclopedias, and thesauri. Presents a lesson plan to introduce print and nonprint thesauri to third and fourth graders and includes a…

  11. Creating a Classroom Makerspace

    ERIC Educational Resources Information Center

    Rivas, Luz

    2014-01-01

    What is a makerspace? Makerspaces are community-operated physical spaces where people (makers) create do-it-yourself projects together. These membership spaces serve as community labs where people learn together and collaborate on projects. Makerspaces often have tools and equipment like 3-D printers, laser cutters, and soldering irons.…

  12. Creating a Virtual Gymnasium

    ERIC Educational Resources Information Center

    Fiorentino, Leah H.; Castelli, Darla

    2005-01-01

    Physical educators struggle with the challenges of assessing student performance, providing feedback about motor skills, and creating opportunities for all students to engage in game-play on a daily basis. The integration of technology in the gymnasium can address some of these challenges by improving teacher efficiency and increasing student…

  13. Creating an Interactive PDF

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2008-01-01

    There are many ways to begin a PDF document using Adobe Acrobat. The easiest and most popular way is to create the document in another application (such as Microsoft Word) and then use the Adobe Acrobat software to convert it to a PDF. In this article, the author describes how he used Acrobat's many tools in his project--an interactive…

  14. Creating Quality Schools.

    ERIC Educational Resources Information Center

    American Association of School Administrators, Arlington, VA.

    This booklet presents information on how total quality management can be applied to school systems to create educational improvement. Total quality management offers education a systemic approach and a new set of assessment tools. Chapter 1 provides a definition and historical overview of total quality management. Chapter 2 views the school…

  15. Cryogenic Homogenization and Sampling of Heterogeneous Multi-Phase Feedstock

    SciTech Connect

    Doyle, Glenn M.; Ideker, Virgene D.; Siegwarth, James D.

    1999-09-21

    An apparatus and process for producing a homogeneous analytical sample from a heterogeneous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77K (-196 C). Further, with the process of this invention the representative sample maybe maintained below the critical temperature until being analyzed.

  16. Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock

    DOEpatents

    Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David

    2002-01-01

    An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.

  17. Theoretical studies of homogeneous catalysts mimicking nitrogenase.

    PubMed

    Sgrignani, Jacopo; Franco, Duvan; Magistrato, Alessandra

    2011-01-01

    The conversion of molecular nitrogen to ammonia is a key biological and chemical process and represents one of the most challenging topics in chemistry and biology. In Nature the Mo-containing nitrogenase enzymes perform nitrogen 'fixation' via an iron molybdenum cofactor (FeMo-co) under ambient conditions. In contrast, industrially, the Haber-Bosch process reduces molecular nitrogen and hydrogen to ammonia with a heterogeneous iron catalyst under drastic conditions of temperature and pressure. This process accounts for the production of millions of tons of nitrogen compounds used for agricultural and industrial purposes, but the high temperature and pressure required result in a large energy loss, leading to several economic and environmental issues. During the last 40 years many attempts have been made to synthesize simple homogeneous catalysts that can activate dinitrogen under the same mild conditions of the nitrogenase enzymes. Several compounds, almost all containing transition metals, have been shown to bind and activate N₂ to various degrees. However, to date Mo(N₂)(HIPTN)₃N with (HIPTN)₃N= hexaisopropyl-terphenyl-triamidoamine is the only compound performing this process catalytically. In this review we describe how Density Functional Theory calculations have been of help in elucidating the reaction mechanisms of the inorganic compounds that activate or fix N₂. These studies provided important insights that rationalize and complement the experimental findings about the reaction mechanisms of known catalysts, predicting the reactivity of new potential catalysts and helping in tailoring new efficient catalytic compounds. PMID:21221062

  18. Iterative and variational homogenization methods for filled elastomers

    NASA Astrophysics Data System (ADS)

    Goudarzi, Taha

    Elastomeric composites have increasingly proved invaluable in commercial technological applications due to their unique mechanical properties, especially their ability to undergo large reversible deformation in response to a variety of stimuli (e.g., mechanical forces, electric and magnetic fields, changes in temperature). Modern advances in organic materials science have revealed that elastomeric composites hold also tremendous potential to enable new high-end technologies, especially as the next generation of sensors and actuators featured by their low cost together with their biocompatibility, and processability into arbitrary shapes. This potential calls for an in-depth investigation of the macroscopic mechanical/physical behavior of elastomeric composites directly in terms of their microscopic behavior with the objective of creating the knowledge base needed to guide their bottom-up design. The purpose of this thesis is to generate a mathematical framework to describe, explain, and predict the macroscopic nonlinear elastic behavior of filled elastomers, arguably the most prominent class of elastomeric composites, directly in terms of the behavior of their constituents --- i.e., the elastomeric matrix and the filler particles --- and their microstructure --- i.e., the content, size, shape, and spatial distribution of the filler particles. This will be accomplished via a combination of novel iterative and variational homogenization techniques capable of accounting for interphasial phenomena and finite deformations. Exact and approximate analytical solutions for the fundamental nonlinear elastic response of dilute suspensions of rigid spherical particles (either firmly bonded or bonded through finite size interphases) in Gaussian rubber are first generated. These results are in turn utilized to construct approximate solutions for the nonlinear elastic response of non-Gaussian elastomers filled with a random distribution of rigid particles (again, either firmly

  19. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  20. Turbulence in homogeneous shear flows

    NASA Astrophysics Data System (ADS)

    Pumir, Alain

    1996-11-01

    Homogeneous shear flows with an imposed mean velocity U=Syx̂ are studied in a period box of size Lx×Ly×Lz, in the statistically stationary turbulent state. In contrast with unbounded shear flows, the finite size of the system constrains the large-scale dynamics. The Reynolds number, defined by Re≡SL2y/ν varies in the range 2600⩽Re⩽11300. The total kinetic energy and enstrophy in the volume of numerical integration have large peaks, resulting in fluctuations of kinetic energy of order 30%-50%. The mechanism leading to these fluctuations is very reminiscent of the ``streaks'' responsible for the violent bursts observed in turbulent boundary layers. The large scale anisotropy of the flow, characterized by the two-point correlation tensor depends on the aspect ratio of the system. The probability distribution functions (PDF) of the components of the velocity are found to be close to Gaussian. The physics of the Reynolds stress tensor, uv, is very similar to what is found experimentally in wall bounded shear flows. The study of the two-point correlation tensor of the vorticity <ωiωj> suggests that the small scales become isotropic when the Reynolds number increases, as observed in high Reynolds number turbulent boundary layers. However, the skewness of the z component of vorticity is independent of the Reynolds number in this range, suggesting that some small scale anisotropy remains even at very high Reynolds numbers. An analogy is drawn with the problem of turbulent mixing, where a similar anisotropy is observed.

  1. Homogeneous catalysts in hypersonic combustion

    SciTech Connect

    Harradine, D.M.; Lyman, J.L.; Oldenborg, R.C.; Pack, R.T.; Schott, G.L.

    1989-01-01

    Density and residence time both become unfavorably small for efficient combustion of hydrogen fuel in ramjet propulsion in air at high altitude and hypersonic speed. Raising the density and increasing the transit time of the air through the engine necessitates stronger contraction of the air flow area. This enhances the kinetic and thermodynamic tendency of H/sub 2/O to form completely, accompanied only by N/sub 2/ and any excess H/sub 2/(or O/sub 2/). The by-products to be avoided are the energetically expensive fragment species H and/or O atoms and OH radicals, and residual (2H/sub 2/ plus O/sub 2/). However, excessive area contraction raises air temperature and consequent combustion-product temperature by adiabatic compression. This counteracts and ultimately overwhelms the thermodynamic benefit by which higher density favors the triatomic product, H/sub 2/O, over its monatomic and diatomic alternatives. For static pressures in the neighborhood of 1 atm, static temperature must be kept or brought below ca. 2400 K for acceptable stability of H/sub 2/O. Another measure, whose requisite chemistry we address here, is to extract propulsive work from the combustion products early in the expansion. The objective is to lower the static temperature of the combustion stream enough for H/sub 2/O to become adequately stable before the exhaust flow is massively expanded and its composition ''frozen.'' We proceed to address this mechanism and its kinetics, and then examine prospects for enhancing its rate by homogeneous catalysts. 9 refs.

  2. Creating Geoscience Leaders

    NASA Astrophysics Data System (ADS)

    Buskop, J.; Buskop, W.

    2013-12-01

    The United Nations Educational, Scientific, and Cultural Organization recognizes 21 World Heritage in the United States, ten of which have astounding geological features: Wrangell St. Elias National Park, Olympic National Park, Mesa Verde National Park, Chaco Canyon, Glacier National Park, Carlsbad National Park, Mammoth Cave, Great Smokey Mountains National Park, Hawaii Volcanoes National Park, and Everglades National Park. Created by a student frustrated with fellow students addicted to smart phones with an extreme lack of interest in the geosciences, one student visited each World Heritage site in the United States and created one e-book chapter per park. Each chapter was created with original photographs, and a geological discovery hunt to encourage teen involvement in preserving remarkable geological sites. Each chapter describes at least one way young adults can get involved with the geosciences, such a cave geology, glaciology, hydrology, and volcanology. The e-book describes one park per chapter, each chapter providing a geological discovery hunt, information on how to get involved with conservation of the parks, geological maps of the parks, parallels between archaeological and geological sites, and how to talk to a ranger. The young author is approaching UNESCO to publish the work as a free e-book to encourage involvement in UNESCO sites and to prove that the geosciences are fun.

  3. Design and testing of a refractive laser beam homogenizer

    NASA Astrophysics Data System (ADS)

    Fernelius, N. C.; Bradley, K. R.; Hoekstra, B. L.

    1984-09-01

    A survey is made of various techniques to create a homogeneous or flat top laser beam profile. A refractive homogenizer was designed for use with a ND:YAG laser with output at its fundamental (1.06 micrometer) and frequency doubled (532 nm) modes. The system consists of a 2X beam expander and two faceted cylindrical lenses with differing focal lengths. Each cylindrical lens focusses its input into a strip the width of a facet. By orienting their axes at a 90 degree angle and focussing them on the same plane, the beam is concentrated into a square focus. Formulae for calculating the facet angles are derived and a FORTRAN computer square focus. Formulae for calculating the facet angles are derived and a FORTRAN computer program was written to calculate them with a precision greater than one is able to fabricate them.

  4. Homogenization of Heterogeneous Elastic Materials with Applications to Seismic Anisotropy

    NASA Astrophysics Data System (ADS)

    Vel, S. S.; Johnson, S. E.; Okaya, D. A.; Cook, A. C.

    2014-12-01

    The velocities of seismic waves passing through a complex Earth volume can be influenced by heterogeneities at length scales shorter than the seismic wavelength. As such, seismic wave propagation analyses can be performed by replacing the actual Earth volume by a homogeneous i.e., "effective", elastic medium. Homogenization refers to the process by which the elastic stiffness tensor of the effective medium is "averaged" from the elastic properties, orientations, modal proportions and spatial distributions of the finer heterogeneities. When computing the homogenized properties of a heterogeneous material, the goal is to compute an effective or bulk elastic stiffness tensor that relates the average stresses to the average strains in the material. Tensor averaging schemes such as the Voigt and Reuss methods are based on certain simplifying assumptions. The Voigt method assumes spatially uniform strains while the Reuss method assumes spatially uniform stresses within the heterogeneous material. Although they are both physically unrealistic, they provide upper and lower bounds for the actual homogenized elastic stiffness tensor. In order to more precisely determine the homogenized stiffness tensor, the stress and strain distributions must be computed by solving the three-dimensional equations of elasticity over the heterogeneous region. Asymptotic expansion homogenization (AEH) is one such structure-based approach for the comprehensive micromechanical analysis of heterogeneous materials. Unlike modal volume methods, the AEH method takes into account how geometrical orientation and alignment can increase elastic stiffness in certain directions. We use the AEH method in conjunction with finite element analysis to calculate the bulk elastic stiffnesses of heterogeneous materials. In our presentation, wave speeds computed using the AEH method are compared with those generated using stiffness tensors derived from commonly-used analytical estimates. The method is illustrated

  5. Effect of heat and homogenization on in vitro digestion of milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Central to commercial fluid milk processing is the use of high temperature, short time (HTST) pasteurization to ensure the safety and quality of milk, and homogenization to prevent creaming of fat-containing milk. UHT processed homogenized milk is also available commercially and is typically used to...

  6. Cell-Laden Poly(ɛ-caprolactone)/Alginate Hybrid Scaffolds Fabricated by an Aerosol Cross-Linking Process for Obtaining Homogeneous Cell Distribution: Fabrication, Seeding Efficiency, and Cell Proliferation and Distribution

    PubMed Central

    Lee, HyeongJin; Ahn, SeungHyun; Bonassar, Lawrence J.; Chun, Wook

    2013-01-01

    Generally, solid-freeform fabricated scaffolds show a controllable pore structure (pore size, porosity, pore connectivity, and permeability) and mechanical properties by using computer-aided techniques. Although the scaffolds can provide repeated and appropriate pore structures for tissue regeneration, they have a low biological activity, such as low cell-seeding efficiency and nonuniform cell density in the scaffold interior after a long culture period, due to a large pore size and completely open pores. Here we fabricated three different poly(ɛ-caprolactone) (PCL)/alginate scaffolds: (1) a rapid prototyped porous PCL scaffold coated with an alginate, (2) the same PCL scaffold coated with a mixture of alginate and cells, and (3) a multidispensed hybrid PCL/alginate scaffold embedded with cell-laden alginate struts. The three scaffolds had similar micropore structures (pore size=430–580 μm, porosity=62%–68%, square pore shape). Preosteoblast cells (MC3T3-E1) were used at the same cell density in each scaffold. By measuring cell-seeding efficiency, cell viability, and cell distribution after various periods of culturing, we sought to determine which scaffold was more appropriate for homogeneously regenerated tissues. PMID:23469894

  7. Creating new growth platforms.

    PubMed

    Laurie, Donald L; Doz, Yves L; Sheer, Claude P

    2006-05-01

    Sooner or later, most companies can't attain the growth rates expected by their boards and CEOs and demanded by investors. To some extent, such businesses are victims of their own successes. Many were able to sustain high growth rates for a long time because they were in high-growth industries. But once those industries slowed down, the businesses could no longer deliver the performance that investors had come to take for granted. Often, companies have resorted to acquisition, though this strategy has a discouraging track record. Over time, 65% of acquisitions destroy more value than they create. So where does real growth come from? For the past 12 years, the authors have been researching and advising companies on this issue. With the support of researchers at Harvard Business School and Insead, they instituted a project titled "The CEO Agenda and Growth". They identified and approached 24 companies that had achieved significant organic growth and interviewed their CEOs, chief strategists, heads of R&D, CFOs, and top-line managers. They asked, "Where does your growth come from?" and found a consistent pattern in the answers. All the businesses grew by creating new growth platforms (NGPs) on which they could build families of products and services and extend their capabilities into multiple new domains. Identifying NGP opportunities calls for executives to challenge conventional wisdom. In all the companies studied, top management believed that NGP innovation differed significantly from traditional product or service innovation. They had independent, senior-level units with a standing responsibility to create NGPs, and their CEOs spent as much as 50% of their time working with these units. The payoff has been spectacular and lasting. For example, from 1985 to 2004, the medical devices company Medtronic grew revenues at 18% per year, earnings at 20%, and market capitalization at 30%. PMID:16649700

  8. Creating healthy camp experiences.

    PubMed

    Walton, Edward A; Tothy, Alison S

    2011-04-01

    The American Academy of Pediatrics has created recommendations for health appraisal and preparation of young people before participation in day or resident camps and to guide health and safety practices for children at camp. These recommendations are intended for parents, primary health care providers, and camp administration and health center staff. Although camps have diverse environments, there are general guidelines that apply to all situations and specific recommendations that are appropriate under special conditions. This policy statement has been reviewed and is supported by the American Camp Association. PMID:21444589

  9. Creating corporate advantage.

    PubMed

    Collis, D J; Montgomery, C A

    1998-01-01

    What differentiates truly great corporate strategies from the merely adequate? How can executives at the corporate level create tangible advantage for their businesses that makes the whole more than the sum of the parts? This article presents a comprehensive framework for value creation in the multibusiness company. It addresses the most fundamental questions of corporate strategy: What businesses should a company be in? How should it coordinate activities across businesses? What role should the corporate office play? How should the corporation measure and control performance? Through detailed case studies of Tyco International, Sharp, the Newell Company, and Saatchi and Saatchi, the authors demonstrate that the answers to all those questions are driven largely by the nature of a company's special resources--its assets, skills, and capabilities. These range along a continuum from the highly specialized at one end to the very general at the other. A corporation's location on the continuum constrains the set of businesses it should compete in and limits its choices about the design of its organization. Applying the framework, the authors point out the common mistakes that result from misaligned corporate strategies. Companies mistakenly enter businesses based on similarities in products rather than the resources that contribute to competitive advantage in each business. Instead of tailoring organizational structures and systems to the needs of a particular strategy, they create plain-vanilla corporate offices and infrastructures. The company examples demonstrate that one size does not fit all. One can find great corporate strategies all along the continuum. PMID:10179655

  10. Creating sustainable performance.

    PubMed

    Spreitzer, Gretchen; Porath, Christine

    2012-01-01

    What makes for sustainable individual and organizational performance? Employees who are thriving-not just satisfied and productive but also engaged in creating the future. The authors found that people who fit this description demonstrated 16% better overall performance, 125% less burnout, 32% more commitment to the organization, and 46% more job satisfaction than their peers. Thriving has two components: vitality, or the sense of being alive and excited, and learning, or the growth that comes from gaining knowledge and skills. Some people naturally build vitality and learning into their jobs, but most employees are influenced by their environment. Four mechanisms, none of which requires heroic effort or major resources, create the conditions for thriving: providing decision-making discretion, sharing information about the organization and its strategy, minimizing incivility, and offering performance feedback. Organizations such as Alaska Airlines, Zingerman's, Quicken Loans, and Caiman Consulting have found that helping people grow and remain energized at work is valiant on its own merits-but it can also boost performance in a sustainable way. PMID:22299508