Sample records for highly variable process

  1. The componential processing of fractions in adults and children: effects of stimuli variability and contextual interference

    PubMed Central

    Gabriel, Florence C.; Szücs, Dénes

    2014-01-01

    Recent studies have indicated that people have a strong tendency to compare fractions based on constituent numerators or denominators. This is called componential processing. This study explored whether componential processing was preferred in tasks involving high stimuli variability and high contextual interference, when fractions could be compared based either on the holistic values of fractions or on their denominators. Here, stimuli variability referred to the fact that fractions were not monotonous but diversiform. Contextual interference referred to the fact that the processing of fractions was interfered by other stimuli. To our ends, three tasks were used. In Task 1, participants compared a standard fraction 1/5 to unit fractions. This task was used as a low stimuli variability and low contextual interference task. In Task 2 stimuli variability was increased by mixing unit and non-unit fractions. In Task 3, high contextual interference was created by incorporating decimals into fractions. The RT results showed that the processing patterns of fractions were very similar for adults and children. In task 1 and task 3, only componential processing was utilzied. In contrast, both holistic processing and componential processing were utilized in task 2. These results suggest that, if individuals are presented with the opportunity to perform componential processing, both adults and children will tend to do so, even if they are faced with high variability of fractions or high contextual interference. PMID:25249995

  2. The componential processing of fractions in adults and children: effects of stimuli variability and contextual interference.

    PubMed

    Zhang, Li; Fang, Qiaochu; Gabriel, Florence C; Szücs, Dénes

    2014-01-01

    Recent studies have indicated that people have a strong tendency to compare fractions based on constituent numerators or denominators. This is called componential processing. This study explored whether componential processing was preferred in tasks involving high stimuli variability and high contextual interference, when fractions could be compared based either on the holistic values of fractions or on their denominators. Here, stimuli variability referred to the fact that fractions were not monotonous but diversiform. Contextual interference referred to the fact that the processing of fractions was interfered by other stimuli. To our ends, three tasks were used. In Task 1, participants compared a standard fraction 1/5 to unit fractions. This task was used as a low stimuli variability and low contextual interference task. In Task 2 stimuli variability was increased by mixing unit and non-unit fractions. In Task 3, high contextual interference was created by incorporating decimals into fractions. The RT results showed that the processing patterns of fractions were very similar for adults and children. In task 1 and task 3, only componential processing was utilzied. In contrast, both holistic processing and componential processing were utilized in task 2. These results suggest that, if individuals are presented with the opportunity to perform componential processing, both adults and children will tend to do so, even if they are faced with high variability of fractions or high contextual interference.

  3. Spatial and temporal variability of rainfall and their effects on hydrological response in urban areas - a review

    NASA Astrophysics Data System (ADS)

    Cristiano, Elena; ten Veldhuis, Marie-claire; van de Giesen, Nick

    2017-07-01

    In urban areas, hydrological processes are characterized by high variability in space and time, making them sensitive to small-scale temporal and spatial rainfall variability. In the last decades new instruments, techniques, and methods have been developed to capture rainfall and hydrological processes at high resolution. Weather radars have been introduced to estimate high spatial and temporal rainfall variability. At the same time, new models have been proposed to reproduce hydrological response, based on small-scale representation of urban catchment spatial variability. Despite these efforts, interactions between rainfall variability, catchment heterogeneity, and hydrological response remain poorly understood. This paper presents a review of our current understanding of hydrological processes in urban environments as reported in the literature, focusing on their spatial and temporal variability aspects. We review recent findings on the effects of rainfall variability on hydrological response and identify gaps where knowledge needs to be further developed to improve our understanding of and capability to predict urban hydrological response.

  4. An investigation of the influence of process and formulation variables on mechanical properties of high shear granules using design of experiment.

    PubMed

    Mangwandi, Chirangano; Adams, Michael J; Hounslow, Michael J; Salman, Agba D

    2012-05-10

    Being able to predict the properties of granules from the knowledge of the process and formulation variables is what most industries are striving for. This research uses experimental design to investigate the effect of process variables and formulation variables on mechanical properties of pharmaceutical granules manufactured from a classical blend of lactose and starch using hydroxypropyl cellulose (HPC) as the binder. The process parameters investigated were granulation time and impeller speed whilst the formulation variables were starch-to-lactose ratio and HPC concentration. The granule properties investigated include granule packing coefficient and granule strength. The effect of some components of the formulation on mechanical properties would also depend on the process variables used in granulation process. This implies that by subjecting the same formulation to different process conditions results in products with different properties. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. A Systematic Process for Developing High Quality SaaS Cloud Services

    NASA Astrophysics Data System (ADS)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  6. Including long-range dependence in integrate-and-fire models of the high interspike-interval variability of cortical neurons.

    PubMed

    Jackson, B Scott

    2004-10-01

    Many different types of integrate-and-fire models have been designed in order to explain how it is possible for a cortical neuron to integrate over many independent inputs while still producing highly variable spike trains. Within this context, the variability of spike trains has been almost exclusively measured using the coefficient of variation of interspike intervals. However, another important statistical property that has been found in cortical spike trains and is closely associated with their high firing variability is long-range dependence. We investigate the conditions, if any, under which such models produce output spike trains with both interspike-interval variability and long-range dependence similar to those that have previously been measured from actual cortical neurons. We first show analytically that a large class of high-variability integrate-and-fire models is incapable of producing such outputs based on the fact that their output spike trains are always mathematically equivalent to renewal processes. This class of models subsumes a majority of previously published models, including those that use excitation-inhibition balance, correlated inputs, partial reset, or nonlinear leakage to produce outputs with high variability. Next, we study integrate-and-fire models that have (nonPoissonian) renewal point process inputs instead of the Poisson point process inputs used in the preceding class of models. The confluence of our analytical and simulation results implies that the renewal-input model is capable of producing high variability and long-range dependence comparable to that seen in spike trains recorded from cortical neurons, but only if the interspike intervals of the inputs have infinite variance, a physiologically unrealistic condition. Finally, we suggest a new integrate-and-fire model that does not suffer any of the previously mentioned shortcomings. By analyzing simulation results for this model, we show that it is capable of producing output spike trains with interspike-interval variability and long-range dependence that match empirical data from cortical spike trains. This model is similar to the other models in this study, except that its inputs are fractional-gaussian-noise-driven Poisson processes rather than renewal point processes. In addition to this model's success in producing realistic output spike trains, its inputs have long-range dependence similar to that found in most subcortical neurons in sensory pathways, including the inputs to cortex. Analysis of output spike trains from simulations of this model also shows that a tight balance between the amounts of excitation and inhibition at the inputs to cortical neurons is not necessary for high interspike-interval variability at their outputs. Furthermore, in our analysis of this model, we show that the superposition of many fractional-gaussian-noise-driven Poisson processes does not approximate a Poisson process, which challenges the common assumption that the total effect of a large number of inputs on a neuron is well represented by a Poisson process.

  7. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    PubMed

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  8. High-efficiency cell concepts on low-cost silicon sheets

    NASA Technical Reports Server (NTRS)

    Bell, R. O.; Ravi, K. V.

    1985-01-01

    The limitations on sheet growth material in terms of the defect structure and minority carrier lifetime are discussed. The effect of various defects on performance are estimated. Given these limitations designs for a sheet growth cell that will make the best of the material characteristics are proposed. Achievement of optimum synergy between base material quality and device processing variables is proposed. A strong coupling exists between material quality and the variables during crystal growth, and device processing variables. Two objectives are outlined: (1) optimization of the coupling for maximum performance at minimal cost; and (2) decoupling of materials from processing by improvement in base material quality to make it less sensitive to processing variables.

  9. Potential interactions among linguistic, autonomic, and motor factors in speech.

    PubMed

    Kleinow, Jennifer; Smith, Anne

    2006-05-01

    Though anecdotal reports link certain speech disorders to increases in autonomic arousal, few studies have described the relationship between arousal and speech processes. Additionally, it is unclear how increases in arousal may interact with other cognitive-linguistic processes to affect speech motor control. In this experiment we examine potential interactions between autonomic arousal, linguistic processing, and speech motor coordination in adults and children. Autonomic responses (heart rate, finger pulse volume, tonic skin conductance, and phasic skin conductance) were recorded simultaneously with upper and lower lip movements during speech. The lip aperture variability (LA variability index) across multiple repetitions of sentences that varied in length and syntactic complexity was calculated under low- and high-arousal conditions. High arousal conditions were elicited by performance of the Stroop color word task. Children had significantly higher lip aperture variability index values across all speaking tasks, indicating more variable speech motor coordination. Increases in syntactic complexity and utterance length were associated with increases in speech motor coordination variability in both speaker groups. There was a significant effect of Stroop task, which produced increases in autonomic arousal and increased speech motor variability in both adults and children. These results provide novel evidence that high arousal levels can influence speech motor control in both adults and children. (c) 2006 Wiley Periodicals, Inc.

  10. Tungsten isotopes in bulk meteorites and their inclusions—Implications for processing of presolar components in the solar protoplanetary disk

    PubMed Central

    Holst, J. C.; Paton, C.; Wielandt, D.; Bizzarro, M.

    2016-01-01

    We present high precision, low- and high-resolution tungsten isotope measurements of iron meteorites Cape York (IIIAB), Rhine Villa (IIIE), Bendego (IC), and the IVB iron meteorites Tlacotepec, Skookum, and Weaver Mountains, as well as CI chondrite Ivuna, a CV3 chondrite refractory inclusion (CAI BE), and terrestrial standards. Our high precision tungsten isotope data show that the distribution of the rare p-process nuclide 180W is homogeneous among chondrites, iron meteorites, and the refractory inclusion. One exception to this pattern is the IVB iron meteorite group, which displays variable excesses relative to the terrestrial standard, possibly related to decay of rare 184Os. Such anomalies are not the result of analytical artifacts and cannot be caused by sampling of a protoplanetary disk characterized by p-process isotope heterogeneity. In contrast, we find that 183W is variable due to a nucleosynthetic s-process deficit/r-process excess among chondrites and iron meteorites. This variability supports the widespread nucleosynthetic s/r-process heterogeneity in the protoplanetary disk inferred from other isotope systems and we show that W and Ni isotope variability is correlated. Correlated isotope heterogeneity for elements of distinct nucleosynthetic origin (183W and 58Ni) is best explained by thermal processing in the protoplanetary disk during which thermally labile carrier phases are unmixed by vaporization thereby imparting isotope anomalies on the residual processed reservoir. PMID:27445452

  11. Spatial pattern analysis of Cu, Zn and Ni and their interpretation in the Campania region (Italy)

    NASA Astrophysics Data System (ADS)

    Petrik, Attila; Albanese, Stefano; Jordan, Gyozo; Rolandi, Roberto; De Vivo, Benedetto

    2017-04-01

    The uniquely abundant Campanian topsoil dataset enabled us to perform a spatial pattern analysis on 3 potentially toxic elements of Cu, Zn and Ni. This study is focusing on revealing the spatial texture and distribution of these elements by spatial point pattern and image processing analysis such as lineament density and spatial variability index calculation. The application of these methods on geochemical data provides a new and efficient tool to understand the spatial variation of concentrations and their background/baseline values. The determination and quantification of spatial variability is crucial to understand how fast the change in concentration is in a certain area and what processes might govern the variation. The spatial variability index calculation and image processing analysis including lineament density enables us to delineate homogenous areas and analyse them with respect to lithology and land use. Identification of spatial outliers and their patterns were also investigated by local spatial autocorrelation and image processing analysis including the determination of local minima and maxima points and singularity index analysis. The spatial variability of Cu and Zn reveals the highest zone (Cu: 0.5 MAD, Zn: 0.8-0.9 MAD, Median Deviation Index) along the coast between Campi Flegrei and the Sorrento Peninsula with the vast majority of statistically identified outliers and high-high spatial clustered points. The background/baseline maps of Cu and Zn reveals a moderate to high variability (Cu: 0.3 MAD, Zn: 0.4-0.5 MAD) NW-SE oriented zone including disrupted patches from Bisaccia to Mignano following the alluvial plains of Appenine's rivers. This zone has high abundance of anomaly concentrations identified using singularity analysis and it also has a high density of lineaments. The spatial variability of Ni shows the highest variability zone (0.6-0.7 MAD) around Campi Flegrei where the majority of low outliers are concentrated. The variability of background/baseline map of Ni reveals a shift to the east in case of highest variability zones coinciding with limestone outcrops. The high segmented area between Mignano and Bisaccia partially follows the alluvial plains of Appenine's rivers which seem to be playing a crucial role in the distribution and redistribution pattern of Cu, Zn and Ni in Campania. The high spatial variability zones of the later elements are located in topsoils on volcanoclastic rocks and are mostly related to cultivation and urbanised areas.

  12. High-speed limnology: using advanced sensors to investigate spatial variability in biogeochemistry and hydrology.

    PubMed

    Crawford, John T; Loken, Luke C; Casson, Nora J; Smith, Colin; Stone, Amanda G; Winslow, Luke A

    2015-01-06

    Advanced sensor technology is widely used in aquatic monitoring and research. Most applications focus on temporal variability, whereas spatial variability has been challenging to document. We assess the capability of water chemistry sensors embedded in a high-speed water intake system to document spatial variability. This new sensor platform continuously samples surface water at a range of speeds (0 to >45 km h(-1)) resulting in high-density, mesoscale spatial data. These novel observations reveal previously unknown variability in physical, chemical, and biological factors in streams, rivers, and lakes. By combining multiple sensors into one platform, we were able to detect terrestrial-aquatic hydrologic connections in a small dystrophic lake, to infer the role of main-channel vs backwater nutrient processing in a large river and to detect sharp chemical changes across aquatic ecosystem boundaries in a stream/lake complex. Spatial sensor data were verified in our examples by comparing with standard lab-based measurements of selected variables. Spatial fDOM data showed strong correlation with wet chemistry measurements of DOC, and optical NO3 concentrations were highly correlated with lab-based measurements. High-frequency spatial data similar to our examples could be used to further understand aquatic biogeochemical fluxes, ecological patterns, and ecosystem processes, and will both inform and benefit from fixed-site data.

  13. High-speed limnology: Using advanced sensors to investigate spatial variability in biogeochemistry and hydrology

    USGS Publications Warehouse

    Crawford, John T.; Loken, Luke C.; Casson, Nora J.; Smith, Collin; Stone, Amanda G.; Winslow, Luke A.

    2015-01-01

    Advanced sensor technology is widely used in aquatic monitoring and research. Most applications focus on temporal variability, whereas spatial variability has been challenging to document. We assess the capability of water chemistry sensors embedded in a high-speed water intake system to document spatial variability. This new sensor platform continuously samples surface water at a range of speeds (0 to >45 km h–1) resulting in high-density, mesoscale spatial data. These novel observations reveal previously unknown variability in physical, chemical, and biological factors in streams, rivers, and lakes. By combining multiple sensors into one platform, we were able to detect terrestrial–aquatic hydrologic connections in a small dystrophic lake, to infer the role of main-channel vs backwater nutrient processing in a large river and to detect sharp chemical changes across aquatic ecosystem boundaries in a stream/lake complex. Spatial sensor data were verified in our examples by comparing with standard lab-based measurements of selected variables. Spatial fDOM data showed strong correlation with wet chemistry measurements of DOC, and optical NO3 concentrations were highly correlated with lab-based measurements. High-frequency spatial data similar to our examples could be used to further understand aquatic biogeochemical fluxes, ecological patterns, and ecosystem processes, and will both inform and benefit from fixed-site data.

  14. Problems and programming for analysis of IUE high resolution data for variability

    NASA Technical Reports Server (NTRS)

    Grady, C. A.

    1981-01-01

    Observations of variability in stellar winds provide an important probe of their dynamics. It is crucial however to know that any variability seen in a data set can be clearly attributed to the star and not to instrumental or data processing effects. In the course of analysis of IUE high resolution data of alpha Cam and other O, B and Wolf-Rayet stars several effects were found which cause spurious variability or spurious spectral features in our data. Programming was developed to partially compensate for these effects using the Interactive Data language (IDL) on the LASP PDP 11/34. Use of an interactive language such as IDL is particularly suited to analysis of variability data as it permits use of efficient programs coupled with the judgement of the scientist at each stage of processing.

  15. Quality-by-Design (QbD): An integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and process design space development.

    PubMed

    Wu, Huiquan; White, Maury; Khan, Mansoor A

    2011-02-28

    The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.

  16. Variability of Attention Processes in ADHD: Observations from the Classroom

    ERIC Educational Resources Information Center

    Rapport, Mark D.; Kofler, Michael J.; Alderson, R. Matt; Timko, Thomas M., Jr.; DuPaul, George J.

    2009-01-01

    Objective: Classroom- and laboratory-based efforts to study the attentional problems of children with ADHD are incongruent in elucidating attentional deficits; however, none have explored within- or between-minute variability in the classroom attentional processing in children with ADHD. Method: High and low attention groups of ADHD children…

  17. Mechanisms of the 40-70 Day Variability in the Yucatan Channel Volume Transport

    NASA Astrophysics Data System (ADS)

    van Westen, René M.; Dijkstra, Henk A.; Klees, Roland; Riva, Riccardo E. M.; Slobbe, D. Cornelis; van der Boog, Carine G.; Katsman, Caroline A.; Candy, Adam S.; Pietrzak, Julie D.; Zijlema, Marcel; James, Rebecca K.; Bouma, Tjeerd J.

    2018-02-01

    The Yucatan Channel connects the Caribbean Sea with the Gulf of Mexico and is the main outflow region of the Caribbean Sea. Moorings in the Yucatan Channel show high-frequent variability in kinetic energy (50-100 days) and transport (20-40 days), but the physical mechanisms controlling this variability are poorly understood. In this study, we show that the short-term variability in the Yucatan Channel transport has an upstream origin and arises from processes in the North Brazil Current. To establish this connection, we use data from altimetry and model output from several high resolution global models. A significant 40-70 day variability is found in the sea surface height in the North Brazil Current retroflection region with a propagation toward the Lesser Antilles. The frequency of variability is generated by intrinsic processes associated with the shedding of eddies, rather than by atmospheric forcing. This sea surface height variability is able to pass the Lesser Antilles, it propagates westward with the background ocean flow in the Caribbean Sea and finally affects the variability in the Yucatan Channel volume transport.

  18. The contribution of local and transport processes to phytoplankton biomass variability over different timescales in the Upper James River, Virginia

    NASA Astrophysics Data System (ADS)

    Qin, Qubin; Shen, Jian

    2017-09-01

    Although both local processes (photosynthesis, respiration, grazing, and settling), and transport processes (advective transport and diffusive transport) significantly affect local phytoplankton dynamics, it is difficult to separate their contributions and to investigate the relative importance of each process to the local variability of phytoplankton biomass over different timescales. A method of using the transport rate is introduced to quantify the contribution of transport processes. By combining the time-varying transport rate and high-frequency observed chlorophyll a data, we can explicitly examine the impact of local and transport processes on phytoplankton biomass over a range of timescales from hourly to annually. For the Upper James River, results show that the relative importance of local and transport processes differs on different timescales. Local processes dominate phytoplankton variability on daily to weekly timescales, whereas the contribution of transport processes increases on seasonal to annual timescales and reaches equilibrium with local processes. With the use of the transport rate and high-frequency chlorophyll a data, a method similar to the open water oxygen method for metabolism is also presented to estimate phytoplankton primary production.

  19. Can Process Understanding Help Elucidate The Structure Of The Critical Zone? Comparing Process-Based Soil Formation Models With Digital Soil Mapping.

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, T.; Román, A.; Peña, A.; Laguna, A.; Giráldez, J. V.

    2017-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties in the critical zone. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of traditional digital soil mapping versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  20. Variable sensory perception in autism.

    PubMed

    Haigh, Sarah M

    2018-03-01

    Autism is associated with sensory and cognitive abnormalities. Individuals with autism generally show normal or superior early sensory processing abilities compared to healthy controls, but deficits in complex sensory processing. In the current opinion paper, it will be argued that sensory abnormalities impact cognition by limiting the amount of signal that can be used to interpret and interact with environment. There is a growing body of literature showing that individuals with autism exhibit greater trial-to-trial variability in behavioural and cortical sensory responses. If multiple sensory signals that are highly variable are added together to process more complex sensory stimuli, then this might destabilise later perception and impair cognition. Methods to improve sensory processing have shown improvements in more general cognition. Studies that specifically investigate differences in sensory trial-to-trial variability in autism, and the potential changes in variability before and after treatment, could ascertain if trial-to-trial variability is a good mechanism to target for treatment in autism. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Sensory-processing sensitivity and its relation to introversion and emotionality.

    PubMed

    Aron, E N; Aron, A

    1997-08-01

    Over a series of 7 studies that used diverse samples and measures, this research identified a unidimensional core variable of high sensory-processing sensitivity and demonstrated its partial independence from social introversion and emotionality, variables with which it had been confused or subsumed in most previous theorizing by personality researchers. Additional findings were that there appear to be 2 distinct clusters of highly sensitive individuals (a smaller group with an unhappy childhood and related variables, and a larger group similar to nonhighly sensitive individuals except for their sensitivity) and that sensitivity moderates, at least for men; the relation of parental environment to reporting having had an unhappy childhood. This research also demonstrated adequate reliability and content, convergent, and discriminant validity for a 27-item Highly Sensitive Person Scale.

  2. Speed but not amplitude of visual feedback exacerbates force variability in older adults.

    PubMed

    Kim, Changki; Yacoubi, Basma; Christou, Evangelos A

    2018-06-23

    Magnification of visual feedback (VF) impairs force control in older adults. In this study, we aimed to determine whether the age-associated increase in force variability with magnification of visual feedback is a consequence of increased amplitude or speed of visual feedback. Seventeen young and 18 older adults performed a constant isometric force task with the index finger at 5% of MVC. We manipulated the vertical (force gain) and horizontal (time gain) aspect of the visual feedback so participants performed the task with the following VF conditions: (1) high amplitude-fast speed; (2) low amplitude-slow speed; (3) high amplitude-slow speed. Changing the visual feedback from low amplitude-slow speed to high amplitude-fast speed increased force variability in older adults but decreased it in young adults (P < 0.01). Changing the visual feedback from low amplitude-slow speed to high amplitude-slow speed did not alter force variability in older adults (P > 0.2), but decreased it in young adults (P < 0.01). Changing the visual feedback from high amplitude-slow speed to high amplitude-fast speed increased force variability in older adults (P < 0.01) but did not alter force variability in young adults (P > 0.2). In summary, increased force variability in older adults with magnification of visual feedback was evident only when the speed of visual feedback increased. Thus, we conclude that in older adults deficits in the rate of processing visual information and not deficits in the processing of more visual information impair force control.

  3. The Importance of Freshwater to Spatial Variability of Aragonite Saturation State in the Gulf of Alaska

    NASA Astrophysics Data System (ADS)

    Siedlecki, Samantha A.; Pilcher, Darren J.; Hermann, Albert J.; Coyle, Ken; Mathis, Jeremy

    2017-11-01

    High-latitude and subpolar regions like the Gulf of Alaska (GOA) are more vulnerable than equatorial regions to rising carbon dioxide (CO2) levels, in part due to local processes that amplify the global signal. Recent field observations have shown that the shelf of the GOA is currently experiencing seasonal corrosive events (carbonate mineral saturation states Ω, Ω < 1), including suppressed Ω in response to ocean acidification as well as local processes like increased low-alkalinity glacial meltwater discharge. While the glacial discharge mainly influences the inner shelf, on the outer shelf, upwelling brings corrosive waters from the deep GOA. In this work, we develop a high-resolution model for carbon dynamics in the GOA, identify regions of high variability of Ω, and test the sensitivity of those regions to changes in the chemistry of glacial meltwater discharge. Results indicate the importance of this climatically sensitive and relatively unconstrained regional freshwater forcing for Ω variability in the nearshore. The increase was nearly linear at 0.002 Ω per 100 µmol/kg increase in alkalinity in the freshwater runoff. We find that the local winds, biological processes, and freshwater forcing all contribute to the spatial distribution of Ω and identify which of these three is highly correlated to the variability in Ω. Given that the timing and magnitude of these processes will likely change during the next few decades, it is critical to elucidate the effect of local processes on the background ocean acidification signal using robust models, such as the one described here.

  4. [Optimal extraction of effective constituents from Aralia elata by central composite design and response surface methodology].

    PubMed

    Lv, Shao-Wa; Liu, Dong; Hu, Pan-Pan; Ye, Xu-Yan; Xiao, Hong-Bin; Kuang, Hai-Xue

    2010-03-01

    To optimize the process of extracting effective constituents from Aralia elata by response surface methodology. The independent variables were ethanol concentration, reflux time and solvent fold, the dependent variable was extraction rate of total saponins in Aralia elata. Linear or no-linear mathematic models were used to estimate the relationship between independent and dependent variables. Response surface methodology was used to optimize the process of extraction. The prediction was carried out through comparing the observed and predicted values. Regression coefficient of binomial fitting complex model was as high as 0.9617, the optimum conditions of extraction process were 70% ethanol, 2.5 hours for reflux, 20-fold solvent and 3 times for extraction. The bias between observed and predicted values was -2.41%. It shows the optimum model is highly predictive.

  5. Spatial Variability of Dissolved Organic Carbon in Headwater Wetlands in Central Pennsylvania

    NASA Astrophysics Data System (ADS)

    Reichert-Eberhardt, A. J.; Wardrop, D.; Boyer, E. W.

    2011-12-01

    Dissolved organic carbon (DOC) is known to be of an important factor in many microbially mediated biochemical processes, such as denitrification, that occur in wetlands. The spatial variability of DOC within a wetland could impact the microbes that fuel these processes, which in turn can affect the ecosystem services provided by wetlands. However, the amount of spatial variability of DOC in wetlands is generally unknown. Furthermore, it is unknown how disturbance to wetlands can affect spatial variability of DOC. Previous research in central Pennsylvania headwater wetland soils has shown that wetlands with increased human disturbance had decreased heterogeneity in soil biochemistry. To address groundwater chemical variability 20 monitoring wells were installed in a random pattern in a 400 meter squared plot in a low-disturbance headwater wetland and a high-disturbance headwater wetland in central Pennsylvania. Water samples from these wells will be analyzed for DOC, dissolved inorganic carbon, nitrate, ammonia, and sulfate concentrations, as well as pH, conductivity, and temperature on a seasonal basis. It is hypothesized that there will be greater spatial variability of groundwater chemistry in the low disturbance wetland than the high disturbance wetland. This poster will present the initial data concerning DOC spatial variability in both the low and high impact headwater wetlands.

  6. Adolescent Decision-Making Processes regarding University Entry: A Model Incorporating Cultural Orientation, Motivation and Occupational Variables

    ERIC Educational Resources Information Center

    Jung, Jae Yup

    2013-01-01

    This study tested a newly developed model of the cognitive decision-making processes of senior high school students related to university entry. The model incorporated variables derived from motivation theory (i.e. expectancy-value theory and the theory of reasoned action), literature on cultural orientation and occupational considerations. A…

  7. Factors influencing spatial variability in nitrogen processing in nitrogen-saturated soils

    Treesearch

    Frank S. Gilliam; Charles C. Somerville; Nikki L. Lyttle; Mary Beth Adams

    2001-01-01

    Nitrogen (N) saturation is an environmental concern for forests in the eastern U.S. Although several watersheds of the Fernow Experimental Forest (FEF), West Virginia exhibit symptoms of N saturation, many watersheds display a high degree of spatial variability in soil N processing. This study examined the effects of temperature on net N mineralization and...

  8. Manufacturing challenge: An employee perception of the impact of BEM variables on motivation

    NASA Astrophysics Data System (ADS)

    Nyaude, Alaster

    The study examines the impact of Thomas F. Gilbert's Behavior Engineering Model (BEM) variables on employee perception of motivation at an aerospace equipment manufacturing plant in Georgia. The research process involved literature review, and determination of an appropriate survey instrument for the study. The Hersey-Chevalier modified PROBE instrument (Appendix C) was used with Dr Roger Chevalier's validation. The participants' responses were further examined to determine the influence of demographic control variables of age, gender, length of service with the company and education on employee perception of motivation. The results indicated that the top three highly motivating variables were knowledge and skills, capacity and resources. Knowledge and skills was perceived to be highly motivating, capacity as second highly motivating and resources as the third highly motivating variable. Interestingly, the fourth highly motivating variable was information, the fifth was motives and the sixth was incentives. The results also showed that demographic control variables had no influence on employee perception of motivation. Further research may be required to understand to what extend these BEM variables impact employee perceptions of motivation.

  9. Development of process data capturing, analysis and controlling for thermal spray techniques - SprayTracker

    NASA Astrophysics Data System (ADS)

    Kelber, C.; Marke, S.; Trommler, U.; Rupprecht, C.; Weis, S.

    2017-03-01

    Thermal spraying processes are becoming increasingly important in high-technology areas, such as automotive engineering and medical technology. The method offers the advantage of a local layer application with different materials and high deposition rates. Challenges in the application of thermal spraying result from the complex interaction of different influencing variables, which can be attributed to the properties of different materials, operating equipment supply, electrical parameters, flow mechanics, plasma physics and automation. In addition, spraying systems are subject to constant wear. Due to the process specification and the high demands on the produced coatings, innovative quality assurance tools are necessary. A central aspect, which has not yet been considered, is the data management in relation to the present measured variables, in particular the spraying system, the handling system, working safety devices and additional measuring sensors. Both the recording of all process-characterizing variables, their linking and evaluation as well as the use of the data for the active process control presuppose a novel, innovative control system (hardware and software) that was to be developed within the scope of the research project. In addition, new measurement methods and sensors are to be developed and qualified in order to improve the process reliability of thermal spraying.

  10. Cognitive inconsistency in bipolar patients is determined by increased intra-individual variability in initial phase of task performance.

    PubMed

    Krukow, Paweł; Szaniawska, Ola; Harciarek, Michał; Plechawska-Wójcik, Małgorzata; Jonak, Kamil

    2017-03-01

    Bipolar patients show high intra-individual variability during cognitive processing. However, it is not known whether there are a specific fluctuations of variability contributing to the overall high cognitive inconsistency. The objective was to compare dynamic profiles of patients and healthy controls to identify hypothetical differences and their associations with overall variability and processing speed. Changes of reaction times iSD during processing speed test performance over time was measured by dividing the iSD for whole task into four consecutive parts. Motor speed and cognitive effort were controlled. Patients with BD exhibited significantly lower results regarding processing speed and higher intra-individual variability comparing with HC. The profile of intra-individual variability changes over time of performance was significantly different in BD versus HC groups: F(3, 207)=8.60, p<0.0001, η p 2 =0.11. iSD of BD patients in the initial phase of performance was three times higher than in the last. There was no significant differences between four intervals in HC group. Inter-group difference in the initial part of the profiles was significant also after controlling for several cognitive and clinical variables. Applied computer version of Cognitive Speed Test was relatively new and, thus, replication studies are needed. Effect seen in the present study is driven mainly by the BD type I. Patients with BD exhibits problems with setting a stimulus-response association in starting phase of cognitive processing. This deficit may negatively interfere with the other cognitive functions, decreasing level of psychosocial functioning, therefore should be explored in future studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Design of quantum efficiency measurement system for variable doping GaAs photocathode

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Yang, Kai; Liu, HongLin; Chang, Benkang

    2008-03-01

    To achieve high quantum efficiency and good stability has been a main direction to develop GaAs photocathode recently. Through early research, we proved that variable doping structure is executable and practical, and has great potential. In order to optimize variable doping GaAs photocathode preparation techniques and study the variable doping theory deeply, a real-time quantum efficiency measurement system for GaAs Photocathode has been designed. The system uses FPGA (Field-programmable gate array) device, and high speed A/D converter to design a high signal noise ratio and high speed data acquisition card. ARM (Advanced RISC Machines) core processor s3c2410 and real-time embedded system are used to obtain and show measurement results. The measurement precision of photocurrent could reach 1nA, and measurement range of spectral response curve is within 400~1000nm. GaAs photocathode preparation process can be real-time monitored by using this system. This system could easily be added other functions to show the physic variation of photocathode during the preparation process more roundly in the future.

  12. New gentle-wing high-shear granulator: impact of processing variables on granules and tablets characteristics of high-drug loading formulation using design of experiment approach.

    PubMed

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shdefat, Ramadan I

    2017-10-01

    The aim of this work was to study the application of design of experiment (DoE) approach in defining design space for granulation and tableting processes using a novel gentle-wing high-shear granulator. According to quality-by-design (QbD) prospective, critical attributes of granules, and tablets should be ensured by manufacturing process design. A face-centered central composite design has been employed in order to investigate the effect of water amount (X 1 ), impeller speed (X 2 ), wet massing time (X 3 ), and water addition rate (X 4 ) as independent process variables on granules and tablets characteristics. Acetaminophen was used as a model drug and granulation experiments were carried out using dry addition of povidone k30. The dried granules have been analyzed for their size distribution, density, and flow pattern. Additionally, the produced tablets have been investigated for; weight uniformity, breaking force, friability and percent capping, disintegration time, and drug dissolution. Results of regression analysis showed that water amount, impeller speed and wet massing time have significant (p < .05) effect on granules and tablets characteristics. However, the water amount had the most pronounced effect as indicated by its higher parameter estimate. On the other hand, water addition rate showed a minimal impact on granules and tablets properties. In conclusion, water amount, impeller speed, and wet massing time could be considered as critical process variables. Thus, understanding the relationship between these variables and quality attributes of granules and corresponding tablets provides the basis for adjusting granulation variables in order to optimize product performance.

  13. Punishment induced behavioural and neurophysiological variability reveals dopamine-dependent selection of kinematic movement parameters

    PubMed Central

    Galea, Joseph M.; Ruge, Diane; Buijink, Arthur; Bestmann, Sven; Rothwell, John C.

    2013-01-01

    Action selection describes the high-level process which selects between competing movements. In animals, behavioural variability is critical for the motor exploration required to select the action which optimizes reward and minimizes cost/punishment, and is guided by dopamine (DA). The aim of this study was to test in humans whether low-level movement parameters are affected by punishment and reward in ways similar to high-level action selection. Moreover, we addressed the proposed dependence of behavioural and neurophysiological variability on DA, and whether this may underpin the exploration of kinematic parameters. Participants performed an out-and-back index finger movement and were instructed that monetary reward and punishment were based on its maximal acceleration (MA). In fact, the feedback was not contingent on the participant’s behaviour but pre-determined. Blocks highly-biased towards punishment were associated with increased MA variability relative to blocks with either reward or without feedback. This increase in behavioural variability was positively correlated with neurophysiological variability, as measured by changes in cortico-spinal excitability with transcranial magnetic stimulation over the primary motor cortex. Following the administration of a DA-antagonist, the variability associated with punishment diminished and the correlation between behavioural and neurophysiological variability no longer existed. Similar changes in variability were not observed when participants executed a pre-determined MA, nor did DA influence resting neurophysiological variability. Thus, under conditions of punishment, DA-dependent processes influence the selection of low-level movement parameters. We propose that the enhanced behavioural variability reflects the exploration of kinematic parameters for less punishing, or conversely more rewarding, outcomes. PMID:23447607

  14. Artificial Intelligence Tools for Scaling Up of High Shear Wet Granulation Process.

    PubMed

    Landin, Mariana

    2017-01-01

    The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R 2 > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  15. Continuous Flow in Labour-Intensive Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Pacheco Eng., Jhonny; Carbajal MSc., Eduardo; Stoll-Ing., Cesar, Dr.

    2017-06-01

    A continuous-flow manufacturing represents the peak of standard production, and usually it means high production in a strict line production. Furthermore, low-tech industry demands high labour-intensive, in this context the efficient of the line production is tied at the job shop organization. Labour-intensive manufacturing processes are a common characteristic for developing countries. This research aims to propose a methodology for production planning in order to fulfilment a variable monthly production quota. The main idea is to use a clock as orchestra director in order to synchronize the rate time (takt time) of customer demand with the manufacturing time. In this way, the study is able to propose a stark reduction of stock in process, over-processing, and unnecessary variability.

  16. Effects of in-sewer processes: a stochastic model approach.

    PubMed

    Vollertsen, J; Nielsen, A H; Yang, W; Hvitved-Jacobsen, T

    2005-01-01

    Transformations of organic matter, nitrogen and sulfur in sewers can be simulated taking into account the relevant transformation and transport processes. One objective of such simulation is the assessment and management of hydrogen sulfide formation and corrosion. Sulfide is formed in the biofilms and sediments of the water phase, but corrosion occurs on the moist surfaces of the sewer gas phase. Consequently, both phases and the transport of volatile substances between these phases must be included. Furthermore, wastewater composition and transformations in sewers are complex and subject to high, natural variability. This paper presents the latest developments of the WATS model concept, allowing integrated aerobic, anoxic and anaerobic simulation of the water phase and of gas phase processes. The resulting model is complex and with high parameter variability. An example applying stochastic modeling shows how this complexity and variability can be taken into account.

  17. How Differences Between Manager and Clinician Perceptions of Safety Culture Impact Hospital Processes of Care.

    PubMed

    Richter, Jason; Mazurenko, Olena; Kazley, Abby Swanson; Ford, Eric W

    2017-11-04

    Evidenced-based processes of care improve patient outcomes, yet universal compliance is lacking, and perceptions of the quality of care are highly variable. The purpose of this study is to examine how differences in clinician and management perceptions on teamwork and communication relate to adherence to hospital processes of care. Hospitals submitted identifiable data for the 2012 Hospital Survey on Patient Safety Culture and the Centers for Medicare and Medicaid Services' Hospital Compare. The dependent variable was a composite, developed from the scores on adherence to acute myocardial infarction, heart failure, and pneumonia process of care measures. The primary independent variables reflected 4 safety culture domains: communication openness, feedback about errors, teamwork within units, and teamwork between units. We assigned each hospital into one of 4 groups based on agreement between managers and clinicians on each domain. Each hospital was categorized as "high" (above the median) or "low" (below) for clinicians and managers in communication and teamwork. We found a positive relationship between perceived teamwork and communication climate and processes of care measures. If managers and clinicians perceived the communication openness as high, the hospital was more likely to adhere with processes of care. Similarly, if clinicians perceived teamwork across units as high, the hospital was more likely to adhere to processes of care. Manager and staff perceptions about teamwork and communications impact adherence to processes of care. Policies should recognize the importance of perceptions of both clinicians and managers on teamwork and communication and seek to improve organizational climate and practices. Clinician perceptions of teamwork across units are more closely linked to processes of care, so managers should be cognizant and try to improve their perceptions.

  18. Process optimization for osmo-dehydrated carambola (Averrhoa carambola L) slices and its storage studies.

    PubMed

    Roopa, N; Chauhan, O P; Raju, P S; Das Gupta, D K; Singh, R K R; Bawa, A S

    2014-10-01

    An osmotic-dehydration process protocol for Carambola (Averrhoacarambola L.,), an exotic star shaped tropical fruit, was developed. The process was optimized using Response Surface Methodology (RSM) following Central Composite Rotatable Design (CCRD). The experimental variables selected for the optimization were soak solution concentration (°Brix), soaking temperature (°C) and soaking time (min) with 6 experiments at central point. The effect of process variables was studied on solid gain and water loss during osmotic dehydration process. The data obtained were analyzed employing multiple regression technique to generate suitable mathematical models. Quadratic models were found to fit well (R(2), 95.58 - 98.64 %) in describing the effect of variables on the responses studied. The optimized levels of the process variables were achieved at 70°Brix, 48 °C and 144 min for soak solution concentration, soaking temperature and soaking time, respectively. The predicted and experimental results at optimized levels of variables showed high correlation. The osmo-dehydrated product prepared at optimized conditions showed a shelf-life of 10, 8 and 6 months at 5 °C, ambient (30 ± 2 °C) and 37 °C, respectively.

  19. Kpejigaou: an indigenous, high-protein, low-fat, cowpea-based griddled food proposed for coastal West Africa.

    PubMed

    Amonsou, Eric Oscar; Sakyi-Dawson, Esther; Saalia, Firibu Kwesi; Houssou, Paul

    2008-12-01

    Griddled cowpea paste foods have high nutritional potential because they are low in fat but high in protein. A good understanding of process and product characteristics of kpejigaou is necessary to improve its quality and enhance acceptability. To describe the product, evaluate critical variables in traditional processing, and determine consumer quality criteria and preferences for kpejigaou. A survey of kpejigaou processing was carried out among processors and regular consumers of kpejigaou. Kpejigaou is flat and circular in shape, with uniform thickness and porous structure. The production process of kpejigaou was found to be simple and rapid, but the quality of the finished product varied among processors and among batches. Critical processing variables affecting quality were dehulling of the cowpeas, type of griddling equipment, and griddling temperature. Texture (sponginess) is the most important quality index that determines the preference and acceptability of kpejigaou by consumers. Traditionally processed kpejigaou does not meet current standards for high-quality foods. This study provides the basis for efforts to standardize the kpejigaou process to ensure consistent product quality and enhance the acceptability of kpejigaou among consumers. Kpejigaou has a potential for success if marketed as a low-fat, nutritious fast food.

  20. Small scale denitrification variability in riparian zones: Results from a high-resolution dataset

    NASA Astrophysics Data System (ADS)

    Gassen, Niklas; Knöller, Kay; Musolff, Andreas; Popp, Felix; Lüders, Tillmann; Stumpp, Christine

    2017-04-01

    Riparian zones are important compartments at the interface between groundwater and surface water where biogeochemical processes like denitrification are often enhanced. Nitrate loads of either groundwater entering a stream through the riparian zone or streamwater infiltrating into the riparian zone can be substantially reduced. These processes are spatially and temporally highly variable, making it difficult to capture solute variabilities, estimate realistic turnover rates and thus to quantify integral mass removal. A crucial step towards a more detailed characterization is to monitor solutes on a scale which adequately resemble the highly heterogeneous distribution and on a scale where processes occur. We measured biogeochemical parameters in a spatial high resolution within a riparian corridor of a German lowland river system over the course of one year. Samples were taken from three newly developed high-resolution multi-level wells with a maximum vertical resolution of 5 cm and analyzed for major ions, DOC and N-O isotopes. Sediment derived during installation of the wells was analyzed for specific denitrifying enzymes. Results showed a distinct depth zonation of hydrochemistry within the shallow alluvial aquifer, with a 1 m thick zone just below the water table with lower nitrate concentrations and EC values similar to the nearby river. Conservative parameters were consistent inbetween the three wells, but nitrate was highly variable. In addition, spots with low nitrate concentrations showed isotopic and microbial evidence for higher denitrification activities. The depth zonation was observed throughout the year, with stronger temporal variations of nitrate concentrations just below the water table compared to deeper layers. Nitrate isotopes showed a clear seasonal trend of denitrification activities (high in summer, low in winter). Our dataset gives new insight into river-groundwater exchange processes and shows the highly heterogeneous distribution of denitrification in riparian zones, both in time and space. With these new insights, we are able to improve our understanding of spatial scaling of denitrification processes. This leads to a better prediction and improved management strategies for buffer mechanisms in riparian zones.

  1. Inter-individual cognitive variability in children with Asperger's syndrome

    PubMed Central

    Gonzalez-Gadea, Maria Luz; Tripicchio, Paula; Rattazzi, Alexia; Baez, Sandra; Marino, Julian; Roca, Maria; Manes, Facundo; Ibanez, Agustin

    2014-01-01

    Multiple studies have tried to establish the distinctive profile of individuals with Asperger's syndrome (AS). However, recent reports suggest that adults with AS feature heterogeneous cognitive profiles. The present study explores inter-individual variability in children with AS through group comparison and multiple case series analysis. All participants completed an extended battery including measures of fluid and crystallized intelligence, executive functions, theory of mind, and classical neuropsychological tests. Significant group differences were found in theory of mind and other domains related to global information processing. However, the AS group showed high inter-individual variability (both sub- and supra-normal performance) on most cognitive tasks. Furthermore, high fluid intelligence correlated with less general cognitive impairment, high cognitive flexibility, and speed of motor processing. In light of these findings, we propose that children with AS are characterized by a distinct, uneven pattern of cognitive strengths and weaknesses. PMID:25132817

  2. Collaborative Research: Process-resolving Decomposition of the Global Temperature Response to Modes of Low Frequency Variability in a Changing Climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Ming; Deng, Yi

    2015-02-06

    El Niño-Southern Oscillation (ENSO) and Annular Modes (AMs) represent respectively the most important modes of low frequency variability in the tropical and extratropical circulations. The future projection of the ENSO and AM variability, however, remains highly uncertain with the state-of-the-art coupled general circulation models. A comprehensive understanding of the factors responsible for the inter-model discrepancies in projecting future changes in the ENSO and AM variability, in terms of multiple feedback processes involved, has yet to be achieved. The proposed research aims to identify sources of such uncertainty and establish a set of process-resolving quantitative evaluations of the existing predictions ofmore » the future ENSO and AM variability. The proposed process-resolving evaluations are based on a feedback analysis method formulated in Lu and Cai (2009), which is capable of partitioning 3D temperature anomalies/perturbations into components linked to 1) radiation-related thermodynamic processes such as cloud and water vapor feedbacks, 2) local dynamical processes including convection and turbulent/diffusive energy transfer and 3) non-local dynamical processes such as the horizontal energy transport in the oceans and atmosphere. Taking advantage of the high-resolution, multi-model ensemble products from the Coupled Model Intercomparison Project Phase 5 (CMIP5) soon to be available at the Lawrence Livermore National Lab, we will conduct a process-resolving decomposition of the global three-dimensional (3D) temperature (including SST) response to the ENSO and AM variability in the preindustrial, historical and future climate simulated by these models. Specific research tasks include 1) identifying the model-observation discrepancies in the global temperature response to ENSO and AM variability and attributing such discrepancies to specific feedback processes, 2) delineating the influence of anthropogenic radiative forcing on the key feedback processes operating on ENSO and AM variability and quantifying their relative contributions to the changes in the temperature anomalies associated with different phases of ENSO and AMs, and 3) investigating the linkages between model feedback processes that lead to inter-model differences in time-mean temperature projection and model feedback processes that cause inter-model differences in the simulated ENSO and AM temperature response. Through a thorough model-observation and inter-model comparison of the multiple energetic processes associated with ENSO and AM variability, the proposed research serves to identify key uncertainties in model representation of ENSO and AM variability, and investigate how the model uncertainty in predicting time-mean response is related to the uncertainty in predicting response of the low-frequency modes. The proposal is thus a direct response to the first topical area of the solicitation: Interaction of Climate Change and Low Frequency Modes of Natural Climate Variability. It ultimately supports the accomplishment of the BER climate science activity Long Term Measure (LTM): "Deliver improved scientific data and models about the potential response of the Earth's climate and terrestrial biosphere to increased greenhouse gas levels for policy makers to determine safe levels of greenhouse gases in the atmosphere."« less

  3. Controls on the spatial variability of key soil properties: comparing field data with a mechanistic soilscape evolution model

    NASA Astrophysics Data System (ADS)

    Vanwalleghem, T.; Román, A.; Giraldez, J. V.

    2016-12-01

    There is a need for better understanding the processes influencing soil formation and the resulting distribution of soil properties. Soil properties can exhibit strong spatial variation, even at the small catchment scale. Especially soil carbon pools in semi-arid, mountainous areas are highly uncertain because bulk density and stoniness are very heterogeneous and rarely measured explicitly. In this study, we explore the spatial variability in key soil properties (soil carbon stocks, stoniness, bulk density and soil depth) as a function of processes shaping the critical zone (weathering, erosion, soil water fluxes and vegetation patterns). We also compare the potential of a geostatistical versus a mechanistic soil formation model (MILESD) for predicting these key soil properties. Soil core samples were collected from 67 locations at 6 depths. Total soil organic carbon stocks were 4.38 kg m-2. Solar radiation proved to be the key variable controlling soil carbon distribution. Stone content was mostly controlled by slope, indicating the importance of erosion. Spatial distribution of bulk density was found to be highly random. Finally, total carbon stocks were predicted using a random forest model whose main covariates were solar radiation and NDVI. The model predicts carbon stocks that are double as high on north versus south-facing slopes. However, validation showed that these covariates only explained 25% of the variation in the dataset. Apparently, present-day landscape and vegetation properties are not sufficient to fully explain variability in the soil carbon stocks in this complex terrain under natural vegetation. This is attributed to a high spatial variability in bulk density and stoniness, key variables controlling carbon stocks. Similar results were obtained with the mechanistic soil formation model MILESD, suggesting that more complex models might be needed to further explore this high spatial variability.

  4. HIGH-SHEAR GRANULATION PROCESS: INFLUENCE OF PROCESSING PARAMETERS ON CRITICAL QUALITY ATTRIBUTES OF ACETAMINOPHEN GRANULES AND TABLETS USING DESIGN OF EXPERIMENT APPROACH.

    PubMed

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Al-Shedfat, Ramadan I

    2017-01-01

    Application of quality by design (QbD) in high shear granulation process is critical and need to recognize the correlation between the granulation process parameters and the properties of intermediate (granules) and corresponding final product (tablets). The present work examined the influence of water amount (X,) and wet massing time (X2) as independent process variables on the critical quality attributes of granules and corresponding tablets using design of experiment (DoE) technique. A two factor, three level (32) full factorial design was performed; each of these variables was investigated at three levels to characterize their strength and interaction. The dried granules have been analyzed for their size distribution, density and flow pattern. Additionally, the produced tablets have been investigated for weight uniformity, crushing strength, friability and percent capping, disintegration time and drug dissolution. Statistically significant impact (p < 0.05) of water amount was identified for granule growth, percent fines and distribution width and flow behavior. Granule density and compressibility were found to be significantly influenced (p < 0.05) by the two operating conditions. Also, water amount has significant effect (p < 0.05) on tablet weight unifornity, friability and percent capping. Moreover, tablet disintegration time and drug dissolution appears to be significantly influenced (p < 0.05) by the two process variables. On the other hand, the relationship of process parameters with critical quality attributes of granule and final product tablet was identified and correlated. Ultimately, a judicious selection of process parameters in high shear granulation process will allow providing product of desirable quality.

  5. Moving forward socio-economically focused models of deforestation.

    PubMed

    Dezécache, Camille; Salles, Jean-Michel; Vieilledent, Ghislain; Hérault, Bruno

    2017-09-01

    Whilst high-resolution spatial variables contribute to a good fit of spatially explicit deforestation models, socio-economic processes are often beyond the scope of these models. Such a low level of interest in the socio-economic dimension of deforestation limits the relevancy of these models for decision-making and may be the cause of their failure to accurately predict observed deforestation trends in the medium term. This study aims to propose a flexible methodology for taking into account multiple drivers of deforestation in tropical forested areas, where the intensity of deforestation is explicitly predicted based on socio-economic variables. By coupling a model of deforestation location based on spatial environmental variables with several sub-models of deforestation intensity based on socio-economic variables, we were able to create a map of predicted deforestation over the period 2001-2014 in French Guiana. This map was compared to a reference map for accuracy assessment, not only at the pixel scale but also over cells ranging from 1 to approximately 600 sq. km. Highly significant relationships were explicitly established between deforestation intensity and several socio-economic variables: population growth, the amount of agricultural subsidies, gold and wood production. Such a precise characterization of socio-economic processes allows to avoid overestimation biases in high deforestation areas, suggesting a better integration of socio-economic processes in the models. Whilst considering deforestation as a purely geographical process contributes to the creation of conservative models unable to effectively assess changes in the socio-economic and political contexts influencing deforestation trends, this explicit characterization of the socio-economic dimension of deforestation is critical for the creation of deforestation scenarios in REDD+ projects. © 2017 John Wiley & Sons Ltd.

  6. Variable high pressure processing sensitivities for GII human noroviruses

    USDA-ARS?s Scientific Manuscript database

    Human norovirus (HuNoV) is the leading cause of foodborne diseases worldwide. High pressure processing (HPP) is one of the most promising non-thermal technologies for decontamination of viral pathogens in foods. However, the survival of HuNoVs by HPP is poorly understood because these viruses cann...

  7. Relationship between operational variables, fundamental physics and foamed cement properties in lab and field generated foamed cement slurries

    DOE PAGES

    Glosser, D.; Kutchko, B.; Benge, G.; ...

    2016-03-21

    Foamed cement is a critical component for wellbore stability. The mechanical performance of a foamed cement depends on its microstructure, which in turn depends on the preparation method and attendant operational variables. Determination of cement stability for field use is based on laboratory testing protocols governed by API Recommended Practice 10B-4 (API RP 10B-4, 2015). However, laboratory and field operational variables contrast considerably in terms of scale, as well as slurry mixing and foaming processes. Here in this paper, laboratory and field operational processes are characterized within a physics-based framework. It is shown that the “atomization energy” imparted by themore » high pressure injection of nitrogen gas into the field mixed foamed cement slurry is – by a significant margin – the highest energy process, and has a major impact on the void system in the cement slurry. There is no analog for this high energy exchange in current laboratory cement preparation and testing protocols. Quantifying the energy exchanges across the laboratory and field processes provides a basis for understanding relative impacts of these variables on cement structure, and can ultimately lead to the development of practices to improve cement testing and performance.« less

  8. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  9. Influence of Processing Parameters on the Flow Path in Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.; Nunes, A. C., Jr.

    2006-01-01

    Friction stir welding (FSW) is a solid phase welding process that unites thermal and mechanical aspects to produce a high quality joint. The process variables are rpm, translational weld speed, and downward plunge force. The strain-temperature history of a metal element at each point on the cross-section of the weld is determined by the individual flow path taken by the particular filament of metal flowing around the tool as influenced by the process variables. The resulting properties of the weld are determined by the strain-temperature history. Thus to control FSW properties, improved understanding of the processing parameters on the metal flow path is necessary.

  10. Intradaily variability of water quality in a shallow tidal lagoon: Mechanisms and implications

    USGS Publications Warehouse

    Lucas, L.V.; Sereno, D.M.; Burau, J.R.; Schraga, T.S.; Lopez, C.B.; Stacey, M.T.; Parchevsky, K.V.; Parchevsky, V.P.

    2006-01-01

    Although surface water quality and its underlying processes vary over time scales ranging from seconds to decades, they have historically been studied at the lower (weekly to interannual) frequencies. The aim of this study was to investigate intradaily variability of three water quality parameters in a small freshwater tidal lagoon (Mildred Island, California). High frequency time series of specific conductivity, water temperature, and chlorophyll a at two locations within the habitat were analyzed in conjunction with supporting hydrodynamic, meteorological, biological, and spatial mapping data. All three constituents exhibited large amplitude intradaily (e.g., semidiurnal tidal and diurnal) oscillations, and periodicity varied across constituents, space, and time. Like other tidal embayments, this habitat is influenced by several processes with distinct periodicities including physical controls, such as tides, solar radiation, and wind, and biological controls, such as photosynthesis, growth, and grazing. A scaling approach was developed to estimate individual process contributions to the observed variability. Scaling results were generally consistent with observations and together with detailed examination of time series and time derivatives, revealed specific mechanisms underlying the observed periodicities, including interactions between the tidal variability, heating, wind, and biology. The implications for monitoring were illustrated through subsampling of the data set. This exercise demonstrated how quantities needed by scientists and managers (e.g., mean or extreme concentrations) may be misrepresented by low frequency data and how short-duration high frequency measurements can aid in the design and interpretation of temporally coarser sampling programs. The dispersive export of chlorophyll a from the habitat exhibited a fortnightly variability corresponding to the modulation of semidiurnal tidal currents with the diurnal cycle of phytoplankton variability, demonstrating how high frequency interactions can govern long-term trends. Process identification, as through the scaling analysis here, can help us anticipate changes in system behavior and adapt our own interactions with the system. ?? 2006 Estuarine Research Federation.

  11. Using weather data to improve decision-making

    USDA-ARS?s Scientific Manuscript database

    Weather in the western United States is relatively dry and highly variable. The consequences of this variability can be effectively dealt with through the process of adaptive management which includes contingency planning for partial restoration success or restoration failure in any given year. Pr...

  12. Initiating an ergonomic analysis. A process for jobs with highly variable tasks.

    PubMed

    Conrad, K M; Lavender, S A; Reichelt, P A; Meyer, F T

    2000-09-01

    Occupational health nurses play a vital role in addressing ergonomic problems in the workplace. Describing and documenting exposure to ergonomic risk factors is a relatively straightforward process in jobs in which the work is repetitive. In other types of work, the analysis becomes much more challenging because tasks may be repeated infrequently, or at irregular time intervals, or under different environmental and temporal conditions, thereby making it difficult to observe a "representative" sample of the work performed. This article describes a process used to identify highly variable job tasks for ergonomic analyses. The identification of tasks for ergonomic analysis was a two step process involving interviews and a survey of firefighters and paramedics from a consortium of 14 suburban fire departments. The interviews were used to generate a list of frequently performed, physically strenuous job tasks and to capture clear descriptions of those tasks and associated roles. The goals of the survey were to confirm the interview findings across the entire target population and to quantify the frequency and degree of strenuousness of each task. In turn, the quantitative results from the survey were used to prioritize job tasks for simulation. Although this process was used to study firefighters and paramedics, the approach is likely to be suitable for many other types of occupations in which the tasks are highly variable in content and irregular in frequency.

  13. Cognitive Performance and Heart Rate Variability: The Influence of Fitness Level

    PubMed Central

    Luque-Casado, Antonio; Zabala, Mikel; Morales, Esther; Mateo-March, Manuel; Sanabria, Daniel

    2013-01-01

    In the present study, we investigated the relation between cognitive performance and heart rate variability as a function of fitness level. We measured the effect of three cognitive tasks (the psychomotor vigilance task, a temporal orienting task, and a duration discrimination task) on the heart rate variability of two groups of participants: a high-fit group and a low-fit group. Two major novel findings emerged from this study. First, the lowest values of heart rate variability were found during performance of the duration discrimination task, compared to the other two tasks. Second, the results showed a decrement in heart rate variability as a function of the time on task, although only in the low-fit group. Moreover, the high-fit group showed overall faster reaction times than the low-fit group in the psychomotor vigilance task, while there were not significant differences in performance between the two groups of participants in the other two cognitive tasks. In sum, our results highlighted the influence of cognitive processing on heart rate variability. Importantly, both behavioral and physiological results suggested that the main benefit obtained as a result of fitness level appeared to be associated with processes involving sustained attention. PMID:23437276

  14. Prediction of Indian Summer-Monsoon Onset Variability: A Season in Advance.

    PubMed

    Pradhan, Maheswar; Rao, A Suryachandra; Srivastava, Ankur; Dakate, Ashish; Salunke, Kiran; Shameera, K S

    2017-10-27

    Monsoon onset is an inherent transient phenomenon of Indian Summer Monsoon and it was never envisaged that this transience can be predicted at long lead times. Though onset is precipitous, its variability exhibits strong teleconnections with large scale forcing such as ENSO and IOD and hence may be predictable. Despite of the tremendous skill achieved by the state-of-the-art models in predicting such large scale processes, the prediction of monsoon onset variability by the models is still limited to just 2-3 weeks in advance. Using an objective definition of onset in a global coupled ocean-atmosphere model, it is shown that the skillful prediction of onset variability is feasible under seasonal prediction framework. The better representations/simulations of not only the large scale processes but also the synoptic and intraseasonal features during the evolution of monsoon onset are the comprehensions behind skillful simulation of monsoon onset variability. The changes observed in convection, tropospheric circulation and moisture availability prior to and after the onset are evidenced in model simulations, which resulted in high hit rate of early/delay in monsoon onset in the high resolution model.

  15. Standard cell electrical and physical variability analysis based on automatic physical measurement for design-for-manufacturing purposes

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan; Parag, Allon; Khmaisy, Hafez; Krispil, Uri; Adan, Ofer; Levi, Shimon; Latinski, Sergey; Schwarzband, Ishai; Rotstein, Israel

    2011-04-01

    A fully automated system for process variability analysis of high density standard cell was developed. The system consists of layout analysis with device mapping: device type, location, configuration and more. The mapping step was created by a simple DRC run-set. This database was then used as an input for choosing locations for SEM images and for specific layout parameter extraction, used by SPICE simulation. This method was used to analyze large arrays of standard cell blocks, manufactured using Tower TS013LV (Low Voltage for high-speed applications) Platforms. Variability of different physical parameters like and like Lgate, Line-width-roughness and more as well as of electrical parameters like drive current (Ion), off current (Ioff) were calculated and statistically analyzed, in order to understand the variability root cause. Comparison between transistors having the same W/L but with different layout configurations and different layout environments (around the transistor) was made in terms of performances as well as process variability. We successfully defined "robust" and "less-robust" transistors configurations, and updated guidelines for Design-for-Manufacturing (DfM).

  16. Temperature-Robust Neural Function from Activity-Dependent Ion Channel Regulation.

    PubMed

    O'Leary, Timothy; Marder, Eve

    2016-11-07

    Many species of cold-blooded animals experience substantial and rapid fluctuations in body temperature. Because biological processes are differentially temperature dependent, it is difficult to understand how physiological processes in such animals can be temperature robust [1-8]. Experiments have shown that core neural circuits, such as the pyloric circuit of the crab stomatogastric ganglion (STG), exhibit robust neural activity in spite of large (20°C) temperature fluctuations [3, 5, 7, 8]. This robustness is surprising because (1) each neuron has many different kinds of ion channels with different temperature dependencies (Q 10 s) that interact in a highly nonlinear way to produce firing patterns and (2) across animals there is substantial variability in conductance densities that nonetheless produce almost identical firing properties. The high variability in conductance densities in these neurons [9, 10] appears to contradict the possibility that robustness is achieved through precise tuning of key temperature-dependent processes. In this paper, we develop a theoretical explanation for how temperature robustness can emerge from a simple regulatory control mechanism that is compatible with highly variable conductance densities [11-13]. The resulting model suggests a general mechanism for how nervous systems and excitable tissues can exploit degenerate relationships among temperature-sensitive processes to achieve robust function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Spray-congealed microparticles for drug delivery - an overview of factors influencing their production and characteristics.

    PubMed

    Oh, Ching Mien; Guo, Qiyun; Wan Sia Heng, Paul; Chan, Lai Wah

    2014-07-01

    In any manufacturing process, the success of producing an end product with the desired properties and yield depends on a range of factors that include the equipment, process and formulation variables. It is the interest of manufacturers and researchers to understand each manufacturing process better and ascertain the effects of various manufacturing-associated factors on the properties of the end product. Unless the manufacturing process is well understood, it would be difficult to set realistic limits for the process variables and raw material specifications to ensure consistently high-quality and reproducible end products. Over the years, spray congealing has been used to produce particulates by the food and pharmaceutical industries. The latter have used this technology to develop specialized drug delivery systems. In this review, basic principles as well as advantages and disadvantages of the spray congealing process will be covered. Recent developments in spray congealing equipment, process variables and formulation variables such as the matrix material, encapsulated material and additives will also be discussed. Innovative equipment designs and formulations for spray congealing have emerged. Judicious choice of atomizers, polymers and additives is the key to achieve the desired properties of the microparticles for drug delivery.

  18. Non-Gaussian Multi-resolution Modeling of Magnetosphere-Ionosphere Coupling Processes

    NASA Astrophysics Data System (ADS)

    Fan, M.; Paul, D.; Lee, T. C. M.; Matsuo, T.

    2016-12-01

    The most dynamic coupling between the magnetosphere and ionosphere occurs in the Earth's polar atmosphere. Our objective is to model scale-dependent stochastic characteristics of high-latitude ionospheric electric fields that originate from solar wind magnetosphere-ionosphere interactions. The Earth's high-latitude ionospheric electric field exhibits considerable variability, with increasing non-Gaussian characteristics at decreasing spatio-temporal scales. Accurately representing the underlying stochastic physical process through random field modeling is crucial not only for scientific understanding of the energy, momentum and mass exchanges between the Earth's magnetosphere and ionosphere, but also for modern technological systems including telecommunication, navigation, positioning and satellite tracking. While a lot of efforts have been made to characterize the large-scale variability of the electric field in the context of Gaussian processes, no attempt has been made so far to model the small-scale non-Gaussian stochastic process observed in the high-latitude ionosphere. We construct a novel random field model using spherical needlets as building blocks. The double localization of spherical needlets in both spatial and frequency domains enables the model to capture the non-Gaussian and multi-resolutional characteristics of the small-scale variability. The estimation procedure is computationally feasible due to the utilization of an adaptive Gibbs sampler. We apply the proposed methodology to the computational simulation output from the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) magnetosphere model. Our non-Gaussian multi-resolution model results in characterizing significantly more energy associated with the small-scale ionospheric electric field variability in comparison to Gaussian models. By accurately representing unaccounted-for additional energy and momentum sources to the Earth's upper atmosphere, our novel random field modeling approach will provide a viable remedy to the current numerical models' systematic biases resulting from the underestimation of high-latitude energy and momentum sources.

  19. Pure s-Process Molybdenum Found in PreSolar Silicon Carbide Grains

    NASA Astrophysics Data System (ADS)

    Stephan, T.; Trappitsch, R.; Boehnke, P.; Davis, A. M.; Pellin, M. J.; Pardo, O. S.

    2017-07-01

    Molybdenum isotopes analyzed with high precision in 18 presolar SiC grains using CHILI (Chicago Instrument for Laser Ionization) reflect variability of conditions in stellar environments during s-process nucleosynthesis.

  20. Study of Variable Frequency Induction Heating in Steel Making Process

    NASA Astrophysics Data System (ADS)

    Fukutani, Kazuhiko; Umetsu, Kenji; Itou, Takeo; Isobe, Takanori; Kitahara, Tadayuki; Shimada, Ryuichi

    Induction heating technologies have been the standard technologies employed in steel making processes because they are clean, they have a high energy density, they are highly the controllable, etc. However, there is a problem in using them; in general, frequencies of the electric circuits have to be kept fixed to improve their power factors, and this constraint makes the processes inflexible. In order to overcome this problem, we have developed a new heating technique-variable frequency power supply with magnetic energy recovery switching. This technique helps us in improving the quality of steel products as well as the productivity. We have also performed numerical calculations and experiments to evaluate its effect on temperature distributions on heated steel plates. The obtained results indicate that the application of the technique in steel making processes would be advantageous.

  1. Variables that Predict Serve Efficacy in Elite Men’s Volleyball with Different Quality of Opposition Sets

    PubMed Central

    Valhondo, Álvaro; Fernández-Echeverría, Carmen; González-Silva, Jara; Claver, Fernando; Moreno, M. Perla

    2018-01-01

    Abstract The objective of this study was to determine the variables that predicted serve efficacy in elite men’s volleyball, in sets with different quality of opposition. 3292 serve actions were analysed, of which 2254 were carried out in high quality of opposition sets and 1038 actions were in low quality of opposition sets, corresponding to a total of 24 matches played during the Men’s European Volleyball Championships held in 2011. The independent variables considered in this study were the serve zone, serve type, serving player, serve direction, reception zone, receiving player and reception type; the dependent variable was serve efficacy and the situational variable was quality of opposition sets. The variables that acted as predictors in both high and low quality of opposition sets were the serving player, reception zone and reception type. The serve type variable only acted as a predictor in high quality of opposition sets, while the serve zone variable only acted as a predictor in low quality of opposition sets. These results may provide important guidance in men’s volleyball training processes. PMID:29599869

  2. Richer concepts are better remembered: number of features effects in free recall

    PubMed Central

    Hargreaves, Ian S.; Pexman, Penny M.; Johnson, Jeremy C.; Zdrazilova, Lenka

    2012-01-01

    Many models of memory build in a term for encoding variability, the observation that there can be variability in the richness or extensiveness of processing at encoding, and that this variability has consequences for retrieval. In four experiments, we tested the expectation that encoding variability could be driven by the properties of the to-be-remembered item. Specifically, that concepts associated with more semantic features would be better remembered than concepts associated with fewer semantic features. Using feature listing norms we selected sets of items for which people tend to list higher numbers of features (high NoF) and items for which people tend to list lower numbers of features (low NoF). Results showed more accurate free recall for high NoF concepts than for low NoF concepts in expected memory tasks (Experiments 1–3) and also in an unexpected memory task (Experiment 4). This effect was not the result of associative chaining between study items (Experiment 3), and can be attributed to the amount of item-specific processing that occurs at study (Experiment 4). These results provide evidence that stimulus-specific differences in processing at encoding have consequences for explicit memory retrieval. PMID:22514526

  3. Contrasting controls of pH climatology in an open coast versus urban fjord estuary

    EPA Science Inventory

    Interactions of physical, chemical, and biological processes in the coastal zone can result in a highly variable carbonate chemistry regime. This characteristic variability in coastal areas has garnered renewed interest within the context of ocean acidification, yet the relative...

  4. Variability in Proactive and Reactive Cognitive Control Processes Across the Adult Lifespan

    PubMed Central

    Karayanidis, Frini; Whitson, Lisa Rebecca; Heathcote, Andrew; Michie, Patricia T.

    2011-01-01

    Task-switching paradigms produce a highly consistent age-related increase in mixing cost [longer response time (RT) on repeat trials in mixed-task than single-task blocks] but a less consistent age effect on switch cost (longer RT on switch than repeat trials in mixed-task blocks). We use two approaches to examine the adult lifespan trajectory of control processes contributing to mixing cost and switch cost: latent variables derived from an evidence accumulation model of choice, and event-related potentials (ERP) that temporally differentiate proactive (cue-driven) and reactive (target-driven) control processes. Under highly practiced and prepared task conditions, aging was associated with increasing RT mixing cost but reducing RT switch cost. Both effects were largely due to the same cause: an age effect for mixed-repeat trials. In terms of latent variables, increasing age was associated with slower non-decision processes, slower rate of evidence accumulation about the target, and higher response criterion. Age effects on mixing costs were evident only on response criterion, the amount of evidence required to trigger a decision, whereas age effects on switch cost were present for all three latent variables. ERPs showed age-related increases in preparation for mixed-repeat trials, anticipatory attention, and post-target interference. Cue-locked ERPs that are linked to proactive control were associated with early emergence of age differences in response criterion. These results are consistent with age effects on strategic processes controlling decision caution. Consistent with an age-related decline in cognitive flexibility, younger adults flexibly adjusted response criterion from trial-to-trial on mixed-task blocks, whereas older adults maintained a high criterion for all trials. PMID:22073037

  5. Nonlinear intrinsic variables and state reconstruction in multiscale simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dsilva, Carmeline J., E-mail: cdsilva@princeton.edu; Talmon, Ronen, E-mail: ronen.talmon@yale.edu; Coifman, Ronald R., E-mail: coifman@math.yale.edu

    2013-11-14

    Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certainmore » simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.« less

  6. Nonlinear intrinsic variables and state reconstruction in multiscale simulations

    NASA Astrophysics Data System (ADS)

    Dsilva, Carmeline J.; Talmon, Ronen; Rabin, Neta; Coifman, Ronald R.; Kevrekidis, Ioannis G.

    2013-11-01

    Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certain simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.

  7. Climate variability and extremes, interacting with nitrogen storage, amplify eutrophication risk

    USGS Publications Warehouse

    Lee, Minjin; Shevliakova, Elena; Malyshev, Sergey; Milly, P.C.D.; Jaffe, Peter R.

    2016-01-01

    Despite 30 years of basin-wide nutrient-reduction efforts, severe hypoxia continues to be observed in the Chesapeake Bay. Here we demonstrate the critical influence of climate variability, interacting with accumulated nitrogen (N) over multidecades, on Susquehanna River dissolved nitrogen (DN) loads, known precursors of the hypoxia in the Bay. We used the process model LM3-TAN (Terrestrial and Aquatic Nitrogen), which is capable of capturing both seasonal and decadal-to-century changes in vegetation-soil-river N storage, and produced nine scenarios of DN-load distributions under different short-term scenarios of climate variability and extremes. We illustrate that after 1 to 3 yearlong dry spells, the likelihood of exceeding a threshold DN load (56 kt yr−1) increases by 40 to 65% due to flushing of N accumulated throughout the dry spells and altered microbial processes. Our analyses suggest that possible future increases in climate variability/extremes—specifically, high precipitation occurring after multiyear dry spells—could likely lead to high DN-load anomalies and hypoxia.

  8. Analysis of a municipal wastewater treatment plant using a neural network-based pattern analysis

    USGS Publications Warehouse

    Hong, Y.-S.T.; Rosen, Michael R.; Bhamidimarri, R.

    2003-01-01

    This paper addresses the problem of how to capture the complex relationships that exist between process variables and to diagnose the dynamic behaviour of a municipal wastewater treatment plant (WTP). Due to the complex biological reaction mechanisms, the highly time-varying, and multivariable aspects of the real WTP, the diagnosis of the WTP are still difficult in practice. The application of intelligent techniques, which can analyse the multi-dimensional process data using a sophisticated visualisation technique, can be useful for analysing and diagnosing the activated-sludge WTP. In this paper, the Kohonen Self-Organising Feature Maps (KSOFM) neural network is applied to analyse the multi-dimensional process data, and to diagnose the inter-relationship of the process variables in a real activated-sludge WTP. By using component planes, some detailed local relationships between the process variables, e.g., responses of the process variables under different operating conditions, as well as the global information is discovered. The operating condition and the inter-relationship among the process variables in the WTP have been diagnosed and extracted by the information obtained from the clustering analysis of the maps. It is concluded that the KSOFM technique provides an effective analysing and diagnosing tool to understand the system behaviour and to extract knowledge contained in multi-dimensional data of a large-scale WTP. ?? 2003 Elsevier Science Ltd. All rights reserved.

  9. [Imputing missing data in public health: general concepts and application to dichotomous variables].

    PubMed

    Hernández, Gilma; Moriña, David; Navarro, Albert

    The presence of missing data in collected variables is common in health surveys, but the subsequent imputation thereof at the time of analysis is not. Working with imputed data may have certain benefits regarding the precision of the estimators and the unbiased identification of associations between variables. The imputation process is probably still little understood by many non-statisticians, who view this process as highly complex and with an uncertain goal. To clarify these questions, this note aims to provide a straightforward, non-exhaustive overview of the imputation process to enable public health researchers ascertain its strengths. All this in the context of dichotomous variables which are commonplace in public health. To illustrate these concepts, an example in which missing data is handled by means of simple and multiple imputation is introduced. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  10. Temporal dynamics of estuarine phytoplankton: A case study of San Francisco Bay

    USGS Publications Warehouse

    Cloern, J.E.; Cole, B.E.; Wong, R.L.J.; Alpine, A.E.

    1985-01-01

    Detailed surveys throughout San Francisco Bay over an annual cycle (1980) show that seasonal variations of phytoplankton biomass, community composition, and productivity can differ markedly among estuarine habitat types. For example, in the river-dominated northern reach (Suisun Bay) phytoplankton seasonality is characterized by a prolonged summer bloom of netplanktonic diatoms that results from the accumulation of suspended particulates at the convergence of nontidal currents (i.e. where residence time is long). Here turbidity is persistently high such that phytoplankton growth and productivity are severely limited by light availability, the phytoplankton population turns over slowly, and biological processes appear to be less important mechanisms of temporal change than physical processes associated with freshwater inflow and turbulent mixing. The South Bay, in contrast, is a lagoon-type estuary less directly coupled to the influence of river discharge. Residence time is long (months) in this estuary, turbidity is lower and estimated rates of population growth are high (up to 1-2 doublings d-1), but the rapid production of phytoplankton biomass is presumably balanced by grazing losses to benthic herbivores. Exceptions occur for brief intervals (days to weeks) during spring when the water column stratifies so that algae retained in the surface layer are uncoupled from benthic grazing, and phytoplankton blooms develop. The degree of stratification varies over the neap-spring tidal cycle, so the South Bay represents an estuary where (1) biological processes (growth, grazing) and a physical process (vertical mixing) interact to cause temporal variability of phytoplankton biomass, and (2) temporal variability is highly dynamic because of the short-term variability of tides. Other mechanisms of temporal variability in estuarine phytoplankton include: zooplankton grazing, exchanges of microalgae between the sediment and water column, and horizontal dispersion which transports phytoplankton from regions of high productivity (shallows) to regions of low productivity (deep channels). Multi-year records of phytoplankton biomass show that large deviations from the typical annual cycles observed in 1980 can occur, and that interannual variability is driven by variability of annual precipitation and river discharge. Here, too, the nature of this variability differs among estuary types. Blooms occur only in the northern reach when river discharge falls within a narrow range, and the summer biomass increase was absent during years of extreme drought (1977) or years of exceptionally high discharge (1982). In South Bay, however, there is a direct relationship between phytoplankton biomass and river discharge. As discharge increases so does the buoyancy input required for density stratification, and wet years are characterized by persistent and intense spring blooms. ?? 1985 Dr W. Junk Publishers.

  11. An ecological alternative to Snodgrass & Vanderwart: 360 high quality colour images with norms for seven psycholinguistic variables.

    PubMed

    Moreno-Martínez, Francisco Javier; Montoro, Pedro R

    2012-01-01

    This work presents a new set of 360 high quality colour images belonging to 23 semantic subcategories. Two hundred and thirty-six Spanish speakers named the items and also provided data from seven relevant psycholinguistic variables: age of acquisition, familiarity, manipulability, name agreement, typicality and visual complexity. Furthermore, we also present lexical frequency data derived from Internet search hits. Apart from the high number of variables evaluated, knowing that it affects the processing of stimuli, this new set presents important advantages over other similar image corpi: (a) this corpus presents a broad number of subcategories and images; for example, this will permit researchers to select stimuli of appropriate difficulty as required, (e.g., to deal with problems derived from ceiling effects); (b) the fact of using coloured stimuli provides a more realistic, ecologically-valid, representation of real life objects. In sum, this set of stimuli provides a useful tool for research on visual object- and word-processing, both in neurological patients and in healthy controls.

  12. Optimized design on condensing tubes high-speed TIG welding technology magnetic control based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Lu, Lin; Chang, Yunlong; Li, Yingmin; Lu, Ming

    2013-05-01

    An orthogonal experiment was conducted by the means of multivariate nonlinear regression equation to adjust the influence of external transverse magnetic field and Ar flow rate on welding quality in the process of welding condenser pipe by high-speed argon tungsten-arc welding (TIG for short). The magnetic induction and flow rate of Ar gas were used as optimum variables, and tensile strength of weld was set to objective function on the base of genetic algorithm theory, and then an optimal design was conducted. According to the request of physical production, the optimum variables were restrained. The genetic algorithm in the MATLAB was used for computing. A comparison between optimum results and experiment parameters was made. The results showed that the optimum technologic parameters could be chosen by the means of genetic algorithm with the conditions of excessive optimum variables in the process of high-speed welding. And optimum technologic parameters of welding coincided with experiment results.

  13. The Development of a High-Throughput/Combinatorial Workflow for the Study of Porous Polymer Networks

    DTIC Science & Technology

    2012-04-05

    poragen composition , poragen level, and cure temperature. A total of 216 unique compositions were prepared. Changes in opacity of the blends as they cured...allowed for the identification of compositional variables and process variables that enabled the production of porous networks. Keywords: high...in polymer network cross-link density,poragen composition , poragen level, and cure temperature. A total of 216 unique compositions were prepared

  14. Data Processing Aspects of MEDLARS

    PubMed Central

    Austin, Charles J.

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287

  15. DATA PROCESSING ASPECTS OF MEDLARS.

    PubMed

    AUSTIN, C J

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.

  16. Assessment of co-composting process with high load of an inorganic industrial waste.

    PubMed

    Soares, Micaela A R; Quina, Margarida J; Reis, Marco S; Quinta-Ferreira, Rosa

    2017-01-01

    This study aims to investigate the co-composting of an inorganic industrial waste (eggshell - ES) in very high levels (up to 60% w/w). Since composting is a process in which solid, liquid and gaseous phases interact in a very complex way, there is a need to shed light on statistical tools that can unravel the main relationships structuring the variability associated to this process. In this study, PCA and data visualisation were used with that purpose. The co-composting tests were designed with increasing quantities of ES (0, 10, 20, 30 and 60%ES w/w) mixed with industrial potato peel and rice husks. Principal component analysis showed that physical properties like free air space, bulk density and moisture are the most relevant variables for explaining the variability due to ES content. On the other hand, variability in time dynamics is mostly driven by some chemical and phytoxicological parameters, such as organic matter decay and nitrate content. Higher ES incorporation (60% ES) enhanced the initial biological activity of the mixture, but the higher bulk density and lower water holding capacity had a negative effect on the aerobic biological activity as the process evolved. Nevertheless, pathogen-killing temperatures (>70°C for 11h) were attained. All the final products obtained after 90days were stable and non-phytotoxic. This work proved that valorisation of high amounts of eggshell by co-composting is feasible, but prone to be influenced by the physical properties of the mixtures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. USING CMAQ FOR EXPOSURE MODELING AND CHARACTERIZING THE SUB-GRID VARIABILITY FOR EXPOSURE ESTIMATES

    EPA Science Inventory

    Atmospheric processes and the associated transport and dispersion of atmospheric pollutants are known to be highly variable in time and space. Current air quality models that characterize atmospheric chemistry effects, e.g. the Community Multi-scale Air Quality (CMAQ), provide vo...

  18. Monitoring Dissolved Oxygen in New Jersey Coastal Waters Using Autonomous Gliders

    EPA Science Inventory

    The coastal ocean is a highly variable system with processes that have significant implications on the hydrographic and oxygen characteristics of the water column. The spatial and temporal variability of these fields can cause dramatic changes to water quality and in turn the h...

  19. The role of context in preschool learning: a multilevel examination of the contribution of context-specific problem behaviors and classroom process quality to low-income children's approaches to learning.

    PubMed

    Domínguez, Ximena; Vitiello, Virginia E; Fuccillo, Janna M; Greenfield, Daryl B; Bulotsky-Shearer, Rebecca J

    2011-04-01

    Research suggests that promoting adaptive approaches to learning early in childhood may help close the gap between advantaged and disadvantaged children. Recent research has identified specific child-level and classroom-level variables that are significantly associated with preschoolers' approaches to learning. However, further research is needed to understand the interactive effects of these variables and determine whether classroom-level variables buffer the detrimental effects of child-level risk variables. Using a largely urban and minority sample (N=275) of preschool children, the present study examined the additive and interactive effects of children's context-specific problem behaviors and classroom process quality dimensions on children's approaches to learning. Teachers rated children's problem behavior and approaches to learning and independent assessors conducted classroom observations to assess process quality. Problem behaviors in structured learning situations and in peer and teacher interactions were found to negatively predict variance in approaches to learning. Classroom process quality domains did not independently predict variance in approaches to learning. Nonetheless, classroom process quality played an important role in these associations; high emotional support buffered the detrimental effects of problem behavior, whereas high instructional support exacerbated them. The findings of this study have important implications for classroom practices aimed at helping children who exhibit problem behaviors. Copyright © 2010 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  20. Ultra-high-speed variable focus optics for novel applications in advanced imaging

    NASA Astrophysics Data System (ADS)

    Kang, S.; Dotsenko, E.; Amrhein, D.; Theriault, C.; Arnold, C. B.

    2018-02-01

    With the advancement of ultra-fast manufacturing technologies, high speed imaging with high 3D resolution has become increasingly important. Here we show the use of an ultra-high-speed variable focus optical element, the TAG Lens, to enable new ways to acquire 3D information from an object. The TAG Lens uses sound to adjust the index of refraction profile in a liquid and thereby can achieve focal scanning rates greater than 100 kHz. When combined with a high-speed pulsed LED and a high-speed camera, we can exploit this phenomenon to achieve high-resolution imaging through large depths. By combining the image acquisition with digital image processing, we can extract relevant parameters such as tilt and angle information from objects in the image. Due to the high speeds at which images can be collected and processed, we believe this technique can be used as an efficient method of industrial inspection and metrology for high throughput applications.

  1. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  2. Batch process fault detection and identification based on discriminant global preserving kernel slow feature analysis.

    PubMed

    Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping

    2018-05-16

    As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Functional variability of habitats within the Sacramento-San Joaquin Delta: Restoration implications

    USGS Publications Warehouse

    Lucas, L.V.; Cloern, J.E.; Thompson, J.K.; Monsen, N.E.

    2002-01-01

    We have now entered an era of large-scale attempts to restore ecological functions and biological communities in impaired ecosystems. Our knowledge base of complex ecosystems and interrelated functions is limited, so the outcomes of specific restoration actions are highly uncertain. One approach for exploring that uncertainty and anticipating the range of possible restoration outcomes is comparative study of existing habitats similar to future habitats slated for construction. Here we compare two examples of one habitat type targeted for restoration in the Sacramento-San Joaquin River Delta. We compare one critical ecological function provided by these shallow tidal habitats - production and distribution of phytoplankton biomass as the food supply to pelagic consumers. We measured spatial and short-term temporal variability of phytoplankton biomass and growth rate and quantified the hydrodynamic and biological processes governing that variability. Results show that the production and distribution of phytoplankton biomass can be highly variable within and between nearby habitats of the same type, due to variations in phytoplankton sources, sinks, and transport. Therefore, superficially similar, geographically proximate habitats can function very differently, and that functional variability introduces large uncertainties into the restoration process. Comparative study of existing habitats is one way ecosystem science can elucidate and potentially minimize restoration uncertainties, by identifying processes shaping habitat functionality, including those that can be controlled in the restoration design.

  4. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs.

    PubMed

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  5. Does ecosystem variability explain phytoplankton diversity? Solving an ecological puzzle with long-term data sets

    NASA Astrophysics Data System (ADS)

    Sarker, Subrata; Lemke, Peter; Wiltshire, Karen H.

    2018-05-01

    Explaining species diversity as a function of ecosystem variability is a long-term discussion in community-ecology research. Here, we aimed to establish a causal relationship between ecosystem variability and phytoplankton diversity in a shallow-sea ecosystem. We used long-term data on biotic and abiotic factors from Helgoland Roads, along with climate data to assess the effect of ecosystem variability on phytoplankton diversity. A point cumulative semi-variogram method was used to estimate the long-term ecosystem variability. A Markov chain model was used to estimate dynamical processes of species i.e. occurrence, absence and outcompete probability. We identified that the 1980s was a period of high ecosystem variability while the last two decades were comparatively less variable. Ecosystem variability was found as an important predictor of phytoplankton diversity at Helgoland Roads. High diversity was related to low ecosystem variability due to non-significant relationship between probability of a species occurrence and absence, significant negative relationship between probability of a species occurrence and probability of a species to be outcompeted by others, and high species occurrence at low ecosystem variability. Using an exceptional marine long-term data set, this study established a causal relationship between ecosystem variability and phytoplankton diversity.

  6. Ecosystem functioning is enveloped by hydrometeorological variability.

    PubMed

    Pappas, Christoforos; Mahecha, Miguel D; Frank, David C; Babst, Flurin; Koutsoyiannis, Demetris

    2017-09-01

    Terrestrial ecosystem processes, and the associated vegetation carbon dynamics, respond differently to hydrometeorological variability across timescales, and so does our scientific understanding of the underlying mechanisms. Long-term variability of the terrestrial carbon cycle is not yet well constrained and the resulting climate-biosphere feedbacks are highly uncertain. Here we present a comprehensive overview of hydrometeorological and ecosystem variability from hourly to decadal timescales integrating multiple in situ and remote-sensing datasets characterizing extra-tropical forest sites. We find that ecosystem variability at all sites is confined within a hydrometeorological envelope across sites and timescales. Furthermore, ecosystem variability demonstrates long-term persistence, highlighting ecological memory and slow ecosystem recovery rates after disturbances. However, simulation results with state-of-the-art process-based models do not reflect this long-term persistent behaviour in ecosystem functioning. Accordingly, we develop a cross-time-scale stochastic framework that captures hydrometeorological and ecosystem variability. Our analysis offers a perspective for terrestrial ecosystem modelling and paves the way for new model-data integration opportunities in Earth system sciences.

  7. Identification of Important Process Variables for Fiber Spinning of Protein Nanotubes Generated from Waste Materials

    DTIC Science & Technology

    2012-01-11

    nanotubes , which sold at the same current cost as carbon nanotubes , this would equate to a $788 million industry. In the USA, the potential to source eye...advantages over carbon nanotubes due to the ability to functionalized them 31. The nanotubes are a highly ordered, insoluble form of protein. Fibrils...1756 Identification of important process variables for fiber spinning of protein nanotubes generated from waste materials. Research Team (listed

  8. Compositional variability of nutrients and phytochemicals in corn after processing.

    PubMed

    Prasanthi, P S; Naveena, N; Vishnuvardhana Rao, M; Bhaskarachary, K

    2017-04-01

    The result of various process strategies on the nutrient and phytochemical composition of corn samples were studied. Fresh and cooked baby corn, sweet corn, dent corn and industrially processed and cooked popcorn, corn grits, corn flour and corn flakes were analysed for the determination of proximate, minerals, xanthophylls and phenolic acids content. This study revealed that the proximate composition of popcorn is high compared to the other corn products analyzed while the mineral composition of these maize products showed higher concentration of magnesium, phosphorus, potassium and low concentration of calcium, manganese, zinc, iron, copper, and sodium. Popcorn was high in iron, zinc, copper, manganese, sodium, magnesium and phosphorus. The xanthophylls lutein and zeaxanthin were predominant in the dent corn and the total polyphenolic content was highest in dent corn while the phenolic acids distribution was variable in different corn products. This study showed preparation and processing brought significant reduction of xanthophylls and polyphenols.

  9. Habitat connectivity and in-stream vegetation control temporal variability of benthic invertebrate communities.

    PubMed

    Huttunen, K-L; Mykrä, H; Oksanen, J; Astorga, A; Paavola, R; Muotka, T

    2017-05-03

    One of the key challenges to understanding patterns of β diversity is to disentangle deterministic patterns from stochastic ones. Stochastic processes may mask the influence of deterministic factors on community dynamics, hindering identification of the mechanisms causing variation in community composition. We studied temporal β diversity (among-year dissimilarity) of macroinvertebrate communities in near-pristine boreal streams across 14 years. To assess whether the observed β diversity deviates from that expected by chance, and to identify processes (deterministic vs. stochastic) through which different explanatory factors affect community variability, we used a null model approach. We observed that at the majority of sites temporal β diversity was low indicating high community stability. When stochastic variation was unaccounted for, connectivity was the only variable explaining temporal β diversity, with weakly connected sites exhibiting higher community variability through time. After accounting for stochastic effects, connectivity lost importance, suggesting that it was related to temporal β diversity via random colonization processes. Instead, β diversity was best explained by in-stream vegetation, community variability decreasing with increasing bryophyte cover. These results highlight the potential of stochastic factors to dampen the influence of deterministic processes, affecting our ability to understand and predict changes in biological communities through time.

  10. Influence of estuarine processes on spatiotemporal variation in bioavailable selenium

    USGS Publications Warehouse

    Stewart, Robin; Luoma, Samuel N.; Elrick, Kent A.; Carter, James L.; van der Wegen, Mick

    2013-01-01

    Dynamic processes (physical, chemical and biological) challenge our ability to quantify and manage the ecological risk of chemical contaminants in estuarine environments. Selenium (Se) bioavailability (defined by bioaccumulation), stable isotopes and molar carbon-tonitrogen ratios in the benthic clam Potamocorbula amurensis, an important food source for predators, were determined monthly for 17 yr in northern San Francisco Bay. Se concentrations in the clams ranged from a low of 2 to a high of 22 μg g-1 over space and time. Little of that variability was stochastic, however. Statistical analyses and preliminary hydrodynamic modeling showed that a constant mid-estuarine input of Se, which was dispersed up- and down-estuary by tidal currents, explained the general spatial patterns in accumulated Se among stations. Regression of Se bioavailability against river inflows suggested that processes driven by inflows were the primary driver of seasonal variability. River inflow also appeared to explain interannual variability but within the range of Se enrichment established at each station by source inputs. Evaluation of risks from Se contamination in estuaries requires the consideration of spatial and temporal variability on multiple scales and of the processes that drive that variability.

  11. Soil temperature variability in complex terrain measured using fiber-optic distributed temperature sensing

    USDA-ARS?s Scientific Manuscript database

    Soil temperature (Ts) exerts critical controls on hydrologic and biogeochemical processes but magnitude and nature of Ts variability in a landscape setting are rarely documented. Fiber optic distributed temperature sensing systems (FO-DTS) potentially measure Ts at high density over a large extent. ...

  12. Sedimentation in the chaparral: how do you handle unusual events?

    Treesearch

    Raymond M. Rice

    1982-01-01

    Abstract - Processes of erosion and sedimentation in steep chaparral drainage basins of southern California are described. The word ""hyperschedastic"" is coined to describe the sedimentation regime which is highly variable because of the interaction of marginally stable drainage basins, great variability in storm inputs, and the random occurrence...

  13. Spatial and Temporal Monitoring of Dissolved Oxygen in NJ Coastal Waters using AUVs (Presentation)

    EPA Science Inventory

    The coastal ocean is a highly variable system with processes that have significant implications on the hydrographic and oxygen characteristics of the water column. The spatial and temporal variability of these fields can cause dramatic changes to water quality and in turn the h...

  14. Multiscale variability of soil aggregate stability: implications for rangeland hydrology and erosion

    USDA-ARS?s Scientific Manuscript database

    Conservation of soil and water resources in rangelands is a crucial step in stopping desertification processes. The formation of water-stable soil aggregates reduces soil erodibility and can increase infiltration capacity in many soils. Soil aggregate stability is highly variable at scales ranging f...

  15. Online Processing of Sentences Containing Noun Modification in Young Children with High-Functioning Autism

    ERIC Educational Resources Information Center

    Bavin, Edith L.; Prendergast, Luke A.; Kidd, Evan; Baker, Emma; Dissanayake, Cheryl

    2016-01-01

    Background: There is variability in the language of children with autism, even those who are high functioning. However, little is known about how they process language structures in real time, including how they handle potential ambiguity, and whether they follow referential constraints. Previous research with older autism spectrum disorder (ASD)…

  16. Contextual mediation of perceptions in hauntings and poltergeist-like experiences.

    PubMed

    Lange, R; Houran, J; Harte, T M; Havens, R A

    1996-06-01

    The content of perceived apparitions, e.g., bereavement hallucinations, cannot be explained entirely in terms of electromagnetically induced neurochemical processes. It was shown that contextual variables influential in hallucinatory and hypnotic states also structured reported haunting experiences. As predicted, high congruency was found between the experiential content and the nature of the contextual variables. Further, the number of contextual variables involved in an experience was related to the type of experience and the state or arousal preceding the experience. Based on these findings we argue that a more complete explanation of haunting experiences should take into account both electromagnetically induced neurochemical processes and factors related to contextual mediation.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glosser, D.; Kutchko, B.; Benge, G.

    Foamed cement is a critical component for wellbore stability. The mechanical performance of a foamed cement depends on its microstructure, which in turn depends on the preparation method and attendant operational variables. Determination of cement stability for field use is based on laboratory testing protocols governed by API Recommended Practice 10B-4 (API RP 10B-4, 2015). However, laboratory and field operational variables contrast considerably in terms of scale, as well as slurry mixing and foaming processes. Here in this paper, laboratory and field operational processes are characterized within a physics-based framework. It is shown that the “atomization energy” imparted by themore » high pressure injection of nitrogen gas into the field mixed foamed cement slurry is – by a significant margin – the highest energy process, and has a major impact on the void system in the cement slurry. There is no analog for this high energy exchange in current laboratory cement preparation and testing protocols. Quantifying the energy exchanges across the laboratory and field processes provides a basis for understanding relative impacts of these variables on cement structure, and can ultimately lead to the development of practices to improve cement testing and performance.« less

  18. Ocean acidification in the coastal zone from an organism's perspective: multiple system parameters, frequency domains, and habitats.

    PubMed

    Waldbusser, George G; Salisbury, Joseph E

    2014-01-01

    Multiple natural and anthropogenic processes alter the carbonate chemistry of the coastal zone in ways that either exacerbate or mitigate ocean acidification effects. Freshwater inputs and multiple acid-base reactions change carbonate chemistry conditions, sometimes synergistically. The shallow nature of these systems results in strong benthic-pelagic coupling, and marine invertebrates at different life history stages rely on both benthic and pelagic habitats. Carbonate chemistry in coastal systems can be highly variable, responding to processes with temporal modes ranging from seconds to centuries. Identifying scales of variability relevant to levels of biological organization requires a fuller characterization of both the frequency and magnitude domains of processes contributing to or reducing acidification in pelagic and benthic habitats. We review the processes that contribute to coastal acidification with attention to timescales of variability and habitats relevant to marine bivalves.

  19. A Conceptual Model of Natural and Anthropogenic Drivers and Their Influence on the Prince William Sound, Alaska, Ecosystem

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Cummins, Kenneth W.; Highsmith, Raymond C.; Hilborn, Ray; McRoy, C. Peter; Parrish, Julia; Weingartner, Thomas

    2010-01-01

    Prince William Sound (PWS) is a semi-enclosed fjord estuary on the coast of Alaska adjoining the northern Gulf of Alaska (GOA). PWS is highly productive and diverse, with primary productivity strongly coupled to nutrient dynamics driven by variability in the climate and oceanography of the GOA and North Pacific Ocean. The pelagic and nearshore primary productivity supports a complex and diverse trophic structure, including large populations of forage and large fish that support many species of marine birds and mammals. High intra-annual, inter-annual, and interdecadal variability in climatic and oceanographic processes as drives high variability in the biological populations. A risk-based conceptual ecosystem model (CEM) is presented describing the natural processes, anthropogenic drivers, and resultant stressors that affect PWS, including stressors caused by the Great Alaska Earthquake of 1964 and the Exxon Valdez oil spill of 1989. A trophodynamic model incorporating PWS valued ecosystem components is integrated into the CEM. By representing the relative strengths of driver/stressors/effects, the CEM graphically demonstrates the fundamental dynamics of the PWS ecosystem, the natural forces that control the ecological condition of the Sound, and the relative contribution of natural processes and human activities to the health of the ecosystem. The CEM illustrates the dominance of natural processes in shaping the structure and functioning of the GOA and PWS ecosystems. PMID:20862192

  20. A Conceptual Model of Natural and Anthropogenic Drivers and Their Influence on the Prince William Sound, Alaska, Ecosystem.

    PubMed

    Harwell, Mark A; Gentile, John H; Cummins, Kenneth W; Highsmith, Raymond C; Hilborn, Ray; McRoy, C Peter; Parrish, Julia; Weingartner, Thomas

    2010-07-01

    Prince William Sound (PWS) is a semi-enclosed fjord estuary on the coast of Alaska adjoining the northern Gulf of Alaska (GOA). PWS is highly productive and diverse, with primary productivity strongly coupled to nutrient dynamics driven by variability in the climate and oceanography of the GOA and North Pacific Ocean. The pelagic and nearshore primary productivity supports a complex and diverse trophic structure, including large populations of forage and large fish that support many species of marine birds and mammals. High intra-annual, inter-annual, and interdecadal variability in climatic and oceanographic processes as drives high variability in the biological populations. A risk-based conceptual ecosystem model (CEM) is presented describing the natural processes, anthropogenic drivers, and resultant stressors that affect PWS, including stressors caused by the Great Alaska Earthquake of 1964 and the Exxon Valdez oil spill of 1989. A trophodynamic model incorporating PWS valued ecosystem components is integrated into the CEM. By representing the relative strengths of driver/stressors/effects, the CEM graphically demonstrates the fundamental dynamics of the PWS ecosystem, the natural forces that control the ecological condition of the Sound, and the relative contribution of natural processes and human activities to the health of the ecosystem. The CEM illustrates the dominance of natural processes in shaping the structure and functioning of the GOA and PWS ecosystems.

  1. Characterization of eco-hydraulic habitats for examining biogeochemical processes in rivers

    NASA Astrophysics Data System (ADS)

    McPhillips, L. E.; O'Connor, B. L.; Harvey, J. W.

    2009-12-01

    Spatial variability in biogeochemical reaction rates in streams is often attributed to sediment characteristics such as particle size, organic material content, and biota attached to or embedded within the sediments. Also important in controlling biogeochemical reaction rates are hydraulic conditions, which influence mass transfer of reactants from the stream to the bed, as well as hyporheic exchange within near-surface sediments. This combination of physical and ecological variables has the potential to create habitats that are unique not only in sediment texture but also in their biogeochemical processes and metabolism rates. In this study, we examine the two-dimensional (2D) variability of these habitats in an agricultural river in central Iowa. The streambed substratum was assessed using a grid-based survey identifying dominant particle size classes, as well as aerial coverage of green algae, benthic organic material, and coarse woody debris. Hydraulic conditions were quantified using a calibrated 2D model, and hyporheic exchange was assessed using a scaling relationship based on sediment and hydraulic characteristics. Point-metabolism rates were inferred from measured sediment dissolved oxygen profiles using an effective diffusion model and compared to traditional whole-stream measurements of metabolism. The 185 m study reach had contrasting geomorphologic and hydraulic characteristics in the upstream and downstream portions of an otherwise relatively straight run of a meandering river. The upstream portion contained a large central gravel bar (50 m in length) flanked by riffle-run segments and the downstream portion contained a deeper, fairly uniform channel cross-section. While relatively high flow velocities and gravel sediments were characteristic of the study river, the upstream island bar separated channels that differed with sandy gravels on one side and cobbley gravels on the other. Additionally, green algae was almost exclusively found in riffle portions of the cobbley gravel channel sediments while fine benthic organic material was concentrated at channel margins, regardless of the underlying sediments. A high degree of spatial variability in hyporheic exchange potential was the result of the complex 2D nature of topography and hydraulics. However, sediment texture classifications did a reasonable job in characterizing variability in hyporheic exchange potential because sediment texture mapping incorporates qualitative aspects of bed shear stress and hydraulic conductivity that control hyporheic exchange. Together these variables greatly influenced point-metabolism measurements in different sediment texture habitats separated by only 1 to 2 m. Results from this study suggest that spatial variability and complex interactions between geomorphology, hydraulics, and biological communities generate eco-hydraulic habitats that control variability in biogeochemical processes. The processes controlling variability are highly two-dimensional in nature and are not often accounted for in traditional one-dimensional analysis approaches of biogeochemical processes.

  2. Small-scale variability in tropical tropopause layer humidity

    NASA Astrophysics Data System (ADS)

    Jensen, E. J.; Ueyama, R.; Pfister, L.; Karcher, B.; Podglajen, A.; Diskin, G. S.; DiGangi, J. P.; Thornberry, T. D.; Rollins, A. W.; Bui, T. V.; Woods, S.; Lawson, P.

    2016-12-01

    Recent advances in statistical parameterizations of cirrus cloud processes for use in global models are highlighting the need for information about small-scale fluctuations in upper tropospheric humidity and the physical processes that control the humidity variability. To address these issues, we have analyzed high-resolution airborne water vapor measurements obtained in the Airborne Tropical TRopopause EXperiment over the tropical Pacific between 14 and 20 km. Using accurate and precise 1-Hz water vapor measurements along approximately-level aircraft flight legs, we calculate structure functions spanning horizontal scales ranging from about 0.2 to 50 km, and we compare the water vapor variability in the lower (about 14 km) and upper (16-19 km) Tropical Tropopause Layer (TTL). We also compare the magnitudes and scales of variability inside TTL cirrus versus in clear-sky regions. The measurements show that in the upper TTL, water vapor concentration variance is stronger inside cirrus than in clear-sky regions. Using simulations of TTL cirrus formation, we show that small variability in clear-sky humidity is amplified by the strong sensitivity of ice nucleation rate to supersaturation, which results in highly-structured clouds that subsequently drive variability in the water vapor field. In the lower TTL, humidity variability is correlated with recent detrainment from deep convection. The structure functions indicate approximately power-law scaling with spectral slopes ranging from about -5/3 to -2.

  3. Geometrical accuracy improvement in flexible roll forming lines

    NASA Astrophysics Data System (ADS)

    Larrañaga, J.; Berner, S.; Galdos, L.; Groche, P.

    2011-01-01

    The general interest to produce profiles with variable cross section in a cost-effective way has increased in the last few years. The flexible roll forming process allows producing profiles with variable cross section lengthwise in a continuous way. Until now, only a few flexible roll forming lines were developed and built up. Apart from the flange wrinkling along the transition zone of u-profiles with variable cross section, the process limits have not been investigated and solutions for shape deviations are unknown. During the PROFOM project a flexible roll forming machine has been developed with the objective of producing high technological components for automotive body structures. In order to investigate the limits of the process, different profile geometries and steel grades including high strength steels have been applied. During the first experimental tests, several errors have been identified, as a result of the complex stress states generated during the forming process. In order to improve the accuracy of the target profiles and to meet the tolerance demands of the automotive industry, a thermo-mechanical solution has been proposed. Additional mechanical devices, supporting flexible the roll forming process, have been implemented in the roll forming line together with local heating techniques. The combination of both methods shows a significant increase of the accuracy. In the present investigation, the experimental results of the validation process are presented.

  4. Process mapping as a framework for performance improvement in emergency general surgery.

    PubMed

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2017-12-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  5. Process mapping as a framework for performance improvement in emergency general surgery.

    PubMed

    DeGirolamo, Kristin; D'Souza, Karan; Hall, William; Joos, Emilie; Garraway, Naisan; Sing, Chad Kim; McLaughlin, Patrick; Hameed, Morad

    2018-02-01

    Emergency general surgery conditions are often thought of as being too acute for the development of standardized approaches to quality improvement. However, process mapping, a concept that has been applied extensively in manufacturing quality improvement, is now being used in health care. The objective of this study was to create process maps for small bowel obstruction in an effort to identify potential areas for quality improvement. We used the American College of Surgeons Emergency General Surgery Quality Improvement Program pilot database to identify patients who received nonoperative or operative management of small bowel obstruction between March 2015 and March 2016. This database, patient charts and electronic health records were used to create process maps from the time of presentation to discharge. Eighty-eight patients with small bowel obstruction (33 operative; 55 nonoperative) were identified. Patients who received surgery had a complication rate of 32%. The processes of care from the time of presentation to the time of follow-up were highly elaborate and variable in terms of duration; however, the sequences of care were found to be consistent. We used data visualization strategies to identify bottlenecks in care, and they showed substantial variability in terms of operating room access. Variability in the operative care of small bowel obstruction is high and represents an important improvement opportunity in general surgery. Process mapping can identify common themes, even in acute care, and suggest specific performance improvement measures.

  6. Sometimes processes don't matter: the general effect of short term climate variability on erosional systems.

    NASA Astrophysics Data System (ADS)

    Deal, Eric; Braun, Jean

    2017-04-01

    Climatic forcing undoubtedly plays an important role in shaping the Earth's surface. However, precisely how climate affects erosion rates, landscape morphology and the sedimentary record is highly debated. Recently there has been a focus on the influence of short-term variability in rainfall and river discharge on the relationship between climate and erosion rates. Here, we present a simple probabilistic argument, backed by modelling, that demonstrates that the way the Earth's surface responds to short-term climatic forcing variability is primarily determined by the existence and magnitude of erosional thresholds. We find that it is the ratio between the threshold magnitude and the mean magnitude of climatic forcing that determines whether variability matters or not and in which way. This is a fundamental result that applies regardless of the nature of the erosional process. This means, for example, that we can understand the role that discharge variability plays in determining fluvial erosion efficiency despite doubts about the processes involved in fluvial erosion. We can use this finding to reproduce the main conclusions of previous studies on the role of discharge variability in determining long-term fluvial erosion efficiency. Many aspects of the landscape known to influence discharge variability are affected by human activity, such as land use and river damming. Another important control on discharge variability, rainfall intensity, is also expected to increase with warmer temperatures. Among many other implications, our findings help provide a general framework to understand and predict the response of the Earth's surface to changes in mean and variability of rainfall and river discharge associated with the anthropogenic activity. In addition, the process independent nature of our findings suggest that previous work on river discharge variability and erosion thresholds can be applied to other erosional systems.

  7. Optimization of conditions for isolation of high quality chitin from shrimp processing raw byproducts using response surface methodology and its characterization.

    PubMed

    Nidheesh, T; Suresh, P V

    2015-06-01

    Chitin is one of the most abundant bioactive biopolymer on earth. It is commercially extracted from seafood processing crustacean shell byproducts by harsh thermochemical treatments. The extraction conditions, the source and pretreatment of raw material significantly affect its quality and bioactivity. In this investigation response surface methodology (RSM) has been applied to optimize and evaluate the interaction of variables for extraction of high quality chitin from shrimp processing raw byproducts. Variables such as, concentration of HCl (%, v/v) 4.5 (for wet) and 4.9 (for dry), reaction time 3 h, solid liquid ratio of HCl (w/v) 1:5.5 (for wet) and 1:7.9 (for dry) with two treatments achieved >98 % demineralization of shrimp byproduct. Variables such as, concentration of NaOH 3.6 % (w/v), reaction time 2.5 h, temperature 69.0 ± 1 °C, solid liquid ratio of NaOH 7.4 (w/v) and two treatments accomplished >98 % deproteinization of demineralized byproducts. Significant (p ≤ 0.05-0.001) interactive effects were observed between different variables. Chitin obtained in these conditions had residual content (%, w/w) of ash <0.4 and protein <0.8 and the degree of N-acetylation was >93 % with purity of >98 %. In conclusion, the optimized conditions by RSM can be applied for large scale preparation of high quality chitin from raw shrimp byproduct.

  8. Motion makes sense: an adaptive motor-sensory strategy underlies the perception of object location in rats.

    PubMed

    Saraf-Sinik, Inbar; Assa, Eldad; Ahissar, Ehud

    2015-06-10

    Tactile perception is obtained by coordinated motor-sensory processes. We studied the processes underlying the perception of object location in freely moving rats. We trained rats to identify the relative location of two vertical poles placed in front of them and measured at high resolution the motor and sensory variables (19 and 2 variables, respectively) associated with this whiskers-based perceptual process. We found that the rats developed stereotypic head and whisker movements to solve this task, in a manner that can be described by several distinct behavioral phases. During two of these phases, the rats' whiskers coded object position by first temporal and then angular coding schemes. We then introduced wind (in two opposite directions) and remeasured their perceptual performance and motor-sensory variables. Our rats continued to perceive object location in a consistent manner under wind perturbations while maintaining all behavioral phases and relatively constant sensory coding. Constant sensory coding was achieved by keeping one group of motor variables (the "controlled variables") constant, despite the perturbing wind, at the cost of strongly modulating another group of motor variables (the "modulated variables"). The controlled variables included coding-relevant variables, such as head azimuth and whisker velocity. These results indicate that consistent perception of location in the rat is obtained actively, via a selective control of perception-relevant motor variables. Copyright © 2015 the authors 0270-6474/15/358777-13$15.00/0.

  9. Analyzing EFL Teachers' Initial Job Motivation and Factors Effecting Their Motivation in Fezalar Educational Institutions in Iraq

    ERIC Educational Resources Information Center

    Koran, Selcuk

    2015-01-01

    Teacher motivation is one of the primary variables of students' high performance. It is experienced that students whose teachers are highly motivated are more engaged in the learning process. Therefore, it's mostly the teacher who determines the level of success or failure in achieving institution's goal in the educational process. Thus, teachers…

  10. Time-division multiplexer uses digital gates

    NASA Technical Reports Server (NTRS)

    Myers, C. E.; Vreeland, A. E.

    1977-01-01

    Device eliminates errors caused by analog gates in multiplexing a large number of channels at high frequency. System was designed for use in aerospace work to multiplex signals for monitoring such variables as fuel consumption, pressure, temperature, strain, and stress. Circuit may be useful in monitoring variables in process control and medicine as well.

  11. Spatial and Temporal Monitoring of Dissolved Oxygen (DO) in New Jersey Coastal Waters Using Autonomous Gliders

    EPA Science Inventory

    The coastal ocean is a highly variable system with processes that have significant implications on the hydrographic and oxygen characteristics of the water column. The spatial and temporal variability of these fields can cause dramatic changes to water quality and in turn the h...

  12. High sub-seasonal variability in water volume transports, revealed through a new ocean monitoring initiative using autonomous gliders

    NASA Astrophysics Data System (ADS)

    Heslop, E.; Ruiz, S.; Allen, J.; Tintoré, J.

    2012-04-01

    One of the clear challenges facing oceanography today is to define variability in ocean processes at a seasonal and sub-seasonal scale, in order to clearly identify the signature of both natural large-scale climatic oscillations and the long-term trends brought about by the human-induced change in atmospheric composition. Without visibility of this variance, which helps to determine the margins of significance for long-term trends and decipher cause and effect, the inferences drawn from sparse data points can be misleading. The cyclonic basin scale circulation pattern in the Western Mediterranean has long been known; the role/contribution that processes in the Balearic Basin play in modifying this is less well defined. The Balearic Channels (channels between the Balearic Islands) are constriction points on this basin scale circulation that appear to exert a controlling influence on the north/south exchange of water masses. Understanding the variability in current flows through these channels is important, not just for the transport of heat and salt, but also for ocean biology that responds to physical variability at the scale of that variability. Earlier studies at a seasonal scale identified; an interannual summer/winter variation of 1 Sv in the strength of the main circulation pattern and a high cruise-to-cruise variability in the pattern and strength of the flows through the channels brought about by mesoscale activity. Initial results using new high-resolution data from glider based monitoring missions across the Ibiza Channel (the main exchange channel in the Balearic Basin), combined with ship and contemporaneous satellite data, indicate surprisingly high and rapid changes in the flows of surface and intermediate waters imposed on the broad seasonal cycle. To date the data suggests that there are three potential 'modes' of water volume transport, generated from the interplay between basin and mesoscale circulation. We will review the concept of transport modes as seen through the earlier seasonal ship based studies and demonstrate that the scales of variability captured by the glider monitoring provides a unique view of variability in this circulation system, which is as high on a weekly timescale as the previously identified seasonal cycle.

  13. Spectroscopy of T Tauri stars with UVES. Observations and analysis of RU Lup

    NASA Astrophysics Data System (ADS)

    Stempels, H. C.; Piskunov, N.

    2002-08-01

    We present the first results of our observations of classical T Tauri Stars with UVES/VLT. The data consists of high signal-to-noise (ge 150) and high spectral resolution (R ~ 60 000) spectra. A large simultaneous wavelength coverage throughout most of the visible spectrum and comparatively short integration times allow us to study variability on short time-scales, using a number of diagnostics reflecting a wide range of physical processes. In particular we concentrate on the properties and geometry of the accretion process in the strongly accreting and highly variable CTTS RU Lup. We use the evolution of the level of veiling, the shapes of absorption and emission lines, and correlations between these diagnostics, to make new measurements of the fundamental stellar parameters as well as constraints on the accretion process and its geometry. We also derive the shortest time-scale of incoherent changes, which has implications for the nature of the accretion process in RU Lup. Based on observations collected at the European Southern Observatory, Chile (proposal 65.I-0404).

  14. Intraindividual variability is related to cognitive change in older adults: evidence for within-person coupling.

    PubMed

    Bielak, Allison A M; Hultsch, David F; Strauss, Esther; MacDonald, Stuart W S; Hunter, Michael A

    2010-09-01

    In this study, the authors addressed the longitudinal nature of intraindividual variability over 3 years. A sample of 304 community-dwelling older adults, initially between the ages of 64 and 92 years, completed 4 waves of annual testing on a battery of accuracy- and latency-based tests covering a wide range of cognitive complexity. Increases in response-time inconsistency on moderately and highly complex tasks were associated with increasing age, but there were significant individual differences in change across the entire sample. The time-varying covariation between cognition and inconsistency was significant across the 1-year intervals and remained stable across both time and age. On occasions when intraindividual variability was high, participants' cognitive performance was correspondingly low. The strength of the coupling relationship was greater for more fluid cognitive domains such as memory, reasoning, and processing speed than for more crystallized domains such as verbal ability. Variability based on moderately and highly complex tasks provided the strongest prediction. These results suggest that intraindividual variability is highly sensitive to even subtle changes in cognitive ability. (c) 2010 APA, all rights reserved.

  15. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  16. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  17. Suspect/foil identification in actual crimes and in the laboratory: a reality monitoring analysis.

    PubMed

    Behrman, Bruce W; Richards, Regina E

    2005-06-01

    Four reality monitoring variables were used to discriminate suspect from foil identifications in 183 actual criminal cases. Four hundred sixty-one identification attempts based on five and six-person lineups were analyzed. These identification attempts resulted in 238 suspect identifications and 68 foil identifications. Confidence, automatic processing, eliminative processing and feature use comprised the set of reality monitoring variables. Thirty-five verbal confidence phrases taken from police reports were assigned numerical values on a 10-point confidence scale. Automatic processing identifications were those that occurred "immediately" or "without hesitation." Eliminative processing identifications occurred when witnesses compared or eliminated persons in the lineups. Confidence, automatic processing and eliminative processing were significant predictors, but feature use was not. Confidence was the most effective discriminator. In cases that involved substantial evidence extrinsic to the identification 43% of the suspect identifications were made with high confidence, whereas only 10% of the foil identifications were made with high confidence. The results of a laboratory study using the same predictors generally paralleled the archival results. Forensic implications are discussed.

  18. Yielding physically-interpretable emulators - A Sparse PCA approach

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Alsahaf, A.; Giuliani, M.; Castelletti, A.

    2015-12-01

    Projection-based techniques, such as Principal Orthogonal Decomposition (POD), are a common approach to surrogate high-fidelity process-based models by lower order dynamic emulators. With POD, the dimensionality reduction is achieved by using observations, or 'snapshots' - generated with the high-fidelity model -, to project the entire set of input and state variables of this model onto a smaller set of basis functions that account for most of the variability in the data. While reduction efficiency and variance control of POD techniques are usually very high, the resulting emulators are structurally highly complex and can hardly be given a physically meaningful interpretation as each basis is a projection of the entire set of inputs and states. In this work, we propose a novel approach based on Sparse Principal Component Analysis (SPCA) that combines the several assets of POD methods with the potential for ex-post interpretation of the emulator structure. SPCA reduces the number of non-zero coefficients in the basis functions by identifying a sparse matrix of coefficients. While the resulting set of basis functions may retain less variance of the snapshots, the presence of a few non-zero coefficients assists in the interpretation of the underlying physical processes. The SPCA approach is tested on the reduction of a 1D hydro-ecological model (DYRESM-CAEDYM) used to describe the main ecological and hydrodynamic processes in Tono Dam, Japan. An experimental comparison against a standard POD approach shows that SPCA achieves the same accuracy in emulating a given output variable - for the same level of dimensionality reduction - while yielding better insights of the main process dynamics.

  19. The nature and use of prediction skills in a biological computer simulation

    NASA Astrophysics Data System (ADS)

    Lavoie, Derrick R.; Good, Ron

    The primary goal of this study was to examine the science process skill of prediction using qualitative research methodology. The think-aloud interview, modeled after Ericsson and Simon (1984), let to the identification of 63 program exploration and prediction behaviors.The performance of seven formal and seven concrete operational high-school biology students were videotaped during a three-phase learning sequence on water pollution. Subjects explored the effects of five independent variables on two dependent variables over time using a computer-simulation program. Predictions were made concerning the effect of the independent variables upon dependent variables through time. Subjects were identified according to initial knowledge of the subject matter and success at solving three selected prediction problems.Successful predictors generally had high initial knowledge of the subject matter and were formal operational. Unsuccessful predictors generally had low initial knowledge and were concrete operational. High initial knowledge seemed to be more important to predictive success than stage of Piagetian cognitive development.Successful prediction behaviors involved systematic manipulation of the independent variables, note taking, identification and use of appropriate independent-dependent variable relationships, high interest and motivation, and in general, higher-level thinking skills. Behaviors characteristic of unsuccessful predictors were nonsystematic manipulation of independent variables, lack of motivation and persistence, misconceptions, and the identification and use of inappropriate independent-dependent variable relationships.

  20. An Ecological Alternative to Snodgrass & Vanderwart: 360 High Quality Colour Images with Norms for Seven Psycholinguistic Variables

    PubMed Central

    Moreno-Martínez, Francisco Javier; Montoro, Pedro R.

    2012-01-01

    This work presents a new set of 360 high quality colour images belonging to 23 semantic subcategories. Two hundred and thirty-six Spanish speakers named the items and also provided data from seven relevant psycholinguistic variables: age of acquisition, familiarity, manipulability, name agreement, typicality and visual complexity. Furthermore, we also present lexical frequency data derived from Internet search hits. Apart from the high number of variables evaluated, knowing that it affects the processing of stimuli, this new set presents important advantages over other similar image corpi: (a) this corpus presents a broad number of subcategories and images; for example, this will permit researchers to select stimuli of appropriate difficulty as required, (e.g., to deal with problems derived from ceiling effects); (b) the fact of using coloured stimuli provides a more realistic, ecologically-valid, representation of real life objects. In sum, this set of stimuli provides a useful tool for research on visual object-and word- processing, both in neurological patients and in healthy controls. PMID:22662166

  1. Effects of long-term voluntary exercise on learning and memory processes: dependency of the task and level of exercise.

    PubMed

    García-Capdevila, Sílvia; Portell-Cortés, Isabel; Torras-Garcia, Meritxell; Coll-Andreu, Margalida; Costa-Miserachs, David

    2009-09-14

    The effect of long-term voluntary exercise (running wheel) on anxiety-like behaviour (plus maze and open field) and learning and memory processes (object recognition and two-way active avoidance) was examined on Wistar rats. Because major individual differences in running wheel behaviour were observed, the data were analysed considering the exercising animals both as a whole and grouped according to the time spent in the running wheel (low, high, and very-high running). Although some variables related to anxiety-like behaviour seem to reflect an anxiogenic compatible effect, the view of the complete set of variables could be interpreted as an enhancement of defensive and risk assessment behaviours in exercised animals, without major differences depending on the exercise level. Effects on learning and memory processes were dependent on task and level of exercise. Two-way avoidance was not affected either in the acquisition or in the retention session, while the retention of object recognition task was affected. In this latter task, an enhancement in low running subjects and impairment in high and very-high running animals were observed.

  2. eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.

    PubMed

    Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre

    2016-11-01

    Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.

  3. The impact of inter-annual rainfall variability on food production in the Ganges basin

    NASA Astrophysics Data System (ADS)

    Siderius, Christian; Biemans, Hester; van Walsum, Paul; hellegers, Petra; van Ierland, Ekko; Kabat, Pavel

    2014-05-01

    Rainfall variability is expected to increase in the coming decades as the world warms. Especially in regions already water stressed, a higher rainfall variability will jeopardize food security. Recently, the impact of inter-annual rainfall variability has received increasing attention in regional to global analysis on water availability and food security. But the description of the dynamics behind it is still incomplete in most models. Contemporary land surface and hydrological models used for such analyses describe variability in production primarily as a function of yield, a process driven by biophysical parameters, thereby neglecting yearly variations in cropped area, a process driven largely by management decisions. Agricultural statistics for northern India show that the latter process could explain up to 40% of the observed inter-annual variation in food production in various states. We added a simple dynamic land use decision module to a land surface model (LPJmL) and analyzed to what extent this improved the estimation of variability in food production. Using this improved modelling framework we then assessed if and at which scale rainfall variability affects meeting the food self-sufficiency threshold. Early results for the Ganges Basin indicate that, while on basin level variability in crop production is still relatively low, several districts and states are highly affected (RSTD > 50%). Such insight can contribute to better recommendations on the most effective measures, at the most appropriate scale, to buffer variability in food production.

  4. Modulation of brain activity by multiple lexical and word form variables in visual word recognition: A parametric fMRI study.

    PubMed

    Hauk, Olaf; Davis, Matthew H; Pulvermüller, Friedemann

    2008-09-01

    Psycholinguistic research has documented a range of variables that influence visual word recognition performance. Many of these variables are highly intercorrelated. Most previous studies have used factorial designs, which do not exploit the full range of values available for continuous variables, and are prone to skewed stimulus selection as well as to effects of the baseline (e.g. when contrasting words with pseudowords). In our study, we used a parametric approach to study the effects of several psycholinguistic variables on brain activation. We focussed on the variable word frequency, which has been used in numerous previous behavioural, electrophysiological and neuroimaging studies, in order to investigate the neuronal network underlying visual word processing. Furthermore, we investigated the variable orthographic typicality as well as a combined variable for word length and orthographic neighbourhood size (N), for which neuroimaging results are still either scarce or inconsistent. Data were analysed using multiple linear regression analysis of event-related fMRI data acquired from 21 subjects in a silent reading paradigm. The frequency variable correlated negatively with activation in left fusiform gyrus, bilateral inferior frontal gyri and bilateral insulae, indicating that word frequency can affect multiple aspects of word processing. N correlated positively with brain activity in left and right middle temporal gyri as well as right inferior frontal gyrus. Thus, our analysis revealed multiple distinct brain areas involved in visual word processing within one data set.

  5. Child involvement, alliance, and therapist flexibility: process variables in cognitive-behavioural therapy for anxiety disorders in childhood.

    PubMed

    Hudson, Jennifer L; Kendall, Philip C; Chu, Brian C; Gosch, Elizabeth; Martin, Erin; Taylor, Alan; Knight, Ashleigh

    2014-01-01

    This study examined the relations between treatment process variables and child anxiety outcomes. Independent raters watched/listened to taped therapy sessions of 151 anxiety-disordered (6-14 yr-old; M = 10.71) children (43% boys) and assessed process variables (child alliance, therapist alliance, child involvement, therapist flexibility and therapist functionality) within a manual-based cognitive-behavioural treatment. Latent growth modelling examined three latent variables (intercept, slope, and quadratic) for each process variable. Child age, gender, family income and ethnicity were examined as potential antecedents. Outcome was analyzed using factorially derived clinician, mother, father, child and teacher scores from questionnaire and structured diagnostic interviews at pretreatment, posttreatment and 12-month follow-up. Latent growth models demonstrated a concave quadratic curve for child involvement and therapist flexibility over time. A predominantly linear, downward slope was observed for alliance, and functional flexibility remained consistent over time. Increased alliance, child involvement and therapist flexibility showed some albeit inconsistent, associations with positive treatment outcome. Findings support the notion that maintaining the initial high level of alliance or involvement is important for clinical improvement. There is some support that progressively increasing alliance/involvement also positively impacts on treatment outcome. These findings were not consistent across outcome measurement points or reporters. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Variability of attention processes in ADHD: observations from the classroom.

    PubMed

    Rapport, Mark D; Kofler, Michael J; Alderson, R Matt; Timko, Thomas M; Dupaul, George J

    2009-05-01

    Classroom- and laboratory-based efforts to study the attentional problems of children with ADHD are incongruent in elucidating attentional deficits; however, none have explored within- or between-minute variability in the classroom attentional processing in children with ADHD. High and low attention groups of ADHD children defined via cluster analysis, and 36 typically developing children, were observed while completing academic assignments in their general education classrooms. All children oscillated between attentive and inattentive states; however, children in both ADHD groups switched states more frequently and remained attentive for shorter durations relative to typically developing children. Overall differences in attention and optimal ability to maintain attention among the groups are consistent with laboratory studies of increased ADHD-related interindividual and intergroup variability but inconsistent with laboratory results of increased intra-individual variability and attention decrements over time.

  7. Climate change and water table fluctuation: Implications for raised bog surface variability

    NASA Astrophysics Data System (ADS)

    Taminskas, Julius; Linkevičienė, Rita; Šimanauskienė, Rasa; Jukna, Laurynas; Kibirkštis, Gintautas; Tamkevičiūtė, Marija

    2018-03-01

    Cyclic peatland surface variability is influenced by hydrological conditions that highly depend on climate and/or anthropogenic activities. A low water level leads to a decrease of peatland surface and an increase of C emissions into the atmosphere, whereas a high water level leads to an increase of peatland surface and carbon sequestration in peatlands. The main aim of this article is to evaluate the influence of hydrometeorological conditions toward the peatland surface and its feedback toward the water regime. A regional survey of the raised bog water table fluctuation and surface variability was made in one of the largest peatlands in Lithuania. Two appropriate indicators for different peatland surface variability periods (increase and decrease) were detected. The first one is an 200 mm y- 1 average net rainfall over a three-year range. The second one is an average annual water depth of 25-30 cm. The application of these indicators enabled the reconstruction of Čepkeliai peatland surface variability during a 100 year period. Processes of peatland surface variability differ in time and in separate parts of peatland. Therefore, internal subbasins in peatland are formed. Subbasins involve autogenic processes that can later affect their internal hydrology, nutrient status, and vegetation succession. Internal hydrological conditions, surface fluctuation, and vegetation succession in peatland subbasins should be taken into account during evaluation of their state, nature management projects, and other peatland research works.

  8. High resolution climate scenarios for snowmelt modelling in small alpine catchments

    NASA Astrophysics Data System (ADS)

    Schirmer, M.; Peleg, N.; Burlando, P.; Jonas, T.

    2017-12-01

    Snow in the Alps is affected by climate change with regard to duration, timing and amount. This has implications with respect to important societal issues as drinking water supply or hydropower generation. In Switzerland, the latter received a lot of attention following the political decision to phase out of nuclear electricity production. An increasing number of authorization requests for small hydropower plants located in small alpine catchments was observed in the recent years. This situation generates ecological conflicts, while the expected climate change poses a threat to water availability thus putting at risk investments in such hydropower plants. Reliable high-resolution climate scenarios are thus required, which account for small-scale processes to achieve realistic predictions of snowmelt runoff and its variability in small alpine catchments. We therefore used a novel model chain by coupling a stochastic 2-dimensional weather generator (AWE-GEN-2d) with a state-of-the-art energy balance snow cover model (FSM). AWE-GEN-2d was applied to generate ensembles of climate variables at very fine temporal and spatial resolution, thus providing all climatic input variables required for the energy balance modelling. The land-surface model FSM was used to describe spatially variable snow cover accumulation and melt processes. The FSM was refined to allow applications at very high spatial resolution by specifically accounting for small-scale processes, such as a subgrid-parametrization of snow covered area or an improved representation of forest-snow processes. For the present study, the model chain was tested for current climate conditions using extensive observational dataset of different spatial and temporal coverage. Small-scale spatial processes such as elevation gradients or aspect differences in the snow distribution were evaluated using airborne LiDAR data. 40-year of monitoring data for snow water equivalent, snowmelt and snow-covered area for entire Switzerland was used to verify snow distribution patterns at coarser spatial and temporal scale. The ability of the model chain to reproduce current climate conditions in small alpine catchments makes this model combination an outstanding candidate to produce high resolution climate scenarios of snowmelt in small alpine catchments.

  9. Velocity fields and spectrum peculiarities in Beta Cephei stars

    NASA Technical Reports Server (NTRS)

    Lesh, J. R.

    1980-01-01

    The acquisition of short wavelength spectra of Beta Cephei variable stars from the International Ultraviolet Explorer is reported. A total of 122 images of 10 variable stars and 3 comparison stars were obtained. All of the images were observed in the high dispersion mode through a small aperture. The development of image processing methods is also briefly discussed.

  10. Cigarette Smoking Outcomes at Four Years of Follow-Up, Psychosocial Factors, and Reactions to Group Intervention.

    ERIC Educational Resources Information Center

    Benfari, Robert C.; Eaker, Elaine

    1984-01-01

    Studied male smokers (N=182) at high risk of coronary heart disease to determine variables that discriminated between successful and nonsuccessful quitters. Analysis revealed that baseline level of smoking, life events, personal security, and selected group process variables were predictive of success or failure in the intervention program.…

  11. The Use of Artificial Neural Networks to Estimate Speech Intelligibility from Acoustic Variables: A Preliminary Analysis.

    ERIC Educational Resources Information Center

    Metz, Dale Evan; And Others

    1992-01-01

    A preliminary scheme for estimating the speech intelligibility of hearing-impaired speakers from acoustic parameters, using a computerized artificial neural network to process mathematically the acoustic input variables, is outlined. Tests with 60 hearing-impaired speakers found the scheme to be highly accurate in identifying speakers separated by…

  12. Environmental variability and population dynamics: Do European and North American ducks play by the same rules?

    USGS Publications Warehouse

    Pöysä, Hannu; Rintala, Jukka; Johnson, Douglas H.; Kauppinen, Jukka; Lammi, Esa; Nudds, Thomas D.; Väänänen, Veli-Matti

    2016-01-01

    Density dependence, population regulation, and variability in population size are fundamental population processes, the manifestation and interrelationships of which are affected by environmental variability. However, there are surprisingly few empirical studies that distinguish the effect of environmental variability from the effects of population processes. We took advantage of a unique system, in which populations of the same duck species or close ecological counterparts live in highly variable (north American prairies) and in stable (north European lakes) environments, to distinguish the relative contributions of environmental variability (measured as between-year fluctuations in wetland numbers) and intraspecific interactions (density dependence) in driving population dynamics. We tested whether populations living in stable environments (in northern Europe) were more strongly governed by density dependence than populations living in variable environments (in North America). We also addressed whether relative population dynamical responses to environmental variability versus density corresponded to differences in life history strategies between dabbling (relatively “fast species” and governed by environmental variability) and diving (relatively “slow species” and governed by density) ducks. As expected, the variance component of population fluctuations caused by changes in breeding environments was greater in North America than in Europe. Contrary to expectations, however, populations in more stable environments were not less variable nor clearly more strongly density dependent than populations in highly variable environments. Also, contrary to expectations, populations of diving ducks were neither more stable nor stronger density dependent than populations of dabbling ducks, and the effect of environmental variability on population dynamics was greater in diving than in dabbling ducks. In general, irrespective of continent and species life history, environmental variability contributed more to variation in species abundances than did density. Our findings underscore the need for more studies on populations of the same species in different environments to verify the generality of current explanations about population dynamics and its association with species life history.

  13. Environmental variability and population dynamics: do European and North American ducks play by the same rules?

    PubMed

    Pöysä, Hannu; Rintala, Jukka; Johnson, Douglas H; Kauppinen, Jukka; Lammi, Esa; Nudds, Thomas D; Väänänen, Veli-Matti

    2016-10-01

    Density dependence, population regulation, and variability in population size are fundamental population processes, the manifestation and interrelationships of which are affected by environmental variability. However, there are surprisingly few empirical studies that distinguish the effect of environmental variability from the effects of population processes. We took advantage of a unique system, in which populations of the same duck species or close ecological counterparts live in highly variable (north American prairies) and in stable (north European lakes) environments, to distinguish the relative contributions of environmental variability (measured as between-year fluctuations in wetland numbers) and intraspecific interactions (density dependence) in driving population dynamics. We tested whether populations living in stable environments (in northern Europe) were more strongly governed by density dependence than populations living in variable environments (in North America). We also addressed whether relative population dynamical responses to environmental variability versus density corresponded to differences in life history strategies between dabbling (relatively "fast species" and governed by environmental variability) and diving (relatively "slow species" and governed by density) ducks. As expected, the variance component of population fluctuations caused by changes in breeding environments was greater in North America than in Europe. Contrary to expectations, however, populations in more stable environments were not less variable nor clearly more strongly density dependent than populations in highly variable environments. Also, contrary to expectations, populations of diving ducks were neither more stable nor stronger density dependent than populations of dabbling ducks, and the effect of environmental variability on population dynamics was greater in diving than in dabbling ducks. In general, irrespective of continent and species life history, environmental variability contributed more to variation in species abundances than did density. Our findings underscore the need for more studies on populations of the same species in different environments to verify the generality of current explanations about population dynamics and its association with species life history.

  14. Process for Operating a Dual-Mode Combustor

    NASA Technical Reports Server (NTRS)

    Trefny, Charles J. (Inventor); Dippold, Vance F. (Inventor)

    2017-01-01

    A new dual-mode ramjet combustor used for operation over a wide flight Mach number range is described. Subsonic combustion mode is usable to lower flight Mach numbers than current dual-mode scramjets. High speed mode is characterized by supersonic combustion in a free-jet that traverses the subsonic combustion chamber to a variable nozzle throat. Although a variable combustor exit aperture is required, the need for fuel staging to accommodate the combustion process is eliminated. Local heating from shock-boundary-layer interactions on combustor walls is also eliminated.

  15. Hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) and its application to predicting key process variables.

    PubMed

    He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-03-01

    In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  16. 16 CFR 1107.21 - Periodic testing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...

  17. 16 CFR § 1107.21 - Periodic testing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...

  18. 16 CFR 1107.21 - Periodic testing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... samples selected for testing pass the test, there is a high degree of assurance that the other untested... determining the testing interval include, but are not limited to, the following: (i) High variability in test... process management techniques and tests provide a high degree of assurance of compliance if they are not...

  19. Processes Affecting the Annual Surface Energy Budget at High-Latitude Terrestrial Sites

    NASA Astrophysics Data System (ADS)

    Persson, P. O. G.; Stone, R. S.; Grachev, A.; Matrosova, L.

    2012-04-01

    Instrumentation at four Study of Environmental Arctic Change (SEARCH) sites (Barrow, Eureka, Alert, and Tiksi) have been enhanced in the past 6 years, including during the 2007-2008 IPY. Data from these sites are used to investigate the annual cycle of the surface energy budget (SEB), its coupling to atmospheric processes, and for Alert, its interannual variability. The comprehensive data sets are useful for showing interactions between the atmosphere, surface, and soil at high temporal resolution throughout the annual cycle. Processes that govern the SEB variability at each site are identified, and their impacts on the SEB are quantified. For example, mesoscale modulation of the SEB caused by forcing from the local terrain (downslope wind events) and coastlines (sea and land breezes) are significant at Alert and Eureka, with these processes affecting both radiative, turbulent, and ground heat flux terms in the SEB. Sub-seasonal and interannual variations in atmospheric processes and SEB impact soil thermal structures, such as the depth and timing of the summer active layer. These analyses provide an improved understanding of the processes producing changes in surface and soil temperature, linking them through the SEB as affected by atmospheric processes.

  20. Isolation of fish skin and bone gelatin from tilapia (Oreochromis niloticus): Response surface approach

    NASA Astrophysics Data System (ADS)

    Arpi, N.; Fahrizal; Novita, M.

    2018-03-01

    In this study, gelatin from fish collagen, as one of halal sources, was extracted from tilapia (Oreochromis niloticus) skin and bone, by using Response Surface Methodology to optimize gelatin extraction conditions. Concentrations of alkaline NaOH and acid HCl, in the pretreatment process, and temperatures in extraction process were chosen as independent variables, while dependent variables were yield, gel strength, and emulsion activity index (EAI). The result of investigation showed that lower NaOH pretreatment concentrations provided proper pH extraction conditions which combine with higher extraction temperatures resulted in high gelatin yield. However, gelatin emulsion activity index increased proportionally to the decreased in NaOH concentrations and extraction temperatures. No significant effect of the three independent variables on the gelatin gel strength. RSM optimization process resulted in optimum gelatin extraction process conditions using alkaline NaOH concentration of 0.77 N, acid HCl of 0.59 N, and extraction temperature of 66.80 °C. The optimal solution formula had optimization targets of 94.38%.

  1. Statistical Study to Evaluate the Effect of Processing Variables on Shrinkage Incidence During Solidification of Nodular Cast Irons

    NASA Astrophysics Data System (ADS)

    Gutiérrez, J. M.; Natxiondo, A.; Nieves, J.; Zabala, A.; Sertucha, J.

    2017-04-01

    The study of shrinkage incidence variations in nodular cast irons is an important aspect of manufacturing processes. These variations change the feeding requirements on castings and the optimization of risers' size is consequently affected when avoiding the formation of shrinkage defects. The effect of a number of processing variables on the shrinkage size has been studied using a layout specifically designed for this purpose. The β parameter has been defined as the relative volume reduction from the pouring temperature up to the room temperature. It is observed that shrinkage size and β decrease as effective carbon content increases and when inoculant is added in the pouring stream. A similar effect is found when the parameters selected from cooling curves show high graphite nucleation during solidification of cast irons for a given inoculation level. Pearson statistical analysis has been used to analyze the correlations among all involved variables and a group of Bayesian networks have been subsequently built so as to get the best accurate model for predicting β as a function of the input processing variables. The developed models can be used in foundry plants to study the shrinkage incidence variations in the manufacturing process and to optimize the related costs.

  2. Apparatus and method for microwave processing of materials

    DOEpatents

    Johnson, A.C.; Lauf, R.J.; Bible, D.W.; Markunas, R.J.

    1996-05-28

    Disclosed is a variable frequency microwave heating apparatus designed to allow modulation of the frequency of the microwaves introduced into a furnace cavity for testing or other selected applications. The variable frequency heating apparatus is used in the method of the present invention to monitor the resonant processing frequency within the furnace cavity depending upon the material, including the state thereof, from which the workpiece is fabricated. The variable frequency microwave heating apparatus includes a microwave signal generator and a high-power microwave amplifier or a microwave voltage-controlled oscillator. A power supply is provided for operation of the high-power microwave oscillator or microwave amplifier. A directional coupler is provided for detecting the direction and amplitude of signals incident upon and reflected from the microwave cavity. A first power meter is provided for measuring the power delivered to the microwave furnace. A second power meter detects the magnitude of reflected power. Reflected power is dissipated in the reflected power load. 10 figs.

  3. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  4. Variability of the institutional review board process within a national research network.

    PubMed

    Khan, Muhammad A; Barratt, Michelle S; Krugman, Scott D; Serwint, Janet R; Dumont-Driscoll, Marilyn

    2014-06-01

    To determine the variability of the institutional review board (IRB) process for a minimal risk multicenter study. Participants included 24 Continuity Research Network (CORNET) sites of the Academic Pediatric Association that participated in a cross-sectional study. Each site obtained individual institutional IRB approval. An anonymous questionnaire went to site investigators about the IRB process at their institution. Twenty-two of 24 sites (92%) responded. Preparation time ranged from 1 to 20 hours, mean of 7.1 hours. Individuals submitting ≤3 IRB applications/year required more time for completion than those submitting >3/year (P < .05). Thirteen of 22 (59%) study sites received approval with "exempt" status, and 6 (27%) approved as "expedited" studies. IRB experiences were highly variable across study sites. These findings indicate that multicenter research projects should anticipate barriers to timely study implementation. Improved IRB standardization or centralization for multicenter clinical studies would facilitate this type of practice-based clinical research.

  5. Behavioral variability in an evolutionary theory of behavior dynamics.

    PubMed

    Popa, Andrei; McDowell, J J

    2016-03-01

    McDowell's evolutionary theory of behavior dynamics (McDowell, 2004) instantiates populations of behaviors (abstractly represented by integers) that evolve under the selection pressure of the environment in the form of positive reinforcement. Each generation gives rise to the next via low-level Darwinian processes of selection, recombination, and mutation. The emergent patterns can be analyzed and compared to those produced by biological organisms. The purpose of this project was to explore the effects of high mutation rates on behavioral variability in environments that arranged different reinforcer rates and magnitudes. Behavioral variability increased with the rate of mutation. High reinforcer rates and magnitudes reduced these effects; low reinforcer rates and magnitudes augmented them. These results are in agreement with live-organism research on behavioral variability. Various combinations of mutation rates, reinforcer rates, and reinforcer magnitudes produced similar high-level outcomes (equifinality). These findings suggest that the independent variables that describe an experimental condition interact; that is, they do not influence behavior independently. These conclusions have implications for the interpretation of high levels of variability, mathematical undermatching, and the matching theory. The last part of the discussion centers on a potential biological counterpart for the rate of mutation, namely spontaneous fluctuations in the brain's default mode network. © 2016 Society for the Experimental Analysis of Behavior.

  6. The variability puzzle in human memory.

    PubMed

    Kahana, Michael J; Aggarwal, Eash V; Phan, Tung D

    2018-04-26

    Memory performance exhibits a high level of variability from moment to moment. Much of this variability may reflect inadequately controlled experimental variables, such as word memorability, past practice and subject fatigue. Alternatively, stochastic variability in performance may largely reflect the efficiency of endogenous neural processes that govern memory function. To help adjudicate between these competing views, the authors conducted a multisession study in which subjects completed 552 trials of a delayed free-recall task. Applying a statistical model to predict variability in each subject's recall performance uncovered modest effects of word memorability, proactive interference, and other variables. In contrast to the limited explanatory power of these experimental variables, performance on the prior list strongly predicted current list recall. These findings suggest that endogenous factors underlying successful encoding and retrieval drive variability in performance. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Removal of iron ore slimes from a highly turbid water by DAF.

    PubMed

    Faustino, L M; Braga, A S; Sacchi, G D; Whitaker, W; Reali, M A P; Leal Filho, L S; Daniel, L A

    2018-05-30

    This paper addresses Dissolved Air Flotation (DAF) process variables, such as the flocculation parameters and the recycle water addition, as well as the pretreatment chemical variables (coagulation conditions), to determine the optimal values for the flotation of iron ore slimes found in a highly turbid water sample from the Gualaxo do Norte River, a tributary of the Doce River Basin in Minas Gerais, Brazil. This work was conducted using a flotatest batch laboratory-scale device to evaluate the effectiveness of DAF for cleaning the water polluted by the Samarco tailings dam leakage and determine the ability of DAF to reduce the water turbidity from 358 NTU to values below 100 NTU, aiming to comply with current legislation. The results showed that the four types of tested coagulants (PAC, ferric chloride, Tanfloc SG and Tanfloc SL) provided adequate conditions for coagulation, flocculation and flotation (in the range of 90-99.6% turbidity reduction). Although the process variables were optimized and low residual turbidity vales were achieved, results revealed that a portion of the flocs settled at the bottom of the flotatest columns, which indicated that the turbidity results represented removal caused by a combination of flotation and sedimentation processes simultaneously.

  8. Combined Effect of Levels in Personal Self-Regulation and Regulatory Teaching on Meta-Cognitive, on Meta-Motivational, and on Academic Achievement Variables in Undergraduate Students.

    PubMed

    de la Fuente, Jesús; Sander, Paul; Martínez-Vicente, José M; Vera, Mariano; Garzón, Angélica; Fadda, Salvattore

    2017-01-01

    The Theory of Self- vs . Externally-Regulated Learning™ (SRL vs. ERL) proposed different types of relationships among levels of variables in Personal Self-Regulation (PSR) and Regulatory Teaching (RT) to predict the meta-cognitive, meta-motivational and -emotional variables of learning, and of Academic Achievement in Higher Education. The aim of this investigation was empirical in order to validate the model of the combined effect of low-medium-high levels in PSR and RT on the dependent variables. For the analysis of combinations, a selected sample of 544 undergraduate students from two Spanish universities was used. Data collection was obtained from validated instruments, in Spanish versions. Using an ex-post-facto design, different Univariate and Multivariate Analyses (3 × 1, 3 × 3, and 4 × 1) were conducted. Results provide evidence for a consistent effect of low-medium-high levels of PSR and of RT, thus giving significant partial confirmation of the proposed rational model. As predicted, (1) the levels of PSR and positively and significantly effected the levels of learning approaches, resilience, engagement, academic confidence, test anxiety, and procedural and attitudinal academic achievement; (2) the most favorable type of interaction was a high level of PSR with a high level RT process. The limitations and implications of these results in the design of effective teaching are analyzed, to improve university teaching-learning processes.

  9. Combined Effect of Levels in Personal Self-Regulation and Regulatory Teaching on Meta-Cognitive, on Meta-Motivational, and on Academic Achievement Variables in Undergraduate Students

    PubMed Central

    de la Fuente, Jesús; Sander, Paul; Martínez-Vicente, José M.; Vera, Mariano; Garzón, Angélica; Fadda, Salvattore

    2017-01-01

    The Theory of Self- vs. Externally-Regulated Learning™ (SRL vs. ERL) proposed different types of relationships among levels of variables in Personal Self-Regulation (PSR) and Regulatory Teaching (RT) to predict the meta-cognitive, meta-motivational and -emotional variables of learning, and of Academic Achievement in Higher Education. The aim of this investigation was empirical in order to validate the model of the combined effect of low-medium-high levels in PSR and RT on the dependent variables. For the analysis of combinations, a selected sample of 544 undergraduate students from two Spanish universities was used. Data collection was obtained from validated instruments, in Spanish versions. Using an ex-post-facto design, different Univariate and Multivariate Analyses (3 × 1, 3 × 3, and 4 × 1) were conducted. Results provide evidence for a consistent effect of low-medium-high levels of PSR and of RT, thus giving significant partial confirmation of the proposed rational model. As predicted, (1) the levels of PSR and positively and significantly effected the levels of learning approaches, resilience, engagement, academic confidence, test anxiety, and procedural and attitudinal academic achievement; (2) the most favorable type of interaction was a high level of PSR with a high level RT process. The limitations and implications of these results in the design of effective teaching are analyzed, to improve university teaching-learning processes. PMID:28280473

  10. Reduction of tablet weight variability by optimizing paddle speed in the forced feeder of a high-speed rotary tablet press.

    PubMed

    Peeters, Elisabeth; De Beer, Thomas; Vervaet, Chris; Remon, Jean-Paul

    2015-04-01

    Tableting is a complex process due to the large number of process parameters that can be varied. Knowledge and understanding of the influence of these parameters on the final product quality is of great importance for the industry, allowing economic efficiency and parametric release. The aim of this study was to investigate the influence of paddle speeds and fill depth at different tableting speeds on the weight and weight variability of tablets. Two excipients possessing different flow behavior, microcrystalline cellulose (MCC) and dibasic calcium phosphate dihydrate (DCP), were selected as model powders. Tablets were manufactured via a high-speed rotary tablet press using design of experiments (DoE). During each experiment also the volume of powder in the forced feeder was measured. Analysis of the DoE revealed that paddle speeds are of minor importance for tablet weight but significantly affect volume of powder inside the feeder in case of powders with excellent flowability (DCP). The opposite effect of paddle speed was observed for fairly flowing powders (MCC). Tableting speed played a role in weight and weight variability, whereas changing fill depth exclusively influenced tablet weight. The DoE approach allowed predicting the optimum combination of process parameters leading to minimum tablet weight variability. Monte Carlo simulations allowed assessing the probability to exceed the acceptable response limits if factor settings were varied around their optimum. This multi-dimensional combination and interaction of input variables leading to response criteria with acceptable probability reflected the design space.

  11. Variability of Massive Young Stellar Objects in Cygnus-X

    NASA Astrophysics Data System (ADS)

    Thomas, Nancy H.; Hora, J. L.; Smith, H. A.

    2013-01-01

    Young stellar objects (YSOs) are stars in the process of formation. Several recent investigations have shown a high rate of photometric variability in YSOs at near- and mid-infrared wavelengths. Theoretical models for the formation of massive stars (1-10 solar masses) remain highly idealized, and little is known about the mechanisms that produce the variability. An ongoing Spitzer Space Telescope program is studying massive star formation in the Cygnus-X region. In conjunction with the Spitzer observations, we have conducted a ground-based near-infrared observing program of the Cygnus-X DR21 field using PAIRITEL, the automated infrared telescope at Whipple Observatory. Using the Stetson index for variability, we identified variable objects and a number of variable YSOs in our time-series PAIRITEL data of DR21. We have searched for periodicity among our variable objects using the Lomb-Scargle algorithm, and identified periodic variable objects with an average period of 8.07 days. Characterization of these variable and periodic objects will help constrain models of star formation present. This work is supported in part by the NSF REU and DOD ASSURE programs under NSF grant no. 0754568 and by the Smithsonian Institution.

  12. Reduction of verotoxigenic Escherichia coli by process and recipe optimisation in dry-fermented sausages.

    PubMed

    Heir, E; Holck, A L; Omer, M K; Alvseike, O; Høy, M; Måge, I; Axelsson, L

    2010-07-15

    Outbreaks of verotoxigenic Escherichia coli (VTEC) linked to dry-fermented sausages (DFSs) have emphasized the need for DFS manufacturers to introduce measures to obtain enhanced safety and still maintain the sensory qualities of their products. To our knowledge no data have yet been reported on non-O157:H7 VTEC survival in DFS. Here, the importance of recipe and process variables on VTEC (O157:H7 and O103:H25) reductions in two types of DFS, morr and salami, was determined through three statistically designed experiments. Linear regression and ANOVA analyses showed that no single variable had a dominant effect on VTEC reductions. High levels of NaCl, NaNO(2), glucose (low pH) and fermentation temperature gave enhanced VTEC reduction, while high fat and large casing diameter (a(w)) gave the opposite effect. Interaction effects were small. The process and recipe variables showed similar effects in morr and salami. In general, recipes combining high batter levels of salt (NaCl and NaNO(2)) and glucose along with high fermentation temperature that gave DFS with low final pH and a(w), provided approximately 3 log(10) reductions compared to approximately 1.5 log(10) reductions obtained for standard recipe DFS. Storage at 4 degrees C for 2 months provided log(10) 0.33-0.95 additional VTEC reductions and were only marginally affected by recipe type. Sensory tests revealed only small differences between the various recipes of morr and salami. By optimisation of recipe and process parameters, it is possible to obtain increased microbial safety of DFS while maintaining the sensory qualities of the sausages. 2010 Elsevier B.V. All rights reserved.

  13. Proteomic analysis of duck fatty liver during post-mortem storage related to the variability of fat loss during cooking of "foie gras".

    PubMed

    Theron, Laetitia; Fernandez, Xavier; Marty-Gasset, Nathalie; Chambon, Christophe; Viala, Didier; Pichereaux, Carole; Rossignol, Michel; Astruc, Thierry; Molette, Caroline

    2013-01-30

    Fat loss during cooking of duck "foie gras" is the main problem for both manufacturers and consumers. Despite the efforts of the processing industry to control fat loss, the variability of fatty liver cooking yields remains high and uncontrolled. To understand the biochemical effects of postslaughter processing on fat loss during cooking, this study characterizes for the first time the protein expression of fatty liver during chilling using a proteomic approach. For this purpose the proteins were separated according to their solubility: the protein fraction soluble in a buffer of low ionic strength (S) and the protein fraction insoluble in the same buffer (IS). Two-dimensional electrophoresis was used to analyze the S fraction and mass spectrometry for the identification of spots of interest. This analysis revealed 36 (21 identified proteins) and 34 (26 identified proteins) spots of interests in the low-fat-loss and high-fat-loss groups, respectively. The expression of proteins was lower after chilling, which revealed a suppressive effect of chilling on biological processes. The shot-gun strategy was used to analyze the IS fraction, with the identification of all the proteins by mass spectrometry. This allowed identification of 554 and 562 proteins in the low-fat-loss and high-fat-loss groups, respectively. Among these proteins, only the proteins that were up-regulated in the high-fat-loss group were significant (p value = 3.17 × 10(-3)) and corresponded to protein from the cytoskeleton and its associated proteins. Taken together, these results suggest that the variability of technological yield observed in processing plants could be explained by different aging states of fatty livers during chilling, most likely associated with different proteolytic patterns.

  14. Optimization of tocopherol concentration process from soybean oil deodorized distillate using response surface methodology.

    PubMed

    Ito, Vanessa Mayumi; Batistella, César Benedito; Maciel, Maria Regina Wolf; Maciel Filho, Rubens

    2007-04-01

    Soybean oil deodorized distillate is a product derived from the refining process and it is rich in high value-added products. The recovery of these unsaponifiable fractions is of great commercial interest, because of the fact that in many cases, the "valuable products" have vitamin activities such as tocopherols (vitamin E), as well as anticarcinogenic properties such as sterols. Molecular distillation has large potential to be used in order to concentrate tocopherols, as it uses very low temperatures owing to the high vacuum and short operating time for separation, and also, it does not use solvents. Then, it can be used to separate and to purify thermosensitive material such as vitamins. In this work, the molecular distillation process was applied for tocopherol concentration, and the response surface methodology was used to optimize free fatty acids (FFA) elimination and tocopherol concentration in the residue and in the distillate streams, both of which are the products of the molecular distiller. The independent variables studied were feed flow rate (F) and evaporator temperature (T) because they are the very important process variables according to previous experience. The experimental range was 4-12 mL/min for F and 130-200 degrees C for T. It can be noted that feed flow rate and evaporator temperature are important operating variables in the FFA elimination. For decreasing the loss of FFA, in the residue stream, the operating range should be changed, increasing the evaporator temperature and decreasing the feed flow rate; D/F ratio increases, increasing evaporator temperature and decreasing feed flow rate. High concentration of tocopherols was obtained in the residue stream at low values of feed flow rate and high evaporator temperature. These results were obtained through experimental results based on experimental design.

  15. Dimensional control of die castings

    NASA Astrophysics Data System (ADS)

    Karve, Aniruddha Ajit

    The demand for net shape die castings, which require little or no machining, is steadily increasing. Stringent customer requirements are forcing die casters to deliver high quality castings in increasingly short lead times. Dimensional conformance to customer specifications is an inherent part of die casting quality. The dimensional attributes of a die casting are essentially dependent upon many factors--the quality of the die and the degree of control over the process variables being the two major sources of dimensional error in die castings. This study focused on investigating the nature and the causes of dimensional error in die castings. The two major components of dimensional error i.e., dimensional variability and die allowance were studied. The major effort of this study was to qualitatively and quantitatively study the effects of casting geometry and process variables on die casting dimensional variability and die allowance. This was accomplished by detailed dimensional data collection at production die casting sites. Robust feature characterization schemes were developed to describe complex casting geometry in quantitative terms. Empirical modeling was utilized to quantify the effects of the casting variables on dimensional variability and die allowance for die casting features. A number of casting geometry and process variables were found to affect dimensional variability in die castings. The dimensional variability was evaluated by comparisons with current published dimensional tolerance standards. The casting geometry was found to play a significant role in influencing the die allowance of the features measured. The predictive models developed for dimensional variability and die allowance were evaluated to test their effectiveness. Finally, the relative impact of all the components of dimensional error in die castings was put into perspective, and general guidelines for effective dimensional control in the die casting plant were laid out. The results of this study will contribute to enhancement of dimensional quality and lead time compression in the die casting industry, thus making it competitive with other net shape manufacturing processes.

  16. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  17. Counter-propagation network with variable degree variable step size LMS for single switch typing recognition.

    PubMed

    Yang, Cheng-Huei; Luo, Ching-Hsing; Yang, Cheng-Hong; Chuang, Li-Yeh

    2004-01-01

    Morse code is now being harnessed for use in rehabilitation applications of augmentative-alternative communication and assistive technology, including mobility, environmental control and adapted worksite access. In this paper, Morse code is selected as a communication adaptive device for disabled persons who suffer from muscle atrophy, cerebral palsy or other severe handicaps. A stable typing rate is strictly required for Morse code to be effective as a communication tool. This restriction is a major hindrance. Therefore, a switch adaptive automatic recognition method with a high recognition rate is needed. The proposed system combines counter-propagation networks with a variable degree variable step size LMS algorithm. It is divided into five stages: space recognition, tone recognition, learning process, adaptive processing, and character recognition. Statistical analyses demonstrated that the proposed method elicited a better recognition rate in comparison to alternative methods in the literature.

  18. Identifying the interferences of irrigation on evapotranspiration variability over the Northern High Plains

    NASA Astrophysics Data System (ADS)

    Zeng, R.; Cai, X.

    2016-12-01

    Irrigation has considerably interfered with hydrological processes in arid and semi-arid areas with heavy irrigated agriculture. With the increasing demand for food production and evaporative demand due to climate change, irrigation water consumption is expected to increase, which would aggravate the interferences to hydrologic processes. Current studies focus on the impact of irrigation on the mean value of evapotranspiration (ET) at either local or regional scale, however, how irrigation changes the variability of ET has not been well understood. This study analyzes the impact of extensive irrigation on ET variability in the Northern High Plains. We apply an ET variance decomposition framework developed from our previous work to quantify the effects of both climate and irrigation on ET variance in the Northern High Plains watersheds. Based on climate and water table observations, we assess the monthly ET variance and its components for two periods: 1930s-1960s with less irrigation development 970s-2010s with more development. It is found that irrigation not only caused the well-recognized groundwater drawdown and stream depletion problems in the region, but also buffered ET variance from climatic fluctuations. In addition to increasing food productivity, irrigation also stabilizes crop yield by mitigating the impact of hydroclimatic variability. With complementary water supply from irrigation, ET often approaches to the potential ET, and thus the observed ET variance is more attributed to climatic variables especially temperature; meanwhile irrigation causes significant seasonal fluctuations to groundwater storage. For sustainable water resources management in the Northern High Plains, we argue that both the mean value and the variance of ET should be considered together for the regulation of irrigation in this region.

  19. Influences on Academic Achievement Across High and Low Income Countries: A Re-Analysis of IEA Data.

    ERIC Educational Resources Information Center

    Heyneman, S.; Loxley, W.

    Previous international studies of science achievement put the data through a process of winnowing to decide which variables to keep in the final regressions. Variables were allowed to enter the final regressions if they met a minimum beta coefficient criterion of 0.05 averaged across rich and poor countries alike. The criterion was an average…

  20. Sharpening method of satellite thermal image based on the geographical statistical model

    NASA Astrophysics Data System (ADS)

    Qi, Pengcheng; Hu, Shixiong; Zhang, Haijun; Guo, Guangmeng

    2016-04-01

    To improve the effectiveness of thermal sharpening in mountainous regions, paying more attention to the laws of land surface energy balance, a thermal sharpening method based on the geographical statistical model (GSM) is proposed. Explanatory variables were selected from the processes of land surface energy budget and thermal infrared electromagnetic radiation transmission, then high spatial resolution (57 m) raster layers were generated for these variables through spatially simulating or using other raster data as proxies. Based on this, the local adaptation statistical relationship between brightness temperature (BT) and the explanatory variables, i.e., the GSM, was built at 1026-m resolution using the method of multivariate adaptive regression splines. Finally, the GSM was applied to the high-resolution (57-m) explanatory variables; thus, the high-resolution (57-m) BT image was obtained. This method produced a sharpening result with low error and good visual effect. The method can avoid the blind choice of explanatory variables and remove the dependence on synchronous imagery at visible and near-infrared bands. The influences of the explanatory variable combination, sampling method, and the residual error correction on sharpening results were analyzed deliberately, and their influence mechanisms are reported herein.

  1. Blazar Variability from Turbulence in Jets Launched by Magnetically Arrested Accretion Flows

    NASA Astrophysics Data System (ADS)

    O' Riordan, Michael; Pe'er, Asaf; McKinney, Jonathan C.

    2017-07-01

    Blazars show variability on timescales ranging from minutes to years, the former being comparable to and in some cases even shorter than the light-crossing time of the central black hole. The observed γ-ray light curves can be described by a power-law power density spectrum (PDS), with a similar index for both BL Lacs and flat-spectrum radio quasars. We show that this variability can be produced by turbulence in relativistic jets launched by magnetically arrested accretion flows (MADs). We perform radiative transport calculations on the turbulent, highly magnetized jet launching region of a MAD with a rapidly rotating supermassive black hole. The resulting synchrotron and synchrotron self-Compton emission, originating from close to the black hole horizon, is highly variable. This variability is characterized by PDS, which is remarkably similar to the observed power-law spectrum at frequencies less than a few per day. Furthermore, turbulence in the jet launching region naturally produces fluctuations in the plasma on scales much smaller than the horizon radius. We speculate that similar turbulent processes, operating in the jet at large radii (and therefore a high bulk Lorentz factor), are responsible for blazar variability over many decades in frequency, including on minute timescales.

  2. Performance of an ultrafiltration membrane bioreactor (UF-MBR) as a processing strategy for the synthesis of galacto-oligosaccharides at high substrate concentrations.

    PubMed

    Córdova, Andrés; Astudillo, Carolina; Vera, Carlos; Guerrero, Cecilia; Illanes, Andrés

    2016-04-10

    The performance of an ultrafiltration membrane bioreactor for galacto-oligosaccharides (GOS) synthesis using high lactose concentrations (470 g/L) and β-galactosidase from Aspergillus oryzae was assessed. Tested processing variables were: transmembrane-pressure (PT), crossflow-velocity (CFV) and temperature. Results showed that processing variables had significant effect on the yield, the enzyme productivity and the flux but did not on GOS concentration and reaction conversion obtained. As expected, the use of high turbulences improved mass transfer and reduced the membrane fouling, but the use of very high crossflow-velocities caused operational instability due to vortex formation and lactose precipitation. The use of a desirability function allowed determining optimal conditions which were: PT (4.38 bar), CFV (7.35 m/s) and temperature (53.1 °C), optimizing simultaneously flux and specific enzyme productivity Under these optimal processing conditions, shear-stress and temperature did not affect the enzyme but long-term operation was limited by flux decay. In comparison to a conventional batch system, at 12.5h of processing time, the continuous GOS synthesis in the UF-MBR increased significantly the amount of processed substrate and a 2.44-fold increase in the amount of GOS produced per unit mass of catalyst was obtained with respect to a conventional batch system. Furthermore, these results can be improved by far by tuning the membranearea/reactionvolume ratio, showing that the use of an UF-MBR is an attractive alternative for the GOS synthesis at very high lactose concentrations. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. A model for the flux-r.m.s. correlation in blazar variability or the minijets-in-a-jet statistical model

    NASA Astrophysics Data System (ADS)

    Biteau, J.; Giebels, B.

    2012-12-01

    Very high energy gamma-ray variability of blazar emission remains of puzzling origin. Fast flux variations down to the minute time scale, as observed with H.E.S.S. during flares of the blazar PKS 2155-304, suggests that variability originates from the jet, where Doppler boosting can be invoked to relax causal constraints on the size of the emission region. The observation of log-normality in the flux distributions should rule out additive processes, such as those resulting from uncorrelated multiple-zone emission models, and favour an origin of the variability from multiplicative processes not unlike those observed in a broad class of accreting systems. We show, using a simple kinematic model, that Doppler boosting of randomly oriented emitting regions generates flux distributions following a Pareto law, that the linear flux-r.m.s. relation found for a single zone holds for a large number of emitting regions, and that the skewed distribution of the total flux is close to a log-normal, despite arising from an additive process.

  4. Accounting for Advanced High School Coursework in College Admission Decisions

    ERIC Educational Resources Information Center

    Sadler, Philip M.; Tai, Robert H.

    2007-01-01

    The purpose of the current study is to investigate the feasibility of accounting for student performance in advanced high school coursework through the adjustment of high school grade point average (HSGPA) while separating out variables that are independently considered in the admission process, e.g., SAT/ACT scores, community affluence, type of…

  5. Climate SPHINX: High-resolution present-day and future climate simulations with an improved representation of small-scale variability

    NASA Astrophysics Data System (ADS)

    Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Subramanian, Aneesh; Weisheimer, Antje; Christensen, Hannah; Juricke, Stephan; Palmer, Tim

    2016-04-01

    The PRACE Climate SPHINX project investigates the sensitivity of climate simulations to model resolution and stochastic parameterization. The EC-Earth Earth-System Model is used to explore the impact of stochastic physics in 30-years climate integrations as a function of model resolution (from 80km up to 16km for the atmosphere). The experiments include more than 70 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), using RCP8.5 CMIP5 forcing. A total amount of 20 million core hours will be used at end of the project (March 2016) and about 150 TBytes of post-processed data will be available to the climate community. Preliminary results show a clear improvement in the representation of climate variability over the Euro-Atlantic following resolution increase. More specifically, the well-known atmospheric blocking negative bias over Europe is definitely resolved. High resolution runs also show improved fidelity in representation of tropical variability - such as the MJO and its propagation - over the low resolution simulations. It is shown that including stochastic parameterization in the low resolution runs help to improve some of the aspects of the MJO propagation further. These findings show the importance of representing the impact of small scale processes on the large scale climate variability either explicitly (with high resolution simulations) or stochastically (in low resolution simulations).

  6. Research in Stochastic Processes.

    DTIC Science & Technology

    1983-10-01

    increases. A more detailed investigation for the exceedances themselves (rather than Just the cluster centers) was undertaken, together with J. HUsler and...J. HUsler and M.R. Leadbetter, Compoung Poisson limit theorems for high level exceedances by stationary sequences, Center for Stochastic Processes...stability by a random linear operator. C.D. Hardin, General (asymmetric) stable variables and processes. T. Hsing, J. HUsler and M.R. Leadbetter, Compound

  7. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)

  8. Effects of process variables and kinetics on the degradation of 2,4-dichlorophenol using advanced reduction processes (ARP).

    PubMed

    Yu, Xingyue; Cabooter, Deirdre; Dewil, Raf

    2018-05-24

    This study aims at investigating the efficiency and kinetics of 2,4-DCP degradation via advanced reduction processes (ARP). Using UV light as activation method, the highest degradation efficiency of 2,4-DCP was obtained when using sulphite as a reducing agent. The highest degradation efficiency was observed under alkaline conditions (pH = 10.0), for high sulphite dosage and UV intensity, and low 2,4-DCP concentration. For all process conditions, first-order reaction rate kinetics were applicable. A quadratic polynomial equation fitted by a Box-Behnken Design was used as a statistical model and proved to be precise and reliable in describing the significance of the different process variables. The analysis of variance demonstrated that the experimental results were in good agreement with the predicted model (R 2  = 0.9343), and solution pH, sulphite dose and UV intensity were found to be key process variables in the sulphite/UV ARP. Consequently, the present study provides a promising approach for the efficient degradation of 2,4-DCP with fast degradation kinetics. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Expanding the Political Philosophy Dimension of the RISP Model: Examining the Conditional Indirect Effects of Cultural Cognition.

    PubMed

    Hmielowski, Jay D; Wang, Meredith Y; Donaway, Rebecca R

    2018-04-25

    This article attempts to connect literatures from the Risk Information Seeking and Processing (RISP) model and cultural cognition theory. We do this by assessing the relationship between the two prominent cultural cognition variables (i.e., group and grid) and risk perceptions. We then examine whether these risk perceptions are associated with three outcomes important to the RISP model: information seeking, systematic processing, and heuristic processing, through a serial mediation model. We used 2015 data collected from 10 communities across the United States to test our hypotheses. Our results show that people high on group and low on grid (egalitarian communitarians) show greater risk perceptions regarding water quality issues. Moreover, these higher levels of perceived risk translate into increased information seeking, systematic processing of information, and lower heuristic processing through intervening variables from the RISP model (e.g., negative emotions and information insufficiency). These results extend the extant literature by expanding on the treatment of political ideology within the RISP model literature and taking a more nuanced approach to political beliefs in accordance with the cultural cognitions literature. Our article also expands on the RISP literature by looking at information-processing variables. © 2018 Society for Risk Analysis.

  10. All varieties of encoding variability are not created equal: Separating variable processing from variable tasks

    PubMed Central

    Huff, Mark J.; Bodner, Glen E.

    2014-01-01

    Whether encoding variability facilitates memory is shown to depend on whether item-specific and relational processing are both performed across study blocks, and whether study items are weakly versus strongly related. Variable-processing groups studied a word list once using an item-specific task and once using a relational task. Variable-task groups’ two different study tasks recruited the same type of processing each block. Repeated-task groups performed the same study task each block. Recall and recognition were greatest in the variable-processing group, but only with weakly related lists. A variable-processing benefit was also found when task-based processing and list-type processing were complementary (e.g., item-specific processing of a related list) rather than redundant (e.g., relational processing of a related list). That performing both item-specific and relational processing across trials, or within a trial, yields encoding-variability benefits may help reconcile decades of contradictory findings in this area. PMID:25018583

  11. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.

    1974-01-01

    A two year study of the major process variables associated with the manufacturing process for sealed, nickel-cadmium, areospace cells is summarized. Effort was directed toward identifying the major process variables associated with a manufacturing process, experimentally assessing each variable's effect, and imposing the necessary changes (optimization) and controls for the critical process variables to improve results and uniformity. A critical process variable associated with the sintered nickel plaque manufacturing process was identified as the manual forming operation. Critical process variables identified with the positive electrode impregnation/polarization process were impregnation solution temperature, free acid content, vacuum impregnation, and sintered plaque strength. Positive and negative electrodes were identified as a major source of carbonate contamination in sealed cells.

  12. On the use of internal state variables in thermoviscoplastic constitutive equations

    NASA Technical Reports Server (NTRS)

    Allen, D. H.; Beek, J. M.

    1985-01-01

    The general theory of internal state variables are reviewed to apply it to inelastic metals in use in high temperature environments. In this process, certain constraints and clarifications will be made regarding internal state variables. It is shown that the Helmholtz free energy can be utilized to construct constitutive equations which are appropriate for metallic superalloys. Internal state variables are shown to represent locally averaged measures of dislocation arrangement, dislocation density, and intergranular fracture. The internal state variable model is demonstrated to be a suitable framework for comparison of several currently proposed models for metals and can therefore be used to exhibit history dependence, nonlinearity, and rate as well as temperature sensitivity.

  13. Using high-frequency nitrogen and carbon measurements to decouple temporal dynamics of catchment and in-stream transport and reaction processes in a headwater stream

    NASA Astrophysics Data System (ADS)

    Blaen, P.; Riml, J.; Khamis, K.; Krause, S.

    2017-12-01

    Within river catchments across the world, headwater streams represent important sites of nutrient transformation and uptake due to their high rates of microbial community processing and relative abundance in the landscape. However, separating the combined influence of in-stream transport and reaction processes from the overall catchment response can be difficult due to spatio-temporal variability in nutrient and organic matter inputs, flow regimes, and reaction rates. Recent developments in optical sensor technologies enable high-frequency, in situ nutrient measurements, and thus provide opportunities for greater insights into in-stream processes. Here, we use in-stream observations of hourly nitrate (NO3-N), dissolved organic carbon (DOC) and dissolved oxygen (DO) measurements from paired in situ sensors that bound a 1 km headwater stream reach in a mixed-use catchment in central England. We employ a spectral approach to decompose (1) variances in solute loading from the surrounding landscape, and (2) variances in reach-scale in-stream nutrient transport and reaction processes. In addition, we estimate continuous rates of reach-scale NO3-N and DOC assimilation/dissimilation, ecosystem respiration and primary production. Comparison of these results over a range of hydrological conditions (baseflow, variable storm events) and timescales (event-based, diel, seasonal) facilitates new insights into the physical and biogeochemical processes that drive in-stream nutrient dynamics in headwater streams.

  14. Implementation of in-line infrared monitor in full-scale anaerobic digestion process.

    PubMed

    Spanjers, H; Bouvier, J C; Steenweg, P; Bisschops, I; van Gils, W; Versprille, B

    2006-01-01

    During start up but also during normal operation, anaerobic reactor systems should be run and monitored carefully to secure trouble-free operation, because the process is vulnerable to disturbances such as temporary overloading, biomass wash out and influent toxicity. The present method of monitoring is usually by manual sampling and subsequent laboratory analysis. Data collection, processing and feedback to system operation is manual and ad hoc, and involves high-level operator skills and attention. As a result, systems tend to be designed at relatively conservative design loading rates resulting in significant over-sizing of reactors and thus increased systems cost. It is therefore desirable to have on-line and continuous access to performance data on influent and effluent quality. Relevant variables to indicate process performance include VFA, COD, alkalinity, sulphate, and, if aerobic post-treatment is considered, total nitrogen, ammonia and nitrate. Recently, mid-IR spectrometry was demonstrated on a pilot scale to be suitable for in-line simultaneous measurement of these variables. This paper describes a full-scale application of the technique to test its ability to monitor continuously and without human intervention the above variables simultaneously in two process streams. For VFA, COD, sulphate, ammonium and TKN good agreement was obtained between in-line and manual measurements. During a period of six months the in-line measurements had to be interrupted several times because of clogging. It appeared that the sample pre-treatment unit was not able to cope with high solids concentrations all the time.

  15. Monthly streamflow forecasting using continuous wavelet and multi-gene genetic programming combination

    NASA Astrophysics Data System (ADS)

    Hadi, Sinan Jasim; Tombul, Mustafa

    2018-06-01

    Streamflow is an essential component of the hydrologic cycle in the regional and global scale and the main source of fresh water supply. It is highly associated with natural disasters, such as droughts and floods. Therefore, accurate streamflow forecasting is essential. Forecasting streamflow in general and monthly streamflow in particular is a complex process that cannot be handled by data-driven models (DDMs) only and requires pre-processing. Wavelet transformation is a pre-processing technique; however, application of continuous wavelet transformation (CWT) produces many scales that cause deterioration in the performance of any DDM because of the high number of redundant variables. This study proposes multigene genetic programming (MGGP) as a selection tool. After the CWT analysis, it selects important scales to be imposed into the artificial neural network (ANN). A basin located in the southeast of Turkey is selected as case study to prove the forecasting ability of the proposed model. One month ahead downstream flow is used as output, and downstream flow, upstream, rainfall, temperature, and potential evapotranspiration with associated lags are used as inputs. Before modeling, wavelet coherence transformation (WCT) analysis was conducted to analyze the relationship between variables in the time-frequency domain. Several combinations were developed to investigate the effect of the variables on streamflow forecasting. The results indicated a high localized correlation between the streamflow and other variables, especially the upstream. In the models of the standalone layout where the data were entered to ANN and MGGP without CWT, the performance is found poor. In the best-scale layout, where the best scale of the CWT identified as the highest correlated scale is chosen and enters to ANN and MGGP, the performance increased slightly. Using the proposed model, the performance improved dramatically particularly in forecasting the peak values because of the inclusion of several scales in which seasonality and irregularity can be captured. Using hydrological and meteorological variables also improved the ability to forecast the streamflow.

  16. Potential Impact of North Atlantic Climate Variability on Ocean Biogeochemical Processes

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Muhling, B.; Lee, S. K.; Muller-Karger, F. E.; Enfield, D. B.; Lamkin, J. T.; Roffer, M. A.

    2016-02-01

    Previous studies have shown that upper ocean circulations largely determine primary production in the euphotic layers, here the global ocean model with biogeochemistry (GFDL's Modular Ocean Model with TOPAZ biogeochemistry) forced with the ERA-Interim is used to simulate the natural variability of biogeochemical processes in global ocean during 1979-present. Preliminary results show that the surface chlorophyll is overall underestimated in MOM-TOPAZ, but its spatial pattern is fairly realistic. Relatively high chlorophyll variability is shown in the subpolar North Atlantic, northeastern tropical Atlantic, and equatorial Atlantic. Further analysis suggests that the chlorophyll variability in the North Atlantic Ocean is affected by long-term climate variability. For the subpolar North Atlantic region, the chlorophyll variability is light-limited and is significantly correlated with North Atlantic Oscillation. A dipole pattern of chlorophyll variability is found between the northeastern tropical Atlantic and equatorial Atlantic. For the northeastern North Atlantic, the chlorophyll variability is significantly correlated with Atlantic Meridional Mode (AMM) and Atlantic Multidecadal Oscillation (AMO). During the negative phase of AMM and AMO, the increased trade wind in the northeast North Atlantic can lead to increased upwelling of nutrients. In the equatorial Atlantic region, the chlorophyll variability is largely link to Atlantic-Niño and associated equatorial upwelling of nutrients. The potential impact of climate variability on the distribution of pelagic fishes (i.e. yellowfin tuna) are discussed.

  17. Formulation and process factors influencing product quality and in vitro performance of ophthalmic ointments.

    PubMed

    Xu, Xiaoming; Al-Ghabeish, Manar; Rahman, Ziyaur; Krishnaiah, Yellela S R; Yerlikaya, Firat; Yang, Yang; Manda, Prashanth; Hunt, Robert L; Khan, Mansoor A

    2015-09-30

    Owing to its unique anatomical and physiological functions, ocular surface presents special challenges for both design and performance evaluation of the ophthalmic ointment drug products formulated with a variety of bases. The current investigation was carried out to understand and identify the appropriate in vitro methods suitable for quality and performance evaluation of ophthalmic ointment, and to study the effect of formulation and process variables on its critical quality attributes (CQA). The evaluated critical formulation variables include API initial size, drug percentage, and mineral oil percentage while the critical process parameters include mixing rate, temperature, time and cooling rate. The investigated quality and performance attributes include drug assay, content uniformity, API particle size in ointment, rheological characteristics, in vitro drug release and in vitro transcorneal drug permeation. Using design of experiments (DoE) as well as a novel principle component analysis approach, five of the quality and performance attributes (API particle size, storage modulus of ointment, high shear viscosity of ointment, in vitro drug release constant and in vitro transcorneal drug permeation rate constant) were found to be highly influenced by the formulation, in particular the strength of API, and to a lesser degree by processing variables. Correlating the ocular physiology with the physicochemical characteristics of acyclovir ophthalmic ointment suggested that in vitro quality metrics could be a valuable predictor of its in vivo performance. Published by Elsevier B.V.

  18. RAP workshop : Buda-TxAPA, Texas, August 27, 2009.

    DOT National Transportation Integrated Search

    2009-08-27

    Presentation Outline : RAP overview : RAP stockpile survey: state of practice : RAP processing and RAP variability : RAP characterization : RAP mix design : Field performance of Texas high RAP test sections

  19. The variation of polar firn subject to percolation - characterizing processes and glacier mass budget uncertainty using high-resolution instruments

    NASA Astrophysics Data System (ADS)

    Demuth, M. N.; Marshall, H.; Morris, E. M.; Burgess, D. O.; Gray, L.

    2009-12-01

    As the Earth's glaciers and ice sheets are subjected to the effects of recent and predicted warming, the distribution of their glaciological facies zones will alter. Percolation and wet snow facies zones will, in general, move upwards; encroaching upon, for some glacier configurations, regions of dry snow facies. Meltwater percolation and internal accumulation processes that characterize these highly variable facies may confound reliable estimates of surface mass budgets based on traditional point measurements alone. If the extents of these zones are indeed increasing, as has been documented through recent analysis of QuickScat data for the ice caps of the Canadian Arctic, then the certainty of glacier mass budget estimates using traditional techniques may be degraded to an as yet un-quantified degree. Indeed, the application of remote sensing, in particular that utilizing repeat altimetry to retrieve surface mass budget estimates, is also subject to the complexity of glacier facies from the standpoint of their near-surface stratigraphy, density variations and rates of compaction. We first review the problem of measuring glacier mass budgets in the context of nested scales of variability, where auto-correlation structure varies with the scale of observation. We then consider specifically firn subject to percolation and describe the application of high-resolution instruments to characterize variability at the field-scale. The data collected include measurements of micro-topography, snow hardness, and snow density and texture; retrieved using airborne scanning lidar, a snow micro-penetrometer, neutron probe and ground-penetrating radars. The analysis suggests corresponding scales of correlation as it concerns the influence of antecedent conditions (surface roughness and hardness, and stratigraphic variability) and post-depositional processes (percolation and refreezing of surface melt water).

  20. 28nm node process optimization: a lithography centric view

    NASA Astrophysics Data System (ADS)

    Seltmann, Rolf

    2014-10-01

    Many experts claim that the 28nm technology node will be the most cost effective technology node forever. This results from primarily from the cost of manufacturing due to the fact that 28nm is the last true Single Patterning (SP) node. It is also affected by the dramatic increase of design costs and the limited shrink factor of the next following nodes. Thus, it is assumed that this technology still will be alive still for many years. To be cost competitive, high yields are mandatory. Meanwhile, leading edge foundries have optimized the yield of the 28nm node to such a level that that it is nearly exclusively defined by random defectivity. However, it was a long way to go to come to that level. In my talk I will concentrate on the contribution of lithography to this yield learning curve. I will choose a critical metal patterning application. I will show what was needed to optimize the process window to a level beyond the usual OPC model work that was common on previous nodes. Reducing the process (in particular focus) variability is a complementary need. It will be shown which improvements were needed in tooling, process control and design-mask-wafer interaction to remove all systematic yield detractors. Over the last couple of years new scanner platforms were introduced that were targeted for both better productivity and better parametric performance. But this was not a clear run-path. It needed some extra affords of the tool suppliers together with the Fab to bring the tool variability down to the necessary level. Another important topic to reduce variability is the interaction of wafer none-planarity and lithography optimization. Having an accurate knowledge of within die topography is essential for optimum patterning. By completing both the variability reduction work and the process window enhancement work we were able to transfer the original marginal process budget to a robust positive budget and thus ensuring high yield and low costs.

  1. Collaborative Research: Process-Resolving Decomposition of the Global Temperature Response to Modes of Low Frequency Variability in a Changing Climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Yi

    2014-11-24

    DOE-GTRC-05596 11/24/2104 Collaborative Research: Process-Resolving Decomposition of the Global Temperature Response to Modes of Low Frequency Variability in a Changing Climate PI: Dr. Yi Deng (PI) School of Earth and Atmospheric Sciences Georgia Institute of Technology 404-385-1821, yi.deng@eas.gatech.edu El Niño-Southern Oscillation (ENSO) and Annular Modes (AMs) represent respectively the most important modes of low frequency variability in the tropical and extratropical circulations. The projection of future changes in the ENSO and AM variability, however, remains highly uncertain with the state-of-the-science climate models. This project conducted a process-resolving, quantitative evaluations of the ENSO and AM variability in the modern reanalysis observationsmore » and in climate model simulations. The goal is to identify and understand the sources of uncertainty and biases in models’ representation of ENSO and AM variability. Using a feedback analysis method originally formulated by one of the collaborative PIs, we partitioned the 3D atmospheric temperature anomalies and surface temperature anomalies associated with ENSO and AM variability into components linked to 1) radiation-related thermodynamic processes such as cloud and water vapor feedbacks, 2) local dynamical processes including convection and turbulent/diffusive energy transfer and 3) non-local dynamical processes such as the horizontal energy transport in the oceans and atmosphere. In the past 4 years, the research conducted at Georgia Tech under the support of this project has led to 15 peer-reviewed publications and 9 conference/workshop presentations. Two graduate students and one postdoctoral fellow also received research training through participating the project activities. This final technical report summarizes key scientific discoveries we made and provides also a list of all publications and conference presentations resulted from research activities at Georgia Tech. The main findings include: 1) the distinctly different roles played by atmospheric dynamical processes in establishing surface temperature response to ENSO at tropics and extratropics (i.e., atmospheric dynamics disperses energy out of tropics during ENSO warm events and modulate surface temperature at mid-, high-latitudes through controlling downward longwave radiation); 2) the representations of ENSO-related temperature response in climate models fail to converge at the process-level particularly over extratropics (i.e., models produce the right temperature responses to ENSO but with wrong reasons); 3) water vapor feedback contributes substantially to the temperature anomalies found over U.S. during different phases of the Northern Annular Mode (NAM), which adds new insight to the traditional picture that cold/warm advective processes are the main drivers of local temperature responses to the NAM; 4) the overall land surface temperature biases in the latest NCAR model (CESM1) are caused by biases in surface albedo while the surface temperature biases over ocean are related to multiple factors including biases in model albedo, cloud and oceanic dynamics, and the temperature biases over different ocean basins are also induced by different process biases. These results provide a detailed guidance for process-level model turning and improvement, and thus contribute directly to the overall goal of reducing model uncertainty in projecting future changes in the Earth’s climate system, especially in the ENSO and AM variability.« less

  2. Sources of biomass feedstock variability and the potential impact on biofuels production

    DOE PAGES

    Williams, C. Luke; Westover, Tyler L.; Emerson, Rachel M.; ...

    2015-11-23

    In this study, terrestrial lignocellulosic biomass has the potential to be a carbon neutral and domestic source of fuels and chemicals. However, the innate variability of biomass resources, such as herbaceous and woody materials, and the inconsistency within a single resource due to disparate growth and harvesting conditions, presents challenges for downstream processes which often require materials that are physically and chemically consistent. Intrinsic biomass characteristics, including moisture content, carbohydrate and ash compositions, bulk density, and particle size/shape distributions are highly variable and can impact the economics of transforming biomass into value-added products. For instance, ash content increases by anmore » order of magnitude between woody and herbaceous feedstocks (from ~0.5 to 5 %, respectively) while lignin content drops by a factor of two (from ~30 to 15 %, respectively). This increase in ash and reduction in lignin leads to biofuel conversion consequences, such as reduced pyrolysis oil yields for herbaceous products as compared to woody material. In this review, the sources of variability for key biomass characteristics are presented for multiple types of biomass. Additionally, this review investigates the major impacts of the variability in biomass composition on four conversion processes: fermentation, hydrothermal liquefaction, pyrolysis, and direct combustion. Finally, future research processes aimed at reducing the detrimental impacts of biomass variability on conversion to fuels and chemicals are proposed.« less

  3. A Method to Estimate the Masses of Asymptotic Giant Branch Variable Stars

    NASA Astrophysics Data System (ADS)

    Takeuti, Mine; Nakagawa, Akiharu; Kurayama, Tomoharu; Honma, Mareki

    2013-06-01

    AGB variable stars are at the transient phase between low and high mass-loss rates; estimating the masses of these stars is necessary to study the evolutionary processes and mass-loss processes during the AGB stage. We applied the pulsation constant theoretically derived by Xiong and Deng (2007 MNRAS, 378, 1270) to 15 galactic AGB stars in order to estimate their masses. We found that using the pulsation constant is effective to estimate the mass of a star pulsating with two different pulsation modes, such as S Crt and RX Boo, which provides mass estimates comparable to theoretical results of AGB star evolution. We also extended the use of the pulsation constant to single-mode variables, and analyzed the properties of AGB stars related to their masses.

  4. Does habituation matter? Emotional processing theory and exposure therapy for acrophobia.

    PubMed

    Baker, Aaron; Mystkowski, Jayson; Culver, Najwa; Yi, Rena; Mortazavi, Arezou; Craske, Michelle G

    2010-11-01

    Clinically, there is wide subscription to emotional processing theory (EPT; Foa & Kozak, 1986) as a model of therapeutic effectiveness of exposure therapy: EPT purports that exposure is maximal when (1) fear is activated (IFA), (2) fear subsides within sessions (WSH), and (3) fear subsides between sessions (BSH). This study examined these assumptions, using in vivo exposure therapy for 44 students scoring high on acrophobia measures. Results indicated that no EPT variables were consistently predictive of treatment outcome. No support was found for IFA or WSH; measures of BSH were predictive of short-term change, but these effects were attenuated at follow-up. Furthermore, EPT variables were not predictive of each other as previously hypothesized, indicating the variables are not functionally related. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Breaking the trade-off between efficiency and service.

    PubMed

    Frei, Frances X

    2006-11-01

    For manufacturers, customers are the open wallets at the end of the supply chain. But for most service businesses, they are key inputs to the production process. Customers introduce tremendous variability to that process, but they also complain about any lack of consistency and don't care about the company's profit agenda. Managing customer-introduced variability, the author argues, is a central challenge for service companies. The first step is to diagnose which type of variability is causing mischief: Customers may arrive at different times, request different kinds of service, possess different capabilities, make varying degrees of effort, and have different personal preferences. Should companies accommodate variability or reduce it? Accommodation often involves asking employees to compensate for the variations among customers--a potentially costly solution. Reduction often means offering a limited menu of options, which may drive customers away. Some companies have learned to deal with customer-introduced variability without damaging either their operating environments or customers' service experiences. Starbucks, for example, handles capability variability among its customers by teaching them the correct ordering protocol. Dell deals with arrival and request variability in its high-end server business by outsourcing customer service while staying in close touch with customers to discuss their needs and assess their experiences with third-party providers. The effective management of variability often requires a company to influence customers' behavior. Managers attempting that kind of intervention can follow a three-step process: diagnosing the behavioral problem, designing an operating role for customers that creates new value for both parties, and testing and refining approaches for influencing behavior.

  6. Stochastic variation in avian survival rates: Life-history predictions, population consequences, and the potential responses to human perturbations and climate change

    USGS Publications Warehouse

    Schmutz, Joel A.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    Stochastic variation in survival rates is expected to decrease long-term population growth rates. This expectation influences both life-history theory and the conservation of species. From this expectation, Pfister (1998) developed the important life-history prediction that natural selection will have minimized variability in those elements of the annual life cycle (such as adult survival rate) with high sensitivity. This prediction has not been rigorously evaluated for bird populations, in part due to statistical difficulties related to variance estimation. I here overcome these difficulties, and in an analysis of 62 populations, I confirm her prediction by showing a negative relationship between the proportional sensitivity (elasticity) of adult survival and the proportional variance (CV) of adult survival. However, several species deviated significantly from this expectation, with more process variance in survival than predicted. For instance, projecting the magnitude of process variance in annual survival for American redstarts (Setophaga ruticilla) for 25 years resulted in a 44% decline in abundance without assuming any change in mean survival rate. For most of these species with high process variance, recent changes in harvest, habitats, or changes in climate patterns are the likely sources of environmental variability causing this variability in survival. Because of climate change, environmental variability is increasing on regional and global scales, which is expected to increase stochasticity in vital rates of species. Increased stochasticity in survival will depress population growth rates, and this result will magnify the conservation challenges we face.

  7. A New High Resolution Climate Dataset for Climate Change Impacts Assessments in New England

    NASA Astrophysics Data System (ADS)

    Komurcu, M.; Huber, M.

    2016-12-01

    Assessing regional impacts of climate change (such as changes in extreme events, land surface hydrology, water resources, energy, ecosystems and economy) requires much higher resolution climate variables than those available from global model projections. While it is possible to run global models in higher resolution, the high computational cost associated with these simulations prevent their use in such manner. To alleviate this problem, dynamical downscaling offers a method to deliver higher resolution climate variables. As part of an NSF EPSCoR funded interdisciplinary effort to assess climate change impacts on New Hampshire ecosystems, hydrology and economy (the New Hampshire Ecosystems and Society project), we create a unique high-resolution climate dataset for New England. We dynamically downscale global model projections under a high impact emissions scenario using the Weather Research and Forecasting model (WRF) with three nested grids of 27, 9 and 3 km horizontal resolution with the highest resolution innermost grid focusing over New England. We prefer dynamical downscaling over other methods such as statistical downscaling because it employs physical equations to progressively simulate climate variables as atmospheric processes interact with surface processes, emissions, radiation, clouds, precipitation and other model components, hence eliminates fix relationships between variables. In addition to simulating mean changes in regional climate, dynamical downscaling also allows for the simulation of climate extremes that significantly alter climate change impacts. We simulate three time slices: 2006-2015, 2040-2060 and 2080-2100. This new high-resolution climate dataset (with more than 200 variables saved in hourly (six hourly) intervals for the highest resolution domain (outer two domains)) along with model input and restart files used in our WRF simulations will be publicly available for use to the broader scientific community to support in-depth climate change impacts assessments for New England. We present results focusing on future changes in New England extreme events.

  8. Effect of input data variability on estimations of the equivalent constant temperature time for microbial inactivation by HTST and retort thermal processing.

    PubMed

    Salgado, Diana; Torres, J Antonio; Welti-Chanes, Jorge; Velazquez, Gonzalo

    2011-08-01

    Consumer demand for food safety and quality improvements, combined with new regulations, requires determining the processor's confidence level that processes lowering safety risks while retaining quality will meet consumer expectations and regulatory requirements. Monte Carlo calculation procedures incorporate input data variability to obtain the statistical distribution of the output of prediction models. This advantage was used to analyze the survival risk of Mycobacterium avium subspecies paratuberculosis (M. paratuberculosis) and Clostridium botulinum spores in high-temperature short-time (HTST) milk and canned mushrooms, respectively. The results showed an estimated 68.4% probability that the 15 sec HTST process would not achieve at least 5 decimal reductions in M. paratuberculosis counts. Although estimates of the raw milk load of this pathogen are not available to estimate the probability of finding it in pasteurized milk, the wide range of the estimated decimal reductions, reflecting the variability of the experimental data available, should be a concern to dairy processors. Knowledge of the C. botulinum initial load and decimal thermal time variability was used to estimate an 8.5 min thermal process time at 110 °C for canned mushrooms reducing the risk to 10⁻⁹ spores/container with a 95% confidence. This value was substantially higher than the one estimated using average values (6.0 min) with an unacceptable 68.6% probability of missing the desired processing objective. Finally, the benefit of reducing the variability in initial load and decimal thermal time was confirmed, achieving a 26.3% reduction in processing time when standard deviation values were lowered by 90%. In spite of novel technologies, commercialized or under development, thermal processing continues to be the most reliable and cost-effective alternative to deliver safe foods. However, the severity of the process should be assessed to avoid under- and over-processing and determine opportunities for improvement. This should include a systematic approach to consider variability in the parameters for the models used by food process engineers when designing a thermal process. The Monte Carlo procedure here presented is a tool to facilitate this task for the determination of process time at a constant lethal temperature. © 2011 Institute of Food Technologists®

  9. Highly crosslinked silicon polymers for gas chromatography columns

    NASA Technical Reports Server (NTRS)

    Shen, Thomas C. (Inventor)

    1994-01-01

    A new highly crosslinked silicone polymer particle for gas chromatography application and a process for synthesizing such copolymer are described. The new copolymer comprises vinyltriethoxysilane and octadecyltrichlorosilane. The copolymer has a high degree of crosslinking and a cool balance of polar to nonpolar sites in the porous silicon polymer assuring fast separation of compounds of variable polarity.

  10. The Dynamic between Knowledge Production and Faculty Evaluation: Perceptions of the Promotion and Tenure Process across Disciplines

    ERIC Educational Resources Information Center

    Jackson, J. Kasi; Latimer, Melissa; Stoiko, Rachel

    2017-01-01

    This study sought to understand predictors of faculty satisfaction with promotion and tenure processes and reasonableness of expectations in the context of a striving institution. The factors we investigated included discipline (high-consensus [science and math] vs. low-consensus [humanities and social sciences]); demographic variables; and…

  11. Extrusion-spheronization: process variables and characterization.

    PubMed

    Sinha, V R; Agrawal, M K; Agarwal, A; Singh, G; Ghai, D

    2009-01-01

    Multiparticulate systems have undergone great development in the past decade fueled by the better understanding of their multiple roles as a suitable delivery system. With the passage of time, significant advances have been made in the process of pelletization due to the incorporation of specialized techniques for their development. Extrusion-spheronization seems to be the most promising process for the optimum delivery of many potent drugs having high systemic toxicity. It also offers immense pharmaceutical applicability due to the benefits of high loading capacity of active ingredient(s), narrow size distribution, and cost-effectiveness. On application of a specific coat, these systems can also aid in site-specific delivery, thereby enhancing the bioavailability of many drugs. The current review focuses on the process of extrusion-spheronization and the operational (extruder types, screen pressure, screw speed, temperature, moisture content, spheronization load, speed and time) and formulation (excipients and drugs) variables, which may affect the quality of the final pellets. Various methods for the evaluation of the quality of the pellets with regard to the size distribution, shape, friability, granule strength, density, porosity, flow properties, and surface texture are discussed.

  12. An Exploratory Study of the Influence of Load and Practice on Segmental and Articulatory Variability in Children with Speech Sound Disorders

    PubMed Central

    Vuolo, Janet; Goffman, Lisa

    2017-01-01

    This exploratory treatment study used phonetic transcription and speech kinematics to examine changes in segmental and articulatory variability. Nine children, ages 4- to 8-years-old, served as participants, including two with childhood apraxia of speech (CAS), five with speech sound disorder (SSD), and two who were typically developing (TD). Children practised producing agent + action phrases in an imitation task (low linguistic load) and a retrieval task (high linguistic load) over five sessions. In the imitation task in session one, both participants with CAS showed high degrees of segmental and articulatory variability. After five sessions, imitation practice resulted in increased articulatory variability for five participants. Retrieval practice resulted in decreased articulatory variability in three participants with SSD. These results suggest that short-term speech production practice in rote imitation disrupts articulatory control in children with and without CAS. In contrast, tasks that require linguistic processing may scaffold learning for children with SSD but not CAS. PMID:27960554

  13. Multivariate statistical analysis of a high rate biofilm process treating kraft mill bleach plant effluent.

    PubMed

    Goode, C; LeRoy, J; Allen, D G

    2007-01-01

    This study reports on a multivariate analysis of the moving bed biofilm reactor (MBBR) wastewater treatment system at a Canadian pulp mill. The modelling approach involved a data overview by principal component analysis (PCA) followed by partial least squares (PLS) modelling with the objective of explaining and predicting changes in the BOD output of the reactor. Over two years of data with 87 process measurements were used to build the models. Variables were collected from the MBBR control scheme as well as upstream in the bleach plant and in digestion. To account for process dynamics, a variable lagging approach was used for variables with significant temporal correlations. It was found that wood type pulped at the mill was a significant variable governing reactor performance. Other important variables included flow parameters, faults in the temperature or pH control of the reactor, and some potential indirect indicators of biomass activity (residual nitrogen and pH out). The most predictive model was found to have an RMSEP value of 606 kgBOD/d, representing a 14.5% average error. This was a good fit, given the measurement error of the BOD test. Overall, the statistical approach was effective in describing and predicting MBBR treatment performance.

  14. Assessment of the Suitability of High Resolution Numerical Weather Model Outputs for Hydrological Modelling in Mountainous Cold Regions

    NASA Astrophysics Data System (ADS)

    Rasouli, K.; Pomeroy, J. W.; Hayashi, M.; Fang, X.; Gutmann, E. D.; Li, Y.

    2017-12-01

    The hydrology of mountainous cold regions has a large spatial variability that is driven both by climate variability and near-surface process variability associated with complex terrain and patterns of vegetation, soils, and hydrogeology. There is a need to downscale large-scale atmospheric circulations towards the fine scales that cold regions hydrological processes operate at to assess their spatial variability in complex terrain and quantify uncertainties by comparison to field observations. In this research, three high resolution numerical weather prediction models, namely, the Intermediate Complexity Atmosphere Research (ICAR), Weather Research and Forecasting (WRF), and Global Environmental Multiscale (GEM) models are used to represent spatial and temporal patterns of atmospheric conditions appropriate for hydrological modelling. An area covering high mountains and foothills of the Canadian Rockies was selected to assess and compare high resolution ICAR (1 km × 1 km), WRF (4 km × 4 km), and GEM (2.5 km × 2.5 km) model outputs with station-based meteorological measurements. ICAR with very low computational cost was run with different initial and boundary conditions and with finer spatial resolution, which allowed an assessment of modelling uncertainty and scaling that was difficult with WRF. Results show that ICAR, when compared with WRF and GEM, performs very well in precipitation and air temperature modelling in the Canadian Rockies, while all three models show a fair performance in simulating wind and humidity fields. Representation of local-scale atmospheric dynamics leading to realistic fields of temperature and precipitation by ICAR, WRF, and GEM makes these models suitable for high resolution cold regions hydrological predictions in complex terrain, which is a key factor in estimating water security in western Canada.

  15. Effects of short-term variability of meteorological variables on soil temperature in permafrost regions

    NASA Astrophysics Data System (ADS)

    Beer, Christian; Porada, Philipp; Ekici, Altug; Brakebusch, Matthias

    2018-03-01

    Effects of the short-term temporal variability of meteorological variables on soil temperature in northern high-latitude regions have been investigated. For this, a process-oriented land surface model has been driven using an artificially manipulated climate dataset. Short-term climate variability mainly impacts snow depth, and the thermal diffusivity of lichens and bryophytes. These impacts of climate variability on insulating surface layers together substantially alter the heat exchange between atmosphere and soil. As a result, soil temperature is 0.1 to 0.8 °C higher when climate variability is reduced. Earth system models project warming of the Arctic region but also increasing variability of meteorological variables and more often extreme meteorological events. Therefore, our results show that projected future increases in permafrost temperature and active-layer thickness in response to climate change will be lower (i) when taking into account future changes in short-term variability of meteorological variables and (ii) when representing dynamic snow and lichen and bryophyte functions in land surface models.

  16. A critical assessment of in-flight particle state during plasma spraying of YSZ and its implications on coating properties and process reliability

    NASA Astrophysics Data System (ADS)

    Srinivasan, Vasudevan

    Air plasma spray is inherently complex due to the deviation from equilibrium conditions, three dimensional nature, multitude of interrelated (controllable) parameters and (uncontrollable) variables involved, and stochastic variability at different stages. The resultant coatings are complex due to the layered high defect density microstructure. Despite the widespread use and commercial success for decades in earthmoving, automotive, aerospace and power generation industries, plasma spray has not been completely understood and prime reliance for critical applications such as thermal barrier coatings on gas turbines are yet to be accomplished. This dissertation is aimed at understanding the in-flight particle state of the plasma spray process towards designing coatings and achieving coating reliability with the aid of noncontact in-flight particle and spray stream sensors. Key issues such as the phenomena of optimum particle injection and the definition of spray stream using particle state are investigated. Few strategies to modify the microstructure and properties of Yttria Stabilized Zirconia coatings are examined systematically using the framework of process maps. An approach to design process window based on design relevant coating properties is presented. Options to control the process for enhanced reproducibility and reliability are examined and the resultant variability is evaluated systematically at the different stages in the process. The 3D variability due to the difference in plasma characteristics has been critically examined by investigating splats collected from the entire spray footprint.

  17. Process Development in the Teaching Laboratory

    NASA Astrophysics Data System (ADS)

    Klein, Leonard C.; Dana, Susanne M.

    1998-06-01

    Many experiences in high school and undergraduate laboratories are well-tested cookbook recipes that have already been designed to yield optimal results; the well-known synthesis of aspirin is such an example. In this project for advanced placement or second-year high school chemistry students, students mimic the process development in industrial laboratories by investigating the effect of varying conditions in the synthesis of aspirin. The class decides on criteria that should be explored (quantity of catalyst, temperature of reaction, etc.). The class is then divided into several teams with each team assigned a variable to study. Each team must submit a proposal describing how they will explore the variable before they start their study. After data on yield and purity has been gathered and evaluated, students discuss which method is most desirable, based on their agreed-upon criteria. This exercise provides an opportunity for students to review many topics from the course (rate of reaction, limiting reagents, Beer's Law) while participating in a cooperative exercise designed to imitate industrial process development.

  18. Properties of Rolled AZ31 Magnesium Alloy Sheet Fabricated by Continuous Variable Cross-Section Direct Extrusion

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Li, Feng; Li, Xue Wen; Shi, Wen Yong

    2018-03-01

    Rolling is currently a widely used method for manufacturing and processing high-performance magnesium alloy sheets and has received widespread attention in recent years. Here, we combined continuous variable cross-section direct extrusion (CVCDE) and rolling processes. The microstructure and mechanical properties of the resulting sheets rolled at different temperatures from CVCDE extrudate were investigated by optical microscopy, scanning electron microscope, transmission electron microscopy and electron backscatter diffraction. The results showed that a fine-grained microstructure was present with an average grain size of 3.62 μm in sheets rolled from CVCDE extrudate at 623 K. Dynamic recrystallization and a large strain were induced by the multi-pass rolling, which resulted in grain refinement. In the 573-673 K range, the yield strength, tensile strength and elongation initially increased and then declined as the CVCDE temperature increased. The above results provide an important scientific basis of processing, manufacturing and the active control on microstructure and property for high-performance magnesium alloy sheet.

  19. Validating the Airspace Concept Evaluation System for Different Weather Days

    NASA Technical Reports Server (NTRS)

    Zelinski, Shannon; Meyn, Larry

    2006-01-01

    This paper extends the process for validating the Airspace Concept Evaluation System using real-world historical flight operational data. System inputs such as flight plans and airport en-route capacities, are generated and processed to create a realistic reproduction of a single day's operations within the National Airspace System. System outputs such as airport throughput, delays, and en-route sector loads are then compared to real world operational metrics and delay statistics for the reproduced day. The process is repeated for 4 historical days with high and low traffic volume and delay attributed to weather. These 4 days are simulated using default en-route capacities and variable en-route capacities used to emulate weather. The validation results show that default enroute capacity simulations are closer to real-world data for low weather days than high weather days. The use of reduced variable enroute capacities adds a large delay bias to ACES but delay trends between weather days are better represented.

  20. High taxonomic variability despite stable functional structure across microbial communities.

    PubMed

    Louca, Stilianos; Jacques, Saulo M S; Pires, Aliny P F; Leal, Juliana S; Srivastava, Diane S; Parfrey, Laura Wegener; Farjalla, Vinicius F; Doebeli, Michael

    2016-12-05

    Understanding the processes that are driving variation of natural microbial communities across space or time is a major challenge for ecologists. Environmental conditions strongly shape the metabolic function of microbial communities; however, other processes such as biotic interactions, random demographic drift or dispersal limitation may also influence community dynamics. The relative importance of these processes and their effects on community function remain largely unknown. To address this uncertainty, here we examined bacterial and archaeal communities in replicate 'miniature' aquatic ecosystems contained within the foliage of wild bromeliads. We used marker gene sequencing to infer the taxonomic composition within nine metabolic functional groups, and shotgun environmental DNA sequencing to estimate the relative abundances of these groups. We found that all of the bromeliads exhibited remarkably similar functional community structures, but that the taxonomic composition within individual functional groups was highly variable. Furthermore, using statistical analyses, we found that non-neutral processes, including environmental filtering and potentially biotic interactions, at least partly shaped the composition within functional groups and were more important than spatial dispersal limitation and demographic drift. Hence both the functional structure and taxonomic composition within functional groups of natural microbial communities may be shaped by non-neutral and roughly separate processes.

  1. Circumpulsar Asteroids: Inferences from Nulling Statistics and High Energy Correlations

    NASA Astrophysics Data System (ADS)

    Shannon, Ryan; Cordes, J. M.

    2006-12-01

    We have proposed that some classes of radio pulsar variability are associated with the entry of neutral asteroidal material into the pulsar magnetosphere. The region surrounding neutron stars is polluted with supernova fall-back material, which collapses and condenses into an asteroid-bearing disk that is stable for millions of years. Over time, collisional and radiative processes cause the asteroids to migrate inward until they are heated to the point of ionization. For older and cooler pulsars, asteroids ionize within the large magnetospheres and inject a sufficient amount of charged particles to alter the electrodynamics of the gap regions and modulate emission processes. This extrinsic model unifies many observed phenomena of variability that occur on time scales that are disparate with the much shorter time scales associated with pulsars and their magnetospheres. One such type of variability is nulling, in which certain pulsars exhibit episodes of quiescence that for some objects may be as short as a few pulse periods, but, for others, is longer than days. Here, in the context of this model, we examine the nulling phenomenon. We analyze the relationship between in-falling material and the statistics of nulling. In addition, as motivation for further high energy observations, we consider the relationship between the nulling and other magnetospheric processes.

  2. Transition of NOAA's GPS-Met Data Acquisition and Processing System to the Commercial Sector: Inital Results

    NASA Astrophysics Data System (ADS)

    Jackson, Michael; Blatt, Stephan; Holub, Kirk

    2015-04-01

    In April of 2014, NOAA/OAR/ESRL Global Systems Division (GSD) and Trimble, in collaboration with Earth Networks, Inc. (ENI) signed a Cooperative Research and Development Agreement (CRADA) to transfer the existing NOAA GPS-Met Data Acquisition and Processing System (GPS-Met DAPS) technology to a commercial Trimble/ENI partnership. NOAA's GPS-Met DAPS is currently operated in a pseudo-operational mode but has proven highly reliable and running at over 95% uptime. The DAPS uses the GAMIT software to ingest dual frequency carrier phase GPS/GNSS observations and ancillary information such as real-time satellite orbits to estimate the zenith-scaled tropospheric (ZTD) signal delays and, where surface MET data are available, retrieve integrated precipitable water vapor (PWV). The NOAA data and products are made available to end users in near real-time. The Trimble/ENI partnership will use the Trimble Pivot™ software with the Atmosphere App to calculate zenith tropospheric (ZTD), tropospheric slant delay, and integrated precipitable water vapor (PWV). Evaluation of the Trimble software is underway starting with a comparison of ZTD and PWV values determined from four sub networks of GPS stations located 1. near NOAA Radiosonde Observation (Upper-Air Observation) launch sites; 2. Stations with low terrain/high moisture variability (Gulf Coast); 3. Stations with high terrain/low moisture variability (Southern California); and 4. Stations with high terrain/high moisture variability (high terrain variability elev. > 1000m). For each network GSD and T/ENI run the same stations for 30 days, compare results, and perform an evaluation of the long-term solution accuracy, precision and reliability. Metrics for success include T/ENI PWV estimates within 1.5 mm of ESRL/GSD's estimates 95% of the time (ZTD uncertainty of less than 10 mm 95% of the time). The threshold for allowable variations in ZTD between NOAA GPS-Met and T/ENI processing are 10mm. The CRADA 1&2 Trimble processing show a variation of 4±2mm and 3±8mm respectively. The threshold for allowable variations in PWV between NOAA GPS-Met and T/ENI processing are 15mm. The CRADA 1&2 Trimble processing show a variation of 2±4mm and 10±13 respectively. The T/ENI PWV and ZTD values meet and exceed the requirements outlined in the CRADA for the first two networks processed. T/ENI Partnership brings a footprint of GNSS and meteorological stations that could significantly enhance the latency, temporal, and geographic density of ZTD and PWV over the US and Europe. We will provide a brief overview of the Trimble Pivot™ software and the Atmosphere App and present results from further testing along with a timeline for the transition of the GPS-Met DAPS to an operational commercial service.

  3. A downscaling scheme for atmospheric variables to drive soil-vegetation-atmosphere transfer models

    NASA Astrophysics Data System (ADS)

    Schomburg, A.; Venema, V.; Lindau, R.; Ament, F.; Simmer, C.

    2010-09-01

    For driving soil-vegetation-transfer models or hydrological models, high-resolution atmospheric forcing data is needed. For most applications the resolution of atmospheric model output is too coarse. To avoid biases due to the non-linear processes, a downscaling system should predict the unresolved variability of the atmospheric forcing. For this purpose we derived a disaggregation system consisting of three steps: (1) a bi-quadratic spline-interpolation of the low-resolution data, (2) a so-called `deterministic' part, based on statistical rules between high-resolution surface variables and the desired atmospheric near-surface variables and (3) an autoregressive noise-generation step. The disaggregation system has been developed and tested based on high-resolution model output (400m horizontal grid spacing). A novel automatic search-algorithm has been developed for deriving the deterministic downscaling rules of step 2. When applied to the atmospheric variables of the lowest layer of the atmospheric COSMO-model, the disaggregation is able to adequately reconstruct the reference fields. Applying downscaling step 1 and 2, root mean square errors are decreased. Step 3 finally leads to a close match of the subgrid variability and temporal autocorrelation with the reference fields. The scheme can be applied to the output of atmospheric models, both for stand-alone offline simulations, and a fully coupled model system.

  4. Blazar Variability from Turbulence in Jets Launched by Magnetically Arrested Accretion Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riordan, Michael O’; Pe’er, Asaf; McKinney, Jonathan C., E-mail: michael_oriordan@umail.ucc.ie

    2017-07-10

    Blazars show variability on timescales ranging from minutes to years, the former being comparable to and in some cases even shorter than the light-crossing time of the central black hole. The observed γ -ray light curves can be described by a power-law power density spectrum (PDS), with a similar index for both BL Lacs and flat-spectrum radio quasars. We show that this variability can be produced by turbulence in relativistic jets launched by magnetically arrested accretion flows (MADs). We perform radiative transport calculations on the turbulent, highly magnetized jet launching region of a MAD with a rapidly rotating supermassive blackmore » hole. The resulting synchrotron and synchrotron self-Compton emission, originating from close to the black hole horizon, is highly variable. This variability is characterized by PDS, which is remarkably similar to the observed power-law spectrum at frequencies less than a few per day. Furthermore, turbulence in the jet launching region naturally produces fluctuations in the plasma on scales much smaller than the horizon radius. We speculate that similar turbulent processes, operating in the jet at large radii (and therefore a high bulk Lorentz factor), are responsible for blazar variability over many decades in frequency, including on minute timescales.« less

  5. Apparatus and method for microwave processing of materials

    DOEpatents

    Johnson, Arvid C.; Lauf, Robert J.; Bible, Don W.; Markunas, Robert J.

    1996-01-01

    A variable frequency microwave heating apparatus (10) designed to allow modulation of the frequency of the microwaves introduced into a furnace cavity (34) for testing or other selected applications. The variable frequency heating apparatus (10) is used in the method of the present invention to monitor the resonant processing frequency within the furnace cavity (34) depending upon the material, including the state thereof, from which the workpiece (36) is fabricated. The variable frequency microwave heating apparatus (10) includes a microwave signal generator (12) and a high-power microwave amplifier (20) or a microwave voltage-controlled oscillator (14). A power supply (22) is provided for operation of the high-power microwave oscillator (14) or microwave amplifier (20). A directional coupler (24) is provided for detecting the direction and amplitude of signals incident upon and reflected from the microwave cavity (34). A first power meter (30) is provided for measuring the power delivered to the microwave furnace (32). A second power meter (26) detects the magnitude of reflected power. Reflected power is dissipated in the reflected power load (28).

  6. Proactive vs. reactive car driving: EEG evidence for different driving strategies of older drivers

    PubMed Central

    Wascher, Edmund; Getzmann, Stephan

    2018-01-01

    Aging is associated with a large heterogeneity in the extent of age-related changes in sensory, motor, and cognitive functions. All these functions can influence the performance in complex tasks like car driving. The present study aims to identify potential differences in underlying cognitive processes that may explain inter-individual variability in driving performance. Younger and older participants performed a one-hour monotonous driving task in a driving simulator under varying crosswind conditions, while behavioral and electrophysiological data were recorded. Overall, younger and older drivers showed comparable driving performance (lane keeping). However, there was a large difference in driving lane variability within the older group. Dividing the older group in two subgroups with low vs. high driving lane variability revealed differences between the two groups in electrophysiological correlates of mental workload, consumption of mental resources, and activation and sustaining of attention: Older drivers with high driving lane variability showed higher frontal Alpha and Theta activity than older drivers with low driving lane variability and—with increasing crosswind—a more pronounced decrease in Beta activity. These results suggest differences in driving strategies of older and younger drivers, with the older drivers using either a rather proactive and alert driving strategy (indicated by low driving lane variability and lower Alpha and Beta activity), or a rather reactive strategy (indicated by high driving lane variability and higher Alpha activity). PMID:29352314

  7. Emotion dysregulation in alexithymia: Startle reactivity to fearful affective imagery and its relation to heart rate variability.

    PubMed

    Panayiotou, Georgia; Constantinou, Elena

    2017-09-01

    Alexithymia is associated with deficiencies in recognizing and expressing emotions and impaired emotion regulation, though few studies have verified the latter assertion using objective measures. This study examined startle reflex modulation by fearful imagery and its associations with heart rate variability in alexithymia. Fifty-four adults (27 alexithymic) imagined previously normed fear scripts. Startle responses were assessed during baseline, first exposure, and reexposure. During first exposure, participants, in separate trials, engaged in either shallow or deep emotion processing, giving emphasis on descriptive or affective aspects of imagery, respectively. Resting heart rate variability was assessed during 2 min of rest prior to the experiment, with high alexithymic participants demonstrating significantly higher LF/HF (low frequency/high frequency) ratio than controls. Deep processing was associated with nonsignificantly larger and faster startle responses at first exposure for alexithymic participants. Lower LF/HF ratio, reflecting higher parasympathetic cardiac activity, predicted greater startle amplitude habituation for alexithymia but lower habituation for controls. Results suggest that, when exposed to prolonged threat, alexithymics may adjust poorly, showing a smaller initial defensive response but slower habituation. This pattern seems related to their low emotion regulation ability as indexed by heart rate variability. © 2017 Society for Psychophysiological Research.

  8. Thermo-Mechanical Processing in Friction Stir Welds

    NASA Technical Reports Server (NTRS)

    Schneider, Judy

    2003-01-01

    Friction stir welding is a solid-phase joining, or welding process that was invented in 1991 at The Welding Institute (TWI). The process is potentially capable of joining a wide variety of aluminum alloys that are traditionally difficult to fusion weld. The friction stir welding (FSW) process produces welds by moving a non-consumable rotating pin tool along a seam between work pieces that are firmly clamped to an anvil. At the start of the process, the rotating pin is plunged into the material to a pre-determined load. The required heat is produced by a combination of frictional and deformation heating. The shape of the tool shoulder and supporting anvil promotes a high hydrostatic pressure along the joint line as the tool shears and literally stirs the metal together. To produce a defect free weld, process variables (RPM, transverse speed, and downward force) and tool pin design must be chosen carefully. An accurate model of the material flow during the process is necessary to guide process variable selection. At MSFC a plastic slip line model of the process has been synthesized based on macroscopic images of the resulting weld material. Although this model appears to have captured the main features of the process, material specific interactions are not understood. The objective of the present research was to develop a basic understanding of the evolution of the microstructure to be able to relate it to the deformation process variables of strain, strain rate, and temperature.

  9. Sunlight, iron and radicals to tackle the resistant leftovers of biotreated winery wastewater.

    PubMed

    Ioannou, Lida; Velegraki, Theodora; Michael, Costas; Mantzavinos, Dionissios; Fatta-Kassinos, Despo

    2013-04-01

    Winery wastewater is characterized by high organic content consisting of alcohols, acids and recalcitrant high-molecular-weight compounds (e.g. polyphenols, tannins and lignins). So far, biological treatment constitutes the best available technology for such effluents that are characterized by high seasonal variability; however the strict legislation applied on the reclamation and reuse of wastewaters for irrigation purposes introduces the need for further treatment of the bioresistant fraction of winery effluents. In this context, the use of alternative treatment technologies, aiming to mineralize or transform refractory molecules into others which could be further biodegraded, is a matter of great concern. In this study, a winery effluent that had already been treated in a sequencing batch reactor was subjected to further purification by homogeneous and heterogeneous solar Fenton oxidation processes. The effect of various operating variables such as catalyst and oxidant concentration, initial pH, temperature and lamp power on the abatement of chemical oxygen demand (COD), dissolved organic carbon (DOC), color, total phenolics and ecotoxicity has been assessed in the homogeneous solar Fenton process. In addition, a comparative assessment between homogeneous and heterogeneous solar Fenton processes was performed. In the present study the homogeneous solar Fenton process has been demonstrated to be the most effective process, yielding COD, DOC and total phenolics removal of about 69%, 48% and 71% in 120 min of the photocatalytic treatment, respectively.

  10. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    PubMed

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.

  11. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data

    PubMed Central

    Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597

  12. The Spatial and Temporal Variability of Meltwater Flow Paths: Insights From a Grid of Over 100 Snow Lysimeters

    NASA Astrophysics Data System (ADS)

    Webb, R. W.; Williams, M. W.; Erickson, T. A.

    2018-02-01

    Snowmelt is an important part of the hydrologic cycle and ecosystem dynamics for headwater systems. However, the physical process of water flow through snow is a poorly understood aspect of snow hydrology as meltwater flow paths tend to be highly complex. Meltwater flow paths diverge and converge as percolating meltwater reaches stratigraphic layer interfaces creating high spatial variability. Additionally, a snowpack is temporally heterogeneous due to rapid localized metamorphism that occurs during melt. This study uses a snowmelt lysimeter array at tree line in the Niwot Ridge study area of northern Colorado. The array is designed to address the issue of spatial and temporal variability of basal discharge at 105 locations over an area of 1,300 m2. Observed coefficients of variation ranged from 0 to almost 10 indicating more variability than previously observed, though this variability decreased throughout each melt season. Snowmelt basal discharge also significantly increases as snow depth decreases displaying a cluster pattern that peaks during weeks 3-5 of the snowmelt season. These results are explained by the flow of meltwater along snow layer interfaces. As the snowpack becomes less stratified through the melt season, the pattern transforms from preferential flow paths to uniform matrix flow. Correlation ranges of the observed basal discharge correspond to a mean representative elementary area of 100 m2, or a characteristic length of 10 m. Snowmelt models representing processes at scales less than this will need to explicitly incorporate the spatial variability of snowmelt discharge and meltwater flow paths through snow between model pixels.

  13. Spatiotemporal Variability of Hillslope Soil Moisture Across Steep, Highly Dissected Topography

    NASA Astrophysics Data System (ADS)

    Jarecke, K. M.; Wondzell, S. M.; Bladon, K. D.

    2016-12-01

    Hillslope ecohydrological processes, including subsurface water flow and plant water uptake, are strongly influenced by soil moisture. However, the factors controlling spatial and temporal variability of soil moisture in steep, mountainous terrain are poorly understood. We asked: How do topography and soils interact to control the spatial and temporal variability of soil moisture in steep, Douglas-fir dominated hillslopes in the western Cascades? We will present a preliminary analysis of bimonthly soil moisture variability from July-November 2016 at 0-30 and 0-60 cm depth across spatially extensive convergent and divergent topographic positions in Watershed 1 of the H.J. Andrews Experimental Forest in central Oregon. Soil moisture monitoring locations were selected following a 5 m LIDAR analysis of topographic position, aspect, and slope. Topographic position index (TPI) was calculated as the difference in elevation to the mean elevation within a 30 m radius. Convergent (negative TPI values) and divergent (positive TPI values) monitoring locations were established along northwest to northeast-facing aspects and within 25-55 degree slopes. We hypothesized that topographic position (convergent vs. divergent), as well as soil physical properties (e.g., texture, bulk density), control variation in hillslope soil moisture at the sub-watershed scale. In addition, we expected the relative importance of hillslope topography to the spatial variability in soil moisture to differ seasonally. By comparing the spatiotemporal variability of hillslope soil moisture across topographic positions, our research provides a foundation for additional understanding of subsurface flow processes and plant-available soil-water in forests with steep, highly dissected terrain.

  14. A century of hydrological variability and trends in the Fraser River Basin

    NASA Astrophysics Data System (ADS)

    Déry, Stephen J.; Hernández-Henríquez, Marco A.; Owens, Philip N.; Parkes, Margot W.; Petticrew, Ellen L.

    2012-06-01

    This study examines the 1911-2010 variability and trends in annual streamflow at 139 sites across the Fraser River Basin (FRB) of British Columbia (BC), Canada. The Fraser River is the largest Canadian waterway flowing to the Pacific Ocean and is one of the world’s greatest salmon rivers. Our analyses reveal high runoff rates and low interannual variability in alpine and coastal rivers, and low runoff rates and high interannual variability in most streams in BC’s interior. The interannual variability in streamflow is also low in rivers such as the Adams, Chilko, Quesnel and Stuart where the principal salmon runs of the Fraser River occur. A trend analysis shows a spatially coherent signal with increasing interannual variability in streamflow across the FRB in recent decades, most notably in spring and summer. The upward trend in the coefficient of variation in annual runoff coincides with a period of near-normal annual runoff for the Fraser River at Hope. The interannual variability in streamflow is greater in regulated rather than natural systems; however, it is unclear whether it is predominantly flow regulation that leads to these observed differences. Environmental changes such as rising air temperatures, more frequent polarity changes in large-scale climate teleconnections such as El Niño-Southern Oscillation and Pacific Decadal Oscillation, and retreating glaciers may be contributing to the greater range in annual runoff fluctuations across the FRB. This has implications for ecological processes throughout the basin, for example affecting migrating and spawning salmon, a keystone species vital to First Nations communities as well as to commercial and recreational fisheries. To exemplify this linkage between variable flows and biological responses, the unusual FRB runoff anomalies observed in 2010 are discussed in the context of that year’s sockeye salmon run. As the climate continues to warm, greater variability in annual streamflow, and hence in hydrological extremes, may influence ecological processes and human usage throughout the FRB in the 21st century.

  15. Holographic femtosecond laser processing and its application to biological materials (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hayasaki, Yoshio

    2017-02-01

    Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.

  16. Spatial heterogeneity of within-stream methane concentrations

    NASA Astrophysics Data System (ADS)

    Crawford, John T.; Loken, Luke C.; West, William E.; Crary, Benjamin; Spawn, Seth A.; Gubbins, Nicholas; Jones, Stuart E.; Striegl, Robert G.; Stanley, Emily H.

    2017-05-01

    Streams, rivers, and other freshwater features may be significant sources of CH4 to the atmosphere. However, high spatial and temporal variabilities hinder our ability to understand the underlying processes of CH4 production and delivery to streams and also challenge the use of scaling approaches across large areas. We studied a stream having high geomorphic variability to assess the underlying scale of CH4 spatial variability and to examine whether the physical structure of a stream can explain the variation in surface CH4. A combination of high-resolution CH4 mapping, a survey of groundwater CH4 concentrations, quantitative analysis of methanogen DNA, and sediment CH4 production potentials illustrates the spatial and geomorphic controls on CH4 emissions to the atmosphere. We observed significant spatial clustering with high CH4 concentrations in organic-rich stream reaches and lake transitions. These sites were also enriched in the methane-producing mcrA gene and had highest CH4 production rates in the laboratory. In contrast, mineral-rich reaches had significantly lower concentrations and had lesser abundances of mcrA. Strong relationships between CH4 and the physical structure of this aquatic system, along with high spatial variability, suggest that future investigations will benefit from viewing streams as landscapes, as opposed to ecosystems simply embedded in larger terrestrial mosaics. In light of such high spatial variability, we recommend that future workers evaluate stream networks first by using similar spatial tools in order to build effective sampling programs.

  17. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  18. Ozone Lidar Observations for Air Quality Studies

    NASA Technical Reports Server (NTRS)

    Wang, Lihua; Newchurch, Mike; Kuang, Shi; Burris, John F.; Huang, Guanyu; Pour-Biazar, Arastoo; Koshak, William; Follette-Cook, Melanie B.; Pickering, Kenneth E.; McGee, Thomas J.; hide

    2015-01-01

    Tropospheric ozone lidars are well suited to measuring the high spatio-temporal variability of this important trace gas. Furthermore, lidar measurements in conjunction with balloon soundings, aircraft, and satellite observations provide substantial information about a variety of atmospheric chemical and physical processes. Examples of processes elucidated by ozone-lidar measurements are presented, and modeling studies using WRF-Chem, RAQMS, and DALES/LES models illustrate our current understanding and shortcomings of these processes.

  19. Effect of process variables on the density and durability of the pellets made from high moisture corn stover

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaya Shankar Tumuluru

    2014-03-01

    A flat die pellet mill was used to understand the effect of high levels of feedstock moisture content in the range of 28–38% (w.b.), with die rotational speeds of 40–60 Hz, and preheating temperatures of 30–110 °C on the pelleting characteristics of 4.8 mm screen size ground corn stover using an 8 mm pellet die. The physical properties of the pelletised biomass studied are: (a) pellet moisture content, (b) unit, bulk and tapped density, and (c) durability. Pelletisation experiments were conducted based on central composite design. Analysis of variance (ANOVA) indicated that feedstock moisture content influenced all of the physicalmore » properties at P < 0.001. Pellet moisture content decreased with increase in preheating temperature to about 110 °C and decreasing the feedstock moisture content to about 28% (w.b.). Response surface models developed for quality attributes with respect to process variables has adequately described the process with coefficient of determination (R2) values of >0.88. The other pellet quality attributes such as unit, bulk, tapped density, were maximised at feedstock moisture content of 30–33% (w.b.), die speeds of >50 Hz and preheating temperature of >90 °C. In case of durability a medium moisture content of 33–34% (w.b.) and preheating temperatures of >70 °C and higher die speeds >50 Hz resulted in high durable pellets. It can be concluded from the present study that feedstock moisture content, followed by preheating, and die rotational speed are the interacting process variables influencing pellet moisture content, unit, bulk and tapped density and durability.« less

  20. Response of Bacterioplankton Communities to Cadmium Exposure in Coastal Water Microcosms with High Temporal Variability

    PubMed Central

    Wang, Kai; Xiong, Jinbo; Chen, Xinxin; Zheng, Jialai; Hu, Changju; Yang, Yina; Zhu, Jianlin

    2014-01-01

    Multiple anthropogenic disturbances to bacterial diversity have been investigated in coastal ecosystems, in which temporal variability in the bacterioplankton community has been considered a ubiquitous process. However, far less is known about the temporal dynamics of a bacterioplankton community responding to pollution disturbances such as toxic metals. We used coastal water microcosms perturbed with 0, 10, 100, and 1,000 μg liter−1 of cadmium (Cd) for 2 weeks to investigate temporal variability, Cd-induced patterns, and their interaction in the coastal bacterioplankton community and to reveal whether the bacterial community structure would reflect the Cd gradient in a temporally varying system. Our results showed that the bacterioplankton community structure shifted along the Cd gradient consistently after a 4-day incubation, although it exhibited some resistance to Cd at low concentration (10 μg liter−1). A process akin to an arms race between temporal variability and Cd exposure was observed, and the temporal variability overwhelmed Cd-induced patterns in the bacterial community. The temporal succession of the bacterial community was correlated with pH, dissolved oxygen, NO3−-N, NO2−-N, PO43−-P, dissolved organic carbon, and chlorophyll a, and each of these parameters contributed more to community variance than Cd did. However, elevated Cd levels did decrease the temporal turnover rate of community. Furthermore, key taxa, affiliated to the families Flavobacteriaceae, Rhodobacteraceae, Erythrobacteraceae, Piscirickettsiaceae, and Alteromonadaceae, showed a high frequency of being associated with Cd levels during 2 weeks. This study provides direct evidence that specific Cd-induced patterns in bacterioplankton communities exist in highly varying manipulated coastal systems. Future investigations on an ecosystem scale across longer temporal scales are needed to validate the observed pattern. PMID:25326310

  1. Multi-platform validation of a high-resolution model in the Western Mediterranean Sea: insight into spatial-temporal variability

    NASA Astrophysics Data System (ADS)

    Aguiar, Eva; Mourre, Baptiste; Heslop, Emma; Juza, Mélanie; Escudier, Romain; Tintoré, Joaquín

    2017-04-01

    This study focuses on the validation of the high resolution Western Mediterranean Operational model (WMOP) developed at SOCIB, the Balearic Islands Coastal Observing and Forecasting System. The Mediterranean Sea is often seen as a small scale ocean laboratory where energetic eddies, fronts and circulation features have important ecological consequences. The Medclic project is a program between "La Caixa" Foundation and SOCIB which aims at characterizing and forecasting the "oceanic weather" in the Western Mediterranean Sea, specifically investigating the interactions between the general circulation and mesoscale processes. We use a WMOP 2009-2015 free run hindcast simulation and available observational datasets (altimetry, moorings and gliders) to both assess the numerical simulation and investigate the ocean variability. WMOP has a 2-km spatial resolution and uses CMEMS Mediterranean products as initial and boundary conditions, with surface forcing from the high-resolution Spanish Meteorological Agency model HIRLAM. Different aspects of the spatial and temporal variability in the model are validated from local to regional and basin scales: (1) the principal axis of variability of the surface circulation using altimetry and moorings along the Iberian coast, (2) the inter-annual changes of the surface flows incorporating also glider data, (3) the propagation of mesoscale eddies formed in the Algerian sub-basin using altimetry, and (4) the statistical properties of eddies (number, rotation, size) applying an eddy tracker detection method in the Western Mediterranean Sea. With these key points evaluated in the model, EOF analysis of sea surface height maps are used to investigate spatial patterns of variability associated with eddies, gyres and the basis-scale circulation and so gain insight into the interconnections between sub-basins, as well as the interactions between physical processes at different scales.

  2. Save money by understanding variance and tolerancing.

    PubMed

    Stuart, K

    2007-01-01

    Manufacturing processes are inherently variable, which results in component and assembly variance. Unless process capability, variance and tolerancing are fully understood, incorrect design tolerances may be applied, which will lead to more expensive tooling, inflated production costs, high reject rates, product recalls and excessive warranty costs. A methodology is described for correctly allocating tolerances and performing appropriate analyses.

  3. Spatial and Temporal Soil Moisture Behavior in a Headwater Watershed of the Mantiqueira Range, Minas Gerais, Brazil

    USDA-ARS?s Scientific Manuscript database

    The characterization of temporal and spatial variability of soil moisture is highly relevant in watersheds for understanding the many hydrological and erosion processes, to better model the processes and apply them to conservation planning. The goal of this study was to map soil moisture of the surf...

  4. Strength and processing properties of wet-formed hardboards from recycled corrugated containers and commercial hardboard fibers

    Treesearch

    J. F. Hunt; C. B. Vick

    1999-01-01

    Recycled paper fiber recovered from our municipal solid waste stream could potentially be used in structural hardboard products. This study compares strength properties and processing variables of wet-formed high-density hardboard panels made from recycled old corrugated container (OCC) fibers and virgin hardboard fibers using continuous pressure during drying. The...

  5. Statistical optics

    NASA Astrophysics Data System (ADS)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  6. A 2.7 Myr record of sedimentary processes on a high-latitude continental slope: 3D seismic evidence from the mid-Norwegian margin

    NASA Astrophysics Data System (ADS)

    Montelli, A.; Dowdeswell, J. A.; Ottesen, D.; Johansen, S. E.

    2017-12-01

    An extensive three-dimensional seismic dataset is used to investigate the sedimentary processes and morphological evolution of the mid-Norwegian continental slope through the Quaternary. These data reveal hundreds of buried landforms, including channels and debris flows of variable morphology, as well as gullies, iceberg ploughmarks, slide scars and sediment waves. Slide scars, turbidity currents and debris flows comprise slope systems controlled by local slope morphology, showing the spatial variability of high-latitude sedimentation. Channels dominate the Early Pleistocene ( 2.7-0.8 Ma) morphological record of the mid-Norwegian slope. During Early Plesitocene, glacimarine sedimentation on the slope was influenced by dense bottom-water flow and turbidity currents. Glacigenic debris-flows appear within the Middle-Late Pleistocene ( 0.8-0 Ma) succession. Their abundance increases on Late Pleistocene palaeo-surfaces, marking a paleo-environmental change characterised by decreasing role for channelized turbidity currents and dense water flows. This transition coincides with the gradual shift to full-glacial ice-sheet conditions marked by the appearance of the first erosive fast-flowing ice streams and an associated increase in sediment flux to the shelf edge, emphasizing first-order climate control on the temporal variability of high-latitude sedimentary slope records.

  7. Simulating variable source problems via post processing of individual particle tallies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.

    2000-10-20

    Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less

  8. Imperfect physician assistant and physical therapist admissions processes in the United States

    PubMed Central

    2014-01-01

    We compared and contrasted physician assistant and physical therapy profession admissions processes based on the similar number of accredited programs in the United States and the co-existence of many programs in the same school of health professions, because both professions conduct similar centralized application procedures administered by the same organization. Many studies are critical of the fallibility and inadequate scientific rigor of the high-stakes nature of health professions admissions decisions, yet typical admission processes remain very similar. Cognitive variables, most notably undergraduate grade point averages, have been shown to be the best predictors of academic achievement in the health professions. The variability of non-cognitive attributes assessed and the methods used to measure them have come under increasing scrutiny in the literature. The variance in health professions students’ performance in the classroom and on certifying examinations remains unexplained, and cognitive considerations vary considerably between and among programs that describe them. One uncertainty resulting from this review is whether or not desired candidate attributes highly sought after by individual programs are more student-centered or graduate-centered. Based on the findings from the literature, we suggest that student success in the classroom versus the clinic is based on a different set of variables. Given the range of positions and general lack of reliability and validity in studies of non-cognitive admissions attributes, we think that health professions admissions processes remain imperfect works in progress. PMID:24810020

  9. Attentional and physiological processing of food images in functional dyspepsia patients: A pilot study.

    PubMed

    Lee, In-Seon; Preissl, Hubert; Giel, Katrin; Schag, Kathrin; Enck, Paul

    2018-01-23

    The food-related behavior of functional dyspepsia has been attracting more interest of late. This pilot study aims to provide evidence of the physiological, emotional, and attentional aspects of food processing in functional dyspepsia patients. The study was performed in 15 functional dyspepsia patients and 17 healthy controls after a standard breakfast. We measured autonomic nervous system activity using skin conductance response and heart rate variability, emotional response using facial electromyography, and visual attention using eyetracking during the visual stimuli of food/non-food images. In comparison to healthy controls, functional dyspepsia patients showed a greater craving for food, a decreased intake of food, more dyspeptic symptoms, lower pleasantness rating of food images (particularly of high fat), decreased low frequency/high frequency ratio of heart rate variability, and suppressed total processing time of food images. There were no significant differences of skin conductance response and facial electromyography data between groups. The results suggest that high level cognitive functions rather than autonomic and emotional mechanisms are more liable to function differently in functional dyspepsia patients. Abnormal dietary behavior, reduced subjective rating of pleasantness and visual attention to food should be considered as important pathophysiological characteristics in functional dyspepsia.

  10. Do team processes really have an effect on clinical performance? A systematic literature review.

    PubMed

    Schmutz, J; Manser, T

    2013-04-01

    There is a growing literature on the relationship between team processes and clinical performance. The purpose of this review is to summarize these articles and examine the impact of team process behaviours on clinical performance. We conducted a literature search in five major databases. Inclusion criteria were: English peer-reviewed papers published between January 2001 and May 2012, which showed or tried to show (i) a statistical relationship of a team process variable and clinical performance or (ii) an improvement of a performance variable through a team process intervention. Study quality was assessed using predefined quality indicators. For every study, we calculated the relevant effect sizes. We included 28 studies in the review, seven of which were intervention studies. Every study reported at least one significant relationship between team processes or an intervention and performance. Also, some non-significant effects were reported. Most of the reported effect sizes were large or medium. The study quality ranged from medium to high. The studies are highly diverse regarding the specific team process behaviours investigated and also regarding the methods used. However, they suggest that team process behaviours do influence clinical performance and that training results in increased performance. Future research should rely on existing theoretical frameworks, valid, and reliable methods to assess processes such as teamwork or coordination and focus on the development of adequate tools to assess process performance, linking them with outcomes in the clinical setting.

  11. ERP correlates of word production predictors in picture naming: a trial by trial multiple regression analysis from stimulus onset to response.

    PubMed

    Valente, Andrea; Bürki, Audrey; Laganaro, Marina

    2014-01-01

    A major effort in cognitive neuroscience of language is to define the temporal and spatial characteristics of the core cognitive processes involved in word production. One approach consists in studying the effects of linguistic and pre-linguistic variables in picture naming tasks. So far, studies have analyzed event-related potentials (ERPs) during word production by examining one or two variables with factorial designs. Here we extended this approach by investigating simultaneously the effects of multiple theoretical relevant predictors in a picture naming task. High density EEG was recorded on 31 participants during overt naming of 100 pictures. ERPs were extracted on a trial by trial basis from picture onset to 100 ms before the onset of articulation. Mixed-effects regression models were conducted to examine which variables affected production latencies and the duration of periods of stable electrophysiological patterns (topographic maps). Results revealed an effect of a pre-linguistic variable, visual complexity, on an early period of stable electric field at scalp, from 140 to 180 ms after picture presentation, a result consistent with the proposal that this time period is associated with visual object recognition processes. Three other variables, word Age of Acquisition, Name Agreement, and Image Agreement influenced response latencies and modulated ERPs from ~380 ms to the end of the analyzed period. These results demonstrate that a topographic analysis fitted into the single trial ERPs and covering the entire processing period allows one to associate the cost generated by psycholinguistic variables to the duration of specific stable electrophysiological processes and to pinpoint the precise time-course of multiple word production predictors at once.

  12. Compensation for Lithography Induced Process Variations during Physical Design

    NASA Astrophysics Data System (ADS)

    Chin, Eric Yiow-Bing

    This dissertation addresses the challenge of designing robust integrated circuits in the deep sub micron regime in the presence of lithography process variability. By extending and combining existing process and circuit analysis techniques, flexible software frameworks are developed to provide detailed studies of circuit performance in the presence of lithography variations such as focus and exposure. Applications of these software frameworks to select circuits demonstrate the electrical impact of these variations and provide insight into variability aware compact models that capture the process dependent circuit behavior. These variability aware timing models abstract lithography variability from the process level to the circuit level and are used to estimate path level circuit performance with high accuracy with very little overhead in runtime. The Interconnect Variability Characterization (IVC) framework maps lithography induced geometrical variations at the interconnect level to electrical delay variations. This framework is applied to one dimensional repeater circuits patterned with both 90nm single patterning and 32nm double patterning technologies, under the presence of focus, exposure, and overlay variability. Studies indicate that single and double patterning layouts generally exhibit small variations in delay (between 1--3%) due to self compensating RC effects associated with dense layouts and overlay errors for layouts without self-compensating RC effects. The delay response of each double patterned interconnect structure is fit with a second order polynomial model with focus, exposure, and misalignment parameters with 12 coefficients and residuals of less than 0.1ps. The IVC framework is also applied to a repeater circuit with cascaded interconnect structures to emulate more complex layout scenarios, and it is observed that the variations on each segment average out to reduce the overall delay variation. The Standard Cell Variability Characterization (SCVC) framework advances existing layout-level lithography aware circuit analysis by extending it to cell-level applications utilizing a physically accurate approach that integrates process simulation, compact transistor models, and circuit simulation to characterize electrical cell behavior. This framework is applied to combinational and sequential cells in the Nangate 45nm Open Cell Library, and the timing response of these cells to lithography focus and exposure variations demonstrate Bossung like behavior. This behavior permits the process parameter dependent response to be captured in a nine term variability aware compact model based on Bossung fitting equations. For a two input NAND gate, the variability aware compact model captures the simulated response to an accuracy of 0.3%. The SCVC framework is also applied to investigate advanced process effects including misalignment and layout proximity. The abstraction of process variability from the layout level to the cell level opens up an entire new realm of circuit analysis and optimization and provides a foundation for path level variability analysis without the computationally expensive costs associated with joint process and circuit simulation. The SCVC framework is used with slight modification to illustrate the speedup and accuracy tradeoffs of using compact models. With variability aware compact models, the process dependent performance of a three stage logic circuit can be estimated to an accuracy of 0.7% with a speedup of over 50,000. Path level variability analysis also provides an accurate estimate (within 1%) of ring oscillator period in well under a second. Another significant advantage of variability aware compact models is that they can be easily incorporated into existing design methodologies for design optimization. This is demonstrated by applying cell swapping on a logic circuit to reduce the overall delay variability along a circuit path. By including these variability aware compact models in cell characterization libraries, design metrics such as circuit timing, power, area, and delay variability can be quickly assessed to optimize for the correct balance of all design metrics, including delay variability. Deterministic lithography variations can be easily captured using the variability aware compact models described in this dissertation. However, another prominent source of variability is random dopant fluctuations, which affect transistor threshold voltage and in turn circuit performance. The SCVC framework is utilized to investigate the interactions between deterministic lithography variations and random dopant fluctuations. Monte Carlo studies show that the output delay distribution in the presence of random dopant fluctuations is dependent on lithography focus and exposure conditions, with a 3.6 ps change in standard deviation across the focus exposure process window. This indicates that the electrical impact of random variations is dependent on systematic lithography variations, and this dependency should be included for precise analysis.

  13. Waste generated in high-rise buildings construction: a quantification model based on statistical multiple regression.

    PubMed

    Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana

    2015-05-01

    Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Drugs meeting the molecular basis of diabetic kidney disease: bridging from molecular mechanism to personalized medicine.

    PubMed

    Lambers Heerspink, Hiddo J; Oberbauer, Rainer; Perco, Paul; Heinzel, Andreas; Heinze, Georg; Mayer, Gert; Mayer, Bernd

    2015-08-01

    Diabetic kidney disease (DKD) is a complex, multifactorial disease and is associated with a high risk of renal and cardiovascular morbidity and mortality. Clinical practice guidelines for diabetes recommend essentially identical treatments for all patients without taking into account how the individual responds to the instituted therapy. Yet, individuals vary widely in how they respond to medications and therefore optimal therapy differs between individuals. Understanding the underlying molecular mechanisms of variability in drug response will help tailor optimal therapy. Polymorphisms in genes related to drug pharmacokinetics have been used to explore mechanisms of response variability in DKD, but with limited success. The complex interaction between genetic make-up and environmental factors on the abundance of proteins and metabolites renders pharmacogenomics alone insufficient to fully capture response variability. A complementary approach is to attribute drug response variability to individual variability in underlying molecular mechanisms involved in the progression of disease. The interplay of different processes (e.g. inflammation, fibrosis, angiogenesis, oxidative stress) appears to drive disease progression, but the individual contribution of each process varies. Drugs at the other hand address specific targets and thereby interfere in certain disease-associated processes. At this level, biomarkers may help to gain insight into which specific pathophysiological processes are involved in an individual followed by a rational assessment whether a specific drug's mode of action indeed targets the relevant process at hand. This article describes the conceptual background and data-driven workflow developed by the SysKid consortium aimed at improving characterization of the molecular mechanisms underlying DKD at the interference of the molecular impact of individual drugs in order to tailor optimal therapy to individual patients. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  15. Method of and apparatus for thermomagnetically processing a workpiece

    DOEpatents

    Kisner, Roger A.; Rios, Orlando; Wilgen, John B.; Ludtka, Gerard M.; Ludtka, Gail M.

    2014-08-05

    A method of thermomagnetically processing a material includes disposing a workpiece within a bore of a magnet; exposing the workpiece to a magnetic field of at least about 1 Tesla generated by the magnet; and, while exposing the workpiece to the magnetic field, applying heat energy to the workpiece at a plurality of frequencies to achieve spatially-controlled heating of the workpiece. An apparatus for thermomagnetically processing a material comprises: a high field strength magnet having a bore extending therethrough for insertion of a workpiece therein; and an energy source disposed adjacent to an entrance to the bore. The energy source is an emitter of variable frequency heat energy, and the bore comprises a waveguide for propagation of the variable frequency heat energy from the energy source to the workpiece.

  16. Regional variation of flow duration curves in the eastern United States: Process-based analyses of the interaction between climate and landscape properties

    NASA Astrophysics Data System (ADS)

    Chouaib, Wafa; Caldwell, Peter V.; Alila, Younes

    2018-04-01

    This paper advances the physical understanding of the flow duration curve (FDC) regional variation. It provides a process-based analysis of the interaction between climate and landscape properties to explain disparities in FDC shapes. We used (i) long term measured flow and precipitation data over 73 catchments from the eastern US. (ii) We calibrated the Sacramento model (SAC-SMA) to simulate soil moisture and flow components FDCs. The catchments classification based on storm characteristics pointed to the effect of catchments landscape properties on the precipitation variability and consequently on the FDC shapes. The landscape properties effect was pronounce such that low value of the slope of FDC (SFDC)-hinting at limited flow variability-were present in regions of high precipitation variability. Whereas, in regions with low precipitation variability the SFDCs were of larger values. The topographic index distribution, at the catchment scale, indicated that saturation excess overland flow mitigated the flow variability under conditions of low elevations with large soil moisture storage capacity and high infiltration rates. The SFDCs increased due to the predominant subsurface stormflow in catchments at high elevations with limited soil moisture storage capacity and low infiltration rates. Our analyses also highlighted the major role of soil infiltration rates on the FDC despite the impact of the predominant runoff generation mechanism and catchment elevation. In conditions of slow infiltration rates in soils of large moisture storage capacity (at low elevations) and predominant saturation excess, the SFDCs were of larger values. On the other hand, the SFDCs decreased in catchments of prevalent subsurface stormflow and poorly drained soils of small soil moisture storage capacity. The analysis of the flow components FDCs demonstrated that the interflow contribution to the response was the higher in catchments with large value of slope of the FDC. The surface flow FDC was the most affected by the precipitation as it tracked the precipitation duration curve (PDC). In catchments with low SFDCs, this became less applicable as surface flow FDC diverged from PDC at the upper tail (> 40% of the flow percentile). The interflow and baseflow FDCs illustrated most the filtering effect on the precipitation. The process understanding we achieved in this study is key for flow simulation and assessment in addition to future works focusing on process-based FDC predictions.

  17. Microbial production of polyhydroxybutyrate with tailor-made properties: an integrated modelling approach and experimental validation.

    PubMed

    Penloglou, Giannis; Chatzidoukas, Christos; Kiparissides, Costas

    2012-01-01

    The microbial production of polyhydroxybutyrate (PHB) is a complex process in which the final quantity and quality of the PHB depend on a large number of process operating variables. Consequently, the design and optimal dynamic operation of a microbial process for the efficient production of PHB with tailor-made molecular properties is an extremely interesting problem. The present study investigates how key process operating variables (i.e., nutritional and aeration conditions) affect the biomass production rate and the PHB accumulation in the cells and its associated molecular weight distribution. A combined metabolic/polymerization/macroscopic modelling approach, relating the process performance and product quality with the process variables, was developed and validated using an extensive series of experiments and measurements. The model predicts the dynamic evolution of the biomass growth, the polymer accumulation, the consumption of carbon and nitrogen sources and the average molecular weights of the PHB in a bioreactor, under batch and fed-batch operating conditions. The proposed integrated model was used for the model-based optimization of the production of PHB with tailor-made molecular properties in Azohydromonas lata bacteria. The process optimization led to a high intracellular PHB accumulation (up to 95% g of PHB per g of DCW) and the production of different grades (i.e., different molecular weight distributions) of PHB. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. How do formulation and process parameters impact blend and unit dose uniformity? Further analysis of the product quality research institute blend uniformity working group industry survey.

    PubMed

    Hancock, Bruno C; Garcia-Munoz, Salvador

    2013-03-01

    Responses from the second Product Quality Research Institute (PQRI) Blend Uniformity Working Group (BUWG) survey of industry have been reanalyzed to identify potential links between formulation and processing variables and the measured uniformity of blends and unit dosage forms. As expected, the variability of the blend potency and tablet potency data increased with a decrease in the loading of the active pharmaceutical ingredient (API). There was also an inverse relationship between the nominal strength of the unit dose and the blend uniformity data. The data from the PQRI industry survey do not support the commonly held viewpoint that granulation processes are necessary to create and sustain tablet and capsule formulations with a high degree of API uniformity. There was no correlation between the blend or tablet potency variability and the type of process used to manufacture the product. Although it is commonly believed that direct compression processes should be avoided for low API loading formulations because of blend and tablet content uniformity concerns, the data for direct compression processes reported by the respondents to the PQRI survey suggest that such processes are being used routinely to manufacture solid dosage forms of acceptable quality even when the drug loading is quite low. Copyright © 2012 Wiley Periodicals, Inc.

  19. Characterizing the variability of food waste quality: A need for efficient valorisation through anaerobic digestion.

    PubMed

    Fisgativa, Henry; Tremier, Anne; Dabert, Patrick

    2016-04-01

    In order to determine the variability of food waste (FW) characteristics and the influence of these variable values on the anaerobic digestion (AD) process, FW characteristics from 70 papers were compiled and analysed statistically. Results indicated that FW characteristics values are effectively very variable and that 24% of these variations may be explained by the geographical origin, the type of collection source and the season of the collection. Considering the whole range of values for physicochemical characteristics (especially volatile solids (VS), chemical oxygen demand (COD) and biomethane potential (BMP)), FW show good potential for AD treatment. However, the high carbohydrates contents (36.4%VS) and the low pH (5.1) might cause inhibitions by the rapid acidification of the digesters. As regards the variation of FW characteristics, FW categories were proposed. Moreover, the adequacy of FW characteristics with AD treatment was discussed. Four FW categories were identified with critical characteristics values for AD performance: (1) the high dry matter (DM) and total ammonia nitrogen (TAN) content of FW collected with green waste, (2) the high cellulose (CEL) content of FW from the organic fraction of municipal solid waste, (3) the low carbon-to-nitrogen (C/N) ratio of FW collected during summer, (4) the high value of TAN and Na of FW from Asia. For these cases, an aerobic pre-treatment or a corrective treatment seems to be advised to avoid instabilities along the digestion. Finally, the results of this review-paper provide a data basis of values for FW characteristics that could be used for AD process design and environmental assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Parent-child attachment, academic performance and the process of high-school dropout: a narrative review.

    PubMed

    Ramsdal, Gro; Bergvik, Svein; Wynn, Rolf

    2015-01-01

    Poor academic performance is a strong predictor of school dropout. Researchers have tried to disentangle variables influencing academic performance. However, studies on preschool and early care variables are seldom examined when explaining the school dropout process. We reviewed the literature on the relationship between caregiver-child attachment and academic performance, including attachment studies from preschool years, seeking out potential contributions to academic performance and the dropout process. The review was organized according to a model of four main mediating hypotheses: the attachment-teaching hypothesis, the social network hypothesis, the attachment-cooperation hypothesis, and the attachment self-regulation hypothesis. The results of the review are summed up in a model. There is some support for all four hypotheses. The review indicates that attachment and early care contribute substantially to dropout and graduation processes. Mediation effects should be given far more attention in future research.

  1. Effect of initial bulk density on high-solids anaerobic digestion of MSW: General mechanism.

    PubMed

    Caicedo, Luis M; Wang, Hongtao; Lu, Wenjing; De Clercq, Djavan; Liu, Yanjun; Xu, Sai; Ni, Zhe

    2017-06-01

    Initial bulk density (IBD) is an important variable in anaerobic digestion since it defines and optimizes the treatment capacity of a system. This study reveals the mechanism on how IBD might affect anaerobic digestion of waste. Four different IBD values: D 1 (500-700kgm -3 ), D 2 (900-1000kgm -3 ), D 3 (1100-1200kgm -3 ) and D 4 (1200-1400kgm -3 ) were set and tested over a period of 90days in simulated landfill reactors. The main variables affected by the IBD are the methane generation, saturation degree, extraction of organic matter, and the total population of methanogens. The study identified that IBD >1000kgm -3 may have significant effect on methane generation, either prolonging the lag time or completely inhibiting the process. This study provides a new understanding of the anaerobic digestion process in saturated high-solids systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Mediators and treatment matching in behavior therapy, cognitive therapy and cognitive behavior therapy for chronic insomnia.

    PubMed

    Harvey, Allison G; Dong, Lu; Bélanger, Lynda; Morin, Charles M

    2017-10-01

    To examine the mediators and the potential of treatment matching to improve outcome for cognitive behavior therapy (CBT) for insomnia. Participants were 188 adults (117 women; Mage = 47.4 years, SD = 12.6) meeting the Diagnostic and Statistical Manual of Mental Disorders (4th ed.; text rev.; DSM-IV-TR; American Psychiatric Association [APA], 2000) diagnostic criteria for chronic insomnia (Mduration: 14.5 years, SD: 12.8). Participants were randomized to behavior therapy (BT; n = 63), cognitive therapy (CT; n = 65), or CBT (n = 60). The outcome measure was the Insomnia Severity Index (ISI). Hypothesized BT mediators were sleep-incompatible behaviors, bedtime variability (BTv), risetime variability (RTv) and time in bed (TIB). Hypothesized CT mediators were worry, unhelpful beliefs, and monitoring for sleep-related threat. The behavioral processes mediated outcome for BT but not CT. The cognitive processes mediated outcome in both BT and CT. The subgroup scoring high on both behavioral and cognitive processes had a marginally significant better outcome if they received CBT relative to BT or CT. The subgroup scoring relatively high on behavioral but low on cognitive processes and received BT or CBT did not differ from those who received CT. The subgroup scoring relatively high on cognitive but low on behavioral processes and received CT or CBT did not differ from those who received BT. The behavioral mediators were specific to BT relative to CT. The cognitive mediators were significant for both BT and CT outcomes. Patients exhibiting high levels of both behavioral and cognitive processes achieve better outcome if they receive CBT relative to BT or CT alone. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Hot mill process parameters impacting on hot mill tertiary scale formation

    NASA Astrophysics Data System (ADS)

    Kennedy, Jonathan Ian

    For high end steel applications surface quality is paramount to deliver a suitable product. A major cause of surface quality issues is from the formation of tertiary scale. The scale formation depends on numerous factors such as thermo-mechanical processing routes, chemical composition, thickness and rolls used. This thesis utilises a collection of data mining techniques to better understand the influence of Hot Mill process parameters on scale formation at Port Talbot Hot Strip Mill in South Wales. The dataset to which these data mining techniques were applied was carefully chosen to reduce process variation. There are several main factors that were considered to minimise this variability including time period, grade and gauge investigated. The following data mining techniques were chosen to investigate this dataset: Partial Least Squares (PLS); Logit Analysis; Principle Component Analysis (PCA); Multinomial Logistical Regression (MLR); Adaptive Neuro Inference Fuzzy Systems (ANFIS). The analysis indicated that the most significant variable for scale formation is the temperature entering the finishing mill. If the temperature is controlled on entering the finishing mill scale will not be formed. Values greater than 1070 °C for the average Roughing Mill and above 1050 °C for the average Crop Shear temperature are considered high, with values greater than this increasing the chance of scale formation. As the temperature increases more scale suppression measures are required to limit scale formation, with high temperatures more likely to generate a greater amount of scale even with fully functional scale suppression systems in place. Chemistry is also a significant factor in scale formation, with Phosphorus being the most significant of the chemistry variables. It is recommended that the chemistry specification for Phosphorus be limited to a maximum value of 0.015 % rather than 0.020 % to limit scale formation. Slabs with higher values should be treated with particular care when being processed through the Hot Mill to limit scale formation.

  4. Very fast optical flaring from a possible new Galactic magnetar.

    PubMed

    Stefanescu, A; Kanbach, G; Słowikowska, A; Greiner, J; McBreen, S; Sala, G

    2008-09-25

    Highly luminous rapid flares are characteristic of processes around compact objects like white dwarfs, neutron stars and black holes. In the high-energy regime of X-rays and gamma-rays, outbursts with variabilities on timescales of seconds or less are routinely observed, for example in gamma-ray bursts or soft gamma-ray repeaters. At optical wavelengths, flaring activity on such timescales has not been observed, other than from the prompt phase of one exceptional gamma-ray burst. This is mostly due to the fact that outbursts with strong, fast flaring are usually discovered in the high-energy regime; most optical follow-up observations of such transients use instruments with integration times exceeding tens of seconds, which are therefore unable to resolve fast variability. Here we show the observation of extremely bright and rapid optical flaring in the Galactic transient SWIFT J195509.6+261406. Our optical light curves are phenomenologically similar to high-energy light curves of soft gamma-ray repeaters and anomalous X-ray pulsars, which are thought to be neutron stars with extremely high magnetic fields (magnetars). This suggests that similar processes are in operation, but with strong emission in the optical, unlike in the case of other known magnetars.

  5. Small-scale temporal and spatial variability in the abundance of plastic pellets on sandy beaches: Methodological considerations for estimating the input of microplastics.

    PubMed

    Moreira, Fabiana Tavares; Prantoni, Alessandro Lívio; Martini, Bruno; de Abreu, Michelle Alves; Stoiev, Sérgio Biato; Turra, Alexander

    2016-01-15

    Microplastics such as pellets have been reported for many years on sandy beaches around the globe. Nevertheless, high variability is observed in their estimates and distribution patterns across the beach environment are still to be unravelled. Here, we investigate the small-scale temporal and spatial variability in the abundance of pellets in the intertidal zone of a sandy beach and evaluate factors that can increase the variability in data sets. The abundance of pellets was estimated during twelve consecutive tidal cycles, identifying the position of the high tide between cycles and sampling drift-lines across the intertidal zone. We demonstrate that beach dynamic processes such as the overlap of strandlines and artefacts of the methods can increase the small-scale variability. The results obtained are discussed in terms of the methodological considerations needed to understand the distribution of pellets in the beach environment, with special implications for studies focused on patterns of input. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Genetic variability in captive populations of the stingless bee Tetragonisca angustula.

    PubMed

    Santiago, Leandro R; Francisco, Flávio O; Jaffé, Rodolfo; Arias, Maria C

    2016-08-01

    Low genetic variability has normally been considered a consequence of animal husbandry and a major contributing factor to declining bee populations. Here, we performed a molecular analysis of captive and wild populations of the stingless bee Tetragonisca angustula, one of the most commonly kept species across South America. Microsatellite analyses showed similar genetic variability between wild and captive populations However, captive populations showed lower mitochondrial genetic variability. Male-mediated gene flow, transport and division of nests are suggested as the most probable explanations for the observed patterns of genetic structure. We conclude that increasing the number of colonies kept through nest divisions does not negatively affect nuclear genetic variability, which seems to be maintained by small-scale male dispersal and human-mediated nest transport. However, the transport of nests from distant localities should be practiced with caution given the high genetic differentiation observed between samples from western and eastern areas. The high genetic structure verified is the result of a long-term evolutionary process, and bees from distant localities may represent unique evolutionary lineages.

  7. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    PubMed

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  8. Advances in the Development of a WCl6 CVD System for Coating UO2 Powders with Tungsten

    NASA Technical Reports Server (NTRS)

    Mireles, Omar R.; Tieman, Alyssa; Broadway, Jeramie; Hickman, Robert

    2013-01-01

    Demonstrated viability and utilization of: a) Fluidized powder bed. b) WCl6 CVD process. c) Coated spherical particles with tungsten. The highly corrosive nature of the WCl6 solid reagent limits material of construction. Indications that identifying optimized process variables with require substantial effort and will likely vary with changes in fuel requirements.

  9. The swiss army knife of job submission tools: grid-control

    NASA Astrophysics Data System (ADS)

    Stober, F.; Fischer, M.; Schleper, P.; Stadie, H.; Garbers, C.; Lange, J.; Kovalchuk, N.

    2017-10-01

    grid-control is a lightweight and highly portable open source submission tool that supports all common workflows in high energy physics (HEP). It has been used by a sizeable number of HEP analyses to process tasks that sometimes consist of up to 100k jobs. grid-control is built around a powerful plugin and configuration system, that allows users to easily specify all aspects of the desired workflow. Job submission to a wide range of local or remote batch systems or grid middleware is supported. Tasks can be conveniently specified through the parameter space that will be processed, which can consist of any number of variables and data sources with complex dependencies on each other. Dataset information is processed through a configurable pipeline of dataset filters, partition plugins and partition filters. The partition plugins can take the number of files, size of the work units, metadata or combinations thereof into account. All changes to the input datasets or variables are propagated through the processing pipeline and can transparently trigger adjustments to the parameter space and the job submission. While the core functionality is completely experiment independent, full integration with the CMS computing environment is provided by a small set of plugins.

  10. The Variability of Atmospheric Deuterium Brightness at Mars: Evidence for Seasonal Dependence

    NASA Astrophysics Data System (ADS)

    Mayyasi, Majd; Clarke, John; Bhattacharyya, Dolon; Deighan, Justin; Jain, Sonal; Chaffin, Michael; Thiemann, Edward; Schneider, Nick; Jakosky, Bruce

    2017-10-01

    The enhanced ratio of deuterium to hydrogen on Mars has been widely interpreted as indicating the loss of a large column of water into space, and the hydrogen content of the upper atmosphere is now known to be highly variable. The variation in the properties of both deuterium and hydrogen in the upper atmosphere of Mars is indicative of the dynamical processes that produce these species and propagate them to altitudes where they can escape the planet. Understanding the seasonal variability of D is key to understanding the variability of the escape rate of water from Mars. Data from a 15 month observing campaign, made by the Mars Atmosphere and Volatile Evolution Imaging Ultraviolet Spectrograph high-resolution echelle channel, are used to determine the brightness of deuterium as observed at the limb of Mars. The D emission is highly variable, with a peak in brightness just after southern summer solstice. The trends of D brightness are examined against extrinsic as well as intrinsic sources. It is found that the fluctuations in deuterium brightness in the upper atmosphere of Mars (up to 400 km), corrected for periodic solar variations, vary on timescales that are similar to those of water vapor fluctuations lower in the atmosphere (20-80 km). The observed variability in deuterium may be attributed to seasonal factors such as regional dust storm activity and subsequent circulation lower in the atmosphere.

  11. Linking interannual variability in shelf bottom water properties to the California Undercurrent and local processes in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Stone, H. B.; Banas, N. S.; Hickey, B. M.; MacCready, P.

    2016-02-01

    The Pacific Northwest coast is an unusually productive area with a strong river influence and highly variable upwelling-favorable and downwelling-favorable winds, but recent trends in hypoxia and ocean acidification in this region are troubling to both scientists and the general public. A new ROMS hindcast model of this region makes possible a study of interannual variability. This study of the interannual temperature and salinity variability on the Pacific Northwest coast is conducted using a coastal hindcast model (43°N - 50°N) spanning 2002-2009 from the University of Washington Coastal Modeling Group, with a resolution of 1.5 km over the shelf and slope. Analysis of hindcast model results was used to assess the relative importance of source water variability, including the poleward California Undercurrent, local and remote wind forcing, winter wind-driven mixing, and river influence in explaining the interannual variations in the shelf bottom layer (40 - 80 m depth, 10 m thick) and over the slope (150 - 250 m depth, <100 km from shelf break) at each latitude within the model domain. Characterized through tracking of the fraction of Pacific Equatorial Water (PEW) relative to Pacific Subarctic Upper Water (PSUW) present on the slope, slope water properties at all latitudes varied little throughout the time series, with the largest variability due to patterns of large north-south advection of water masses over the slope. Over the time series, the standard deviation of slope temperature was 0.09 ˚C, while slope salinity standard deviation was 0.02 psu. Results suggest that shelf bottom water interannual variability is not driven primarily by interannual variability in slope water as shelf bottom water temperature and salinity vary nearly 10 times more than those over the slope. Instead, interannual variability in shelf bottom water properties is likely driven by other processes, such as local and remote wind forcing, and winter wind-driven mixing. The relative contributions of these processes to interannual variability in shelf bottom water properties will be addressed. Overall, these results highlight the importance of shelf processes relative to large-scale influences on the interannual timescale in particular. Implications for variability in hypoxia and ocean acidification impacts will be discussed.

  12. The relationship of working memory, inhibition, and response variability in child psychopathology.

    PubMed

    Verté, Sylvie; Geurts, Hilde M; Roeyers, Herbert; Oosterlaan, Jaap; Sergeant, Joseph A

    2006-02-15

    The aim of this study was to investigate the relationship between working memory and inhibition in children with attention deficit hyperactivity disorder (ADHD), high-functioning autism (HFA), and Tourette syndrome (TS), compared to normally developing children. Furthermore, the contribution of variation in processing speed on working memory and inhibition was investigated in these childhood psychopathologies. Four groups of children are reported in this study: 65 children with ADHD, 66 children with HFA, 24 children with TS, and 82 normal control children. All children were in the age range of 6-13 years. The relationship between working memory and inhibition was similar in children with ADHD, HFA, TS, and normally developing children. The relationship between both domains did not alter significantly for any of the groups, when variation in processing speed was taken into account. More symptoms of hyperactivity/impulsivity are related to a poorer inhibitory process and greater response variability. More symptoms of autism are related to a poorer working memory process. The current study showed that working memory, inhibition, and response variability, are distinct, but related cognitive domains in children with developmental psychopathologies. Research with experimental manipulations is needed to tackle the exact relationship between these cognitive domains.

  13. Efficient SRAM yield optimization with mixture surrogate modeling

    NASA Astrophysics Data System (ADS)

    Zhongjian, Jiang; Zuochang, Ye; Yan, Wang

    2016-12-01

    Largely repeated cells such as SRAM cells usually require extremely low failure-rate to ensure a moderate chi yield. Though fast Monte Carlo methods such as importance sampling and its variants can be used for yield estimation, they are still very expensive if one needs to perform optimization based on such estimations. Typically the process of yield calculation requires a lot of SPICE simulation. The circuit SPICE simulation analysis accounted for the largest proportion of time in the process yield calculation. In the paper, a new method is proposed to address this issue. The key idea is to establish an efficient mixture surrogate model. The surrogate model is based on the design variables and process variables. This model construction method is based on the SPICE simulation to get a certain amount of sample points, these points are trained for mixture surrogate model by the lasso algorithm. Experimental results show that the proposed model is able to calculate accurate yield successfully and it brings significant speed ups to the calculation of failure rate. Based on the model, we made a further accelerated algorithm to further enhance the speed of the yield calculation. It is suitable for high-dimensional process variables and multi-performance applications.

  14. Optimization of L-asparaginase production from novel Enterobacter sp., by submerged fermentation using response surface methodology.

    PubMed

    Erva, Rajeswara Reddy; Goswami, Ajgebi Nath; Suman, Priyanka; Vedanabhatla, Ravali; Rajulapati, Satish Babu

    2017-03-16

    The culture conditions and nutritional rations influencing the production of extra cellular antileukemic enzyme by novel Enterobacter aerogenes KCTC2190/MTCC111 were optimized in shake-flask culture. Process variables like pH, temperature, incubation time, carbon and nitrogen sources, inducer concentration, and inoculum size were taken into account. In the present study, finest enzyme activity achieved by traditional one variable at a time method was 7.6 IU/mL which was a 2.6-fold increase compared to the initial value. Further, the L-asparaginase production was optimized using response surface methodology, and validated experimental result at optimized process variables gave 18.35 IU/mL of L-asparaginase activity, which is 2.4-times higher than the traditional optimization approach. The study explored the E. aerogenes MTCC111 as a potent and potential bacterial source for high yield of antileukemic drug.

  15. Sensitivity analysis and nonlinearity assessment of steam cracking furnace process

    NASA Astrophysics Data System (ADS)

    Rosli, M. N.; Sudibyo, Aziz, N.

    2017-11-01

    In this paper, sensitivity analysis and nonlinearity assessment of cracking furnace process are presented. For the sensitivity analysis, the fractional factorial design method is employed as a method to analyze the effect of input parameters, which consist of four manipulated variables and two disturbance variables, to the output variables and to identify the interaction between each parameter. The result of the factorial design method is used as a screening method to reduce the number of parameters, and subsequently, reducing the complexity of the model. It shows that out of six input parameters, four parameters are significant. After the screening is completed, step test is performed on the significant input parameters to assess the degree of nonlinearity of the system. The result shows that the system is highly nonlinear with respect to changes in an air-to-fuel ratio (AFR) and feed composition.

  16. Working memory and intraindividual variability in processing speed: A lifespan developmental and individual-differences study.

    PubMed

    Mella, Nathalie; Fagot, Delphine; Lecerf, Thierry; de Ribaupierre, Anik

    2015-04-01

    Working memory (WM) and intraindividual variability (IIV) in processing speed are both hypothesized to reflect general attentional processes. In the present study, we aimed at exploring the relationship between WM capacity and IIV in reaction times (RTs) and its possible variation with development across the lifespan. Two WM tasks and six RT tasks of varying complexity were analyzed in a sample of 539 participants, consisting of five age groups: two groups of children (9-10 and 11-12 years of age), one group of young adults, and two groups of older adults (59-69 and 70-89 years of age). Two approaches were adopted. First, low-span and high-span individuals were identified, and analyses of variance were conducted comparing these two groups within each age group and for each RT task. The results consistently showed a span effect in the youngest children and oldest adults: High-span individuals were significantly faster and less variable than low-span individuals. In contrast, in young adults no difference was observed between high- and low-span individuals, whether in terms of their means or IIV. Second, multivariate analyses were conducted on the entire set of tasks, to determine whether IIV in RTs brought different information than the mean RT. The results showed that, although very strongly correlated, the mean and IIV in speed should be kept separate in terms of how they account for individual differences in WM. Overall, our results support the assumption of a link between WM capacity and IIV in RT, more strongly so in childhood and older adulthood.

  17. Perceptual Plasticity for Auditory Object Recognition

    PubMed Central

    Heald, Shannon L. M.; Van Hedger, Stephen C.; Nusbaum, Howard C.

    2017-01-01

    In our auditory environment, we rarely experience the exact acoustic waveform twice. This is especially true for communicative signals that have meaning for listeners. In speech and music, the acoustic signal changes as a function of the talker (or instrument), speaking (or playing) rate, and room acoustics, to name a few factors. Yet, despite this acoustic variability, we are able to recognize a sentence or melody as the same across various kinds of acoustic inputs and determine meaning based on listening goals, expectations, context, and experience. The recognition process relates acoustic signals to prior experience despite variability in signal-relevant and signal-irrelevant acoustic properties, some of which could be considered as “noise” in service of a recognition goal. However, some acoustic variability, if systematic, is lawful and can be exploited by listeners to aid in recognition. Perceivable changes in systematic variability can herald a need for listeners to reorganize perception and reorient their attention to more immediately signal-relevant cues. This view is not incorporated currently in many extant theories of auditory perception, which traditionally reduce psychological or neural representations of perceptual objects and the processes that act on them to static entities. While this reduction is likely done for the sake of empirical tractability, such a reduction may seriously distort the perceptual process to be modeled. We argue that perceptual representations, as well as the processes underlying perception, are dynamically determined by an interaction between the uncertainty of the auditory signal and constraints of context. This suggests that the process of auditory recognition is highly context-dependent in that the identity of a given auditory object may be intrinsically tied to its preceding context. To argue for the flexible neural and psychological updating of sound-to-meaning mappings across speech and music, we draw upon examples of perceptual categories that are thought to be highly stable. This framework suggests that the process of auditory recognition cannot be divorced from the short-term context in which an auditory object is presented. Implications for auditory category acquisition and extant models of auditory perception, both cognitive and neural, are discussed. PMID:28588524

  18. A Tire-Sulfur Hybrid Adsorption Denitrification (T-SHAD) process for decentralized wastewater treatment.

    PubMed

    Krayzelova, Lucie; Lynn, Thomas J; Banihani, Qais; Bartacek, Jan; Jenicek, Pavel; Ergas, Sarina J

    2014-09-15

    Nitrogen discharges from decentralized wastewater treatment (DWT) systems contribute to surface and groundwater contamination. However, the high variability in loading rates, long idle periods and lack of regular maintenance presents a challenge for biological nitrogen removal in DWT. A Tire-Sulfur Hybrid Adsorption Denitrification (T-SHAD) process was developed that combines nitrate (NO3(-)) adsorption to scrap tire chips with sulfur-oxidizing denitrification. This allows the tire chips to adsorb NO3(-) when the influent loading exceeds the denitrification capacity of the biofilm and release it when NO3(-) loading rates are low (e.g. at night). Three waste products, scrap tire chips, elemental sulfur pellets and crushed oyster shells, were used as a medium in adsorption, leaching, microcosm and up-flow packed bed bioreactor studies of NO3(-) removal from synthetic nitrified DWT wastewater. Adsorption isotherms showed that scrap tire chips have an adsorption capacity of 0.66 g NO3(-)-N kg(-1) of scrap tires. Leaching and microcosm studies showed that scrap tires leach bioavailable organic carbon that can support mixotrophic metabolism, resulting in lower effluent SO4(2-) concentrations than sulfur oxidizing denitrification alone. In column studies, the T-SHAD process achieved high NO3(-)-N removal efficiencies under steady state (90%), variable flow (89%) and variable concentration (94%) conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Effect of feed moisture, extrusion temperature and screw speed on properties of soy white flakes based aquafeed: a response surface analysis.

    PubMed

    Singh, Sushil K; Muthukumarappan, Kasiviswanathan

    2016-04-01

    Soy white flakes (SWF) is an intermediate product during soy bean processing. It is an untoasted inexpensive product and contains around 51% of crude protein. It can be a potential source of protein to replace fish meal for developing aquafeed. The extrusion process is versatile and is used for the development of aquafeed. Our objective was to study the effects of inclusion of SWF (up to 50%) and other extrusion processing parameters such as barrel temperature and screw speed on the properties of aquafeed extrudates using a single-screw extruder. Extrudate properties, including pellet durability index, bulk density, water absorption and solubility indices and mass flow rate, were significantly (P < 0.05) affected by the process variables. SWF was the most significant variable with quadratic effects on most of the properties. Increasing temperature and screw speed resulted in increase in durability and mass flow rate of extrudates. Response surface regression models were established to correlate the properties of extrudates to the process variables. SWF was used as an alternative protein source of fish meal. Our study shows that aquafeed with high durability, lower bulk density and lower water absorption and higher solubility indices can be obtained by adding SWF up to 40%. © 2015 Society of Chemical Industry.

  20. Non-native Listeners’ Recognition of High-Variability Speech Using PRESTO

    PubMed Central

    Tamati, Terrin N.; Pisoni, David B.

    2015-01-01

    Background Natural variability in speech is a significant challenge to robust successful spoken word recognition. In everyday listening environments, listeners must quickly adapt and adjust to multiple sources of variability in both the signal and listening environments. High-variability speech may be particularly difficult to understand for non-native listeners, who have less experience with the second language (L2) phonological system and less detailed knowledge of sociolinguistic variation of the L2. Purpose The purpose of this study was to investigate the effects of high-variability sentences on non-native speech recognition and to explore the underlying sources of individual differences in speech recognition abilities of non-native listeners. Research Design Participants completed two sentence recognition tasks involving high-variability and low-variability sentences. They also completed a battery of behavioral tasks and self-report questionnaires designed to assess their indexical processing skills, vocabulary knowledge, and several core neurocognitive abilities. Study Sample Native speakers of Mandarin (n = 25) living in the United States recruited from the Indiana University community participated in the current study. A native comparison group consisted of scores obtained from native speakers of English (n = 21) in the Indiana University community taken from an earlier study. Data Collection and Analysis Speech recognition in high-variability listening conditions was assessed with a sentence recognition task using sentences from PRESTO (Perceptually Robust English Sentence Test Open-Set) mixed in 6-talker multitalker babble. Speech recognition in low-variability listening conditions was assessed using sentences from HINT (Hearing In Noise Test) mixed in 6-talker multitalker babble. Indexical processing skills were measured using a talker discrimination task, a gender discrimination task, and a forced-choice regional dialect categorization task. Vocabulary knowledge was assessed with the WordFam word familiarity test, and executive functioning was assessed with the BRIEF-A (Behavioral Rating Inventory of Executive Function – Adult Version) self-report questionnaire. Scores from the non-native listeners on behavioral tasks and self-report questionnaires were compared with scores obtained from native listeners tested in a previous study and were examined for individual differences. Results Non-native keyword recognition scores were significantly lower on PRESTO sentences than on HINT sentences. Non-native listeners’ keyword recognition scores were also lower than native listeners’ scores on both sentence recognition tasks. Differences in performance on the sentence recognition tasks between non-native and native listeners were larger on PRESTO than on HINT, although group differences varied by signal-to-noise ratio. The non-native and native groups also differed in the ability to categorize talkers by region of origin and in vocabulary knowledge. Individual non-native word recognition accuracy on PRESTO sentences in multitalker babble at more favorable signal-to-noise ratios was found to be related to several BRIEF-A subscales and composite scores. However, non-native performance on PRESTO was not related to regional dialect categorization, talker and gender discrimination, or vocabulary knowledge. Conclusions High-variability sentences in multitalker babble were particularly challenging for non-native listeners. Difficulty under high-variability testing conditions was related to lack of experience with the L2, especially L2 sociolinguistic information, compared with native listeners. Individual differences among the non-native listeners were related to weaknesses in core neurocognitive abilities affecting behavioral control in everyday life. PMID:25405842

  1. Freshwater ecosystems and resilience of Pacific salmon: Habitat Management based on natural variability

    USGS Publications Warehouse

    Bisson, P.A.; Dunham, J.B.; Reeves, G.H.

    2009-01-01

    In spite of numerous habitat restoration programs in fresh waters with an aggregate annual funding of millions of dollars, many populations of Pacific salmon remain significantly imperiled. Habitat restoration strategies that address limited environmental attributes and partial salmon life-history requirements or approaches that attempt to force aquatic habitat to conform to idealized but ecologically unsustainable conditions may partly explain this lack of response. Natural watershed processes generate highly variable environmental conditions and population responses, i.e., multiple life histories, that are often not considered in restoration. Examples from several locations underscore the importance of natural variability to the resilience of Pacific salmon. The implication is that habitat restoration efforts will be more likely to foster salmon resilience if they consider processes that generate and maintain natural variability in fresh water. We identify three specific criteria for management based on natural variability: the capacity of aquatic habitat to recover from disturbance, a range of habitats distributed across stream networks through time sufficient to fulfill the requirements of diverse salmon life histories, and ecological connectivity. In light of these considerations, we discuss current threats to habitat resilience and describe how regulatory and restoration approaches can be modified to better incorporate natural variability. ?? 2009 by the author(s).

  2. Process for applying control variables having fractal structures

    DOEpatents

    Bullock, IV, Jonathan S.; Lawson, Roger L.

    1996-01-01

    A process and apparatus for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform.

  3. Process for applying control variables having fractal structures

    DOEpatents

    Bullock, J.S. IV; Lawson, R.L.

    1996-01-23

    A process and apparatus are disclosed for the application of a control variable having a fractal structure to a body or process. The process of the present invention comprises the steps of generating a control variable having a fractal structure and applying the control variable to a body or process reacting in accordance with the control variable. The process is applicable to electroforming where first, second and successive pulsed-currents are applied to cause the deposition of material onto a substrate, such that the first pulsed-current, the second pulsed-current, and successive pulsed currents form a fractal pulsed-current waveform. 3 figs.

  4. The role of discharge variability in the formation and preservation of alluvial sediment bodies

    NASA Astrophysics Data System (ADS)

    Fielding, Christopher R.; Alexander, Jan; Allen, Jonathan P.

    2018-03-01

    Extant, planform-based facies models for alluvial deposits are not fully fit for purpose, because they over-emphasise plan form whereas there is little in the alluvial rock record that is distinctive of any particular planform, and because the planform of individual rivers vary in both time and space. Accordingly, existing facies models have limited predictive capability. In this paper, we explore the role of inter-annual peak discharge variability as a possible control on the character of the preserved alluvial record. Data from a suite of modern rivers, for which long-term gauging records are available, and for which there are published descriptions of subsurface sedimentary architecture, are analysed. The selected rivers are categorized according to their variance in peak discharge or the coefficient of variation (CVQp = standard deviation of the annual peak flood discharge over the mean annual peak flood discharge). This parameter ranges over the rivers studied between 0.18 and 1.22, allowing classification of rivers as having very low (< 0.20), low (0.20-0.40), moderate (0.40-0.60), high (0.60-0.90), or very high (> 0.90) annual peak discharge variance. Deposits of rivers with very low and low peak discharge variability are dominated by cross-bedding on various scales and preserve macroform bedding structure, allowing the interpretation of bar construction processes. Rivers with moderate values preserve mostly cross-bedding, but records of macroform processes are in places muted and considerably modified by reworking. Rivers with high and very high values of annual peak discharge variability show a wide range of bedding structures commonly including critical and supercritical flow structures, abundant in situ trees and transported large, woody debris, and their deposits contain pedogenically modified mud partings and generally lack macroform structure. Such a facies assemblage is distinctively different from the conventional fluvial style recorded in published facies models but is widely developed both in modern and ancient alluvial deposits. This high-peak-variance style is also distinctive of rivers that are undergoing contraction in discharge over time because of the gradual annexation of the channel belt by the establishment of woody vegetation. We propose that discharge variability, both inter-annual peak variation and "flashiness" may be a more reliable basis for classifying the alluvial rock record than planform, and we provide some examples of three classes of alluvial sediment bodies (representing low, intermediate, and high/very high discharge variability) from the rock record that illustrate this point.

  5. Constituent loads in small streams: the process and problems of estimating sediment flux

    Treesearch

    R. B. Thomas

    1989-01-01

    Constituent loads in small streams are often estimated poorly. This is especially true for discharge-related constituents like sediment, since their flux is highly variable and mainly occurs during infrequent high-flow events. One reason for low-quality estimates is that most prevailing data collection methods ignore sampling probabilities and only partly account for...

  6. Spatially complex distribution of dissolved manganese in a fjord as revealed by high-resolution in situ sensing using the autonomous underwater vehicle Autosub.

    PubMed

    Statham, P J; Connelly, D P; German, C R; Brand, T; Overnell, J O; Bulukin, E; Millard, N; McPhail, S; Pebody, M; Perrett, J; Squire, M; Stevenson, P; Webb, A

    2005-12-15

    Loch Etive is a fjordic system on the west coast of Scotland. The deep waters of the upper basin are periodically isolated, and during these periods oxygen is lost through benthic respiration and concentrations of dissolved manganese increase. In April 2000 the autonomous underwater vehicle (AUV) Autosub was fitted with an in situ dissolved manganese analyzer and was used to study the spatial variability of this element together with oxygen, salinity, and temperature throughout the basin. Six along-loch transects were completed at either constant height above the seafloor or at constant depth below the surface. The ca. 4000 in situ 10-s-average dissolved Mn (Mnd) data points obtained provide a new quasi-synoptic and highly detailed view of the distribution of manganese in this fjordic environment not possible using conventional (water bottle) sampling. There is substantial variability in concentrations (<25 to >600 nM) and distributions of Mnd. Surface waters are characteristically low in Mnd reflecting mixing of riverine and marine end-member waters, both of which are low in Mnd. The deeper waters are enriched in Mnd, and as the water column always contains some oxygen, this must reflect primarily benthic inputs of reduced dissolved Mn. However, this enrichment of Mnd is spatially very variable, presumably as a result of variability in release of Mn coupled with mixing of water in the loch and removal processes. This work demonstrates how AUVs coupled with chemical sensors can reveal substantial small-scale variability of distributions of chemical species in coastal environments that would not be resolved by conventional sampling approaches. Such information is essential if we are to improve our understanding of the nature and significance of the underlying processes leading to this variability.

  7. Time-Variable Transit Time Distributions in the Hyporheic Zone of a Headwater Mountain Stream

    NASA Astrophysics Data System (ADS)

    Ward, Adam S.; Schmadel, Noah M.; Wondzell, Steven M.

    2018-03-01

    Exchange of water between streams and their hyporheic zones is known to be dynamic in response to hydrologic forcing, variable in space, and to exist in a framework with nested flow cells. The expected result of heterogeneous geomorphic setting, hydrologic forcing, and between-feature interaction is hyporheic transit times that are highly variable in both space and time. Transit time distributions (TTDs) are important as they reflect the potential for hyporheic processes to impact biogeochemical transformations and ecosystems. In this study we simulate time-variable transit time distributions based on dynamic vertical exchange in a headwater mountain stream with observed, heterogeneous step-pool morphology. Our simulations include hyporheic exchange over a 600 m river corridor reach driven by continuously observed, time-variable hydrologic conditions for more than 1 year. We found that spatial variability at an instance in time is typically larger than temporal variation for the reach. Furthermore, we found reach-scale TTDs were marginally variable under all but the most extreme hydrologic conditions, indicating that TTDs are highly transferable in time. Finally, we found that aggregation of annual variation in space and time into a "master TTD" reasonably represents most of the hydrologic dynamics simulated, suggesting that this aggregation approach may provide a relevant basis for scaling from features or short reaches to entire networks.

  8. Investigation of clinical pharmacokinetic variability of an opioid antagonist through physiologically based absorption modeling.

    PubMed

    Ding, Xuan; He, Minxia; Kulkarni, Rajesh; Patel, Nita; Zhang, Xiaoyu

    2013-08-01

    Identifying the source of inter- and/or intrasubject variability in pharmacokinetics (PK) provides fundamental information in understanding the pharmacokinetics-pharmacodynamics relationship of a drug and project its efficacy and safety in clinical populations. This identification process can be challenging given that a large number of potential causes could lead to PK variability. Here we present an integrated approach of physiologically based absorption modeling to investigate the root cause of unexpectedly high PK variability of a Phase I clinical trial drug. LY2196044 exhibited high intersubject variability in the absorption phase of plasma concentration-time profiles in humans. This could not be explained by in vitro measurements of drug properties and excellent bioavailability with low variability observed in preclinical species. GastroPlus™ modeling suggested that the compound's optimal solubility and permeability characteristics would enable rapid and complete absorption in preclinical species and in humans. However, simulations of human plasma concentration-time profiles indicated that despite sufficient solubility and rapid dissolution of LY2196044 in humans, permeability and/or transit in the gastrointestinal (GI) tract may have been negatively affected. It was concluded that clinical PK variability was potentially due to the drug's antagonism on opioid receptors that affected its transit and absorption in the GI tract. Copyright © 2013 Wiley Periodicals, Inc.

  9. Spatial heterogeneity of within-stream methane concentrations

    USGS Publications Warehouse

    Crawford, John T.; Loken, Luke C.; West, William E.; Crary, Benjamin; Spawn, Seth A.; Gubbins, Nicholas; Jones, Stuart E.; Striegl, Robert G.; Stanley, Emily H.

    2017-01-01

    Streams, rivers, and other freshwater features may be significant sources of CH4 to the atmosphere. However, high spatial and temporal variabilities hinder our ability to understand the underlying processes of CH4 production and delivery to streams and also challenge the use of scaling approaches across large areas. We studied a stream having high geomorphic variability to assess the underlying scale of CH4 spatial variability and to examine whether the physical structure of a stream can explain the variation in surface CH4. A combination of high-resolution CH4 mapping, a survey of groundwater CH4 concentrations, quantitative analysis of methanogen DNA, and sediment CH4 production potentials illustrates the spatial and geomorphic controls on CH4 emissions to the atmosphere. We observed significant spatial clustering with high CH4 concentrations in organic-rich stream reaches and lake transitions. These sites were also enriched in the methane-producing mcrA gene and had highest CH4 production rates in the laboratory. In contrast, mineral-rich reaches had significantly lower concentrations and had lesser abundances of mcrA. Strong relationships between CH4and the physical structure of this aquatic system, along with high spatial variability, suggest that future investigations will benefit from viewing streams as landscapes, as opposed to ecosystems simply embedded in larger terrestrial mosaics. In light of such high spatial variability, we recommend that future workers evaluate stream networks first by using similar spatial tools in order to build effective sampling programs.

  10. Advances in photonic MOEMS-MEMS device thinning and polishing

    NASA Astrophysics Data System (ADS)

    McAneny, James J.; Kennedy, Mark; McGroggan, Tom

    2010-02-01

    As devices continue to increase in density and complexity, ever more stringent specifications are placed on the wafer scale equipment manufacturers to produce higher quality and higher output. This results in greater investment and more resource being diverted into producing tools and processes which can meet the latest demanding criteria. Substrate materials employed in the fabrication process range from Silicon through InP and include GaAs, InSb and other optical networking or waveguide materials. With this diversity of substrate materials presented, controlling the geometries and surfaces grows progressively more challenging. This article highlights the key parameters which require close monitoring and control in order to produce highly precise wafers as part of the fabrication process. Several as cut and commercially available standard polished wafer materials were used in empirical trials to test tooling options in generating high levels of geometric control over the dimensions while producing high quality surface finishes. Specific attention was given to the measurement and control of: flatness; parallelism/TTV; surface roughness and final target thickness as common specifications required by the industry. By combining the process variables of: plate speed, download pressure, slurry flow rate and concentration, pad type and wafer travel path across the polish pad, the effect of altering these variables was recorded and analysed to realize the optimum process conditions for the materials under test. The results being then used to design improved methods and tooling for the thinning and polishing of photonic materials applied to MOEMS-MEMS device fabrication.

  11. Space-Time Variability in River Flow Regimes of Northeast Turkey

    NASA Astrophysics Data System (ADS)

    Saris, F.; Hannah, D. M.; Eastwood, W. J.

    2011-12-01

    The northeast region of Turkey is characterised by relatively high annual precipitation totals and river flow. It is a mountainous region with high ecological status and also it is of prime interest to the energy sector. These characteristics make this region an important area for a hydroclimatology research in terms of future availability and management of water resources. However, there is not any previous research identifying hydroclimatological variability across the region. This study provides first comprehensive and detailed information on river flow regimes of northeast Turkey which is delimited by two major river basins namely East Black Sea (EBS) and Çoruh River (ÇRB) basins. A novel river flow classification is used that yields a large-scale perspective on hydroclimatology patterns of the region and allows interpretations regarding the controlling factors on river flow variability. River flow regimes are classified (with respect to timing and magnitude of flow) to examine spatial variability based on long-term average regimes, and also by grouping annual regimes for each station-year to identify temporal (between-year) variability. Results indicate that rivers in northeast Turkey are characterised by marked seasonal flow variation with an April-May-June maximum flow period. Spatial variability in flow regime seasonality is dependent largely on the topography of the study area. The EBS Basin, for which the North Anatolian Mountains cover the eastern part, is characterised by a May-June peak; whereas the ÇRB is defined by an April-May flow peak. The timing of river flows indicates that snowmelt is an important process and contributor of river flow maxima for both basins. The low flow season is January and February. Intermediate and low regime magnitude classes dominate in ÇRB and EBS basins, respectively, while high flow magnitude class is observed for one station only across the region. Result of regime stability analysis (year-to-year variation) shows that April-May and May-June peak shape classes together with low and intermediate magnitude classes are the most frequent and persistent flow regimes. This research has advanced understanding of hydroclimatological processes in northeast Turkey by identifying river flow regimes and together with explanations regarding the controlling factors on river flow variability.

  12. Late Holocene sea level variability and Atlantic Meridional Overturning Circulation

    USGS Publications Warehouse

    Cronin, Thomas M.; Farmer, Jesse R.; Marzen, R. E.; Thomas, E.; Varekamp, J.C.

    2014-01-01

    Pre-twentieth century sea level (SL) variability remains poorly understood due to limits of tide gauge records, low temporal resolution of tidal marsh records, and regional anomalies caused by dynamic ocean processes, notably multidecadal changes in Atlantic Meridional Overturning Circulation (AMOC). We examined SL and AMOC variability along the eastern United States over the last 2000 years, using a SL curve constructed from proxy sea surface temperature (SST) records from Chesapeake Bay, and twentieth century SL-sea surface temperature (SST) relations derived from tide gauges and instrumental SST. The SL curve shows multidecadal-scale variability (20–30 years) during the Medieval Climate Anomaly (MCA) and Little Ice Age (LIA), as well as the twentieth century. During these SL oscillations, short-term rates ranged from 2 to 4 mm yr−1, roughly similar to those of the last few decades. These oscillations likely represent internal modes of climate variability related to AMOC variability and originating at high latitudes, although the exact mechanisms remain unclear. Results imply that dynamic ocean changes, in addition to thermosteric, glacio-eustatic, or glacio-isostatic processes are an inherent part of SL variability in coastal regions, even during millennial-scale climate oscillations such as the MCA and LIA and should be factored into efforts that use tide gauges and tidal marsh sediments to understand global sea level rise.

  13. A priori and a posteriori analyses of the flamelet/progress variable approach for supersonic combustion

    NASA Astrophysics Data System (ADS)

    Saghafian, Amirreza; Pitsch, Heinz

    2012-11-01

    A compressible flamelet/progress variable approach (CFPV) has been devised for high-speed flows. Temperature is computed from the transported total energy and tabulated species mass fractions and the source term of the progress variable is rescaled with pressure and temperature. The combustion is thus modeled by three additional scalar equations and a chemistry table that is computed in a pre-processing step. Three-dimensional direct numerical simulation (DNS) databases of reacting supersonic turbulent mixing layer with detailed chemistry are analyzed to assess the underlying assumptions of CFPV. Large eddy simulations (LES) of the same configuration using the CFPV method have been performed and compared with the DNS results. The LES computations are based on the presumed subgrid PDFs of mixture fraction and progress variable, beta function and delta function respectively, which are assessed using DNS databases. The flamelet equation budget is also computed to verify the validity of CFPV method for high-speed flows.

  14. Comparison of pre-processing methods for multiplex bead-based immunoassays.

    PubMed

    Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter

    2016-08-11

    High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.

  15. Reward speeds up and increases consistency of visual selective attention: a lifespan comparison.

    PubMed

    Störmer, Viola; Eppinger, Ben; Li, Shu-Chen

    2014-06-01

    Children and older adults often show less favorable reward-based learning and decision making, relative to younger adults. It is unknown, however, whether reward-based processes that influence relatively early perceptual and attentional processes show similar lifespan differences. In this study, we investigated whether stimulus-reward associations affect selective visual attention differently across the human lifespan. Children, adolescents, younger adults, and older adults performed a visual search task in which the target colors were associated with either high or low monetary rewards. We discovered that high reward value speeded up response times across all four age groups, indicating that reward modulates attentional selection across the lifespan. This speed-up in response time was largest in younger adults, relative to the other three age groups. Furthermore, only younger adults benefited from high reward value in increasing response consistency (i.e., reduction of trial-by-trial reaction time variability). Our findings suggest that reward-based modulations of relatively early and implicit perceptual and attentional processes are operative across the lifespan, and the effects appear to be greater in adulthood. The age-specific effect of reward on reducing intraindividual response variability in younger adults likely reflects mechanisms underlying the development and aging of reward processing, such as lifespan age differences in the efficacy of dopaminergic modulation. Overall, the present results indicate that reward shapes visual perception across different age groups by biasing attention to motivationally salient events.

  16. A Bayesian methodological framework for accommodating interannual variability of nutrient loading with the SPARROW model

    NASA Astrophysics Data System (ADS)

    Wellen, Christopher; Arhonditsis, George B.; Labencki, Tanya; Boyd, Duncan

    2012-10-01

    Regression-type, hybrid empirical/process-based models (e.g., SPARROW, PolFlow) have assumed a prominent role in efforts to estimate the sources and transport of nutrient pollution at river basin scales. However, almost no attempts have been made to explicitly accommodate interannual nutrient loading variability in their structure, despite empirical and theoretical evidence indicating that the associated source/sink processes are quite variable at annual timescales. In this study, we present two methodological approaches to accommodate interannual variability with the Spatially Referenced Regressions on Watershed attributes (SPARROW) nonlinear regression model. The first strategy uses the SPARROW model to estimate a static baseline load and climatic variables (e.g., precipitation) to drive the interannual variability. The second approach allows the source/sink processes within the SPARROW model to vary at annual timescales using dynamic parameter estimation techniques akin to those used in dynamic linear models. Model parameterization is founded upon Bayesian inference techniques that explicitly consider calibration data and model uncertainty. Our case study is the Hamilton Harbor watershed, a mixed agricultural and urban residential area located at the western end of Lake Ontario, Canada. Our analysis suggests that dynamic parameter estimation is the more parsimonious of the two strategies tested and can offer insights into the temporal structural changes associated with watershed functioning. Consistent with empirical and theoretical work, model estimated annual in-stream attenuation rates varied inversely with annual discharge. Estimated phosphorus source areas were concentrated near the receiving water body during years of high in-stream attenuation and dispersed along the main stems of the streams during years of low attenuation, suggesting that nutrient source areas are subject to interannual variability.

  17. How does a newly encountered face become familiar? The effect of within-person variability on adults' and children's perception of identity.

    PubMed

    Baker, Kristen A; Laurence, Sarah; Mondloch, Catherine J

    2017-04-01

    Adults and children aged 6years and older easily recognize multiple images of a familiar face, but often perceive two images of an unfamiliar face as belonging to different identities. Here we examined the process by which a newly encountered face becomes familiar, defined as accurate recognition of multiple images that capture natural within-person variability in appearance. In Experiment 1 we examined whether exposure to within-person variability in appearance helps children learn a new face. Children aged 6-13years watched a 10-min video of a woman reading a story; she was filmed on a single day (low variability) or over three days, across which her appearance and filming conditions (e.g., camera, lighting) varied (high variability). After familiarization, participants sorted a set of images comprising novel images of the target identity intermixed with distractors. Compared to participants who received no familiarization, children showed evidence of learning only in the high-variability condition, in contrast to adults who showed evidence of learning in both the low- and high-variability conditions. Experiment 2 highlighted the efficiency with which adults learn a new face; their accuracy was comparable across training conditions despite variability in duration (1 vs. 10min) and type (video vs. static images) of training. Collectively, our findings show that exposure to variability leads to the formation of a robust representation of facial identity, consistent with perceptual learning in other domains (e.g., language), and that the development of face learning is protracted throughout childhood. We discuss possible underlying mechanisms. Copyright © 2016. Published by Elsevier B.V.

  18. Coastal vulnerability assessment with the use of environmental and socio-economic indicators

    NASA Astrophysics Data System (ADS)

    Alexandrakis, George; Petrakis, Stelios; Vousdoukas, Mixalis; Ghionis, George; Hatziyanni, Eleni; Kampanis, Nikolaos

    2014-05-01

    Climate change has significant repercussions on the natural environment, triggering obvious changes in the natural processes that have a severe socio-economic impact on the coastal zone; where a great number of human activities are concentrated. So far, the estimation of coastal vulnerability was based primarily on the natural processes and less on socio-economic variables, which would assist in the identification of vulnerable areas. The present investigation proposes a methodology to examine the vulnerability of a highly touristic area in the Island of Crete to an expected sea level rise of up to ~40 cm by the year 2100, according to the A1B scenario of IPCC 2007. The methodology includes the combination of socio-economic indicators into a GIS-based coastal vulnerability index for wave-induced erosion. This approach includes three sub-indices that contribute equally to the overall index. The sub-indices refer to coastal forcing, socio-economic and coastal characteristics. All variables are ranked on a 1-5 scale with 5 indicating higher vulnerability. The socio-economic sub-index includes, as indicators, the population of the study area, cultural heritage sites, transport networks, land use and protection measures. The coastal forcing sub-index includes the frequency of extreme events, while the Coastal Vulnerability Index includes the geological variables (coastal geomorphology, historical coastline changes, and regional coastal slope) and the variables representing the marine processes (relative sea level rise, mean significant wave height, and tidal range). The main difficulty for the estimation of the index lies in assessing and ranking the socio-economic indicators. The whole approach was tested and validated through field and desktop studies, using as a case study the Elouda bay, Crete Isl., an area of high cultural and economic value, which combines monuments from ancient and medieval times, with a very high touristic development since the 1970s.

  19. Temporal and Spatial Variation in Peatland Carbon Cycling and Implications for Interpreting Responses of an Ecosystem-Scale Warming Experiment

    DOE PAGES

    Griffiths, Natalie A.; Hanson, Paul J.; Ricciuto, Daniel M.; ...

    2017-11-22

    Here, we are conducting a large-scale, long-term climate change response experiment in an ombrotrophic peat bog in Minnesota to evaluate the effects of warming and elevated CO 2 on ecosystem processes using empirical and modeling approaches. To better frame future assessments of peatland responses to climate change, we characterized and compared spatial vs. temporal variation in measured C cycle processes and their environmental drivers. We also conducted a sensitivity analysis of a peatland C model to identify how variation in ecosystem parameters contributes to model prediction uncertainty. High spatial variability in C cycle processes resulted in the inability to determinemore » if the bog was a C source or sink, as the 95% confidence interval ranged from a source of 50 g C m –2 yr –1 to a sink of 67 g C m –2 yr –1. Model sensitivity analysis also identified that spatial variation in tree and shrub photosynthesis, allocation characteristics, and maintenance respiration all contributed to large variations in the pretreatment estimates of net C balance. Variation in ecosystem processes can be more thoroughly characterized if more measurements are collected for parameters that are highly variable over space and time, and especially if those measurements encompass environmental gradients that may be driving the spatial and temporal variation (e.g., hummock vs. hollow microtopographies, and wet vs. dry years). Together, the coupled modeling and empirical approaches indicate that variability in C cycle processes and their drivers must be taken into account when interpreting the significance of experimental warming and elevated CO 2 treatments.« less

  20. Temporal and Spatial Variation in Peatland Carbon Cycling and Implications for Interpreting Responses of an Ecosystem-Scale Warming Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffiths, Natalie A.; Hanson, Paul J.; Ricciuto, Daniel M.

    Here, we are conducting a large-scale, long-term climate change response experiment in an ombrotrophic peat bog in Minnesota to evaluate the effects of warming and elevated CO 2 on ecosystem processes using empirical and modeling approaches. To better frame future assessments of peatland responses to climate change, we characterized and compared spatial vs. temporal variation in measured C cycle processes and their environmental drivers. We also conducted a sensitivity analysis of a peatland C model to identify how variation in ecosystem parameters contributes to model prediction uncertainty. High spatial variability in C cycle processes resulted in the inability to determinemore » if the bog was a C source or sink, as the 95% confidence interval ranged from a source of 50 g C m –2 yr –1 to a sink of 67 g C m –2 yr –1. Model sensitivity analysis also identified that spatial variation in tree and shrub photosynthesis, allocation characteristics, and maintenance respiration all contributed to large variations in the pretreatment estimates of net C balance. Variation in ecosystem processes can be more thoroughly characterized if more measurements are collected for parameters that are highly variable over space and time, and especially if those measurements encompass environmental gradients that may be driving the spatial and temporal variation (e.g., hummock vs. hollow microtopographies, and wet vs. dry years). Together, the coupled modeling and empirical approaches indicate that variability in C cycle processes and their drivers must be taken into account when interpreting the significance of experimental warming and elevated CO 2 treatments.« less

  1. Estimating and mapping ecological processes influencing microbial community assembly

    PubMed Central

    Stegen, James C.; Lin, Xueju; Fredrickson, Jim K.; Konopka, Allan E.

    2015-01-01

    Ecological community assembly is governed by a combination of (i) selection resulting from among-taxa differences in performance; (ii) dispersal resulting from organismal movement; and (iii) ecological drift resulting from stochastic changes in population sizes. The relative importance and nature of these processes can vary across environments. Selection can be homogeneous or variable, and while dispersal is a rate, we conceptualize extreme dispersal rates as two categories; dispersal limitation results from limited exchange of organisms among communities, and homogenizing dispersal results from high levels of organism exchange. To estimate the influence and spatial variation of each process we extend a recently developed statistical framework, use a simulation model to evaluate the accuracy of the extended framework, and use the framework to examine subsurface microbial communities over two geologic formations. For each subsurface community we estimate the degree to which it is influenced by homogeneous selection, variable selection, dispersal limitation, and homogenizing dispersal. Our analyses revealed that the relative influences of these ecological processes vary substantially across communities even within a geologic formation. We further identify environmental and spatial features associated with each ecological process, which allowed mapping of spatial variation in ecological-process-influences. The resulting maps provide a new lens through which ecological systems can be understood; in the subsurface system investigated here they revealed that the influence of variable selection was associated with the rate at which redox conditions change with subsurface depth. PMID:25983725

  2. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Soil moisture and soil temperature variability among three plant communities in a High Arctic Lake Basin

    NASA Astrophysics Data System (ADS)

    Davis, M. L.; Konkel, J.; Welker, J. M.; Schaeffer, S. M.

    2017-12-01

    Soil moisture and soil temperature are critical to plant community distribution and soil carbon cycle processes in High Arctic tundra. As environmental drivers of soil biochemical processes, the predictability of soil moisture and soil temperature by vegetation zone in High Arctic landscapes has significant implications for the use of satellite imagery and vegetation distribution maps to estimate of soil gas flux rates. During the 2017 growing season, we monitored soil moisture and soil temperature weekly at 48 sites in dry tundra, moist tundra, and wet grassland vegetation zones in a High Arctic lake basin. Soil temperature in all three communities reflected fluctuations in air temperature throughout the season. Mean soil temperature was highest in the dry tundra community at 10.5±0.6ºC, however, did not differ between moist tundra and wet grassland communities (2.7±0.6 and 3.1±0.5ºC, respectively). Mean volumetric soil moisture differed significantly among all three plant communities with the lowest and highest soil moisture measured in the dry tundra and wet grassland (30±1.2 and 65±2.7%), respectively. For all three communities, soil moisture was highest during the early season snow melt. Soil moisture in wet grassland remained high with no significant change throughout the season, while significant drying occurred in dry tundra. The most significant change in soil moisture was measured in moist tundra, ranging from 61 to 35%. Our results show different gradients in soil moisture variability within each plant community where: 1) soil moisture was lowest in dry tundra with little change, 2) highest in wet grassland with negligible change, and 3) variable in moist tundra which slowly dried but remained moist. Consistently high soil moisture in wet grassland restricts this plant community to areas with no significant drying during summer. The moist tundra occupies the intermediary areas between wet grassland and dry tundra and experiences the widest range of soil moisture variability. As climate projections predict wetter summers in the High Arctic, expansion of areas with seasonally inundated soils and increased soil moisture variability could result in an expansion of wet grassland and moist tundra communities with a commensurate decrease in dry tundra area.

  4. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  5. Shifts in Audiovisual Processing in Healthy Aging.

    PubMed

    Baum, Sarah H; Stevenson, Ryan

    2017-09-01

    The integration of information across sensory modalities into unified percepts is a fundamental sensory process upon which a multitude of cognitive processes are based. We review the body of literature exploring aging-related changes in audiovisual integration published over the last five years. Specifically, we review the impact of changes in temporal processing, the influence of the effectiveness of sensory inputs, the role of working memory, and the newer studies of intra-individual variability during these processes. Work in the last five years on bottom-up influences of sensory perception has garnered significant attention. Temporal processing, a driving factors of multisensory integration, has now been shown to decouple with multisensory integration in aging, despite their co-decline with aging. The impact of stimulus effectiveness also changes with age, where older adults show maximal benefit from multisensory gain at high signal-to-noise ratios. Following sensory decline, high working memory capacities have now been shown to be somewhat of a protective factor against age-related declines in audiovisual speech perception, particularly in noise. Finally, newer research is emerging focusing on the general intra-individual variability observed with aging. Overall, the studies of the past five years have replicated and expanded on previous work that highlights the role of bottom-up sensory changes with aging and their influence on audiovisual integration, as well as the top-down influence of working memory.

  6. Correlates of compliance with national comprehensive smoke-free laws.

    PubMed

    Peruga, Armando; Hayes, Luminita S; Aguilera, Ximena; Prasad, Vinayak; Bettcher, Douglas W

    2017-12-05

    To explore correlates of high compliance with smoking bans in a cross-sectional data set from the 41 countries with national comprehensive smoke-free laws in 2014 and complete data on compliance and enforcement. Outcome variable: compliance with a national comprehensive smoke-free law in each country was obtained for 2014 from the WHO global report on the global tobacco epidemic. Explanatory variables: legal enforcement requirements, penalties, infrastructure and strategy were obtained through a separate survey of governments. Also, country socioeconomic and demographic characteristics including the level of corruption control were included. an initial bivariate analysis determined the significance of each potentially relevant explanatory variable of high compliance. Differences in compliance were tested using the exact logistic regression. High compliance with the national comprehensive smoke-free law was associated with the involvement of the local jurisdictions in providing training and/or guidance for inspections (OR=10.3, 95% CI 1.7 to 117.7) and a perception of high corruption control efforts in the country (OR=7.2, 95% CI 1.1 to 85.8). The results show the importance of the depth of the enforcement infrastructure and effort represented by the degree to which the local government is involved in enforcement. They also show the significance of fighting corruption in the enforcement process, including the attempts of the tobacco industry to undermine the process, to achieve high levels of compliance with the law. The results point out to the need to invest minimal but essential enforcement resources given that national comprehensive smoke-free laws are self-enforcing in many but not all countries and sectors.

  7. A high-throughput platform for population reformatting and mammalian expression of phage display libraries to enable functional screening as full-length IgG.

    PubMed

    Xiao, Xiaodong; Douthwaite, Julie A; Chen, Yan; Kemp, Ben; Kidd, Sara; Percival-Alwyn, Jennifer; Smith, Alison; Goode, Kate; Swerdlow, Bonnie; Lowe, David; Wu, Herren; Dall'Acqua, William F; Chowdhury, Partha S

    Phage display antibody libraries are a rich resource for discovery of potential therapeutic antibodies. Single-chain variable fragment (scFv) libraries are the most common format due to the efficient display of scFv by phage particles and the ease by which soluble scFv antibodies can be expressed for high-throughput screening. Typically, a cascade of screening and triaging activities are performed, beginning with the assessment of large numbers of E. coli-expressed scFv, and progressing through additional assays with individual reformatting of the most promising scFv to full-length IgG. However, use of high-throughput screening of scFv for the discovery of full-length IgG is not ideal because of the differences between these molecules. Furthermore, the reformatting step represents a bottle neck in the process because each antibody has to be handled individually to preserve the unique VH and VL pairing. These problems could be resolved if populations of scFv could be reformatted to full-length IgG before screening without disrupting the variable region pairing. Here, we describe a novel strategy that allows the reformatting of diverse populations of scFv from phage selections to full-length IgG in a batch format. The reformatting process maintains the diversity and variable region pairing with high fidelity, and the resulted IgG pool enables high-throughput expression of IgG in mammalian cells and cell-based functional screening. The improved process led to the discovery of potent candidates that are comparable or better than those obtained by traditional methods. This strategy should also be readily applicable to Fab-based phage libraries. Our approach, Screening in Product Format (SiPF), represents a substantial improvement in the field of antibody discovery using phage display.

  8. Anomalous CO2 Emissions in Different Ecosystems Around the World

    NASA Astrophysics Data System (ADS)

    Sanchez-Canete, E. P.; Moya Jiménez, M. R.; Kowalski, A. S.; Serrano-Ortiz, P.; López-Ballesteros, A.; Oyonarte, C.; Domingo, F.

    2016-12-01

    As an important tool for understanding and monitoring ecosystem dynamics at ecosystem level, the eddy covariance (EC) technique allows the assessment of the diurnal and seasonal variation of the net ecosystem exchange (NEE). Despite the high temporal resolution data available, there are still many processes (in addition to photosynthesis and respiration) that, although they are being monitored, have been neglected. Only a few authors have studied anomalous CO2 emissions (non biological), and have related them to soil ventilation, photodegradation or geochemical processes. The aim of this study is: 1) to identify anomalous short term CO2 emissions in different ecosystems distributed around the world, 2) to determine the meteorological variables that are influencing these emissions, and 3) to explore the potential processes that can be involved. We have studied EC data together with other meteorological ancillary variables obtained from the FLUXNET database (version 2015) and have found more than 50 sites with anomalous CO2 emissions in different ecosystem types such as grasslands, croplands or savannas. Data were filtered according to the FLUXNET quality control flags (only data with quality control flag equal to 0 was used) and correlation analysis were performed with NEE and ancillary data. Preliminary results showed strong and highly significant correlations between meteorological variables and anomalous CO2 emissions. Correlation results showed clear differing behaviors between ecosystems types, which could be related to the different processes involved in the anomalous CO2 emissions. We suggest that anomalous CO2 emissions are happening globally and therefore, their contribution to the global net ecosystem carbon balance requires further investigation in order to better understand its drivers.

  9. Characterizing the γ-ray long-term variability of PKS 2155-304 with H.E.S.S. and Fermi-LAT

    NASA Astrophysics Data System (ADS)

    H.E.S.S. Collaboration; Abdalla, H.; Abramowski, A.; Aharonian, F.; Ait Benkhali, F.; Akhperjanian, A. G.; Andersson, T.; Angüner, E. O.; Arrieta, M.; Aubert, P.; Backes, M.; Balzer, A.; Barnard, M.; Becherini, Y.; Becker Tjus, J.; Berge, D.; Bernhard, S.; Bernlöhr, K.; Blackwell, R.; Böttcher, M.; Boisson, C.; Bolmont, J.; Bordas, P.; Bregeon, J.; Brun, F.; Brun, P.; Bryan, M.; Bulik, T.; Capasso, M.; Carr, J.; Casanova, S.; Cerruti, M.; Chakraborty, N.; Chalme-Calvet, R.; Chaves, R. C. G.; Chen, A.; Chevalier, J.; Chrétien, M.; Colafrancesco, S.; Cologna, G.; Condon, B.; Conrad, J.; Cui, Y.; Davids, I. D.; Decock, J.; Degrange, B.; Deil, C.; Devin, J.; deWilt, P.; Dirson, L.; Djannati-Ataï, A.; Domainko, W.; Donath, A.; Drury, L. O.'C.; Dubus, G.; Dutson, K.; Dyks, J.; Edwards, T.; Egberts, K.; Eger, P.; Ernenwein, J.-P.; Eschbach, S.; Farnier, C.; Fegan, S.; Fernandes, M. V.; Fiasson, A.; Fontaine, G.; Förster, A.; Funk, S.; Füßling, M.; Gabici, S.; Gajdus, M.; Gallant, Y. A.; Garrigoux, T.; Giavitto, G.; Giebels, B.; Glicenstein, J. F.; Gottschall, D.; Goyal, A.; Grondin, M.-H.; Hadasch, D.; Hahn, J.; Haupt, M.; Hawkes, J.; Heinzelmann, G.; Henri, G.; Hermann, G.; Hervet, O.; Hinton, J. A.; Hofmann, W.; Hoischen, C.; Holler, M.; Horns, D.; Ivascenko, A.; Jacholkowska, A.; Jamrozy, M.; Janiak, M.; Jankowsky, D.; Jankowsky, F.; Jingo, M.; Jogler, T.; Jouvin, L.; Jung-Richardt, I.; Kastendieck, M. A.; Katarzyński, K.; Katz, U.; Kerszberg, D.; Khélifi, B.; Kieffer, M.; King, J.; Klepser, S.; Klochkov, D.; Kluźniak, W.; Kolitzus, D.; Komin, Nu.; Kosack, K.; Krakau, S.; Kraus, M.; Krayzel, F.; Krüger, P. P.; Laffon, H.; Lamanna, G.; Lau, J.; Lees, J.-P.; Lefaucheur, J.; Lefranc, V.; Lemière, A.; Lemoine-Goumard, M.; Lenain, J.-P.; Leser, E.; Lohse, T.; Lorentz, M.; Liu, R.; López-Coto, R.; Lypova, I.; Marandon, V.; Marcowith, A.; Mariaud, C.; Marx, R.; Maurin, G.; Maxted, N.; Mayer, M.; Meintjes, P. J.; Meyer, M.; Mitchell, A. M. W.; Moderski, R.; Mohamed, M.; Mohrmann, L.; Morå, K.; Moulin, E.; Murach, T.; de Naurois, M.; Niederwanger, F.; Niemiec, J.; Oakes, L.; O'Brien, P.; Odaka, H.; Öttl, S.; Ohm, S.; Ostrowski, M.; Oya, I.; Padovani, M.; Panter, M.; Parsons, R. D.; Pekeur, N. W.; Pelletier, G.; Perennes, C.; Petrucci, P.-O.; Peyaud, B.; Piel, Q.; Pita, S.; Poon, H.; Prokhorov, D.; Prokoph, H.; Pühlhofer, G.; Punch, M.; Quirrenbach, A.; Raab, S.; Reimer, A.; Reimer, O.; Renaud, M.; de los Reyes, R.; Rieger, F.; Romoli, C.; Rosier-Lees, S.; Rowell, G.; Rudak, B.; Rulten, C. B.; Sahakian, V.; Salek, D.; Sanchez, D. A.; Santangelo, A.; Sasaki, M.; Schlickeiser, R.; Schüssler, F.; Schulz, A.; Schwanke, U.; Schwemmer, S.; Settimo, M.; Seyffert, A. S.; Shafi, N.; Shilon, I.; Simoni, R.; Sol, H.; Spanier, F.; Spengler, G.; Spies, F.; Stawarz, Ł.; Steenkamp, R.; Stegmann, C.; Stinzing, F.; Stycz, K.; Sushch, I.; Tavernet, J.-P.; Tavernier, T.; Taylor, A. M.; Terrier, R.; Tibaldo, L.; Tiziani, D.; Tluczykont, M.; Trichard, C.; Tuffs, R.; Uchiyama, Y.; van der Walt, D. J.; van Eldik, C.; van Rensburg, C.; van Soelen, B.; Vasileiadis, G.; Veh, J.; Venter, C.; Viana, A.; Vincent, P.; Vink, J.; Voisin, F.; Völk, H. J.; Vuillaume, T.; Wadiasingh, Z.; Wagner, S. J.; Wagner, P.; Wagner, R. M.; White, R.; Wierzcholska, A.; Willmann, P.; Wörnlein, A.; Wouters, D.; Yang, R.; Zabalza, V.; Zaborov, D.; Zacharias, M.; Zdziarski, A. A.; Zech, A.; Zefi, F.; Ziegler, A.; Żywucka, N.

    2017-02-01

    Studying the temporal variability of BL Lac objects at the highest energies provides unique insights into the extreme physical processes occurring in relativistic jets and in the vicinity of super-massive black holes. To this end, the long-term variability of the BL Lac object PKS 2155-304 is analyzed in the high (HE, 100 MeV < E < 300 GeV) and very high energy (VHE, E > 200 GeV) γ-ray domain. Over the course of 9 yr of H.E.S.S. observations the VHE light curve in the quiescent state is consistent with a log-normal behavior. The VHE variability in this state is well described by flicker noise (power-spectral-density index ) on timescales larger than one day. An analysis of 5.5 yr of HE Fermi-LAT data gives consistent results (, on timescales larger than 10 days) compatible with the VHE findings. The HE and VHE power spectral densities show a scale invariance across the probed time ranges. A direct linear correlation between the VHE and HE fluxes could neither be excluded nor firmly established. These long-term-variability properties are discussed and compared to the red noise behavior (β 2) seen on shorter timescales during VHE-flaring states. The difference in power spectral noise behavior at VHE energies during quiescent and flaring states provides evidence that these states are influenced by different physical processes, while the compatibility of the HE and VHE long-term results is suggestive of a common physical link as it might be introduced by an underlying jet-disk connection.

  10. Density-dependent resistance of the gypsy moth, Lymantria dispar, to its nucleopolyhedrovirus

    Treesearch

    James R. Reilly; Ann E. Hajek

    2007-01-01

    The processes controlling disease resistance can strongly influence the population dynamics of insect outbreaks. Evidence that disease resistance is density-dependent is accumulating, but the exact form of this relationship is highly variable from species to species.

  11. Collaborative adaptive landscape management (CALM) in rangelands: Discussion of general principles

    USDA-ARS?s Scientific Manuscript database

    The management of rangeland landscapes involves broad spatial extents, mixed land ownership, and multiple resource objectives. Management outcomes depend on biophysical heterogeneity, highly variable weather conditions, land use legacies, and spatial processes such as wildlife movement, hydrological...

  12. Laser welding process in PP moulding parts: Evaluation of seam performance

    NASA Astrophysics Data System (ADS)

    Oliveira, N.; Pontes, A. J.

    2015-12-01

    The Polypropylene is one of the most versatile polymer materials used in the industry. Due to this versatility, it is possible to use it in different products. This material can also be mixed with several additives namely glass fiber, carbon nanotubes, etc. This compatibility with different additives allows also obtaining products with characteristics that goes from an impact absorber to an electricity conductor. When is necessary to join components in PP they could be welded through hot plate, ultrasonic weld and also by laser. This study had the objective of study the influence of several variables, capable of influence the final quality of the seam. In this case were studied variables of the injection molding process as mold temperature and cooling time. Was also studied laser welding variables and different materials. The results showed that the variables that have the most influence were mould temperature, laser velocity and laser diameter. The seams were analyzed using Optical Microscopy technique. The seams showed perfect contact between the materials analyzed, despite the high standard variation presented in the mechanical testes.

  13. A Global Lake Ecological Observatory Network (GLEON) for synthesising high-frequency sensor data for validation of deterministic ecological models

    USGS Publications Warehouse

    David, Hamilton P; Carey, Cayelan C.; Arvola, Lauri; Arzberger, Peter; Brewer, Carol A.; Cole, Jon J; Gaiser, Evelyn; Hanson, Paul C.; Ibelings, Bas W; Jennings, Eleanor; Kratz, Tim K; Lin, Fang-Pang; McBride, Christopher G.; de Motta Marques, David; Muraoka, Kohji; Nishri, Ami; Qin, Boqiang; Read, Jordan S.; Rose, Kevin C.; Ryder, Elizabeth; Weathers, Kathleen C.; Zhu, Guangwei; Trolle, Dennis; Brookes, Justin D

    2014-01-01

    A Global Lake Ecological Observatory Network (GLEON; www.gleon.org) has formed to provide a coordinated response to the need for scientific understanding of lake processes, utilising technological advances available from autonomous sensors. The organisation embraces a grassroots approach to engage researchers from varying disciplines, sites spanning geographic and ecological gradients, and novel sensor and cyberinfrastructure to synthesise high-frequency lake data at scales ranging from local to global. The high-frequency data provide a platform to rigorously validate process- based ecological models because model simulation time steps are better aligned with sensor measurements than with lower-frequency, manual samples. Two case studies from Trout Bog, Wisconsin, USA, and Lake Rotoehu, North Island, New Zealand, are presented to demonstrate that in the past, ecological model outputs (e.g., temperature, chlorophyll) have been relatively poorly validated based on a limited number of directly comparable measurements, both in time and space. The case studies demonstrate some of the difficulties of mapping sensor measurements directly to model state variable outputs as well as the opportunities to use deviations between sensor measurements and model simulations to better inform process understanding. Well-validated ecological models provide a mechanism to extrapolate high-frequency sensor data in space and time, thereby potentially creating a fully 3-dimensional simulation of key variables of interest.

  14. Land cover trajectory in the Caatinga biome: analysing trends, drivers and consequences with high resolution satellite time series in Paraiba, Brazil

    NASA Astrophysics Data System (ADS)

    Rufino, Iana; Cunha, John; Carlos, Galvão; Nailson, Silva

    2017-04-01

    The Caatinga Biome is a unique Earth ecosystem with only 1% of conserved and protected areas (Oliveira et al, 2012). Human activities pressures high threaten Caatinga Biodiversity. Along the last decades, native green areas are changed by crops, livestock or those areas are reached by urban areas (Oliveira et al 2012; Fiaschi e Pianni, 2009; Sivakumar, 2007; Castelleti et al, 2004; Pereira et al, 2013; le Polain de Waroux & Lambin, 2012; Apgaua et al, 2013). Precipitation rates have high variability in space and time. High temperatures with small inter annual variability drives evapotranspiration up and turns the water scarcity the main challenge for sustainable life in rural areas. Sánchez-Azofeifa et al., (2005) try establishing research priorities for tropical dry forests and they recommend Scientific Community to focus on ecology and social aspects and possibilities of remote sensing techniques in those studies. Specific algorithms to produce estimates of energy balance and evapotranspiration of water to the atmosphere can process satellite images derived from several sensors. These estimates, combined with the analysis of historical time-series, allow the detection of changes in the terrestrial plant systems and can be used to discriminate the influences from human occupation and those from climate variability and/or change on energy fluxes and land cover. The algorithms have to be calibrated and validated using ground-based data. Thus, a large multiple source set of satellite and ground data has to be processed and comparatively analyzed. However, the high computational cost for image processing introduce further processing challenges. In order to face those challenges, this research explores the possibilities of using these medium resolution remote sensing products (30 meters), presenting a multitemporal long term analysis (24 months) to identify the land trajectory of one Semi-arid area (pilot) in the Caatinga biome. All processing steps use the statistical R package and GIS based tools in a automatic approach for the SEBAL (Bastiaanssen, 2000) and Fmask algorithms (Zhu e Woodcock, 2012). The main goal is to develop and provide an efficient remote sensing approach for a better understanding of "land cover trajectory" on an extremely vulnerable ecosystem driven by shifts on precipitation seasonality and extreme weather conditions.

  15. Report to the High Order Language Working Group (HOLWG)

    DTIC Science & Technology

    1977-01-14

    as running, runnable, suspended or dormant, may be synchronized by semaphore variables, may be schedaled using clock and duration data types and mpy...Recursive and non-recursive routines G6. Parallel processes, synchronization , critical regions G7. User defined parameterized exception handling G8...typed and lacks extensibility, parallel processing, synchronization and real-time features. Overall Evaluation IBM strongly recommended PL/I as a

  16. Variables Control Charts: A Measurement Tool to Detect Process Problems within Housing

    ERIC Educational Resources Information Center

    Luna, Andrew

    1999-01-01

    The purpose of this study was to use quality improvement tools to determine if the current process of supplying hot water to a high-rise residence hall for women at a southeastern Doctoral I granting institution was in control. After a series of focus groups among the residents in the hall, it was determined that they were mostly concerned about…

  17. Processes Affecting Variability of Fluorescence Signals from Benthic Targets in Shallow Waters

    DTIC Science & Technology

    1997-09-30

    processes in the Department of Chemistry at Brookhaven National Laboratory. The model organisms used are primarily cultured zooxanthellae obtained from...and closed (Fm) photosystem II reaction centers in the zooxanthellae isolated from the fire coral, Montipora. The short lifetime curve corresponds...individual zooxanthellae strains, is highly correlated Figure 2. The correlation between the average fluorescence lifetimes, calculated from a four

  18. Real-time parameter optimization based on neural network for smart injection molding

    NASA Astrophysics Data System (ADS)

    Lee, H.; Liau, Y.; Ryu, K.

    2018-03-01

    The manufacturing industry has been facing several challenges, including sustainability, performance and quality of production. Manufacturers attempt to enhance the competitiveness of companies by implementing CPS (Cyber-Physical Systems) through the convergence of IoT(Internet of Things) and ICT(Information & Communication Technology) in the manufacturing process level. Injection molding process has a short cycle time and high productivity. This features have been making it suitable for mass production. In addition, this process is used to produce precise parts in various industry fields such as automobiles, optics and medical devices. Injection molding process has a mixture of discrete and continuous variables. In order to optimized the quality, variables that is generated in the injection molding process must be considered. Furthermore, Optimal parameter setting is time-consuming work to predict the optimum quality of the product. Since the process parameter cannot be easily corrected during the process execution. In this research, we propose a neural network based real-time process parameter optimization methodology that sets optimal process parameters by using mold data, molding machine data, and response data. This paper is expected to have academic contribution as a novel study of parameter optimization during production compare with pre - production parameter optimization in typical studies.

  19. A Variable Resolution Atmospheric General Circulation Model for a Megasite at the North Slope of Alaska

    NASA Astrophysics Data System (ADS)

    Dennis, L.; Roesler, E. L.; Guba, O.; Hillman, B. R.; McChesney, M.

    2016-12-01

    The Atmospheric Radiation Measurement (ARM) climate research facility has three siteslocated on the North Slope of Alaska (NSA): Barrrow, Oliktok, and Atqasuk. These sites, incombination with one other at Toolik Lake, have the potential to become a "megasite" whichwould combine observational data and high resolution modeling to produce high resolutiondata products for the climate community. Such a data product requires high resolutionmodeling over the area of the megasite. We present three variable resolution atmosphericgeneral circulation model (AGCM) configurations as potential alternatives to stand-alonehigh-resolution regional models. Each configuration is based on a global cubed-sphere gridwith effective resolution of 1 degree, with a refinement in resolution down to 1/8 degree overan area surrounding the ARM megasite. The three grids vary in the size of the refined areawith 13k, 9k, and 7k elements. SquadGen, NCL, and GIMP are used to create the grids.Grids vary based upon the selection of areas of refinement which capture climate andweather processes that may affect a proposed NSA megasite. A smaller area of highresolution may not fully resolve climate and weather processes before they reach the NSA,however grids with smaller areas of refinement have a significantly reduced computationalcost compared with grids with larger areas of refinement. Optimal size and shape of thearea of refinement for a variable resolution model at the NSA is investigated.

  20. Development of uniform and predictable battery materials for nickel-cadmium aerospace cells

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Battery materials and manufacturing methods were analyzed with the aim of developing uniform and predictable battery plates for nickel cadmium aerospace cells. A study is presented for the high temperature electrochemical impregnation process for the preparation of nickel cadmium battery plates. This comparative study is set up as a factorially designed experiment to examine both manufacturing and operational variables and any interaction that might exist between them. The manufacturing variables in the factorial design include plaque preparative method, plaque porosity and thickness, impregnation method, and loading, The operational variables are type of duty cycle, charge and discharge rate, extent of overcharge, and depth of discharge.

  1. The Influence of a Western Boundary Current on Continental Shelf Processes Along Southeastern Australia.

    NASA Astrophysics Data System (ADS)

    Roughan, M.

    2016-02-01

    The East Australian Current (EAC) flows as a jet over the narrow shelf of southeastern Australia, dominating shelf circulation, and shedding vast eddies at the highly variable separation point. These characteristics alone make it a dynamically challenging region to measure, model and predict. In recent years a significant effort has been placed on understanding continental shelf processes along the coast of SE Australia, adjacent to the EAC, our major Western Boundary Current. We have used a multi-pronged approach by combining state of the art in situ observations and data assimilation modelling. Observations are obtained from a network of moorings, HF Radar and ocean gliders deployed in shelf waters along SE Australia, made possible through Australia's Integrated Marine Observing System (IMOS). In addition, we have developed a high resolution reanalysis of the East Australian Current using ROMS and 4DVar data Assimilation. In addition to the traditional data streams (SST, SSH and ARGO) we assimilate the newly available IMOS observations in the region. These include velocity and hydrographic observations from the EAC transport array, 1km HF radar measurements of surface currents, CTD casts from ocean gliders, and temperature, salinity and velocity measurements from a network of shelf mooring arrays. We use these vast data sets and numerical modelling tools combined with satellite remote sensed data to understand spatio-temporal variability of shelf processes and water mass distributions on synoptic, seasonal and inter-annual timescales. We have quantified the cross shelf transport variability inshore of the EAC, the driving mechanisms, the seasonal cycles in shelf waters and to some extent variability in the biological (phytoplankton) response. I will present a review of some of the key results from a number of recent studies.

  2. Spontaneous Fluctuations in Sensory Processing Predict Within-Subject Reaction Time Variability.

    PubMed

    Ribeiro, Maria J; Paiva, Joana S; Castelo-Branco, Miguel

    2016-01-01

    When engaged in a repetitive task our performance fluctuates from trial-to-trial. In particular, inter-trial reaction time variability has been the subject of considerable research. It has been claimed to be a strong biomarker of attention deficits, increases with frontal dysfunction, and predicts age-related cognitive decline. Thus, rather than being just a consequence of noise in the system, it appears to be under the control of a mechanism that breaks down under certain pathological conditions. Although the underlying mechanism is still an open question, consensual hypotheses are emerging regarding the neural correlates of reaction time inter-trial intra-individual variability. Sensory processing, in particular, has been shown to covary with reaction time, yet the spatio-temporal profile of the moment-to-moment variability in sensory processing is still poorly characterized. The goal of this study was to characterize the intra-individual variability in the time course of single-trial visual evoked potentials and its relationship with inter-trial reaction time variability. For this, we chose to take advantage of the high temporal resolution of the electroencephalogram (EEG) acquired while participants were engaged in a 2-choice reaction time task. We studied the link between single trial event-related potentials (ERPs) and reaction time using two different analyses: (1) time point by time point correlation analyses thereby identifying time windows of interest; and (2) correlation analyses between single trial measures of peak latency and amplitude and reaction time. To improve extraction of single trial ERP measures related with activation of the visual cortex, we used an independent component analysis (ICA) procedure. Our ERP analysis revealed a relationship between the N1 visual evoked potential and reaction time. The earliest time point presenting a significant correlation of its respective amplitude with reaction time occurred 175 ms after stimulus onset, just after the onset of the N1 peak. Interestingly, single trial N1 latency correlated significantly with reaction time, while N1 amplitude did not. In conclusion, our findings suggest that inter-trial variability in the timing of extrastriate visual processing contributes to reaction time variability.

  3. Spontaneous Fluctuations in Sensory Processing Predict Within-Subject Reaction Time Variability

    PubMed Central

    Ribeiro, Maria J.; Paiva, Joana S.; Castelo-Branco, Miguel

    2016-01-01

    When engaged in a repetitive task our performance fluctuates from trial-to-trial. In particular, inter-trial reaction time variability has been the subject of considerable research. It has been claimed to be a strong biomarker of attention deficits, increases with frontal dysfunction, and predicts age-related cognitive decline. Thus, rather than being just a consequence of noise in the system, it appears to be under the control of a mechanism that breaks down under certain pathological conditions. Although the underlying mechanism is still an open question, consensual hypotheses are emerging regarding the neural correlates of reaction time inter-trial intra-individual variability. Sensory processing, in particular, has been shown to covary with reaction time, yet the spatio-temporal profile of the moment-to-moment variability in sensory processing is still poorly characterized. The goal of this study was to characterize the intra-individual variability in the time course of single-trial visual evoked potentials and its relationship with inter-trial reaction time variability. For this, we chose to take advantage of the high temporal resolution of the electroencephalogram (EEG) acquired while participants were engaged in a 2-choice reaction time task. We studied the link between single trial event-related potentials (ERPs) and reaction time using two different analyses: (1) time point by time point correlation analyses thereby identifying time windows of interest; and (2) correlation analyses between single trial measures of peak latency and amplitude and reaction time. To improve extraction of single trial ERP measures related with activation of the visual cortex, we used an independent component analysis (ICA) procedure. Our ERP analysis revealed a relationship between the N1 visual evoked potential and reaction time. The earliest time point presenting a significant correlation of its respective amplitude with reaction time occurred 175 ms after stimulus onset, just after the onset of the N1 peak. Interestingly, single trial N1 latency correlated significantly with reaction time, while N1 amplitude did not. In conclusion, our findings suggest that inter-trial variability in the timing of extrastriate visual processing contributes to reaction time variability. PMID:27242470

  4. A data mining approach to optimize pellets manufacturing process based on a decision tree algorithm.

    PubMed

    Ronowicz, Joanna; Thommes, Markus; Kleinebudde, Peter; Krysiński, Jerzy

    2015-06-20

    The present study is focused on the thorough analysis of cause-effect relationships between pellet formulation characteristics (pellet composition as well as process parameters) and the selected quality attribute of the final product. The shape using the aspect ratio value expressed the quality of pellets. A data matrix for chemometric analysis consisted of 224 pellet formulations performed by means of eight different active pharmaceutical ingredients and several various excipients, using different extrusion/spheronization process conditions. The data set contained 14 input variables (both formulation and process variables) and one output variable (pellet aspect ratio). A tree regression algorithm consistent with the Quality by Design concept was applied to obtain deeper understanding and knowledge of formulation and process parameters affecting the final pellet sphericity. The clear interpretable set of decision rules were generated. The spehronization speed, spheronization time, number of holes and water content of extrudate have been recognized as the key factors influencing pellet aspect ratio. The most spherical pellets were achieved by using a large number of holes during extrusion, a high spheronizer speed and longer time of spheronization. The described data mining approach enhances knowledge about pelletization process and simultaneously facilitates searching for the optimal process conditions which are necessary to achieve ideal spherical pellets, resulting in good flow characteristics. This data mining approach can be taken into consideration by industrial formulation scientists to support rational decision making in the field of pellets technology. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Potential sources of variability in mesocosm experiments on the response of phytoplankton to ocean acidification

    NASA Astrophysics Data System (ADS)

    Moreno de Castro, Maria; Schartau, Markus; Wirtz, Kai

    2017-04-01

    Mesocosm experiments on phytoplankton dynamics under high CO2 concentrations mimic the response of marine primary producers to future ocean acidification. However, potential acidification effects can be hindered by the high standard deviation typically found in the replicates of the same CO2 treatment level. In experiments with multiple unresolved factors and a sub-optimal number of replicates, post-processing statistical inference tools might fail to detect an effect that is present. We propose that in such cases, data-based model analyses might be suitable tools to unearth potential responses to the treatment and identify the uncertainties that could produce the observed variability. As test cases, we used data from two independent mesocosm experiments. Both experiments showed high standard deviations and, according to statistical inference tools, biomass appeared insensitive to changing CO2 conditions. Conversely, our simulations showed earlier and more intense phytoplankton blooms in modeled replicates at high CO2 concentrations and suggested that uncertainties in average cell size, phytoplankton biomass losses, and initial nutrient concentration potentially outweigh acidification effects by triggering strong variability during the bloom phase. We also estimated the thresholds below which uncertainties do not escalate to high variability. This information might help in designing future mesocosm experiments and interpreting controversial results on the effect of acidification or other pressures on ecosystem functions.

  6. Numerical Simulation Analysis of High-precision Dispensing Needles for Solid-liquid Two-phase Grinding

    NASA Astrophysics Data System (ADS)

    Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming

    2018-03-01

    In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.

  7. Guiding Biogeochemical Campaigns with High Resolution Altimetry: Waiting for the SWOT Mission

    NASA Astrophysics Data System (ADS)

    d'Ovidio, Francesco; Zhou, Meng; Park, Young Hyang; Nencioli, Francesco; Resplandy, Laure; Doglioli, Andrea; Petrenko, Anne; Blain, Stephane; Queguiner, Bernard

    2013-09-01

    Biogeochemical processes in the ocean are strongly affected by the horizontal mesoscale ( 10-100 km) and submesoscale (1-10 km) circulation. Eddies and filaments can create strong dishomogeneity, either amplifying small-scale diffusion processes (mixing) or creating tracer reservoirs. This variability has a direct effect on the biogeochemical budgets - controlling for instances tracer fluxes across climatological fronts, or part of the vertical exchanges. This variability also provides a challenge to in situ studies, because sites few tens of kms or few weeks apart may be representative of very different situations. Here we discuss how altimetry observation can be exploited in order to track in near- real-time transport barriers and mixing regions and guide a biogeochemical adaptative sampling strategy. As a case study, we focus on the recent KEOPS2 campaign (Kerguelen region, October-November 2012) which employed Lagrangian diagnostics of a specifically designed high resolution, regional altimetric product produced by CLS (with support from CNES) analyzed with several Lagrangian diagnostics. Such approach anticipates possible uses of incoming high resolution altimetry data for biogeochemical studies.

  8. Dual-Mode Combustor

    NASA Technical Reports Server (NTRS)

    Trefny, Charles J (Inventor); Dippold, Vance F (Inventor)

    2013-01-01

    A new dual-mode ramjet combustor used for operation over a wide flight Mach number range is described. Subsonic combustion mode is usable to lower flight Mach numbers than current dual-mode scramjets. High speed mode is characterized by supersonic combustion in a free-jet that traverses the subsonic combustion chamber to a variable nozzle throat. Although a variable combustor exit aperture is required, the need for fuel staging to accommodate the combustion process is eliminated. Local heating from shock-boundary-layer interactions on combustor walls is also eliminated.

  9. Exploring the Unknown: Detection of Fast Variability of Starlight (Abstract)

    NASA Astrophysics Data System (ADS)

    Stanton, R. H.

    2017-12-01

    (Abstract only) In previous papers the author described a photometer designed for observing high-speed events such as lunar and asteroid occultations, and for searching for new varieties of fast stellar variability. A significant challenge presented by such a system is how one deals with the large quantity of data generated in order to process it efficiently and reveal any hidden information that might be present. This paper surveys some of the techniques used to achieve this goal.

  10. Study of flexural rigidity of weavable powder-coated towpreg

    NASA Technical Reports Server (NTRS)

    Hirt, Douglas E.; Marchello, Joseph M.; Baucom, Robert M.

    1990-01-01

    An effort has been made to weave powder-impregnated tow into a two-dimensional preform, controlling process variables to obtain high flexural rigidity in the warp direction and greater flexibility in the fill direction. The resulting prepregs have been consolidated into laminates with LaRC-TPI matrices. Complementary SEM and DSC studies have been performed to deepen understanding of the relationship between tow flexibility and heat treatment. Attention is also given to the oven temperature and residence time variables' effects on power/fiber fusion.

  11. Do submesoscale frontal processes ventilate the oxygen minimum zone off Peru?

    NASA Astrophysics Data System (ADS)

    Thomsen, S.; Kanzow, T.; Colas, F.; Echevin, V.; Krahmann, G.; Engel, A.

    2016-02-01

    The Peruvian upwelling region shows pronounced near-surface submesoscale variability including filaments and sharp density fronts. Submesoscale frontal processes can drive large vertical velocities and enhance vertical tracer fluxes in the upper ocean. The associated high temporal and spatial variability poses a large challenge to observational approaches targeting these processes. In this study the role of submesoscale processes for the ventilation of the near-coastal oxygen minimum zone off Peru is investigated. We use satellite based sea surface temperature measurements and multiple high-resolution glider observations of temperature, salinity, oxygen and chlorophyll fluorescence carried out in January and February 2013 off Peru near 14°S during active upwelling. Additionally, high-resolution regional ocean circulation model outputs (ROMS) outputs are analysed. At the beginning of our observational survey a previously upwelled, productive and highly oxygenated water body is found in the mixed layer. Subsequently, a cold filament forms and the waters are moved offshore. After the decay of the filament and the relaxation of the upwelling front, the oxygen enriched surface water is found in the previously less oxygenated thermocline suggesting the occurrence of frontal subduction. A numerical model simulation is used to analyse the evolution of Lagrangian numerical floats in several upwelling filaments, whose vertical structure and hydrographic properties agree well with the observations. The floats trajectories support our interpretation that the subduction of previously upwelled water occurs in filaments off Peru. We find that 40 - 60 % of the floats seeded in the newly upwelled water is subducted within a time period of 5 days. This hightlights the importance of this process in ventilating the oxycline off Peru.

  12. Boundary Layer Remote Sensing with Combined Active and Passive Techniques: GPS Radio Occultation and High-Resolution Stereo Imaging (WindCam) Small Satellite Concept

    NASA Technical Reports Server (NTRS)

    Mannucci, A.J.; Wu, D.L.; Teixeira, J.; Ao, C.O.; Xie, F.; Diner, D.J.; Wood, R.; Turk, Joe

    2012-01-01

    Objective: significant progress in understanding low-cloud boundary layer processes. This is the Single largest uncertainty in climate projections. Radio occultation has unique features suited to boundary layer remote sensing (1) Cloud penetrating (2) Very high vertical resolution (approximately 50m-100m) (3) Sensitivity to thermodynamic variables

  13. Epigenetic modification of the oxytocin receptor gene influences the perception of anger and fear in the human brain

    PubMed Central

    Puglia, Meghan H.; Lillard, Travis S.; Morris, James P.; Connelly, Jessica J.

    2015-01-01

    In humans, the neuropeptide oxytocin plays a critical role in social and emotional behavior. The actions of this molecule are dependent on a protein that acts as its receptor, which is encoded by the oxytocin receptor gene (OXTR). DNA methylation of OXTR, an epigenetic modification, directly influences gene transcription and is variable in humans. However, the impact of this variability on specific social behaviors is unknown. We hypothesized that variability in OXTR methylation impacts social perceptual processes often linked with oxytocin, such as perception of facial emotions. Using an imaging epigenetic approach, we established a relationship between OXTR methylation and neural activity in response to emotional face processing. Specifically, high levels of OXTR methylation were associated with greater amounts of activity in regions associated with face and emotion processing including amygdala, fusiform, and insula. Importantly, we found that these higher levels of OXTR methylation were also associated with decreased functional coupling of amygdala with regions involved in affect appraisal and emotion regulation. These data indicate that the human endogenous oxytocin system is involved in attenuation of the fear response, corroborating research implicating intranasal oxytocin in the same processes. Our findings highlight the importance of including epigenetic mechanisms in the description of the endogenous oxytocin system and further support a central role for oxytocin in social cognition. This approach linking epigenetic variability with neural endophenotypes may broadly explain individual differences in phenotype including susceptibility or resilience to disease. PMID:25675509

  14. HIGH SHEAR GRANULATION PROCESS: ASSESSING IMPACT OF FORMULATION VARIABLES ON GRANULES AND TABLETS CHARACTERISTICS OF HIGH DRUG LOADING FORMULATION USING DESIGN OF EXPERIMENT METHODOLOGY.

    PubMed

    Fayed, Mohamed H; Abdel-Rahman, Sayed I; Alanazi, Fars K; Ahmed, Mahrous O; Tawfeek, Hesham M; Ali, Bahaa E

    2017-03-01

    High shear wet granulation is a significant component procedure in the pharmaceutical industry. The objective of the study was to investigate the influence of two independent formulation variables; polyvinypyrrolidone (PVP) as a binder (X,) and croscarmellose sodium (CCS) as a disintegrant (X2) on the crit- ical quality attributes of acetaminophen granules and their corresponding tablets using design of experiment (DoE) approach. A two factor, three level (32) full factorial design has been applied; each variable was investi- gated at three levels to characterize their strength and interaction. The dried granules have been analyzed for their density, granule size and flowability. Additionally, the produced tablets have been investigated for: break- ing force, friability, disintegration time and t. of drug dissolution. The analysis of variance (ANOVA) showed that the two variables had a significant impact (p < 0.05) on granules and tablets characteristics, while only the binder concentration influenced the tablets friability. Furthermore, significant interactions (p < 0.05) between the two variables, for granules and tablets attributes, were also found. However, variables interaction showed minimal effect for granules flowability as well as tablets friability. Desirability function was carried out to opti- mize the variables under study to obtain product within the USP limit. It was found that the higher desirability (0.985) could be obtained at the medium level of PVP and low level of CCS. Ultimately, this study supplies the formulator with beneficial tools in selecting the proper level of binder and disintegrant to attain product with desired characteristics.

  15. High-throughput assay for optimising microbial biological control agent production and delivery

    USDA-ARS?s Scientific Manuscript database

    Lack of technologies to produce and deliver effective biological control agents (BCAs) is a major barrier to their commercialization. A myriad of variables associated with BCA cultivation, formulation, drying, storage, and reconstitution processes complicates agent quality maximization. An efficie...

  16. MODELING LONG-TERM DYNAMICS OF LITTER ACCUMULATION IN RESPONSE TO STATIC AND VARIABLE HYDROPERIODS

    EPA Science Inventory

    Accumulated litter from emergent species like the cattail hybrid (Typha glauca Godr.) can influence local abiotic conditions, other biota, and ecosystem processes. Litter accumulation results from high production coupled with slow breakdown rates. Wetland managers regularly mani...

  17. Improved silicon carbide for advanced heat engines. I - Process development for injection molding

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas J.; Trela, Walter

    1989-01-01

    Alternate processing methods have been investigated as a means of improving the mechanical properties of injection-molded SiC. Various mixing processes (dry, high-sheer, and fluid) were evaluated along with the morphology and particle size of the starting beta-SiC powder. Statistically-designed experiments were used to determine significant effects and interactions of variables in the mixing, injection molding, and binder removal process steps. Improvements in mechanical strength can be correlated with the reduction in flaw size observed in the injection molded green bodies obtained with improved processing methods.

  18. The academic and nonacademic characteristics of science and nonscience majors in Yemeni high schools

    NASA Astrophysics Data System (ADS)

    Anaam, Mahyoub Ali

    The purposes of this study were: (a) to identify the variables associated with selection of majors; (b) to determine the differences between science and nonscience majors in general, and high and low achievers in particular, with respect to attitudes toward science, integrated science process skills, and logical thinking abilities; and (c) to determine if a significant relationship exists between students' majors and their personality types and learning styles. Data were gathered from 188 twelfth grade male and female high school students in Yemen, who enrolled in science (45 males and 47 females) and art and literature (47 males and 49 females) tracks. Data were collected by the following instruments: Past math and science achievement (data source taken from school records), Kolb's Learning Styles Inventory (1985), Integrated Science Process Skills Test, Myers-Briggs Type Indicator, Attitude Toward Science in School Assessment, Group Assessment of Logical Thinking, Yemeni High School Students Questionnaire. The Logistic Regression Model and the Linear Discriminant Analysis identified several variables that are associated with selection of majors. Moreover, some of the characteristics of science and nonscience majors that were revealed by these models include the following: Science majors seem to have higher degrees of curiosity in science, high interest in science at high school level, high tendency to believe that their majors will help them to find a potential job in the future, and have had higher achievement in science subjects, and have rated their math teachers higher than did nonscience majors. In contrast, nonscience majors seem to have higher degrees of curiosity in nonscience subjects, higher interest in science at elementary school, higher anxiety during science lessons than did science majors. In addition, General Linear Models allow that science majors generally demonstrate more positive attitudes towards science than do nonscience majors and they outperform nonscience majors on integrated science process skills and logical thinking abilities. High achievers in science majors have a significantly higher attitude toward science, higher integrated science process skills, and higher logical thinking abilities than high and low achievers in nonscience majors. No gender differences were found on these variables. Chi-Square tests indicate that no significant relationships exist between students' majors and their personality types and learning styles. However, it was found that majority of students prefer extroversion over introversion, sensing over intuition, thinking over feeling, and judging over perceiving. Moreover, the most common learning styles among science and nonscience majors were the divergent and the assimilative learning styles. Finally, the educational implication of these findings were discussed and future research that need to be conducted were proposed.

  19. An internal variable constitutive model for the large deformation of metals at high temperatures

    NASA Technical Reports Server (NTRS)

    Brown, Stuart; Anand, Lallit

    1988-01-01

    The advent of large deformation finite element methodologies is beginning to permit the numerical simulation of hot working processes whose design until recently has been based on prior industrial experience. Proper application of such finite element techniques requires realistic constitutive equations which more accurately model material behavior during hot working. A simple constitutive model for hot working is the single scalar internal variable model for isotropic thermal elastoplasticity proposed by Anand. The model is recalled and the specific scalar functions, for the equivalent plastic strain rate and the evolution equation for the internal variable, presented are slight modifications of those proposed by Anand. The modified functions are better able to represent high temperature material behavior. The monotonic constant true strain rate and strain rate jump compression experiments on a 2 percent silicon iron is briefly described. The model is implemented in the general purpose finite element program ABAQUS.

  20. 25 MHz clock continuous-variable quantum key distribution system over 50 km fiber channel

    PubMed Central

    Wang, Chao; Huang, Duan; Huang, Peng; Lin, Dakai; Peng, Jinye; Zeng, Guihua

    2015-01-01

    In this paper, a practical continuous-variable quantum key distribution system is developed and it runs in the real-world conditions with 25 MHz clock rate. To reach high-rate, we have employed a homodyne detector with maximal bandwidth to 300 MHz and an optimal high-efficiency error reconciliation algorithm with processing speed up to 25 Mbps. To optimize the stability of the system, several key techniques are developed, which include a novel phase compensation algorithm, a polarization feedback algorithm, and related stability method on the modulators. Practically, our system is tested for more than 12 hours with a final secret key rate of 52 kbps over 50 km transmission distance, which is the highest rate so far in such distance. Our system may pave the road for practical broadband secure quantum communication with continuous variables in the commercial conditions. PMID:26419413

  1. Systems, methods, and software for determining spatially variable distributions of the dielectric properties of a heterogeneous material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farrington, Stephen P.

    Systems, methods, and software for measuring the spatially variable relative dielectric permittivity of materials along a linear or otherwise configured sensor element, and more specifically the spatial variability of soil moisture in one dimension as inferred from the dielectric profile of the soil matrix surrounding a linear sensor element. Various methods provided herein combine advances in the processing of time domain reflectometry data with innovations in physical sensing apparatuses. These advancements enable high temporal (and thus spatial) resolution of electrical reflectance continuously along an insulated waveguide that is permanently emplaced in contact with adjacent soils. The spatially resolved reflectance ismore » directly related to impedance changes along the waveguide that are dominated by electrical permittivity contrast due to variations in soil moisture. Various methods described herein are thus able to monitor soil moisture in profile with high spatial resolution.« less

  2. Variability at the edge: highly accreting objects in Taurus

    NASA Astrophysics Data System (ADS)

    Abraham, Peter; Kospal, Agnes; Szabo, Robert

    2017-04-01

    In Kepler K2, Campaign 13, we will obtain 80-days-long optical light curves of seven highly accreting T Tauri stars in the benchmark Taurus star forming region. Here we propose to monitor our sample simultaneously with Kepler and Spitzer, to be able to separate variability patterns related to different physical processes. Monitoring our targets with Spitzer during the final 11 days of the K2 campaign, we will clean the light curves from non-accretion effects (rotating stellar spots, dips due to passing dust structures), and construct, for the first time, a variability curve which reflects the time-dependent accretion only. We will then study and understand how time-dependent mass accretion affects the density and temperature structure of the protoplanetary disk, which sets the initial conditions for planet formation. The proposed work cannot be done without the unparalleled precision of Kepler and Spitzer. This unique and one-time opportunity motivated our DDT proposal.

  3. 25 MHz clock continuous-variable quantum key distribution system over 50 km fiber channel.

    PubMed

    Wang, Chao; Huang, Duan; Huang, Peng; Lin, Dakai; Peng, Jinye; Zeng, Guihua

    2015-09-30

    In this paper, a practical continuous-variable quantum key distribution system is developed and it runs in the real-world conditions with 25 MHz clock rate. To reach high-rate, we have employed a homodyne detector with maximal bandwidth to 300 MHz and an optimal high-efficiency error reconciliation algorithm with processing speed up to 25 Mbps. To optimize the stability of the system, several key techniques are developed, which include a novel phase compensation algorithm, a polarization feedback algorithm, and related stability method on the modulators. Practically, our system is tested for more than 12 hours with a final secret key rate of 52 kbps over 50 km transmission distance, which is the highest rate so far in such distance. Our system may pave the road for practical broadband secure quantum communication with continuous variables in the commercial conditions.

  4. Multi-step rhodopsin inactivation schemes can account for the size variability of single photon responses in Limulus ventral photoreceptors

    PubMed Central

    1994-01-01

    Limulus ventral photoreceptors generate highly variable responses to the absorption of single photons. We have obtained data on the size distribution of these responses, derived the distribution predicted from simple transduction cascade models and compared the theory and data. In the simplest of models, the active state of the visual pigment (defined by its ability to activate G protein) is turned off in a single reaction. The output of such a cascade is predicted to be highly variable, largely because of stochastic variation in the number of G proteins activated. The exact distribution predicted is exponential, but we find that an exponential does not adequately account for the data. The data agree much better with the predictions of a cascade model in which the active state of the visual pigment is turned off by a multi-step process. PMID:8057085

  5. The Effect of Pre-Analytical Variability on the Measurement of MRM-MS-Based Mid- to High-Abundance Plasma Protein Biomarkers and a Panel of Cytokines

    PubMed Central

    Aguilar-Mahecha, Adriana; Kuzyk, Michael A.; Domanski, Dominik; Borchers, Christoph H.; Basik, Mark

    2012-01-01

    Blood sample processing and handling can have a significant impact on the stability and levels of proteins measured in biomarker studies. Such pre-analytical variability needs to be well understood in the context of the different proteomics platforms available for biomarker discovery and validation. In the present study we evaluated different types of blood collection tubes including the BD P100 tube containing protease inhibitors as well as CTAD tubes, which prevent platelet activation. We studied the effect of different processing protocols as well as delays in tube processing on the levels of 55 mid and high abundance plasma proteins using novel multiple-reaction monitoring-mass spectrometry (MRM-MS) assays as well as 27 low abundance cytokines using a commercially available multiplexed bead-based immunoassay. The use of P100 tubes containing protease inhibitors only conferred proteolytic protection for 4 cytokines and only one MRM-MS-measured peptide. Mid and high abundance proteins measured by MRM are highly stable in plasma left unprocessed for up to six hours although platelet activation can also impact the levels of these proteins. The levels of cytokines were elevated when tubes were centrifuged at cold temperature, while low levels were detected when samples were collected in CTAD tubes. Delays in centrifugation also had an impact on the levels of cytokines measured depending on the type of collection tube used. Our findings can help in the development of guidelines for blood collection and processing for proteomic biomarker studies. PMID:22701622

  6. The effect of pre-analytical variability on the measurement of MRM-MS-based mid- to high-abundance plasma protein biomarkers and a panel of cytokines.

    PubMed

    Aguilar-Mahecha, Adriana; Kuzyk, Michael A; Domanski, Dominik; Borchers, Christoph H; Basik, Mark

    2012-01-01

    Blood sample processing and handling can have a significant impact on the stability and levels of proteins measured in biomarker studies. Such pre-analytical variability needs to be well understood in the context of the different proteomics platforms available for biomarker discovery and validation. In the present study we evaluated different types of blood collection tubes including the BD P100 tube containing protease inhibitors as well as CTAD tubes, which prevent platelet activation. We studied the effect of different processing protocols as well as delays in tube processing on the levels of 55 mid and high abundance plasma proteins using novel multiple-reaction monitoring-mass spectrometry (MRM-MS) assays as well as 27 low abundance cytokines using a commercially available multiplexed bead-based immunoassay. The use of P100 tubes containing protease inhibitors only conferred proteolytic protection for 4 cytokines and only one MRM-MS-measured peptide. Mid and high abundance proteins measured by MRM are highly stable in plasma left unprocessed for up to six hours although platelet activation can also impact the levels of these proteins. The levels of cytokines were elevated when tubes were centrifuged at cold temperature, while low levels were detected when samples were collected in CTAD tubes. Delays in centrifugation also had an impact on the levels of cytokines measured depending on the type of collection tube used. Our findings can help in the development of guidelines for blood collection and processing for proteomic biomarker studies.

  7. How temperament and personality contribute to the maladjustment of children with autism.

    PubMed

    De Pauw, Sarah S W; Mervielde, Ivan; Van Leeuwen, Karla G; De Clercq, Barbara J

    2011-02-01

    To test the spectrum hypothesis--postulating that clinical and non-clinical samples are primarily differentiated by mean-level differences--, this study evaluates differences in parent-rated temperament, personality and maladjustment among a low-symptom (N = 81), a high-symptom (N = 94) ASD-group, and a comparison group (N = 500). These classic spectrum hypothesis tests are extended by adding tests for similarity in variances, reliabilities and patterns of covariation between relevant variables. Children with ASD exhibit more extreme means, except for dominance. The low- and high-symptom ASD-groups are primarily differentiated by mean sociability and internal distress. Striking similarities in reliability and pattern of covariation of variables suggest that comparable processes link traits to maladaptation in low- and high-symptom children with ASD and in children with and without autism.

  8. Within-Students Variability in Learning Experiences, and Teachers' Perceptions of Students' Task-Focus

    ERIC Educational Resources Information Center

    Malmberg, Lars-Erik; Lim, Wee H. T.; Tolvanen, Asko; Nurmi, Jari-Erik

    2016-01-01

    In order to advance our understanding of educational processes, we present a tutorial of intraindividual variability. An adaptive educational process is characterised by stable (less variability), and a maladaptive process is characterised by instable (more variability) learning experiences from one learning situation to the next. We outline step…

  9. Cortical Action Potential Backpropagation Explains Spike Threshold Variability and Rapid-Onset Kinetics

    PubMed Central

    Yu, Yuguo; Shu, Yousheng; McCormick, David A.

    2008-01-01

    Neocortical action potential responses in vivo are characterized by considerable threshold variability, and thus timing and rate variability, even under seemingly identical conditions. This finding suggests that cortical ensembles are required for accurate sensorimotor integration and processing. Intracellularly, trial-to-trial variability results not only from variation in synaptic activities, but also in the transformation of these into patterns of action potentials. Through simultaneous axonal and somatic recordings and computational simulations, we demonstrate that the initiation of action potentials in the axon initial segment followed by backpropagation of these spikes throughout the neuron results in a distortion of the relationship between the timing of synaptic and action potential events. In addition, this backpropagation also results in an unusually high rate of rise of membrane potential at the foot of the action potential. The distortion of the relationship between the amplitude time course of synaptic inputs and action potential output caused by spike back-propagation results in the appearance of high spike threshold variability at the level of the soma. At the point of spike initiation, the axon initial segment, threshold variability is considerably less. Our results indicate that spike generation in cortical neurons is largely as expected by Hodgkin—Huxley theory and is more precise than previously thought. PMID:18632930

  10. Variables associated with peripherally inserted central catheter related infection in high risk newborn infants 1

    PubMed Central

    Rangel, Uesliz Vianna; Gomes, Saint Clair dos Santos; Costa, Ana Maria Aranha Magalhães; Moreira, Maria Elisabeth Lopes

    2014-01-01

    OBJECTIVE: to relate the variables from a surveillance form for intravenous devices in high risk newborn infants with peripherally inserted central catheter related infection. METHODOLOGY: approximately 15 variables were studied, being associated with peripherally inserted central catheter related infection, this being defined by blood culture results. The variables analyzed were obtained from the surveillance forms used with intravenous devices, attached to the medical records of newborn infants weighing between 500 and 1,499 g. The statistical association was defined using the Chi-squared and Student t tests. The study was approved by the Research Ethics Committee of the Instituto Fernandes Figueira under process N. 140.703/12. RESULTS: 63 medical records were analyzed. The infection rate observed was 25.4%. Of the variables analyzed, only three had a statistically-significant relationship with the blood culture - the use of drugs capable of inhibiting acid secretion, post-natal steroid use, and undertaking more than one invasive procedure (p-value of 0.0141, 0.0472 and 0.0277, respectively). CONCLUSION: the absence of significance of the variables of the form may be related to the quality of the records and to the absence of standardization. It is recommended that the teams be encouraged to adhere to the protocol and fill out the form. PMID:25493681

  11. Quantum information processing with a travelling wave of light

    NASA Astrophysics Data System (ADS)

    Serikawa, Takahiro; Shiozawa, Yu; Ogawa, Hisashi; Takanashi, Naoto; Takeda, Shuntaro; Yoshikawa, Jun-ichi; Furusawa, Akira

    2018-02-01

    We exploit quantum information processing on a traveling wave of light, expecting emancipation from thermal noise, easy coupling to fiber communication, and potentially high operation speed. Although optical memories are technically challenging, we have an alternative approach to apply multi-step operations on traveling light, that is, continuous-variable one-way computation. So far our achievement includes generation of a one-million-mode entangled chain in time-domain, mode engineering of nonlinear resource states, and real-time nonlinear feedforward. Although they are implemented with free space optics, we are also investigating photonic integration and performed quantum teleportation with a passive liner waveguide chip as a demonstration of entangling, measurement, and feedforward. We also suggest a loop-based architecture as another model of continuous-variable computing.

  12. Shortwave surface radiation network for observing small-scale cloud inhomogeneity fields

    NASA Astrophysics Data System (ADS)

    Lakshmi Madhavan, Bomidi; Kalisch, John; Macke, Andreas

    2016-03-01

    As part of the High Definition Clouds and Precipitation for advancing Climate Prediction Observational Prototype Experiment (HOPE), a high-density network of 99 silicon photodiode pyranometers was set up around Jülich (10 km × 12 km area) from April to July 2013 to capture the small-scale variability of cloud-induced radiation fields at the surface. In this paper, we provide the details of this unique setup of the pyranometer network, data processing, quality control, and uncertainty assessment under variable conditions. Some exemplary days with clear, broken cloudy, and overcast skies were explored to assess the spatiotemporal observations from the network along with other collocated radiation and sky imager measurements available during the HOPE period.

  13. Observations and Models of Highly Intermittent Phytoplankton Distributions

    PubMed Central

    Mandal, Sandip; Locke, Christopher; Tanaka, Mamoru; Yamazaki, Hidekatsu

    2014-01-01

    The measurement of phytoplankton distributions in ocean ecosystems provides the basis for elucidating the influences of physical processes on plankton dynamics. Technological advances allow for measurement of phytoplankton data to greater resolution, displaying high spatial variability. In conventional mathematical models, the mean value of the measured variable is approximated to compare with the model output, which may misinterpret the reality of planktonic ecosystems, especially at the microscale level. To consider intermittency of variables, in this work, a new modelling approach to the planktonic ecosystem is applied, called the closure approach. Using this approach for a simple nutrient-phytoplankton model, we have shown how consideration of the fluctuating parts of model variables can affect system dynamics. Also, we have found a critical value of variance of overall fluctuating terms below which the conventional non-closure model and the mean value from the closure model exhibit the same result. This analysis gives an idea about the importance of the fluctuating parts of model variables and about when to use the closure approach. Comparisons of plot of mean versus standard deviation of phytoplankton at different depths, obtained using this new approach with real observations, give this approach good conformity. PMID:24787740

  14. Interstellar scintillation as the origin of the rapid radio variability of the quasar J1819+3845.

    PubMed

    Dennett-Thorpe, J; de Bruyn, A G

    2002-01-03

    The liberation of gravitational energy as matter falls onto a supermassive black hole at the centre of a galaxy is believed to explain the high luminosity of quasars. The variability of this emission from quasars and other types of active galactic nuclei can provide information on the size of the emitting regions and the physical process of fuelling the black hole. Some active galactic nuclei are variable at optical (and shorter) wavelengths, and display radio outbursts over years and decades. These active galactic nuclei often also show faster intraday variability at radio wavelengths. The origin of this rapid variability has been extensively debated, but a correlation between optical and radio variations in some sources suggests that both are intrinsic. This would, however, require radiation brightness temperatures that seem physically implausible, leading to the suggestion that the rapid variations are caused by scattering of the emission by the interstellar medium inside our Galaxy. Here we show that the rapid variations in the extreme case of quasar J1819+3845 (ref. 10) indeed arise from interstellar scintillation. The transverse velocity of the scattering material reveals the presence of plasma with a surprisingly high velocity close to the Solar System.

  15. Burgers or tofu? Eating between two worlds: risk information seeking and processing during dietary acculturation.

    PubMed

    Lu, Hang

    2015-01-01

    This study attempted to examine what factors might motivate Chinese international students, the fastest growing ethnic student group in the United States, to seek and process information about potential health risks from eating American-style food. This goal was accomplished by applying the Risk Information Seeking and Processing (RISP) model to this study. An online 2 (severity: high vs. low) × 2 (coping strategies: present vs. absent) between-subjects experiment was conducted via Qualtrics to evaluate the effects of the manipulated variables on the dependent variables of interest as well as various relationships proposed in the RISP model. A convenience sample of 635 participants was recruited online. Data were analyzed primarily using structural equation modeling (SEM) in AMOS 21.0 with maximum likelihood estimation. The final conceptual model has a good model fit to the data given the sample size. The results showed that although the experimentally manipulated variables failed to cause any significant differences in individuals' perceived severity and self-efficacy, this study largely supported the RISP model's propositions about the sociopsychological factors that explain individual variations in information seeking and processing. More specifically, the findings indicated a prominent role of informational subjective norms and affective responses (both negative and positive emotions) in predicting individuals' information seeking and processing. Future implications and limitations are also discussed.

  16. Describing rainfall in northern Australia using multiple climate indices

    NASA Astrophysics Data System (ADS)

    Wilks Rogers, Cassandra Denise; Beringer, Jason

    2017-02-01

    Savanna landscapes are globally extensive and highly sensitive to climate change, yet the physical processes and climate phenomena which affect them remain poorly understood and therefore poorly represented in climate models. Both human populations and natural ecosystems are highly susceptible to precipitation variation in these regions due to the effects on water and food availability and atmosphere-biosphere energy fluxes. Here we quantify the relationship between climate phenomena and historical rainfall variability in Australian savannas and, in particular, how these relationships changed across a strong rainfall gradient, namely the North Australian Tropical Transect (NATT). Climate phenomena were described by 16 relevant climate indices and correlated against precipitation from 1900 to 2010 to determine the relative importance of each climate index on seasonal, annual and decadal timescales. Precipitation trends, climate index trends and wet season characteristics have also been investigated using linear statistical methods. In general, climate index-rainfall correlations were stronger in the north of the NATT where annual rainfall variability was lower and a high proportion of rainfall fell during the wet season. This is consistent with a decreased influence of the Indian-Australian monsoon from the north to the south. Seasonal variation was most strongly correlated with the Australian Monsoon Index, whereas yearly variability was related to a greater number of climate indices, predominately the Tasman Sea and Indonesian sea surface temperature indices (both of which experienced a linear increase over the duration of the study) and the El Niño-Southern Oscillation indices. These findings highlight the importance of understanding the climatic processes driving variability and, subsequently, the importance of understanding the relationships between rainfall and climatic phenomena in the Northern Territory in order to project future rainfall patterns in the region.

  17. Carbon fluxes in tropical forest ecosystems: the value of Eddy-covariance data for individual-based dynamic forest gap models

    NASA Astrophysics Data System (ADS)

    Roedig, Edna; Cuntz, Matthias; Huth, Andreas

    2015-04-01

    The effects of climatic inter-annual fluctuations and human activities on the global carbon cycle are uncertain and currently a major issue in global vegetation models. Individual-based forest gap models, on the other hand, model vegetation structure and dynamics on a small spatial (<100 ha) and large temporal scale (>1000 years). They are well-established tools to reproduce successions of highly-diverse forest ecosystems and investigate disturbances as logging or fire events. However, the parameterizations of the relationships between short-term climate variability and forest model processes are often uncertain in these models (e.g. daily variable temperature and gross primary production (GPP)) and cannot be constrained from forest inventories. We addressed this uncertainty and linked high-resolution Eddy-covariance (EC) data with an individual-based forest gap model. The forest model FORMIND was applied to three diverse tropical forest sites in the Amazonian rainforest. Species diversity was categorized into three plant functional types. The parametrizations for the steady-state of biomass and forest structure were calibrated and validated with different forest inventories. The parameterizations of relationships between short-term climate variability and forest model processes were evaluated with EC-data on a daily time step. The validations of the steady-state showed that the forest model could reproduce biomass and forest structures from forest inventories. The daily estimations of carbon fluxes showed that the forest model reproduces GPP as observed by the EC-method. Daily fluctuations of GPP were clearly reflected as a response to daily climate variability. Ecosystem respiration remains a challenge on a daily time step due to a simplified soil respiration approach. In the long-term, however, the dynamic forest model is expected to estimate carbon budgets for highly-diverse tropical forests where EC-measurements are rare.

  18. A regional modeling framework of phosphorus sources and transport in streams of the southeastern United States

    USGS Publications Warehouse

    Garcia, Ana Maria.; Hoos, Anne B.; Terziotti, Silvia

    2011-01-01

    We applied the SPARROW model to estimate phosphorus transport from catchments to stream reaches and subsequent delivery to major receiving water bodies in the Southeastern United States (U.S.). We show that six source variables and five land-to-water transport variables are significant (p < 0.05) in explaining 67% of the variability in long-term log-transformed mean annual phosphorus yields. Three land-to-water variables are a subset of landscape characteristics that have been used as transport factors in phosphorus indices developed by state agencies and are identified through experimental research as influencing land-to-water phosphorus transport at field and plot scales. Two land-to-water variables – soil organic matter and soil pH – are associated with phosphorus sorption, a significant finding given that most state-developed phosphorus indices do not explicitly contain variables for sorption processes. Our findings for Southeastern U.S. streams emphasize the importance of accounting for phosphorus present in the soil profile to predict attainable instream water quality. Regional estimates of phosphorus associated with soil-parent rock were highly significant in explaining instream phosphorus yield variability. Model predictions associate 31% of phosphorus delivered to receiving water bodies to geology and the highest total phosphorus yields in the Southeast were catchments with already high background levels that have been impacted by human activity.

  19. Influence of rice straw cooking conditions in the soda-ethanol-water pulping on the mechanical properties of produced paper sheets.

    PubMed

    Navaee-Ardeh, S; Mohammadi-Rovshandeh, J; Pourjoozi, M

    2004-03-01

    A normalized design was used to examine the influence of independent variables (alcohol concentration, cooking time and temperature) in the catalytic soda-ethanol pulping of rice straw on various mechanical properties (breaking length, burst, tear index and folding endurance) of paper sheets obtained from each pulping process. An equation of each dependent variable as a function of cooking variables (independent variables) was obtained by multiple non-linear regression using the least square method by MATLAB software for developing of empirical models. The ranges of alcohol concentration, cooking time and temperature were 40-65% (w/w), 150-180 min and 195-210 degrees C, respectively. Three-dimensional graphs of dependent variables were also plotted versus independent variables. The optimum values of breaking length, burst and tear index and folding endurance were 4683.7 (m), 30.99 (kN/g), 376.93 (mN m2/g) and 27.31, respectively. However, short cooking time (150 min), high ethanol concentration (65%) and high temperature (210 degrees C) could be used to produce papers with suitable burst and tear index. However, for papers with best breaking length and folding endurance low temperature (195 degrees C) was desirable. Differences between optimum values of dependent variables obtained by normalized design and experimental data were less than 20%.

  20. The relationship between Urbanisation and changes in flood regimes: the British case.

    NASA Astrophysics Data System (ADS)

    Prosdocimi, Ilaria; Miller, James; Kjeldsen, Thomas

    2013-04-01

    This pilot study investigates if long-term changes in observed series of extreme flood events can be attributed to changes in climate and land-use drivers. We investigate, in particular, changes of winter and summer peaks extracted from gauged instantaneous flows records in selected British catchments. Using a Poisson processes framework, the frequency and magnitude of extreme events above a threshold can be modelled simultaneously under the standard stationarity assumptions of constant location and scale. In the case of a non-stationary process, the framework was extended to include covariates to account for changes in the process parameters. By including covariates related to the physical process, such as increased urbanization or North Atlantic Oscillation (NAO) Index levels, rather than just time, an enhanced understanding of the changes in high flows is obtainable. Indeed some variability is expected in any natural process and can be partially explained by large scale measures like NAO Index. The focus of this study is to understand, once natural variability is taken into account, how much of the remaining variability can be explained by increased urbanization levels. For this study, catchments are selected that have experienced significant growth in urbanisation in the past decades, typically 1960s to present, and for which concurrent good quality high flow data are available. Temporal change in the urban extent within catchments is obtained using novel processing of historical mapping sources, whereby the urban, suburban and rural fractions are obtained for decadal periods. Suitable flow data from localised rural catchments are also included as control cases to compare observed changes in the flood regime of urbanised catchments against, and to provide evidence of changes in regional climate. Initial results suggest that the effect of urbanisation can be detected in the rate of occurrence of flood events, especially in summer, whereas the impact on flood magnitude is less pronounced. Further tests across a greater number of catchments are necessary to validate these results.

  1. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed

    Hardman, Charlotte A; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J; Brunstrom, Jeffrey M

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our "obesogenic environment" may have been overlooked - the dramatic increase in "dietary variability" (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity.

  2. Bacterial contamination of ex vivo processed PBPC products under clean room conditions.

    PubMed

    Ritter, Markus; Schwedler, Joachim; Beyer, Jörg; Movassaghi, Kamran; Mutters, Reinier; Neubauer, Andreas; Schwella, Nimrod

    2003-11-01

    Patients undergoing high-dose radio- and/or chemotherapy and autologous or allogeneic PBPC transplantation are at high risk for infections owing to profound immunosuppression. In this study, the rate of microbial contamination of ex vivo processed PBPC products was analyzed, comparing preparation under clean room conditions to standard laboratory conditions. After implementation of good manufacturing practice conditions in the two participating institutions, the microbial contamination rate of 366 PBPC harvests from 198 patients was determined under certified clean room conditions (Group A) from 2000 until 2002. To investigate influence of improved environmental conditions along with other parameters, this set of samples was compared with a historical control set of 1413 PBPC products, which have been processed ex vivo under a clean bench in a regular laboratory room and were harvested from 626 patients (Group B) from 1989 until 2000. In Group B microbial contamination was found in 74 PBPC products (5.2%) from 57 patients. In Group A microbial growth was detected in 3 leukapheresis products (0.8%) from 3 patients. After exclusion of PBPC products, which were probably contaminated before manipulation, statistical analysis showed a significant difference (chi2= 10.339; p < 0.001). These data suggest an impact of clean room conditions on the bacterial contamination rate of PBPC products. To identify confounding variables, variables like technique of leukapheresis, culture methodology, and microbial colonization of central venous catheters were taken into account. Further variables might be identified in following studies.

  3. Studies on Hot-Melt Prepregging on PRM-II-50 Polyimide Resin with Graphite Fibers

    NASA Technical Reports Server (NTRS)

    Shin, E. Eugene; Sutter, James K.; Juhas, John; Veverka, Adrienne; Klans, Ojars; Inghram, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Zoha, John; Bubnick, Jim

    2004-01-01

    A second generation PMR (in situ Polymerization of Monomer Reactants) polyimide resin PMR-II-50, has been considered for high temperature and high stiffness space propulsion composites applications for its improved high temperature performance. As part of composite processing optimization, two commercial prepregging methods: solution vs. hot-melt processes were investigated with M40J fabrics from Toray. In a previous study a systematic chemical, physical, thermal and mechanical characterization of these composites indicated the poor resin-fiber interfacial wetting, especially for the hot-melt process, resulted in poor composite quality. In order to improve the interfacial wetting, optimization of the resin viscosity and process variables were attempted in a commercial hot-melt prepregging line. In addition to presenting the results from the prepreg quality optimization trials, the combined effects of the prepregging method and two different composite cure methods, i.e. hot press vs. autoclave on composite quality and properties are discussed.

  4. Studies on Hot-Melt Prepregging of PMR-II-50 Polyimide Resin with Graphite Fibers

    NASA Technical Reports Server (NTRS)

    Shin, E. Eugene; Sutter, James K.; Juhas, John; Veverka, Adrienne; Klans, Ojars; Inghram, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Zoha, John; Bubnick, Jim

    2003-01-01

    A Second generation PMR (in situ Polymerization of Monomer Reactants) polyimide resin, PMR-II-50, has been considered for high temperature and high stiffness space propulsion composites applications for its improved high temperature performance. As part of composite processing optimization, two commercial prepregging methods: solution vs. hot-melt processes were investigated with M40J fabrics from Toray. In a previous study a systematic chemical, physical, thermal and mechanical characterization of these composites indicated that poor resin-fiber interfacial wetting, especially for the hot-melt process, resulted in poor composite quality. In order to improve the interfacial wetting, optimization of the resin viscosity and process variables were attempted in a commercial hot-melt prepregging line. In addition to presenting the results from the prepreg quality optimization trials, the combined effects of the prepregging method and two different composite cure methods, i.e., hot press vs. autoclave on composite quality and properties are discussed.

  5. Post-secretion processing influences spider silk performance

    PubMed Central

    Blamires, Sean J.; Wu, Chung-Lin; Blackledge, Todd A.; Tso, I-Min

    2012-01-01

    Phenotypic variation facilitates adaptations to novel environments. Silk is an example of a highly variable biomaterial. The two-spidroin (MaSp) model suggests that spider major ampullate (MA) silk is composed of two proteins—MaSp1 predominately contains alanine and glycine and forms strength enhancing β-sheet crystals, while MaSp2 contains proline and forms elastic spirals. Nonetheless, mechanical properties can vary in spider silks without congruent amino acid compositional changes. We predicted that post-secretion processing causes variation in the mechanical performance of wild MA silk independent of protein composition or spinning speed across 10 species of spider. We used supercontraction to remove post-secretion effects and compared the mechanics of silk in this ‘ground state’ with wild native silks. Native silk mechanics varied less among species compared with ‘ground state’ silks. Variability in the mechanics of ‘ground state’ silks was associated with proline composition. However, variability in native silks did not. We attribute interspecific similarities in the mechanical properties of native silks, regardless of amino acid compositions, to glandular processes altering molecular alignment of the proteins prior to extrusion. Such post-secretion processing may enable MA silk to maintain functionality across environments, facilitating its function as a component of an insect-catching web. PMID:22628213

  6. Effects of Process-Oriented and Product-Oriented Worked Examples and Prior Knowledge on Learner Problem Solving and Attitude: A Study in the Domain of Microeconomics

    ERIC Educational Resources Information Center

    Brooks, Christopher Darren

    2009-01-01

    The purpose of this study was to investigate the effectiveness of process-oriented and product-oriented worked example strategies and the mediating effect of prior knowledge (high versus low) on problem solving and learner attitude in the domain of microeconomics. In addition, the effect of these variables on learning efficiency as well as the…

  7. Single Side Electrolytic In-Process Dressing (ELID) Grinding with Lapping Kinematics of Silicon Carbide

    NASA Astrophysics Data System (ADS)

    Khoshaim, Ahmed Bakr

    The demand for Silicon Carbide ceramics (SiC) has increased significantly in the last decade due to its reliable physical and chemical properties. The silicon carbide is widely used for aerospace segments in addition to many uses in the industry. Sometimes, a single side grinding is preferable than conventional grinding, for it has the ability to produce flat ceramics. However, the manufacturing cost is still high because of the high tool wear and long machining time. Part of the solution is to use electrolytic in process dressing (ELID) to reduce the processing time. The study on ELID single side grinding of ceramics has never been attempted before. The study involves four variables with three levels each. One of the variables, which is the eccentricity, is being investigated for the first time on ceramics. A full factorial design, for both the surface roughness and material removal rate, guides to calculate mathematical models that can predict future results. Three grinding wheel mesh sizes are used. An investigation of the influence of different grain size on the results can then be evaluated. The kinematics of the process was studied based on eccentricity in order to optimize the pattern of the diamond grains. The experiment is performed with the assist of the proposed specialized ELID fluid, TRIM C270E.

  8. Patterns in Temporal Variability of Temperature, Oxygen and pH along an Environmental Gradient in a Coral Reef

    PubMed Central

    Guadayol, Òscar; Silbiger, Nyssa J.; Donahue, Megan J.; Thomas, Florence I. M.

    2014-01-01

    Spatial and temporal environmental variability are important drivers of ecological processes at all scales. As new tools allow the in situ exploration of individual responses to fluctuations, ecologically meaningful ways of characterizing environmental variability at organism scales are needed. We investigated the fine-scale spatial heterogeneity of high-frequency temporal variability in temperature, dissolved oxygen concentration, and pH experienced by benthic organisms in a shallow coastal coral reef. We used a spatio-temporal sampling design, consisting of 21 short-term time-series located along a reef flat-to-reef slope transect, coupled to a long-term station monitoring water column changes. Spectral analyses revealed sharp gradients in variance decomposed by frequency, as well as differences between physically-driven and biologically-reactive parameters. These results highlight the importance of environmental variance at organismal scales and present a new sampling scheme for exploring this variability in situ. PMID:24416364

  9. Multicollinearity may lead to artificial interaction: an example from a cross sectional study of biomarkers.

    PubMed

    Sithisarankul, P; Weaver, V M; Diener-West, M; Strickland, P T

    1997-06-01

    Collinearity is the situation which arises in multiple regression when some or all of the explanatory variables are so highly correlated with one another that it becomes very difficult, if not impossible, to disentangle their influences and obtain a reasonably precise estimate of their effects. Suppressor variable is one of the extreme situations of collinearity that one variable can substantially increase the multiple correlation when combined with a variable that is only modestly correlated with the response variable. In this study, we describe the process by which we disentangled and discovered multicollinearity and its consequences, namely artificial interaction, using the data from cross-sectional quantification of several biomarkers. We showed how the collinearity between one biomarker (blood lead level) and another (urinary trans, trans-muconic acid) and their interaction (blood lead level* urinary trans, trans-muconic acid) can lead to the observed artificial interaction on the third biomarker (urinary 5-aminolevulinic acid).

  10. Networks for image acquisition, processing and display

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1990-01-01

    The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

  11. Effect of a Dispersant Agent in Fine Coal Recovery from Washery Tailings by Oil Agglomeration (Preliminary Study)

    NASA Astrophysics Data System (ADS)

    Yasar, Özüm; Uslu, Tuncay

    2017-12-01

    Among the fine coal cleaning methods, the oil agglomeration process has important advantages such as high process recovery, more clean product, simple dewatering stage. Several coal agglomeration studies have been undertaken recently and effects of different variables on the process performance have been investigated. However, unlike flotation studies, most of the previous agglomeration studies have not used dispersing agents to minimize slime coating effects of clays. In this study, agglomeration process was applied for recovery of fine coals from coal washery tailings containing remarkable amount of fine coal. Negative effect of fine clays during recovery was tried to be eliminated by using dispersing agent instead of de-sliming. Although ash reductions over 90 % were achieved, performance remained below expectations in terms of combustible matter recovery. However, this study is a preliminary one. It is considered that more satisfied results will be obtained in the next studies by changing the variables such as solid ratio, oil dosage, dispersant type and dosage.

  12. Work environment risk factors for injuries in wood processing.

    PubMed

    Holcroft, Christina A; Punnett, Laura

    2009-01-01

    The reported injury rate for wood product manufacturing in Maine, 1987-2004, was almost twice the state-wide average for all jobs. A case-control study was conducted in wood processing plants to determine preventable risk factors for injury. A total of 157 cases with injuries reported to workers' compensation and 251 controls were interviewed. In multivariable analyses, variables associated with injury risk were high physical workload, machine-paced work or inability to take a break, lack of training, absence of a lockout/tagout program, low seniority, and male gender. Different subsets of these variables were significant when acute incidents and overexertions were analyzed separately and when all injuries were stratified by industry sub-sector. Generalizability may be limited somewhat by non-representative participation of workplaces and individuals. Nevertheless, these findings provide evidence that many workplace injuries occurring in wood processing could be prevented by application of ergonomics principles and improved work organization.

  13. Optimization of a thermal hydrolysis process for sludge pre-treatment.

    PubMed

    Sapkaite, I; Barrado, E; Fdz-Polanco, F; Pérez-Elvira, S I

    2017-05-01

    At industrial scale, thermal hydrolysis is the most used process to enhance biodegradability of the sludge produced in wastewater treatment plants. Through statistically guided Box-Behnken experimental design, the present study analyses the effect of TH as pre-treatment applied to activated sludge. The selected process variables were temperature (130-180 °C), time (5-50 min) and decompression mode (slow or steam-explosion effect), and the parameters evaluated were sludge solubilisation and methane production by anaerobic digestion. A quadratic polynomial model was generated to compare the process performance for the 15 different combinations of operation conditions by modifying the process variables evaluated. The statistical analysis performed exhibited that methane production and solubility were significantly affected by pre-treatment time and temperature. During high intensity pre-treatment (high temperature and long times), the solubility increased sharply while the methane production exhibited the opposite behaviour, indicating the formation of some soluble but non-biodegradable materials. Therefore, solubilisation is not a reliable parameter to quantify the efficiency of a thermal hydrolysis pre-treatment, since it is not directly related to methane production. Based on the operational parameters optimization, the estimated optimal thermal hydrolysis conditions to enhance of sewage sludge digestion were: 140-170 °C heating temperature, 5-35min residence time, and one sudden decompression. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. The role of effort in moderating the anxiety-performance relationship: Testing the prediction of processing efficiency theory in simulated rally driving.

    PubMed

    Wilson, Mark; Smith, Nickolas C; Chattington, Mark; Ford, Mike; Marple-Horvat, Dilwyn E

    2006-11-01

    We tested some of the key predictions of processing efficiency theory using a simulated rally driving task. Two groups of participants were classified as either dispositionally high or low anxious based on trait anxiety scores and trained on a simulated driving task. Participants then raced individually on two similar courses under counterbalanced experimental conditions designed to manipulate the level of anxiety experienced. The effort exerted on the driving tasks was assessed though self-report (RSME), psychophysiological measures (pupil dilation) and visual gaze data. Efficiency was measured in terms of efficiency of visual processing (search rate) and driving control (variability of wheel and accelerator pedal) indices. Driving performance was measured as the time taken to complete the course. As predicted, increased anxiety had a negative effect on processing efficiency as indexed by the self-report, pupillary response and variability of gaze data. Predicted differences due to dispositional levels of anxiety were also found in the driving control and effort data. Although both groups of drivers performed worse under the threatening condition, the performance of the high trait anxious individuals was affected to a greater extent by the anxiety manipulation than the performance of the low trait anxious drivers. The findings suggest that processing efficiency theory holds promise as a theoretical framework for examining the relationship between anxiety and performance in sport.

  15. Gamma rays from blazars

    NASA Astrophysics Data System (ADS)

    Tavecchio, Fabrizio

    2017-01-01

    Blazars are high-energy engines providing us natural laboratories to study particle acceleration, relativistic plasma processes, magnetic field dynamics, black hole physics. Key informations are provided by observations at high-energy (in particular by Fermi/LAT) and very-high energy (by Cherenkov telescopes). I give a short account of the current status of the field, with particular emphasis on the theoretical challenges connected to the observed ultra-fast variability events and to the emission of flat spectrum radio quasars in the very high energy band.

  16. Process connectivity in a naturally prograding river delta

    NASA Astrophysics Data System (ADS)

    Sendrowski, Alicia; Passalacqua, Paola

    2017-03-01

    River deltas are lowland systems that can display high hydrological connectivity. This connectivity can be structural (morphological connections), functional (control of fluxes), and process connectivity (information flow from system drivers to sinks). In this work, we quantify hydrological process connectivity in Wax Lake Delta, coastal Louisiana, by analyzing couplings among external drivers (discharge, tides, and wind) and water levels recorded at five islands and one channel over summer 2014. We quantify process connections with information theory, a branch of mathematics concerned with the communication of information. We represent process connections as a network; variables serve as network nodes and couplings as network links describing the strength, direction, and time scale of information flow. Comparing process connections at long (105 days) and short (10 days) time scales, we show that tides exhibit daily synchronization with water level, with decreasing strength from downstream to upstream, and that tides transfer information as tides transition from spring to neap. Discharge synchronizes with water level and the time scale of its information transfer compares well to physical travel times through the system, computed with a hydrodynamic model. Information transfer and physical transport show similar spatial patterns, although information transfer time scales are larger than physical travel times. Wind events associated with water level setup lead to increased process connectivity with highly variable information transfer time scales. We discuss the information theory results in the context of the hydrologic behavior of the delta, the role of vegetation as a connector/disconnector on islands, and the applicability of process networks as tools for delta modeling results.

  17. Macroscale water fluxes 3. Effects of land processes on variability of monthly river discharge

    USGS Publications Warehouse

    Milly, P.C.D.; Wetherald, R.T.

    2002-01-01

    A salient characteristic of river discharge is its temporal variability. The time series of flow at a point on a river can be viewed as the superposition of a smooth seasonal cycle and an irregular, random variation. Viewing the random component in the spectral domain facilitates both its characterization and an interpretation of its major physical controls from a global perspective. The power spectral density functions of monthly flow anomalies of many large rivers worldwide are typified by a "red noise" process: the density is higher at low frequencies (e.g., <1 y-1) than at high frequencies, indicating disproportionate (relative to uncorrelated "white noise") contribution of low frequencies to variability of monthly flow. For many high-latitude and arid-region rivers, however, the power is relatively evenly distributed across the frequency spectrum. The power spectrum of monthly flow can be interpreted as the product of the power spectrum of monthly basin total precipitation (which is typically white or slightly red) and several filters that have physical significance. The filters are associated with (1) the conversion of total precipitation (sum of rainfall and snowfall) to effective rainfall (liquid flux to the ground surface from above), (2) the conversion of effective rainfall to soil water excess (runoff), and (3) the conversion of soil water excess to river discharge. Inferences about the roles of each filter can be made through an analysis of observations, complemented by information from a global model of the ocean-atmosphere-land system. The first filter causes a snowmelt-related amplification of high-frequency variability in those basins that receive substantial snowfall. The second filter causes a relatively constant reduction in variability across all frequencies and can be predicted well by means of a semiempirical water balance relation. The third filter, associated with groundwater and surface water storage in the river basin, causes a strong reduction in high-frequency variability of many basins. The strength of this reduction can be quantified by an average residence time of water in storage, which is typically on the order of 20-50 days. The residence time is demonstrably influenced by freezing conditions in the basin, fractional cover of the basin by lakes, and runoff ratio (ratio of mean runoff to mean precipitation). Large lake areas enhance storage and can greatly increase total residence times (100 to several hundred days). Freezing conditions appear to cause bypassing of subsurface storage, thus reducing residence times (0-30 days). Small runoff ratios tend to be associated with arid regions, where the water table is deep, and consequently, most of the runoff is produced by processes that bypass the saturated zone, leading to relatively small residence times for such basins (0-40 days).

  18. Selecting the process variables for filament winding

    NASA Technical Reports Server (NTRS)

    Calius, E.; Springer, G. S.

    1986-01-01

    A model is described which can be used to determine the appropriate values of the process variables for filament winding cylinders. The process variables which can be selected by the model include the winding speed, fiber tension, initial resin degree of cure, and the temperatures applied during winding, curing, and post-curing. The effects of these process variables on the properties of the cylinder during and after manufacture are illustrated by a numerical example.

  19. Assessing the effects of multiple stressors on the functioning of Mediterranean rivers using poplar wood breakdown.

    PubMed

    Aristi, Ibon; Díez, Jose Ramon; Larrañaga, Aitor; Navarro-Ortega, Alícia; Barceló, Damià; Elosegi, Arturo

    2012-12-01

    Mediterranean rivers in the Iberian Peninsula are being increasingly affected by human activities, which threaten their ecological status. A clear picture of how do these multiple stressors affect river ecosystem functioning is still lacking. We addressed this question by measuring a key ecosystem process, namely breakdown of organic matter, at 66 sites distributed across Mediterranean Spain. We performed breakdown experiments by measuring the mass lost by wood sticks for 54 to 106 days. Additionally, we gathered data on physico-chemical, biological and geomorphological characteristics of study sites. Study sites spanned a broad range of environmental characteristics and breakdown rates varied fiftyfold across sites. No clear geographic patterns were found between or within basins. 90th quantile regressions performed to link breakdown rates with environmental characteristics included the following 7 variables in the model, in decreasing order of importance: altitude, water content in phosphorus, catchment area, toxicity, invertebrate-based biotic index, riparian buffer width, and diatom-based quality index. Breakdown rate was systematically low in high-altitude rivers with few human impacts, but showed a high variability in areas affected by human activity. This increase in variability is the result of the influence of multiple stressors acting simultaneously, as some of these can promote whereas others slow down the breakdown of organic matter. Therefore, stick breakdown gives information on the intensity of a key ecosystem process, which would otherwise be very difficult to predict based on environmental variables. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    NASA Astrophysics Data System (ADS)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  1. Individual differences in the processing of smoking-cessation video messages: An imaging genetics study.

    PubMed

    Shi, Zhenhao; Wang, An-Li; Aronowitz, Catherine A; Romer, Daniel; Langleben, Daniel D

    2017-09-01

    Studies testing the benefits of enriching smoking-cessation video ads with attention-grabbing sensory features have yielded variable results. Dopamine transporter gene (DAT1) has been implicated in attention deficits. We hypothesized that DAT1 polymorphism is partially responsible for this variability. Using functional magnetic resonance imaging, we examined brain responses to videos high or low in attention-grabbing features, indexed by "message sensation value" (MSV), in 53 smokers genotyped for DAT1. Compared to other smokers, 10/10 homozygotes showed greater neural response to High- vs. Low-MSV smoking-cessation videos in two a priori regions of interest: the right temporoparietal junction and the right ventrolateral prefrontal cortex. These regions are known to underlie stimulus-driven attentional processing. Exploratory analysis showed that the right temporoparietal response positively predicted follow-up smoking behavior indexed by urine cotinine. Our findings suggest that responses to attention-grabbing features in smoking-cessation messages is affected by the DAT1 genotype. Copyright © 2017. Published by Elsevier B.V.

  2. Self-distancing Buffers High Trait Anxious Pediatric Cancer Caregivers against Short- and Longer-term Distress

    PubMed Central

    Penner, Louis A; Guevarra, Darwin A.; Harper, Felicity W. K.; Taub, Jeffrey; Phipps, Sean; Albrecht, Terrance L.; Kross, Ethan

    2015-01-01

    Pediatric cancer caregivers are typically present at their child’s frequent, invasive treatments, and such treatments elicit substantial distress. Yet, variability exists in how even the most anxious caregivers cope. Here we examined one potential source of this variability: caregivers’ tendencies to self-distance when reflecting on their feelings surrounding their child’s treatments. We measured caregivers’ self-distancing and trait anxiety at baseline, anticipatory anxiety during their child’s treatment procedures, and psychological distress and avoidance three months later. Self-distancing buffered high (but not low) trait anxious caregivers against short- and long-term distress without promoting avoidance. These findings held when controlling for other buffers, highlighting the unique benefits of self-distancing. These results identify a coping process that buffers vulnerable caregivers against a chronic life stressor while also demonstrating the ecological validly of laboratory research on self-distancing. Future research is needed to explicate causality and the cognitive and physiological processes that mediate these results. PMID:27617183

  3. Evaporation of LOX under supercritical and subcritical conditions

    NASA Technical Reports Server (NTRS)

    Yang, A. S.; Hsieh, W. H.; Kuo, K. K.; Brown, J. J.

    1993-01-01

    The evaporation of LOX under supercritical and subcritical conditions was studied experimentally and theoretically. In experiments, the evaporation rate and surface temperature were measured for LOX strand vaporizing in helium environments at pressures ranging from 5 to 68 atmospheres. Gas sampling and chromatography analysis were also employed to profile the gas composition above the LOX surface for the purpose of model validation. A comprehensive theoretical model was formulated and solved numerically to simulate the evaporation process of LOX at high pressures. The model was based on the conservation equations of mass, momentum, energy, and species concentrations for a multicomponent system, with consideration of gravitational body force, solubility of ambient gases in liquid, and variable thermophysical properties. Good agreement between predictions and measured oxygen mole fraction profiles was obtained. The effect of pressure on the distribution of the Lewis number, as well as the effect of variable diffusion coefficient, were further examined to elucidate the high-pressure transport behavior exhibited in the LOX vaporization process.

  4. Estimating and mapping ecological processes influencing microbial community assembly

    DOE PAGES

    Stegen, James C.; Lin, Xueju; Fredrickson, Jim K.; ...

    2015-05-01

    Ecological community assembly is governed by a combination of (i) selection resulting from among-taxa differences in performance; (ii) dispersal resulting from organismal movement; and (iii) ecological drift resulting from stochastic changes in population sizes. The relative importance and nature of these processes can vary across environments. Selection can be homogeneous or variable, and while dispersal is a rate, we conceptualize extreme dispersal rates as two categories; dispersal limitation results from limited exchange of organisms among communities, and homogenizing dispersal results from high levels of organism exchange. To estimate the influence and spatial variation of each process we extend a recentlymore » developed statistical framework, use a simulation model to evaluate the accuracy of the extended framework, and use the framework to examine subsurface microbial communities over two geologic formations. For each subsurface community we estimate the degree to which it is influenced by homogeneous selection, variable selection, dispersal limitation, and homogenizing dispersal. Our analyses revealed that the relative influences of these ecological processes vary substantially across communities even within a geologic formation. We further identify environmental and spatial features associated with each ecological process, which allowed mapping of spatial variation in ecological-process-influences. The resulting maps provide a new lens through which ecological systems can be understood; in the subsurface system investigated here they revealed that the influence of variable selection was associated with the rate at which redox conditions change with subsurface depth.« less

  5. Quantifying spatiotemporal variability of fine particles in an urban environment using combined fixed and mobile measurements

    NASA Astrophysics Data System (ADS)

    Sullivan, R. C.; Pryor, S. C.

    2014-06-01

    Spatiotemporal variability of fine particle concentrations in Indianapolis, Indiana is quantified using a combination of high temporal resolution measurements at four fixed sites and mobile measurements with instruments attached to bicycles during transects of the city. Average urban PM2.5 concentrations are an average of ˜3.9-5.1 μg m-3 above the regional background. The influence of atmospheric conditions on ambient PM2.5 concentrations is evident with the greatest temporal variability occurring at periods of one day and 5-10 days corresponding to diurnal and synoptic meteorological processes, and lower mean wind speeds are associated with episodes of high PM2.5 concentrations. An anthropogenic signal is also evident. Higher PM2.5 concentrations coincide with morning rush hour, the frequencies of PM2.5 variability co-occur with those for carbon monoxide, and higher extreme concentrations were observed mid-week compared to weekends. On shorter time scales (

  6. High-cadence, High-resolution Spectroscopic Observations of Herbig Stars HD 98922 and V1295 Aquila

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aarnio, Alicia N.; Monnier, John D.; Calvet, Nuria

    Recent observational work has indicated that mechanisms for accretion and outflow in Herbig Ae/Be star–disk systems may differ from magnetospheric accretion (MA) as it is thought to occur in T Tauri star–disk systems. In this work, we assess the temporal evolution of spectral lines probing accretion and mass loss in Herbig Ae/Be systems and test for consistency with the MA paradigm. For two Herbig Ae/Be stars, HD 98922 (B9e) and V1295 Aql (A2e), we have gathered multi-epoch (∼years) and high-cadence (∼minutes) high-resolution optical spectra to probe a wide range of kinematic processes. Employing a line equivalent width evolution correlation metricmore » introduced here, we identify species co-evolving (indicative of common line origin) via novel visualization. We interferometrically constrain often problematically degenerate parameters, inclination and inner-disk radius, allowing us to focus on the structure of the wind, magnetosphere, and inner gaseous disk in radiative transfer models. Over all timescales sampled, the strongest variability occurs within the blueshifted absorption components of the Balmer series lines; the strength of variability increases with the cadence of the observations. Finally, high-resolution spectra allow us to probe substructure within the Balmer series’ blueshifted absorption components: we observe static, low-velocity features and time-evolving features at higher velocities. Overall, we find the observed line morphologies and variability are inconsistent with a scaled-up T Tauri MA scenario. We suggest that as magnetic field structure and strength change dramatically with increasing stellar mass from T Tauri to Herbig Ae/Be stars, so too may accretion and outflow processes.« less

  7. Redintegration and the Benefits of Long-Term Knowledge in Verbal Short-Term Memory: An Evaluation of Schweickert's (1993) Multinomial Processing Tree Model

    ERIC Educational Resources Information Center

    Thorn, Annabel S. C.; Gathercole, Susan E.; Frankish, Clive R.

    2005-01-01

    The impact of four long-term knowledge variables on serial recall accuracy was investigated. Serial recall was tested for high and low frequency words and high and low phonotactic frequency nonwords in 2 groups: monolingual English speakers and French-English bilinguals. For both groups the recall advantage for words over nonwords reflected more…

  8. Soil variability along a nitrogen mineralization and nitrification gradient in a nitrogen-saturated hardwood forest

    Treesearch

    Frank S. Gilliam; Nikki L. Lyttle; Ashley Thomas; Mary Beth Adams

    2005-01-01

    Some N-saturated watersheds of the Fernow Experimental Forest (FEF), West Virginia, exhibit a high degree of spatial heterogeneity in soil N processing. We used soils from four sites at FEF representing a gradient in net N mineralization and nitrification to consider the causes and consequences of such spatial heterogeneity. We collected soils with extremely high vs....

  9. Motion Picture Attendance and Factors Influencing Movie Selection among High School Students.

    ERIC Educational Resources Information Center

    Austin, Bruce A.

    In an audience research study, 64 high school students responded to a questionnaire concerning their movie attendance habits and the importance of ten variables to their decision-making process when choosing a movie to see. The results indicated that 26.6% attended movies once a month, 23.4% twice monthly, 6.3% three times a month, 4.7% four times…

  10. Vocal acoustic analysis as a biometric indicator of information processing: implications for neurological and psychiatric disorders.

    PubMed

    Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C

    2015-03-30

    Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A first-principle model of 300 mm Czochralski single-crystal Si production process for predicting crystal radius and crystal growth rate

    NASA Astrophysics Data System (ADS)

    Zheng, Zhongchao; Seto, Tatsuru; Kim, Sanghong; Kano, Manabu; Fujiwara, Toshiyuki; Mizuta, Masahiko; Hasebe, Shinji

    2018-06-01

    The Czochralski (CZ) process is the dominant method for manufacturing large cylindrical single-crystal ingots for the electronics industry. Although many models and control methods for the CZ process have been proposed, they were only tested with small equipment and only a few industrial application were reported. In this research, we constructed a first-principle model for controlling industrial CZ processes that produce 300 mm single-crystal silicon ingots. The developed model, which consists of energy, mass balance, hydrodynamic, and geometrical equations, calculates the crystal radius and the crystal growth rate as output variables by using the heater input, the crystal pulling rate, and the crucible rise rate as input variables. To improve accuracy, we modeled the CZ process by considering factors such as changes in the positions of the crucible and the melt level. The model was validated with the operation data from an industrial 300 mm CZ process. We compared the calculated and actual values of the crystal radius and the crystal growth rate, and the results demonstrated that the developed model simulated the industrial process with high accuracy.

  13. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  14. Effectiveness of Cognitive Processing Therapy and Prolonged Exposure in the Department of Veterans Affairs.

    PubMed

    Rutt, Benjamin T; Oehlert, Mary E; Krieshok, Thomas S; Lichtenberg, James W

    2018-04-01

    Objective This study evaluated the effectiveness of cognitive processing therapy and prolonged exposure in conditions reflective of current clinical practice within the Veterans Health Administration. Method This study involved a retrospective review of 2030 charts. A total of 750 veterans from 10 U.S. states who received cognitive processing therapy or prolonged exposure in individual psychotherapy were included in the study (participants in cognitive processing therapy, N = 376; participants in prolonged exposure, N = 374). The main dependent variable was self-reported posttraumatic stress disorder symptoms as measured by total scores on the Posttraumatic Stress Disorder Checklist. The study used multilevel modeling to evaluate the absolute and relative effectiveness of both treatments and determine the relationship between patient-level variables and total Posttraumatic Stress Disorder Checklist scores during treatment. Results Cognitive processing therapy and prolonged exposure were equally effective at reducing total Posttraumatic Stress Disorder Checklist scores. Veterans who completed therapy reported significantly larger reductions in the Posttraumatic Stress Disorder Checklist than patients who did not complete therapy. There were no significant differences in the improvement of posttraumatic stress disorder symptoms with respect to age and three racial/ethnic groups (Caucasian, African American, and Hispanic). Conclusions Cognitive processing therapy and prolonged exposure were shown to be effective in conditions highly reflective of clinical practice and with a highly diverse sample of veterans. Challenges related to dropout from trauma focused therapy should continue to be researched.

  15. Landsat-8 TIRS thermal radiometric calibration status

    USGS Publications Warehouse

    Barsi, Julia A.; Markham, Brian L.; Montanaro, Matthew; Gerace, Aaron; Hook, Simon; Schott, John R.; Raqueno, Nina G.; Morfitt, Ron

    2017-01-01

    The Thermal Infrared Sensor (TIRS) instrument is the thermal-band imager on the Landsat-8 platform. The initial onorbit calibration estimates of the two TIRS spectral bands indicated large average radiometric calibration errors, -0.29 and -0.51 W/m2 sr μm or -2.1K and -4.4K at 300K in Bands 10 and 11, respectively, as well as high variability in the errors, 0.87K and 1.67K (1-σ), respectively. The average error was corrected in operational processing in January 2014, though, this adjustment did not improve the variability. The source of the variability was determined to be stray light from far outside the field of view of the telescope. An algorithm for modeling the stray light effect was developed and implemented in the Landsat-8 processing system in February 2017. The new process has improved the overall calibration of the two TIRS bands, reducing the residual variability in the calibration from 0.87K to 0.51K at 300K for Band 10 and from 1.67K to 0.84K at 300K for Band 11. There are residual average lifetime bias errors in each band: 0.04 W/m2 sr μm (0.30K) and -0.04 W/m2 sr μm (-0.29K), for Bands 10 and 11, respectively.

  16. Modelling Escherichia coli concentrations in the tidal Scheldt river and estuary.

    PubMed

    de Brauwere, Anouk; de Brye, Benjamin; Servais, Pierre; Passerat, Julien; Deleersnijder, Eric

    2011-04-01

    Recent observations in the tidal Scheldt River and Estuary revealed a poor microbiological water quality and substantial variability of this quality which can hardly be assigned to a single factor. To assess the importance of tides, river discharge, point sources, upstream concentrations, mortality and settling a new model (SLIM-EC) was built. This model was first validated by comparison with the available field measurements of Escherichia coli (E. coli, a common fecal bacterial indicator) concentrations. The model simulations agreed well with the observations, and in particular were able to reproduce the observed long-term median concentrations and variability. Next, the model was used to perform sensitivity runs in which one process/forcing was removed at a time. These simulations revealed that the tide, upstream concentrations and the mortality process are the primary factors controlling the long-term median E. coli concentrations and the observed variability. The tide is crucial to explain the increased concentrations upstream of important inputs, as well as a generally increased variability. Remarkably, the wastewater treatment plants discharging in the study domain do not seem to have a significant impact. This is due to a dilution effect, and to the fact that the concentrations coming from upstream (where large cities are located) are high. Overall, the settling process as it is presently described in the model does not significantly affect the simulated E. coli concentrations. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Complexity in relational processing predicts changes in functional brain network dynamics.

    PubMed

    Cocchi, Luca; Halford, Graeme S; Zalesky, Andrew; Harding, Ian H; Ramm, Brentyn J; Cutmore, Tim; Shum, David H K; Mattingley, Jason B

    2014-09-01

    The ability to link variables is critical to many high-order cognitive functions, including reasoning. It has been proposed that limits in relating variables depend critically on relational complexity, defined formally as the number of variables to be related in solving a problem. In humans, the prefrontal cortex is known to be important for reasoning, but recent studies have suggested that such processes are likely to involve widespread functional brain networks. To test this hypothesis, we used functional magnetic resonance imaging and a classic measure of deductive reasoning to examine changes in brain networks as a function of relational complexity. As expected, behavioral performance declined as the number of variables to be related increased. Likewise, increments in relational complexity were associated with proportional enhancements in brain activity and task-based connectivity within and between 2 cognitive control networks: A cingulo-opercular network for maintaining task set, and a fronto-parietal network for implementing trial-by-trial control. Changes in effective connectivity as a function of increased relational complexity suggested a key role for the left dorsolateral prefrontal cortex in integrating and implementing task set in a trial-by-trial manner. Our findings show that limits in relational processing are manifested in the brain as complexity-dependent modulations of large-scale networks. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. High-resolution monitoring across the soil-groundwater interface - Revealing small-scale hydrochemical patterns with a novel multi-level well

    NASA Astrophysics Data System (ADS)

    Gassen, Niklas; Griebler, Christian; Stumpp, Christine

    2016-04-01

    Biogeochemical turnover processes in the subsurface are highly variable both in time and space. In order to capture this variability, high resolution monitoring systems are required. Particular in riparian zones the understanding of small-scale biogeochemical processes is of interest, as they are regarded as important buffer zones for nutrients and contaminants with high turnover rates. To date, riparian research has focused on influences of groundwater-surface water interactions on element cycling, but little is known about processes occurring at the interface between the saturated and the unsaturated zone during dynamic flow conditions. Therefore, we developed a new type of high resolution multi-level well (HR-MLW) that has been installed in the riparian zone of the Selke river. This HR-MLW for the first time enables to derive water samples both from the unsaturated and the saturated zone across one vertical profile with a spatial vertical resolution of 0.05 to 0.5 m to a depth of 4 m b.l.s. Water samples from the unsaturated zone are extracted via suction cup sampling. Samples from the saturated zone are withdrawn through glass filters and steel capillaries. Both, ceramic cups and glass filters, are installed along a 1" HDPE piezometer tube. First high resolution hydrochemical profiles revealed a distinct depth-zonation in the riparian alluvial aquifer. A shallow zone beneath the water table carried a signature isotopically and hydrochemically similar to the nearby river, while layers below 1.5 m were influenced by regional groundwater. This zonation showed temporal dynamics related to groundwater table fluctuations and microbial turnover processes. The HR-MLW delivered new insight into mixing and turnover processes between riverwater and groundwater in riparian zones, both in a temporal and spatial dimension. With these new insights, we are able to improve our understanding of dynamic turnover processes at the soil - groundwater interface and of surface -groundwater interactions in riparian zones. In the future, a better prediction and targeted management of buffer mechanisms in riparian zones will be possible.

  19. Aconitum alkaloid content and the high toxicity of aconite tincture.

    PubMed

    Chan, Thomas Y K

    2012-10-10

    Although proprietary medicines and decoction of processed aconite roots are the most widely used, tincture accounts for the great majority of aconite poisoning cases in China, indicating that it is much more toxic than other formulations. Aconite tincture is often self-prepared at home and raw aconite plants or roots are often used. Even if processed aconite roots were used to make the tincture, the amount of Aconitum alkaloids is highly variable, depending on the adequacy of processing and quality control. Aconitum alkaloids dissolve efficiently in alcohol. For these reasons, tincture contains very high concentrations of Aconitum alkaloids. Despite its high intrinsic toxicity, overdose of aconite tincture by the users has been common. Severe aconite poisoning can be complicated by fatal ventricular tachyarrhythmias and asystole. The public should be repeatedly warned of the danger of taking aconite tincture by mouth. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.; Doan, D. J.; Carr, E. S.

    1971-01-01

    A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.

  1. Rivers and Floodplains as Key Components of Global Terrestrial Water Storage Variability

    NASA Astrophysics Data System (ADS)

    Getirana, Augusto; Kumar, Sujay; Girotto, Manuela; Rodell, Matthew

    2017-10-01

    This study quantifies the contribution of rivers and floodplains to terrestrial water storage (TWS) variability. We use state-of-the-art models to simulate land surface processes and river dynamics and to separate TWS into its main components. Based on a proposed impact index, we show that surface water storage (SWS) contributes 8% of TWS variability globally, but that contribution differs widely among climate zones. Changes in SWS are a principal component of TWS variability in the tropics, where major rivers flow over arid regions and at high latitudes. SWS accounts for 22-27% of TWS variability in both the Amazon and Nile Basins. Changes in SWS are negligible in the Western U.S., Northern Africa, Middle East, and central Asia. Based on comparisons with Gravity Recovery and Climate Experiment-based TWS, we conclude that accounting for SWS improves simulated TWS in most of South America, Africa, and Southern Asia, confirming that SWS is a key component of TWS variability.

  2. Some considerations concerning the challenge of incorporating social variables into epidemiological models of infectious disease transmission.

    PubMed

    Barnett, Tony; Fournié, Guillaume; Gupta, Sunetra; Seeley, Janet

    2015-01-01

    Incorporation of 'social' variables into epidemiological models remains a challenge. Too much detail and models cease to be useful; too little and the very notion of infection - a highly social process in human populations - may be considered with little reference to the social. The French sociologist Émile Durkheim proposed that the scientific study of society required identification and study of 'social currents'. Such 'currents' are what we might today describe as 'emergent properties', specifiable variables appertaining to individuals and groups, which represent the perspectives of social actors as they experience the environment in which they live their lives. Here we review the ways in which one particular emergent property, hope, relevant to a range of epidemiological situations, might be used in epidemiological modelling of infectious diseases in human populations. We also indicate how such an approach might be extended to include a range of other potential emergent properties to represent complex social and economic processes bearing on infectious disease transmission.

  3. Role of relative humidity in processing and storage of seeds and assessment of variability in storage behaviour in Brassica spp. and Eruca sativa.

    PubMed

    Suma, A; Sreenivasan, Kalyani; Singh, A K; Radhamani, J

    2013-01-01

    The role of relative humidity (RH) while processing and storing seeds of Brassica spp. and Eruca sativa was investigated by creating different levels of relative humidity, namely, 75%, 50%, 32%, and 11% using different saturated salt solutions and 1% RH using concentrated sulphuric acid. The variability in seed storage behaviour of different species of Brassica was also evaluated. The samples were stored at 40 ± 2°C in sealed containers and various physiological parameters were assessed at different intervals up to three months. The seed viability and seedling vigour parameters were considerably reduced in all accessions at high relative humidity irrespective of the species. Storage at intermediate relative humidities caused minimal decline in viability. All the accessions performed better at relative humidity level of 32% maintaining seed moisture content of 3%. On analyzing the variability in storage behaviour, B. rapa and B. juncea were better performers than B. napus and Eruca sativa.

  4. Technical variables in high-throughput miRNA expression profiling: much work remains to be done.

    PubMed

    Nelson, Peter T; Wang, Wang-Xia; Wilfred, Bernard R; Tang, Guiliang

    2008-11-01

    MicroRNA (miRNA) gene expression profiling has provided important insights into plant and animal biology. However, there has not been ample published work about pitfalls associated with technical parameters in miRNA gene expression profiling. One source of pertinent information about technical variables in gene expression profiling is the separate and more well-established literature regarding mRNA expression profiling. However, many aspects of miRNA biochemistry are unique. For example, the cellular processing and compartmentation of miRNAs, the differential stability of specific miRNAs, and aspects of global miRNA expression regulation require specific consideration. Additional possible sources of systematic bias in miRNA expression studies include the differential impact of pre-analytical variables, substrate specificity of nucleic acid processing enzymes used in labeling and amplification, and issues regarding new miRNA discovery and annotation. We conclude that greater focus on technical parameters is required to bolster the validity, reliability, and cultural credibility of miRNA gene expression profiling studies.

  5. New parameters in adaptive testing of ferromagnetic materials utilizing magnetic Barkhausen noise

    NASA Astrophysics Data System (ADS)

    Pal'a, Jozef; Ušák, Elemír

    2016-03-01

    A new method of magnetic Barkhausen noise (MBN) measurement and optimization of the measured data processing with respect to non-destructive evaluation of ferromagnetic materials was tested. Using this method we tried to found, if it is possible to enhance sensitivity and stability of measurement results by replacing the traditional MBN parameter (root mean square) with some new parameter. In the tested method, a complex set of the MBN from minor hysteresis loops is measured. Afterward, the MBN data are collected into suitably designed matrices and optimal parameters of MBN with respect to maximum sensitivity to the evaluated variable are searched. The method was verified on plastically deformed steel samples. It was shown that the proposed measuring method and measured data processing bring an improvement of the sensitivity to the evaluated variable when comparing with measuring traditional MBN parameter. Moreover, we found a parameter of MBN, which is highly resistant to the changes of applied field amplitude and at the same time it is noticeably more sensitive to the evaluated variable.

  6. A new neural observer for an anaerobic bioreactor.

    PubMed

    Belmonte-Izquierdo, R; Carlos-Hernandez, S; Sanchez, E N

    2010-02-01

    In this paper, a recurrent high order neural observer (RHONO) for anaerobic processes is proposed. The main objective is to estimate variables of methanogenesis: biomass, substrate and inorganic carbon in a completely stirred tank reactor (CSTR). The recurrent high order neural network (RHONN) structure is based on the hyperbolic tangent as activation function. The learning algorithm is based on an extended Kalman filter (EKF). The applicability of the proposed scheme is illustrated via simulation. A validation using real data from a lab scale process is included. Thus, this observer can be successfully implemented for control purposes.

  7. Performance characteristics of a novel blood bag in-line closure device and subsequent product quality assessment

    PubMed Central

    Serrano, Katherine; Levin, Elena; Culibrk, Brankica; Weiss, Sandra; Scammell, Ken; Boecker, Wolfgang F; Devine, Dana V

    2010-01-01

    BACKGROUND In high-volume processing environments, manual breakage of in-line closures can result in repetitive strain injury (RSI). Furthermore, these closures may be incorrectly opened causing shear-induced hemolysis. To overcome the variability of in-line closure use and minimize RSI, Fresenius Kabi developed a new in-line closure, the CompoFlow, with mechanical openers. STUDY DESIGN AND METHODS The consistency of the performance of the CompoFlow closure device was assessed, as was its effect on component quality. A total of 188 RBC units using CompoFlow blood bag systems and 43 using the standard bag systems were produced using the buffy coat manufacturing method. Twenty-six CompoFlow platelet (PLT) concentrates and 10 control concentrates were prepared from pools of four buffy coats. RBCs were assessed on Days 1, 21, and 42 for cellular variables and hemolysis. PLTs were assessed on Days 1, 3, and 7 for morphology, CD62P expression, glucose, lactate, and pH. A total of 308 closures were excised after processing and the apertures were measured using digital image analysis. RESULTS The use of the CompoFlow device significantly improved the mean extraction time with 0.46 ± 0.11 sec/mL for the CompoFlow units and 0.52 ± 0.13 sec/mL for the control units. The CompoFlow closures showed a highly reproducible aperture after opening (coefficient of variation, 15%) and the device always remained opened. PLT and RBC products showed acceptable storage variables with no differences between CompoFlow and control. CONCLUSIONS The CompoFlow closure devices improved the level of process control and processing time of blood component production with no negative effects on product quality. PMID:20529007

  8. Yield impact for wafer shape misregistration-based binning for overlay APC diagnostic enhancement

    NASA Astrophysics Data System (ADS)

    Jayez, David; Jock, Kevin; Zhou, Yue; Govindarajulu, Venugopal; Zhang, Zhen; Anis, Fatima; Tijiwa-Birk, Felipe; Agarwal, Shivam

    2018-03-01

    The importance of traditionally acceptable sources of variation has started to become more critical as semiconductor technologies continue to push into smaller technology nodes. New metrology techniques are needed to pursue the process uniformity requirements needed for controllable lithography. Process control for lithography has the advantage of being able to adjust for cross-wafer variability, but this requires that all processes are close in matching between process tools/chambers for each process. When this is not the case, the cumulative line variability creates identifiable groups of wafers1 . This cumulative shape based effect is described as impacting overlay measurements and alignment by creating misregistration of the overlay marks. It is necessary to understand what requirements might go into developing a high volume manufacturing approach which leverages this grouping methodology, the key inputs and outputs, and what can be extracted from such an approach. It will be shown that this line variability can be quantified into a loss of electrical yield primarily at the edge of the wafer and proposes a methodology for root cause identification and improvement. This paper will cover the concept of wafer shape based grouping as a diagnostic tool for overlay control and containment, the challenges in implementing this in a manufacturing setting, and the limitations of this approach. This will be accomplished by showing that there are identifiable wafer shape based signatures. These shape based wafer signatures will be shown to be correlated to overlay misregistration, primarily at the edge. It will also be shown that by adjusting for this wafer shape signal, improvements can be made to both overlay as well as electrical yield. These improvements show an increase in edge yield, and a reduction in yield variability.

  9. Hubby and Lewontin on Protein Variation in Natural Populations: When Molecular Genetics Came to the Rescue of Population Genetics.

    PubMed

    Charlesworth, Brian; Charlesworth, Deborah; Coyne, Jerry A; Langley, Charles H

    2016-08-01

    The 1966 GENETICS papers by John Hubby and Richard Lewontin were a landmark in the study of genome-wide levels of variability. They used the technique of gel electrophoresis of enzymes and proteins to study variation in natural populations of Drosophila pseudoobscura, at a set of loci that had been chosen purely for technical convenience, without prior knowledge of their levels of variability. Together with the independent study of human populations by Harry Harris, this seminal study provided the first relatively unbiased picture of the extent of genetic variability in protein sequences within populations, revealing that many genes had surprisingly high levels of diversity. These papers stimulated a large research program that found similarly high electrophoretic variability in many different species and led to statistical tools for interpreting the data in terms of population genetics processes such as genetic drift, balancing and purifying selection, and the effects of selection on linked variants. The current use of whole-genome sequences in studies of variation is the direct descendant of this pioneering work. Copyright © 2016 by the Genetics Society of America.

  10. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  11. Evolution and Control of 2219 Aluminum Microstructural Features Through Electron Beam Freeform Fabrication

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.; Hafley, Robert A.; Domack, Marcia S.

    2006-01-01

    The layer-additive nature of the electron beam freeform fabrication (EBF3) process results in a tortuous thermal path producing complex microstructures including: small homogeneous equiaxed grains; dendritic growth contained within larger grains; and/or pervasive dendritic formation in the interpass regions of the deposits. Several process control variables contribute to the formation of these different microstructures, including translation speed, wire feed rate, beam current and accelerating voltage. In electron beam processing, higher accelerating voltages embed the energy deeper below the surface of the substrate. Two EBF3 systems have been established at NASA Langley, one with a low-voltage (10-30kV) and the other a high-voltage (30-60 kV) electron beam gun. Aluminum alloy 2219 was processed over a range of different variables to explore the design space and correlate the resultant microstructures with the processing parameters. This report is specifically exploring the impact of accelerating voltage. Of particular interest is correlating energy to the resultant material characteristics to determine the potential of achieving microstructural control through precise management of the heat flux and cooling rates during deposition.

  12. High-temperature optical fiber instrumentation for gas flow monitoring in gas turbine engines

    NASA Astrophysics Data System (ADS)

    Roberts, Adrian; May, Russell G.; Pickrell, Gary R.; Wang, Anbo

    2002-02-01

    In the design and testing of gas turbine engines, real-time data about such physical variables as temperature, pressure and acoustics are of critical importance. The high temperature environment experienced in the engines makes conventional electronic sensors devices difficult to apply. Therefore, there is a need for innovative sensors that can reliably operate under the high temperature conditions and with the desirable resolution and frequency response. A fiber optic high temperature sensor system for dynamic pressure measurement is presented in this paper. This sensor is based on a new sensor technology - the self-calibrated interferometric/intensity-based (SCIIB) sensor, recently developed at Virginia Tech. State-of-the-art digital signal processing (DSP) methods are applied to process the signal from the sensor to acquire high-speed frequency response.

  13. Redintegration and the benefits of long-term knowledge in verbal short-term memory: an evaluation of Schweickert's (1993) multinomial processing tree model.

    PubMed

    Thorn, Annabel S C; Gathercole, Susan E; Frankish, Clive R

    2005-03-01

    The impact of four long-term knowledge variables on serial recall accuracy was investigated. Serial recall was tested for high and low frequency words and high and low phonotactic frequency nonwords in 2 groups: monolingual English speakers and French-English bilinguals. For both groups the recall advantage for words over nonwords reflected more fully correct recalls with fewer recall attempts that consisted of fragments of the target memory items (one or two of the three target phonemes recalled correctly); completely incorrect recalls were equivalent for the 2 list types. However, word frequency (for both groups), nonword phonotactic frequency (for the monolingual group), and language familiarity all influenced the proportions of completely incorrect recalls that were made. These results are not consistent with the view that long-term knowledge influences on immediate recall accuracy can be exclusively attributed to a redintegration process of the type specified in multinomial processing tree model of immediate recall. The finding of a differential influence on completely incorrect recalls of these four long-term knowledge variables suggests instead that the beneficial effects of long-term knowledge on short-term recall accuracy are mediated by more than one mechanism.

  14. The role of personal self-regulation and regulatory teaching to predict motivational-affective variables, achievement, and satisfaction: a structural model

    PubMed Central

    De la Fuente, Jesus; Zapata, Lucía; Martínez-Vicente, Jose M.; Sander, Paul; Cardelle-Elawar, María

    2014-01-01

    The present investigation examines how personal self-regulation (presage variable) and regulatory teaching (process variable of teaching) relate to learning approaches, strategies for coping with stress, and self-regulated learning (process variables of learning) and, finally, how they relate to performance and satisfaction with the learning process (product variables). The objective was to clarify the associative and predictive relations between these variables, as contextualized in two different models that use the presage-process-product paradigm (the Biggs and DEDEPRO models). A total of 1101 university students participated in the study. The design was cross-sectional and retrospective with attributional (or selection) variables, using correlations and structural analysis. The results provide consistent and significant empirical evidence for the relationships hypothesized, incorporating variables that are part of and influence the teaching–learning process in Higher Education. Findings confirm the importance of interactive relationships within the teaching–learning process, where personal self-regulation is assumed to take place in connection with regulatory teaching. Variables that are involved in the relationships validated here reinforce the idea that both personal factors and teaching and learning factors should be taken into consideration when dealing with a formal teaching–learning context at university. PMID:25964764

  15. Use of osmotic dehydration to improve fruits and vegetables quality during processing.

    PubMed

    Maftoonazad, Neda

    2010-11-01

    Osmotic treatment describes a preparation step to further processing of foods involving simultaneous transient moisture loss and solids gain when immersing in osmotic solutions, resulting in partial drying and improving the overall quality of food products. The different aspects of the osmotic dehydration (OD) technology namely the solutes employed, solutions characteristics used, process variables influence, as well as, the quality characteristics of the osmodehydrated products will be discussed in this review. As the process is carried out at mild temperatures and the moisture is removed by a liquid diffusion process, phase change that would be present in the other drying processes will be avoided, resulting in high quality products and may also lead to substantial energy savings. To optimize this process, modeling of the mass transfer phenomenon can improve high product quality. Several techniques such as microwave heating, vacuum, high pressure, pulsed electric field, etc. may be employed during or after osmotic treatment to enhance performance of the osmotic dehydration. Moreover new technologies used in osmotic dehydration will be discussed. Patents on osmotic dehydration of fruits and vegetables are also discussed in this article.

  16. Scale and legacy controls on catchment nutrient export regimes

    NASA Astrophysics Data System (ADS)

    Howden, N. J. K.; Burt, T.; Worrall, F.

    2017-12-01

    Nutrient dynamics in river catchments are complex: water and chemical fluxes are highly variable in low-order streams, but this variability declines as fluxes move through higher-order reaches. This poses a major challenge for process understanding as much effort is focussed on long-term monitoring of the main river channel (a high-order reach), and therefore the data available to support process understanding are predominantly derived from sites where much of the transient response of nutrient export is masked by the effect of averaging over both space and time. This may be further exacerbated at all scales by the accumulation of legacy nutrient sources in soils, aquifers and pore waters, where historical activities have led to nutrient accumulation where the catchment system is transport limited. Therefore it is of particular interest to investigate how the variability of nutrient export changes both with catchment scale (from low to high-order catchment streams) and with the presence of legacy sources, such that the context of infrequent monitoring on high-order streams can be better understood. This is not only a question of characterising nutrient export regimes per se, but also developing a more thorough understanding of how the concepts of scale and legacy may modify the statistical characteristics of observed responses across scales in both space and time. In this paper, we use synthetic data series and develop a model approach to consider how space and timescales combine with impacts of legacy sources to influence observed variability in catchment export. We find that: increasing space and timescales tend to reduce the observed variance in nutrient exports, due to an increase in travel times and greater mixing, and therefore averaging, of sources; increasing the influence of legacy sources inflates the variance, with the level of inflation dictated by the residence time of the respective sources.

  17. FHR Process Instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holcomb, David Eugene

    2015-01-01

    Fluoride salt-cooled High temperature Reactors (FHRs) are entering into early phase engineering development. Initial candidate technologies have been identified to measure all of the required process variables. The purpose of this paper is to describe the proposed measurement techniques in sufficient detail to enable assessment of the proposed instrumentation suite and to support development of the component technologies. This paper builds upon the instrumentation chapter of the recently published FHR technology development roadmap. Locating instruments outside of the intense core radiation and high-temperature fluoride salt environment significantly decreases their environmental tolerance requirements. Under operating conditions, FHR primary coolant salt ismore » a transparent, low-vapor-pressure liquid. Consequently, FHRs can employ standoff optical measurements from above the salt pool to assess in-vessel conditions. For example, the core outlet temperature can be measured by observing the fuel s blackbody emission. Similarly, the intensity of the core s Cerenkov glow indicates the fission power level. Short-lived activation of the primary coolant provides another means for standoff measurements of process variables. The primary coolant flow and neutron flux can be measured using gamma spectroscopy along the primary coolant piping. FHR operation entails a number of process measurements. Reactor thermal power and core reactivity are the most significant variables for process control. Thermal power can be determined by measuring the primary coolant mass flow rate and temperature rise across the core. The leading candidate technologies for primary coolant temperature measurement are Au-Pt thermocouples and Johnson noise thermometry. Clamp-on ultrasonic flow measurement, that includes high-temperature tolerant standoffs, is a potential coolant flow measurement technique. Also, the salt redox condition will be monitored as an indicator of its corrosiveness. Both electrochemical techniques and optical spectroscopy are candidate fluoride salt redox measurement methods. Coolant level measurement can be performed using radar-level gauges located in standpipes above the reactor vessel. While substantial technical development remains for most of the instruments, industrially compatible instruments based upon proven technology can be reasonably extrapolated from the current state of the art.« less

  18. Processing-Structure-Property Relationships for Lignin-Based Carbonaceous Materials Used in Energy-Storage Applications

    DOE PAGES

    García-Negrón, Valerie; Phillip, Nathan D.; Li, Jianlin; ...

    2016-11-18

    Lignin, an abundant organic polymer and a byproduct of pulp and biofuel production, has potential applications owing to its high carbon content and aromatic structure. Processing structure relationships are difficult to predict because of the heterogeneity of lignin. Here, this work discusses the roles of unit operations in the carbonization process of softwood lignin, and their resulting impacts on the material structure and electrochemical properties in application as the anode in lithium-ion cells. The processing variables include the lignin source, temperature, and duration of thermal stabilization, pyrolysis, and reduction. Materials are characterized at the atomic and microscales. High-temperature carbonization, atmore » 2000 °C, produces larger graphitic domains than at 1050 °C, but results in a reduced capacity. Coulombic efficiencies over 98 % are achieved for extended galvanostatic cycling. Consequently, a properly designed carbonization process for lignin is well suited for the generation of low-cost, high-efficiency electrodes.« less

  19. Multivariate Analysis of Ladle Vibration

    NASA Astrophysics Data System (ADS)

    Yenus, Jaefer; Brooks, Geoffrey; Dunn, Michelle

    2016-08-01

    The homogeneity of composition and uniformity of temperature of the steel melt before it is transferred to the tundish are crucial in making high-quality steel product. The homogenization process is performed by stirring the melt using inert gas in ladles. Continuous monitoring of this process is important to make sure the action of stirring is constant throughout the ladle. Currently, the stirring process is monitored by process operators who largely rely on visual and acoustic phenomena from the ladle. However, due to lack of measurable signals, the accuracy and suitability of this manual monitoring are problematic. The actual flow of argon gas to the ladle may not be same as the flow gage reading due to leakage along the gas line components. As a result, the actual degree of stirring may not be correctly known. Various researchers have used one-dimensional vibration, and sound and image signals measured from the ladle to predict the degree of stirring inside. They developed online sensors which are indeed to monitor the online stirring phenomena. In this investigation, triaxial vibration signals have been measured from a cold water model which is a model of an industrial ladle. Three flow rate ranges and varying bath heights were used to collect vibration signals. The Fast Fourier Transform was applied to the dataset before it has been analyzed using principal component analysis (PCA) and partial least squares (PLS). PCA was used to unveil the structure in the experimental data. PLS was mainly applied to predict the stirring from the vibration response. It was found that for each flow rate range considered in this study, the informative signals reside in different frequency ranges. The first latent variables in these frequency ranges explain more than 95 pct of the variation in the stirring process for the entire single layer and the double layer data collected from the cold model. PLS analysis in these identified frequency ranges demonstrated that the latent variables of the response and predictor variables are highly correlated. The predicted variable has shown linear relationship with the stirring energy and bath recirculation speed. This outcome can improve the predictability of the mixing status in ladle metallurgy and make the online control of the process easier. Industrial testing of this input will follow.

  20. Coupling of snow and permafrost processes using the Basic Modeling Interface (BMI)

    NASA Astrophysics Data System (ADS)

    Wang, K.; Overeem, I.; Jafarov, E. E.; Piper, M.; Stewart, S.; Clow, G. D.; Schaefer, K. M.

    2017-12-01

    We developed a permafrost modeling tool based by implementing the Kudryavtsev empirical permafrost active layer depth model (the so-called "Ku" component). The model is specifically set up to have a basic model interface (BMI), which enhances the potential coupling to other earth surface processes model components. This model is accessible through the Web Modeling Tool in Community Surface Dynamics Modeling System (CSDMS). The Kudryavtsev model has been applied for entire Alaska to model permafrost distribution at high spatial resolution and model predictions have been verified by Circumpolar Active Layer Monitoring (CALM) in-situ observations. The Ku component uses monthly meteorological forcing, including air temperature, snow depth, and snow density, and predicts active layer thickness (ALT) and temperature on the top of permafrost (TTOP), which are important factors in snow-hydrological processes. BMI provides an easy approach to couple the models with each other. Here, we provide a case of coupling the Ku component to snow process components, including the Snow-Degree-Day (SDD) method and Snow-Energy-Balance (SEB) method, which are existing components in the hydrological model TOPOFLOW. The work flow is (1) get variables from meteorology component, set the values to snow process component, and advance the snow process component, (2) get variables from meteorology and snow component, provide these to the Ku component and advance, (3) get variables from snow process component, set the values to meteorology component, and advance the meteorology component. The next phase is to couple the permafrost component with fully BMI-compliant TOPOFLOW hydrological model, which could provide a useful tool to investigate the permafrost hydrological effect.

  1. Time course of word production in fast and slow speakers: a high density ERP topographic study.

    PubMed

    Laganaro, Marina; Valente, Andrea; Perret, Cyril

    2012-02-15

    The transformation of an abstract concept into an articulated word is achieved through a series of encoding processes, which time course has been repeatedly investigated in the psycholinguistic and neuroimaging literature on single word production. The estimates of the time course issued from previous investigations represent the timing of process duration for mean processing speed: as production speed varies significantly across speakers, a crucial question is how the timing of encoding processing varies with speed. Here we investigated whether between-subjects variability in the speed of speech production is distributed along all encoding processes or if it is accounted for by a specific processing stage. We analysed event-related electroencephalographical (ERP) correlates during overt picture naming in 45 subjects divided into three speed subgroups according to their production latencies. Production speed modulated waveform amplitudes in the time window ranging from about 200 to 350 ms after picture presentation and the duration of a stable electrophysiological spatial configuration in the same time period. The remaining time windows from picture onset to 200 ms before articulation were unaffected by speed. By contrast, the manipulation of a psycholinguistic variable, word age-of-acquisition, modulated ERPs in all speed subgroups in a different and later time period, starting at around 400 ms after picture presentation, associated with phonological encoding processes. These results indicate that the between-subject variability in the speed of single word production is principally accounted for by the timing of a stable electrophysiological activity in the 200-350 ms time period, presumably associated with lexical selection. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Hybrid photonic signal processing

    NASA Astrophysics Data System (ADS)

    Ghauri, Farzan Naseer

    This thesis proposes research of novel hybrid photonic signal processing systems in the areas of optical communications, test and measurement, RF signal processing and extreme environment optical sensors. It will be shown that use of innovative hybrid techniques allows design of photonic signal processing systems with superior performance parameters and enhanced capabilities. These applications can be divided into domains of analog-digital hybrid signal processing applications and free-space---fiber-coupled hybrid optical sensors. The analog-digital hybrid signal processing applications include a high-performance analog-digital hybrid MEMS variable optical attenuator that can simultaneously provide high dynamic range as well as high resolution attenuation controls; an analog-digital hybrid MEMS beam profiler that allows high-power watt-level laser beam profiling and also provides both submicron-level high resolution and wide area profiling coverage; and all optical transversal RF filters that operate on the principle of broadband optical spectral control using MEMS and/or Acousto-Optic tunable Filters (AOTF) devices which can provide continuous, digital or hybrid signal time delay and weight selection. The hybrid optical sensors presented in the thesis are extreme environment pressure sensors and dual temperature-pressure sensors. The sensors employ hybrid free-space and fiber-coupled techniques for remotely monitoring a system under simultaneous extremely high temperatures and pressures.

  3. Improved process robustness by using closed loop control in deep drawing applications

    NASA Astrophysics Data System (ADS)

    Barthau, M.; Liewald, M.; Christian, Held

    2017-09-01

    The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part flange.

  4. Crystallization using reverse micelles and water-in-oil microemulsion systems: the highly selective tool for the purification of organic compounds from complex mixtures.

    PubMed

    Kljajic, Alen; Bester-Rogac, Marija; Klobcar, Andrej; Zupet, Rok; Pejovnik, Stane

    2013-02-01

    The active pharmaceutical ingredient orlistat is usually manufactured using a semi-synthetic procedure, producing crude product and complex mixtures of highly related impurities with minimal side-chain structure variability. It is therefore crucial for the overall success of industrial/pharmaceutical application to develop an effective purification process. In this communication, we present the newly developed water-in-oil reversed micelles and microemulsion system-based crystallization process. Physiochemical properties of the presented crystallization media were varied through surfactants and water composition, and the impact on efficiency was measured through final variation of these two parameters. Using precisely defined properties of the dispersed water phase in crystallization media, a highly efficient separation process in terms of selectivity and yield was developed. Small-angle X-ray scattering, high-performance liquid chromatography, mass spectrometry, and scanning electron microscopy were used to monitor and analyze the separation processes and orlistat products obtained. Typical process characteristics, especially selectivity and yield in regard to reference examples, were compared and discussed. Copyright © 2012 Wiley Periodicals, Inc.

  5. Understanding the Process of Acculturation for Primary Prevention.

    ERIC Educational Resources Information Center

    Berry, J. W.

    This paper reviews the concepts of acculturation and adaptation to provide a framework for understanding the highly variable relationship between acculturation and mental health in refugee populations. It begins with an extended definition and discussion of the concepts of acculturation and adaptation. The characteristics of acculturating groups…

  6. The value of crossdating to retain high-frequency variability, climate signals, and extreme events in environmental proxies.

    PubMed

    Black, Bryan A; Griffin, Daniel; van der Sleen, Peter; Wanamaker, Alan D; Speer, James H; Frank, David C; Stahle, David W; Pederson, Neil; Copenheaver, Carolyn A; Trouet, Valerie; Griffin, Shelly; Gillanders, Bronwyn M

    2016-07-01

    High-resolution biogenic and geologic proxies in which one increment or layer is formed per year are crucial to describing natural ranges of environmental variability in Earth's physical and biological systems. However, dating controls are necessary to ensure temporal precision and accuracy; simple counts cannot ensure that all layers are placed correctly in time. Originally developed for tree-ring data, crossdating is the only such procedure that ensures all increments have been assigned the correct calendar year of formation. Here, we use growth-increment data from two tree species, two marine bivalve species, and a marine fish species to illustrate sensitivity of environmental signals to modest dating error rates. When falsely added or missed increments are induced at one and five percent rates, errors propagate back through time and eliminate high-frequency variability, climate signals, and evidence of extreme events while incorrectly dating and distorting major disturbances or other low-frequency processes. Our consecutive Monte Carlo experiments show that inaccuracies begin to accumulate in as little as two decades and can remove all but decadal-scale processes after as little as two centuries. Real-world scenarios may have even greater consequence in the absence of crossdating. Given this sensitivity to signal loss, the fundamental tenets of crossdating must be applied to fully resolve environmental signals, a point we underscore as the frontiers of growth-increment analysis continue to expand into tropical, freshwater, and marine environments. © 2016 John Wiley & Sons Ltd.

  7. Norwegian fjord sediments reveal NAO related winter temperature and precipitation changes of the past 2800 years

    NASA Astrophysics Data System (ADS)

    Faust, Johan; Fabian, Karl; Giraudeau, Jacques; Knies, Jochen

    2016-04-01

    The North Atlantic Oscillation (NAO) is the leading mode of atmospheric circulation variability in the North Atlantic region. Associated shifts of storm tracks, precipitation and temperature patterns affect energy supply and demand, fisheries and agricultural, as well as marine and terrestrial ecological dynamics. Long-term NAO reconstructions are crucial to better understand NAO variability in its response to climate forcing factors, and assess predictability and possible shifts associated with ongoing climate change. Fjord deposits have a great potential for providing high-resolution sedimentary records that reflect local terrestrial and marine processes and, therefore, offer unique opportunities for the investigation of sedimentological and geochemical climatically induced processes. A recent study of instrumental time series revealed NAO as main factor for a strong relation between winter temperature, precipitation and river discharge in central Norway over the past 50 years. Here we use the gained knowledge to establish the first high resolution NAO proxy record from marine sediments. By comparing geochemical measurements from a short sediment core with instrumental data we show that marine primary productivity proxies are sensitive to NAO changes. Conditioned on a stationary relation between our climate proxy and the NAO we establish the first high resolution NAO proxy record (NAO-TFJ) from marine sediments covering the past 2,800 years. The NAO-TFJ shows distinct co-variability with climate changes over Greenland, solar activity and Northern Hemisphere glacier dynamics as well as climatically associated paleo-demographic trends.

  8. Temporal and vertical variability in optical properties of New England shelf waters during late summer and spring

    NASA Astrophysics Data System (ADS)

    Sosik, Heidi M.; Green, Rebecca E.; Pegau, W. Scott; Roesler, Collin S.

    2001-05-01

    Relationships between optical and physical properties were examined on the basis of intensive sampling at a site on the New England continental shelf during late summer 1996 and spring 1997. During both seasons, particles were found to be the primary source of temporal and vertical variability in optical properties since light absorption by dissolved material, though significant in magnitude, was relatively constant. Within the particle pool, changes in phytoplankton were responsible for much of the observed optical variability. Physical processes associated with characteristic seasonal patterns in stratification and mixing contributed to optical variability mostly through effects on phytoplankton. An exception to this generalization occurred during summer as the passage of a hurricane led to a breakdown in stratification and substantial resuspension of nonphytoplankton particulate material. Prior to the hurricane, conditions in summer were highly stratified with subsurface maxima in absorption and scattering coefficients. In spring, stratification was much weaker but increased over the sampling period, and a modest phytoplankton bloom caused surface layer maxima in absorption and scattering coefficients. These seasonal differences in the vertical distribution of inherent optical properties were evident in surface reflectance spectra, which were elevated and shifted toward blue wavelengths in the summer. Some seasonal differences in optical properties, including reflectance spectra, suggest that a significant shift toward a smaller particle size distribution occurred in summer. Shorter timescale optical variability was consistent with a variety of influences including episodic events such as the hurricane, physical processes associated with shelfbreak frontal dynamics, biological processes such as phytoplankton growth, and horizontal patchiness combined with water mass advection.

  9. Using high-frequency sensors to identify hydroclimatological controls on storm-event variability in catchment nutrient fluxes and source zone activation

    NASA Astrophysics Data System (ADS)

    Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Krause, Stefan

    2017-04-01

    At the river catchment scale, storm events can drive highly variable behaviour in nutrient and water fluxes, yet short-term dynamics are frequently missed by low resolution sampling regimes. In addition, nutrient source contributions can vary significantly within and between storm events. Our inability to identify and characterise time dynamic source zone contributions severely hampers the adequate design of land use management practices in order to control nutrient exports from agricultural landscapes. Here, we utilise an 8-month high-frequency (hourly) time series of streamflow, nitrate concentration (NO3) and fluorescent dissolved organic matter concentration (FDOM) derived from optical in-situ sensors located in a headwater agricultural catchment. We characterised variability in flow and nutrient dynamics across 29 storm events. Storm events represented 31% of the time series and contributed disproportionately to nutrient loads (43% of NO3 and 36% of CDOM) relative to their duration. Principal components analysis of potential hydroclimatological controls on nutrient fluxes demonstrated that a small number of components, representing >90% of variance in the dataset, were highly significant model predictors of inter-event variability in catchment nutrient export. Hysteresis analysis of nutrient concentration-discharge relationships suggested spatially discrete source zones existed for NO3 and FDOM, and that activation of these zones varied on an event-specific basis. Our results highlight the benefits of high-frequency in-situ monitoring for characterising complex short-term nutrient dynamics and unravelling connections between hydroclimatological variability and river nutrient export and source zone activation under extreme flow conditions. These new process-based insights are fundamental to underpinning the development of targeted management measures to reduce nutrient loading of surface waters.

  10. USAF/SCEEE (United States Air Force/Southeastern Center for Electrical Engineering Education) Research Initiation Program Research Reports. Volume 1.

    DTIC Science & Technology

    1985-03-01

    comparison of samples would be difficult. (5) A restrictive random sample allows the sample to be irregularly spaced throughout the auxiliary variable space ...looking or downward-looking probes and the very low background radiation from space contribute to high signal-to-noise ratio and allow the...sunshine and earthshine, chemiluminescent processes, and radiation to space , in addition to collisional processes, determine the vibrational

  11. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    PubMed

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. [The Brazilian Hospital Information System and the acute myocardial infarction hospital care].

    PubMed

    Escosteguy, Claudia Caminha; Portela, Margareth Crisóstomo; Medronho, Roberto de Andrade; de Vasconcellos, Maurício Teixeira Leite

    2002-08-01

    To analyze the applicability of the Brazilian Unified Health System's national hospital database to evaluate the quality of acute myocardial infarction hospital care. It was evaluated 1,936 hospital admission forms having acute myocardial infarction (AMI) as primary diagnosis in the municipal district of Rio de Janeiro, Brazil, in 1997. Data was collected from the national hospital database. A stratified random sampling of 391 medical records was also evaluated. AMI diagnosis agreement followed the literature criteria. Variable accuracy analysis was performed using kappa index agreement. The quality of AMI diagnosis registered in hospital admission forms was satisfactory according to the gold standard of the literature. In general, the accuracy of the variables demographics (sex, age group), process (medical procedures and interventions), and outcome (hospital death) was satisfactory. The accuracy of demographics and outcome variables was higher than the one of process variables. Under registration of secondary diagnosis was high in the forms and it was the main limiting factor. Given the study findings and the widespread availability of the national hospital database, it is pertinent its use as an instrument in the evaluation of the quality of AMI medical care.

  13. Optimization of process variables for decolorization of Disperse Yellow 211 by Bacillus subtilis using Box-Behnken design.

    PubMed

    Sharma, Praveen; Singh, Lakhvinder; Dilbaghi, Neeraj

    2009-05-30

    Decolorization of textile azo dye Disperse Yellow 211 (DY 211) was carried out from simulated aqueous solution by bacterial strain Bacillus subtilis. Response surface methodology (RSM), involving Box-Behnken design matrix in three most important operating variables; temperature, pH and initial dye concentration was successfully employed for the study and optimization of decolorization process. The total 17 experiments were conducted in the study towards the construction of a quadratic model. According to analysis of variance (ANOVA) results, the proposed model can be used to navigate the design space. Under optimized conditions the bacterial strain was able to decolorize DY 211 up to 80%. Model indicated that initial dye concentration of 100 mgl(-1), pH 7 and a temperature of 32.5 degrees C were found optimum for maximum % decolorization. Very high regression coefficient between the variables and the response (R(2)=0.9930) indicated excellent evaluation of experimental data by polynomial regression model. The combination of the three variables predicted through RSM was confirmed through confirmatory experiments, hence the bacterial strain holds a great potential for the treatment of colored textile effluents.

  14. A Catchment-Based Approach to Modeling Land Surface Processes in a GCM. Part 1; Model Structure

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Suarez, Max J.; Ducharne, Agnes; Stieglitz, Marc; Kumar, Praveen

    2000-01-01

    A new strategy for modeling the land surface component of the climate system is described. The strategy is motivated by an arguable deficiency in most state-of-the-art land surface models (LSMs), namely the disproportionately higher emphasis given to the formulation of one-dimensional, vertical physics relative to the treatment of horizontal heterogeneity in surface properties -- particularly subgrid soil moisture variability and its effects on runoff generation. The new strategy calls for the partitioning of the continental surface into a mosaic of hydrologic catchments, delineated through analysis of high-resolution surface elevation data. The effective "grid" used for the land surface is therefore not specified by the overlying atmospheric grid. Within each catchment, the variability of soil moisture is related to characteristics of the topography and to three bulk soil moisture variables through a well-established model of catchment processes. This modeled variability allows the partitioning of the catchment into several areas representing distinct hydrological regimes, wherein distinct (regime-specific) evaporation and runoff parameterizations are applied. Care is taken to ensure that the deficiencies of the catchment model in regions of little to moderate topography are minimized.

  15. Global patterns in the poleward expansion of mangrove forests

    NASA Astrophysics Data System (ADS)

    Cavanaugh, K. C.; Feller, I. C.

    2016-12-01

    Understanding the processes that limit the geographic ranges of species is one of the central goals of ecology and biogeography. This issue is particularly relevant for coastal wetlands given that climate change is expected to lead to a `tropicalization' of temperate coastal and marine ecosystems. In coastal wetlands around the world, there have already been observations of mangroves expanding into salt marshes near the current poleward range limits of mangroves. However, there is still uncertainty regarding regional variability in the factors that control mangrove range limits. Here we used time series of Landsat satellite imagery to characterize patterns of mangrove abundance near their poleward range limits around the world. We tested the commonly held assumption that temporal variation in abundance should increase towards the edge of the range. We also compared variability in mangrove abundance to climate factors thought to set mangrove range limits (air temperature, water temperature, and aridity). In general, variability in mangrove abundance at range edges was high relative to range centers and this variability was correlated to one or more climate factors. However, the strength of these relationships varied among poleward range limits, suggesting that some mangrove range limits are control by processes other than climate, such as dispersal limitation.

  16. VARIABILITY IN ASSOCIATIONS OF PHOSPHATIDYCHOLINE MOLECULAR SPECIES WITH METABOLIC SYNDROME IN MEXICAN-AMERICAN FAMILIES

    PubMed Central

    Kulkarni, Hemant; Meikle, Peter J.; Mamtani, Manju; Weir, Jacquelyn M.; Barlow, Christopher K.; Jowett, Jeremy B.; Bellis, Claire; Dyer, Thomas D.; Johnson, Matthew P.; Rainwater, David L.; Almasy, Laura; Mahaney, Michael C.; Comuzzie, Anthony G.; Blangero, John; Curran, Joanne E.

    2013-01-01

    Plasma lipidomic studies using high performance liquid chromatography and mass spectroscopy offer detailed insights into metabolic processes. Taking the example of the most abundant plasma lipid class (phosphatidylcholines) we used the rich phenotypic and lipidomic data from the ongoing San Antonio Family Heart Study of large extended Mexican American families to assess the variability of association of the plasma phosphatidylcholine species with metabolic syndrome. Using robust statistical analytical methods, our study made two important observations. First, there was a wide variability in the association of phosphatidylcholine species with risk measures of metabolic syndrome. Phosphatidylcholine 40:7 was associated with a low risk while phosphatidylcholines 32:1 and 38:3 were associated with a high risk of metabolic syndrome. Second, all the odd chain phosphatidylcholines were associated with a reduced risk of metabolic syndrome implying that phosphatidylcholines derived from dairy products might be beneficial against metabolic syndrome. Our results demonstrate the value of lipid species-specific information provided by the upcoming array of lipidomic studies and open potential avenues for prevention and control of metabolic syndrome in high prevalence settings. PMID:23494580

  17. Investigation of Acoustic Vector Sensor Data Processing in the Presence of Highly Variable Bathymetry

    DTIC Science & Technology

    2014-06-01

    shelf 10 region to the north of the canyon. The impact of this 3-dimensional (3D) variable bathymetry, which may be combined with the effects of...weaker arrivals at large negative angles, consistent with the earliest bottom reflections on the left. The impact of the bottom-path reflections from...nzout*(nrout+1)*ny))),’bof’); for ifr =1:64, for ir=1:nrout+1, for iy=1:ny, data=fread(fid3,2*nzout,’float32’); fwrite(fid,data

  18. Explanatory Variables Associated with Campylobacter and Escherichia coli Concentrations on Broiler Chicken Carcasses during Processing in Two Slaughterhouses.

    PubMed

    Pacholewicz, Ewa; Swart, Arno; Wagenaar, Jaap A; Lipman, Len J A; Havelaar, Arie H

    2016-12-01

    This study aimed at identifying explanatory variables that were associated with Campylobacter and Escherichia coli concentrations throughout processing in two commercial broiler slaughterhouses. Quantative data on Campylobacter and E. coli along the processing line were collected. Moreover, information on batch characteristics, slaughterhouse practices, process performance, and environmental variables was collected through questionnaires, observations, and measurements, resulting in data on 19 potential explanatory variables. Analysis was conducted separately in each slaughterhouse to identify which variables were related to changes in concentrations of Campylobacter and E. coli during the processing steps: scalding, defeathering, evisceration, and chilling. Associations with explanatory variables were different in the slaughterhouses studied. In the first slaughterhouse, there was only one significant association: poorer uniformity of the weight of carcasses within a batch with less decrease in E. coli concentrations after defeathering. In the second slaughterhouse, significant statistical associations were found with variables, including age, uniformity, average weight of carcasses, Campylobacter concentrations in excreta and ceca, and E. coli concentrations in excreta. Bacterial concentrations in excreta and ceca were found to be the most prominent variables, because they were associated with concentration on carcasses at various processing points. Although the slaughterhouses produced specific products and had different batch characteristics and processing parameters, the effect of the significant variables was not always the same for each slaughterhouse. Therefore, each slaughterhouse needs to determine its particular relevant measures for hygiene control and process management. This identification could be supported by monitoring changes in bacterial concentrations during processing in individual slaughterhouses. In addition, the possibility that management and food handling practices in slaughterhouses contribute to the differences in bacterial contamination between slaughterhouses needs further investigation.

  19. A high-density relativistic reflection origin for the soft and hard X-ray excess emission from Mrk 1044

    NASA Astrophysics Data System (ADS)

    Mallick, L.; Alston, W. N.; Parker, M. L.; Fabian, A. C.; Pinto, C.; Dewangan, G. C.; Markowitz, A.; Gandhi, P.; Kembhavi, A. K.; Misra, R.

    2018-06-01

    We present the first results from a detailed spectral-timing analysis of a long (˜130 ks) XMM-Newton observation and quasi-simultaneous NuSTAR and Swift observations of the highly-accreting narrow-line Seyfert 1 galaxy Mrk 1044. The broadband (0.3-50 keV) spectrum reveals the presence of a strong soft X-ray excess emission below ˜1.5 keV, iron Kα emission complex at ˜6 -7 keV and a `Compton hump' at ˜15 -30 keV. We find that the relativistic reflection from a high-density accretion disc with a broken power-law emissivity profile can simultaneously explain the soft X-ray excess, highly ionized broad iron line and the Compton hump. At low frequencies ([2 - 6] × 10-5 Hz), the power-law continuum dominated 1.5-5 keV band lags behind the reflection dominated 0.3-1 keV band, which is explained with a combination of propagation fluctuation and Comptonization processes, while at higher frequencies ([1 - 2] × 10-4 Hz), we detect a soft lag which is interpreted as a signature of X-ray reverberation from the accretion disc. The fractional root-mean-squared (rms) variability of the source decreases with energy and is well described by two variable components: a less variable relativistic disc reflection and a more variable direct coronal emission. Our combined spectral-timing analyses suggest that the observed broadband X-ray variability of Mrk 1044 is mainly driven by variations in the location or geometry of the optically thin, hot corona.

  20. High performance reconciliation for continuous-variable quantum key distribution with LDPC code

    NASA Astrophysics Data System (ADS)

    Lin, Dakai; Huang, Duan; Huang, Peng; Peng, Jinye; Zeng, Guihua

    2015-03-01

    Reconciliation is a significant procedure in a continuous-variable quantum key distribution (CV-QKD) system. It is employed to extract secure secret key from the resulted string through quantum channel between two users. However, the efficiency and the speed of previous reconciliation algorithms are low. These problems limit the secure communication distance and the secure key rate of CV-QKD systems. In this paper, we proposed a high-speed reconciliation algorithm through employing a well-structured decoding scheme based on low density parity-check (LDPC) code. The complexity of the proposed algorithm is reduced obviously. By using a graphics processing unit (GPU) device, our method may reach a reconciliation speed of 25 Mb/s for a CV-QKD system, which is currently the highest level and paves the way to high-speed CV-QKD.

  1. Regolith-Derived Heat Shield for Planetary Body Entry and Descent System with In Situ Fabrication

    NASA Technical Reports Server (NTRS)

    Hogue, Michael D.; Mueller, Robert P.; Rasky, Daniel J.; Hintze, Paul E.; Sibille, Laurent

    2011-01-01

    In this paper we will discuss a new mass-efficient and innovative way of protecting high-mass spacecraft during planetary Entry, Descent & Landing (EDL). Heat shields fabricated in situ can provide a thermal-protection system (TPS) for spacecraft that routinely enter a planetary atmosphere. By fabricating the heat shield with space resources from regolith materials available on moons and asteroids, it is possible to avoid launching the heat-shield mass from Earth. Three regolith processing and manufacturing methods will be discussed: 1) oxygen & metal extraction ISRU processes produce glassy melts enriched in alumina and titania, processed to obtain variable density, high melting point and heat-resistance; 2) compression and sintering of the regolith yield low density materials; 3) in-situ derived high-temperature polymers are created to bind regolith particles together, with a lower energy budget.

  2. An advanced stochastic weather generator for simulating 2-D high-resolution climate variables

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2017-07-01

    A new stochastic weather generator, Advanced WEather GENerator for a two-dimensional grid (AWE-GEN-2d) is presented. The model combines physical and stochastic approaches to simulate key meteorological variables at high spatial and temporal resolution: 2 km × 2 km and 5 min for precipitation and cloud cover and 100 m × 100 m and 1 h for near-surface air temperature, solar radiation, vapor pressure, atmospheric pressure, and near-surface wind. The model requires spatially distributed data for the calibration process, which can nowadays be obtained by remote sensing devices (weather radar and satellites), reanalysis data sets and ground stations. AWE-GEN-2d is parsimonious in terms of computational demand and therefore is particularly suitable for studies where exploring internal climatic variability at multiple spatial and temporal scales is fundamental. Applications of the model include models of environmental systems, such as hydrological and geomorphological models, where high-resolution spatial and temporal meteorological forcing is crucial. The weather generator was calibrated and validated for the Engelberg region, an area with complex topography in the Swiss Alps. Model test shows that the climate variables are generated by AWE-GEN-2d with a level of accuracy that is sufficient for many practical applications.

  3. Variation of sperm head shape and tail length in a species of Australian hydromyine rodent: the spinifex hopping mouse, Notomys alexis.

    PubMed

    Bauer, M; Breed, W G

    2006-01-01

    In Australia, there are around 60 species of murid rodents that occur in the subfamily Hydromyinae, most of which produce highly complex, monomorphic, spermatozoa in which the head has an apical hook together with two ventral processes containing filamentous actin and a long tail of species-specific length. One of the few exceptions to this is the spinifex hopping mouse, Notomys alexis, whose spermatozoa have previously been shown to have pleiomorphic heads. In this study, the structural organisation of the sperm head has been investigated in more detail and the variability in length of the midpiece and total length of the sperm tail has been determined for this species. The findings confirm that pleiomorphic sperm heads are invariably present in these animals and that this variability is associated with that of the nucleus, although nuclear vacuoles were not evident. The total length of the sperm tail, as well as that of the midpiece, was also highly variable both within, as well as between, individual animals. The reason(s) for this high degree of variability in sperm morphology is not known but it may relate to a relaxation of the genetic control of sperm form owing to depressed levels of inter-male sperm competition.

  4. Integrating black liquor gasification with pulping - Process simulation, economics and potential benefits

    NASA Astrophysics Data System (ADS)

    Lindstrom, Erik Vilhelm Mathias

    Gasification of black liquor could drastically increase the flexibility and improve the profit potential of a mature industry. The completed work was focused on research around the economics and benefits of its implementation, utilizing laboratory pulping experiments and process simulation. The separation of sodium and sulfur achieved through gasification of recovered black liquor, can be utilized in processes like modified continuous cooking, split sulfidity and green liquor pretreatment pulping, and polysulfide-anthraquinone pulping, to improve pulp yield and properties. Laboratory pulping protocols have been developed for these modified pulping technologies and different process options evaluated. The process simulation work around BLG has led to the development of a WinGEMS module for the low temperature MTCI steam reforming process, and case studies comparing a simulated conventional kraft process to different process options built around the implementation of a BLG unit operation into the kraft recovery cycle. Pulp yield increases of 1-3% points with improved product quality, and the potential for capital and operating cost savings relative to the conventional kraft process have been demonstrated. Process simulation work has shown that the net variable operating cost for a pulping process using BLGCC is highly dependent on the cost of lime kiln fuel and the selling price of green power to the grid. Under the assumptions taken in the performed case study, the BLGCC process combined with split sulfidity or PSAQ pulping operations had net variable operating cost 2-4% greater than the kraft reference. The influence of the sales price of power to the grid is the most significant cost factor. If a sales price increase to 6 ¢/KWh for green power could be achieved, cost savings of about $40/ODtP could be realized in all investigated BLG processes. Other alternatives to improve the process economics around BLG would be to modify or eliminate the lime kiln unit operations, utilizing high sulfidity green liquor pretreatment, PSAQ with auto-causticization, or converting the process to mini-sulfide sulfite-AQ.

  5. Heralded processes on continuous-variable spaces as quantum maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreyrol, Franck; Spagnolo, Nicolò; Blandino, Rémi

    2014-12-04

    Heralding processes, which only work when a measurement on a part of the system give the good result, are particularly interesting for continuous-variables. They permit non-Gaussian transformations that are necessary for several continuous-variable quantum information tasks. However if maps and quantum process tomography are commonly used to describe quantum transformations in discrete-variable space, they are much rarer in the continuous-variable domain. Also, no convenient tool for representing maps in a way more adapted to the particularities of continuous variables have yet been explored. In this paper we try to fill this gap by presenting such a tool.

  6. Control of variable speed variable pitch wind turbine based on a disturbance observer

    NASA Astrophysics Data System (ADS)

    Ren, Haijun; Lei, Xin

    2017-11-01

    In this paper, a novel sliding mode controller based on disturbance observer (DOB) to optimize the efficiency of variable speed variable pitch (VSVP) wind turbine is developed and analyzed. Due to the highly nonlinearity of the VSVP system, the model is linearly processed to obtain the state space model of the system. Then, a conventional sliding mode controller is designed and a DOB is added to estimate wind speed. The proposed control strategy can successfully deal with the random nature of wind speed, the nonlinearity of VSVP system, the uncertainty of parameters and external disturbance. Via adding the observer to the sliding mode controller, it can greatly reduce the chattering produced by the sliding mode switching gain. The simulation results show that the proposed control system has the effectiveness and robustness.

  7. Using hypnosis to disrupt face processing: mirrored-self misidentification delusion and different visual media

    PubMed Central

    Connors, Michael H.; Barnier, Amanda J.; Coltheart, Max; Langdon, Robyn; Cox, Rochelle E.; Rivolta, Davide; Halligan, Peter W.

    2014-01-01

    Mirrored-self misidentification delusion is the belief that one’s reflection in the mirror is not oneself. This experiment used hypnotic suggestion to impair normal face processing in healthy participants and recreate key aspects of the delusion in the laboratory. From a pool of 439 participants, 22 high hypnotisable participants (“highs”) and 20 low hypnotisable participants were selected on the basis of their extreme scores on two separately administered measures of hypnotisability. These participants received a hypnotic induction and a suggestion for either impaired (i) self-face recognition or (ii) impaired recognition of all faces. Participants were tested on their ability to recognize themselves in a mirror and other visual media – including a photograph, live video, and handheld mirror – and their ability to recognize other people, including the experimenter and famous faces. Both suggestions produced impaired self-face recognition and recreated key aspects of the delusion in highs. However, only the suggestion for impaired other-face recognition disrupted recognition of other faces, albeit in a minority of highs. The findings confirm that hypnotic suggestion can disrupt face processing and recreate features of mirrored-self misidentification. The variability seen in participants’ responses also corresponds to the heterogeneity seen in clinical patients. An important direction for future research will be to examine sources of this variability within both clinical patients and the hypnotic model. PMID:24994973

  8. AVHRR channel selection for land cover classification

    USGS Publications Warehouse

    Maxwell, S.K.; Hoffer, R.M.; Chapman, P.L.

    2002-01-01

    Mapping land cover of large regions often requires processing of satellite images collected from several time periods at many spectral wavelength channels. However, manipulating and processing large amounts of image data increases the complexity and time, and hence the cost, that it takes to produce a land cover map. Very few studies have evaluated the importance of individual Advanced Very High Resolution Radiometer (AVHRR) channels for discriminating cover types, especially the thermal channels (channels 3, 4 and 5). Studies rarely perform a multi-year analysis to determine the impact of inter-annual variability on the classification results. We evaluated 5 years of AVHRR data using combinations of the original AVHRR spectral channels (1-5) to determine which channels are most important for cover type discrimination, yet stabilize inter-annual variability. Particular attention was placed on the channels in the thermal portion of the spectrum. Fourteen cover types over the entire state of Colorado were evaluated using a supervised classification approach on all two-, three-, four- and five-channel combinations for seven AVHRR biweekly composite datasets covering the entire growing season for each of 5 years. Results show that all three of the major portions of the electromagnetic spectrum represented by the AVHRR sensor are required to discriminate cover types effectively and stabilize inter-annual variability. Of the two-channel combinations, channels 1 (red visible) and 2 (near-infrared) had, by far, the highest average overall accuracy (72.2%), yet the inter-annual classification accuracies were highly variable. Including a thermal channel (channel 4) significantly increased the average overall classification accuracy by 5.5% and stabilized interannual variability. Each of the thermal channels gave similar classification accuracies; however, because of the problems in consistently interpreting channel 3 data, either channel 4 or 5 was found to be a more appropriate choice. Substituting the thermal channel with a single elevation layer resulted in equivalent classification accuracies and inter-annual variability.

  9. Optimization of nanoparticles for cardiovascular tissue engineering.

    PubMed

    Izadifar, Mohammad; Kelly, Michael E; Haddadi, Azita; Chen, Xiongbiao

    2015-06-12

    Nano-particulate delivery systems have increasingly been playing important roles in cardiovascular tissue engineering. Properties of nanoparticles (e.g. size, polydispersity, loading capacity, zeta potential, morphology) are essential to system functions. Notably, these characteristics are regulated by fabrication variables, but in a complicated manner. This raises a great need to optimize fabrication process variables to ensure the desired nanoparticle characteristics. This paper presents a comprehensive experimental study on this matter, along with a novel method, the so-called Geno-Neural approach, to analyze, predict and optimize fabrication variables for desired nanoparticle characteristics. Specifically, ovalbumin was used as a protein model of growth factors used in cardiovascular tissue regeneration, and six fabrication variables were examined with regard to their influence on the characteristics of nanoparticles made from high molecular weight poly(lactide-co-glycolide). The six-factor five-level central composite rotatable design was applied to the conduction of experiments, and based on the experimental results, a geno-neural model was developed to determine the optimum fabrication conditions. For desired particle sizes of 150, 200, 250 and 300 nm, respectively, the optimum conditions to achieve the low polydispersity index, higher negative zeta potential and higher loading capacity were identified based on the developed geno-neural model and then evaluated experimentally. The experimental results revealed that the polymer and the external aqueous phase concentrations and their interactions with other fabrication variables were the most significant variables to affect the size, polydispersity index, zeta potential, loading capacity and initial burst release of the nanoparticles, while the electron microscopy images of the nanoparticles showed their spherical geometries with no sign of large pores or cracks on their surfaces. The release study revealed that the onset of the third phase of release can be affected by the polymer concentration. Circular dichroism spectroscopy indicated that ovalbumin structural integrity is preserved during the encapsulation process. Findings from this study would greatly contribute to the design of high molecular weight poly(lactide-co-glycolide) nanoparticles for prolonged release patterns in cardiovascular engineering.

  10. Determinants of safety outcomes and performance: A systematic literature review of research in four high-risk industries.

    PubMed

    Cornelissen, Pieter A; Van Hoof, Joris J; De Jong, Menno D T

    2017-09-01

    In spite of increasing governmental and organizational efforts, organizations still struggle to improve the safety of their employees as evidenced by the yearly 2.3 million work-related deaths worldwide. Occupational safety research is scattered and inaccessible, especially for practitioners. Through systematically reviewing the safety literature, this study aims to provide a comprehensive overview of behavioral and circumstantial factors that endanger or support employee safety. A broad search on occupational safety literature using four online bibliographical databases yielded 27.527 articles. Through a systematic reviewing process 176 online articles were identified that met the inclusion criteria (e.g., original peer-reviewed research; conducted in selected high-risk industries; published between 1980-2016). Variables and the nature of their interrelationships (i.e., positive, negative, or nonsignificant) were extracted, and then grouped and classified through a process of bottom-up coding. The results indicate that safety outcomes and performance prevail as dependent research areas, dependent on variables related to management & colleagues, work(place) characteristics & circumstances, employee demographics, climate & culture, and external factors. Consensus was found for five variables related to safety outcomes and seven variables related to performance, while there is debate about 31 other relationships. Last, 21 variables related to safety outcomes and performance appear understudied. The majority of safety research has focused on addressing negative safety outcomes and performance through variables related to others within the organization, the work(place) itself, employee demographics, and-to a lesser extent-climate & culture and external factors. This systematic literature review provides both scientists and safety practitioners an overview of the (under)studied behavioral and circumstantial factors related to occupational safety behavior. Scientists could use this overview to study gaps, and validate or falsify relationships. Safety practitioners could use the insights to evaluate organizational safety policies, and to further development of safety interventions. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  11. NEW METHODOLOGY FOR DEVELOPMENT OF ORODISPERSIBLE TABLETS USING HIGH-SHEAR GRANULATION PROCESS.

    PubMed

    Ali, Bahaa E; Al-Shedfat, Ramadan I; Fayed, Mohamed H; Alanazi, Fars K

    2017-05-01

    Development of orodispersible delivery system of high mechanical properties and low disintegration time is a big challenge. The aim of the current work was to assess and optimize the high shear granulation process as a new methodology for development of orodispersible tablets of high quality attributes using design of experiment approach. A two factor, three levels (32), full factorial design was carried out to investigate the main and interaction effects of independent variables, water amount (XI) and granulation time (X2) on the characteristics of granules and final product, tablet. The produced granules were analyzed for their granule size, density and flowability. Furthermore, the produced tablets were tested for: weight variation, breaking force/ crushing strength, friability, disintegration time and drug dissolution. Regression analysis results of multiple linear models showed a high correlation between the adjusted R-squared and predicted R-squared for all granules and tablets characteristics, the difference is less than 0.2. All dependent responses of granules and tablets were found to be impacted significantly (p < 0.05) by the two independent variables. However, water amount demonstrated the most dominant effect for all granules and tablet characteristics as shown by higher its coefficient estimate for all selected responses. Numerical optimization using desirability function was performed to optimize the variables under study to provide orodispersible system within the USP limit with respect of mechanical properties and disintegration time. It was found that the higher desirability (0.915) could be attained at the low level pf water (180 g) and short granulation time (1.65 min). Eventually, this study provides the formulator with helpful information in selecting the proper level of water and granulation time to provide an orodispersible system of high crushing strength and very low disintegration time, when high shear granulation methodology was used as a method of manufacture.

  12. ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.

    PubMed

    Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus

    2011-12-01

    The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.

  13. The Role of Microstructural Variability on the Very High-Cycle Fatigue Behavior of Discontinuously-Reinforced Aluminum Metal Matrix Composites using Ultrasonic Fatigue (Preprint)

    DTIC Science & Technology

    2008-05-01

    controlled processing. Bhanu-Prasad et al .37 conducted a systematic study of PM-processed 2124/SiC/30p aluminum composites 4 5 in which matrix alloy...Mater., 27, 173-178. [5] Wang A, Rack HJ (1991). Transition wear behavior of SiC-particulate- and SiC- whisker-reinforced 7091 Al metal matrix...modeling of particle distribution effects on fatigue in Al -SiCp composites. Mater. Sci. Eng. A, Struct. Mater. Prop. Microstruct. Process., 300, 113-124

  14. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    NASA Astrophysics Data System (ADS)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  15. A method for predicting optimized processing parameters for surfacing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dupont, J.N.; Marder, A.R.

    1994-12-31

    Welding is used extensively for surfacing applications. To operate a surfacing process efficiently, the variables must be optimized to produce low levels of dilution with the substrate while maintaining high deposition rates. An equation for dilution in terms of the welding variables, thermal efficiency factors, and thermophysical properties of the overlay and substrate was developed by balancing energy and mass terms across the welding arc. To test the validity of the resultant dilution equation, the PAW, GTAW, GMAW, and SAW processes were used to deposit austenitic stainless steel onto carbon steel over a wide range of parameters. Arc efficiency measurementsmore » were conducted using a Seebeck arc welding calorimeter. Melting efficiency was determined based on knowledge of the arc efficiency. Dilution was determined for each set of processing parameters using a quantitative image analysis system. The pertinent equations indicate dilution is a function of arc power (corrected for arc efficiency), filler metal feed rate, melting efficiency, and thermophysical properties of the overlay and substrate. With the aid of the dilution equation, the effect of processing parameters on dilution is presented by a new processing diagram. A new method is proposed for determining dilution from welding variables. Dilution is shown to depend on the arc power, filler metal feed rate, arc and melting efficiency, and the thermophysical properties of the overlay and substrate. Calculated dilution levels were compared with measured values over a large range of processing parameters and good agreement was obtained. The results have been applied to generate a processing diagram which can be used to: (1) predict the maximum deposition rate for a given arc power while maintaining adequate fusion with the substrate, and (2) predict the resultant level of dilution with the substrate.« less

  16. Relationships between atypical sensory processing patterns, maladaptive behaviour and maternal stress in Spanish children with autism spectrum disorder.

    PubMed

    Nieto, C; López, B; Gandía, H

    2017-12-01

    This study investigated sensory processing in a sample of Spanish children with autism spectrum disorder (ASD). Specifically, the study aimed to explore (1) the prevalence and distribution of atypical sensory processing patterns, (2) the relationship between adaptive and maladaptive behaviour with atypical sensory processing and (3) the possible relationship between sensory subtype and maternal stress. The short sensory profile 2 (Dunn 2014) and the vineland adaptive behavior scale (Sparrow et al. 1984) were administered to examine the sensory processing difficulties and maladaptive behaviours of 45 children with ASD aged 3 to 14; their mothers also completed the parenting stress index-short form (Abidin 1995). Atypical sensory features were found in 86.7% of the children; avoider and sensor being the two most common patterns. No significant relationship was found between atypical sensory processing and adaptive behaviour. However, the analysis showed a strong relationship between sensory processing and maladaptive behaviour. Both maladaptive behaviour and sensory processing difficulties correlated significantly with maternal stress although maternal stress was predicted only by the sensory variable, and in particular by the avoider pattern. The findings suggest that sensory features in ASD may be driving the high prevalence of parental stress in carers. They also suggest that the effect on parental stress that has been attributed traditionally to maladaptive behaviours may be driven by sensory difficulties. The implications of these findings are discussed in relation to the development of interventions and the need to explore contextual and cultural variables as possible sources of variability. © 2017 MENCAP and International Association of the Scientific Study of Intellectual and Developmental Disabilities and John Wiley & Sons Ltd.

  17. EPE analysis of sub-N10 BEoL flow with and without fully self-aligned via using Coventor SEMulator3D

    NASA Astrophysics Data System (ADS)

    Franke, Joern-Holger; Gallagher, Matt; Murdoch, Gayle; Halder, Sandip; Juncker, Aurelie; Clark, William

    2017-03-01

    During the last few decades, the semiconductor industry has been able to scale device performance up while driving costs down. What started off as simple geometrical scaling, driven mostly by advances in lithography, has recently been accompanied by advances in processing techniques and in device architectures. The trend to combine efforts using process technology and lithography is expected to intensify, as further scaling becomes ever more difficult. One promising component of future nodes are "scaling boosters", i.e. processing techniques that enable further scaling. An indispensable component in developing these ever more complex processing techniques is semiconductor process modeling software. Visualization of complex 3D structures in SEMulator3D, along with budget analysis on film thicknesses, CD and etch budgets, allow process integrators to compare flows before any physical wafers are run. Hundreds of "virtual" wafers allow comparison of different processing approaches, along with EUV or DUV patterning options for defined layers and different overlay schemes. This "virtual fabrication" technology produces massively parallel process variation studies that would be highly time-consuming or expensive in experiment. Here, we focus on one particular scaling booster, the fully self-aligned via (FSAV). We compare metal-via-metal (mevia-me) chains with self-aligned and fully-self-aligned via's using a calibrated model for imec's N7 BEoL flow. To model overall variability, 3D Monte Carlo modeling of as many variability sources as possible is critical. We use Coventor SEMulator3D to extract minimum me-me distances and contact areas and show how fully self-aligned vias allow a better me-via distance control and tighter via-me contact area variability compared with the standard self-aligned via (SAV) approach.

  18. Efficacious insect and disease control with laser-guided air-assisted sprayer

    USDA-ARS?s Scientific Manuscript database

    Efficacy of a newly developed air-assisted variable-rate sprayer was investigated for the control of arthropod pests and plant diseases in six commercial fields. The sprayer was integrated with a high-speed laser scanning sensor, a custom-designed signal processing program, an automatic flow control...

  19. Criterion 1: Conservation of biological diversity

    Treesearch

    Stephen R. Shifley; Francisco X. Aguilar; Nianfu Song; Susan I. Stewart; David J. Nowak; Dale D. Gormanson; W. Keith Moser; Sherri Wormstead; Eric J. Greenfield

    2012-01-01

    Biological diversity, or biodiversity, is the variety of life. It encompasses the variability among living organisms and includes diversity within species, among species, and among ecosystems. High biodiversity enables a forest ecosystem to respond to external influences, absorb and recover from disturbances, and still maintain essential ecosystem processes such as...

  20. Rising synchrony controls western North American ecosystems

    Treesearch

    Bryan A. Black; Peter van der Sleen; Emanuele Di Lorenzo; Daniel Griffin; William J. Sydeman; Jason B. Dunham; Ryan R. Rykaczewski; Marisol García-Reyes; Mohammad Safeeq; Ivan Arismendi; Steven J. Bograd

    2018-01-01

    Along the western margin of North America, the winter expression of the North Pacific High (NPH) strongly influences interannual variability in coastal upwelling, storm track position, precipitation, and river discharge. Coherence among these factors induces covariance among physical and biological processes across adjacent marine and terrestrial ecosystems. Here, we...

  1. Miniature spinning: an improved cotton research tools

    USDA-ARS?s Scientific Manuscript database

    Cotton is a natural fiber and is highly variable. Researchers need to evaluate cotton fiber properties to aid in the development of improved varieties and to ensure that changes in agronomic practices do not harm fiber quality or processing propensity. There is a need for fiber quality evaluation be...

  2. Evaluation of thermal processing variables for reducing acrylamide in canned black ripe olives

    USDA-ARS?s Scientific Manuscript database

    Acrylamide formed in plant foods at elevated cooking temperatures has been identified as a probable carcinogen. A wide variation and high acrylamide concentration in commercial canned black ripe olives has been reported. The objective of this study was to determine if different safe sterilization co...

  3. Creative Thinking: Processes, Strategies, and Knowledge

    ERIC Educational Resources Information Center

    Mumford, Michael D.; Medeiros, Kelsey E.; Partlow, Paul J.

    2012-01-01

    Creative achievements are the basis for progress in our world. Although creative achievement is influenced by many variables, the basis for creativity is held to lie in the generation of high-quality, original, and elegant solutions to complex, novel, ill-defined problems. In the present effort, we examine the cognitive capacities that make…

  4. Theory and High-Energy-Density Laser Experiments Relevant to Accretion Processes in Cataclysmic Variables

    NASA Astrophysics Data System (ADS)

    Krauland, Christine; Drake, R.; Loupias, B.; Falize, E.; Busschaert, C.; Ravasio, A.; Yurchak, R.; Pelka, A.; Koenig, M.; Kuranz, C. C.; Plewa, T.; Huntington, C. M.; Kaczala, D. N.; Klein, S.; Sweeney, R.; Villete, B.; Young, R.; Keiter, P. A.

    2012-05-01

    We present results from high-energy-density (HED) laboratory experiments that explore the contribution of radiative shock waves to the evolving dynamics of the cataclysmic variable (CV) systems in which they reside. CVs can be classified under two main categories, non-magnetic and magnetic. In the process of accretion, both types involve strongly radiating shocks that provide the main source of radiation in the binary systems. This radiation can cause varying structure to develop depending on the optical properties of the material on either side of the shock. The ability of high-intensity lasers to create large energy densities in targets of millimeter-scale volume makes it feasible to create similar radiative shocks in the laboratory. We provide an overview of both CV systems and their connection to the designed and executed laboratory experiments preformed on two laser facilities. Available data and accompanying simulations will likewise be shown. Funded by the NNSA-DS and SC-OFES Joint Prog. in High-Energy-Density Lab. Plasmas, by the Nat. Laser User Facility Prog. in NNSA-DS and by the Predictive Sci. Acad. Alliances Prog. in NNSA-ASC, under grant numbers are DE-FG52-09NA29548, DE-FG52-09NA29034, and DE-FC52-08NA28616.

  5. Investigation of Recombination Processes In A Magnetized Plasma

    NASA Technical Reports Server (NTRS)

    Chavers, Greg; Chang-Diaz, Franklin; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Interplanetary travel requires propulsion systems that can provide high specific impulse (Isp), while also having sufficient thrust to rapidly accelerate large payloads. One such propulsion system is the Variable Specific Impulse Magneto-plasma Rocket (VASIMR), which creates, heats, and exhausts plasma to provide variable thrust and Isp, optimally meeting the mission requirements. A large fraction of the energy to create the plasma is frozen in the exhaust in the form of ionization energy. This loss mechanism is common to all electromagnetic plasma thrusters and has an impact on their efficiency. When the device operates at high Isp, where the exhaust kinetic energy is high compared to the ionization energy, the frozen flow component is of little consequence; however, at low Isp, the effect of the frozen flow may be important. If some of this energy could be recovered through recombination processes, and re-injected as neutral kinetic energy, the efficiency of VASIMR, in its low Isp/high thrust mode may be improved. In this operating regime, the ionization energy is a large portion of the total plasma energy. An experiment is being conducted to investigate the possibility of recovering some of the energy used to create the plasma. This presentation will cover the progress and status of the experiment involving surface recombination of the plasma.

  6. Nutrient characteristics of the water masses and their seasonal variability in the eastern equatorial Indian Ocean.

    PubMed

    Sardessai, S; Shetye, Suhas; Maya, M V; Mangala, K R; Prasanna Kumar, S

    2010-01-01

    Nutrient characteristics of four water masses in the light of their thermohaline properties are examined in the eastern Equatorial Indian Ocean during winter, spring and summer monsoon. The presence of low salinity water mass with "Surface enrichments" of inorganic nutrients was observed relative to 20 m in the mixed layer. Lowest oxygen levels of 19 microM at 3 degrees N in the euphotic zone indicate mixing of low oxygen high salinity Arabian Sea waters with the equatorial Indian Ocean. The seasonal variability of nutrients was regulated by seasonally varying physical processes like thermocline elevation, meridional and zonal transport, the equatorial undercurrent and biological processes of uptake and remineralization. Circulation of Arabian Sea high salinity waters with nitrate deficit could also be seen from low N/P ratio with a minimum of 8.9 in spring and a maximum of 13.6 in winter. This large deviation from Redfield N/P ratio indicates the presence of denitrified high salinity waters with a seasonal nitrate deficit ranging from -4.85 to 1.52 in the Eastern Equatorial Indian Ocean. 2010 Elsevier Ltd. All rights reserved.

  7. Application of AIS Technology to Forest Mapping

    NASA Technical Reports Server (NTRS)

    Yool, S. R.; Star, J. L.

    1985-01-01

    Concerns about environmental effects of large scale deforestation have prompted efforts to map forests over large areas using various remote sensing data and image processing techniques. Basic research on the spectral characteristics of forest vegetation are required to form a basis for development of new techniques, and for image interpretation. Examination of LANDSAT data and image processing algorithms over a portion of boreal forest have demonstrated the complexity of relations between the various expressions of forest canopies, environmental variability, and the relative capacities of different image processing algorithms to achieve high classification accuracies under these conditions. Airborne Imaging Spectrometer (AIS) data may in part provide the means to interpret the responses of standard data and techniques to the vegetation based on its relatively high spectral resolution.

  8. Scaling Limit of Symmetric Random Walk in High-Contrast Periodic Environment

    NASA Astrophysics Data System (ADS)

    Piatnitski, A.; Zhizhina, E.

    2017-11-01

    The paper deals with the asymptotic properties of a symmetric random walk in a high contrast periodic medium in Z^d, d≥1. From the existing homogenization results it follows that under diffusive scaling the limit behaviour of this random walk need not be Markovian. The goal of this work is to show that if in addition to the coordinate of the random walk in Z^d we introduce an extra variable that characterizes the position of the random walk inside the period then the limit dynamics of this two-component process is Markov. We describe the limit process and observe that the components of the limit process are coupled. We also prove the convergence in the path space for the said random walk.

  9. Increased Intra-Participant Variability in Children with Autistic Spectrum Disorders: Evidence from Single-Trial Analysis of Evoked EEG

    PubMed Central

    Milne, Elizabeth

    2011-01-01

    Intra-participant variability in clinical conditions such as autistic spectrum disorder (ASD) is an important indicator of pathophysiological processing. The data reported here illustrate that trial-by-trial variability can be reliably measured from EEG, and that intra-participant EEG variability is significantly greater in those with ASD than in neuro-typical matched controls. EEG recorded at the scalp is a linear mixture of activity arising from muscle artifacts and numerous concurrent brain processes. To minimize these additional sources of variability, EEG data were subjected to two different methods of spatial filtering. (i) The data were decomposed using infomax independent component analysis, a method of blind source separation which un-mixes the EEG signal into components with maximally independent time-courses, and (ii) a surface Laplacian transform was performed (current source density interpolation) in order to reduce the effects of volume conduction. Data are presented from 13 high functioning adolescents with ASD without co-morbid ADHD, and 12 neuro-typical age-, IQ-, and gender-matched controls. Comparison of variability between the ASD and neuro-typical groups indicated that intra-participant variability of P1 latency and P1 amplitude was greater in the participants with ASD, and inter-trial α-band phase coherence was lower in the participants with ASD. These data support the suggestion that individuals with ASD are less able to synchronize the activity of stimulus-related cell assemblies than neuro-typical individuals, and provide empirical evidence in support of theories of increased neural noise in ASD. PMID:21716921

  10. Comparative Genome Analysis of Ciprofloxacin-Resistant Pseudomonas aeruginosa Reveals Genes Within Newly Identified High Variability Regions Associated With Drug Resistance Development

    PubMed Central

    Su, Hsun-Cheng; Khatun, Jainab; Kanavy, Dona M.

    2013-01-01

    The alarming rise of ciprofloxacin-resistant Pseudomonas aeruginosa has been reported in several clinical studies. Though the mutation of resistance genes and their role in drug resistance has been researched, the process by which the bacterium acquires high-level resistance is still not well understood. How does the genomic evolution of P. aeruginosa affect resistance development? Could the exposure of antibiotics to the bacteria enrich genomic variants that lead to the development of resistance, and if so, how are these variants distributed through the genome? To answer these questions, we performed 454 pyrosequencing and a whole genome analysis both before and after exposure to ciprofloxacin. The comparative sequence data revealed 93 unique resistance strain variation sites, which included a mutation in the DNA gyrase subunit A gene. We generated variation-distribution maps comparing the wild and resistant types, and isolated 19 candidates from three discrete resistance-associated high variability regions that had available transposon mutants, to perform a ciprofloxacin exposure assay. Of these region candidates with transposon disruptions, 79% (15/19) showed a reduction in the ability to gain high-level resistance, suggesting that genes within these high variability regions might enrich for certain functions associated with resistance development. PMID:23808957

  11. Using present day observations to detect when ocean acidification exceeds natural variability of surface seawater Ωaragonite

    NASA Astrophysics Data System (ADS)

    Sutton, A.; Sabine, C. L.; Feely, R. A.

    2016-02-01

    One of the major challenges to assessing the impact of ocean acidification on marine life is the need to better understand the magnitude of long-term change in the context of natural variability. High-frequency moored observations can be highly effective in defining interannual, seasonal, and subseasonal variability at key locations. Here we present monthly aragonite saturation state (Ωaragonite) climatology for 15 open ocean, coastal, and coral reef locations using 3-hourly moored observations of surface seawater pCO2 and pH collected together since as early as 2009. We then use these present day surface mooring observations to estimate pre-industrial variability at each location and compare these results to previous modeling studies addressing global-scale variability and change. Our observations suggest that open oceans sites, especially in the subtropics, are experiencing Ωaragonite values throughout much of the year which are outside the range of pre-industrial values. In coastal and coral reef ecosystems, which have higher natural variability, seasonal patterns where present day Ωaragonite values exceeding pre-industrial bounds are emerging with some sites exhibiting subseasonal conditions approaching Ωaragonite = 1. Linking these seasonal patterns in carbonate chemistry to biological processes in these regions is critical to identify when and where marine life may encounter Ωaragonite values outside the conditions to which they have adapted.

  12. Assessing Precipitation Isotope Variations during Atmospheric River Events to Reveal Dominant Atmospheric/Hydrologic Processes

    NASA Astrophysics Data System (ADS)

    McCabe-Glynn, S. E.; Johnson, K. R.; Yoshimura, K.; Buenning, N. H.; Welker, J. M.

    2015-12-01

    Extreme precipitation events across the Western US commonly associated with atmospheric rivers (ARs), whereby extensive fluxes of moisture are transported from the subtropics, can result in major damage and are projected by most climate models to increase in frequency and severity. However, they are difficult to project beyond ~ten days and the location of landfall and topographically induced precipitation is even more uncertain. Water isotopes, often used to reconstruct past rainfall variability, are useful natural tracers of atmospheric hydrologic processes. Because of the typical tropical and sub-tropical origins, ARs can carry unique water isotope (δ18O and δ2H, d-excess) signatures that can be utilized to provide source and process information that can lead to improving AR predictions. Recent analysis of the top 10 weekly precipitation total samples from Sequoia National Park, CA, of which 9 contained AR events, shows a high variability in the isotopic values. NOAA Hysplit back trajectory analyses reveals a variety of trajectories and varying latitudinal source regions contributed to moisture delivered to this site, which may explain part of the high variability (δ2H = -150.03 to -49.52 ‰, δ18O = -19.27 to -7.20 ‰, d-excess = 4.1 to 25.8). Here we examine the top precipitation totals occurring during AR events and the associated isotopic composition of precipitation samples from several sites across the Western US. We utilize IsoGSM, an isotope-enabled atmospheric general circulation model, to characterize the hydrologic processes and physical dynamics contributing to the observed isotopic variations. We investigate isotopic influences from moisture source location, AR speed, condensation height, and associated temperature. We explore the dominant controls on spatial and temporal variations of the isotopic composition of AR precipitation which highlights different physical processes for different AR events.

  13. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  14. A Regional Modeling Framework of Phosphorus Sources and Transport in Streams of the Southeastern United States

    USGS Publications Warehouse

    Garcia, A.M.; Hoos, A.B.; Terziotti, S.

    2011-01-01

    We applied the SPARROW model to estimate phosphorus transport from catchments to stream reaches and subsequent delivery to major receiving water bodies in the Southeastern United States (U.S.). We show that six source variables and five land-to-water transport variables are significant (p<0.05) in explaining 67% of the variability in long-term log-transformed mean annual phosphorus yields. Three land-to-water variables are a subset of landscape characteristics that have been used as transport factors in phosphorus indices developed by state agencies and are identified through experimental research as influencing land-to-water phosphorus transport at field and plot scales. Two land-to-water variables - soil organic matter and soil pH - are associated with phosphorus sorption, a significant finding given that most state-developed phosphorus indices do not explicitly contain variables for sorption processes. Our findings for Southeastern U.S. streams emphasize the importance of accounting for phosphorus present in the soil profile to predict attainable instream water quality. Regional estimates of phosphorus associated with soil-parent rock were highly significant in explaining instream phosphorus yield variability. Model predictions associate 31% of phosphorus delivered to receiving water bodies to geology and the highest total phosphorus yields in the Southeast were catchments with already high background levels that have been impacted by human activity. ?? 2011 American Water Resources Association. This article is a US Government work and is in the public domain in the USA.

  15. Spatial variability in plankton biomass and hydrographic variables along an axial transect in Chesapeake Bay

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Roman, M.; Kimmel, D.; McGilliard, C.; Boicourt, W.

    2006-05-01

    High-resolution, axial sampling surveys were conducted in Chesapeake Bay during April, July, and October from 1996 to 2000 using a towed sampling device equipped with sensors for depth, temperature, conductivity, oxygen, fluorescence, and an optical plankton counter (OPC). The results suggest that the axial distribution and variability of hydrographic and biological parameters in Chesapeake Bay were primarily influenced by the source and magnitude of freshwater input. Bay-wide spatial trends in the water column-averaged values of salinity were linear functions of distance from the main source of freshwater, the Susquehanna River, at the head of the bay. However, spatial trends in the water column-averaged values of temperature, dissolved oxygen, chlorophyll-a and zooplankton biomass were nonlinear along the axis of the bay. Autocorrelation analysis and the residuals of linear and quadratic regressions between each variable and latitude were used to quantify the patch sizes for each axial transect. The patch sizes of each variable depended on whether the data were detrended, and the detrending techniques applied. However, the patch size of each variable was generally larger using the original data compared to the detrended data. The patch sizes of salinity were larger than those for dissolved oxygen, chlorophyll-a and zooplankton biomass, suggesting that more localized processes influence the production and consumption of plankton. This high-resolution quantification of the zooplankton spatial variability and patch size can be used for more realistic assessments of the zooplankton forage base for larval fish species.

  16. Identifying the fingerprints of the anthropogenic component of land use/land cover changes on regional climate of the USA high plains

    NASA Astrophysics Data System (ADS)

    Mutiibwa, D.; Irmak, S.

    2011-12-01

    The majority of recent climate change studies have largely focused on detection and attribution of anthropogenic forcings of greenhouse gases, aerosols, stratospheric and tropospheric ozone. However, there is growing evidence that land cover/land use (LULC) change can significantly impact atmospheric processes from local to regional weather and climate variability. Human activities such as conversion of natural ecosystem to croplands and urban-centers, deforestation and afforestation impact biophysical properties of the land surfaces including albedo, energy balance, moisture-holding capacity of soil, and surface roughness. Alterations in these properties affect the heat and moisture exchanges between the land surface and atmospheric boundary layer, and ultimately impact the climate system. The challenge is to demonstrate that LULC changes produce a signal that can be discerned from natural climate noise. In this study, we attempt to detect the signature of anthropogenic forcing of LULC change on climate on regional scale. The signal projector investigated for detecting the signature of LULC changes on regional climate of the High Plains of the USA is the Normalized Difference Vegetation Index (NDVI). NDVI is an indicator that captures short and long-term geographical distribution of vegetation surfaces. The study develops an enhanced signal processing procedure to maximize the signal to noise ratio by introducing a pre-filtering technique of ARMA processes on the investigated climate and signal variables, before applying the optimal fingerprinting technique to detect the signals of LULC changes on observed climate, temperature, in the High Plains. The intent is to filter out as much noise as possible while still retaining the essential features of the signal by making use of the known characteristics of the noise and the anticipated signal. The study discusses the approach of identifying and suppressing the autocorrelation in optimal fingerprint analysis by applying linear transformation of ARMA processes to the analysis variables. With the assumption that natural climate variability is a near stationary process, the pre-filters are developed to generate stationary residuals. The High Plains region although impacted by droughts over the last three decades has had an increase in agricultural lands, both irrigated and non-irrigated. The study shows that for the most part of the High Plains region there is significant influence of evaporative cooling on regional climate during the summer months. As the vegetation coverage increases coupled with increased in irrigation application, the regional daytime surface energy in summer is increasingly redistributed into latent heat flux which increases the effect of evaporative cooling on summer temperatures. We included the anthropogenic forcing of CO2 on regional climate with the main purpose of surpassing the radiative heating effect of greenhouse gases from natural climate noise, to enhance the LULC signal-to-noise ratio. The warming signal due to greenhouse gas forcing is observed to be weakest in the central part of the High Plains. The results showed that the CO2 signal in the region was weak or is being surpassed by the evaporative cooling effect.

  17. Brain Signal Variability Differentially Affects Cognitive Flexibility and Cognitive Stability.

    PubMed

    Armbruster-Genç, Diana J N; Ueltzhöffer, Kai; Fiebach, Christian J

    2016-04-06

    Recent research yielded the intriguing conclusion that, in healthy adults, higher levels of variability in neuronal processes are beneficial for cognitive functioning. Beneficial effects of variability in neuronal processing can also be inferred from neurocomputational theories of working memory, albeit this holds only for tasks requiring cognitive flexibility. However, cognitive stability, i.e., the ability to maintain a task goal in the face of irrelevant distractors, should suffer under high levels of brain signal variability. To directly test this prediction, we studied both behavioral and brain signal variability during cognitive flexibility (i.e., task switching) and cognitive stability (i.e., distractor inhibition) in a sample of healthy human subjects and developed an efficient and easy-to-implement analysis approach to assess BOLD-signal variability in event-related fMRI task paradigms. Results show a general positive effect of neural variability on task performance as assessed by accuracy measures. However, higher levels of BOLD-signal variability in the left inferior frontal junction area result in reduced error rate costs during task switching and thus facilitate cognitive flexibility. In contrast, variability in the same area has a detrimental effect on cognitive stability, as shown in a negative effect of variability on response time costs during distractor inhibition. This pattern was mirrored at the behavioral level, with higher behavioral variability predicting better task switching but worse distractor inhibition performance. Our data extend previous results on brain signal variability by showing a differential effect of brain signal variability that depends on task context, in line with predictions from computational theories. Recent neuroscientific research showed that the human brain signal is intrinsically variable and suggested that this variability improves performance. Computational models of prefrontal neural networks predict differential effects of variability for different behavioral situations requiring either cognitive flexibility or stability. However, this hypothesis has so far not been put to an empirical test. In this study, we assessed cognitive flexibility and cognitive stability, and, besides a generally positive effect of neural variability on accuracy measures, we show that neural variability in a prefrontal brain area at the inferior frontal junction is differentially associated with performance: higher levels of variability are beneficial for the effectiveness of task switching (cognitive flexibility) but detrimental for the efficiency of distractor inhibition (cognitive stability). Copyright © 2016 the authors 0270-6474/16/363978-10$15.00/0.

  18. Advanced metrology by offline SEM data processing

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime

    2017-06-01

    Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.

  19. The definition of polytrauma revisited: An international consensus process and proposal of the new 'Berlin definition'

    PubMed

    Pape, Hans-Christoph; Lefering, Rolf; Butcher, Nerida; Peitzman, Andrew; Leenen, Luke; Marzi, Ingo; Lichte, Philip; Josten, Christoph; Bouillon, Bertil; Schmucker, Uli; Stahel, Philip; Giannoudis, Peter; Balogh, Zsolt

    2014-11-01

    The nomenclature for patients with multiple injuries with high mortality rates is highly variable, and there is a lack of a uniform definition of the term polytrauma. A consensus process was therefore initiated by a panel of international experts with the goal of assessing an improved, database-supported definition for the polytraumatized patient. The consensus process involved the following: RESULTS: A total of 28,211 patients in the trauma registry met the inclusion criteria. The mean (SD) age of the study cohort was 42.9 (20.2) years (72% males, 28% females). The mean (SD) ISS was 30.5 (12.2), with an overall mortality rate of 18.7% (n = 5,277) and an incidence of 3% of penetrating injuries (n = 886). Five independent physiologic variables were identified, and their individual cutoff values were calculated based on a set mortality rate of 30%: hypotension (systolic blood pressure ≤ 90 mm Hg), level of consciousness (Glasgow Coma Scale [GCS] score ≤ 8), acidosis (base excess ≤ -6.0), coagulopathy (international normalized ratio ≥ 1.4/partial thromboplastin time ≥ 40 seconds), and age (≥70 years). Based on several consensus meetings and a database analysis, the expert panel proposes the following parameters for a definition of "polytrauma": significant injuries of three or more points in two or more different anatomic AIS regions in conjunction with one or more additional variables from the five physiologic parameters. Further validation of this proposal should occur, favorably by mutivariate analyses of these parameters in a separate data set.

  20. Evaluating Spatial Variability in Sediment and Phosphorus Concentration-Discharge Relationships Using Bayesian Inference and Self-Organizing Maps

    NASA Astrophysics Data System (ADS)

    Underwood, Kristen L.; Rizzo, Donna M.; Schroth, Andrew W.; Dewoolkar, Mandar M.

    2017-12-01

    Given the variable biogeochemical, physical, and hydrological processes driving fluvial sediment and nutrient export, the water science and management communities need data-driven methods to identify regions prone to production and transport under variable hydrometeorological conditions. We use Bayesian analysis to segment concentration-discharge linear regression models for total suspended solids (TSS) and particulate and dissolved phosphorus (PP, DP) using 22 years of monitoring data from 18 Lake Champlain watersheds. Bayesian inference was leveraged to estimate segmented regression model parameters and identify threshold position. The identified threshold positions demonstrated a considerable range below and above the median discharge—which has been used previously as the default breakpoint in segmented regression models to discern differences between pre and post-threshold export regimes. We then applied a Self-Organizing Map (SOM), which partitioned the watersheds into clusters of TSS, PP, and DP export regimes using watershed characteristics, as well as Bayesian regression intercepts and slopes. A SOM defined two clusters of high-flux basins, one where PP flux was predominantly episodic and hydrologically driven; and another in which the sediment and nutrient sourcing and mobilization were more bimodal, resulting from both hydrologic processes at post-threshold discharges and reactive processes (e.g., nutrient cycling or lateral/vertical exchanges of fine sediment) at prethreshold discharges. A separate DP SOM defined two high-flux clusters exhibiting a bimodal concentration-discharge response, but driven by differing land use. Our novel framework shows promise as a tool with broad management application that provides insights into landscape drivers of riverine solute and sediment export.

  1. Radio and γ -Ray Variability in the BL Lac PKS 0219−164: Detection of Quasi-periodic Oscillations in the Radio Light Curve

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatta, Gopal, E-mail: gopalbhatta716@gmail.com; Mt. Suhora Observatory, Pedagogical University, ul. Podchorazych 2, 30-084 Kraków

    In this work, we explore the long-term variability properties of the blazar PKS 0219−164 in the radio and the γ -ray regime, utilizing the OVRO 15 GHz and the Fermi /LAT observations from the period 2008–2017. We found that γ -ray emission is more variable than the radio emission implying that γ -ray emission possibly originated in more compact regions while the radio emission represented continuum emission from the large-scale jets. Also, in the γ -ray, the source exhibited spectral variability, characterized by the softer-when-brighter trend, a less frequently observed feature in the high-energy emission by BL Lacs. In radio,more » using Lomb–Scargle periodogram and weighted wavelet z -transform, we detected a strong signal of quasi-periodic oscillation (QPO) with a periodicity of 270 ± 26 days with possible harmonics of 550 ± 42 and 1150 ± 157 day periods. At a time when detections of QPOs in blazars are still under debate, the observed QPO with high statistical significance (∼97%–99% global significance over underlying red-noise processes) and persistent over nearly 10 oscillations could make one of the strongest cases for the detection of QPOs in blazar light curves. We discuss various blazar models that might lead to the γ -ray and radio variability, QPO, and the achromatic behavior seen in the high-energy emission from the source.« less

  2. On-chip continuous-variable quantum entanglement

    NASA Astrophysics Data System (ADS)

    Masada, Genta; Furusawa, Akira

    2016-09-01

    Entanglement is an essential feature of quantum theory and the core of the majority of quantum information science and technologies. Quantum computing is one of the most important fruits of quantum entanglement and requires not only a bipartite entangled state but also more complicated multipartite entanglement. In previous experimental works to demonstrate various entanglement-based quantum information processing, light has been extensively used. Experiments utilizing such a complicated state need highly complex optical circuits to propagate optical beams and a high level of spatial interference between different light beams to generate quantum entanglement or to efficiently perform balanced homodyne measurement. Current experiments have been performed in conventional free-space optics with large numbers of optical components and a relatively large-sized optical setup. Therefore, they are limited in stability and scalability. Integrated photonics offer new tools and additional capabilities for manipulating light in quantum information technology. Owing to integrated waveguide circuits, it is possible to stabilize and miniaturize complex optical circuits and achieve high interference of light beams. The integrated circuits have been firstly developed for discrete-variable systems and then applied to continuous-variable systems. In this article, we review the currently developed scheme for generation and verification of continuous-variable quantum entanglement such as Einstein-Podolsky-Rosen beams using a photonic chip where waveguide circuits are integrated. This includes balanced homodyne measurement of a squeezed state of light. As a simple example, we also review an experiment for generating discrete-variable quantum entanglement using integrated waveguide circuits.

  3. Study on creep of fiber reinforced ultra-high strength concrete based on strength

    NASA Astrophysics Data System (ADS)

    Peng, Wenjun; Wang, Tao

    2018-04-01

    To complement the creep performance of ultra-high strength concrete, the long creep process of fiber reinforced concrete was studied in this paper. The long-term creep process and regularity of ultra-high strength concrete with 0.5% PVA fiber under the same axial compression were analyzed by using concrete strength (C80/C100/C120) as a variable. The results show that the creep coefficient of ultra-high strength concrete decreases with the increase of concrete strength. Compared with ACI209R (92), GL2000 models, it is found that the predicted value of ACI209R (92) are close to the experimental value, and the creep prediction model suitable for this experiment is proposed based on ACI209R (92).

  4. Prompt optical emission from gamma-ray bursts with multiple timescale variability of central engine activities

    NASA Astrophysics Data System (ADS)

    Xu, Si-Yao; Li, Zhuo

    2014-04-01

    Complete high-resolution light curves of GRB 080319B observed by Swift present an opportunity for detailed temporal analysis of prompt optical emission. With a two-component distribution of initial Lorentz factors, we simulate the dynamical process of shells being ejected from the central engine in the framework of the internal shock model. The emitted radiations are decomposed into different frequency ranges for a temporal correlation analysis between the light curves in different energy bands. The resulting prompt optical and gamma-ray emissions show similar temporal profiles, with both showing a superposition of a component with slow variability and a component with fast variability, except that the gamma-ray light curve is much more variable than its optical counterpart. The variability in the simulated light curves and the strong correlation with a time lag between the optical and gamma-ray emissions are in good agreement with observations of GRB 080319B. Our simulations suggest that the variations seen in the light curves stem from the temporal structure of the shells injected from the central engine of gamma-ray bursts. Future observations with high temporal resolution of prompt optical emission from GRBs, e.g., by UFFO-Pathfinder and SVOM-GWAC, will provide a useful tool for investigating the central engine activity.

  5. Controlling the COD removal of an A-stage pilot study with instrumentation and automatic process control.

    PubMed

    Miller, Mark W; Elliott, Matt; DeArmond, Jon; Kinyua, Maureen; Wett, Bernhard; Murthy, Sudhir; Bott, Charles B

    2017-06-01

    The pursuit of fully autotrophic nitrogen removal via the anaerobic ammonium oxidation (anammox) pathway has led to an increased interest in carbon removal technologies, particularly the A-stage of the adsorption/bio-oxidation (A/B) process. The high-rate operation of the A-stage and lack of automatic process control often results in wide variations of chemical oxygen demand (COD) removal that can ultimately impact nitrogen removal in the downstream B-stage process. This study evaluated the use dissolved oxygen (DO) and mixed liquor suspended solids (MLSS) based automatic control strategies through the use of in situ on-line sensors in the A-stage of an A/B pilot study. The objective of using these control strategies was to reduce the variability of COD removal by the A-stage and thus the variability of the effluent C/N. The use of cascade DO control in the A-stage did not impact COD removal at the conditions tested in this study, likely because the bulk DO concentration (>0.5 mg/L) was maintained above the half saturation coefficient of heterotrophic organisms for DO. MLSS-based solids retention time (SRT) control, where MLSS was used as a surrogate for SRT, did not significantly reduce the effluent C/N variability but it was able to reduce COD removal variation in the A-stage by 90%.

  6. Virtual sensors for on-line wheel wear and part roughness measurement in the grinding process.

    PubMed

    Arriandiaga, Ander; Portillo, Eva; Sánchez, Jose A; Cabanes, Itziar; Pombo, Iñigo

    2014-05-19

    Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 μm). In the case of surface finish, the absolute error is well below Ra 1 μm (average value 0.32 μm). The present approach can be easily generalized to other grinding operations.

  7. Multiscale Model Simulations of Temperature and Relative Humidity for the License Application of the Proposed Yucca Mountain Repository

    NASA Astrophysics Data System (ADS)

    Buscheck, T.; Glascoe, L.; Sun, Y.; Gansemer, J.; Lee, K.

    2003-12-01

    For the proposed Yucca Mountain geologic repository for high-level nuclear waste, the planned method of disposal involves the emplacement of cylindrical packages containing the waste inside horizontal tunnels, called emplacement drifts, bored several hundred meters below the ground surface. The emplacement drifts reside in highly fractured, partially saturated volcanic tuff. An important phenomenological consideration for the licensing of the proposed repository at Yucca Mountain is the generation of decay heat by the emplaced waste and the consequences of this decay heat. Changes in temperature will affect the hydrologic and chemical environment at Yucca Mountain. A thermohydrologic-modeling tool is necessary to support the performance assessment of the Engineered Barrier System (EBS) of the proposed repository. This modeling tool must simultaneously account for processes occurring at a scale of a few tens of centimeters around individual waste packages, for processes occurring around the emplacement drifts themselves, and for processes occurring at the multi-kilometer scale of the mountain. Additionally, many other features must be considered including non-isothermal, multiphase-flow in fractured porous rock of variable liquid-phase saturation and thermal radiation and convection in open cavities. The Multiscale Thermohydrologic Model (MSTHM) calculates the following thermohydrologic (TH) variables: temperature, relative humidity, liquid-phase saturation, evaporation rate, air-mass fraction, gas-phase pressure, capillary pressure, and liquid- and gas-phase fluxes. The TH variables are determined as a function of position along each of the emplacement drifts in the repository and as a function of waste-package (WP) type. These variables are determined at various generic locations within the emplacement drifts, including the waste package and drip-shield surfaces and in the invert; they are also determined at various generic locations in the adjoining host rock; these variables are determined every 20 m for each emplacement drift in the repository. The MSTHM accounts for 3-D drift-scale and mountain-scale heat flow and captures the influence of the key engineering-design variables and natural-system factors affecting TH conditions in the emplacement drifts and adjoining host rock. Presented is a synopsis of recent MSTHM calculations conducted to support the Total System Performance Assessment for the License Application (TSPA-LA). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.

  8. Exploring heterogeneity in clinical trials with latent class analysis

    PubMed Central

    Abarda, Abdallah; Contractor, Ateka A.; Wang, Juan; Dayton, C. Mitchell

    2018-01-01

    Case-mix is common in clinical trials and treatment effect can vary across different subgroups. Conventionally, a subgroup analysis is performed by dividing the overall study population by one or two grouping variables. It is usually impossible to explore complex high-order intersections among confounding variables. Latent class analysis (LCA) provides a framework to identify latent classes by observed manifest variables. Distal clinical outcomes and treatment effect can be different across these classes. This paper provides a step-by-step tutorial on how to perform LCA with R. A simulated dataset is generated to illustrate the process. In the example, the classify-analyze approach is employed to explore the differential treatment effects on distal outcomes across latent classes. PMID:29955579

  9. Climate variability drives recent tree mortality in Europe.

    PubMed

    Neumann, Mathias; Mues, Volker; Moreno, Adam; Hasenauer, Hubert; Seidl, Rupert

    2017-11-01

    Tree mortality is an important process in forest ecosystems, frequently hypothesized to be highly climate sensitive. Yet, tree death remains one of the least understood processes of forest dynamics. Recently, changes in tree mortality have been observed in forests around the globe, which could profoundly affect ecosystem functioning and services provisioning to society. We describe continental-scale patterns of recent tree mortality from the only consistent pan-European forest monitoring network, identifying recent mortality hotspots in southern and northern Europe. Analyzing 925,462 annual observations of 235,895 trees between 2000 and 2012, we determine the influence of climate variability and tree age on interannual variation in tree mortality using Cox proportional hazard models. Warm summers as well as high seasonal variability in precipitation increased the likelihood of tree death. However, our data also suggest that reduced cold-induced mortality could compensate increased mortality related to peak temperatures in a warming climate. Besides climate variability, age was an important driver of tree mortality, with individual mortality probability decreasing with age over the first century of a trees life. A considerable portion of the observed variation in tree mortality could be explained by satellite-derived net primary productivity, suggesting that widely available remote sensing products can be used as an early warning indicator of widespread tree mortality. Our findings advance the understanding of patterns of large-scale tree mortality by demonstrating the influence of seasonal and diurnal climate variation, and highlight the potential of state-of-the-art remote sensing to anticipate an increased likelihood of tree mortality in space and time. © 2017 John Wiley & Sons Ltd.

  10. Progress in Modeling Global Atmospheric CO2 Fluxes and Transport: Results from Simulations with Diurnal Fluxes

    NASA Technical Reports Server (NTRS)

    Collatz, G. James; Kawa, R.

    2007-01-01

    Progress in better determining CO2 sources and sinks will almost certainly rely on utilization of more extensive and intensive CO2 and related observations including those from satellite remote sensing. Use of advanced data requires improved modeling and analysis capability. Under NASA Carbon Cycle Science support we seek to develop and integrate improved formulations for 1) atmospheric transport, 2) terrestrial uptake and release, 3) biomass and 4) fossil fuel burning, and 5) observational data analysis including inverse calculations. The transport modeling is based on meteorological data assimilation analysis from the Goddard Modeling and Assimilation Office. Use of assimilated met data enables model comparison to CO2 and other observations across a wide range of scales of variability. In this presentation we focus on the short end of the temporal variability spectrum: hourly to synoptic to seasonal. Using CO2 fluxes at varying temporal resolution from the SIB 2 and CASA biosphere models, we examine the model's ability to simulate CO2 variability in comparison to observations at different times, locations, and altitudes. We find that the model can resolve much of the variability in the observations, although there are limits imposed by vertical resolution of boundary layer processes. The influence of key process representations is inferred. The high degree of fidelity in these simulations leads us to anticipate incorporation of realtime, highly resolved observations into a multiscale carbon cycle analysis system that will begin to bridge the gap between top-down and bottom-up flux estimation, which is a primary focus of NACP.

  11. Adult Demography and Larval Processes in Coastal Benthic Populations: Intertidal Barnacles in Southern California and Baja California

    DTIC Science & Technology

    2005-09-01

    mechanism for near-shore concentration and estuarine recruitment of post-larval Penaeus plebejus Hess ( Decapoda , Penaeidae). Estuarine, Coastal and...the physical The dynamics of coastal populations is highly processes that are likely to affect the distribution dependent on the mechanisms and...help. I was lucky to land on a lab where, depending on the lunar phase, time of year, and who knows what other environmental variables, I would find a

  12. MODFLOW-2000, the U.S. Geological Survey Modular Ground-Water Model--Documentation of the SEAWAT-2000 Version with the Variable-Density Flow Process (VDF) and the Integrated MT3DMS Transport Process (IMT)

    USGS Publications Warehouse

    Langevin, Christian D.; Shoemaker, W. Barclay; Guo, Weixing

    2003-01-01

    SEAWAT-2000 is the latest release of the SEAWAT computer program for simulation of three-dimensional, variable-density, transient ground-water flow in porous media. SEAWAT-2000 was designed by combining a modified version of MODFLOW-2000 and MT3DMS into a single computer program. The code was developed using the MODFLOW-2000 concept of a process, which is defined as ?part of the code that solves a fundamental equation by a specified numerical method.? SEAWAT-2000 contains all of the processes distributed with MODFLOW-2000 and also includes the Variable-Density Flow Process (as an alternative to the constant-density Ground-Water Flow Process) and the Integrated MT3DMS Transport Process. Processes may be active or inactive, depending on simulation objectives; however, not all processes are compatible. For example, the Sensitivity and Parameter Estimation Processes are not compatible with the Variable-Density Flow and Integrated MT3DMS Transport Processes. The SEAWAT-2000 computer code was tested with the common variable-density benchmark problems and also with problems representing evaporation from a salt lake and rotation of immiscible fluids.

  13. Processes governing transient responses of the deep ocean buoyancy budget to a doubling of CO2

    NASA Astrophysics Data System (ADS)

    Palter, J. B.; Griffies, S. M.; Hunter Samuels, B. L.; Galbraith, E. D.; Gnanadesikan, A.

    2012-12-01

    Recent observational analyses suggest there is a temporal trend and high-frequency variability in deep ocean buoyancy in the last twenty years, a phenomenon reproduced even in low-mixing models. Here we use an earth system model (GFDL's ESM2M) to evaluate physical processes that influence buoyancy (and thus steric sea level) budget of the deep ocean in quasi-steady state and under a doubling of CO2. A new suite of model diagnostics allows us to quantitatively assess every process that influences the buoyancy budget and its temporal evolution, revealing surprising dynamics governing both the equilibrium budget and its transient response to climate change. The results suggest that the temporal evolution of the deep ocean contribution to sea level rise is due to a diversity of processes at high latitudes, whose net effect is then advected in the Eulerian mean flow to mid and low latitudes. In the Southern Ocean, a slowdown in convection and spin up of the residual mean advection are approximately equal players in the deep steric sea level rise. In the North Atlantic, the region of greatest deep steric sea level variability in our simulations, a decrease in mixing of cold, dense waters from the marginal seas and a reduction in open ocean convection causes an accumulation of buoyancy in the deep subpolar gyre, which is then advected equatorward.

  14. QbD for pediatric oral lyophilisates development: risk assessment followed by screening and optimization.

    PubMed

    Casian, Tibor; Iurian, Sonia; Bogdan, Catalina; Rus, Lucia; Moldovan, Mirela; Tomuta, Ioan

    2017-12-01

    This study proposed the development of oral lyophilisates with respect to pediatric medicine development guidelines, by applying risk management strategies and DoE as an integrated QbD approach. Product critical quality attributes were overviewed by generating Ishikawa diagrams for risk assessment purposes, considering process, formulation and methodology related parameters. Failure Mode Effect Analysis was applied to highlight critical formulation and process parameters with an increased probability of occurrence and with a high impact on the product performance. To investigate the effect of qualitative and quantitative formulation variables D-optimal designs were used for screening and optimization purposes. Process parameters related to suspension preparation and lyophilization were classified as significant factors, and were controlled by implementing risk mitigation strategies. Both quantitative and qualitative formulation variables introduced in the experimental design influenced the product's disintegration time, mechanical resistance and dissolution properties selected as CQAs. The optimum formulation selected through Design Space presented ultra-fast disintegration time (5 seconds), a good dissolution rate (above 90%) combined with a high mechanical resistance (above 600 g load). Combining FMEA and DoE allowed the science based development of a product with respect to the defined quality target profile by providing better insights on the relevant parameters throughout development process. The utility of risk management tools in pharmaceutical development was demonstrated.

  15. Intrinsic vs. spurious long-range memory in high-frequency records of environmental radioactivity. Critical re-assessment and application to indoor 222Rn concentrations from Coimbra, Portugal

    NASA Astrophysics Data System (ADS)

    Donner, R. V.; Potirakis, S. M.; Barbosa, S. M.; Matos, J. A. O.; Pereira, A. J. S. C.; Neves, L. J. P. F.

    2015-05-01

    The presence or absence of long-range correlations in the environmental radioactivity fluctuations has recently attracted considerable interest. Among a multiplicity of practically relevant applications, identifying and disentangling the environmental factors controlling the variable concentrations of the radioactive noble gas radon is important for estimating its effect on human health and the efficiency of possible measures for reducing the corresponding exposition. In this work, we present a critical re-assessment of a multiplicity of complementary methods that have been previously applied for evaluating the presence of long-range correlations and fractal scaling in environmental radon variations with a particular focus on the specific properties of the underlying time series. As an illustrative case study, we subsequently re-analyze two high-frequency records of indoor radon concentrations from Coimbra, Portugal, each of which spans several weeks of continuous measurements at a high temporal resolution of five minutes.Our results reveal that at the study site, radon concentrations exhibit complex multi-scale dynamics with qualitatively different properties at different time-scales: (i) essentially white noise in the high-frequency part (up to time-scales of about one hour), (ii) spurious indications of a non-stationary, apparently long-range correlated process (at time scales between some hours and one day) arising from marked periodic components, and (iii) low-frequency variability indicating a true long-range dependent process. In the presence of such multi-scale variability, common estimators of long-range memory in time series are prone to fail if applied to the raw data without previous separation of time-scales with qualitatively different dynamics.

  16. Human Stressors Are Driving Coastal Benthic Long-Lived Sessile Fan Mussel Pinna nobilis Population Structure More than Environmental Stressors.

    PubMed

    Deudero, Salud; Vázquez-Luis, Maite; Álvarez, Elvira

    2015-01-01

    Coastal degradation and habitat disruption are severely compromising sessile marine species. The fan shell Pinna nobilis is an endemic, vulnerable species and the largest bivalve in the Mediterranean basin. In spite of species legal protection, fan shell populations are declining. Models analyzed the contributions of environmental (mean depth, wave height, maximum wave height, period of waves with high energy and mean direction of wave source) versus human-derived stressors (anchoring, protection status, sewage effluents, fishing activity and diving) as explanatory variables depicting Pinna nobilis populations at a mesoscale level. Human stressors were explaining most of the variability in density spatial distribution of fan shell, significantly disturbing benthic communities. Habitat protection affected P. nobilis structure and physical aggression by anchoring reveals a high impact on densities. Environmental variables instead played a secondary role, indicating that global change processes are not so relevant in coastal benthic communities as human-derived impacts.

  17. A Marked Poisson Process Driven Latent Shape Model for 3D Segmentation of Reflectance Confocal Microscopy Image Stacks of Human Skin.

    PubMed

    Ghanta, Sindhu; Jordan, Michael I; Kose, Kivanc; Brooks, Dana H; Rajadhyaksha, Milind; Dy, Jennifer G

    2017-01-01

    Segmenting objects of interest from 3D data sets is a common problem encountered in biological data. Small field of view and intrinsic biological variability combined with optically subtle changes of intensity, resolution, and low contrast in images make the task of segmentation difficult, especially for microscopy of unstained living or freshly excised thick tissues. Incorporating shape information in addition to the appearance of the object of interest can often help improve segmentation performance. However, the shapes of objects in tissue can be highly variable and design of a flexible shape model that encompasses these variations is challenging. To address such complex segmentation problems, we propose a unified probabilistic framework that can incorporate the uncertainty associated with complex shapes, variable appearance, and unknown locations. The driving application that inspired the development of this framework is a biologically important segmentation problem: the task of automatically detecting and segmenting the dermal-epidermal junction (DEJ) in 3D reflectance confocal microscopy (RCM) images of human skin. RCM imaging allows noninvasive observation of cellular, nuclear, and morphological detail. The DEJ is an important morphological feature as it is where disorder, disease, and cancer usually start. Detecting the DEJ is challenging, because it is a 2D surface in a 3D volume which has strong but highly variable number of irregularly spaced and variably shaped "peaks and valleys." In addition, RCM imaging resolution, contrast, and intensity vary with depth. Thus, a prior model needs to incorporate the intrinsic structure while allowing variability in essentially all its parameters. We propose a model which can incorporate objects of interest with complex shapes and variable appearance in an unsupervised setting by utilizing domain knowledge to build appropriate priors of the model. Our novel strategy to model this structure combines a spatial Poisson process with shape priors and performs inference using Gibbs sampling. Experimental results show that the proposed unsupervised model is able to automatically detect the DEJ with physiologically relevant accuracy in the range 10- 20 μm .

  18. A Marked Poisson Process Driven Latent Shape Model for 3D Segmentation of Reflectance Confocal Microscopy Image Stacks of Human Skin

    PubMed Central

    Ghanta, Sindhu; Jordan, Michael I.; Kose, Kivanc; Brooks, Dana H.; Rajadhyaksha, Milind; Dy, Jennifer G.

    2016-01-01

    Segmenting objects of interest from 3D datasets is a common problem encountered in biological data. Small field of view and intrinsic biological variability combined with optically subtle changes of intensity, resolution and low contrast in images make the task of segmentation difficult, especially for microscopy of unstained living or freshly excised thick tissues. Incorporating shape information in addition to the appearance of the object of interest can often help improve segmentation performance. However, shapes of objects in tissue can be highly variable and design of a flexible shape model that encompasses these variations is challenging. To address such complex segmentation problems, we propose a unified probabilistic framework that can incorporate the uncertainty associated with complex shapes, variable appearance and unknown locations. The driving application which inspired the development of this framework is a biologically important segmentation problem: the task of automatically detecting and segmenting the dermal-epidermal junction (DEJ) in 3D reflectance confocal microscopy (RCM) images of human skin. RCM imaging allows noninvasive observation of cellular, nuclear and morphological detail. The DEJ is an important morphological feature as it is where disorder, disease and cancer usually start. Detecting the DEJ is challenging because it is a 2D surface in a 3D volume which has strong but highly variable number of irregularly spaced and variably shaped “peaks and valleys”. In addition, RCM imaging resolution, contrast and intensity vary with depth. Thus a prior model needs to incorporate the intrinsic structure while allowing variability in essentially all its parameters. We propose a model which can incorporate objects of interest with complex shapes and variable appearance in an unsupervised setting by utilizing domain knowledge to build appropriate priors of the model. Our novel strategy to model this structure combines a spatial Poisson process with shape priors and performs inference using Gibbs sampling. Experimental results show that the proposed unsupervised model is able to automatically detect the DEJ with physiologically relevant accuracy in the range 10 – 20µm. PMID:27723590

  19. Evolutionary search for new high-k dielectric materials: methodology and applications to hafnia-based oxides.

    PubMed

    Zeng, Qingfeng; Oganov, Artem R; Lyakhov, Andriy O; Xie, Congwei; Zhang, Xiaodong; Zhang, Jin; Zhu, Qiang; Wei, Bingqing; Grigorenko, Ilya; Zhang, Litong; Cheng, Laifei

    2014-02-01

    High-k dielectric materials are important as gate oxides in microelectronics and as potential dielectrics for capacitors. In order to enable computational discovery of novel high-k dielectric materials, we propose a fitness model (energy storage density) that includes the dielectric constant, bandgap, and intrinsic breakdown field. This model, used as a fitness function in conjunction with first-principles calculations and the global optimization evolutionary algorithm USPEX, efficiently leads to practically important results. We found a number of high-fitness structures of SiO2 and HfO2, some of which correspond to known phases and some of which are new. The results allow us to propose characteristics (genes) common to high-fitness structures--these are the coordination polyhedra and their degree of distortion. Our variable-composition searches in the HfO2-SiO2 system uncovered several high-fitness states. This hybrid algorithm opens up a new avenue for discovering novel high-k dielectrics with both fixed and variable compositions, and will speed up the process of materials discovery.

  20. Investigating performance variability of processing, exploitation, and dissemination using a socio-technical systems analysis approach

    NASA Astrophysics Data System (ADS)

    Danczyk, Jennifer; Wollocko, Arthur; Farry, Michael; Voshell, Martin

    2016-05-01

    Data collection processes supporting Intelligence, Surveillance, and Reconnaissance (ISR) missions have recently undergone a technological transition accomplished by investment in sensor platforms. Various agencies have made these investments to increase the resolution, duration, and quality of data collection, to provide more relevant and recent data to warfighters. However, while sensor improvements have increased the volume of high-resolution data, they often fail to improve situational awareness and actionable intelligence for the warfighter because it lacks efficient Processing, Exploitation, and Dissemination and filtering methods for mission-relevant information needs. The volume of collected ISR data often overwhelms manual and automated processes in modern analysis enterprises, resulting in underexploited data, insufficient, or lack of answers to information requests. The outcome is a significant breakdown in the analytical workflow. To cope with this data overload, many intelligence organizations have sought to re-organize their general staffing requirements and workflows to enhance team communication and coordination, with hopes of exploiting as much high-value data as possible and understanding the value of actionable intelligence well before its relevance has passed. Through this effort we have taken a scholarly approach to this problem by studying the evolution of Processing, Exploitation, and Dissemination, with a specific focus on the Army's most recent evolutions using the Functional Resonance Analysis Method. This method investigates socio-technical processes by analyzing their intended functions and aspects to determine performance variabilities. Gaps are identified and recommendations about force structure and future R and D priorities to increase the throughput of the intelligence enterprise are discussed.

  1. The Solid Phase Curing Time Effect of Asbuton with Texapon Emulsifier at the Optimum Bitumen Content

    NASA Astrophysics Data System (ADS)

    Sarwono, D.; Surya D, R.; Setyawan, A.; Djumari

    2017-07-01

    Buton asphalt (asbuton) could not be utilized optimally in Indonesia. Asbuton utilization rate was still low because the processed product of asbuton still have impracticable form in the term of use and also requiring high processing costs. This research aimed to obtain asphalt products from asbuton practical for be used through the extraction process and not requiring expensive processing cost. This research was done with experimental method in laboratory. The composition of emulsify asbuton were 5/20 grain, premium, texapon, HCl, and aquades. Solid phase was the mixture asbuton 5/20 grain and premium with 3 minutes mixing time. Liquid phase consisted texapon, HCl and aquades. The aging process was done after solid phase mixing process in order to reaction and tie of solid phase mixed become more optimal for high solubility level of asphalt production. Aging variable time were 30, 60, 90, 120, and 150 minutes. Solid and liquid phase was mixed for emulsify asbuton production, then extracted for 25 minutes. Solubility level of asphalt, water level, and asphalt characteristic was tested at extraction result of emulsify asbuton with most optimum ashphal level. The result of analysis tested data asphalt solubility level at extract asbuton resulted 94.77% on 120 minutes aging variable time. Water level test resulted water content reduction on emulsify asbuton more long time on occurring of aging solid phase. Examination of asphalt characteristic at extraction result of emulsify asbuton with optimum asphalt solubility level, obtain specimen that have rigid and strong texture in order that examination result have not sufficient ductility and penetration value.

  2. Characterizing hyporheic exchange processes using high-frequency electrical conductivity-discharge relationships on subhourly to interannual timescales

    NASA Astrophysics Data System (ADS)

    Singley, Joel G.; Wlostowski, Adam N.; Bergstrom, Anna J.; Sokol, Eric R.; Torrens, Christa L.; Jaros, Chris; Wilson, Colleen E.; Hendrickson, Patrick J.; Gooseff, Michael N.

    2017-05-01

    Concentration-discharge (C-Q) relationships are often used to quantify source water contributions and biogeochemical processes occurring within catchments, especially during discrete hydrological events. Yet, the interpretation of C-Q hysteresis is often confounded by complexity of the critical zone, such as numerous source waters and hydrochemical nonstationarity. Consequently, researchers must often ignore important runoff pathways and geochemical sources/sinks, especially the hyporheic zone because it lacks a distinct hydrochemical signature. Such simplifications limit efforts to identify processes responsible for the transience of C-Q hysteresis over time. To address these limitations, we leverage the hydrologic simplicity and long-term, high-frequency Q and electrical conductivity (EC) data from streams in the McMurdo Dry Valleys, Antarctica. In this two end-member system, EC can serve as a proxy for the concentration of solutes derived from the hyporheic zone. We utilize a novel approach to decompose loops into subhysteretic EC-Q dynamics to identify individual mechanisms governing hysteresis across a wide range of timescales. We find that hydrologic and hydraulic processes govern EC response to diel and seasonal Q variability and that the effects of hyporheic mixing processes on C-Q transience differ in short and long streams. We also observe that variable hyporheic turnover rates govern EC-Q patterns at daily to interannual timescales. Last, subhysteretic analysis reveals a period of interannual freshening of glacial meltwater streams related to the effects of unsteady flow on hyporheic exchange. The subhysteretic analysis framework we introduce may be applied more broadly to constrain the processes controlling C-Q transience and advance understanding of catchment evolution.

  3. Processing and property evaluation of tungsten-based mixed oxides for photovoltaics and optoelectronics

    NASA Astrophysics Data System (ADS)

    Vargas, Mirella

    Tungsten Oxide (WO3) films and low-dimensional structures have proven to be promising candidates in the fields of photonics and electronics. WO3 is a well-established n-type semiconductor characterized by unique electrochromic behavior, an ideal optical band gap that permits transparency over a wide spectral range, and high chemical integrity. The plethora of diverse properties endow WO3 to be highly effective in applications related to electrochromism, gas sensing, and deriving economical energy. Compared to the bulk films, a materials system involving WO3 and a related species (elements or metal oxides) offer the opportunity to tailor the electrochromic response, and an overall enhancement of the physio-chemical and optical properties. In the present case, WO3 and TiO2 composite films have been fabricated by reactive magnetron sputtering employing W/Ti alloy targets, and individual W and Ti targets for co-sputtering. Composite WO3-TiO2 films were fabricated with variable chemical composition and the effect of variable bulk chemistry on film structure, surface/interface chemistry and chemical valence state of the W and Ti cations was investigated in detail. The process-property relationships between composition and physical properties for the films deposited by using W/Ti alloy targets of variable Ti content are associated with decreases in the deposition rate of the WO3-TiO2 films due to the lower sputter yield of the strongly bonded TiO2 formed on the target surface. Additionally, for the co-sputtered films using variable tungsten power, the optical properties demonstrate unique optical modulation. The changes associated with the physical color of the films demonstrate the potential to tailor the optical behavior for the design and fabrication of multilayer photovoltaic and catalytic devices. The process-structure-property correlation derived in this work will provide a road-map to optimize and produce W-Ti-O thin films with desired properties for a given technological application.

  4. Polishing tool and the resulting TIF for three variable machine parameters as input for the removal simulation

    NASA Astrophysics Data System (ADS)

    Schneider, Robert; Haberl, Alexander; Rascher, Rolf

    2017-06-01

    The trend in the optic industry shows, that it is increasingly important to be able to manufacture complex lens geometries on a high level of precision. From a certain limit on the required shape accuracy of optical workpieces, the processing is changed from the two-dimensional to point-shaped processing. It is very important that the process is as stable as possible during the in point-shaped processing. To ensure stability, usually only one process parameter is varied during processing. It is common that this parameter is the feed rate, which corresponds to the dwell time. In the research project ArenA-FOi (Application-oriented analysis of resource-saving and energy-efficient design of industrial facilities for the optical industry), a touching procedure is used in the point-attack, and in this case a close look is made as to whether a change of several process parameters is meaningful during a processing. The ADAPT tool in size R20 from Satisloh AG is used, which is also available for purchase. The behavior of the tool is tested under constant conditions in the MCP 250 CNC by OptoTech GmbH. A series of experiments should enable the TIF (tool influence function) to be determined using three variable parameters. Furthermore, the maximum error frequency that can be processed is calculated as an example for one parameter set and serves as an outlook for further investigations. The test results serve as the basic for the later removal simulation, which must be able to deal with a variable TIF. This topic has already been successfully implemented in another research project of the Institute for Precision Manufacturing and High-Frequency Technology (IPH) and thus this algorithm can be used. The next step is the useful implementation of the collected knowledge. The TIF must be selected on the basis of the measured data. It is important to know the error frequencies to select the optimal TIF. Thus, it is possible to compare the simulated results with real measurement data and to carry out a revision. From this point onwards, it is possible to evaluate the potential of this approach, and in the ideal case it will be further researched and later found in the production.

  5. Combined effects of short-term rainfall patterns and soil texture on nitrogen cycling -- A Modeling Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gu, C.; Riley, W.J.

    2009-11-01

    Precipitation variability and magnitude are expected to change in many parts of the world over the 21st century. We examined the potential effects of intra-annual rainfall patterns on soil nitrogen (N) transport and transformation in the unsaturated soil zone using a deterministic dynamic modeling approach. The model (TOUGHREACT-N), which has been tested and applied in several experimental and observational systems, mechanistically accounts for microbial activity, soil-moisture dynamics that respond to precipitation variability, and gaseous and aqueous tracer transport in the soil. Here, we further tested and calibrated the model against data from a precipitation variability experiment in a tropical systemmore » in Costa Rica. The model was then used to simulate responses of soil moisture, microbial dynamics, nitrogen (N) aqueous and gaseous species, N leaching, and N trace-gas emissions to changes in rainfall patterns; the effect of soil texture was also examined. The temporal variability of nitrate leaching and NO, N{sub 2}, and N{sub 2}O effluxes were significantly influenced by rainfall dynamics. Soil texture combined with rainfall dynamics altered soil moisture dynamics, and consequently regulated soil N responses to precipitation changes. The clay loam soil more effectively buffered water stress during relatively long intervals between precipitation events, particularly after a large rainfall event. Subsequent soil N aqueous and gaseous losses showed either increases or decreases in response to increasing precipitation variability due to complex soil moisture dynamics. For a high rainfall scenario, high precipitation variability resulted in as high as 2.4-, 2.4-, 1.2-, and 13-fold increases in NH{sub 3}, NO, N{sub 2}O and NO{sub 3}{sup -} fluxes, respectively, in clay loam soil. In sandy loam soil, however, NO and N{sub 2}O fluxes decreased by 15% and 28%, respectively, in response to high precipitation variability. Our results demonstrate that soil N cycling responses to increasing precipitation variability depends on precipitation amount and soil texture, and that accurate prediction of future N cycling and gas effluxes requires models with relatively sophisticated representation of the relevant processes.« less

  6. Chemical Structure and Molecular Dimension As Controls on the Inherent Stability of Charcoal in Boreal Forest Soil

    NASA Astrophysics Data System (ADS)

    Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.

    2014-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  7. Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.

    2015-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  8. Investigation of the adhesion properties of direct 3D printing of polymers and nanocomposites on textiles: Effect of FDM printing process parameters

    NASA Astrophysics Data System (ADS)

    Hashemi Sanatgar, Razieh; Campagne, Christine; Nierstrasz, Vincent

    2017-05-01

    In this paper, 3D printing as a novel printing process was considered for deposition of polymers on synthetic fabrics to introduce more flexible, resource-efficient and cost effective textile functionalization processes than conventional printing process like screen and inkjet printing. The aim is to develop an integrated or tailored production process for smart and functional textiles which avoid unnecessary use of water, energy, chemicals and minimize the waste to improve ecological footprint and productivity. Adhesion of polymer and nanocomposite layers which were 3D printed directly onto the textile fabrics using fused deposition modeling (FDM) technique was investigated. Different variables which may affect the adhesion properties including 3D printing process parameters, fabric type and filler type incorporated in polymer were considered. A rectangular shape according to the peeling standard was designed as 3D computer-aided design (CAD) to find out the effect of the different variables. The polymers were printed in different series of experimental design: nylon on polyamide 66 (PA66) fabrics, polylactic acid (PLA) on PA66 fabric, PLA on PLA fabric, and finally nanosize carbon black/PLA (CB/PLA) and multi-wall carbon nanotubes/PLA (CNT/PLA) nanocomposites on PLA fabrics. The adhesion forces were quantified using the innovative sample preparing method combining with the peeling standard method. Results showed that different variables of 3D printing process like extruder temperature, platform temperature and printing speed can have significant effect on adhesion force of polymers to fabrics while direct 3D printing. A model was proposed specifically for deposition of a commercial 3D printer Nylon filament on PA66 fabrics. In the following, among the printed polymers, PLA and its composites had high adhesion force to PLA fabrics.

  9. Trends and Divergences in Childhood Income Dynamics, 1970-2010.

    PubMed

    Hill, Heather D

    2018-01-01

    Earnings and income variability have increased since the 1970s, particularly at the bottom of the income distribution. Considerable evidence suggests that childhood income levels-captured as average or point-in-time yearly income-are associated with numerous child and adult outcomes. The importance to child development of stable proximal processes during childhood suggests that income variability may also be important, particularly if it is unpredictable, unintentional, or does not reflect an upward trend in family income. Using the Panel Study of Income Dynamics, this study documents trends since the 1970s in three dimensions of childhood income dynamics: level, variability, and growth (n=7991). The analysis reveals that income variability during childhood has grown over time, while income growth rates have not. In addition, the economic context of childhood has diverged substantially by socioeconomic status, race, and family structure, with the most disadvantaged children facing a double-whammy of low income and high variability. © 2018 Elsevier Inc. All rights reserved.

  10. Three-stage variability-based reserve modifiers for enhancing flexibility reserve requirements under high variable generation penetrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim; Gao, David Wenzhong; Ibanez, Eduardo

    2016-12-01

    The electric power system has continuously evolved in order to accommodate new technologies and operating strategies. As the penetration of integrated variable generation in the system increases, it is beneficial to develop strategies that can help mitigate their effect on the grid. Historically, power system operators have held excess capacity during the commitment and dispatch process to allow the system to handle unforeseen load ramping events. As variable generation resources increase, sufficient flexibility scheduled in the system is required to ensure that system performance is not deteriorated in the presence of additional variability and uncertainty. This paper presents a systematicmore » comparison of various flexibility reserve strategies. Several of them are implemented and applied in a common test system, in order to evaluate their effect on the economic and reliable operations. Furthermore, a three stage reserve modifier algorithm is proposed and evaluated for its ability to improve system performance.« less

  11. Optimization of electrocoagulation process to treat grey wastewater in batch mode using response surface methodology.

    PubMed

    Karichappan, Thirugnanasambandham; Venkatachalam, Sivakumar; Jeganathan, Prakash Maran

    2014-01-10

    Discharge of grey wastewater into the ecological system causes the negative impact effect on receiving water bodies. In this present study, electrocoagulation process (EC) was investigated to treat grey wastewater under different operating conditions such as initial pH (4-8), current density (10-30 mA/cm2), electrode distance (4-6 cm) and electrolysis time (5-25 min) by using stainless steel (SS) anode in batch mode. Four factors with five levels Box-Behnken response surface design (BBD) was employed to optimize and investigate the effect of process variables on the responses such as total solids (TS), chemical oxygen demand (COD) and fecal coliform (FC) removal. The process variables showed significant effect on the electrocoagulation treatment process. The results were analyzed by Pareto analysis of variance (ANOVA) and second order polynomial models were developed in order to study the electrocoagulation process statistically. The optimal operating conditions were found to be: initial pH of 7, current density of 20 mA/cm2, electrode distance of 5 cm and electrolysis time of 20 min. These results indicated that EC process can be scale up in large scale level to treat grey wastewater with high removal efficiency of TS, COD and FC.

  12. Soil nitrate reducing processes – drivers, mechanisms for spatial variation, and significance for nitrous oxide production

    PubMed Central

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M.; Daniell, Tim J.

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate (NO3−) and production of the potent greenhouse gas, nitrous oxide (N2O). A number of factors are known to control these processes, including O2 concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N2O production from soils. PMID:23264770

  13. Soil nitrate reducing processes - drivers, mechanisms for spatial variation, and significance for nitrous oxide production.

    PubMed

    Giles, Madeline; Morley, Nicholas; Baggs, Elizabeth M; Daniell, Tim J

    2012-01-01

    The microbial processes of denitrification and dissimilatory nitrate reduction to ammonium (DNRA) are two important nitrate reducing mechanisms in soil, which are responsible for the loss of nitrate ([Formula: see text]) and production of the potent greenhouse gas, nitrous oxide (N(2)O). A number of factors are known to control these processes, including O(2) concentrations and moisture content, N, C, pH, and the size and community structure of nitrate reducing organisms responsible for the processes. There is an increasing understanding associated with many of these controls on flux through the nitrogen cycle in soil systems. However, there remains uncertainty about how the nitrate reducing communities are linked to environmental variables and the flux of products from these processes. The high spatial variability of environmental controls and microbial communities across small sub centimeter areas of soil may prove to be critical in determining why an understanding of the links between biotic and abiotic controls has proved elusive. This spatial effect is often overlooked as a driver of nitrate reducing processes. An increased knowledge of the effects of spatial heterogeneity in soil on nitrate reduction processes will be fundamental in understanding the drivers, location, and potential for N(2)O production from soils.

  14. Analysis of grinding of superalloys and ceramics for off-line process optimization

    NASA Astrophysics Data System (ADS)

    Sathyanarayanan, G.

    The present study has compared the performances of resinoid, vitrified, and electroplated CBN wheels in creep feed grinding of M42 and D2 tool steels. Responses such as a specific energy, normal and tangential forces, and surface roughness were used as measures of performance. It was found that creep feed grinding with resinoid, vitrified, and electroplated CBN wheels has its own advantages, but no single wheel could provide good finish, lower specific energy, and high material removal rates simultaneously. To optimize the CBN grinding with different bonded wheels, a Multiple Criteria Decision Making (MCDM) methodology was used. Creep feed grinding of superalloys, Ti-6Al-4V and Inconel 718, has been modeled by utilizing neural networks to optimize the grinding process. A parallel effort was directed at creep feed grinding of alumina ceramics with diamond wheels to investigate the influence of process variables on responses based on experimental results and statistical analysis. The conflicting influence of variables was observed. This led to the formulation of ceramic grinding process as a multi-objective nonlinear mixed integer problem.

  15. Key process parameters to modify the porosity of cerium dioxide microspheres formed in the internal gelation process

    DOE PAGES

    Hunt, Rodney Dale; Collins, Jack Lee; Reif, Tyler J.; ...

    2017-08-04

    Recently, an internal gelation study demonstrated that the use of heated urea and hexamethylenetetramine can have a pronounced impact on the porosity and sintering characteristics of cerium dioxide (CeO 2) microspheres. This effort has identified process variables that can significantly change the initial porosity of the CeO 2 microspheres with slight modifications. A relatively small difference in the sample preparation of cerium ammonium nitrate and ammonium hydroxide solution had a large reproducible impact on the porosity and slow pour density of the produced microspheres. Increases in the gelation temperature as small as 0.5 K also produced a noticeable increase inmore » the slow pour density. If the gelation temperature was increased too high, the use of the heated hexamethylenetetramine and urea was no longer observed to be effective in increasing the porosity of the CeO 2 microspheres. In conclusion, the final process variable was the amount of dispersing agent, Span™ 80, which can increase the slow pour density and produce significantly smaller microspheres.« less

  16. Key process parameters to modify the porosity of cerium dioxide microspheres formed in the internal gelation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Rodney Dale; Collins, Jack Lee; Reif, Tyler J.

    Recently, an internal gelation study demonstrated that the use of heated urea and hexamethylenetetramine can have a pronounced impact on the porosity and sintering characteristics of cerium dioxide (CeO 2) microspheres. This effort has identified process variables that can significantly change the initial porosity of the CeO 2 microspheres with slight modifications. A relatively small difference in the sample preparation of cerium ammonium nitrate and ammonium hydroxide solution had a large reproducible impact on the porosity and slow pour density of the produced microspheres. Increases in the gelation temperature as small as 0.5 K also produced a noticeable increase inmore » the slow pour density. If the gelation temperature was increased too high, the use of the heated hexamethylenetetramine and urea was no longer observed to be effective in increasing the porosity of the CeO 2 microspheres. In conclusion, the final process variable was the amount of dispersing agent, Span™ 80, which can increase the slow pour density and produce significantly smaller microspheres.« less

  17. Biosurfactant production by Aureobasidium pullulans in stirred tank bioreactor: New approach to understand the influence of important variables in the process.

    PubMed

    Brumano, Larissa Pereira; Antunes, Felipe Antonio Fernandes; Souto, Sara Galeno; Dos Santos, Júlio Cesar; Venus, Joachim; Schneider, Roland; da Silva, Silvio Silvério

    2017-11-01

    Surfactants are amphiphilic molecules with large industrial applications produced currently by chemical routes mainly derived from oil industry. However, biotechnological process, aimed to develop new sustainable process configurations by using favorable microorganisms, already requires investigations in more details. Thus, we present a novel approach for biosurfactant production using the promising yeast Aureobasidium pullulans LB 83, in stirred tank reactor. A central composite face-centered design was carried out to evaluate the effect of the aeration rate (0.1-1.1min -1 ) and sucrose concentration (20-80g.L -1 ) in the biosurfactant maximum tensoactivity and productivity. Statistical analysis showed that the use of variables at high levels enhanced tensoactivity, showing 8.05cm in the oil spread test and productivity of 0.0838cm.h -1 . Also, unprecedented investigation of aeration rate and sucrose concentration relevance in biosurfactant production by A. pullulans in stirred tank reactor was detailed, demonstrating the importance to establish adequate conditions in bioreactors, aimed to scale-up process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Comprehensive two-dimensional gas chromatography for the analysis of Fischer-Tropsch oil products.

    PubMed

    van der Westhuizen, Rina; Crous, Renier; de Villiers, André; Sandra, Pat

    2010-12-24

    The Fischer-Tropsch (FT) process involves a series of catalysed reactions of carbon monoxide and hydrogen, originating from coal, natural gas or biomass, leading to a variety of synthetic chemicals and fuels. The benefits of comprehensive two-dimensional gas chromatography (GC×GC) compared to one-dimensional GC (1D-GC) for the detailed investigation of the oil products of low and high temperature FT processes are presented. GC×GC provides more accurate quantitative data to construct Anderson-Schultz-Flory (ASF) selectivity models that correlate the FT product distribution with reaction variables. On the other hand, the high peak capacity and sensitivity of GC×GC allow the detailed study of components present at trace level. Analyses of the aromatic and oxygenated fractions of a high temperature FT (HT-FT) process are presented. GC×GC data have been used to optimise or tune the HT-FT process by using a lab-scale micro-FT-reactor. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Single-Cell-Based Analysis Highlights a Surge in Cell-to-Cell Molecular Variability Preceding Irreversible Commitment in a Differentiation Process

    PubMed Central

    Boullu, Loïs; Morin, Valérie; Vallin, Elodie; Guillemin, Anissa; Papili Gao, Nan; Cosette, Jérémie; Arnaud, Ophélie; Kupiec, Jean-Jacques; Espinasse, Thibault

    2016-01-01

    In some recent studies, a view emerged that stochastic dynamics governing the switching of cells from one differentiation state to another could be characterized by a peak in gene expression variability at the point of fate commitment. We have tested this hypothesis at the single-cell level by analyzing primary chicken erythroid progenitors through their differentiation process and measuring the expression of selected genes at six sequential time-points after induction of differentiation. In contrast to population-based expression data, single-cell gene expression data revealed a high cell-to-cell variability, which was masked by averaging. We were able to show that the correlation network was a very dynamical entity and that a subgroup of genes tend to follow the predictions from the dynamical network biomarker (DNB) theory. In addition, we also identified a small group of functionally related genes encoding proteins involved in sterol synthesis that could act as the initial drivers of the differentiation. In order to assess quantitatively the cell-to-cell variability in gene expression and its evolution in time, we used Shannon entropy as a measure of the heterogeneity. Entropy values showed a significant increase in the first 8 h of the differentiation process, reaching a peak between 8 and 24 h, before decreasing to significantly lower values. Moreover, we observed that the previous point of maximum entropy precedes two paramount key points: an irreversible commitment to differentiation between 24 and 48 h followed by a significant increase in cell size variability at 48 h. In conclusion, when analyzed at the single cell level, the differentiation process looks very different from its classical population average view. New observables (like entropy) can be computed, the behavior of which is fully compatible with the idea that differentiation is not a “simple” program that all cells execute identically but results from the dynamical behavior of the underlying molecular network. PMID:28027290

  20. Single-Cell-Based Analysis Highlights a Surge in Cell-to-Cell Molecular Variability Preceding Irreversible Commitment in a Differentiation Process.

    PubMed

    Richard, Angélique; Boullu, Loïs; Herbach, Ulysse; Bonnafoux, Arnaud; Morin, Valérie; Vallin, Elodie; Guillemin, Anissa; Papili Gao, Nan; Gunawan, Rudiyanto; Cosette, Jérémie; Arnaud, Ophélie; Kupiec, Jean-Jacques; Espinasse, Thibault; Gonin-Giraud, Sandrine; Gandrillon, Olivier

    2016-12-01

    In some recent studies, a view emerged that stochastic dynamics governing the switching of cells from one differentiation state to another could be characterized by a peak in gene expression variability at the point of fate commitment. We have tested this hypothesis at the single-cell level by analyzing primary chicken erythroid progenitors through their differentiation process and measuring the expression of selected genes at six sequential time-points after induction of differentiation. In contrast to population-based expression data, single-cell gene expression data revealed a high cell-to-cell variability, which was masked by averaging. We were able to show that the correlation network was a very dynamical entity and that a subgroup of genes tend to follow the predictions from the dynamical network biomarker (DNB) theory. In addition, we also identified a small group of functionally related genes encoding proteins involved in sterol synthesis that could act as the initial drivers of the differentiation. In order to assess quantitatively the cell-to-cell variability in gene expression and its evolution in time, we used Shannon entropy as a measure of the heterogeneity. Entropy values showed a significant increase in the first 8 h of the differentiation process, reaching a peak between 8 and 24 h, before decreasing to significantly lower values. Moreover, we observed that the previous point of maximum entropy precedes two paramount key points: an irreversible commitment to differentiation between 24 and 48 h followed by a significant increase in cell size variability at 48 h. In conclusion, when analyzed at the single cell level, the differentiation process looks very different from its classical population average view. New observables (like entropy) can be computed, the behavior of which is fully compatible with the idea that differentiation is not a "simple" program that all cells execute identically but results from the dynamical behavior of the underlying molecular network.

Top