2007-05-01
Active Duty Have Been Improved, but Some Challenges Remain The Army’s MRP program has largely resolved the widespread delays in order processing that...interviewed confirmed that they did not experience gaps in pay and associated benefits because of order processing delays. However, some of the...and injured reserve component soldiers we interviewed, these improvements have virtually eliminated the widespread delays in order processing that
The sand bag model of the dispersion of the cosmic body in the atmosphere
NASA Technical Reports Server (NTRS)
Teterev, A. V.; Nemchinov, I. V.
1993-01-01
The strength of the extraterrestrial bodies depends on their structure, composition, dimensions, and the history of this body. The fragmentation of the body due to aerodynamic stresses begins at sufficiently large heights above the surface of the Earth. The process of fragmentation and dispersion of the fragments usually is studied by the hydrodynamic or even gasdynamic models. If the fragmentation process begins due to the initial cracks and faults of the body, or this body consists of large boulders glued by ice, the strength of these boulders after fragmentation remains higher than the aerodynamic stresses exerted at the remaining part of the body. It is supposed that fragmentation occurs at initial moment t = 0 at some height z(sub o) above the surface of the air, these fragments remain solid. The possibility of further fragmentation during the remaining part of the trajectory is not taken into account. If the number of these parts is large enough and their size is small in comparison to the initial radius of the body than we can use the sand bag model proposed in qualitative form.
Channel recovery from recent large floods in north coastal California: rates and processes
Thomas E. Lisle
1981-01-01
Abstract - Stream channel recovery from recent large floods in northern California involves a sequence of processes, including degradation of streambeds to stable levels, narrowing of channels, and accentuation of riffle-pool sequences. Most channels have degraded but remain widened because hillslope encroachment and establishment of riparian groves conducive to...
DETAIL VIEW OF LOWER CYANIDE PROCESSING WORKS, LOOKING SOUTHWEST FROM ...
DETAIL VIEW OF LOWER CYANIDE PROCESSING WORKS, LOOKING SOUTHWEST FROM LARGE TAILINGS PILE. THE REMAINS OF THREE TEN FOOT DIAMETER SETTLING TANKS ARE AT CENTER. THE SCATTER IN THE CENTER FOREGROUND IS THE REMAINS OF A LARGE RECTANGULAR HOLDING TANK POSSIBLY A SETTLING TANK. THIS AREA WAS MOST LIKELY CONSTRUCTED LATER IN THE TWENTIETH CENTURY AFTER MINING HAD CEASED AND ONLY TAILINGS WERE BEING RECLAIMED. AN EXACT DATE CANNOT BE DETERMINED HOWEVER THESE WORKS ARE DISTINCTLY DIFFERENT THAN THE ORIGINAL LAYOUT. THE SANDY AREA THAT OCCUPIES THE FOREGROUND AND THE CENTER ARE TAILINGS. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA
Extracting Silicon From Sodium-Process Products
NASA Technical Reports Server (NTRS)
Kapur, V.; Sanjurjo, A.; Sancier, K. M.; Nanis, L.
1982-01-01
New acid leaching process purifies silicon produced in reaction between silicon fluoride and sodium. Concentration of sodium fluoride and other impurities and byproducts remaining in silicon are within acceptable ranges for semi-conductor devices. Leaching process makes sodium reduction process more attractive for making large quantities of silicon for solar cells.
What will the future of cloud-based astronomical data processing look like?
NASA Astrophysics Data System (ADS)
Green, Andrew W.; Mannering, Elizabeth; Harischandra, Lloyd; Vuong, Minh; O'Toole, Simon; Sealey, Katrina; Hopkins, Andrew M.
2017-06-01
Astronomy is rapidly approaching an impasse: very large datasets require remote or cloud-based parallel processing, yet many astronomers still try to download the data and develop serial code locally. Astronomers understand the need for change, but the hurdles remain high. We are developing a data archive designed from the ground up to simplify and encourage cloud-based parallel processing. While the volume of data we host remains modest by some standards, it is still large enough that download and processing times are measured in days and even weeks. We plan to implement a python based, notebook-like interface that automatically parallelises execution. Our goal is to provide an interface sufficiently familiar and user-friendly that it encourages the astronomer to run their analysis on our system in the cloud-astroinformatics as a service. We describe how our system addresses the approaching impasse in astronomy using the SAMI Galaxy Survey as an example.
USDA-ARS?s Scientific Manuscript database
Humans have consumed fermented cucumber products since before the dawn of civilization. Although cucumber fermentation remains largely a traditional process, it has proven to be a consistently safe process by which raw cucumbers are transformed into high quality pickles that have a long shelf-life ...
ERIC Educational Resources Information Center
Baartman, Liesbeth K. J.; de Bruijn, Elly
2011-01-01
Current research focuses on competence development and complex professional tasks. However, "learning processes" towards the integration of knowledge, skills and attitudes largely remain a black box. This article conceptualises three integration processes, in analogy to theories on transfer. Knowledge, skills and attitudes are defined, reconciling…
Parallel and Serial Processes in Visual Search
ERIC Educational Resources Information Center
Thornton, Thomas L.; Gilden, David L.
2007-01-01
A long-standing issue in the study of how people acquire visual information centers around the scheduling and deployment of attentional resources: Is the process serial, or is it parallel? A substantial empirical effort has been dedicated to resolving this issue. However, the results remain largely inconclusive because the methodologies that have…
Dyadic Processes in Early Marriage: Attributions, Behavior, and Marital Quality
ERIC Educational Resources Information Center
Durtschi, Jared A.; Fincham, Frank D.; Cui, Ming; Lorenz, Frederick O.; Conger, Rand D.
2011-01-01
Marital processes in early marriage are important for understanding couples' future marital quality. Spouses' attributions about a partner's behavior have been linked to marital quality, yet the mechanisms underlying this association remain largely unknown. When we used couple data from the Family Transitions Project (N = 280 couples) across the…
Inference of Evolutionary Jumps in Large Phylogenies using Lévy Processes
Duchen, Pablo; Leuenberger, Christoph; Szilágyi, Sándor M.; Harmon, Luke; Eastman, Jonathan; Schweizer, Manuel
2017-01-01
Abstract Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson’s hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits. PMID:28204787
Level of Processing Modulates the Neural Correlates of Emotional Memory Formation
ERIC Educational Resources Information Center
Ritchey, Maureen; LaBar, Kevin S.; Cabeza, Roberto
2011-01-01
Emotion is known to influence multiple aspects of memory formation, including the initial encoding of the memory trace and its consolidation over time. However, the neural mechanisms whereby emotion impacts memory encoding remain largely unexplored. The present study used a levels-of-processing manipulation to characterize the impact of emotion on…
Development of L2 Interactional Resources for Online Collaborative Task Accomplishment
ERIC Educational Resources Information Center
Balaman, Ufuk; Sert, Olcay
2017-01-01
Technology-mediated task environments have long been considered integral parts of L2 learning and teaching processes. However, the interactional resources that the learners deploy to complete tasks in these environments have remained largely unexplored due to an overall focus on task design and outcomes rather than task engagement processes. With…
Defaunation leads to microevolutionary changes in a tropical palm
Carvalho, Carolina S.; Galetti, Mauro; Colevatti, Rosane G.; Jordano, Pedro
2016-01-01
Many large species have declined worldwide due to habitat fragmentation and poaching. The defaunation of large frugivores and the consequent reductions of seed dispersal services may have immediate effects on plant demography. Yet, the lasting effects of frugivore defaunation on microevolutionary processes of the plants they disperse remain understudied. We tested if the loss of large seed dispersers can lead to microevolutionary changes of a tropical palm. We show that frugivore defaunation is the main driver of changes in allelic frequency among populations. Turnover of alleles accounted for 100% of dissimilarity in allelic frequencies of individuals between defaunated and non-defaunated forests; and individuals from defaunated sites are 1.5 times more similar genetically than those found in pristine sites. Given that sizeable fractions of the palm fruit crops remain undispersed in defaunated sites due to lack of large-bodied frugivores, this distinct pattern of gene pool composition of early recruits may reveal strong dispersal limitation for specific genotypes, or collapses of gene flow between fragmented areas, or both. Because most of tropical tree species rely on seed dispersal by vertebrates, our results show that defaunation has a lasting effect on microevolutionary processes, with potential consequences for persistence under scenarios of environmental change. PMID:27535709
Inference of Evolutionary Jumps in Large Phylogenies using Lévy Processes.
Duchen, Pablo; Leuenberger, Christoph; Szilágyi, Sándor M; Harmon, Luke; Eastman, Jonathan; Schweizer, Manuel; Wegmann, Daniel
2017-11-01
Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson's hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits. The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
ERIC Educational Resources Information Center
Subramaniam, Karuna; Faust, Miriam; Beeman, Mark; Mashal, Nira
2012-01-01
The neural mechanisms underlying the process of understanding novel and conventional metaphoric expressions remain unclear largely because the specific brain regions that support the formation of novel semantic relations are still unknown. A well established way to study distinct cognitive processes specifically associated with an event of…
The importance of tree size and fecundity for wind dispersal of big-leaf mahogany
Julian M. Norghauer; Charles A. Nock; James Grogan
2011-01-01
Seed dispersal by wind is a critical yet poorly understood process in tropical forest trees. How tree size and fecundity affect this process at the population level remains largely unknown because of insufficient replication across adults. We measured seed dispersal by the endangered neotropical timber species big-leaf mahogany (Swietenia macrophylla King, Meliaceae)...
Nuclear autophagy: An evolutionarily conserved mechanism of nuclear degradation in the cytoplasm.
Luo, Majing; Zhao, Xueya; Song, Ying; Cheng, Hanhua; Zhou, Rongjia
2016-11-01
Macroautophagy/autophagy is a catabolic process that is essential for cellular homeostasis. Studies on autophagic degradation of cytoplasmic components have generated interest in nuclear autophagy. Although its mechanisms and roles have remained elusive, tremendous progress has been made toward understanding nuclear autophagy. Nuclear autophagy is evolutionarily conserved in eukaryotes that may target various nuclear components through a series of processes, including nuclear sensing, nuclear export, autophagic substrate encapsulation and autophagic degradation in the cytoplasm. However, the molecular processes and regulatory mechanisms involved in nuclear autophagy remain largely unknown. Numerous studies have highlighted the importance of nuclear autophagy in physiological and pathological processes such as cancer. This review focuses on current advances in nuclear autophagy and provides a summary of its research history and landmark discoveries to offer new perspectives.
An Improved Data Collection and Processing System
1988-05-01
of -use of Turbo made it the compiler of choice . These applications in- cluded data storage, processing and output. Thus, those programs...changed. As it was not envisioned that the settings would 0 87 remain constant for a large number of tests in a row, the update process is executed every...store,going); * A repeat of the above reading process is done for the file containing Ic. U:. The only difference is that the first line of this
Small enterprises' importance to the U.S. secondary wood processing industry
Urs Buehlmann; Omar Espinoza; Matthew Bumgardner; Michael Sperber
2013-01-01
The past decades have seen numerous U.S. secondary wood processing companies shift their production to overseas locations, mainly in Southeast Asia. The remaining companies have been hit hard by the downturn in housing markets and the following recession. Thus, many large customers of the U.S. hardwood lumber industry have reduced or stopped the purchase of products,...
1986-06-01
model of the self-evaluation process as it differs from the evaluation process used by superiors. Symbolic Interactionism One view of self assessment is...supplied by the symbolic interactionists (Cooley, 1902; Head, 1934), who state that self perceptions are generated largely from individuals...disagreements remained even immediately after an appraisal interview in which a great deal of feedback was given. Research on the symbolic interactionist
Arguments Against a Configural Processing Account of Familiar Face Recognition.
Burton, A Mike; Schweinberger, Stefan R; Jenkins, Rob; Kaufmann, Jürgen M
2015-07-01
Face recognition is a remarkable human ability, which underlies a great deal of people's social behavior. Individuals can recognize family members, friends, and acquaintances over a very large range of conditions, and yet the processes by which they do this remain poorly understood, despite decades of research. Although a detailed understanding remains elusive, face recognition is widely thought to rely on configural processing, specifically an analysis of spatial relations between facial features (so-called second-order configurations). In this article, we challenge this traditional view, raising four problems: (1) configural theories are underspecified; (2) large configural changes leave recognition unharmed; (3) recognition is harmed by nonconfigural changes; and (4) in separate analyses of face shape and face texture, identification tends to be dominated by texture. We review evidence from a variety of sources and suggest that failure to acknowledge the impact of familiarity on facial representations may have led to an overgeneralization of the configural account. We argue instead that second-order configural information is remarkably unimportant for familiar face recognition. © The Author(s) 2015.
Quantitative real-time imaging of glutathione
USDA-ARS?s Scientific Manuscript database
Glutathione plays many important roles in biological processes; however, the dynamic changes of glutathione concentrations in living cells remain largely unknown. Here, we report a reversible reaction-based fluorescent probe—designated as RealThiol (RT)—that can quantitatively monitor the real-time ...
Professional Development: Then and Now
ERIC Educational Resources Information Center
Bolt, Susan
2012-01-01
Technological developments have altered pedagogies in classroom teaching but approaches to teacher professional development have remained largely unchanged. The purpose of this paper is to describe an evolving learning process that spans the last decade and draws from three different investigations into professional development. The author…
ERIC Educational Resources Information Center
Key, Alexandra P.; Ibanez, Lisa V.; Henderson, Heather A.; Warren, Zachary; Messinger, Daniel S.; Stone, Wendy L.
2015-01-01
Few behavioral indices of risk for autism spectrum disorders (ASD) are present before 12 months, and potential biomarkers remain largely unexamined. This prospective study of infant siblings of children with ASD (n = 16) and low-risk comparison infants (n = 15) examined group differences in event-related potentials (ERPs) indexing processing of…
Moving zone Marangoni drying of wet objects using naturally evaporated solvent vapor
Britten, Jerald A.
1997-01-01
A surface tension gradient driven flow (a Marangoni flow) is used to remove the thin film of water remaining on the surface of an object following rinsing. The process passively introduces by natural evaporation and diffusion of minute amounts of alcohol (or other suitable material) vapor in the immediate vicinity of a continuously refreshed meniscus of deionized water or another aqueous-based, nonsurfactant rinsing agent. Used in conjunction with cleaning, developing or wet etching application, rinsing coupled with Marangoni drying provides a single-step process for 1) cleaning, developing or etching, 2) rinsing, and 3) drying objects such as flat substrates or coatings on flat substrates without necessarily using heat, forced air flow, contact wiping, centrifugation or large amounts of flammable solvents. This process is useful in one-step cleaning and drying of large flat optical substrates, one-step developing/rinsing and drying or etching/rinsing/drying of large flat patterned substrates and flat panel displays during lithographic processing, and room-temperature rinsing/drying of other large parts, sheets or continuous rolls of material.
Moving zone Marangoni drying of wet objects using naturally evaporated solvent vapor
Britten, J.A.
1997-08-26
A surface tension gradient driven flow (a Marangoni flow) is used to remove the thin film of water remaining on the surface of an object following rinsing. The process passively introduces by natural evaporation and diffusion of minute amounts of alcohol (or other suitable material) vapor in the immediate vicinity of a continuously refreshed meniscus of deionized water or another aqueous-based, nonsurfactant rinsing agent. Used in conjunction with cleaning, developing or wet etching application, rinsing coupled with Marangoni drying provides a single-step process for (1) cleaning, developing or etching, (2) rinsing, and (3) drying objects such as flat substrates or coatings on flat substrates without necessarily using heat, forced air flow, contact wiping, centrifugation or large amounts of flammable solvents. This process is useful in one-step cleaning and drying of large flat optical substrates, one-step developing/rinsing and drying or etching/rinsing/drying of large flat patterned substrates and flat panel displays during lithographic processing, and room-temperature rinsing/drying of other large parts, sheets or continuous rolls of material. 5 figs.
Integration and segregation of large-scale brain networks during short-term task automatization
Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F.; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes
2016-01-01
The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes. PMID:27808095
Planning as an Iterative Process
NASA Technical Reports Server (NTRS)
Smith, David E.
2012-01-01
Activity planning for missions such as the Mars Exploration Rover mission presents many technical challenges, including oversubscription, consideration of time, concurrency, resources, preferences, and uncertainty. These challenges have all been addressed by the research community to varying degrees, but significant technical hurdles still remain. In addition, the integration of these capabilities into a single planning engine remains largely unaddressed. However, I argue that there is a deeper set of issues that needs to be considered namely the integration of planning into an iterative process that begins before the goals, objectives, and preferences are fully defined. This introduces a number of technical challenges for planning, including the ability to more naturally specify and utilize constraints on the planning process, the ability to generate multiple qualitatively different plans, and the ability to provide deep explanation of plans.
Gender, Class and Rurality: Australian Case Studies
ERIC Educational Resources Information Center
Bryant, Lia; Pini, Barbara
2009-01-01
The interrelationship between gender and class in rural spaces has received little attention. While rural scholars have focused on the implications for class from processes of gentrification and agricultural and rural restructuring, these analyses have remained largely ungendered. Similarly, feminist rural studies have rarely explored subjectivity…
The plasma separation process as a pre-cursor for large scale radioisotope production
NASA Astrophysics Data System (ADS)
Stevenson, Nigel R.
2001-07-01
Radioisotope production generally employs either accelerators or reactors to convert stable (usually enriched) isotopes into the desired product species. Radioisotopes have applications in industry, environmental sciences, and most significantly in medicine. The production of many potentially useful radioisotopes is significantly hindered by the lack of availability or by the high cost of key enriched stable isotopes. To try and meet this demand, certain niche enrichment processes have been developed and commercialized. Calutrons, centrifuges, and laser separation processes are some of the devices and techniques being employed to produce large quantities of selective enriched stable isotopes. Nevertheless, the list of enriched stable isotopes in sufficient quantities remains rather limited and this continues to restrict the availability of many radioisotopes that otherwise could have a significant impact on society. The Plasma Separation Process is a newly available commercial technique for producing large quantities of a wide range of enriched isotopes and thereby holds promise of being able to open the door to producing new and exciting applications of radioisotopes in the future.
Do leaf-cutter ants Atta colombica obtain their magnetic sensors from soil?
USDA-ARS?s Scientific Manuscript database
How animals sense, process and use magnetic information has remained largely elusive. In insects, ferromagnetic particles are candidates for a magnetic sensor. Recent studies suggest that ferromagnetic minerals from soil can be incorporated into the antennae of the migratory ant Pachycondola margina...
Small heterodimer partner (NROB2) coordinates nutrient signaling and the circadian clock in mice
USDA-ARS?s Scientific Manuscript database
Circadian rhythm regulates multiple metabolic processes and in turn is readily entrained by feeding-fasting cycles. However, the molecular mechanisms by which the peripheral clock senses nutrition availability remain largely unknown. Bile acids are under circadian control and also increase postprand...
Measuring Te inclusion uniformity over large areas for CdTe/CZT imaging and spectrometry sensors
NASA Astrophysics Data System (ADS)
Bolke, Joe; O'Brien, Kathryn; Wall, Peter; Spicer, Mike; Gélinas, Guillaume; Beaudry, Jean-Nicolas; Alexander, W. Brock
2017-09-01
CdTe and CZT materials are technologies for gamma and x-ray imaging for applications in industry, homeland security, defense, space, medical, and astrophysics. There remain challenges in uniformity over large detector areas (50 75 mm) due to a combination of material purity, handling, growth process, grown in defects, doping/compensation, and metal contacts/surface states. The influence of these various factors has yet to be explored at the large substrate level required for devices with higher resolution both spatially and spectroscopically. In this study, we looked at how the crystal growth processes affect the size and density distributions of microscopic Te inclusion defects. We were able to grow single crystals as large as 75 mm in diameter and spatially characterize three-dimensional defects and map the uniformity using IR microscopy. We report on the pattern of observed defects within wafers and its relation to instabilities at the crystal growth interface.
Exploring the Self-Ownership Effect: Separating Stimulus and Response Biases
ERIC Educational Resources Information Center
Golubickis, Marius; Falben, Johanna K.; Cunningham, William A.; Macrae, C. Neil
2018-01-01
Although ownership is acknowledged to exert a potent influence on various aspects of information processing, the origin of these effects remains largely unknown. Based on the demonstration that self-relevance facilitates perceptual judgments (i.e., the self-prioritization effect), here we explored the possibility that ownership enhances object…
In ecosystems where native fish species have been greatly reduced or extirpated, ecological processes such as transport of energy and nutrients across habitats or ecosystems may be lost to the detriment of remaining native species. We hypothesized that fall spawning migrations ...
ERIC Educational Resources Information Center
McNamara, K. P.; O'Reilly, S. L.; George, J.; Peterson, G. M.; Jackson, S. L.; Duncan, G.; Howarth, H.; Dunbar, J. A.
2015-01-01
Background: Delivery of cardiovascular disease (CVD) prevention programs by community pharmacists appears effective and enhances health service access. However, their capacity to implement complex behavioural change processes during patient counselling remains largely unexplored. This study aims to determine intervention fidelity by pharmacists…
Avoiding Aging? Social Psychology's Treatment of Age
ERIC Educational Resources Information Center
Barrett, Anne E.; Redmond, Rebecca; von Rohr, Carmen
2012-01-01
Population aging, in conjunction with social and cultural transformations of the life course, has profound implications for social systems--from large-scale structures to micro-level processes. However, much of sociology remains fairly quiet on issues of age and aging, including the subfield of social psychology that could illuminate the impact of…
Extending the Testing Effect to Self-Regulated Learning
ERIC Educational Resources Information Center
Fernandez, Jonathan; Jamet, Eric
2017-01-01
In addition to serving summative assessment purposes, testing has turned out to be a powerful learning tool. However, while the beneficial effect of testing on learning performances has been confirmed in a large body of literature, the question of exactly how testing influences cognitive and metacognitive processes remains unclear. We therefore…
Advancing the Surveillance Capabilities of the Air Force’s Large-Aperature Telescopes
2014-03-06
frozen flow screens. Lastly, use of the FFM has the added benefit of requiring the estimation of significantly fewer parameters than a... FFM in the restoration process provides the decoding. This remains to be verified. Figure 14. Left: The mean diffraction-limited image for the
What Physicians Reason about during Admission Case Review
ERIC Educational Resources Information Center
Juma, Salina; Goldszmidt, Mark
2017-01-01
Research suggests that physicians perform multiple reasoning tasks beyond diagnosis during patient review. However, these remain largely theoretical. The purpose of this study was to explore reasoning tasks in clinical practice during patient admission review. The authors used a constant comparative approach--an iterative and inductive process of…
Residence Time and Military Workplace Literacies
ERIC Educational Resources Information Center
Doe, Sue; Doe, William W., III
2013-01-01
Despite widespread interest in the reintegration of Post-9/11 military veterans into civilian life, the literacies of Post-9/11 veterans, both academic and professional, remain largely untheorized. This paper addresses this dearth of information by examining the induction processes and resulting workplace literacies of soldiers, airmen/women,…
The Invisible Hand of Inquiry-Based Learning
ERIC Educational Resources Information Center
Bennett, Mark
2015-01-01
The key elements of learning in a classroom remain largely invisible. Teachers cannot expect every student to learn to their fullest capacity; yet they can augment learning within a classroom through inquiry-based learning. In this article, the author describes inquiry-based learning and how to begin this process in the classroom.
NASA Astrophysics Data System (ADS)
Weber, R. J.; Guo, H.; Russell, A. G.; Nenes, A.
2015-12-01
pH is a critical aerosol property that impacts many atmospheric processes, including biogenic secondary organic aerosol formation, gas-particle phase partitioning, and mineral dust or redox metal mobilization. Particle pH has also been linked to adverse health effects. Using a comprehensive data set from the Southern Oxidant and Aerosol Study (SOAS) as the basis for thermodynamic modeling, we have shown that particles are currently highly acidic in the southeastern US, with pH between 0 and 2. Sulfate and ammonium are the main acid-base components that determine particle pH in this region, however they have different sources and their concentrations are changing. Over 15 years of network data show that sulfur dioxide emission reductions have resulted in a roughly 70 percent decrease in sulfate, whereas ammonia emissions, mainly link to agricultural activities, have been largely steady, as have gas phase ammonia concentrations. This has led to the view that particles are becoming more neutralized. However, sensitivity analysis, based on thermodynamic modeling, to changing sulfate concentrations indicates that particles have remained highly acidic over the past decade, despite the large reductions in sulfate. Furthermore, anticipated continued reductions of sulfate and relatively constant ammonia emissions into the future will not significantly change particle pH until sulfate drops to clean continental background levels. The result reshapes our expectation of future particle pH and implies that atmospheric processes and adverse health effects linked to particle acidity will remain unchanged for some time into the future.
Rapid encoding of relationships between spatially remote motion signals.
Maruya, Kazushi; Holcombe, Alex O; Nishida, Shin'ya
2013-02-06
For visual processing, the temporal correlation of remote local motion signals is a strong cue to detect meaningful large-scale structures in the retinal image, because related points are likely to move together regardless of their spatial separation. While the processing of multi-element motion patterns involved in biological motion and optic flow has been studied intensively, the encoding of simpler pairwise relationships between remote motion signals remains poorly understood. We investigated this process by measuring the temporal rate limit for perceiving the relationship of two motion directions presented at the same time at different spatial locations. Compared to luminance or orientation, motion comparison was more rapid. Performance remained very high even when interstimulus separation was increased up to 100°. Motion comparison also remained rapid regardless of whether the two motion directions were similar to or different from each other. The exception was a dramatic slowing when the elements formed an orthogonal "T," in which two motions do not perceptually group together. Motion presented at task-irrelevant positions did not reduce performance, suggesting that the rapid motion comparison could not be ascribed to global optic flow processing. Our findings reveal the existence and unique nature of specialized processing that encodes long-range relationships between motion signals for quick appreciation of global dynamic scene structure.
Cycle time and cost reduction in large-size optics production
NASA Astrophysics Data System (ADS)
Hallock, Bob; Shorey, Aric; Courtney, Tom
2005-09-01
Optical fabrication process steps have remained largely unchanged for decades. Raw glass blanks have been rough-machined, generated to near net shape, loose abrasive or fine bound diamond ground and then polished. This set of processes is sequential and each subsequent operation removes the damage and micro cracking induced by the prior operational step. One of the long-lead aspects of this process has been the glass polishing. Primarily, this has been driven by the need to remove relatively large volumes of glass material compared to the polishing removal rate to ensure complete damage removal. The secondary time driver has been poor convergence to final figure and the corresponding polish-metrology cycles. The overall cycle time and resultant cost due to labor, equipment utilization and shop efficiency is increased, often significantly, when the optical prescription is aspheric. In addition to the long polishing cycle times, the duration of the polishing time is often very difficult to predict given that current polishing processes are not deterministic processes. This paper will describe a novel approach to large optics finishing, relying on several innovative technologies to be presented and illustrated through a variety of examples. The cycle time reductions enabled by this approach promises to result in significant cost and lead-time reductions for large size optics. In addition, corresponding increases in throughput will provide for less capital expenditure per square meter of optic produced. This process, comparative cycles time estimates and preliminary results will be discussed.
Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Hamer, George
2003-01-01
Beowulf clusters can provide a cost-effective way to compute numerical models and process large amounts of remote sensing image data. Usually a Beowulf cluster is designed to accomplish a specific set of processing goals, and processing is very efficient when the problem remains inside the constraints of the original design. There are cases, however, when one might wish to compute a problem that is beyond the capacity of the local Beowulf system. In these cases, spreading the problem to multiple clusters or to other machines on the network may provide a cost-effective solution.
Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes
Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.
2014-01-01
Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467
Developmental Changes in the Primacy of Facial Cues for Emotion Recognition
ERIC Educational Resources Information Center
Leitzke, Brian T.; Pollak, Seth D.
2016-01-01
There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to…
Mobile Learning as Alternative to Assistive Technology Devices for Special Needs Students
ERIC Educational Resources Information Center
Ismaili, Jalal; Ibrahimi, El Houcine Ouazzani
2017-01-01
Assistive Technology (AT) revolutionized the process of learning for special needs students during the past three decades. Thanks to this technology, accessibility and educational inclusion became attainable more than any time in the history of special education. Meanwhile, assistive technology devices remain unreachable for a large number of…
Parallel Index and Query for Large Scale Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Ruebel, Oliver
2011-07-18
Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing ofmore » a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.« less
Deformation and rupture of the oceanic crust may control growth of Hawaiian volcanoes
Got, J.-L.; Monteiller, V.; Monteux, J.; Hassani, R.; Okubo, P.
2008-01-01
Hawaiian volcanoes are formed by the eruption of large quantities of basaltic magma related to hot-spot activity below the Pacific Plate. Despite the apparent simplicity of the parent process - emission of magma onto the oceanic crust - the resulting edifices display some topographic complexity. Certain features, such as rift zones and large flank slides, are common to all Hawaiian volcanoes, indicating similarities in their genesis; however, the underlying mechanism controlling this process remains unknown. Here we use seismological investigations and finite-element mechanical modelling to show that the load exerted by large Hawaiian volcanoes can be sufficient to rupture the oceanic crust. This intense deformation, combined with the accelerated subsidence of the oceanic crust and the weakness of the volcanic edifice/oceanic crust interface, may control the surface morphology of Hawaiian volcanoes, especially the existence of their giant flank instabilities. Further studies are needed to determine whether such processes occur in other active intraplate volcanoes. ??2008 Nature Publishing Group.
Deformation and rupture of the oceanic crust may control growth of Hawaiian volcanoes.
Got, Jean-Luc; Monteiller, Vadim; Monteux, Julien; Hassani, Riad; Okubo, Paul
2008-01-24
Hawaiian volcanoes are formed by the eruption of large quantities of basaltic magma related to hot-spot activity below the Pacific Plate. Despite the apparent simplicity of the parent process--emission of magma onto the oceanic crust--the resulting edifices display some topographic complexity. Certain features, such as rift zones and large flank slides, are common to all Hawaiian volcanoes, indicating similarities in their genesis; however, the underlying mechanism controlling this process remains unknown. Here we use seismological investigations and finite-element mechanical modelling to show that the load exerted by large Hawaiian volcanoes can be sufficient to rupture the oceanic crust. This intense deformation, combined with the accelerated subsidence of the oceanic crust and the weakness of the volcanic edifice/oceanic crust interface, may control the surface morphology of Hawaiian volcanoes, especially the existence of their giant flank instabilities. Further studies are needed to determine whether such processes occur in other active intraplate volcanoes.
Particle Demagnetization in Collisionless Magnetic Reconnection
NASA Technical Reports Server (NTRS)
Hesse, Michael
2006-01-01
The dissipation mechanism of magnetic reconnection remains a subject of intense scientific interest. On one hand, one set of recent studies have shown that particle inertia-based processes, which include thermal and bulk inertial effects, provide the reconnection electric field in the diffusion region. In this presentation, we present analytical theory results, as well as 2.5 and three-dimensional PIC simulations of guide field magnetic reconnection. We will show that diffusion region scale sizes in moderate and large guide field cases are determined by electron Larmor radii, and that analytical estimates of diffusion region dimensions need to include description of the heat flux tensor. The dominant electron dissipation process appears to be based on thermal electron inertia, expressed through nongyrotropic electron pressure tensors. We will argue that this process remains viable in three dimensions by means of a detailed comparison of high resolution particle-in-cell simulations.
Managing the Nuclear Fuel Cycle: Policy Implications of Expanding Global Access to Nuclear Power
2008-09-03
Spent nuclear fuel disposal has remained the most critical aspect of the nuclear fuel cycle for the United States, where longstanding nonproliferation...inalienable right and by and large, neither have U.S. government officials. However, the case of Iran raises perhaps the most critical question in...the enrichment process can take advantage of the slight difference in atomic mass between 235U and 238U. The typical enrichment process requires
Large historical growth in global terrestrial gross primary production
Campbell, J. E.; Berry, J. A.; Seibt, U.; ...
2017-04-05
Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less
Large historical growth in global terrestrial gross primary production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, J. E.; Berry, J. A.; Seibt, U.
Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less
Bello, Silvia M; Saladié, Palmira; Cáceres, Isabel; Rodríguez-Hidalgo, Antonio; Parfitt, Simon A
2015-05-01
A recurring theme of late Upper Palaeolithic Magdalenian human bone assemblages is the remarkable rarity of primary burials and the common occurrence of highly-fragmentary human remains mixed with occupation waste at many sites. One of the most extensive Magdalenian human bone assemblages comes from Gough's Cave, a sizeable limestone cave set in Cheddar Gorge (Somerset), UK. After its discovery in the 1880s, the site was developed as a show cave and largely emptied of sediment, at times with minimal archaeological supervision. Some of the last surviving remnants of sediment within the cave were excavated between 1986 and 1992. The excavations uncovered intensively-processed human bones intermingled with abundant butchered large mammal remains and a diverse range of flint, bone, antler, and ivory artefacts. New ultrafiltrated radiocarbon determinations demonstrate that the Upper Palaeolithic human remains were deposited over a very short period of time, possibly during a series of seasonal occupations, about 14,700 years BP (before present). The human remains have been the subject of several taphonomic studies, culminating in a detailed reanalysis of the cranial remains that showed they had been carefully modified to make skull-cups. Our present analysis of the postcrania has identified a far greater degree of human modification than recorded in earlier studies. We identify extensive evidence for defleshing, disarticulation, chewing, crushing of spongy bone, and the cracking of bones to extract marrow. The presence of human tooth marks on many of the postcranial bones provides incontrovertible evidence for cannibalism. In a wider context, the treatment of the human corpses and the manufacture and use of skull-cups at Gough Cave have parallels with other Magdalenian sites in central and western Europe. This suggests that cannibalism during the Magdalenian was part of a customary mortuary practice that combined intensive processing and consumption of the bodies with ritual use of skull-cups. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hayashi, Paul H
2016-02-04
Hepatotoxicity due to drugs, herbal or dietary supplements remains largely a clinical diagnosis based on meticulous history taking and exclusion of other causes of liver injury. In 2004, the U.S. Drug-Induced Liver Injury Network (DILIN) was created under the auspices of the U.S. National Institute of Diabetes and Digestive and Kidney Diseases with the aims of establishing a large registry of cases for clinical, epidemiological and mechanistic study. From inception, the DILIN has used an expert opinion process that incorporates consensus amongst three different DILIN hepatologists assigned to each case. It is the most well-established, well-described and vigorous expert opinion process for DILI to date, and yet it is an imperfect standard. This review will discuss the DILIN expert opinion process, its strengths and weaknesses, psychometric performance and future.
Aftershocks and triggering processes in rock fracture
NASA Astrophysics Data System (ADS)
Davidsen, J.; Kwiatek, G.; Goebel, T.; Stanchits, S. A.; Dresen, G.
2017-12-01
One of the hallmarks of our understanding of seismicity in nature is the importance of triggering processes, which makes the forecasting of seismic activity feasible. These triggering processes by which one earthquake induces (dynamic or static) stress changes leading to potentially multiple other earthquakes are at the core relaxation processes. A specic example of triggering are aftershocks following a large earthquake, which have been observed to follow certain empirical relationships such as the Omori-Utsu relation. Such an empirical relation should arise from the underlying microscopic dynamics of the involved physical processes but the exact connection remains to be established. Simple explanations have been proposed but their general applicability is unclear. Many explanations involve the picture of an earthquake as a purely frictional sliding event. Here, we present experimental evidence that these empirical relationships are not limited to frictional processes but also arise in fracture zone formation and are mostly related to compaction-type events. Our analysis is based on tri-axial compression experiments under constant displacement rate on sandstone and granite samples using spatially located acoustic emission events and their focal mechanisms. More importantly, we show that event-event triggering plays an important role in the presence of large-scale or macrocopic imperfections while such triggering is basically absent if no signicant imperfections are present. We also show that spatial localization and an increase in activity rates close to failure do not necessarily imply triggering behavior associated with aftershocks. Only if a macroscopic crack is formed and its propagation remains subcritical do we observe significant triggering.
Metastable Prepores in Tension-Free Lipid Bilayers
NASA Astrophysics Data System (ADS)
Ting, Christina L.; Awasthi, Neha; Müller, Marcus; Hub, Jochen S.
2018-03-01
The formation and closure of aqueous pores in lipid bilayers is a key step in various biophysical processes. Large pores are well described by classical nucleation theory, but the free-energy landscape of small, biologically relevant pores has remained largely unexplored. The existence of small and metastable "prepores" was hypothesized decades ago from electroporation experiments, but resolving metastable prepores from theoretical models remained challenging. Using two complementary methods—atomistic simulations and self-consistent field theory of a minimal lipid model—we determine the parameters for which metastable prepores occur in lipid membranes. Both methods consistently suggest that pore metastability depends on the relative volume ratio between the lipid head group and lipid tails: lipids with a larger head-group volume fraction (or shorter saturated tails) form metastable prepores, whereas lipids with a smaller head-group volume fraction (or longer unsaturated tails) form unstable prepores.
The Time Course of Verb Processing in Dutch Sentences
Shapiro, Lewis P.; Wester, Femke; Swinney, David A.; Bastiaanse, Roelien
2012-01-01
The verb has traditionally been characterized as the central element in a sentence. Nevertheless, the exact role of the verb during the actual ongoing comprehension of a sentence as it unfolds in time remains largely unknown. This paper reports the results of two Cross-Modal Lexical Priming (CMLP) experiments detailing the pattern of verb priming during on-line processing of Dutch sentences. Results are contrasted with data from a third CMLP experiment on priming of nouns in similar sentences. It is demonstrated that the meaning of a matrix verb remains active throughout the entire matrix clause, while this is not the case for the meaning of a subject head noun. Activation of the meaning of the verb only dissipates upon encountering a clear signal as to the start of a new clause. PMID:19452278
New Engineering Solutions in Creation of Mini-BOF for Metallic Waste Recycling
NASA Astrophysics Data System (ADS)
Eronko, S. P.; Gorbatyuk, S. M.; Oshovskaya, E. V.; Starodubtsev, B. I.
2017-12-01
New engineering solutions used in design of the mini melting unit capable of recycling industrial and domestic metallic waste with high content of harmful impurities are provided. High efficiency of the process technology implemented with its use is achieved due to the possibility of the heat and mass transfer intensification in the molten metal bath, controlled charge into it of large amounts of reagents in lumps and in fines, and cut-off of remaining process slag during metal tapping into the teeming ladle.
Peter Caldwell; Catalina Segura; Shelby Gull Laird; Ge Sun; Steven G. McNulty; Maria Sandercock; Johnny Boggs; James M. Vose
2015-01-01
Assessment of potential climate change impacts on stream water temperature (Ts) across large scales remains challenging for resource managers because energy exchange processes between the atmosphere and the stream environment are complex and uncertain, and few long-term datasets are available to evaluate changes over time. In this study, we...
R. Flint Hughes; Amanda Uowolo
2006-01-01
Invasive species have the capacity to substantially alter soil processes, including rates of litter decomposition. Currently, the few remaining nativedominated lowland wet forests in Hawaii are being invaded by Falcataria moluccana, a large, fast-growing, N2-fixing tree. In this study, we sought to determine the extent to...
ERIC Educational Resources Information Center
Daschmann, Elena C.; Goetz, Thomas; Stupnisky, Robert H.
2011-01-01
Background: Boredom has been found to be an important emotion for students' learning processes and achievement outcomes; however, the precursors of this emotion remain largely unexplored. Aim: In the current study, scales assessing the precursors to boredom in academic achievement settings were developed and tested. Sample: Participants were 1,380…
ERIC Educational Resources Information Center
Brennan, Christine; Booth, James R.
2015-01-01
Linguistic knowledge, cognitive ability, and instruction influence how adults acquire a second orthography yet it remains unclear how different forms of instruction influence grain size sensitivity and subsequent decoding skill and speed. Thirty-seven monolingual, literate English-speaking adults were trained on a novel artificial orthography…
Embracing Community Ecology in Plant Microbiome Research.
Dini-Andreote, Francisco; Raaijmakers, Jos M
2018-06-01
Community assembly is mediated by selection, dispersal, drift, and speciation. Environmental selection is mostly used to date to explain patterns in plant microbiome assembly, whereas the influence of the other processes remains largely elusive. Recent studies highlight that adopting community ecology concepts provides a mechanistic framework for plant microbiome research. Copyright © 2018 Elsevier Ltd. All rights reserved.
Readout circuit with novel background suppression for long wavelength infrared focal plane arrays
NASA Astrophysics Data System (ADS)
Xie, L.; Xia, X. J.; Zhou, Y. F.; Wen, Y.; Sun, W. F.; Shi, L. X.
2011-02-01
In this article, a novel pixel readout circuit using a switched-capacitor integrator mode background suppression technique is presented for long wavelength infrared focal plane arrays. This circuit can improve dynamic range and signal-to-noise ratio by suppressing the large background current during integration. Compared with other background suppression techniques, the new background suppression technique is less sensitive to the process mismatch and has no additional shot noise. The proposed circuit is theoretically analysed and simulated while taking into account the non-ideal characteristics. The result shows that the background suppression non-uniformity is ultra-low even for a large process mismatch. The background suppression non-uniformity of the proposed circuit can also remain very small with technology scaling.
An overview of the 1984 Battelle outside users payload model
NASA Astrophysics Data System (ADS)
Day, J. B.; Conlon, R. J.; Neale, D. B.; Fischer, N. H.
1984-10-01
The methodology and projections from a model for the market for non-NASA, non-DOD, reimbursable payloads from the non-Soviet bloc countries over the 1984-2000 AD time period are summarized. High and low forecast ranges were made based on demand forecasts by industrial users, NASA estimates, and other publications. The launches were assumed to be alloted to either the Shuttle or the Ariane. The greatest demand for launch services is expected to come form communications and materials processing payloads, the latter either becoming a large user or remaining a research item. The number of Shuttle payload equivalents over the reference time spanis projected as 84-194, showing the large variance that is dependent on the progress in materials processing operations.
Reason, emotion and decision-making: risk and reward computation with feeling.
Quartz, Steven R
2009-05-01
Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.
Medical students perceive better group learning processes when large classes are made to seem small.
Hommes, Juliette; Arah, Onyebuchi A; de Grave, Willem; Schuwirth, Lambert W T; Scherpbier, Albert J J A; Bos, Gerard M J
2014-01-01
Medical schools struggle with large classes, which might interfere with the effectiveness of learning within small groups due to students being unfamiliar to fellow students. The aim of this study was to assess the effects of making a large class seem small on the students' collaborative learning processes. A randomised controlled intervention study was undertaken to make a large class seem small, without the need to reduce the number of students enrolling in the medical programme. The class was divided into subsets: two small subsets (n=50) as the intervention groups; a control group (n=102) was mixed with the remaining students (the non-randomised group n∼100) to create one large subset. The undergraduate curriculum of the Maastricht Medical School, applying the Problem-Based Learning principles. In this learning context, students learn mainly in tutorial groups, composed randomly from a large class every 6-10 weeks. The formal group learning activities were organised within the subsets. Students from the intervention groups met frequently within the formal groups, in contrast to the students from the large subset who hardly enrolled with the same students in formal activities. Three outcome measures assessed students' group learning processes over time: learning within formally organised small groups, learning with other students in the informal context and perceptions of the intervention. Formal group learning processes were perceived more positive in the intervention groups from the second study year on, with a mean increase of β=0.48. Informal group learning activities occurred almost exclusively within the subsets as defined by the intervention from the first week involved in the medical curriculum (E-I indexes>-0.69). Interviews tapped mainly positive effects and negligible negative side effects of the intervention. Better group learning processes can be achieved in large medical schools by making large classes seem small.
Medical Students Perceive Better Group Learning Processes when Large Classes Are Made to Seem Small
Hommes, Juliette; Arah, Onyebuchi A.; de Grave, Willem; Schuwirth, Lambert W. T.; Scherpbier, Albert J. J. A.; Bos, Gerard M. J.
2014-01-01
Objective Medical schools struggle with large classes, which might interfere with the effectiveness of learning within small groups due to students being unfamiliar to fellow students. The aim of this study was to assess the effects of making a large class seem small on the students' collaborative learning processes. Design A randomised controlled intervention study was undertaken to make a large class seem small, without the need to reduce the number of students enrolling in the medical programme. The class was divided into subsets: two small subsets (n = 50) as the intervention groups; a control group (n = 102) was mixed with the remaining students (the non-randomised group n∼100) to create one large subset. Setting The undergraduate curriculum of the Maastricht Medical School, applying the Problem-Based Learning principles. In this learning context, students learn mainly in tutorial groups, composed randomly from a large class every 6–10 weeks. Intervention The formal group learning activities were organised within the subsets. Students from the intervention groups met frequently within the formal groups, in contrast to the students from the large subset who hardly enrolled with the same students in formal activities. Main Outcome Measures Three outcome measures assessed students' group learning processes over time: learning within formally organised small groups, learning with other students in the informal context and perceptions of the intervention. Results Formal group learning processes were perceived more positive in the intervention groups from the second study year on, with a mean increase of β = 0.48. Informal group learning activities occurred almost exclusively within the subsets as defined by the intervention from the first week involved in the medical curriculum (E-I indexes>−0.69). Interviews tapped mainly positive effects and negligible negative side effects of the intervention. Conclusion Better group learning processes can be achieved in large medical schools by making large classes seem small. PMID:24736272
NASA Astrophysics Data System (ADS)
Parsons, Todd L.; Rogers, Tim
2017-10-01
Systems composed of large numbers of interacting agents often admit an effective coarse-grained description in terms of a multidimensional stochastic dynamical system, driven by small-amplitude intrinsic noise. In applications to biological, ecological, chemical and social dynamics it is common for these models to posses quantities that are approximately conserved on short timescales, in which case system trajectories are observed to remain close to some lower-dimensional subspace. Here, we derive explicit and general formulae for a reduced-dimension description of such processes that is exact in the limit of small noise and well-separated slow and fast dynamics. The Michaelis-Menten law of enzyme-catalysed reactions, and the link between the Lotka-Volterra and Wright-Fisher processes are explored as a simple worked examples. Extensions of the method are presented for infinite dimensional systems and processes coupled to non-Gaussian noise sources.
Fabrication of large area Si cylindric drift detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, W.; Kraner, H.W.; Li, Z.
1993-04-01
Advanced Si drift detector, a large area cylindrical drift detector (CDD), processing steps, with the exception of the ion implantation, were carried out in the BNL class 100 cleanroom. The double-side planer process technique was developed for the fabrication of CDD. Important improvements of the double-side planer process in this fabrication are the introduction of Al implantation protection mask and the remaining of a 1000 Angstroms oxide layer in the p-window during the implantation. Another important design of the CDD is the structure called ``river,`` which ,allows the current generated on Si-SiO{sub 2} interface to ``flow`` into the guard anode,more » and thus can minimize the leakage current at the signed anode. The test result showed that most of the signal anodes have the leakage current about 0.3 nA/cm{sup 2} for the best detector.« less
Large-Scale Fluorescence Calcium-Imaging Methods for Studies of Long-Term Memory in Behaving Mammals
Jercog, Pablo; Rogerson, Thomas; Schnitzer, Mark J.
2016-01-01
During long-term memory formation, cellular and molecular processes reshape how individual neurons respond to specific patterns of synaptic input. It remains poorly understood how such changes impact information processing across networks of mammalian neurons. To observe how networks encode, store, and retrieve information, neuroscientists must track the dynamics of large ensembles of individual cells in behaving animals, over timescales commensurate with long-term memory. Fluorescence Ca2+-imaging techniques can monitor hundreds of neurons in behaving mice, opening exciting avenues for studies of learning and memory at the network level. Genetically encoded Ca2+ indicators allow neurons to be targeted by genetic type or connectivity. Chronic animal preparations permit repeated imaging of neural Ca2+ dynamics over multiple weeks. Together, these capabilities should enable unprecedented analyses of how ensemble neural codes evolve throughout memory processing and provide new insights into how memories are organized in the brain. PMID:27048190
Large-scale production of human pluripotent stem cell derived cardiomyocytes.
Kempf, Henning; Andree, Birgit; Zweigerdt, Robert
2016-01-15
Regenerative medicine, including preclinical studies in large animal models and tissue engineering approaches as well as innovative assays for drug discovery, will require the constant supply of hPSC-derived cardiomyocytes and other functional progenies. Respective cell production processes must be robust, economically viable and ultimately GMP-compliant. Recent research has enabled transition of lab scale protocols for hPSC expansion and cardiomyogenic differentiation towards more controlled processing in industry-compatible culture platforms. Here, advanced strategies for the cultivation and differentiation of hPSCs will be reviewed by focusing on stirred bioreactor-based techniques for process upscaling. We will discuss how cardiomyocyte mass production might benefit from recent findings such as cell expansion at the cardiovascular progenitor state. Finally, remaining challenges will be highlighted, specifically regarding three dimensional (3D) hPSC suspension culture and critical safety issues ahead of clinical translation. Copyright © 2015 Elsevier B.V. All rights reserved.
The production of transuranium elements by the r-process nucleosynthesis
NASA Astrophysics Data System (ADS)
Goriely, S.; Martínez Pinedo, G.
2015-12-01
The production of super-heavy transuranium elements by stellar nucleosynthesis processes remains an open question. The most promising process that could potentially give rise to the formation of such elements is the so-called rapid neutron-capture process, or r-process, known to be at the origin of approximately half of the A > 60 stable nuclei observed in nature. However, despite important efforts, the astrophysical site of the r-process remains unidentified. Here, we study the r-process nucleosynthesis in material that is dynamically ejected by tidal and pressure forces during the merging of binary neutron stars. Neutron star mergers could potentially be the dominant r-process site in the Galaxy, but also due to the extreme neutron richness found in such environment, could potentially synthesise super-heavy elements. R-process nucleosynthesis during the decompression is known to be largely insensitive to the detailed astrophysical conditions because of efficient fission recycling, producing a composition that closely follows the solar r-abundance distribution for nuclei with mass numbers A > 140. During the neutron irradiation, nuclei up to charge numbers Z ≃ 110 and mass number A ≃ 340 are produced, with a major peak production at the N = 184 shell closure, i.e. around A ≃ 280. Super-heavy nuclei with Z > 110 can hardly be produced due to the efficient fission taking place along those isotopic chains. Long-lived transuranium nuclei are inevitably produced by the r-process. The predictions concerning the production of transuranium nuclei remain however very sensitive to the predictions of fission barrier heights for such super-heavy nuclei. More nuclear predictions within different microscopic approaches are needed.
El Gabaly, Farid; Schmid, Andreas K.
2013-03-19
A novel method of forming large atomically flat areas is described in which a crystalline substrate having a stepped surface is exposed to a vapor of another material to deposit a material onto the substrate, which material under appropriate conditions self arranges to form 3D islands across the substrate surface. These islands are atomically flat at their top surface, and conform to the stepped surface of the substrate below at the island-substrate interface. Thereafter, the deposited materials are etched away, in the etch process the atomically flat surface areas of the islands transferred to the underlying substrate. Thereafter the substrate may be cleaned and annealed to remove any remaining unwanted contaminants, and eliminate any residual defects that may have remained in the substrate surface as a result of pre-existing imperfections of the substrate.
The colibactin warhead crosslinks DNA
NASA Astrophysics Data System (ADS)
Vizcaino, Maria I.; Crawford, Jason M.
2015-05-01
Members of the human microbiota are increasingly being correlated to human health and disease states, but the majority of the underlying microbial metabolites that regulate host-microbe interactions remain largely unexplored. Select strains of Escherichia coli present in the human colon have been linked to the initiation of inflammation-induced colorectal cancer through an unknown small-molecule-mediated process. The responsible non-ribosomal peptide-polyketide hybrid pathway encodes ‘colibactin’, which belongs to a largely uncharacterized family of small molecules. Genotoxic small molecules from this pathway that are capable of initiating cancer formation have remained elusive due to their high instability. Guided by metabolomic analyses, here we employ a combination of NMR spectroscopy and bioinformatics-guided isotopic labelling studies to characterize the colibactin warhead, an unprecedented substituted spirobicyclic structure. The warhead crosslinks duplex DNA in vitro, providing direct experimental evidence for colibactin's DNA-damaging activity. The data support unexpected models for both colibactin biosynthesis and its mode of action.
Metastable Prepores in Tension-Free Lipid Bilayers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ting, Christina L.; Awasthi, Neha; Muller, Marcus
The formation and closure of aqueous pores in lipid bilayers is a key step in various biophysical processes. Large pores are well described by classical nucleation theory, but the free-energy landscape of small, biologically relevant pores has remained largely unexplored. The existence of small and metastable “prepores” was hypothesized decades ago from electroporation experiments, but resolving metastable prepores from theoretical models remained challenging. Using two complementary methods—atomistic simulations and self-consistent field theory of a minimal lipid model—we determine the parameters for which metastable prepores occur in lipid membranes. Here, both methods consistently suggest that pore metastability depends on the relativemore » volume ratio between the lipid head group and lipid tails: lipids with a larger head-group volume fraction (or shorter saturated tails) form metastable prepores, whereas lipids with a smaller head-group volume fraction (or longer unsaturated tails) form unstable prepores.« less
Metastable Prepores in Tension-Free Lipid Bilayers
Ting, Christina L.; Awasthi, Neha; Muller, Marcus; ...
2018-03-23
The formation and closure of aqueous pores in lipid bilayers is a key step in various biophysical processes. Large pores are well described by classical nucleation theory, but the free-energy landscape of small, biologically relevant pores has remained largely unexplored. The existence of small and metastable “prepores” was hypothesized decades ago from electroporation experiments, but resolving metastable prepores from theoretical models remained challenging. Using two complementary methods—atomistic simulations and self-consistent field theory of a minimal lipid model—we determine the parameters for which metastable prepores occur in lipid membranes. Here, both methods consistently suggest that pore metastability depends on the relativemore » volume ratio between the lipid head group and lipid tails: lipids with a larger head-group volume fraction (or shorter saturated tails) form metastable prepores, whereas lipids with a smaller head-group volume fraction (or longer unsaturated tails) form unstable prepores.« less
Mostafa, Ayman; Nolte, Ingo; Wefstaedt, Patrick
2018-06-05
Medial coronoid process disease is a common leading cause of thoracic limb lameness in dogs. Computed tomography and arthroscopy are superior to radiography to diagnose medial coronoid process disease, however, radiography remains the most available diagnostic imaging modality in veterinary practice. Objectives of this retrospective observational study were to describe the prevalence of medial coronoid process disease in lame large breed dogs and apply a novel method for quantifying the radiographic changes associated with medial coronoid process and subtrochlear-ulnar region in Labrador and Golden Retrievers with confirmed medial coronoid process disease. Purebred Labrador and Golden Retrievers (n = 143, 206 elbows) without and with confirmed medial coronoid process disease were included. The prevalence of medial coronoid process disease in lame large breed dogs was calculated. Mediolateral and craniocaudal radiographs of elbows were analyzed to assess the medial coronoid process length and morphology, and subtrochlear-ulnar width. Mean grayscale value was calculated for radial and subtrochlear-ulnar zones. The prevalence of medial coronoid process disease was 20.8%. Labrador and Golden Retrievers were the most affected purebred dogs (29.6%). Elbows with confirmed medial coronoid process disease had short (P < 0.0001) and deformed (∼95%) medial coronoid process, with associated medial coronoid process osteophytosis (7.5%). Subtrochlear-ulnar sclerosis was evidenced in ∼96% of diseased elbows, with a significant increase (P < 0.0001) in subtrochlear-ulnar width and standardized grayscale value. Radial grayscale value did not differ between groups. Periarticular osteophytosis was identified in 51.4% of elbows with medial coronoid process disease. Medial coronoid process length and morphology, and subtrochlear-ulnar width and standardized grayscale value varied significantly in dogs with confirmed medial coronoid process disease compared to controls. Findings indicated that medial coronoid process disease has a high prevalence in lame large breed dogs and that quantitative radiographic assessments can contribute to the diagnosis. © 2018 American College of Veterinary Radiology.
ERIC Educational Resources Information Center
Russell, A. W.; Netherwood, G. M. A.; Robinson, S. A.
2004-01-01
Photosynthesis is a central topic in biology education. It remains one of the most challenging, largely because of a) its conceptual difficulty, leading to lack of interest and misconceptions among students; b) the difficulties students have in visualising the process, or relating it to things they can see, especially when the topic is presented…
Martin Barrette; Louis Bélanger; Louis De Grandpré; Alejandro A. Royo
2017-01-01
In the absence of large-scale stand replacing disturbances, boreal forests can remain in the old-growth stage over time because of a dynamic equilibrium between small-scale mortality and regeneration processes. Although this gap paradigm has been a cornerstone of forest dynamics theory and practice for decades, evidence suggests that it could be disrupted, threatening...
ERIC Educational Resources Information Center
Blau, Vera; Reithler, Joel; van Atteveldt, Nienke; Seitz, Jochen; Gerretsen, Patty; Goebel, Rainer; Blomert, Leo
2010-01-01
Learning to associate auditory information of speech sounds with visual information of letters is a first and critical step for becoming a skilled reader in alphabetic languages. Nevertheless, it remains largely unknown which brain areas subserve the learning and automation of such associations. Here, we employ functional magnetic resonance…
Cost effective technologies and renewable substrates for biosurfactants’ production
Banat, Ibrahim M.; Satpute, Surekha K.; Cameotra, Swaranjit S.; Patil, Rajendra; Nyayanit, Narendra V.
2014-01-01
Diverse types of microbial surface active amphiphilic molecules are produced by a range of microbial communities. The extraordinary properties of biosurfactant/bioemulsifier (BS/BE) as surface active products allows them to have key roles in various field of applications such as bioremediation, biodegradation, enhanced oil recovery, pharmaceutics, food processing among many others. This leads to a vast number of potential applications of these BS/BE in different industrial sectors. Despite the huge number of reports and patents describing BS and BE applications and advantages, commercialization of these compounds remain difficult, costly and to a large extent irregular. This is mainly due to the usage of chemically synthesized media for growing producing microorganism and in turn the production of preferred quality products. It is important to note that although a number of developments have taken place in the field of BS industries, large scale production remains economically challenging for many types of these products. This is mainly due to the huge monetary difference between the investment and achievable productivity from the commercial point of view. This review discusses low cost, renewable raw substrates, and fermentation technology in BS/BE production processes and their role in reducing the production cost. PMID:25566213
Efficient collective influence maximization in cascading processes with first-order transitions
Pei, Sen; Teng, Xian; Shaman, Jeffrey; Morone, Flaviano; Makse, Hernán A.
2017-01-01
In many social and biological networks, the collective dynamics of the entire system can be shaped by a small set of influential units through a global cascading process, manifested by an abrupt first-order transition in dynamical behaviors. Despite its importance in applications, efficient identification of multiple influential spreaders in cascading processes still remains a challenging task for large-scale networks. Here we address this issue by exploring the collective influence in general threshold models of cascading process. Our analysis reveals that the importance of spreaders is fixed by the subcritical paths along which cascades propagate: the number of subcritical paths attached to each spreader determines its contribution to global cascades. The concept of subcritical path allows us to introduce a scalable algorithm for massively large-scale networks. Results in both synthetic random graphs and real networks show that the proposed method can achieve larger collective influence given the same number of seeds compared with other scalable heuristic approaches. PMID:28349988
Efficient collective influence maximization in cascading processes with first-order transitions
NASA Astrophysics Data System (ADS)
Pei, Sen; Teng, Xian; Shaman, Jeffrey; Morone, Flaviano; Makse, Hernán A.
2017-03-01
In many social and biological networks, the collective dynamics of the entire system can be shaped by a small set of influential units through a global cascading process, manifested by an abrupt first-order transition in dynamical behaviors. Despite its importance in applications, efficient identification of multiple influential spreaders in cascading processes still remains a challenging task for large-scale networks. Here we address this issue by exploring the collective influence in general threshold models of cascading process. Our analysis reveals that the importance of spreaders is fixed by the subcritical paths along which cascades propagate: the number of subcritical paths attached to each spreader determines its contribution to global cascades. The concept of subcritical path allows us to introduce a scalable algorithm for massively large-scale networks. Results in both synthetic random graphs and real networks show that the proposed method can achieve larger collective influence given the same number of seeds compared with other scalable heuristic approaches.
Jones, M; Cameron, D
2017-09-22
Task shifting has enabled South Africa (SA) to rapidly expand its HIV treatment programme. This has been achieved by training and mentoring primary-care nurses in nurse initiation and management of antiretroviral therapy (NIMART). Five years into its clinical mentoring programme, the Foundation for Professional Development conducted an evaluation that identified improved knowledge, attitudes and confidence perceived by nurses who received NIMART mentoring. Low completion rates for the Department of Health (DoH) NIMART training process were identified and therefore targeted mentoring was introduced; this increased the percentage of primary nurses eligible for DoH certificates of clinical competence in NIMART from 12%, adding a further 30%. There remain a large number of primary nurses who require mentoring in order to complete the NIMART process. For those who have completed the process, there remains a need for ongoing mentoring as SA's HIV programme evolves, complex cases emerge and primary care undergoes change.
Blanco, Juan; Arévalo, Fabiola; Correa, Jorge; Porro, M Corina; Cabado, Ana G; Vieites, Juan M; Moroño, Angeles
2016-03-15
The effect of canning in pickled sauce and autoclaving on weight, toxin content, toxin concentration and toxicity of steamed mussels was studied. Weight decreased by 25.5%. Okadaic acid (OA) and DTX2 content of mussel meat decreased by 24.1 and 42.5%, respectively. The estimated toxicity of the mussel remained nearly unchanged (increased by 2.9%). A part of the toxins lost by the mussels was leached to the sauce but the remaining part should have been thermally degraded. DTX2 underwent more degradation than OA and, in both toxins, free forms more than conjugated ones. This process, therefore, cannot be responsible for the large increments of toxicity of processed mussels -relative to the raw ones-sometimes detected by food processing companies. The final product could be monitored in several ways, but analysing the whole can content or the mussel meat once rehydrated seems to be the most equivalents to the raw mussel controls. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rare-earth abundances in chondritic meteorites
NASA Technical Reports Server (NTRS)
Evensen, N. M.; Hamilton, P. J.; Onions, R. K.
1978-01-01
Fifteen chondrites, including eight carbonaceous chondrites, were analyzed for rare earth element abundances by isotope dilution. Examination of REE for a large number of individual chondrites shows that only a small proportion of the analyses have flat unfractionated REE patterns within experimental error. While some of the remaining analyses are consistent with magmatic fractionation, many patterns, in particular those with positive Ce anomalies, can not be explained by known magmatic processes. Elemental abundance anomalies are found in all major chondrite classes. The persistence of anomalies in chondritic materials relatively removed from direct condensational processes implies that anomalous components are resistant to equilibrium or were introduced at a late stage of chondrite formation. Large-scale segregation of gas and condensate is implied, and bulk variations in REE abundances between planetary bodies is possible.
Scalable loading of a two-dimensional trapped-ion array
Bruzewicz, Colin D.; McConnell, Robert; Chiaverini, John; Sage, Jeremy M.
2016-01-01
Two-dimensional arrays of trapped-ion qubits are attractive platforms for scalable quantum information processing. Sufficiently rapid reloading capable of sustaining a large array, however, remains a significant challenge. Here with the use of a continuous flux of pre-cooled neutral atoms from a remotely located source, we achieve fast loading of a single ion per site while maintaining long trap lifetimes and without disturbing the coherence of an ion quantum bit in an adjacent site. This demonstration satisfies all major criteria necessary for loading and reloading extensive two-dimensional arrays, as will be required for large-scale quantum information processing. Moreover, the already high loading rate can be increased by loading ions in parallel with only a concomitant increase in photo-ionization laser power and no need for additional atomic flux. PMID:27677357
NASA Astrophysics Data System (ADS)
Du, Guohui; Chen, Yao; Zhu, Chunming; Liu, Chang; Ge, Lili; Wang, Bing; Li, Chuanyang; Wang, Haimin
2018-06-01
Coronal loops interconnecting two active regions (ARs), called interconnecting loops (ILs), are prominent large-scale structures in the solar atmosphere. They carry a significant amount of magnetic flux and therefore are considered to be an important element of the solar dynamo process. Earlier observations showed that eruptions of ILs are an important source of CMEs. It is generally believed that ILs are formed through magnetic reconnection in the high corona (>150″–200″), and several scenarios have been proposed to explain their brightening in soft X-rays (SXRs). However, the detailed IL formation process has not been fully explored, and the associated energy release in the corona still remains unresolved. Here, we report the complete formation process of a set of ILs connecting two nearby ARs, with successive observations by STEREO-A on the far side of the Sun and by SDO and Hinode on the Earth side. We conclude that ILs are formed by gradual reconnection high in the corona, in line with earlier postulations. In addition, we show evidence that ILs brighten in SXRs and EUVs through heating at or close to the reconnection site in the corona (i.e., through the direct heating process of reconnection), a process that has been largely overlooked in earlier studies of ILs.
Full-tree utilization of southern pine and hardwoods growing on southern pine sites
Peter Koch
1974-01-01
in 1963, approximately 30 percent of the dry weight of above- and below-ground parts of southern pine trees ended as dry surfaced lumber or paper; the remaining 70 percent was largely unused. By 1980, computer-controlled chipping headrigs, thin-kerf saws, lamination of lumber from rotary-cut veneer, high-yield pulping processes, and more intensive use of roots, bark,...
Whole-tree utilization of southern pine advanced by developments in mechanical conversion
P. Koch
1973-01-01
In 1963 approximately 30 percent of the dry weight of above- and below-ground parts of southern pine trees ended as dry-surfaced lumber or paper; the remaining 70 percent was largely unused. By 1980, computer-controlled chipping headrigs, thin-kerf saws, lamination of lumber from rotary-cut veneer, high-yield pulping processes, and more intensive use of roots, bark,...
Whole-tree utilization of southern pine advanced by developments in mechanical conversion
Peter Koch
1973-01-01
In 1963 approximately 30 percent of the dry weight of aboe- and below-ground parts of southern pine trees ended as dry-surfaced lumber or paper; the remaining 70 percent was largely unused. By 1980, computer-controlled chipping headrigs, think-kerf saws, lamination of lumber from rotary-cut veneer, high-yield pulping processes, and more intensive use of roots, bark,...
NASA Astrophysics Data System (ADS)
Neggers, Roel
2016-04-01
Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach), and iii) process-level evaluation at climate time-scales. The advantages and disadvantages of each approach will be identified and discussed, and some thoughts about possible future developments will be given.
High-dimensional single-cell analysis reveals the immune signature of narcolepsy.
Hartmann, Felix J; Bernard-Valnet, Raphaël; Quériault, Clémence; Mrdjen, Dunja; Weber, Lukas M; Galli, Edoardo; Krieg, Carsten; Robinson, Mark D; Nguyen, Xuan-Hung; Dauvilliers, Yves; Liblau, Roland S; Becher, Burkhard
2016-11-14
Narcolepsy type 1 is a devastating neurological sleep disorder resulting from the destruction of orexin-producing neurons in the central nervous system (CNS). Despite its striking association with the HLA-DQB1*06:02 allele, the autoimmune etiology of narcolepsy has remained largely hypothetical. Here, we compared peripheral mononucleated cells from narcolepsy patients with HLA-DQB1*06:02-matched healthy controls using high-dimensional mass cytometry in combination with algorithm-guided data analysis. Narcolepsy patients displayed multifaceted immune activation in CD4 + and CD8 + T cells dominated by elevated levels of B cell-supporting cytokines. Additionally, T cells from narcolepsy patients showed increased production of the proinflammatory cytokines IL-2 and TNF. Although it remains to be established whether these changes are primary to an autoimmune process in narcolepsy or secondary to orexin deficiency, these findings are indicative of inflammatory processes in the pathogenesis of this enigmatic disease. © 2016 Hartmann et al.
High-dimensional single-cell analysis reveals the immune signature of narcolepsy
Quériault, Clémence; Krieg, Carsten; Nguyen, Xuan-Hung
2016-01-01
Narcolepsy type 1 is a devastating neurological sleep disorder resulting from the destruction of orexin-producing neurons in the central nervous system (CNS). Despite its striking association with the HLA-DQB1*06:02 allele, the autoimmune etiology of narcolepsy has remained largely hypothetical. Here, we compared peripheral mononucleated cells from narcolepsy patients with HLA-DQB1*06:02-matched healthy controls using high-dimensional mass cytometry in combination with algorithm-guided data analysis. Narcolepsy patients displayed multifaceted immune activation in CD4+ and CD8+ T cells dominated by elevated levels of B cell–supporting cytokines. Additionally, T cells from narcolepsy patients showed increased production of the proinflammatory cytokines IL-2 and TNF. Although it remains to be established whether these changes are primary to an autoimmune process in narcolepsy or secondary to orexin deficiency, these findings are indicative of inflammatory processes in the pathogenesis of this enigmatic disease. PMID:27821550
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohammadi, Erfan; Zhao, Chuankai; Meng, Yifei
Solution processable semiconducting polymers have been under intense investigations due to their diverse applications from printed electronics to biomedical devices. However, controlling the macromolecular assembly across length scales during solution coating remains a key challenge, largely due to the disparity in timescales of polymer assembly and high-throughput printing/coating. Herein we propose the concept of dynamic templating to expedite polymer nucleation and the ensuing assembly process, inspired by biomineralization templates capable of surface reconfiguration. Molecular dynamic simulations reveal that surface reconfigurability is key to promoting template–polymer interactions, thereby lowering polymer nucleation barrier. Employing ionic-liquid-based dynamic template during meniscus-guided coating results inmore » highly aligned, highly crystalline donor-acceptor polymer thin films over large area (41cm 2) and promoted charge transport along both the polymer backbone and the π-π stacking direction in field-effect transistors. We further demonstrate that the charge transport anisotropy can be reversed by tuning the degree of polymer backbone alignment.« less
Mohammadi, Erfan; Zhao, Chuankai; Meng, Yifei; Qu, Ge; Zhang, Fengjiao; Zhao, Xikang; Mei, Jianguo; Zuo, Jian-Min; Shukla, Diwakar; Diao, Ying
2017-01-01
Solution processable semiconducting polymers have been under intense investigations due to their diverse applications from printed electronics to biomedical devices. However, controlling the macromolecular assembly across length scales during solution coating remains a key challenge, largely due to the disparity in timescales of polymer assembly and high-throughput printing/coating. Herein we propose the concept of dynamic templating to expedite polymer nucleation and the ensuing assembly process, inspired by biomineralization templates capable of surface reconfiguration. Molecular dynamic simulations reveal that surface reconfigurability is key to promoting template–polymer interactions, thereby lowering polymer nucleation barrier. Employing ionic-liquid-based dynamic template during meniscus-guided coating results in highly aligned, highly crystalline donor–acceptor polymer thin films over large area (>1 cm2) and promoted charge transport along both the polymer backbone and the π–π stacking direction in field-effect transistors. We further demonstrate that the charge transport anisotropy can be reversed by tuning the degree of polymer backbone alignment. PMID:28703136
Mohammadi, Erfan; Zhao, Chuankai; Meng, Yifei; ...
2017-07-13
Solution processable semiconducting polymers have been under intense investigations due to their diverse applications from printed electronics to biomedical devices. However, controlling the macromolecular assembly across length scales during solution coating remains a key challenge, largely due to the disparity in timescales of polymer assembly and high-throughput printing/coating. Herein we propose the concept of dynamic templating to expedite polymer nucleation and the ensuing assembly process, inspired by biomineralization templates capable of surface reconfiguration. Molecular dynamic simulations reveal that surface reconfigurability is key to promoting template–polymer interactions, thereby lowering polymer nucleation barrier. Employing ionic-liquid-based dynamic template during meniscus-guided coating results inmore » highly aligned, highly crystalline donor-acceptor polymer thin films over large area (41cm 2) and promoted charge transport along both the polymer backbone and the π-π stacking direction in field-effect transistors. We further demonstrate that the charge transport anisotropy can be reversed by tuning the degree of polymer backbone alignment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bakel, Allen J.; Conner, Cliff; Quigley, Kevin
One of the missions of the Reduced Enrichment for Research and Test Reactors (RERTR) program (and now the National Nuclear Security Administrations Material Management and Minimization program) is to facilitate the use of low enriched uranium (LEU) targets for 99Mo production. The conversion from highly enriched uranium (HEU) to LEU targets will require five to six times more uranium to produce an equivalent amount of 99Mo. The work discussed here addresses the technical challenges encountered in the treatment of uranyl nitrate hexahydrate (UNH)/nitric acid solutions remaining after the dissolution of LEU targets. Specifically, the focus of this work is themore » calcination of the uranium waste from 99Mo production using LEU foil targets and the Modified Cintichem Process. Work with our calciner system showed that high furnace temperature, a large vent tube, and a mechanical shield are beneficial for calciner operation. One- and two-step direct calcination processes were evaluated. The high-temperature one-step process led to contamination of the calciner system. The two-step direct calcination process operated stably and resulted in a relatively large amount of material in the calciner cup. Chemically assisted calcination using peroxide was rejected for further work due to the difficulty in handling the products. Chemically assisted calcination using formic acid was rejected due to unstable operation. Chemically assisted calcination using oxalic acid was recommended, although a better understanding of its chemistry is needed. Overall, this work showed that the two-step direct calcination and the in-cup oxalic acid processes are the best approaches for the treatment of the UNH/nitric acid waste solutions remaining from dissolution of LEU targets for 99Mo production.« less
Fortier, Véronique; Levesque, Ives R
2018-06-01
Phase processing impacts the accuracy of quantitative susceptibility mapping (QSM). Techniques for phase unwrapping and background removal have been proposed and demonstrated mostly in brain. In this work, phase processing was evaluated in the context of large susceptibility variations (Δχ) and negligible signal, in particular for susceptibility estimation using the iterative phase replacement (IPR) algorithm. Continuous Laplacian, region-growing, and quality-guided unwrapping were evaluated. For background removal, Laplacian boundary value (LBV), projection onto dipole fields (PDF), sophisticated harmonic artifact reduction for phase data (SHARP), variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP), regularization enabled sophisticated harmonic artifact reduction for phase data (RESHARP), and 3D quadratic polynomial field removal were studied. Each algorithm was quantitatively evaluated in simulation and qualitatively in vivo. Additionally, IPR-QSM maps were produced to evaluate the impact of phase processing on the susceptibility in the context of large Δχ with negligible signal. Quality-guided unwrapping was the most accurate technique, whereas continuous Laplacian performed poorly in this context. All background removal algorithms tested resulted in important phase inaccuracies, suggesting that techniques used for brain do not translate well to situations where large Δχ and no or low signal are expected. LBV produced the smallest errors, followed closely by PDF. Results suggest that quality-guided unwrapping should be preferred, with PDF or LBV for background removal, for QSM in regions with large Δχ and negligible signal. This reduces the susceptibility inaccuracy introduced by phase processing. Accurate background removal remains an open question. Magn Reson Med 79:3103-3113, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
An innovative large scale integration of silicon nanowire-based field effect transistors
NASA Astrophysics Data System (ADS)
Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.
2018-05-01
Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.
Local Helioseismology of Emerging Active Regions: A Case Study
NASA Astrophysics Data System (ADS)
Kosovichev, Alexander G.; Zhao, Junwei; Ilonidis, Stathis
2018-04-01
Local helioseismology provides a unique opportunity to investigate the subsurface structure and dynamics of active regions and their effect on the large-scale flows and global circulation of the Sun. We use measurements of plasma flows in the upper convection zone, provided by the Time-Distance Helioseismology Pipeline developed for analysis of solar oscillation data obtained by Helioseismic and Magnetic Imager (HMI) on Solar Dynamics Observatory (SDO), to investigate the subsurface dynamics of emerging active region NOAA 11726. The active region emergence was detected in deep layers of the convection zone about 12 hours before the first bipolar magnetic structure appeared on the surface, and 2 days before the emergence of most of the magnetic flux. The speed of emergence determined by tracking the flow divergence with depth is about 1.4 km/s, very close to the emergence speed in the deep layers. As the emerging magnetic flux becomes concentrated in sunspots local converging flows are observed beneath the forming sunspots. These flows are most prominent in the depth range 1-3 Mm, and remain converging after the formation process is completed. On the larger scale converging flows around active region appear as a diversion of the zonal shearing flows towards the active region, accompanied by formation of a large-scale vortex structure. This process occurs when a substantial amount of the magnetic flux emerged on the surface, and the converging flow pattern remains stable during the following evolution of the active region. The Carrington synoptic flow maps show that the large-scale subsurface inflows are typical for active regions. In the deeper layers (10-13 Mm) the flows become diverging, and surprisingly strong beneath some active regions. In addition, the synoptic maps reveal a complex evolving pattern of large-scale flows on the scale much larger than supergranulation
NASA Astrophysics Data System (ADS)
Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi
2016-08-01
Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.
Federal repatriation legislation and the role of physical anthropology in repatriation.
Ousley, Stephen D; Billeck, William T; Hollinger, R Eric
2005-01-01
Two laws governing the disposition of Native American human remains in museums and institutions have had a profound impact on anthropology, and especially physical anthropology. In contrast to the perception of constant conflict between Native Americans and physical anthropologists, the repatriation process based on these laws has been in large part harmonious between institutions and Native peoples in the US. Despite misconceptions, the Native American Graves Protection and Repatriation Act (NAPGRA; 25 United States Code (U.S.C.) 3001-3013) was not intended to halt further research on Native American remains in museums. In fact, court decisions have affirmed that the documentation of human remains produces information no other methods can provide, and provides necessary evidence to be incorporated and weighed, along with other evidence, in evaluating "cultural affiliation," the legal term for the required connection from federally recognized Native American groups to their ancestors. The wide variety of osteological data collected at the National Museum of Natural History (NMNH), Smithsonian Institution, have proven indispensable when evaluating cultural affiliation, especially when other information sources are unhelpful or ambiguous, and provide an empirical basis for determining the ancestry of individuals whose remains will be discovered in the future. To date, the claim-driven process at the NMNH has resulted in the affiliation and repatriation of more Native American remains than any other institution in the country. Repatriation experiences at the NMNH demonstrate the changing relationships between museums and Native peoples, the continuing important contributions that physical anthropology makes to the repatriation process, and the importance of physical anthropology in understanding the recent and ancient history of North America. (c) 2005 Wiley-Liss, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Decho, A.W.; Luoma, S.N.
1994-12-31
Bivalves possess two major digestion pathways for processing food particles: a rapid ``intestinal`` pathway where digestion is largely extracellular; and a slower ``glandular`` pathway where digestion is largely intracellular. The slower glandular pathway often results in more efficient absorption of carbon but also more efficient uptake of certain metals (e.g. Cr associated with bacteria). In the bivalve Potamocorbula amurensis, large portions (> 90%) of bacteria are selectively routed to the glandular pathway. This results in efficient C uptake but also efficient uptake of associated Cr. The authors further determined if prolonged exposure to Cr-contaminated bacteria would result in high Crmore » uptake by animals or whether mechanisms exist to reduce Cr exposure and uptake. Bivalves were exposed to natural food + added bacteria (with or without added Cr) for a 6-day period, then pulse-chase experiments were conducted to quantify digestive processing and % absorption efficiencies (%AE) of bacterial Cr. Bivalves compensate at low (2--5 ug/g sed) Cr by reducing overall food ingestion, while digestive processing of food remains statistically similar to controls. At high Cr (200--500 ug/g sed) there are marked decreases in % bacteria processed by glandular digestion. This results in lower overall %AE of Cr. The results suggest that bivalves under natural conditions might balance efficient carbon sequestration against avoiding uptake of potentially toxic metals associated the food.« less
SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, M; Russell Eibling, R; David Koopman, D
2007-09-04
The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratiomore » of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.« less
Alexander C. Vibrans; Ronald E. McRoberts; Paolo Moser; Adilson L. Nicoletti
2013-01-01
Estimation of large area forest attributes, such as area of forest cover, from remote sensing-based maps is challenging because of image processing, logistical, and data acquisition constraints. In addition, techniques for estimating and compensating for misclassification and estimating uncertainty are often unfamiliar. Forest area for the state of Santa Catarina in...
Jung, Eunice L.; Zadbood, Asieh; Lee, Sang-Hun; Tomarken, Andrew J.; Blake, Randolph
2013-01-01
We live in a cluttered, dynamic visual environment that poses a challenge for the visual system: for objects, including those that move about, to be perceived, information specifying those objects must be integrated over space and over time. Does a single, omnibus mechanism perform this grouping operation, or does grouping depend on separate processes specialized for different feature aspects of the object? To address this question, we tested a large group of healthy young adults on their abilities to perceive static fragmented figures embedded in noise and to perceive dynamic point-light biological motion figures embedded in dynamic noise. There were indeed substantial individual differences in performance on both tasks, but none of the statistical tests we applied to this data set uncovered a significant correlation between those performance measures. These results suggest that the two tasks, despite their superficial similarity, require different segmentation and grouping processes that are largely unrelated to one another. Whether those processes are embodied in distinct neural mechanisms remains an open question. PMID:24198799
Jung, Eunice L; Zadbood, Asieh; Lee, Sang-Hun; Tomarken, Andrew J; Blake, Randolph
2013-01-01
WE LIVE IN A CLUTTERED, DYNAMIC VISUAL ENVIRONMENT THAT POSES A CHALLENGE FOR THE VISUAL SYSTEM: for objects, including those that move about, to be perceived, information specifying those objects must be integrated over space and over time. Does a single, omnibus mechanism perform this grouping operation, or does grouping depend on separate processes specialized for different feature aspects of the object? To address this question, we tested a large group of healthy young adults on their abilities to perceive static fragmented figures embedded in noise and to perceive dynamic point-light biological motion figures embedded in dynamic noise. There were indeed substantial individual differences in performance on both tasks, but none of the statistical tests we applied to this data set uncovered a significant correlation between those performance measures. These results suggest that the two tasks, despite their superficial similarity, require different segmentation and grouping processes that are largely unrelated to one another. Whether those processes are embodied in distinct neural mechanisms remains an open question.
Sim, Hwansu; Kim, Chanho; Bok, Shingyu; Kim, Min Ki; Oh, Hwisu; Lim, Guh-Hwan; Cho, Sung Min; Lim, Byungkwon
2018-06-18
Silver (Ag) nanowires (NWs) are promising building blocks for flexible transparent electrodes, which are key components in fabricating soft electronic devices such as flexible organic light emitting diodes (OLEDs). Typically, Ag NWs have been synthesized using a polyol method, but it still remains a challenge to produce high-aspect-ratio Ag NWs via a simple and rapid process. In this work, we developed a modified polyol method and newly found that the addition of propylene glycol to ethylene glycol-based polyol synthesis facilitated the growth of Ag NWs, allowing the rapid production of long Ag NWs with high aspect ratios of about 2000 in a high yield (∼90%) within 5 min. Transparent electrodes fabricated with our Ag NWs exhibited performance comparable to that of an indium tin oxide-based electrode. With these Ag NWs, we successfully demonstrated the fabrication of a large-area flexible OLED with dimensions of 30 cm × 15 cm using a roll-to-roll process.
An effective online data monitoring and saving strategy for large-scale climate simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
An effective online data monitoring and saving strategy for large-scale climate simulations
Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...
2018-01-22
Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less
Wu, Junjun; Du, Guocheng; Zhou, Jingwen; Chen, Jian
2014-10-20
Flavonoids possess pharmaceutical potential due to their health-promoting activities. The complex structures of these products make extraction from plants difficult, and chemical synthesis is limited because of the use of many toxic solvents. Microbial production offers an alternate way to produce these compounds on an industrial scale in a more economical and environment-friendly manner. However, at present microbial production has been achieved only on a laboratory scale and improvements and scale-up of these processes remain challenging. Naringenin and pinocembrin, which are flavonoid scaffolds and precursors for most of the flavonoids, are the model molecules that are key to solving the current issues restricting industrial production of these chemicals. The emergence of systems metabolic engineering, which combines systems biology with synthetic biology and evolutionary engineering at the systems level, offers new perspectives on strain and process optimization. In this review, current challenges in large-scale fermentation processes involving flavonoid scaffolds and the strategies and tools of systems metabolic engineering used to overcome these challenges are summarized. This will offer insights into overcoming the limitations and challenges of large-scale microbial production of these important pharmaceutical compounds. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Joo Hyung; Kang, Tae Sung; Yang, Jung Yup; Hong, Jin Pyo
2015-11-01
One long-standing goal in the emerging field of flexible and transparent electronic devices is to meet the demand of key markets, such as enhanced output performance for metal oxide semiconductor thin film transistors (TFTs) prepared by a solution process. While solution-based fabrication techniques are cost-effective and ensure large-area coverage at low temperature, their utilization has the disadvantage of introducing large trap states into TFTs. Such states, the formation of which is induced by intrinsic defects initially produced during preparation, have a significant impact on electrical performance. Therefore, the ability to enhance the electrical characteristics of solution-processed TFTs, along with attaining a firm understanding of their physical nature, remains a key step towards extending their use. In this study, measurements of low-frequency noise and random telegraph signal noise are employed as generic alternative tools to examine the origins of enhanced output performance for solution-processed ZnO TFTs through the control of defect sites by Al evaporation.
Soundscapes and Larval Settlement: Characterizing the Stimulus from a Larval Perspective.
Lillis, Ashlee; Eggleston, David B; Bohnenstiehl, DelWayne R
2016-01-01
There is growing evidence that underwater sounds serve as a cue for the larvae of marine organisms to locate suitable settlement habitats; however, the relevant spatiotemporal scales of variability in habitat-related sounds and how this variation scales with larval settlement processes remain largely uncharacterized, particularly in estuarine habitats. Here, we provide an overview of the approaches we have developed to characterize an estuarine soundscape as it relates to larval processes, and a conceptual framework is provided for how habitat-related sounds may influence larval settlement, using oyster reef soundscapes as an example.
Achieving a competitive advantage through referral management.
D'Amaro, R; Thomas, C S
1989-01-01
The physician remains the primary referral source in medical service. Referral patterns, in turn, reflect interactions between referring physicians and consultants which relate to quality of care, costs, and personal factors such as age and common training. Referrals initiated by patients relate to the desire to seek a second opinion and are heavily influenced by other family members. Alterations in the referral process are emerging due to cost escalation, the emergence of large payor groupings and aggregation of physicians into larger group settings. Strategies to manage the referral process include enhanced communications using new telecommunication technology and joint ventures with hospitals.
Shock probes in a one-dimensional Katz-Lebowitz-Spohn model
NASA Astrophysics Data System (ADS)
Chatterjee, Sakuntala; Barma, Mustansir
2008-06-01
We consider shock probes in a one-dimensional driven diffusive medium with nearest-neighbor Ising interaction (KLS model). Earlier studies based on an approximate mapping of the present system to an effective zero-range process concluded that the exponents characterizing the decays of several static and dynamical correlation functions of the probes depend continuously on the strength of the Ising interaction. On the contrary, our numerical simulations indicate that over a substantial range of the interaction strength, these exponents remain constant and their values are the same as in the case of no interaction (when the medium executes an ASEP). We demonstrate this by numerical studies of several dynamical correlation functions for two probes and also for a macroscopic number of probes. Our results are consistent with the expectation that the short-ranged correlations induced by the Ising interaction should not affect the large time and large distance properties of the system, implying that scaling forms remain the same as in the medium with no interactions present.
Neural Mechanisms of Selective Visual Attention.
Moore, Tirin; Zirnsak, Marc
2017-01-03
Selective visual attention describes the tendency of visual processing to be confined largely to stimuli that are relevant to behavior. It is among the most fundamental of cognitive functions, particularly in humans and other primates for whom vision is the dominant sense. We review recent progress in identifying the neural mechanisms of selective visual attention. We discuss evidence from studies of different varieties of selective attention and examine how these varieties alter the processing of stimuli by neurons within the visual system, current knowledge of their causal basis, and methods for assessing attentional dysfunctions. In addition, we identify some key questions that remain in identifying the neural mechanisms that give rise to the selective processing of visual information.
NASA Astrophysics Data System (ADS)
Whateley, T. L.; Poncelet, D.
2005-06-01
Microencapsulation by solvent evaporation is a novel technique to enable the controlled delivery of active materials.The controlled release of drugs, for example, is a key challenge in the pharmaceutical industries. Although proposed several decades ago, it remains largely an empirical laboratory process.The Topical Team has considered its critical points and the work required to produce a more effective technology - better control of the process for industrial production, understanding of the interfacial dynamics, determination of the solvent evaporation profile, and establishment of the relation between polymer/microcapsule structures.The Team has also defined how microgravity experiments could help in better understanding microencapsulation by solvent evaporation, and it has proposed a strategy for a collaborative project on the topic.
Assessing the effects of large mobile predators on ecosystem connectivity.
McCauley, Douglas J; Young, Hillary S; Dunbar, Robert B; Estes, James A; Semmens, Brice X; Micheli, Fiorenza
2012-09-01
Large predators are often highly mobile and can traverse and use multiple habitats. We know surprisingly little about how predator mobility determines important processes of ecosystem connectivity. Here we used a variety of data sources drawn from Palmyra Atoll, a remote tropical marine ecosystem where large predators remain in high abundance, to investigate how these animals foster connectivity. Our results indicate that three of Palmyra's most abundant large predators (e.g., two reef sharks and one snapper) use resources from different habitats creating important linkages across ecosystems. Observations of cross-system foraging such as this have important implications for the understanding of ecosystem functioning, the management of large-predator populations, and the design of conservation measures intended to protect whole ecosystems. In the face of widespread declines of large, mobile predators, it is important that resource managers, policy makers, and ecologists work to understand how these predators create connectivity and to determine the impact that their depletions may be having on the integrity of these linkages.
Cryogenic Calcite: A Morphologic and Isotopic Analog to the ALH84001 Carbonates
NASA Technical Reports Server (NTRS)
Niles, P. B.; Leshin, L. A.; Socki, R. A.; Guan, Y.; Ming, D. W.; Gibson, E. K.
2004-01-01
Martian meteorite ALH84001 carbonates preserve large and variable microscale isotopic compositions, which in some way reflect their formation environment. These measurements show large variations (>20%) in the carbon and oxygen isotopic compositions of the carbonates on a 10-20 micron scale that are correlated with chemical composition. However, the utilization of these data sets for interpreting the formation conditions of the carbonates is complex due to lack of suitable terrestrial analogs and the difficulty of modeling under non-equilibrium conditions. Thus, the mechanisms and processes are largely unknown that create and preserve large microscale isotopic variations in carbonate minerals. Experimental tests of the possible environments and mechanisms that lead to large microscale isotopic variations can help address these concerns. One possible mechanism for creating large carbon isotopic variations in carbonates involves the freezing of water. Carbonates precipitate during extensive CO2 degassing that occurs during the freezing process as the fluid s decreasing volume drives CO2 out. This rapid CO2 degassing results in a kinetic isotopic fractionation where the CO2 gas has a much lighter isotopic composition causing an enrichment of 13C in the remaining dissolved bicarbonate. This study seeks to determine the suitability of cryogenically formed carbonates as analogs to ALH84001 carbonates. Specifically, our objective is to determine how accurately models using equilibrium fractionation factors approximate the isotopic compositions of cryogenically precipitated carbonates. This includes determining the accuracy of applying equilibrium fractionation factors during a kinetic process, and determining how isotopic variations in the fluid are preserved in microscale variations in the precipitated carbonates.
Wang, Quanlong; Bai, Qingshun; Chen, Jiaxuan; Su, Hao; Wang, Zhiguo; Xie, Wenkun
2015-12-01
Large-scale molecular dynamics simulation is performed to study the nano-cutting process of single crystal copper realized by single-point diamond cutting tool in this paper. The centro-symmetry parameter is adopted to characterize the subsurface deformed layers and the distribution and evolution of the subsurface defect structures. Three-dimensional visualization and measurement technology are used to measure the depth of the subsurface deformed layers. The influence of cutting speed, cutting depth, cutting direction, and crystallographic orientation on the depth of subsurface deformed layers is systematically investigated. The results show that a lot of defect structures are formed in the subsurface of workpiece during nano-cutting process, for instance, stair-rod dislocations, stacking fault tetrahedron, atomic clusters, vacancy defects, point defects. In the process of nano-cutting, the depth of subsurface deformed layers increases with the cutting distance at the beginning, then decreases at stable cutting process, and basically remains unchanged when the cutting distance reaches up to 24 nm. The depth of subsurface deformed layers decreases with the increase in cutting speed between 50 and 300 m/s. The depth of subsurface deformed layer increases with cutting depth, proportionally, and basically remains unchanged when the cutting depth reaches over 6 nm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, T.
2016-05-20
ZionSolutions is in the process of decommissioning the Zion Nuclear Power Station (ZNPS). After decommissioning is completed, the site will contain two reactor Containment Buildings, the Fuel Handling Building and Transfer Canals, Auxiliary Building, Turbine Building, Crib House/Forebay, and a Waste Water Treatment Facility that have been demolished to a depth of 3 feet below grade. Additional below ground structures remaining will include the Main Steam Tunnels and large diameter intake and discharge pipes. These additional structures are not included in the modeling described in this report, but the inventory remaining (expected to be very low) will be included withmore » one of the structures that are modeled as designated in the Zion Station Restoration Project (ZSRP) License Termination Plan (LTP). The remaining underground structures will be backfilled with clean material. The final selection of fill material has not been made.« less
ERIC Educational Resources Information Center
Whitney, Carin; Kirk, Marie; O'Sullivan, Jamie; Ralph, Matthew A. Lambon; Jefferies, Elizabeth
2012-01-01
To understand the meanings of words and objects, we need to have knowledge about these items themselves plus executive mechanisms that compute and manipulate semantic information in a task-appropriate way. The neural basis for semantic control remains controversial. Neuroimaging studies have focused on the role of the left inferior frontal gyrus…
State of the Art in Large-Scale Soil Moisture Monitoring
NASA Technical Reports Server (NTRS)
Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.;
2013-01-01
Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.
Quantifying hyporheic exchange dynamics in a highly regulated large river reach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Glenn Edward; Zhou, T; Huang, M
Hyporheic exchange is an important mechanism taking place in riverbanks and riverbed sediments, where river water and shallow groundwater mix and interact with each other. The direction, magnitude, and residence time of the hyporheic flux that penetrates the river bed are critical for biogeochemical processes such as carbon and nitrogen cycling, and biodegradation of organic contaminants. Many approaches including field measurements and numerical methods have been developed to quantify the hyporheic exchanges in relatively small rivers. However, the spatial and temporal distributions of hyporheic exchanges in a large, regulated river reach remain less explored due to the large spatial domains,more » complexity of geomorphologic features and subsurface properties, and the great pressure gradient variations at the riverbed created by dam operations.« less
Scientific Investigations Associated with the Human Exploration of Mars in the Next 35 Years
NASA Technical Reports Server (NTRS)
Niles, P. B.; Beaty, David; Hays, Lindsay; Bass, Deborah; Bell, Mary Sue; Bleacher, Jake; Cabrol, Nathalie A.; Conrad, Pan; Eppler, Dean; Hamilton, Vicky;
2017-01-01
A human mission to Mars would present an unprecedented opportunity to investigate the earliest history of the solar system. This history that has largely been overwritten on Earth by active geological processing throughout its history, but on Mars, large swaths of the ancient crust remain exposed at the surface, allowing us to investigate martian processes at the earliest time periods when life first appeared on the Earth. Mars' surface has been largely frozen in place for 4 billion years, and after losing its atmosphere and magnetic field what re-mains is an ancient landscape of former hydrothermal systems, river beds, volcanic eruptions, and impact craters. This allows us to investigate scientific questions ranging from the nature of the impact history of the solar system to the origins of life. We present here a summary of the findings of the Human Science Objectives Science Analysis Group, or HSO-SAG chartered by MEPAG in 2015 to address science objectives and landing site criteria for future human missions to Mars (Niles, Beaty et al. 2015). Currently, NASA's plan to land astronauts on Mars in the mid 2030's would allow for robust human exploration of the surface in the next 35 years. We expect that crews would be able to traverse to sites up to 100 km away from the original landing site using robust rovers. A habitat outfitted with state of the art laboratory facilities that could enable the astronauts to perform cutting edge science on the surface of Mars. Robotic/human partnership during exploration would further enhance the science return of the mission.
Geomorphic and habitat response to a large-dam removal in a Mediterranean river
NASA Astrophysics Data System (ADS)
Harrison, L.; East, A. E.; Smith, D. P.; Bond, R.; Logan, J. B.; Nicol, C.; Williams, T.; Boughton, D. A.; Chow, K.
2017-12-01
The presence of large dams has fundamentally altered physical and biological processes in riverine ecosystems, and dam removal is becoming more common as a river restoration strategy. We used a before-after-control-impact study design to investigate the geomorphic and habitat response to removal of 32-m-high San Clemente Dam on the Carmel River, CA. The project represents the first major dam removal in a Mediterranean river and is also unique among large dam removals in that most reservoir sediment was sequestered in place. We found that in the first year post-removal, a sediment pulse migrated 3.5 km downstream, filling pools and the interstitial pore spaces of gravels with sand. These sedimentary and topographic changes initially reduced the overall quality of steelhead (O. mykiss) spawning and rearing habitat in impacted reaches. Over the second winter after dam removal, a sequence of high flows flushed large volumes of sand from pools and mobilized the river bed throughout much of the active channel. The floods substantially altered fluvial evolution in the upper part of the reservoir, promoting new avulsion and the subsequent delivery of gravel and large wood to below dam reaches. These geomorphic processes increased the availability of spawning-sized gravel and enhanced channel complexity in reaches within several km of the former dam, which should improve habitat for multiple life stages of steelhead. Results indicate that when most reservoir sediment remains impounded, high flows become more important drivers of geomorphic and habitat change than dam removal alone. In such cases, the rates at which biophysical processes are reestablished will depend largely on post-dam removal flow sequencing and the upstream supply of sediment and large wood.
Landerl, Karin
2013-01-01
Numerical processing has been demonstrated to be closely associated with arithmetic skills, however, our knowledge on the development of the relevant cognitive mechanisms is limited. The present longitudinal study investigated the developmental trajectories of numerical processing in 42 children with age-adequate arithmetic development and 41 children with dyscalculia over a 2-year period from beginning of Grade 2, when children were 7; 6 years old, to beginning of Grade 4. A battery of numerical processing tasks (dot enumeration, non-symbolic and symbolic comparison of one- and two-digit numbers, physical comparison, number line estimation) was given five times during the study (beginning and middle of each school year). Efficiency of numerical processing was a very good indicator of development in numerical processing while within-task effects remained largely constant and showed low long-term stability before middle of Grade 3. Children with dyscalculia showed less efficient numerical processing reflected in specifically prolonged response times. Importantly, they showed consistently larger slopes for dot enumeration in the subitizing range, an untypically large compatibility effect when processing two-digit numbers, and they were consistently less accurate in placing numbers on a number line. Thus, we were able to identify parameters that can be used in future research to characterize numerical processing in typical and dyscalculic development. These parameters can also be helpful for identification of children who struggle in their numerical development. PMID:23898310
Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.
1993-01-01
Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to determine the quantity of each taxon present in the semi-quantitative samples or to list the taxa present in qualitative samples. The processing guidelines provide standardized laboratory forms, sample labels, detailed sample processing flow charts, standardized format for electronic data, quality-assurance procedures and checks, sample tracking standards, and target levels for taxonomic determinations. The contract laboratory (1) is responsible for identifications and quantifications, (2) constructs reference collections, (3) provides data in hard copy and electronic forms, (4) follows specified quality-assurance and quality-control procedures, and (5) returns all processed and unprocessed portions of the samples. The U.S. Geological Survey's Quality Management Group maintains a Biological Quality-Assurance Unit, located at the National Water-Quality Laboratory, Arvada, Colorado, to oversee the use of contract laboratories and ensure the quality of data obtained from these laboratories according to the guidelines established in this document. This unit establishes contract specifications, reviews contractor performance (timeliness, accuracy, and consistency), enters data into the National Water Information System-II data base, maintains in-house reference collections, deposits voucher specimens in outside museums, and interacts with taxonomic experts within and outside the U.S. Geological Survey. This unit also modifies the existing sample processing and quality-assurance guidelines, establishes criteria and testing procedures for qualifying potential contract laboratories, identifies qualified taxonomic experts, and establishes voucher collections.
Saleem, Muhammad; Moe, Luke A
2014-10-01
Multitrophic level microbial loop interactions mediated by protist predators, bacteria, and viruses drive eco- and agro-biotechnological processes such as bioremediation, wastewater treatment, plant growth promotion, and ecosystem functioning. To what extent these microbial interactions are context-dependent in performing biotechnological and ecosystem processes remains largely unstudied. Theory-driven research may advance the understanding of eco-evolutionary processes underlying the patterns and functioning of microbial interactions for successful development of microbe-based biotechnologies for real world applications. This could also be a great avenue to test the validity or limitations of ecology theory for managing diverse microbial resources in an era of altering microbial niches, multitrophic interactions, and microbial diversity loss caused by climate and land use changes. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.
2017-12-01
Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the community through both docker and anaconda.
Back to the Future: Consistency-Based Trajectory Tracking
NASA Technical Reports Server (NTRS)
Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)
2000-01-01
Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.
Walther, Andreas; Bjurhager, Ingela; Malho, Jani-Markus; Pere, Jaakko; Ruokolainen, Janne; Berglund, Lars A; Ikkala, Olli
2010-08-11
Although remarkable success has been achieved to mimic the mechanically excellent structure of nacre in laboratory-scale models, it remains difficult to foresee mainstream applications due to time-consuming sequential depositions or energy-intensive processes. Here, we introduce a surprisingly simple and rapid methodology for large-area, lightweight, and thick nacre-mimetic films and laminates with superior material properties. Nanoclay sheets with soft polymer coatings are used as ideal building blocks with intrinsic hard/soft character. They are forced to rapidly self-assemble into aligned nacre-mimetic films via paper-making, doctor-blading or simple painting, giving rise to strong and thick films with tensile modulus of 45 GPa and strength of 250 MPa, that is, partly exceeding nacre. The concepts are environmentally friendly, energy-efficient, and economic and are ready for scale-up via continuous roll-to-roll processes. Excellent gas barrier properties, optical translucency, and extraordinary shape-persistent fire-resistance are demonstrated. We foresee advanced large-scale biomimetic materials, relevant for lightweight sustainable construction and energy-efficient transportation.
Brennan, Christine; Booth, James R.
2016-01-01
Linguistic knowledge, cognitive ability, and instruction influence how adults acquire a second orthography yet it remains unclear how different forms of instruction influence grain size sensitivity and subsequent decoding skill and speed. Thirty-seven monolingual, literate English-speaking adults were trained on a novel artificial orthography given initial instruction that directed attention to either large or small grain size units (i.e., words or letters). We examined how initial instruction influenced processing speed (i.e., reaction time (RT)) and sensitivity to different orthographic grain sizes (i.e., rimes and letters). Directing attention to large grain size units during initial instruction resulted in higher accuracy for rimes, whereas directing attention to smaller grain size units resulted in slower RTs across all measures. Additionally, phonological awareness skill modulated early learning effects, compensating for the limitations of the initial instruction provided. Collectively, these findings suggest that when adults are learning to read a second orthography, consideration should be given to how initial instruction directs attention to different grain sizes and inherent phonological awareness ability. PMID:27829705
Accelerating root system phenotyping of seedlings through a computer-assisted processing pipeline.
Dupuy, Lionel X; Wright, Gladys; Thompson, Jacqueline A; Taylor, Anna; Dekeyser, Sebastien; White, Christopher P; Thomas, William T B; Nightingale, Mark; Hammond, John P; Graham, Neil S; Thomas, Catherine L; Broadley, Martin R; White, Philip J
2017-01-01
There are numerous systems and techniques to measure the growth of plant roots. However, phenotyping large numbers of plant roots for breeding and genetic analyses remains challenging. One major difficulty is to achieve high throughput and resolution at a reasonable cost per plant sample. Here we describe a cost-effective root phenotyping pipeline, on which we perform time and accuracy benchmarking to identify bottlenecks in such pipelines and strategies for their acceleration. Our root phenotyping pipeline was assembled with custom software and low cost material and equipment. Results show that sample preparation and handling of samples during screening are the most time consuming task in root phenotyping. Algorithms can be used to speed up the extraction of root traits from image data, but when applied to large numbers of images, there is a trade-off between time of processing the data and errors contained in the database. Scaling-up root phenotyping to large numbers of genotypes will require not only automation of sample preparation and sample handling, but also efficient algorithms for error detection for more reliable replacement of manual interventions.
Flexible Microstrip Circuits for Superconducting Electronics
NASA Technical Reports Server (NTRS)
Chervenak, James; Mateo, Jennette
2013-01-01
Flexible circuits with superconducting wiring atop polyimide thin films are being studied to connect large numbers of wires between stages in cryogenic apparatus with low heat load. The feasibility of a full microstrip process, consisting of two layers of superconducting material separated by a thin dielectric layer on 5 mil (approximately 0.13 mm) Kapton sheets, where manageable residual stress remains in the polyimide film after processing, has been demonstrated. The goal is a 2-mil (approximately 0.051-mm) process using spin-on polyimide to take advantage of the smoother polyimide surface for achieving highquality metal films. Integration of microstrip wiring with this polyimide film may require high-temperature bakes to relax the stress in the polyimide film between metallization steps.
NASA Astrophysics Data System (ADS)
Schlichting, Hilke E.; Sari, Re'em
2011-02-01
Runaway growth is an important stage in planet formation during which large protoplanets form, while most of the initial mass remains in small planetesimals. The amount of mass converted into large protoplanets and their resulting size distribution are not well understood. Here, we use analytic work, that we confirm by coagulation simulations, to describe runaway growth and the corresponding evolution of the velocity dispersion. We find that runaway growth proceeds as follows. Initially, all the mass resides in small planetesimals, with mass surface density σ, and large protoplanets start to form by accreting small planetesimals. This growth continues until growth by merging large protoplanets becomes comparable to growth by planetesimal accretion. This condition sets in when Σ/σ ~ α3/4 ~ 10-3, where Σ is the mass surface density in protoplanets in a given logarithmic mass interval and α is the ratio of the size of a body to its Hill radius. From then on, protoplanetary growth and the evolution of the velocity dispersion become self-similar and Σ remains roughly constant, since an increase in Σ by accretion of small planetesimals is balanced by a decrease due to merging with large protoplanets. We show that this growth leads to a protoplanet size distribution given by N(>R) vprop R -3, where N(>R) is the number of objects with radii greater than R (i.e., a differential power-law index of 4). Since only the largest bodies grow significantly during runaway growth, Σ and thereby the size distribution are preserved. We apply our results to the Kuiper Belt, which is a relic of runaway growth where planet formation never proceeded to completion. Our results successfully match the observed Kuiper Belt size distribution, they illuminate the physical processes that shaped it and explain the total mass that is present in large Kuiper Belt objects (KBOs) today. This work suggests that the current mass in large KBOs is primordial and that it has not been significantly depleted. We also predict a maximum mass ratio for Kuiper Belt binaries that formed by dynamical processes of α-1/4 ~ 10, which explains the observed clustering in binary companion sizes that is seen in the cold classical belt. Finally, our results also apply to growth in debris disks, as long as frequent planetesimal-planetesimal collisions are not important during the growth.
Conscience and Consciousness: a definition
Vithoulkas, G; Muresanu, DF
2014-01-01
While consciousness has been examined extensively in its different aspects, like in philosophy, psychiatry, neurophysiology, neuroplasticity, etc., conscience though it is an equal important aspect of the human existence, which remains an unknown to a great degree as an almost transcendental aspect of the human mind. It has not been examined as thoroughly as consciousness and largely remains a “terra incognita" for its neurophysiology, brain topography, etc. Conscience and consciousness are part of a system of information that governs our experience and decision making process. The intent of this paper is to define these terms, to discuss about consciousness from both neurological and quantum physics point of view, the relationship between the dynamics of consciousness and neuroplasticity and to highlight the relationship between conscience, stress and health. PMID:24653768
Conscience and consciousness: a definition.
Vithoulkas, G; Muresanu, D F
2014-03-15
While consciousness has been examined extensively in its different aspects, like in philosophy, psychiatry, neurophysiology, neuroplasticity, etc., conscience though it is an equal important aspect of the human existence, which remains an unknown to a great degree as an almost transcendental aspect of the human mind. It has not been examined as thoroughly as consciousness and largely remains a "terra incognita" for its neurophysiology, brain topography, etc. Conscience and consciousness are part of a system of information that governs our experience and decision making process. The intent of this paper is to define these terms, to discuss about consciousness from both neurological and quantum physics point of view, the relationship between the dynamics of consciousness and neuroplasticity and to highlight the relationship between conscience, stress and health.
Large catchment area recharges Titan's Ontario Lacus
NASA Astrophysics Data System (ADS)
Dhingra, Rajani D.; Barnes, Jason W.; Yanites, Brian J.; Kirk, Randolph L.
2018-01-01
We seek to address the question of what processes are at work to fill Ontario Lacus while other, deeper south polar basins remain empty. Our hydrological analysis indicates that Ontario Lacus has a catchment area spanning 5.5% of Titan's surface and a large catchment area to lake surface area ratio. This large catchment area translates into large volumes of liquid making their way to Ontario Lacus after rainfall. The areal extent of the catchment extends to at least southern mid-latitudes (40°S). Mass conservation calculations indicate that runoff alone might completely fill Ontario Lacus within less than half a Titan year (1 Titan year = 29.5 Earth years) assuming no infiltration. Cassini Visual and Infrared Mapping Spectrometer (VIMS) observations of clouds over the southern mid and high-latitudes are consistent with precipitation feeding Ontario's large catchment area. This far-flung rain may be keeping Ontario Lacus filled, making it a liquid hydrocarbon oasis in the relatively dry south polar region.
Continuous information flow fluctuations
NASA Astrophysics Data System (ADS)
Rosinberg, Martin Luc; Horowitz, Jordan M.
2016-10-01
Information plays a pivotal role in the thermodynamics of nonequilibrium processes with feedback. However, much remains to be learned about the nature of information fluctuations in small-scale devices and their relation with fluctuations in other thermodynamics quantities, like heat and work. Here we derive a series of fluctuation theorems for information flow and partial entropy production in a Brownian particle model of feedback cooling and extend them to arbitrary driven diffusion processes. We then analyze the long-time behavior of the feedback-cooling model in detail. Our results provide insights into the structure and origin of large deviations of information and thermodynamic quantities in autonomous Maxwell's demons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brombal, Daniele; Moriggi, Angela; Department of Environmental Sciences, Informatics and Statistics, University Ca' Foscari Venice
In recent years, China's government authorities have devoted increasing attention to the role of public participation processes in Environmental Impact Assessment (EIA). The capacity of these processes to influence decision-making remains widely debated. This paper aims at appraising the institutional rationale informing the implementation of public participation in China's EIA, benchmarking it against three conceptualisations: (1) Normative, based on objectives of empowerment and democratisation; (2) Substantive, where participation is pursued mainly to improve quality of decisions; (3) Instrumental, seeking participation as an instrument to legitimise decision-making processes. The appraisal is carried out by means of a new integrated index (Publicmore » Participation Index, PPI), which is applied to a case study representative of latest advancements in EIA public participation practices in China, namely the “New Beijing Airport Project”. Located 46 km south of downtown Beijing, the project was approved in 2014 and it is currently under construction. Results of the PPI application to this case study indicate that, despite progress made in recent years, the implementation of public participation in Chinese EIA still largely responds to an instrumental rationale, with limited capacity for the public to affect decisions. - Highlights: • In recent years China has strengthened EIA public participation (PP) legislation. • Despite progress, implementation of PP remains informed by an instrumental rationale. • A large gap exists between principles enunciated in regulations and implementation. • The Public Participation Index can be used to monitor China's EIA PP development.« less
Distributions of occupied and vacant butterfly habitats in fragmented landscapes.
Thomas, C D; Thomas, J A; Warren, M S
1992-12-01
We found several rare UK butterflies to be restricted to relatively large and non-isolated habitat patches, while small patches and those that are isolated from population sources remain vacant. These patterns of occurrence are generated by the dynamic processes of local extinction and colonization. Habitat patches act as terrestrial archipelagos in which long-term population persistence, and hence effective long-term conservation, rely on networks of suitable habitats, sufficiently close to allow natural dispersal.
Metals Additive Manufacturing. Great Promise in Mitigating Shortages but Some Risks Remain
2016-11-01
manufac- tured, shrinking development and delivery cycle times, and yielding improved performance at a lower cost per part. Shapes previously not...make a part. As illustrated in Figure 1, after each pass, a new layer of powder is laid down using a recoater blade and the process continues until...suggested deliberately abandon- ing large production runs and stockpiled inventories. The Defense Logistics Agency (DLA) lists AM as a priority in its R
Multiscale model of a freeze–thaw process for tree sap exudation
Graf, Isabell; Ceseri, Maurizio; Stockie, John M.
2015-01-01
Sap transport in trees has long fascinated scientists, and a vast literature exists on experimental and modelling studies of trees during the growing season when large negative stem pressures are generated by transpiration from leaves. Much less attention has been paid to winter months when trees are largely dormant but nonetheless continue to exhibit interesting flow behaviour. A prime example is sap exudation, which refers to the peculiar ability of sugar maple (Acer saccharum) and related species to generate positive stem pressure while in a leafless state. Experiments demonstrate that ambient temperatures must oscillate about the freezing point before significantly heightened stem pressures are observed, but the precise causes of exudation remain unresolved. The prevailing hypothesis attributes exudation to a physical process combining freeze–thaw and osmosis, which has some support from experimental studies but remains a subject of active debate. We address this knowledge gap by developing the first mathematical model for exudation, while also introducing several essential modifications to this hypothesis. We derive a multiscale model consisting of a nonlinear system of differential equations governing phase change and transport within wood cells, coupled to a suitably homogenized equation for temperature on the macroscale. Numerical simulations yield stem pressures that are consistent with experiments and provide convincing evidence that a purely physical mechanism is capable of capturing exudation. PMID:26400199
Basu, Swaraj; Larsson, Erik
2016-01-01
Identification of cancer driver genes using somatic mutation patterns indicative of positive selection has become a major goal in cancer genomics. However, cancer cells additionally depend on a large number of genes involved in basic cellular processes. While such genes should in theory be subject to strong purifying (negative) selection against damaging somatic mutations, these patterns have been elusive and purifying selection remains inadequately explored in cancer. Here, we hypothesized that purifying selection should be evident in hemizygous genomic regions, where damaging mutations cannot be compensated for by healthy alleles. Using a 7,781-sample pan-cancer dataset, we first confirmed this in POLR2A, an essential gene where hemizygous deletions are known to confer elevated sensitivity to pharmacological suppression. We next used this principle to identify several genes and pathways that show patterns indicative of purifying selection to avoid deleterious mutations. These include the POLR2A interacting protein INTS10 as well as genes involved in mRNA splicing, nonsense-mediated mRNA decay and other RNA processing pathways. Many of these genes belong to large protein complexes, and strong overlaps were observed with recent functional screens for gene essentiality in human cells. Our analysis supports that purifying selection acts to preserve the remaining function of many hemizygously deleted essential genes in tumors, indicating vulnerabilities that might be exploited by future therapeutic strategies. PMID:28027311
Multiscale model of a freeze-thaw process for tree sap exudation.
Graf, Isabell; Ceseri, Maurizio; Stockie, John M
2015-10-06
Sap transport in trees has long fascinated scientists, and a vast literature exists on experimental and modelling studies of trees during the growing season when large negative stem pressures are generated by transpiration from leaves. Much less attention has been paid to winter months when trees are largely dormant but nonetheless continue to exhibit interesting flow behaviour. A prime example is sap exudation, which refers to the peculiar ability of sugar maple (Acer saccharum) and related species to generate positive stem pressure while in a leafless state. Experiments demonstrate that ambient temperatures must oscillate about the freezing point before significantly heightened stem pressures are observed, but the precise causes of exudation remain unresolved. The prevailing hypothesis attributes exudation to a physical process combining freeze-thaw and osmosis, which has some support from experimental studies but remains a subject of active debate. We address this knowledge gap by developing the first mathematical model for exudation, while also introducing several essential modifications to this hypothesis. We derive a multiscale model consisting of a nonlinear system of differential equations governing phase change and transport within wood cells, coupled to a suitably homogenized equation for temperature on the macroscale. Numerical simulations yield stem pressures that are consistent with experiments and provide convincing evidence that a purely physical mechanism is capable of capturing exudation. © 2015 The Author(s).
GraphReduce: Processing Large-Scale Graphs on Accelerator-Based Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Song, Shuaiwen; Agarwal, Kapil
2015-11-15
Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host andmore » device.« less
Mackay, Stephen; Gomes, Eduardo; Holliger, Christof; Bauer, Rolene; Schwitzguébel, Jean-Paul
2015-06-01
Despite recent advances in down-stream processing, production of microalgae remains substantially limited because of economical reasons. Harvesting and dewatering are the most energy-intensive processing steps in their production and contribute 20-30% of total operational cost. Bio-flocculation of microalgae by co-cultivation with filamentous fungi relies on the development of large structures that facilitate cost effective harvesting. A yet unknown filamentous fungus was isolated as a contaminant from a microalgal culture and identified as Isaria fumosorosea. Blastospores production was optimized in minimal medium and the development of pellets, possibly lichens, was followed when co-cultured with Chlorella sorokiniana under strict autotrophic conditions. Stable pellets (1-2mm) formed rapidly at pH 7-8, clearing the medium of free algal cells. Biomass was harvested with large inexpensive filters, generating wet slurry suitable for hydrothermal gasification. Nutrient rich brine from the aqueous phase of hydrothermal gasification supported growth of the fungus and may increase the process sustainability. Copyright © 2015 Elsevier Ltd. All rights reserved.
Habacha, Hamdi; Moreau, David; Jarraya, Mohamed; Lejeune-Poutrain, Laure; Molinaro, Corinne
2018-01-01
The effect of stimuli size on the mental rotation of abstract objects has been extensively investigated, yet its effect on the mental rotation of bodily stimuli remains largely unexplored. Depending on the experimental design, mentally rotating bodily stimuli can elicit object-based transformations, relying mainly on visual processes, or egocentric transformations, which typically involve embodied motor processes. The present study included two mental body rotation tasks requiring either a same-different or a laterality judgment, designed to elicit object-based or egocentric transformations, respectively. Our findings revealed shorter response times for large-sized stimuli than for small-sized stimuli only for greater angular disparities, suggesting that the more unfamiliar the orientations of the bodily stimuli, the more stimuli size affected mental processing. Importantly, when comparing size transformation times, results revealed different patterns of size transformation times as a function of angular disparity between object-based and egocentric transformations. This indicates that mental size transformation and mental rotation proceed differently depending on the mental rotation strategy used. These findings are discussed with respect to the different spatial manipulations involved during object-based and egocentric transformations.
Comparing DNA damage-processing pathways by computer analysis of chromosome painting data.
Levy, Dan; Vazquez, Mariel; Cornforth, Michael; Loucas, Bradford; Sachs, Rainer K; Arsuaga, Javier
2004-01-01
Chromosome aberrations are large-scale illegitimate rearrangements of the genome. They are indicative of DNA damage and informative about damage processing pathways. Despite extensive investigations over many years, the mechanisms underlying aberration formation remain controversial. New experimental assays such as multiplex fluorescent in situ hybridyzation (mFISH) allow combinatorial "painting" of chromosomes and are promising for elucidating aberration formation mechanisms. Recently observed mFISH aberration patterns are so complex that computer and graph-theoretical methods are needed for their full analysis. An important part of the analysis is decomposing a chromosome rearrangement process into "cycles." A cycle of order n, characterized formally by the cyclic graph with 2n vertices, indicates that n chromatin breaks take part in a single irreducible reaction. We here describe algorithms for computing cycle structures from experimentally observed or computer-simulated mFISH aberration patterns. We show that analyzing cycles quantitatively can distinguish between different aberration formation mechanisms. In particular, we show that homology-based mechanisms do not generate the large number of complex aberrations, involving higher-order cycles, observed in irradiated human lymphocytes.
Li, Jin; Zheng, Le; Uchiyama, Akihiko; Bin, Lianghua; Mauro, Theodora M; Elias, Peter M; Pawelczyk, Tadeusz; Sakowicz-Burkiewicz, Monika; Trzeciak, Magdalena; Leung, Donald Y M; Morasso, Maria I; Yu, Peng
2018-06-13
A large volume of biological data is being generated for studying mechanisms of various biological processes. These precious data enable large-scale computational analyses to gain biological insights. However, it remains a challenge to mine the data efficiently for knowledge discovery. The heterogeneity of these data makes it difficult to consistently integrate them, slowing down the process of biological discovery. We introduce a data processing paradigm to identify key factors in biological processes via systematic collection of gene expression datasets, primary analysis of data, and evaluation of consistent signals. To demonstrate its effectiveness, our paradigm was applied to epidermal development and identified many genes that play a potential role in this process. Besides the known epidermal development genes, a substantial proportion of the identified genes are still not supported by gain- or loss-of-function studies, yielding many novel genes for future studies. Among them, we selected a top gene for loss-of-function experimental validation and confirmed its function in epidermal differentiation, proving the ability of this paradigm to identify new factors in biological processes. In addition, this paradigm revealed many key genes in cold-induced thermogenesis using data from cold-challenged tissues, demonstrating its generalizability. This paradigm can lead to fruitful results for studying molecular mechanisms in an era of explosive accumulation of publicly available biological data.
Investigation into process-induced de-aggregation of cohesive micronised API particles.
Hoffmann, Magnus; Wray, Patrick S; Gamble, John F; Tobyn, Mike
2015-09-30
The aim of this study was to assess the impact of unit processes on the de-aggregation of a cohesive micronised API within a pharmaceutical formulation using near-infrared chemical imaging. The impact on the primary API particles was also investigated using an image-based particle characterization system with integrated Raman analysis. The blended material was shown to contain large, API rich domains which were distributed in-homogeneously across the sample, suggesting that the blending process was not aggressive enough to disperse aggregates of micronised drug particles. Cone milling, routinely used to improve the homogeneity of such cohesive formulations, was observed to substantially reduce the number and size of API rich domains; however, several smaller API domains survived the milling process. Conveyance of the cone milled formulation through the Alexanderwerk WP120 powder feed system completely dispersed all remaining aggregates. Importantly, powder feed transmission of the un-milled formulation was observed to produce an equally homogeneous API distribution. The size of the micronised primary drug particles remained unchanged during powder feed transmission. These findings provide further evidence that this powder feed system does induce shear, and is in fact better able to disperse aggregates of a cohesive micronised API within a blend than the blend-mill-blend step. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ise, T.; Litton, C. M.; Giardina, C. P.; Ito, A.
2009-12-01
Plant partitioning of carbon (C) to above- vs. belowground, to growth vs. respiration, and to short vs. long lived tissues exerts a large influence on ecosystem structure and function with implications for the global C budget. Importantly, outcomes of process-based terrestrial vegetation models are likely to vary substantially with different C partitioning algorithms. However, controls on C partitioning patterns remain poorly quantified, and studies have yielded variable, and at times contradictory, results. A recent meta-analysis of forest studies suggests that the ratio of net primary production (NPP) and gross primary production (GPP) is fairly conservative across large scales. To illustrate the effect of this unique meta-analysis-based partitioning scheme (MPS), we compared an application of MPS to a terrestrial satellite-based (MODIS) GPP to estimate NPP vs. two global process-based vegetation models (Biome-BGC and VISIT) to examine the influence of C partitioning on C budgets of woody plants. Due to the temperature dependence of maintenance respiration, NPP/GPP predicted by the process-based models increased with latitude while the ratio remained constant with MPS. Overall, global NPP estimated with MPS was 17 and 27% lower than the process-based models for temperate and boreal biomes, respectively, with smaller differences in the tropics. Global equilibrium biomass of woody plants was then calculated from the NPP estimates and tissue turnover rates from VISIT. Since turnover rates differed greatly across tissue types (i.e., metabolically active vs. structural), global equilibrium biomass estimates were sensitive to the partitioning scheme employed. The MPS estimate of global woody biomass was 7-21% lower than that of the process-based models. In summary, we found that model output for NPP and equilibrium biomass was quite sensitive to the choice of C partitioning schemes. Carbon use efficiency (CUE; NPP/GPP) by forest biome and the globe. Values are means for 2001-2006.
Major vessel involvement in Behçet disease.
Calamia, Kenneth T; Schirmer, Michael; Melikoglu, Melike
2005-01-01
Large vessel vasculitis occurs in a subgroup of patients with Behçet disease at high risk for disease-related morbidity and mortality. Recognition of patients at risk, early detection of vasculitis, and the need for aggressive treatment are essential for optimal care of these patients. The authors review the clinical spectrum and management of large vessel problems in Behçet disease, highlighting contributions over the past year. Vasculo-Behçet patients are at risk for multiple vessel-related complications including thromboses, stenoses, occlusions, and aneurysms. A number of factors may contribute to thrombosis in individual cases, but the primary reason for clot seems to reside in the inflammatory process in the arterial wall, still incompletely understood. An appreciation for the challenges in the perioperative period requires the joint efforts of physicians and surgeons, and fuels the study of alternate, less invasive procedures for Behçet patients. Because of earlier recognition, aggressive medical treatment, and novel surgical procedures, the morbidity and mortality of large vessel vasculitis in Behçet disease are beginning to change. In the absence of controlled treatment studies, reports of clinical experience remain an important source of information for clinicians. Identification of patients at risk for vascular complications remains a priority.
Current status and challenges for automotive battery production technologies
NASA Astrophysics Data System (ADS)
Kwade, Arno; Haselrieder, Wolfgang; Leithoff, Ruben; Modlinger, Armin; Dietrich, Franz; Droeder, Klaus
2018-04-01
Production technology for automotive lithium-ion battery (LIB) cells and packs has improved considerably in the past five years. However, the transfer of developments in materials, cell design and processes from lab scale to production scale remains a challenge due to the large number of consecutive process steps and the significant impact of material properties, electrode compositions and cell designs on processes. This requires an in-depth understanding of the individual production processes and their interactions, and pilot-scale investigations into process parameter selection and prototype cell production. Furthermore, emerging process concepts must be developed at lab and pilot scale that reduce production costs and improve cell performance. Here, we present an introductory summary of the state-of-the-art production technologies for automotive LIBs. We then discuss the key relationships between process, quality and performance, as well as explore the impact of materials and processes on scale and cost. Finally, future developments and innovations that aim to overcome the main challenges are presented.
Gallucci, Armen; Deutsch, Thomas; Youngquist, Jaymie
2013-01-01
The authors attempt to simplify the key elements to the process of negotiating successfully with private physicians. From their experience, the business elements that have resulted in the most discussion center on the compensation including the incentive plan. Secondarily, how the issue of malpractice is handled will also consume a fair amount of time. What the authors have also learned is that the intangible issues can often be the reason for an unexpectedly large amount of discussion and therefore add time to the negotiation process. To assist with this process, they have derived a negotiation checklist, which seeks to help hospital leaders and administrators set the proper framework to ensure successful negotiation conversations. More importantly, being organized and recognizing these broad issues upfront and remaining transparent throughout the process will help to ensure a successful negotiation.
Experiments to Distribute Map Generalization Processes
NASA Astrophysics Data System (ADS)
Berli, Justin; Touya, Guillaume; Lokhat, Imran; Regnauld, Nicolas
2018-05-01
Automatic map generalization requires the use of computationally intensive processes often unable to deal with large datasets. Distributing the generalization process is the only way to make them scalable and usable in practice. But map generalization is a highly contextual process, and the surroundings of a generalized map feature needs to be known to generalize the feature, which is a problem as distribution might partition the dataset and parallelize the processing of each part. This paper proposes experiments to evaluate the past propositions to distribute map generalization, and to identify the main remaining issues. The past propositions to distribute map generalization are first discussed, and then the experiment hypotheses and apparatus are described. The experiments confirmed that regular partitioning was the quickest strategy, but also the less effective in taking context into account. The geographical partitioning, though less effective for now, is quite promising regarding the quality of the results as it better integrates the geographical context.
The neural basis of novelty and appropriateness in processing of creative chunk decomposition.
Huang, Furong; Fan, Jin; Luo, Jing
2015-06-01
Novelty and appropriateness have been recognized as the fundamental features of creative thinking. However, the brain mechanisms underlying these features remain largely unknown. In this study, we used event-related functional magnetic resonance imaging (fMRI) to dissociate these mechanisms in a revised creative chunk decomposition task in which participants were required to perform different types of chunk decomposition that systematically varied in novelty and appropriateness. We found that novelty processing involved functional areas for procedural memory (caudate), mental rewarding (substantia nigra, SN), and visual-spatial processing, whereas appropriateness processing was mediated by areas for declarative memory (hippocampus), emotional arousal (amygdala), and orthography recognition. These results indicate that non-declarative and declarative memory systems may jointly contribute to the two fundamental features of creative thinking. Copyright © 2015 Elsevier Inc. All rights reserved.
History of surface weather observations in the United States
NASA Astrophysics Data System (ADS)
Fiebrich, Christopher A.
2009-04-01
In this paper, the history of surface weather observations in the United States is reviewed. Local weather observations were first documented in the 17th Century along the East Coast. For many years, the progression of a weather observation from an initial reading to dissemination remained a slow and laborious process. The number of observers remained small and unorganized until agencies including the Surgeon General, Army, and General Land Office began to request regular observations at satellite locations in the 1800s. The Smithsonian was responsible for first organizing a large "network" of volunteer weather observers across the nation. These observers became the foundation for today's Cooperative Observer network. As applications of weather data continued to grow and users required the data with an ever-decreasing latency, automated weather networks saw rapid growth in the later part of the 20th century. Today, the number of weather observations across the U.S. totals in the tens of thousands due largely to privately-owned weather networks and amateur weather observers who submit observations over the internet.
Liquid fuels from food waste: An alternative process to co-digestion
NASA Astrophysics Data System (ADS)
Sim, Yoke-Leng; Ch'ng, Boon-Juok; Mok, Yau-Cheng; Goh, Sok-Yee; Hilaire, Dickens Saint; Pinnock, Travis; Adams, Shemlyn; Cassis, Islande; Ibrahim, Zainab; Johnson, Camille; Johnson, Chantel; Khatim, Fatima; McCormack, Andrece; Okotiuero, Mary; Owens, Charity; Place, Meoak; Remy, Cristine; Strothers, Joel; Waithe, Shannon; Blaszczak-Boxe, Christopher; Pratt, Lawrence M.
2017-04-01
Waste from uneaten, spoiled, or otherwise unusable food is an untapped source of material for biofuels. A process is described to recover the oil from mixed food waste, together with a solid residue. This process includes grinding the food waste to an aqueous slurry, skimming off the oil, a combined steam treatment of the remaining solids concurrent with extrusion through a porous cylinder to release the remaining oil, a second oil skimming step, and centrifuging the solids to obtain a moist solid cake for fermentation. The water, together with any resulting oil from the centrifuging step, is recycled back to the grinding step, and the cycle is repeated. The efficiency of oil extraction increases with the oil content of the waste, and greater than 90% of the oil was collected from waste containing at least 3% oil based on the wet mass. Fermentation was performed on the solid cake to obtain ethanol, and the dried solid fermentation residue was a nearly odorless material with potential uses of biochar, gasification, or compost production. This technology has the potential to enable large producers of food waste to comply with new laws which require this material to be diverted from landfills.
NASA Astrophysics Data System (ADS)
Mumma, Michael J.
2008-10-01
As messengers from the early Solar System, comets contain key information from the time of planet formation and even earlier some may contain material formed in our natal interstellar cloud. Along with water, the cometary nucleus contains ices of natural gases (CH4, C2H6), alcohols (CH3OH), acids (HCOOH), embalming fluid (H2CO), and even anti-freeze (ethylene glycol). Comets today contain some ices that vaporize at temperatures near absolute zero (CO, CH4), demonstrating that their compositions remain largely unchanged after 4.5 billion years. By comparing their chemical diversity, several distinct cometary classes have been identified but their specific relation to chemical gradients in the proto-planetary disk remains murky. How does the compositional diversity of comets relate to nebular processes such as chemical processing, radial migration, and dynamical scattering? No current reservoir holds a unique class, but their fractional abundance can test emerging dynamical models for origins of the scattered Kuiper disk, the Oort cloud, and the (proposed) main-belt comets. I will provide a simplified overview emphasizing what we are learning, current issues, and their relevance to the subject of this Symposium.
NASA Astrophysics Data System (ADS)
González Riga, Bernardo J.; Astini, Ricardo A.
2007-04-01
Patagonia exhibits a particularly abundant record of Cretaceous dinosaurs with worldwide relevance. Although paleontological studies are relatively numerous, few include taphonomic information about these faunas. This contribution provides the first detailed sedimentological and taphonomical analyses of a dinosaur bone quarry from northern Neuquén Basin. At Arroyo Seco (Mendoza Province, Argentina), a large parautochthonous/autochthonous accumulation of articulated and disarticulated bones that represent several sauropod individuals has been discovered. The fossil remains, assigned to Mendozasaurus neguyelap González Riga, correspond to a large (18-27-m long) sauropod titanosaur collected in the strata of the Río Neuquén Subgroup (late Turoronian-late Coniacian). A taphonomic viewpoint recognizes a two-fold division into biostratinomic and fossil-diagenetic processes. Biostratinomic processes include (1) subaerial biodegradation of sauropod carcasses on well-drained floodplains, (2) partial or total skeletal disarticulation, (3) reorientation of bones by sporadic overbank flows, and (4) subaerial weathering. Fossil-diagenetic processes include (1) plastic deformation of bones, (2) initial permineralization with hematite, (3) fracturing and brittle deformation due to lithostatic pressure; (4) secondary permineralization with calcite in vascular canals and fractures, and (5) postfossilization bone weathering. This type of bone concentration, also present in Rincón de los Sauces (northern Patagonia), suggests that overbank facies tended to accumulate large titanosaur bones. This taphonomic mode, referred to as "overbank bone assemblages", outlines the potential of crevasse splay facies as important sources of paleontological data in Cretaceous meandering fluvial systems.
NASA Astrophysics Data System (ADS)
Akanda, A. S.; Jutla, A.; Huq, A.; Colwell, R. R.
2014-12-01
Cholera is a global disease, with significantly large outbreaks occurring since the 1990s, notably in Sub-Saharan Africa and South Asia and recently in Haiti, in the Caribbean. Critical knowledge gaps remain in the understanding of the annual recurrence in endemic areas and the nature of epidemic outbreaks, especially those that follow extreme hydroclimatic events. Teleconnections with large-scale climate phenomena affecting regional scale hydroclimatic drivers of cholera dynamics remain largely unexplained. For centuries, the Bengal delta region has been strongly influenced by the asymmetric availability of water in the rivers Ganges and the Brahmaputra. As these two major rivers are known to have strong contrasting affects on local cholera dynamics in the region, we argue that the role of El Nino-Southern Oscillation (ENSO), Indian Ocean Dipole (IOD), or other phenomena needs to be interpreted in the context of the seasonal role of individual rivers and subsequent impact on local environmental processes, not as a teleconnection having a remote and unified effect. We present a modified hypothesis that the influences of large-scale climate phenomena such as ENSO and IOD on Bengal cholera can be explicitly identified and incorporated through regional scale hydroclimatic drivers. Here, we provide an analytical review of the literature addressing cholera and climate linkages and present hypotheses, based on recent evidence, and quantification on the role of regional scale hydroclimatic drivers of cholera. We argue that the seasonal changes in precipitation and temperature, and resulting river discharge in the GBM basin region during ENSO and IOD events have a dominant combined effect on the endemic persistence and the epidemic vulnerability to cholera outbreaks in spring and fall seasons, respectively, that is stronger than the effect of localized hydrological and socio-economic sensitivities in Bangladesh. In addition, systematic identification of underlying seasonal hydroclimatic drivers will allow us to harness the inherent system memory of these processes to develop early warning systems and strengthen prevention measures.
Dhakar, Lokesh; Gudla, Sudeep; Shan, Xuechuan; Wang, Zhiping; Tay, Francis Eng Hock; Heng, Chun-Huat; Lee, Chengkuo
2016-01-01
Triboelectric nanogenerators (TENGs) have emerged as a potential solution for mechanical energy harvesting over conventional mechanisms such as piezoelectric and electromagnetic, due to easy fabrication, high efficiency and wider choice of materials. Traditional fabrication techniques used to realize TENGs involve plasma etching, soft lithography and nanoparticle deposition for higher performance. But lack of truly scalable fabrication processes still remains a critical challenge and bottleneck in the path of bringing TENGs to commercial production. In this paper, we demonstrate fabrication of large scale triboelectric nanogenerator (LS-TENG) using roll-to-roll ultraviolet embossing to pattern polyethylene terephthalate sheets. These LS-TENGs can be used to harvest energy from human motion and vehicle motion from embedded devices in floors and roads, respectively. LS-TENG generated a power density of 62.5 mW m−2. Using roll-to-roll processing technique, we also demonstrate a large scale triboelectric pressure sensor array with pressure detection sensitivity of 1.33 V kPa−1. The large scale pressure sensor array has applications in self-powered motion tracking, posture monitoring and electronic skin applications. This work demonstrates scalable fabrication of TENGs and self-powered pressure sensor arrays, which will lead to extremely low cost and bring them closer to commercial production. PMID:26905285
Broca's area: a supramodal hierarchical processor?
Tettamanti, Marco; Weniger, Dorothea
2006-05-01
Despite the presence of shared characteristics across the different domains modulating Broca's area activity (e.g., structural analogies, as between language and music, or representational homologies, as between action execution and action observation), the question of what exactly the common denominator of such diverse brain functions is, with respect to the function of Broca's area, remains largely a debated issue. Here, we suggest that an important computational role of Broca's area may be to process hierarchical structures in a wide range of functional domains.
The cell biology of mammalian fertilization.
Okabe, Masaru
2013-11-01
Fertilization is the process by which eggs and spermatozoa interact, achieve mutual recognition, and fuse to create a zygote, which then develops to form a new individual, thus allowing for the continuity of a species. Despite numerous studies on mammalian fertilization, the molecular mechanisms underpinning the fertilization event remain largely unknown. However, as I summarize here, recent work using both gene-manipulated animals and in vitro studies has begun to elucidate essential sperm and egg molecules and to establish predictive models of successful fertilization.
Theory of ion-matrix-sheath dynamics
NASA Astrophysics Data System (ADS)
Kos, L.; Tskhakaya, D. D.
2018-01-01
The time evolution of a one-dimensional, uni-polar ion sheath (an "ion matrix sheath") is investigated. The analytical solutions for the ion-fluid and Poisson's equations are found for an arbitrary time dependence of the wall-applied negative potential. In the case that the wall potential is large and remains constant after its ramp-up application, the explicit time dependencies of the sheath's parameters during the initial stage of the process are given. The characteristic rate of approaching the stationary state, satisfying the Child-Langmuir law, is determined.
Analysis of Growth and Molecular Responses to Ethylene in Etiolated Rice Seedlings.
Ma, Biao; Zhang, Jin-Song
2017-01-01
Ethylene plays a key role in various submergence responses of rice plants, but the mechanism of ethylene action remains largely unclear in rice. Regarding the differences between rice and Arabidopsis in ethylene-regulated processes, rice plants may possess divergent mechanisms in ethylene signaling in addition to the conserved aspects. Forward genetic analysis is essential to fully understand the ethylene signaling mechanism in rice. Here, we describe a method for screening ethylene-response mutants and evaluating ethylene responsiveness in etiolated rice seedlings.
Garner, Bryan R.; Smith, Jane Ellen; Meyers, Robert J.; Godley, Mark D.
2010-01-01
Multiple evidence-based treatments for adolescents with substance use disorders are available; however, the diffusion of these treatments in practice remains minimal. A dissemination and implementation model incorporating research-based training components for simultaneous implementation across 33 dispersed sites and over 200 clinical staff is described. Key elements for the diffusion of the Adolescent Community Reinforcement Approach and Assertive Continuing Care were: (a) three years of funding to support local implementation; (b) comprehensive training, including a 3.5 day workshop, bi-weekly coaching calls, and ongoing performance feedback facilitated by a web tool; (c) a clinician certification process; (d) a supervisor certification process to promote long-term sustainability; and (e) random fidelity reviews after certification. Process data are summarized for 167 clinicians and 64 supervisors. PMID:21547241
How adolescents come to see themselves as more responsible through participation in youth programs.
Wood, Dustin; Larson, Reed W; Brown, Jane R
2009-01-01
This qualitative study was aimed at developing theory about the process underlying the development of responsibility grounded in accounts of youth who reported experiencing this change. A total of 108 high-school-aged (M = 16.5) youth from 11 programs were interviewed about their experiences within the program, and 24 reported becoming more responsible through their participation. The youth's accounts suggested that this process was driven largely by successfully fulfilling program expectations. This process was driven by youth's adherence to their commitments and their consideration of the consequences of their actions on others. Youth mentioned changes in responsibility most frequently in three programs, which appeared to differ from the remaining programs in having more structure and placing greater ownership and accountability on youth.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, Lieke A.; Teuling, Adriaan J.; Torfs, Paul J. J. F.; Uijlenhoet, Remko; Mizukami, Naoki; Clark, Martyn P.
2016-03-01
A meta-analysis on 192 peer-reviewed articles reporting on applications of the variable infiltration capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
HESS Opinions: The need for process-based evaluation of large-domain hyper-resolution models
NASA Astrophysics Data System (ADS)
Melsen, L. A.; Teuling, A. J.; Torfs, P. J. J. F.; Uijlenhoet, R.; Mizukami, N.; Clark, M. P.
2015-12-01
A meta-analysis on 192 peer-reviewed articles reporting applications of the Variable Infiltration Capacity (VIC) model in a distributed way reveals that the spatial resolution at which the model is applied has increased over the years, while the calibration and validation time interval has remained unchanged. We argue that the calibration and validation time interval should keep pace with the increase in spatial resolution in order to resolve the processes that are relevant at the applied spatial resolution. We identified six time concepts in hydrological models, which all impact the model results and conclusions. Process-based model evaluation is particularly relevant when models are applied at hyper-resolution, where stakeholders expect credible results both at a high spatial and temporal resolution.
Effective Connectivity Reveals Largely Independent Parallel Networks of Face and Body Patches.
Premereur, Elsie; Taubert, Jessica; Janssen, Peter; Vogels, Rufin; Vanduffel, Wim
2016-12-19
The primate brain processes objects in the ventral visual pathway. One object category, faces, is processed in a hierarchical network of interconnected areas along this pathway. It remains unknown whether such an interconnected network is specific for faces or whether there are similar networks for other object classes. For example, the primate inferotemporal cortex also contains a set of body-selective patches, adjacent to the face-selective patches, but it is not known whether these body-selective patches form a similar discretely connected network or whether cross-talk exists between the face- and body-processing systems. To address these questions, we combined fMRI with electrical microstimulation to determine the effective connectivity of fMRI-defined face and body patches. We found that microstimulation of face patches caused increased fMRI activation throughout the face-processing system; microstimulation of the body patches gave similar results restricted to the body-processing system. Critically, our results revealed largely segregated connectivity patterns for the body and face patches. These results suggest that face and body patches form two interconnected hierarchical networks that are largely separated within the monkey inferotemporal cortex. Only a restricted number of voxels were activated by stimulation of both the body and face patches. The latter regions may be important for the integration of face and body information. Our findings are not only essential to advance our understanding of the neural circuits that enable social cognition, but they also provide further insights into the organizing principles of the inferotemporal cortex. Copyright © 2016 Elsevier Ltd. All rights reserved.
High-Throughput DNA sequencing of ancient wood.
Wagner, Stefanie; Lagane, Frédéric; Seguin-Orlando, Andaine; Schubert, Mikkel; Leroy, Thibault; Guichoux, Erwan; Chancerel, Emilie; Bech-Hebelstrup, Inger; Bernard, Vincent; Billard, Cyrille; Billaud, Yves; Bolliger, Matthias; Croutsch, Christophe; Čufar, Katarina; Eynaud, Frédérique; Heussner, Karl Uwe; Köninger, Joachim; Langenegger, Fabien; Leroy, Frédéric; Lima, Christine; Martinelli, Nicoletta; Momber, Garry; Billamboz, André; Nelle, Oliver; Palomo, Antoni; Piqué, Raquel; Ramstein, Marianne; Schweichel, Roswitha; Stäuble, Harald; Tegel, Willy; Terradas, Xavier; Verdin, Florence; Plomion, Christophe; Kremer, Antoine; Orlando, Ludovic
2018-03-01
Reconstructing the colonization and demographic dynamics that gave rise to extant forests is essential to forecasts of forest responses to environmental changes. Classical approaches to map how population of trees changed through space and time largely rely on pollen distribution patterns, with only a limited number of studies exploiting DNA molecules preserved in wooden tree archaeological and subfossil remains. Here, we advance such analyses by applying high-throughput (HTS) DNA sequencing to wood archaeological and subfossil material for the first time, using a comprehensive sample of 167 European white oak waterlogged remains spanning a large temporal (from 550 to 9,800 years) and geographical range across Europe. The successful characterization of the endogenous DNA and exogenous microbial DNA of 140 (~83%) samples helped the identification of environmental conditions favouring long-term DNA preservation in wood remains, and started to unveil the first trends in the DNA decay process in wood material. Additionally, the maternally inherited chloroplast haplotypes of 21 samples from three periods of forest human-induced use (Neolithic, Bronze Age and Middle Ages) were found to be consistent with those of modern populations growing in the same geographic areas. Our work paves the way for further studies aiming at using ancient DNA preserved in wood to reconstruct the micro-evolutionary response of trees to climate change and human forest management. © 2018 John Wiley & Sons Ltd.
Earliest Archaeological Evidence of Persistent Hominin Carnivory
Ferraro, Joseph V.; Plummer, Thomas W.; Pobiner, Briana L.; Oliver, James S.; Bishop, Laura C.; Braun, David R.; Ditchfield, Peter W.; Seaman, John W.; Binetti, Katie M.; Seaman, John W.; Hertel, Fritz; Potts, Richard
2013-01-01
The emergence of lithic technology by ∼2.6 million years ago (Ma) is often interpreted as a correlate of increasingly recurrent hominin acquisition and consumption of animal remains. Associated faunal evidence, however, is poorly preserved prior to ∼1.8 Ma, limiting our understanding of early archaeological (Oldowan) hominin carnivory. Here, we detail three large well-preserved zooarchaeological assemblages from Kanjera South, Kenya. The assemblages date to ∼2.0 Ma, pre-dating all previously published archaeofaunas of appreciable size. At Kanjera, there is clear evidence that Oldowan hominins acquired and processed numerous, relatively complete, small ungulate carcasses. Moreover, they had at least occasional access to the fleshed remains of larger, wildebeest-sized animals. The overall record of hominin activities is consistent through the stratified sequence – spanning hundreds to thousands of years – and provides the earliest archaeological evidence of sustained hominin involvement with fleshed animal remains (i.e., persistent carnivory), a foraging adaptation central to many models of hominin evolution. PMID:23637995
Bottoni, Patrizia; Isgrò, Maria Antonietta; Scatena, Roberto
2016-01-01
The epithelial-mesenchymal transition (EMT) is a morphogenetic process that results in a loss of epithelial characteristics and the acquisition of a mesenchymal phenotype. First described in embryogenesis, the EMT has been recently implicated in carcinogenesis and tumor progression. In addition, recent evidence has shown that stem-like cancer cells present the hallmarks of the EMT. Some of the molecular mechanisms related to the interrelationships between cancer pathophysiology and the EMT are well-defined. Nevertheless, the precise molecular mechanism by which epithelial cancer cells acquire the mesenchymal phenotype remains largely unknown. This review focuses on various proteomic strategies with the goal of better understanding the physiological and pathological mechanisms of the EMT process.
Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy
2014-12-01
A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.
Engineering two-photon high-dimensional states through quantum interference
Zhang, Yingwen; Roux, Filippus S.; Konrad, Thomas; Agnew, Megan; Leach, Jonathan; Forbes, Andrew
2016-01-01
Many protocols in quantum science, for example, linear optical quantum computing, require access to large-scale entangled quantum states. Such systems can be realized through many-particle qubits, but this approach often suffers from scalability problems. An alternative strategy is to consider a lesser number of particles that exist in high-dimensional states. The spatial modes of light are one such candidate that provides access to high-dimensional quantum states, and thus they increase the storage and processing potential of quantum information systems. We demonstrate the controlled engineering of two-photon high-dimensional states entangled in their orbital angular momentum through Hong-Ou-Mandel interference. We prepare a large range of high-dimensional entangled states and implement precise quantum state filtering. We characterize the full quantum state before and after the filter, and are thus able to determine that only the antisymmetric component of the initial state remains. This work paves the way for high-dimensional processing and communication of multiphoton quantum states, for example, in teleportation beyond qubits. PMID:26933685
Fabrication of Large-area Free-standing Ultrathin Polymer Films
Stadermann, Michael; Baxamusa, Salmaan H.; Aracne-Ruddle, Chantel; Chea, Maverick; Li, Shuaili; Youngblood, Kelly; Suratwala, Tayyab
2015-01-01
This procedure describes a method for the fabrication of large-area and ultrathin free-standing polymer films. Typically, ultrathin films are prepared using either sacrificial layers, which may damage the film or affect its mechanical properties, or they are made on freshly cleaved mica, a substrate that is difficult to scale. Further, the size of ultrathin film is typically limited to a few square millimeters. In this method, we modify a surface with a polyelectrolyte that alters the strength of adhesion between polymer and deposition substrate. The polyelectrolyte can be shown to remain on the wafer using spectroscopy, and a treated wafer can be used to produce multiple films, indicating that at best minimal amounts of the polyelectrolyte are added to the film. The process has thus far been shown to be limited in scalability only by the size of the coating equipment, and is expected to be readily scalable to industrial processes. In this study, the protocol for making the solutions, preparing the deposition surface, and producing the films is described. PMID:26066738
NASA Astrophysics Data System (ADS)
Du, Fang-Fang; Deng, Fu-Guo; Long, Gui-Lu
2016-11-01
Entanglement concentration protocol (ECP) is used to extract the maximally entangled states from less entangled pure states. Here we present a general hyperconcentration protocol for two-photon systems in partially hyperentangled Bell states that decay with the interrelation between the time-bin and the polarization degrees of freedom (DOFs), resorting to an input-output process with respect to diamond nitrogen-vacancy centers coupled to resonators. We show that the resource can be utilized sufficiently and the success probability is largely improved by iteration of the hyper-ECP process. Besides, our hyper-ECP can be directly extended to concentrate nonlocal partially hyperentangled N-photon Greenberger-Horne-Zeilinger states, and the success probability remains unchanged with the growth of the number of photons. Moreover, the time-bin entanglement is a useful DOF and it only requires one path for transmission, which means it not only economizes on a large amount of quantum resources but also relaxes from the path-length dispersion in long-distance quantum communication.
Reducing graphene device variability with yttrium sacrificial layers
NASA Astrophysics Data System (ADS)
Wang, Ning C.; Carrion, Enrique A.; Tung, Maryann C.; Pop, Eric
2017-05-01
Graphene technology has made great strides since the material was isolated more than a decade ago. However, despite improvements in growth quality and numerous "hero" devices, challenges of uniformity remain, restricting the large-scale development of graphene-based technologies. Here, we investigate and reduce the variability of graphene transistors by studying the effects of contact metals (with and without a Ti layer), resist, and yttrium (Y) sacrificial layers during the fabrication of hundreds of devices. We find that with optical photolithography, residual resist and process contamination are unavoidable, ultimately limiting the device performance and yield. However, using Y sacrificial layers to isolate the graphene from processing conditions improves the yield (from 73% to 97%), the average device performance (three-fold increase of mobility and 58% lower contact resistance), and the device-to-device variability (standard deviation of Dirac voltage reduced by 20%). In contrast to other sacrificial layer techniques, the removal of the Y sacrificial layer with dilute HCl does not harm surrounding materials, simplifying large-scale graphene fabrication.
Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis
2015-11-13
CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.
Disgust and Obsessive Beliefs in Contamination-related OCD
Cisler, Josh M.; Brady, Robert E.; Olatunji, Bunmi O.; Lohr, Jeffrey M.
2010-01-01
A large body of evidence suggests that disgust is an important affective process underlying contamination fear. An independent line of research demonstrates that obsessive beliefs, particularly overestimations of threat, are also an important cognitive process underlying contamination fear. The present study attempts to integrate these two lines of research by testing whether obsessive beliefs potentiate the influence of disgust propensity on contamination fear. The interaction between disgust propensity and obsessive beliefs was tested in two large non-clinical samples (N = 252 in Study 1; N = 308 in Study 2) using two different self-report measures of contamination fear. Regression analyses supported the hypotheses in both samples. The interaction remained significant when controlling for negative affect. The results are hypothesized to suggest that contamination fear results, at least partly, from obsessive beliefs about the contamination-based appraisals that accompany heightened disgust responding. These results complement previous affective-driven explanations of the role of disgust in contamination fear by suggesting cognitive factors that similarly potentiate disgust’s role in contamination fear. PMID:20877585
Milberg, Oleg; Shitara, Akiko; Ebrahim, Seham; Tora, Muhibullah; Tran, Duy T.; Chen, Yun; Conti, Mary Anne; Ten Hagen, Kelly G.
2017-01-01
Membrane remodeling plays a fundamental role during a variety of biological events. However, the dynamics and the molecular mechanisms regulating this process within cells in mammalian tissues in situ remain largely unknown. In this study, we use intravital subcellular microscopy in live mice to study the role of the actomyosin cytoskeleton in driving the remodeling of membranes of large secretory granules, which are integrated into the plasma membrane during regulated exocytosis. We show that two isoforms of nonmuscle myosin II, NMIIA and NMIIB, control distinct steps of the integration process. Furthermore, we find that F-actin is not essential for the recruitment of NMII to the secretory granules but plays a key role in the assembly and activation of NMII into contractile filaments. Our data support a dual role for the actomyosin cytoskeleton in providing the mechanical forces required to remodel the lipid bilayer and serving as a scaffold to recruit key regulatory molecules. PMID:28600434
Du, Fang-Fang; Deng, Fu-Guo; Long, Gui-Lu
2016-01-01
Entanglement concentration protocol (ECP) is used to extract the maximally entangled states from less entangled pure states. Here we present a general hyperconcentration protocol for two-photon systems in partially hyperentangled Bell states that decay with the interrelation between the time-bin and the polarization degrees of freedom (DOFs), resorting to an input-output process with respect to diamond nitrogen-vacancy centers coupled to resonators. We show that the resource can be utilized sufficiently and the success probability is largely improved by iteration of the hyper-ECP process. Besides, our hyper-ECP can be directly extended to concentrate nonlocal partially hyperentangled N-photon Greenberger-Horne-Zeilinger states, and the success probability remains unchanged with the growth of the number of photons. Moreover, the time-bin entanglement is a useful DOF and it only requires one path for transmission, which means it not only economizes on a large amount of quantum resources but also relaxes from the path-length dispersion in long-distance quantum communication. PMID:27804973
Park, Steve; Giri, Gaurav; Shaw, Leo; Pitner, Gregory; Ha, Jewook; Koo, Ja Hoon; Gu, Xiaodan; Park, Joonsuk; Lee, Tae Hoon; Nam, Ji Hyun; Hong, Yongtaek; Bao, Zhenan
2015-01-01
The electronic properties of solution-processable small-molecule organic semiconductors (OSCs) have rapidly improved in recent years, rendering them highly promising for various low-cost large-area electronic applications. However, practical applications of organic electronics require patterned and precisely registered OSC films within the transistor channel region with uniform electrical properties over a large area, a task that remains a significant challenge. Here, we present a technique termed “controlled OSC nucleation and extension for circuits” (CONNECT), which uses differential surface energy and solution shearing to simultaneously generate patterned and precisely registered OSC thin films within the channel region and with aligned crystalline domains, resulting in low device-to-device variability. We have fabricated transistor density as high as 840 dpi, with a yield of 99%. We have successfully built various logic gates and a 2-bit half-adder circuit, demonstrating the practical applicability of our technique for large-scale circuit fabrication. PMID:25902502
Cadherin genes and evolutionary novelties in the octopus.
Wang, Z Yan; Ragsdale, Clifton W
2017-09-01
All animals with large brains must have molecular mechanisms to regulate neuronal process outgrowth and prevent neurite self-entanglement. In vertebrates, two major gene families implicated in these mechanisms are the clustered protocadherins and the atypical cadherins. However, the molecular mechanisms utilized in complex invertebrate brains, such as those of the cephalopods, remain largely unknown. Recently, we identified protocadherins and atypical cadherins in the octopus. The octopus protocadherin expansion shares features with the mammalian clustered protocadherins, including enrichment in neural tissues, clustered head-to-tail orientations in the genome, and a large first exon encoding all cadherin domains. Other octopus cadherins, including a newly-identified cadherin with 77 extracellular cadherin domains, are elevated in the suckers, a striking cephalopod novelty. Future study of these octopus genes may yield insights into the general functions of protocadherins in neural wiring and cadherin-related proteins in complex morphogenesis. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tsagkrasoulis, Dimosthenis; Hysi, Pirro; Spector, Tim; Montana, Giovanni
2017-04-01
The human face is a complex trait under strong genetic control, as evidenced by the striking visual similarity between twins. Nevertheless, heritability estimates of facial traits have often been surprisingly low or difficult to replicate. Furthermore, the construction of facial phenotypes that correspond to naturally perceived facial features remains largely a mystery. We present here a large-scale heritability study of face geometry that aims to address these issues. High-resolution, three-dimensional facial models have been acquired on a cohort of 952 twins recruited from the TwinsUK registry, and processed through a novel landmarking workflow, GESSA (Geodesic Ensemble Surface Sampling Algorithm). The algorithm places thousands of landmarks throughout the facial surface and automatically establishes point-wise correspondence across faces. These landmarks enabled us to intuitively characterize facial geometry at a fine level of detail through curvature measurements, yielding accurate heritability maps of the human face (www.heritabilitymaps.info).
Akanda, Ali Shafqat; Jutla, Antarpreet S.; Gute, David M.; Sack, R. Bradley; Alam, Munirul; Huq, Anwar; Colwell, Rita R.; Islam, Shafiqul
2013-01-01
The highly populated floodplains of the Bengal Delta have a long history of endemic and epidemic cholera outbreaks, both coastal and inland. Previous studies have not addressed the spatio-temporal dynamics of population vulnerability related to the influence of underlying large-scale processes. We analyzed spatial and temporal variability of cholera incidence across six surveillance sites in the Bengal Delta and their association with regional hydroclimatic and environmental drivers. More specifically, we use salinity and flood inundation modeling across the vulnerable districts of Bangladesh to test earlier proposed hypotheses on the role of these environmental variables. Our results show strong influence of seasonal and interannual variability in estuarine salinity on spring outbreaks and inland flooding on fall outbreaks. A large segment of the population in the Bengal Delta floodplains remain vulnerable to these biannual cholera transmission mechanisms that provide ecologic and environmental conditions for outbreaks over large geographic regions. PMID:24019441
webpic: A flexible web application for collecting distance and count measurements from images
2018-01-01
Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592
Detailed Quantitative Classifications of Galaxy Morphology
NASA Astrophysics Data System (ADS)
Nair, Preethi
2018-01-01
Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.
Davison, John; Moora, Mari; Öpik, Maarja; Ainsaar, Leho; Ducousso, Marc; Hiiesalu, Inga; Jairus, Teele; Johnson, Nancy; Jourand, Philippe; Kalamees, Rein; Koorem, Kadri; Meyer, Jean-Yves; Püssa, Kersti; Reier, Ülle; Pärtel, Meelis; Semchenko, Marina; Traveset, Anna; Vasar, Martti; Zobel, Martin
2018-06-08
Island biogeography theory is one of the most influential paradigms in ecology. That island characteristics, including remoteness, can profoundly modulate biological diversity has been borne out by studies of animals and plants. By contrast, the processes influencing microbial diversity in island systems remain largely undetermined. We sequenced arbuscular mycorrhizal (AM) fungal DNA from plant roots collected on 13 islands worldwide and compared AM fungal diversity on islands with existing data from mainland sites. AM fungal communities on islands (even those >6000 km from the closest mainland) comprised few endemic taxa and were as diverse as mainland communities. Thus, in contrast to patterns recorded among macro-organisms, efficient dispersal appears to outweigh the effects of taxogenesis and extinction in regulating AM fungal diversity on islands. Nonetheless, AM fungal communities on more distant islands comprised a higher proportion of previously cultured and large-spored taxa, indicating that dispersal may be human-mediated or require tolerance of significant environmental stress, such as exposure to sunlight or high salinity. The processes driving large-scale patterns of microbial diversity are a key consideration for attempts to conserve and restore functioning ecosystems in this era of rapid global change.
Melt Electrospinning Writing of Highly Ordered Large Volume Scaffold Architectures.
Wunner, Felix M; Wille, Marie-Luise; Noonan, Thomas G; Bas, Onur; Dalton, Paul D; De-Juan-Pardo, Elena M; Hutmacher, Dietmar W
2018-05-01
The additive manufacturing of highly ordered, micrometer-scale scaffolds is at the forefront of tissue engineering and regenerative medicine research. The fabrication of scaffolds for the regeneration of larger tissue volumes, in particular, remains a major challenge. A technology at the convergence of additive manufacturing and electrospinning-melt electrospinning writing (MEW)-is also limited in thickness/volume due to the accumulation of excess charge from the deposited material repelling and hence, distorting scaffold architectures. The underlying physical principles are studied that constrain MEW of thick, large volume scaffolds. Through computational modeling, numerical values variable working distances are established respectively, which maintain the electrostatic force at a constant level during the printing process. Based on the computational simulations, three voltage profiles are applied to determine the maximum height (exceeding 7 mm) of a highly ordered large volume scaffold. These thick MEW scaffolds have fully interconnected pores and allow cells to migrate and proliferate. To the best of the authors knowledge, this is the first study to report that z-axis adjustment and increasing the voltage during the MEW process allows for the fabrication of high-volume scaffolds with uniform morphologies and fiber diameters. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Process for forming coal compacts and product thereof
Gunnink, Brett; Kanunar, Jayanth; Liang, Zhuoxiong
2002-01-01
A process for forming durable, mechanically strong compacts from coal particulates without use of a binder is disclosed. The process involves applying a compressive stress to a particulate feed comprising substantially water-saturated coal particles while the feed is heated to a final compaction temperature in excess of about 100.degree. C. The water present in the feed remains substantially in the liquid phase throughout the compact forming process. This is achieved by heating and compressing the particulate feed and cooling the formed compact at a pressure sufficient to prevent water present in the feed from boiling. The compacts produced by the process have a moisture content near their water saturation point. As a result, these compacts absorb little water and retain exceptional mechanical strength when immersed in high pressure water. The process can be used to form large, cylindrically-shaped compacts from coal particles (i.e., "coal logs") so that the coal can be transported in a hydraulic coal log pipeline.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.
1992-01-01
To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.
The effects of mutational processes and selection on driver mutations across cancer types.
Temko, Daniel; Tomlinson, Ian P M; Severini, Simone; Schuster-Böckler, Benjamin; Graham, Trevor A
2018-05-10
Epidemiological evidence has long associated environmental mutagens with increased cancer risk. However, links between specific mutation-causing processes and the acquisition of individual driver mutations have remained obscure. Here we have used public cancer sequencing data from 11,336 cancers of various types to infer the independent effects of mutation and selection on the set of driver mutations in a cancer type. First, we detect associations between a range of mutational processes, including those linked to smoking, ageing, APOBEC and DNA mismatch repair (MMR) and the presence of key driver mutations across cancer types. Second, we quantify differential selection between well-known alternative driver mutations, including differences in selection between distinct mutant residues in the same gene. These results show that while mutational processes have a large role in determining which driver mutations are present in a cancer, the role of selection frequently dominates.
Bioconversion of lignocellulosic biomass to xylitol: An overview.
Venkateswar Rao, Linga; Goli, Jyosthna Khanna; Gentela, Jahnavi; Koti, Sravanthi
2016-08-01
Lignocellulosic wastes include agricultural and forest residues which are most promising alternative energy sources and serve as potential low cost raw materials that can be exploited to produce xylitol. The strong physical and chemical construction of lignocelluloses is a major constraint for the recovery of xylose. The large scale production of xylitol is attained by nickel catalyzed chemical process that is based on xylose hydrogenation, that requires purified xylose as raw substrate and the process requires high temperature and pressure that remains to be cost intensive and energy consuming. Therefore, there is a necessity to develop an integrated process for biotechnological conversion of lignocelluloses to xylitol and make the process economical. The present review confers about the pretreatment strategies that facilitate cellulose and hemicellulose acquiescent for hydrolysis. There is also an emphasis on various detoxification and fermentation methodologies including genetic engineering strategies for the efficient conversion of xylose to xylitol. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Michalak, D. J.; Bruno, A.; Caudillo, R.; Elsherbini, A. A.; Falcon, J. A.; Nam, Y. S.; Poletto, S.; Roberts, J.; Thomas, N. K.; Yoscovits, Z. R.; Dicarlo, L.; Clarke, J. S.
Experimental quantum computing is rapidly approaching the integration of sufficient numbers of quantum bits for interesting applications, but many challenges still remain. These challenges include: realization of an extensible design for large array scale up, sufficient material process control, and discovery of integration schemes compatible with industrial 300 mm fabrication. We present recent developments in extensible circuits with vertical delivery. Toward the goal of developing a high-volume manufacturing process, we will present recent results on a new Josephson junction process that is compatible with current tooling. We will then present the improvements in NbTiN material uniformity that typical 300 mm fabrication tooling can provide. While initial results on few-qubit systems are encouraging, advanced processing control is expected to deliver the improvements in qubit uniformity, coherence time, and control required for larger systems. Research funded by Intel Corporation.
Signal processing of anthropometric data
NASA Astrophysics Data System (ADS)
Zimmermann, W. J.
1983-09-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Signal processing of anthropometric data
NASA Technical Reports Server (NTRS)
Zimmermann, W. J.
1983-01-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Crystallization process of a three-dimensional complex plasma
NASA Astrophysics Data System (ADS)
Steinmüller, Benjamin; Dietz, Christopher; Kretschmer, Michael; Thoma, Markus H.
2018-05-01
Characteristic timescales and length scales for phase transitions of real materials are in ranges where a direct visualization is unfeasible. Therefore, model systems can be useful. Here, the crystallization process of a three-dimensional complex plasma under gravity conditions is considered where the system ranges up to a large extent into the bulk plasma. Time-resolved measurements exhibit the process down to a single-particle level. Primary clusters, consisting of particles in the solid state, grow vertically and, secondarily, horizontally. The box-counting method shows a fractal dimension of df≈2.72 for the clusters. This value gives a hint that the formation process is a combination of local epitaxial and diffusion-limited growth. The particle density and the interparticle distance to the nearest neighbor remain constant within the clusters during crystallization. All results are in good agreement with former observations of a single-particle layer.
A Global Catalogue of Large SO2 Sources and Emissions Derived from the Ozone Monitoring Instrument
NASA Technical Reports Server (NTRS)
Fioletov, Vitali E.; McLinden, Chris A.; Krotkov, Nickolay; Li, Can; Joiner, Joanna; Theys, Nicolas; Carn, Simon; Moran, Mike D.
2016-01-01
Sulfur dioxide (SO2) measurements from the Ozone Monitoring Instrument (OMI) satellite sensor processed with the new principal component analysis (PCA) algorithm were used to detect large point emission sources or clusters of sources. The total of 491 continuously emitting point sources releasing from about 30 kt yr(exp -1) to more than 4000 kt yr(exp -1) of SO2 per year have been identified and grouped by country and by primary source origin: volcanoes (76 sources); power plants (297); smelters (53); and sources related to the oil and gas industry (65). The sources were identified using different methods, including through OMI measurements themselves applied to a new emission detection algorithm, and their evolution during the 2005- 2014 period was traced by estimating annual emissions from each source. For volcanic sources, the study focused on continuous degassing, and emissions from explosive eruptions were excluded. Emissions from degassing volcanic sources were measured, many for the first time, and collectively they account for about 30% of total SO2 emissions estimated from OMI measurements, but that fraction has increased in recent years given that cumulative global emissions from power plants and smelters are declining while emissions from oil and gas industry remained nearly constant. Anthropogenic emissions from the USA declined by 80% over the 2005-2014 period as did emissions from western and central Europe, whereas emissions from India nearly doubled, and emissions from other large SO2-emitting regions (South Africa, Russia, Mexico, and the Middle East) remained fairly constant. In total, OMI-based estimates account for about a half of total reported anthropogenic SO2 emissions; the remaining half is likely related to sources emitting less than 30 kt yr(exp -1) and not detected by OMI.
Hydroclimatic drivers, Water-borne Diseases, and Population Vulnerability in Bengal Delta
NASA Astrophysics Data System (ADS)
Akanda, A. S.; Jutla, A. S.
2012-04-01
Water-borne diarrheal disease outbreaks in the Bengal Delta region, such as cholera, rotavirus, and dysentery, show distinct seasonal peaks and spatial signatures in their origin and progression. However, the mechanisms behind these seasonal phenomena, especially the role of regional climatic and hydrologic processes behind the disease outbreaks, are not fully understood. Overall diarrheal disease prevalence and the population vulnerability to transmission mechanisms thus remain severely underestimated. Recent findings suggest that diarrheal incidence in the spring is strongly associated with scarcity of freshwater flow volumes, while the abundance of water in monsoon show strong positive correlation with autumn diarrheal burden. The role of large-scale ocean-atmospheric processes that tend to modulate meteorological, hydrological, and environmental conditions over large regions and the effects on the ecological states conducive to the vectors and triggers of diarrheal outbreaks over large geographic regions are not well understood. We take a large scale approach to conduct detailed diagnostic analyses of a range of climate, hydrological, and ecosystem variables to investigate their links to outbreaks, occurrence, and transmission of the most prevalent water-borne diarrheal diseases. We employ satellite remote sensing data products to track coastal ecosystems and plankton processes related to cholera outbreaks. In addition, we investigate the effect of large scale hydroclimatic extremes (e.g., droughts and floods, El Nino) to identify how diarrheal transmission and epidemic outbreaks are most likely to respond to shifts in climatic, hydrologic, and ecological changes over coming decades. We argue that controlling diarrheal disease burden will require an integrated predictive surveillance approach - a combination of prediction and prevention - with recent advances in climate-based predictive capabilities and demonstrated successes in primary and tertiary prevention in endemic regions.
Plasma processing conditions substantially influence circulating microRNA biomarker levels.
Cheng, Heather H; Yi, Hye Son; Kim, Yeonju; Kroh, Evan M; Chien, Jason W; Eaton, Keith D; Goodman, Marc T; Tait, Jonathan F; Tewari, Muneesh; Pritchard, Colin C
2013-01-01
Circulating, cell-free microRNAs (miRNAs) are promising candidate biomarkers, but optimal conditions for processing blood specimens for miRNA measurement remain to be established. Our previous work showed that the majority of plasma miRNAs are likely blood cell-derived. In the course of profiling lung cancer cases versus healthy controls, we observed a broad increase in circulating miRNA levels in cases compared to controls and that higher miRNA expression correlated with higher platelet and particle counts. We therefore hypothesized that the quantity of residual platelets and microparticles remaining after plasma processing might impact miRNA measurements. To systematically investigate this, we subjected matched plasma from healthy individuals to stepwise processing with differential centrifugation and 0.22 µm filtration and performed miRNA profiling. We found a major effect on circulating miRNAs, with the majority (72%) of detectable miRNAs substantially affected by processing alone. Specifically, 10% of miRNAs showed 4-30x variation, 46% showed 30-1,000x variation, and 15% showed >1,000x variation in expression solely from processing. This was predominantly due to platelet contamination, which persisted despite using standard laboratory protocols. Importantly, we show that platelet contamination in archived samples could largely be eliminated by additional centrifugation, even in frozen samples stored for six years. To minimize confounding effects in microRNA biomarker studies, additional steps to limit platelet contamination for circulating miRNA biomarker studies are necessary. We provide specific practical recommendations to help minimize confounding variation attributable to plasma processing and platelet contamination.
Man-caused seismicity of Kuzbass
NASA Astrophysics Data System (ADS)
Emanov, Alexandr; Emanov, Alexey; Leskova, Ekaterina; Fateyev, Alexandr
2010-05-01
A natural seismicity of Kuznetsk Basin is confined in the main to mountain frame of Kuznetsk hollow. In this paper materials of experimental work with local station networks within sediment basin are presented. Two types of seismicity display within Kuznetsk hollow have been understood: first, man-caused seismic processes, confined to mine working and concentrated on depths up to one and a half of km; secondly, seismic activations on depths of 2-56 km, not coordinated in plan with coal mines. Every of studied seismic activations consists of large quantity of earthquakes of small powers (Ms=1-3). From one to first tens of earthquakes were recorded in a day. The earthquakes near mine working shift in space along with mine working, and seismic process become stronger at the instant a coal-plough machine is operated, and slacken at the instant the preventive works are executed. The seismic processes near three lavas in Kuznetsk Basin have been studied in detail. Uplift is the most typical focal mechanism. Activated zone near mine working reach in diameter 1-1,5 km. Seismic activations not linked with mine working testify that the subsoil of Kuznetsk hollow remain in stress state in whole. The most probable causes of man-caused action on hollow are processes, coupled with change of physical state of rocks at loss of methane from large volume or change by mine working of rock watering in large volume. In this case condensed rocks, lost gas and water, can press out upwards, realizing the reverse fault mechanism of earthquakes. A combination of stress state of hollow with man-caused action at deep mining may account for incipient activations in Kuznetsk Basin. Today earthquakes happen mainly under mine workings, though damages of workings themselves do not happen, but intensive shaking on surface calls for intent study of so dangerous phenomena. In 2009 replicates of the experiment on research of seismic activations in area of before investigated lavas have been conducted. A spatial displacement of activations along with mine working has been found. An impact of technogeneous factors on behavior of seismic process was investigated. It was demonstrated that industrial explosions in neighboring open-casts have no pronounced effect on seismic process near lavas. Stoppage of mole work in lavas leads to simultaneous changes in man-caused seismicity. The number of technogeneous earthquakes is halved. The earthquakes of small powers remain, but such slack lead to occasional though more strong technogeneous earthquakes.
NASA Astrophysics Data System (ADS)
Kelkar, S.; Karra, S.; Pawar, R. J.; Zyvoloski, G.
2012-12-01
There has been an increasing interest in the recent years in developing computational tools for analyzing coupled thermal, hydrological and mechanical (THM) processes that occur in geological porous media. This is mainly due to their importance in applications including carbon sequestration, enhanced geothermal systems, oil and gas production from unconventional sources, degradation of Arctic permafrost, and nuclear waste isolation. Large changes in pressures, temperatures and saturation can result due to injection/withdrawal of fluids or emplaced heat sources. These can potentially lead to large changes in the fluid flow and mechanical behavior of the formation, including shear and tensile failure on pre-existing or induced fractures and the associated permeability changes. Due to this, plastic deformation and large changes in material properties such as permeability and porosity can be expected to play an important role in these processes. We describe a general purpose computational code FEHM that has been developed for the purpose of modeling coupled THM processes during multi-phase fluid flow and transport in fractured porous media. The code uses a continuum mechanics approach, based on control volume - finite element method. It is designed to address spatial scales on the order of tens of centimeters to tens of kilometers. While large deformations are important in many situations, we have adapted the small strain formulation as useful insight can be obtained in many problems of practical interest with this approach while remaining computationally manageable. Nonlinearities in the equations and the material properties are handled using a full Jacobian Newton-Raphson technique. Stress-strain relationships are assumed to follow linear elastic/plastic behavior. The code incorporates several plasticity models such as von Mises, Drucker-Prager, and also a large suite of models for coupling flow and mechanical deformation via permeability and stresses/deformations. In this work we present several example applications of such models.
Wang, Chunya; Zhang, Mingchao; Xia, Kailun; Gong, Xueqin; Wang, Huimin; Yin, Zhe; Guan, Baolu; Zhang, Yingying
2017-04-19
The prosperous development of stretchable electronics poses a great demand on stretchable conductive materials that could maintain their electrical conductivity under tensile strain. Previously reported strategies to obtain stretchable conductors usually involve complex structure-fabricating processes or utilization of high-cost nanomaterials. It remains a great challenge to produce stretchable and conductive materials via a scalable and cost-effective process. Herein, a large-scalable pyrolysis strategy is developed for the fabrication of intrinsically stretchable and conductive textile in utilizing low-cost and mass-produced weft-knitted textiles as raw materials. Due to the intrinsic stretchability of the weft-knitted structure and the excellent mechanical and electrical properties of the as-obtained carbonized fibers, the obtained flexible and durable textile could sustain tensile strains up to 125% while keeping a stable electrical conductivity (as shown by a Modal-based textile), thus ensuring its applications in elastic electronics. For demonstration purposes, stretchable supercapacitors and wearable thermal-therapy devices that showed stable performance with the loading of tensile strains have been fabricated. Considering the simplicity and large scalability of the process, the low-cost and mass production of the raw materials, and the superior performances of the as-obtained elastic and conductive textile, this strategy would contribute to the development and industrial production of wearable electronics.
Brunero, Scott; Lamont, Scott
2012-03-01
Clinical supervision (CS) has been identified within nursing as a process for improving clinical practice and reducing the emotional burden of nursing practice. Little is known about its implementation across large tertiary referral hospitals. The purpose of this study is to evaluate the implementation of clinical supervision across several different nursing specialities at a teaching hospital in Sydney, Australia. Using a model of nursing implementation science, a process was developed at the study site that facilitated the development, implementation and evaluation of the project. After a 6-month study period, the CS groups were postevaluated using a survey tool developed for the project. A total of nine CS groups were in operation over the 6-month study period. A predominant focus within the sessions was one of the collegial support and developing standards of practice. The process was able to achieve wide hospital-based support for the role of CS from the senior nurse executives to junior nurses. Whilst there was overall positive support for the CS groups, logistical and resource challenges remain, in the effective roll out of CS to large numbers of nurses. © 2011 The Authors. Scandinavian Journal of Caring Sciences © 2011 Nordic College of Caring Science.
Screening Methodologies to Support Risk and Technology ...
The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have largely completed the required “Maximum Achievable Control Technology” (MACT) standards. In the second stage of the regulatory process, EPA must review each MACT standard at least every eight years and revise them as necessary, “taking into account developments in practices, processes and control technologies.” We call this requirement the “technology review.” EPA is also required to complete a one-time assessment of the health and environmental risks that remain after sources come into compliance with MACT. This residual risk review also must be done within 8 years of setting the initial MACT standard. If additional risk reductions are necessary to protect public health with an ample margin of safety or to prevent adverse environmental effects, EPA must develop standards to address these remaining risks. Because the risk review is an important component of the RTR process, EPA is seeking SAB input on the scientific credibility of specific enhancements made to our risk assessment methodologies, particularly with respect to screening methodologies, since the last SAB review was completed in 2010. These enhancements to our risk methodologies are outlined in the document title
NASA Astrophysics Data System (ADS)
Newman, Gregory A.
2014-01-01
Many geoscientific applications exploit electrostatic and electromagnetic fields to interrogate and map subsurface electrical resistivity—an important geophysical attribute for characterizing mineral, energy, and water resources. In complex three-dimensional geologies, where many of these resources remain to be found, resistivity mapping requires large-scale modeling and imaging capabilities, as well as the ability to treat significant data volumes, which can easily overwhelm single-core and modest multicore computing hardware. To treat such problems requires large-scale parallel computational resources, necessary for reducing the time to solution to a time frame acceptable to the exploration process. The recognition that significant parallel computing processes must be brought to bear on these problems gives rise to choices that must be made in parallel computing hardware and software. In this review, some of these choices are presented, along with the resulting trade-offs. We also discuss future trends in high-performance computing and the anticipated impact on electromagnetic (EM) geophysics. Topics discussed in this review article include a survey of parallel computing platforms, graphics processing units to multicore CPUs with a fast interconnect, along with effective parallel solvers and associated solver libraries effective for inductive EM modeling and imaging.
Ionisation in ultra-cool, cloud forming extrasolar planetary atmospheres
NASA Astrophysics Data System (ADS)
Helling, Christiane; the LEAP Team
2015-04-01
Transit spectroscopy provides evidence that extrasolare planets are covered in clouds, a finding that has been forecast by cloud model simulations 15 years ago. Atmospheres are strongly affected by clouds through their large opacity and their chemical activity. Cloud formation models allow to predict cloud particle sizes, their chemical composition and the composition of the remaining atmospheric gas (Woitke & Helling 2004, A&A 414; Helling & Woitke 2006, A&A 455), for example, as input for radiative transfer codes like Drift-Phoenix (Witte et al. 2009; A&A 506). These cloud particles are charged and can discharge, for example in form of lighting (Helling et al. 2013, ApJ 767; Bailey et al. 2014, ApJ 784). Earth observations demonstrate that lighting effects not only the local chemistry but also the electron budget of the atmosphere. This talk will present our work on cloud formation modelling and ionisation processes in cloud forming atmospheres. An hierarchy of ionisation processes leads to a vertically inhomogenously ionised atmosphere which has implications for planetary mass loss and global circulation pattern of planetary atmospheres. Processes involved, like Cosmic Ray ionisation, do also activate the local chemistry such that large hydrocarbon molecules form (Rimmer et al. 2014, IJAsB 13).
Emergence of cracks by mass transport in elastic crystals stressed at high temperatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, B.; Suo, Z.; Evans, A.G.
1995-12-31
Single crystals are used under high temperature and high stresses in hostile environments (usually gases). A void produced in the fabrication process can change shape and volume, as atoms migrate under various thermodynamic forces. A small void under low stress remains rounded in shape, but a large void under high stress evolves to a crack. The material fractures catastrophically when the crack becomes sufficiently large. In this article three kinetic processes are analyzed: diffusion along the void surface, diffusion in a low melting point second phase inside the void, and surface reaction with the gases. An approximate evolution path ismore » simulated, with the void evolving as a sequence of spheroids, from a sphere to a penny-shaped crack. The free energy is calculated as a functional of void shape, from which the instability conditions are determined. The evolution rate is calculated by using variational principles derived from the valance of the reduction in the free energy and the dissipation is the kinetic processes. Crystalline anisotropy and surface heterogeneity can be readily incorporated in this energetic framework. Comparisons are made with experimental strength date for sapphire fibers measured at various strain rates.« less
O'Dwyer, David N; Norman, Katy C; Xia, Meng; Huang, Yong; Gurczynski, Stephen J; Ashley, Shanna L; White, Eric S; Flaherty, Kevin R; Martinez, Fernando J; Murray, Susan; Noth, Imre; Arnold, Kelly B; Moore, Bethany B
2017-04-25
Idiopathic pulmonary fibrosis (IPF) is a progressive and fatal interstitial pneumonia. The disease pathophysiology is poorly understood and the etiology remains unclear. Recent advances have generated new therapies and improved knowledge of the natural history of IPF. These gains have been brokered by advances in technology and improved insight into the role of various genes in mediating disease, but gene expression and protein levels do not always correlate. Thus, in this paper we apply a novel large scale high throughput aptamer approach to identify more than 1100 proteins in the peripheral blood of well-characterized IPF patients and normal volunteers. We use systems biology approaches to identify a unique IPF proteome signature and give insight into biological processes driving IPF. We found IPF plasma to be altered and enriched for proteins involved in defense response, wound healing and protein phosphorylation when compared to normal human plasma. Analysis also revealed a minimal protein signature that differentiated IPF patients from normal controls, which may allow for accurate diagnosis of IPF based on easily-accessible peripheral blood. This report introduces large scale unbiased protein discovery analysis to IPF and describes distinct biological processes that further inform disease biology.
Li, Jin; Huang, Lijie; Song, Yiying; Liu, Jia
2017-07-28
It has been long proposed that our extraordinary face recognition ability stems from holistic face processing. Two widely-used behavioral hallmarks of holistic face processing are the whole-part effect (WPE) and composite-face effect (CFE). However, it remains unknown whether these two effects reflect similar or different aspects of holistic face processing. Here we investigated this question by examining whether the WPE and CFE involved shared or distinct neural substrates in a large sample of participants (N=200). We found that the WPE and CFE showed hemispheric dissociation in the fusiform face area (FFA), that is, the WPE was correlated with face selectivity in the left FFA, while the CFE was correlated with face selectivity in the right FFA. Further, the correlation between the WPE and face selectivity was largely driven by the FFA response to faces, whereas the association between the CFE and face selectivity resulted from suppressed response to objects in the right FFA. Finally, we also observed dissociated correlation patterns of the WPE and CFE in other face-selective regions and across the whole brain. These results suggest that the WPE and CFE may reflect different aspects of holistic face processing, which shed new light on the behavioral dissociations of these two effects demonstrated in literature. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, T.M.
1992-01-01
Investigations and mapping of surficial deposits in Ohio have focused largely on the glacial deposits which cover nearly two-thirds of the state. Research on Quaternary deposits beyond the glacial border has been done by Foster, Hildreth, Andrews, Leverett, Tight, Stout, Goldthwait, Forsyth, Lessig, White, Totten, Hoyer, and Noltimier. However, growing human interaction with surficial materials of southeast Ohio now requires much more detailed mapping and characterization of these deposits. Recognition of periglacial, proglacial, and preglacial processes and materials in eastern and southern states has led to the search for similar processes and materials in southeast Ohio. Evidence for gelifraction, gelifluction,more » cryoturbation, and considerable periglacial colluviation is more extensive than previously thought. Proglacial deposits are also much more extensive, outwash and glaciolacustrine deposits cover large areas in southeast Ohio and are poorly mapped and characterized, or not mapped at all. Preglacial processes including a long span of profound weathering and formation of saprolite have been given little or no attention in southeast Ohio. The signature of protracted preglacial weathering still remains in this part of the state, and should change prevailing views of the terrain upon which periglacial processes worked. Mapping and characterization of these materials are urgently needed as citizens make important land-use decisions such as locating landfills and new developments.« less
Endogenous hydrogen sulfide is involved in the pathogenesis of atherosclerosis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qiao, Wang; Chaoshu, Tang; Key Laboratory of Molecular Cardiovascular Medicine, Ministry of Education
2010-05-28
Atherosclerosis is a chronic, complex, and progressive pathological process in large and medium sized arteries. The exact mechanism of this process remains unclear. Hydrogen sulfide (H{sub 2}S), a novel gasotransmitter, was confirmed as playing a major role in the pathogenesis of many cardiovascular diseases. It plays a role in vascular smooth muscle cell (VSMC) proliferation and apoptosis, participates in the progress of hyperhomocysteinemia (HHCY), inhibits atherogenic modification of LDL, interferes with vascular calcification, intervenes with platelet function, and there are interactions between H{sub 2}S and inflammatory processes. The role of H{sub 2}S in atherosclerotic pathogenesis highlights the mysteries of atherosclerosismore » and inspires the search for innovative therapeutic strategies. Here, we review the studies to date that have considered the role of H{sub 2}S in atherosclerosis.« less
Using large-scale genome variation cohorts to decipher the molecular mechanism of cancer.
Habermann, Nina; Mardin, Balca R; Yakneen, Sergei; Korbel, Jan O
2016-01-01
Characterizing genomic structural variations (SVs) in the human genome remains challenging, and there is a growing interest to understand somatic SVs occurring in cancer, a disease of the genome. A havoc-causing SV process known as chromothripsis scars the genome when localized chromosome shattering and repair occur in a one-off catastrophe. Recent efforts led to the development of a set of conceptual criteria for the inference of chromothripsis events in cancer genomes and to the development of experimental model systems for studying this striking DNA alteration process in vitro. We discuss these approaches, and additionally touch upon current "Big Data" efforts that employ hybrid cloud computing to enable studies of numerous cancer genomes in an effort to search for commonalities and differences in molecular DNA alteration processes in cancer. Copyright © 2016. Published by Elsevier SAS.
Key, Alexandra P.; Ibanez, Lisa V.; Henderson, Heather A.; Warren, Zachary; Messinger, Daniel S.; Stone, Wendy L.
2014-01-01
Few behavioral indices of risk for autism spectrum disorders (ASD) are present before 12 months, and potential biomarkers remain largely unexamined. This prospective study of infant siblings of children with ASD (n=16) and low-risk comparison infants (n= 15) examined group differences in event-related potentials (ERPs) indexing processing of facial positive affect (N290/P400, Nc) at 9 months and their relation to joint attention at 15 months. Group differences were most pronounced for subtle facial expressions, in that the low-risk group exhibited relatively longer processing (P400 latency) and greater attention resource allocation (Nc amplitude). Exploratory analyses found associations between ERP responses and later joint attention, suggesting that attention to positive affect cues may support the development of other social competencies. PMID:25056131
Effect of Electric Field on CO2 Photoreduction by TiO2 Film
NASA Astrophysics Data System (ADS)
Huang, Zhengfeng; Cheng, Xudong; Dong, Peimei; Zhang, Xiwen
2017-02-01
To mitigate the greenhouse effect, many studies have been carried out to improve the CO2 conversion efficiency of TiO2. Modification of TiO2 has been intensively investigated, but the influence of an electric field on photoreduction by this material remains largely unknown. Accordingly, in this study, we explored the effect of an electric field on the photoreduction process using a porous TiO2-Ti material. The results indicated that the CO yield improved 85-fold (equivalent to 4772 μmol/g h) when a 30-kV voltage was applied during the reduction process. To make the electric field effect fully functional, we also explored the effect of water on the photoreduction process, finding that TiO2 showed the highest conversion rate when the humidity was controlled at 50% relative humidity (RH).
NASA Astrophysics Data System (ADS)
Akanda, Ali Shafqat; Jutla, Antarpreet S.; Alam, Munirul; de Magny, Guillaume Constantin; Siddique, A. Kasem; Sack, R. Bradley; Huq, Anwar; Colwell, Rita R.; Islam, Shafiqul
2011-03-01
Cholera remains a major public health threat in many developing countries around the world. The striking seasonality and annual recurrence of this infectious disease in endemic areas remain of considerable interest to scientists and public health workers. Despite major advances in the ecological and microbiological understanding of Vibrio cholerae, the causative agent of the disease, the role of underlying large-scale hydroclimatic processes in propagating the disease for different seasons and spatial locations is not well understood. Here we show that the cholera outbreaks in the Bengal Delta region are propagated from the coastal to the inland areas and from spring to fall by two distinctly different transmission cycles, premonsoon and postmonsoon, influenced by coastal and terrestrial hydroclimatic processes, respectively. A coupled analysis of the regional hydroclimate and cholera incidence reveals a strong association of the space-time variability of incidence peaks with seasonal processes and extreme climatic events. We explain how the asymmetric seasonal hydroclimatology affects regional cholera dynamics by providing a coastal growth environment for bacteria in spring, while propagating the disease to fall by monsoon flooding. Our findings may serve as the basis for "climate-informed" early warnings and for prompting effective means for intervention and preempting epidemic cholera outbreaks in vulnerable regions.
Solar to fuels conversion technologies: a perspective.
Tuller, Harry L
2017-01-01
To meet increasing energy needs, while limiting greenhouse gas emissions over the coming decades, power capacity on a large scale will need to be provided from renewable sources, with solar expected to play a central role. While the focus to date has been on electricity generation via photovoltaic (PV) cells, electricity production currently accounts for only about one-third of total primary energy consumption. As a consequence, solar-to-fuel conversion will need to play an increasingly important role and, thereby, satisfy the need to replace high energy density fossil fuels with cleaner alternatives that remain easy to transport and store. The solar refinery concept (Herron et al. in Energy Environ Sci 8:126-157, 2015), in which captured solar radiation provides energy in the form of heat, electricity or photons, used to convert the basic chemical feedstocks CO 2 and H 2 O into fuels, is reviewed as are the key conversion processes based on (1) combined PV and electrolysis, (2) photoelectrochemically driven electrolysis and (3) thermochemical processes, all focused on initially converting H 2 O and CO 2 to H 2 and CO. Recent advances, as well as remaining challenges, associated with solar-to-fuel conversion are discussed, as is the need for an intensive research and development effort to bring such processes to scale.
Zhang, Chunlin; Geng, Xuesong; Wang, Hao; Zhou, Lei; Wang, Boguang
2017-01-01
Atmospheric ammonia (NH 3 ), a common alkaline gas found in air, plays a significant role in atmospheric chemistry, such as in the formation of secondary particles. However, large uncertainties remain in the estimation of ammonia emissions from nonagricultural sources, such as wastewater treatment plants (WWTPs). In this study, the ammonia emission factors from a large WWTP utilizing three typical biological treatment techniques to process wastewater in South China were calculated using the US EPA's WATER9 model with three years of raw sewage measurements and information about the facility. The individual emission factors calculated were 0.15 ± 0.03, 0.24 ± 0.05, 0.29 ± 0.06, and 0.25 ± 0.05 g NH 3 m -3 sewage for the adsorption-biodegradation activated sludge treatment process, the UNITANK process (an upgrade of the sequencing batch reactor activated sludge treatment process), and two slightly different anaerobic-anoxic-oxic treatment processes, respectively. The overall emission factor of the WWTP was 0.24 ± 0.06 g NH 3 m -3 sewage. The pH of the wastewater influent is likely an important factor affecting ammonia emissions, because higher emission factors existed at higher pH values. Based on the ammonia emission factor generated in this study, sewage treatment accounted for approximately 4% of the ammonia emissions for the urban area of South China's Pearl River Delta (PRD) in 2006, which is much less than the value of 34% estimated in previous studies. To reduce the large uncertainty in the estimation of ammonia emissions in China, more field measurements are required. Copyright © 2016 Elsevier Ltd. All rights reserved.
Adapting a large database of point of care summarized guidelines: a process description.
Delvaux, Nicolas; Van de Velde, Stijn; Aertgeerts, Bert; Goossens, Martine; Fauquert, Benjamin; Kunnamo, Ilka; Van Royen, Paul
2017-02-01
Questions posed at the point of care (POC) can be answered using POC summarized guidelines. To implement a national POC information resource, we subscribed to a large database of POC summarized guidelines to complement locally available guidelines. Our challenge was in developing a sustainable strategy for adapting almost 1000 summarized guidelines. The aim of this paper was to describe our process for adapting a database of POC summarized guidelines. An adaptation process based on the ADAPTE framework was tailored to be used by a heterogeneous group of participants. Guidelines were assessed on content and on applicability to the Belgian context. To improve efficiency, we chose to first aim our efforts towards those guidelines most important to primary care doctors. Over a period of 3 years, we screened about 80% of 1000 international summarized guidelines. For those guidelines identified as most important for primary care doctors, we noted that in about half of the cases, remarks were made concerning content. On the other hand, at least two-thirds of all screened guidelines required no changes when evaluating their local usability. Adapting a large body of POC summarized guidelines using a formal adaptation process is possible, even when faced with limited resources. This can be done by creating an efficient and collaborative effort and ensuring user-friendly procedures. Our experiences show that even though in most cases guidelines can be adopted without adaptations, careful review of guidelines developed in a different context remains necessary. Streamlining international efforts in adapting international POC information resources and adopting similar adaptation processes may lessen duplication efforts and prove more cost-effective. © 2015 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.
Interspinous Process Decompression: Expanding Treatment Options for Lumbar Spinal Stenosis
Nunley, Pierce D.; Shamie, A. Nick; Blumenthal, Scott L.; Orndorff, Douglas; Geisler, Fred H.
2016-01-01
Interspinous process decompression is a minimally invasive implantation procedure employing a stand-alone interspinous spacer that functions as an extension blocker to prevent compression of neural elements without direct surgical removal of tissue adjacent to the nerves. The Superion® spacer is the only FDA approved stand-alone device available in the US. It is also the only spacer approved by the CMS to be implanted in an ambulatory surgery center. We computed the within-group effect sizes from the Superion IDE trial and compared them to results extrapolated from two randomized trials of decompressive laminectomy. For the ODI, effect sizes were all very large (>1.0) for Superion and laminectomy at 2, 3, and 4 years. For ZCQ, the 2-year Superion symptom severity (1.26) and physical function (1.29) domains were very large; laminectomy effect sizes were very large (1.07) for symptom severity and large for physical function (0.80). Current projections indicate a marked increase in the number of patients with spinal stenosis. Consequently, there remains a keen interest in minimally invasive treatment options that delay or obviate the need for invasive surgical procedures, such as decompressive laminectomy or fusion. Stand-alone interspinous spacers may fill a currently unmet treatment gap in the continuum of care and help to reduce the burden of this chronic degenerative condition on the health care system. PMID:27819001
Interspinous Process Decompression: Expanding Treatment Options for Lumbar Spinal Stenosis.
Nunley, Pierce D; Shamie, A Nick; Blumenthal, Scott L; Orndorff, Douglas; Block, Jon E; Geisler, Fred H
2016-01-01
Interspinous process decompression is a minimally invasive implantation procedure employing a stand-alone interspinous spacer that functions as an extension blocker to prevent compression of neural elements without direct surgical removal of tissue adjacent to the nerves. The Superion® spacer is the only FDA approved stand-alone device available in the US. It is also the only spacer approved by the CMS to be implanted in an ambulatory surgery center. We computed the within-group effect sizes from the Superion IDE trial and compared them to results extrapolated from two randomized trials of decompressive laminectomy. For the ODI, effect sizes were all very large (>1.0) for Superion and laminectomy at 2, 3, and 4 years. For ZCQ, the 2-year Superion symptom severity (1.26) and physical function (1.29) domains were very large ; laminectomy effect sizes were very large (1.07) for symptom severity and large for physical function (0.80). Current projections indicate a marked increase in the number of patients with spinal stenosis. Consequently, there remains a keen interest in minimally invasive treatment options that delay or obviate the need for invasive surgical procedures, such as decompressive laminectomy or fusion. Stand-alone interspinous spacers may fill a currently unmet treatment gap in the continuum of care and help to reduce the burden of this chronic degenerative condition on the health care system.
Resiliency of the Multiscale Retinex Image Enhancement Algorithm
NASA Technical Reports Server (NTRS)
Rahman, Zia-Ur; Jobson, Daniel J.; Woodell, Glenn A.
1998-01-01
The multiscale retinex with color restoration (MSRCR) continues to prove itself in extensive testing to be very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition, However, issues remain with regard to the resiliency of the MSRCR to different image sources and arbitrary image manipulations which may have been applied prior to retinex processing. In this paper we define these areas of concern, provide experimental results, and, examine the effects of commonly occurring image manipulation on retinex performance. In virtually all cases the MSRCR is highly resilient to the effects of both the image source variations and commonly encountered prior image-processing. Significant artifacts are primarily observed for the case of selective color channel clipping in large dark zones in a image. These issues are of concerning the processing of digital image archives and other applications where there is neither control over the image acquisition process, nor knowledge about any processing done on th data beforehand.
Li, Yongxin; Kikuchi, Mani; Li, Xueyan; Gao, Qionghua; Xiong, Zijun; Ren, Yandong; Zhao, Ruoping; Mao, Bingyu; Kondo, Mariko; Irie, Naoki; Wang, Wen
2018-01-01
Sea cucumbers, one main class of Echinoderms, have a very fast and drastic metamorphosis process during their development. However, the molecular basis under this process remains largely unknown. Here we systematically examined the gene expression profiles of Japanese common sea cucumber (Apostichopus japonicus) for the first time by RNA sequencing across 16 developmental time points from fertilized egg to juvenile stage. Based on the weighted gene co-expression network analysis (WGCNA), we identified 21 modules. Among them, MEdarkmagenta was highly expressed and correlated with the early metamorphosis process from late auricularia to doliolaria larva. Furthermore, gene enrichment and differentially expressed gene analysis identified several genes in the module that may play key roles in the metamorphosis process. Our results not only provide a molecular basis for experimentally studying the development and morphological complexity of sea cucumber, but also lay a foundation for improving its emergence rate. Copyright © 2017 Elsevier Inc. All rights reserved.
Customized data container for improved performance in optical cryptosystems
NASA Astrophysics Data System (ADS)
Vélez Zea, Alejandro; Fredy Barrera, John; Torroba, Roberto
2016-12-01
Coherent optical encryption procedures introduce speckle noise to the output, limiting many practical applications. Until now the only method available to avoid this noise is to codify the information to be processed into a container that is encrypted instead of the original data. Although the decrypted container presents the noise due to the optical processing, their features remain recognizable enough to allow decoding, bringing the original information free of any kind of degradation. The first adopted containers were the quick response (QR) codes. However, the limitations of optical encryption procedures and the features of QR codes imply that in practice only simple codes containing small amounts of data can be processed without large experimental requirements. In order to overcome this problem, we introduce the first tailor made container to be processed in optical cryptosystems, ensuring larger noise tolerance and the ability to process more information with less experimental requirements. We present both simulations and experimental results to demonstrate the advantages of our proposal.
Medial PFC Damage Abolishes the Self-reference Effect
Philippi, Carissa L.; Duff, Melissa C.; Denburg, Natalie L.; Tranel, Daniel; Rudrauf, David
2012-01-01
Functional neuroimaging studies suggest that the medial PFC (mPFC) is a key component of a large-scale neural system supporting a variety of self-related processes. However, it remains unknown whether the mPFC is critical for such processes. In this study, we used a human lesion approach to examine this question. We administered a standard trait judgment paradigm [Kelley, W. M., Macrae, C. N., Wyland, C. L., Caglar, S., Inati, S., & Heatherton, T. F. Finding the self? An event-related fMRI study. Journal of Cognitive Neuroscience, 14, 785–794, 2002] to patients with focal brain damage to the mPFC. The self-reference effect (SRE), a memory advantage conferred by self-related processing, served as a measure of intact self-processing ability. We found that damage to the mPFC abolished the SRE. The results demonstrate that the mPFC is necessary for the SRE and suggest that this structure is important for self-referential processing and the neural representation of self. PMID:21942762
ATP synthase promotes germ cell differentiation independent of oxidative phosphorylation
Teixeira, Felipe K.; Sanchez, Carlos G.; Hurd, Thomas R.; Seifert, Jessica R. K.; Czech, Benjamin; Preall, Jonathan B.; Hannon, Gregory J.; Lehmann, Ruth
2015-01-01
The differentiation of stem cells is a tightly regulated process essential for animal development and tissue homeostasis. Through this process, attainment of new identity and function is achieved by marked changes in cellular properties. Intrinsic cellular mechanisms governing stem cell differentiation remain largely unknown, in part because systematic forward genetic approaches to the problem have not been widely used1,2. Analysing genes required for germline stem cell differentiation in the Drosophila ovary, we find that the mitochondrial ATP synthase plays a critical role in this process. Unexpectedly, the ATP synthesizing function of this complex was not necessary for differentiation, as knockdown of other members of the oxidative phosphorylation system did not disrupt the process. Instead, the ATP synthase acted to promote the maturation of mitochondrial cristae during differentiation through dimerization and specific upregulation of the ATP synthase complex. Taken together, our results suggest that ATP synthase-dependent crista maturation is a key developmental process required for differentiation independent of oxidative phosphorylation. PMID:25915123
NASA Astrophysics Data System (ADS)
Murawski, Jens; Kleine, Eckhard
2017-04-01
Sea ice remains one of the frontiers of ocean modelling and is of vital importance for the correct forecasts of the northern oceans. At large scale, it is commonly considered a continuous medium whose dynamics is modelled in terms of continuum mechanics. Its specifics are a matter of constitutive behaviour which may be characterised as rigid-plastic. The new developed sea ice dynamic module bases on general principles and follows a systematic approach to the problem. Both drift field and stress field are modelled by a variational property. Rigidity is treated by Lagrangian relaxation. Thus one is led to a sensible numerical method. Modelling fast ice remains to be a challenge. It is understood that ridging and the formation of grounded ice keels plays a role in the process. The ice dynamic model includes a parameterisation of the stress associated with grounded ice keels. Shear against the grounded bottom contact might lead to plastic deformation and the loss of integrity. The numerical scheme involves a potentially large system of linear equations which is solved by pre-conditioned iteration. The entire algorithm consists of several components which result from decomposing the problem. The algorithm has been implemented and tested in practice.
McHale, P; Keenan, A; Ghebrehewet, S
2016-03-01
Uptake rates for the combined measles, mumps and rubella (MMR) vaccine have been below the required 95% in the UK since a retracted and discredited article linking the MMR vaccine with autism and inflammatory bowel disease was released in 1998. This study undertook semi-structured telephone interviews among parents or carers of 47 unvaccinated measles cases who were aged between 13 months and 9 years, during a large measles outbreak in Merseyside. Results showed that concerns over the specific links with autism remain an important cause of refusal to vaccinate, with over half of respondents stating this as a reason. A quarter stated child illness during scheduled vaccination time, while other reasons included general safety concerns and access issues. Over half of respondents felt that more information or a discussion with a health professional would help the decision-making process, while a third stated improved access. There was clear support for vaccination among respondents when asked about current opinions regarding MMR vaccine. The findings support the hypothesis that safety concerns remain a major barrier to MMR vaccination, and also support previous evidence that experience of measles is an important determinant in the decision to vaccinate.
A Community of One: Social Cognition and Auditory Verbal Hallucinations
Bell, Vaughan
2013-01-01
Auditory verbal hallucinations have attracted a great deal of scientific interest, but despite the fact that they are fundamentally a social experience—in essence, a form of hallucinated communication—current theories remain firmly rooted in an individualistic account and have largely avoided engagement with social cognition. Nevertheless, there is mounting evidence for the role of social cognitive and social neurocognitive processes in auditory verbal hallucinations, and, consequently, it is proposed that problems with the internalisation of social models may be key to the experience. PMID:24311984
NASA Technical Reports Server (NTRS)
1992-01-01
Ames Research Center research into virtual reality led to the development of the Convolvotron, a high speed digital audio processing system that delivers three-dimensional sound over headphones. It consists of a two-card set designed for use with a personal computer. The Convolvotron's primary application is presentation of 3D audio signals over headphones. Four independent sound sources are filtered with large time-varying filters that compensate for motion. The perceived location of the sound remains constant. Possible applications are in air traffic control towers or airplane cockpits, hearing and perception research and virtual reality development.
Challenges and opportunities in synthetic biology for chemical engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, YZ; Lee, JK; Zhao, HM
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. (C) 2012 Elsevier Ltd. All rights reserved.
Challenges and opportunities in synthetic biology for chemical engineers
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2012-01-01
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement. PMID:24222925
Challenges and opportunities in synthetic biology for chemical engineers.
Luo, Yunzi; Lee, Jung-Kul; Zhao, Huimin
2013-11-15
Synthetic biology provides numerous great opportunities for chemical engineers in the development of new processes for large-scale production of biofuels, value-added chemicals, and protein therapeutics. However, challenges across all scales abound. In particular, the modularization and standardization of the components in a biological system, so-called biological parts, remain the biggest obstacle in synthetic biology. In this perspective, we will discuss the main challenges and opportunities in the rapidly growing synthetic biology field and the important roles that chemical engineers can play in its advancement.
Modeling the Bergeron-Findeisen Process Using PDF Methods With an Explicit Representation of Mixing
NASA Astrophysics Data System (ADS)
Jeffery, C.; Reisner, J.
2005-12-01
Currently, the accurate prediction of cloud droplet and ice crystal number concentration in cloud resolving, numerical weather prediction and climate models is a formidable challenge. The Bergeron-Findeisen process in which ice crystals grow by vapor deposition at the expense of super-cooled droplets is expected to be inhomogeneous in nature--some droplets will evaporate completely in centimeter-scale filaments of sub-saturated air during turbulent mixing while others remain unchanged [Baker et al., QJRMS, 1980]--and is unresolved at even cloud-resolving scales. Despite the large body of observational evidence in support of the inhomogeneous mixing process affecting cloud droplet number [most recently, Brenguier et al., JAS, 2000], it is poorly understood and has yet to be parameterized and incorporated into a numerical model. In this talk, we investigate the Bergeron-Findeisen process using a new approach based on simulations of the probability density function (PDF) of relative humidity during turbulent mixing. PDF methods offer a key advantage over Eulerian (spatial) models of cloud mixing and evaporation: the low probability (cm-scale) filaments of entrained air are explicitly resolved (in probability space) during the mixing event even though their spatial shape, size and location remain unknown. Our PDF approach reveals the following features of the inhomogeneous mixing process during the isobaric turbulent mixing of two parcels containing super-cooled water and ice, respectively: (1) The scavenging of super-cooled droplets is inhomogeneous in nature; some droplets evaporate completely at early times while others remain unchanged. (2) The degree of total droplet evaporation during the initial mixing period depends linearly on the mixing fractions of the two parcels and logarithmically on Damköhler number (Da)---the ratio of turbulent to evaporative time-scales. (3) Our simulations predict that the PDF of Lagrangian (time-integrated) subsaturation (S) goes as S-1 at high Da. This behavior results from a Gaussian mixing closure and requires observational validation.
NASA Astrophysics Data System (ADS)
Leite, Orlando; Gance, Julien; Texier, Benoît; Bernard, Jean; Truffert, Catherine
2017-04-01
Driven by needs in the mineral exploration market for ever faster and ever easier set-up of large 3D resistivity and induced polarization, autonomous and cableless recorded systems come to the forefront. Opposite to the traditional centralized acquisition, this new system permits a complete random distribution of receivers on the survey area allowing to obtain a real 3D imaging. This work presents the results of a 3 km2 large experiment up to 600m of depth performed with a new type of autonomous distributed receivers: the I&V-Fullwaver. With such system, all usual drawbacks induced by long cable set up over large 3D areas - time consuming, lack of accessibility, heavy weight, electromagnetic induction, etc. - disappear. The V-Fullwavers record the entire time series of voltage on two perpendicular axes, for a good determination of the data quality although I-Fullwaver records injected current simultaneously. For this survey, despite good assessment of each individual signal quality, on each channel of the set of Fullwaver systems, a significant number of negative apparent resistivity and chargeability remains present in the dataset (around 15%). These values are commonly not taken into account in the inversion software although they may be due to complex geological structure of interest (e.g. linked to the presence of sulfides in the earth). Taking into account that such distributed recording system aims to restitute the best 3D resistivity and IP tomography, how can 3D inversion be improved? In this work, we present the dataset, the processing chain and quality control of a large 3D survey. We show that the quality of the data selected is good enough to include it into the inversion processing. We propose a second way of processing based on the modulus of the apparent resistivity that stabilizes the inversion. We then discuss the results of both processing. We conclude that an effort could be made on the inclusion of negative apparent resistivity in the inversion code.
Landslides and Landscape Evolution
NASA Astrophysics Data System (ADS)
Densmore, A. L.; Hovius, N.
2017-12-01
Landslides have long been recognised as a major hazard, and are a common product of both large earthquakes and rainstorms. Our appreciation for landslides as agents of erosion and land surface evolution, however, is much more recent. Only in the last twenty years have we come to understand the critical role that landslides play at the landscape scale: in allowing hillslopes to keep pace with fluvial incision, in supplying sediment to channel networks and sedimentary basins, in divide migration, and in setting the basic structure of the landscape. This perspective has been made possible in part by repeat remote sensing and new ways of visualising the land surface, and by extending our understanding of failure processes to the landscape scale; but it is also true that the big jumps in our knowledge have been triggered by large events, such as the 1999 Chi-Chi and 2008 Wenchuan earthquakes. Thanks in part to a relative handful of such case studies, we now have a better idea of the spatial distribution of landslides that are triggered in large events, the volume of sediment that they mobilise, the time scales over which that sediment is mobilised and evacuated, and the overall volume balance between erosion and tectonic processes in the growth of mountainous topography. There remain, however, some major challenges that must still be overcome. Estimates of landslide volume remain highly uncertain, as does our ability to predict the evolution of hillslope propensity to failure after a major triggering event, the movement of landslide sediment (especially the coarse fraction that is transported as bedload), and the impact of landslides on both long-term erosion rates and tectonic processes. The limited range of case studies also means that we struggle to predict outcomes for triggering events in different geological settings, such as loess landscapes or massive lithologies. And the perspective afforded by taking a landscape-scale view has yet to be fully reflected in our approach to landslide hazard. We close by outlining some promising future research directions by which these challenges might be overcome.
Kiefer, Gundolf; Lehmann, Helko; Weese, Jürgen
2006-04-01
Maximum intensity projections (MIPs) are an important visualization technique for angiographic data sets. Efficient data inspection requires frame rates of at least five frames per second at preserved image quality. Despite the advances in computer technology, this task remains a challenge. On the one hand, the sizes of computed tomography and magnetic resonance images are increasing rapidly. On the other hand, rendering algorithms do not automatically benefit from the advances in processor technology, especially for large data sets. This is due to the faster evolving processing power and the slower evolving memory access speed, which is bridged by hierarchical cache memory architectures. In this paper, we investigate memory access optimization methods and use them for generating MIPs on general-purpose central processing units (CPUs) and graphics processing units (GPUs), respectively. These methods can work on any level of the memory hierarchy, and we show that properly combined methods can optimize memory access on multiple levels of the hierarchy at the same time. We present performance measurements to compare different algorithm variants and illustrate the influence of the respective techniques. On current hardware, the efficient handling of the memory hierarchy for CPUs improves the rendering performance by a factor of 3 to 4. On GPUs, we observed that the effect is even larger, especially for large data sets. The methods can easily be adjusted to different hardware specifics, although their impact can vary considerably. They can also be used for other rendering techniques than MIPs, and their use for more general image processing task could be investigated in the future.
Microfluidic model of the platelet-generating organ: beyond bone marrow biomimetics
NASA Astrophysics Data System (ADS)
Blin, Antoine; Le Goff, Anne; Magniez, Aurélie; Poirault-Chassac, Sonia; Teste, Bruno; Sicot, Géraldine; Nguyen, Kim Anh; Hamdi, Feriel S.; Reyssat, Mathilde; Baruch, Dominique
2016-02-01
We present a new, rapid method for producing blood platelets in vitro from cultured megakaryocytes based on a microfluidic device. This device consists in a wide array of VWF-coated micropillars. Such pillars act as anchors on megakaryocytes, allowing them to remain trapped in the device and subjected to hydrodynamic shear. The combined effect of anchoring and shear induces the elongation of megakaryocytes and finally their rupture into platelets and proplatelets. This process was observed with megakaryocytes from different origins and found to be robust. This original bioreactor design allows to process megakaryocytes at high throughput (millions per hour). Since platelets are produced in such a large amount, their extensive biological characterisation is possible and shows that platelets produced in this bioreactor are functional.
Flower development: open questions and future directions.
Wellmer, Frank; Bowman, John L; Davies, Brendan; Ferrándiz, Cristina; Fletcher, Jennifer C; Franks, Robert G; Graciet, Emmanuelle; Gregis, Veronica; Ito, Toshiro; Jack, Thomas P; Jiao, Yuling; Kater, Martin M; Ma, Hong; Meyerowitz, Elliot M; Prunet, Nathanaël; Riechmann, José Luis
2014-01-01
Almost three decades of genetic and molecular analyses have resulted in detailed insights into many of the processes that take place during flower development and in the identification of a large number of key regulatory genes that control these processes. Despite this impressive progress, many questions about how flower development is controlled in different angiosperm species remain unanswered. In this chapter, we discuss some of these open questions and the experimental strategies with which they could be addressed. Specifically, we focus on the areas of floral meristem development and patterning, floral organ specification and differentiation, as well as on the molecular mechanisms underlying the evolutionary changes that have led to the astounding variations in flower size and architecture among extant and extinct angiosperms.
Microfluidic model of the platelet-generating organ: beyond bone marrow biomimetics.
Blin, Antoine; Le Goff, Anne; Magniez, Aurélie; Poirault-Chassac, Sonia; Teste, Bruno; Sicot, Géraldine; Nguyen, Kim Anh; Hamdi, Feriel S; Reyssat, Mathilde; Baruch, Dominique
2016-02-22
We present a new, rapid method for producing blood platelets in vitro from cultured megakaryocytes based on a microfluidic device. This device consists in a wide array of VWF-coated micropillars. Such pillars act as anchors on megakaryocytes, allowing them to remain trapped in the device and subjected to hydrodynamic shear. The combined effect of anchoring and shear induces the elongation of megakaryocytes and finally their rupture into platelets and proplatelets. This process was observed with megakaryocytes from different origins and found to be robust. This original bioreactor design allows to process megakaryocytes at high throughput (millions per hour). Since platelets are produced in such a large amount, their extensive biological characterisation is possible and shows that platelets produced in this bioreactor are functional.
Imaging nanobubble nucleation and hydrogen spillover during electrocatalytic water splitting.
Hao, Rui; Fan, Yunshan; Howard, Marco D; Vaughan, Joshua C; Zhang, Bo
2018-06-05
Nucleation and growth of hydrogen nanobubbles are key initial steps in electrochemical water splitting. These processes remain largely unexplored due to a lack of proper tools to probe the nanobubble's interfacial structure with sufficient spatial and temporal resolution. We report the use of superresolution microscopy to image transient formation and growth of single hydrogen nanobubbles at the electrode/solution interface during electrocatalytic water splitting. We found hydrogen nanobubbles can be generated even at very early stages in water electrolysis, i.e., ∼500 mV before reaching its thermodynamic reduction potential. The ability to image single nanobubbles on an electrode enabled us to observe in real time the process of hydrogen spillover from ultrathin gold nanocatalysts supported on indium-tin oxide.
Mean Energy Density of Photogenerated Magnetic Fields Throughout the EoR
NASA Astrophysics Data System (ADS)
Durrive, Jean-Baptiste; Tashiro, Hiroyuki; Langer, Mathieu; Sugiyama, Naoshi
2018-05-01
There seems to be magnetic fields at all scales and epochs in our Universe, but their origin at large scales remains an important open question of cosmology. In this work we focus on the generation of magnetic fields in the intergalactic medium due to the photoionizations by the first galaxies, all along the Epoch of Reionization. Based on previous studies which considered only isolated sources, we develop an analytical model to estimate the mean magnetic energy density accumulated in the Universe by this process. In our model, without considering any amplification process, the Universe is globally magnetized by this mechanism to the order of, at least, several 10-18 G during the Epoch of Reionization (i.e. a few 10-20 G comoving).
Longitudinal study of skin aging: from microrelief to wrinkles.
Bazin, Roland; Lévêque, Jean Luc
2011-05-01
To study the changes in skin microrelief and periocular wrinkles during the aging process. Replicas of the crow's feet area of volunteers were recorded in 1987 and 2008 and observed comparatively. Characteristic features were quantified by image analysis. Observation shows that some microrelief features disappear and even merge with wrinkles that become more marked. Some primary lines also tend to merge to form thin new wrinkles. Quantitative data support these observations: the size of small and medium objects of skin relief decreases with age while large objects are becoming larger. Over 21 years, in the group studied, the total area of the detected objects remains quite constant. Only the distribution between small and large detected objects (microrelief features and wrinkles, respectively) is modified. © 2011 John Wiley & Sons A/S.
Ecosystem Services Connect Environmental Change to Human Health Outcomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bayles, Brett R.; Brauman, Kate A.; Adkins, Joshua N.
Global environmental change, driven in large part by human activities, profoundly impacts the structure and functioning of Earth’s ecosystems (Millennium Ecosystem Assessment 2005). We are beginning to push beyond planetary boundaries (Steffan et al. 2015), and the consequences for human health remain largely unknown (Myers et al. 2013). Growing evidence suggests that ecological transformations can dramatically affect human health in ways that are both obvious and obscure (Myers and Patz 2009; Myers et al. 2013). The framework of ecosystem services, designed to evaluate the benefits that people derive from ecosystem products and processes, provides a compelling framework for integrating themore » many factors that influence the human health response to global change, as well as for integrating health impacts into broader analyses of the impacts of this change« less
Diagnosis and management of carotid stenosis: a review.
Nussbaum, E S
2000-01-01
Since its introduction in the 1950s, carotid endarterectomy has become one of the most frequently performed operations in the United States. The tremendous appeal of a procedure that decreases the risk of stroke, coupled with the large number of individuals in the general population with carotid stenosis, has contributed to its popularity. To provide optimal patient care, the practicing physician must have a firm understanding of the proper evaluation and management of carotid stenosis. Nevertheless, because of the large number of clinical trials performed over the last decade addressing the treatment of stroke and carotid endarterectomy, the care of patients with carotid stenosis remains a frequently misunderstood topic. This review summarizes the current evaluation and treatment options for carotid stenosis and provides a rational management algorithm for this prevalent disease process.
Convergence between biological, behavioural and genetic determinants of obesity.
Ghosh, Sujoy; Bouchard, Claude
2017-12-01
Multiple biological, behavioural and genetic determinants or correlates of obesity have been identified to date. Genome-wide association studies (GWAS) have contributed to the identification of more than 100 obesity-associated genetic variants, but their roles in causal processes leading to obesity remain largely unknown. Most variants are likely to have tissue-specific regulatory roles through joint contributions to biological pathways and networks, through changes in gene expression that influence quantitative traits, or through the regulation of the epigenome. The recent availability of large-scale functional genomics resources provides an opportunity to re-examine obesity GWAS data to begin elucidating the function of genetic variants. Interrogation of knockout mouse phenotype resources provides a further avenue to test for evidence of convergence between genetic variation and biological or behavioural determinants of obesity.
NASA Astrophysics Data System (ADS)
Postek, Michael T.; Poster, Dianne L.; Vládar, András E.; Driscoll, Mark S.; LaVerne, Jay A.; Tsinas, Zois; Al-Sheikhly, Mohamad I.
2018-02-01
Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. Through utilization of a biorefinery concept, nanocellulose can be produced in large volumes from wood at relatively low cost via ionizing radiation processing. Ionizing radiation causes significant break down of the polysaccharide and leads to the production of potentially useful gaseous products such as H2 and CO. The application of radiation processing to the production of nanocellulose from woody and non-wood sources, such as field grasses, bio-refining by-products, industrial pulp waste, and agricultural surplus materials remains an open field, ripe for innovation and application. Elucidating the mechanisms of the radiolytic decomposition of cellulose and the mass generation of nanocellulose by radiation processing is key to tapping into this source of nanocelluose for the growth of nanocellulostic-product development. More importantly, understanding the structural break-up of the cell walls as a function of radiation exposure is a key goal and only through careful, detailed characterization and dimensional metrology can this be achieved at the level of detail that is needed to further the growth of large scale radiation processing of plant materials. This work is resulting from strong collaborations between NIST and its academic partners who are pursuing the unique demonstration of applied ionizing radiation processing to plant materials as well as the development of manufacturing metrology for novel nanomaterials.
Postek, Michael T; Poster, Dianne L; Vládar, András E; Driscoll, Mark S; LaVerne, Jay A; Tsinas, Zois; Al-Sheikhly, Mohamad I
2018-02-01
Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. Through utilization of a biorefinery concept, nanocellulose can be produced in large volumes from wood at relatively low cost via ionizing radiation processing. Ionizing radiation causes significant break down of the polysaccharide and leads to the production of potentially useful gaseous products such as H 2 and CO. The application of radiation processing to the production of nanocellulose from woody and non-wood sources, such as field grasses, bio-refining byproducts, industrial pulp waste, and agricultural surplus materials remains an open field, ripe for innovation and application. Elucidating the mechanisms of the radiolytic decomposition of cellulose and the mass generation of nanocellulose by radiation processing is key to tapping into this source of nanocelluose for the growth of nanocellulostic-product development. More importantly, understanding the structural break-up of the cell walls as a function of radiation exposure is a key goal and only through careful, detailed characterization and dimensional metrology can this be achieved at the level of detail that is needed to further the growth of large scale radiation processing of plant materials. This work is resulting from strong collaborations between NIST and its academic partners who are pursuing the unique demonstration of applied ionizing radiation processing to plant materials as well as the development of manufacturing metrology for novel nanomaterials.
Assessing and Projecting Greenhouse Gas Release due to Abrupt Permafrost Degradation
NASA Astrophysics Data System (ADS)
Saito, K.; Ohno, H.; Yokohata, T.; Iwahana, G.; Machiya, H.
2017-12-01
Permafrost is a large reservoir of frozen soil organic carbon (SOC; about half of all the terrestrial storage). Therefore, its degradation (i.e., thawing) under global warming may lead to a substantial amount of additional greenhouse gas (GHG) release. However, understanding of the processes, geographical distribution of such hazards, and implementation of the relevant processes in the advanced climate models are insufficient yet so that variations in permafrost remains one of the large source of uncertainty in climatic and biogeochemical assessment and projections. Thermokarst, induced by melting of ground ice in ice-rich permafrost, leads to dynamic surface subsidence up to 60 m, which further affects local and regional societies and eco-systems in the Arctic. It can also accelerate a large-scale warming process through a positive feedback between released GHGs (especially methane), atmospheric warming and permafrost degradation. This three-year research project (2-1605, Environment Research and Technology Development Fund of the Ministry of the Environment, Japan) aims to assess and project the impacts of GHG release through dynamic permafrost degradation through in-situ and remote (e.g., satellite and airborn) observations, lab analysis of sampled ice and soil cores, and numerical modeling, by demonstrating the vulnerability distribution and relative impacts between large-scale degradation and such dynamic degradation. Our preliminary laboratory analysis of ice and soil cores sampled in 2016 at the Alaskan and Siberian sites largely underlain by ice-rich permafrost, shows that, although gas volumes trapped in unit mass are more or less homogenous among sites both for ice and soil cores, large variations are found in the methane concentration in the trapped gases, ranging from a few ppm (similar to that of the atmosphere) to hundreds of thousands ppm We will also present our numerical approach to evaluate relative impacts of GHGs released through dynamic permafrost degradations, by implementing conceptual modeling to assess and project distribution and affected amount of ground ice and SOC.
Frenette, Jean-Jacques; Massicotte, Philippe; Lapierre, Jean-François
2012-01-01
Large rivers represent a significant component of inland waters and are considered sentinels and integrators of terrestrial and atmospheric processes. They represent hotspots for the transport and processing of organic and inorganic material from the surrounding landscape, which ultimately impacts the bio-optical properties and food webs of the rivers. In large rivers, hydraulic connectivity operates as a major forcing variable to structure the functioning of the riverscape, and–despite increasing interest in large-river studies–riverscape structural properties, such as the underwater spectral regime, and their impact on autotrophic ecological processes remain poorly studied. Here we used the St. Lawrence River to identify the mechanisms structuring the underwater spectral environment and their consequences on pico- and nanophytoplankton communities, which are good biological tracers of environmental changes. Our results, obtained from a 450 km sampling transect, demonstrate that tributaries exert a profound impact on the receiving river’s photosynthetic potential. This occurs mainly through injection of chromophoric dissolved organic matter (CDOM) and non-algal material (tripton). CDOM and tripton in the water column selectively absorbed wavelengths in a gradient from blue to red, and the resulting underwater light climate was in turn a strong driver of the phytoplankton community structure (prokaryote/eukaryote relative and absolute abundances) at scales of many kilometers from the tributary confluence. Our results conclusively demonstrate the proximal impact of watershed properties on underwater spectral composition in a highly dynamic river environment characterized by unique structuring properties such as high directional connectivity, numerous sources and forms of carbon, and a rapidly varying hydrodynamic regime. We surmise that the underwater spectral composition represents a key integrating and structural property of large, heterogeneous river ecosystems and a promising tool to study autotrophic functional properties. It confirms the usefulness of using the riverscape approach to study large-river ecosystems and initiate comparison along latitudinal gradients. PMID:22558259
Automated Processing Workflow for Ambient Seismic Recordings
NASA Astrophysics Data System (ADS)
Girard, A. J.; Shragge, J.
2017-12-01
Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that automated preprocessing of ambient seismic recordings in the recording domain successfully mitigates unwanted coherent noise events in both the time and frequency domain. Accordingly, we assert that this method is beneficial for direct wave-equation imaging with ambient seismic recordings.
Crouch, Taylor Berens; DiClemente, Carlo C; Pitts, Steven C
2015-09-01
This study evaluated whether alcohol abstinence self-efficacy at the end of alcohol treatment was moderated by utilization of behavioral processes of change (coping activities used during a behavior change attempt). It was hypothesized that self-efficacy would be differentially important in predicting posttreatment drinking outcomes depending on the level of behavioral processes, such that the relation between self-efficacy and outcomes would be stronger for individuals who reported low process use. Analyses were also estimated with end-of-treatment abstinence included as a covariate. Data were analyzed from alcohol-dependent individuals in both treatment arms of Project MATCH (Matching Alcoholism Treatments to Client Heterogeneity; N = 1,328), a large alcohol treatment study. Self-efficacy was moderated by behavioral process use in predicting drinking frequency 6 and 12 months posttreatment and drinking quantity 6 months posttreatment such that self-efficacy was more strongly related to posttreatment drinking when low levels of processes were reported than high levels, but interactions were attenuated when end-of-treatment abstinence was controlled for. Significant quadratic relations between end-of-treatment self-efficacy and 6- and 12-month posttreatment drinking quantity and frequency were found (p < .001, ƒ² = 0.02-0.03), such that self-efficacy most robustly predicted outcomes when high. These effects remained significant when end-of-treatment abstinence was included as a covariate. Findings highlight the complex nature of self-efficacy's relation with drinking outcomes. Although the interaction between self-efficacy and behavioral processes was attenuated when end-of-treatment abstinence was controlled for, the quadratic effect of self-efficacy on outcomes remained significant. The pattern of these effects did not support the idea of "overconfidence" as a negative indicator. (c) 2015 APA, all rights reserved).
The identification of Josef Mengele. A triumph of international cooperation.
Eckert, W G; Teixeira, W R
1985-09-01
In recent weeks, world attention has been focused on the identification of skeletal remains suspected of being those of the most widely sought Nazi war criminal still at large--Josef Mengele. Several important turns in the investigation of his whereabouts led to a small city south of São Paulo, where he had been living until 1979. Mengele was reported to have drowned and to have been buried in a country cemetery near his last residence. The initial processing of the remains was done at the Medicolegal Institute of São Paulo by police officials in consultation with anthropologists and dentists as well as Dr. Wilmes Teixeira of Mogi das Cruzes, a suburb of São Paulo. Dr. Teixeira coordinated the team of authorized international forensic experts officially representing the governments of West Germany and the United States, as well as the Simon Wiesenthal Center of Los Angeles, who joined Brazilian scientists in completing identification. The success of the investigation was due to complete cooperation among members of the team, resulting in verification, within a reasonable scientific certainty, that these were the remains of Josef Mengele.
ERIC Educational Resources Information Center
Harris, Alma; Jones, Michelle
2017-01-01
The challenges of securing educational change and transformation, at scale, remain considerable. While sustained progress has been made in some education systems (Fullan, 2009; Hargreaves & Shirley, 2009) generally, it remains the case that the pathway to large-scale, system improvement is far from easy or straightforward. While large-scale…
Kawamura, Kiyoko; Wada, Akihiko; Wang, Ji-Yang; Li, Quanhai; Ishii, Akihiro; Tsujimura, Hideki; Takagi, Toshiyuki; Itami, Makiko; Tada, Yuji; Tatsumi, Koichiro; Shimada, Hideaki; Hiroshima, Kenzo; Tagawa, Masatoshi
2016-01-01
Activation-induced cytidine deaminase (AID) is involved in somatic hypermutation and class switch recombination processes in the antibody formation. The AID activity induces gene mutations and could be associated with transformation processes of B cells. Nevertheless, the relation between AID expression and the prognosis of B cell lymphoma patients remains uncharacterized. We examined expression levels of the AID gene in 89 lymph node specimens from lymphoma and non-lymphoma patients with Northern blot analysis and investigated an association with their survival. The AID gene was preferentially expressed in B cell lymphoma in particular in diffuse large B cell lymphoma and follicular lymphoma. We confirmed AID protein expression in the mRNA-positive but not in the negative specimens with Western blot analysis and immunohistochemical staining. Survival of the patients treated with cyclophosphamide-/doxorubicin-/vincristine-/prednisone-based chemotherapy demonstrated that the prognosis of diffuse large B cell patients was unfavorable in the mRNA-positive group compared with the negative group, and that AID expression levels were correlated with the poor prognosis. In contrast, AID expression was not linked with the prognosis of follicular lymphoma patients. AID expression is a predictive marker for an unfavorable outcome in DLBCL patients treated with the chemotherapy.
Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J; Inzé, Dirk; Van de Peer, Yves
2013-03-01
Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein-protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies.
Large shift in source of fine sediment in the upper Mississippi River
Belmont, P.; Gran, K.B.; Schottler, S.P.; Wilcock, P.R.; Day, S.S.; Jennings, C.; Lauer, J.W.; Viparelli, E.; Willenbring, J.K.; Engstrom, D.R.; Parker, G.
2011-01-01
Although sediment is a natural constituent of rivers, excess loading to rivers and streams is a leading cause of impairment and biodiversity loss. Remedial actions require identification of the sources and mechanisms of sediment supply. This task is complicated by the scale and complexity of large watersheds as well as changes in climate and land use that alter the drivers of sediment supply. Previous studies in Lake Pepin, a natural lake on the Mississippi River, indicate that sediment supply to the lake has increased 10-fold over the past 150 years. Herein we combine geochemical fingerprinting and a suite of geomorphic change detection techniques with a sediment mass balance for a tributary watershed to demonstrate that, although the sediment loading remains very large, the dominant source of sediment has shifted from agricultural soil erosion to accelerated erosion of stream banks and bluffs, driven by increased river discharge. Such hydrologic amplification of natural erosion processes calls for a new approach to watershed sediment modeling that explicitly accounts for channel and floodplain dynamics that amplify or dampen landscape processes. Further, this finding illustrates a new challenge in remediating nonpoint sediment pollution and indicates that management efforts must expand from soil erosion to factors contributing to increased water runoff. ?? 2011 American Chemical Society.
Mejias, Jorge F; Murray, John D; Kennedy, Henry; Wang, Xiao-Jing
2016-11-01
Interactions between top-down and bottom-up processes in the cerebral cortex hold the key to understanding attentional processes, predictive coding, executive control, and a gamut of other brain functions. However, the underlying circuit mechanism remains poorly understood and represents a major challenge in neuroscience. We approached this problem using a large-scale computational model of the primate cortex constrained by new directed and weighted connectivity data. In our model, the interplay between feedforward and feedback signaling depends on the cortical laminar structure and involves complex dynamics across multiple (intralaminar, interlaminar, interareal, and whole cortex) scales. The model was tested by reproducing, as well as providing insights into, a wide range of neurophysiological findings about frequency-dependent interactions between visual cortical areas, including the observation that feedforward pathways are associated with enhanced gamma (30 to 70 Hz) oscillations, whereas feedback projections selectively modulate alpha/low-beta (8 to 15 Hz) oscillations. Furthermore, the model reproduces a functional hierarchy based on frequency-dependent Granger causality analysis of interareal signaling, as reported in recent monkey and human experiments, and suggests a mechanism for the observed context-dependent hierarchy dynamics. Together, this work highlights the necessity of multiscale approaches and provides a modeling platform for studies of large-scale brain circuit dynamics and functions.
Mejias, Jorge F.; Murray, John D.; Kennedy, Henry; Wang, Xiao-Jing
2016-01-01
Interactions between top-down and bottom-up processes in the cerebral cortex hold the key to understanding attentional processes, predictive coding, executive control, and a gamut of other brain functions. However, the underlying circuit mechanism remains poorly understood and represents a major challenge in neuroscience. We approached this problem using a large-scale computational model of the primate cortex constrained by new directed and weighted connectivity data. In our model, the interplay between feedforward and feedback signaling depends on the cortical laminar structure and involves complex dynamics across multiple (intralaminar, interlaminar, interareal, and whole cortex) scales. The model was tested by reproducing, as well as providing insights into, a wide range of neurophysiological findings about frequency-dependent interactions between visual cortical areas, including the observation that feedforward pathways are associated with enhanced gamma (30 to 70 Hz) oscillations, whereas feedback projections selectively modulate alpha/low-beta (8 to 15 Hz) oscillations. Furthermore, the model reproduces a functional hierarchy based on frequency-dependent Granger causality analysis of interareal signaling, as reported in recent monkey and human experiments, and suggests a mechanism for the observed context-dependent hierarchy dynamics. Together, this work highlights the necessity of multiscale approaches and provides a modeling platform for studies of large-scale brain circuit dynamics and functions. PMID:28138530
Characterization of microbial 'hot spots' in soils": Where are we, and where are we going?
NASA Astrophysics Data System (ADS)
Baveye, Philippe C.
2015-04-01
Fifty years ago, microbiologists realized that significant progress in our understanding of microbial processes in soils required being able to measure various physical, chemical, and microbial parameters at the scale of microorganisms, i.e., at micrometric or even submicrometric scales, and to identify areas of particularly high microbial activity. Back then, this was only a dream, severely hampered by the crudeness of our measuring instruments. In the intervening years, however, amazing technological progress has transformed that old dream into reality. We are now able to quantify the physical and (bio)chemical environment of soil microorganisms at spatial scales that are commensurate with bacterial cells. In this invited presentation, I will provide an overview of the significant progress achieved in this field over the last few years, and mention a number of further technological advances that are likely to profoundly influence the nature of the research over the next decade. Technology must however remain a means to an end, and therefore it is important to firmly keep in mind that the goal of the research on understanding better how soil processes work at the microscale is to be ultimately in a position to predict the behavior of soils at scales that matter to society at large, for example in terms of food security or global climate change. In that context, part of the research has to focus on how we can upscale information about soil microbial hotspots to macroscopic scales and beyond. I will discuss where we stand on this crucial question, which remains largely open at the moment.
An Ancestral Recombination Graph for Diploid Populations with Skewed Offspring Distribution
Birkner, Matthias; Blath, Jochen; Eldon, Bjarki
2013-01-01
A large offspring-number diploid biparental multilocus population model of Moran type is our object of study. At each time step, a pair of diploid individuals drawn uniformly at random contributes offspring to the population. The number of offspring can be large relative to the total population size. Similar “heavily skewed” reproduction mechanisms have been recently considered by various authors (cf. e.g., Eldon and Wakeley 2006, 2008) and reviewed by Hedgecock and Pudovkin (2011). Each diploid parental individual contributes exactly one chromosome to each diploid offspring, and hence ancestral lineages can coalesce only when in distinct individuals. A separation-of-timescales phenomenon is thus observed. A result of Möhle (1998) is extended to obtain convergence of the ancestral process to an ancestral recombination graph necessarily admitting simultaneous multiple mergers of ancestral lineages. The usual ancestral recombination graph is obtained as a special case of our model when the parents contribute only one offspring to the population each time. Due to diploidy and large offspring numbers, novel effects appear. For example, the marginal genealogy at each locus admits simultaneous multiple mergers in up to four groups, and different loci remain substantially correlated even as the recombination rate grows large. Thus, genealogies for loci far apart on the same chromosome remain correlated. Correlation in coalescence times for two loci is derived and shown to be a function of the coalescence parameters of our model. Extending the observations by Eldon and Wakeley (2008), predictions of linkage disequilibrium are shown to be functions of the reproduction parameters of our model, in addition to the recombination rate. Correlations in ratios of coalescence times between loci can be high, even when the recombination rate is high and sample size is large, in large offspring-number populations, as suggested by simulations, hinting at how to distinguish between different population models. PMID:23150600
The influence of spontaneous activity on stimulus processing in primary visual cortex.
Schölvinck, M L; Friston, K J; Rees, G
2012-02-01
Spontaneous activity in the resting human brain has been studied extensively; however, how such activity affects the local processing of a sensory stimulus is relatively unknown. Here, we examined the impact of spontaneous activity in primary visual cortex on neuronal and behavioural responses to a simple visual stimulus, using functional MRI. Stimulus-evoked responses remained essentially unchanged by spontaneous fluctuations, combining with them in a largely linear fashion (i.e., with little evidence for an interaction). However, interactions between spontaneous fluctuations and stimulus-evoked responses were evident behaviourally; high levels of spontaneous activity tended to be associated with increased stimulus detection at perceptual threshold. Our results extend those found in studies of spontaneous fluctuations in motor cortex and higher order visual areas, and suggest a fundamental role for spontaneous activity in stimulus processing. Copyright © 2011. Published by Elsevier Inc.
Coupling Spatiotemporal Community Assembly Processes to Changes in Microbial Metabolism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Emily B.; Crump, Alex R.; Resch, Charles T.
Community assembly processes govern shifts in species abundances in response to environmental change, yet our understanding of assembly remains largely decoupled from ecosystem function. Here, we test hypotheses regarding assembly and function across space and time using hyporheic microbial communities as a model system. We pair sampling of two habitat types through hydrologic fluctuation with null modeling and multivariate statistics. We demonstrate that dual selective pressures assimilate to generate compositional changes at distinct timescales among habitat types, resulting in contrasting associations of Betaproteobacteria and Thaumarchaeota with selection and with seasonal changes in aerobic metabolism. Our results culminate in a conceptualmore » model in which selection from contrasting environments regulates taxon abundance and ecosystem function through time, with increases in function when oscillating selection opposes stable selective pressures. Our model is applicable within both macrobial and microbial ecology and presents an avenue for assimilating community assembly processes into predictions of ecosystem function.« less
Adaptation to sensory input tunes visual cortex to criticality
NASA Astrophysics Data System (ADS)
Shew, Woodrow L.; Clawson, Wesley P.; Pobst, Jeff; Karimipanah, Yahya; Wright, Nathaniel C.; Wessel, Ralf
2015-08-01
A long-standing hypothesis at the interface of physics and neuroscience is that neural networks self-organize to the critical point of a phase transition, thereby optimizing aspects of sensory information processing. This idea is partially supported by strong evidence for critical dynamics observed in the cerebral cortex, but the impact of sensory input on these dynamics is largely unknown. Thus, the foundations of this hypothesis--the self-organization process and how it manifests during strong sensory input--remain unstudied experimentally. Here we show in visual cortex and in a computational model that strong sensory input initially elicits cortical network dynamics that are not critical, but adaptive changes in the network rapidly tune the system to criticality. This conclusion is based on observations of multifaceted scaling laws predicted to occur at criticality. Our findings establish sensory adaptation as a self-organizing mechanism that maintains criticality in visual cortex during sensory information processing.
Complete quantum control of exciton qubits bound to isoelectronic centres.
Éthier-Majcher, G; St-Jean, P; Boso, G; Tosi, A; Klem, J F; Francoeur, S
2014-05-30
In recent years, impressive demonstrations related to quantum information processing have been realized. The scalability of quantum interactions between arbitrary qubits within an array remains however a significant hurdle to the practical realization of a quantum computer. Among the proposed ideas to achieve fully scalable quantum processing, the use of photons is appealing because they can mediate long-range quantum interactions and could serve as buses to build quantum networks. Quantum dots or nitrogen-vacancy centres in diamond can be coupled to light, but the former system lacks optical homogeneity while the latter suffers from a low dipole moment, rendering their large-scale interconnection challenging. Here, through the complete quantum control of exciton qubits, we demonstrate that nitrogen isoelectronic centres in GaAs combine both the uniformity and predictability of atomic defects and the dipole moment of semiconductor quantum dots. This establishes isoelectronic centres as a promising platform for quantum information processing.
Vigmond, Edward J.; Boyle, Patrick M.; Leon, L. Joshua; Plank, Gernot
2014-01-01
Simulations of cardiac bioelectric phenomena remain a significant challenge despite continual advancements in computational machinery. Spanning large temporal and spatial ranges demands millions of nodes to accurately depict geometry, and a comparable number of timesteps to capture dynamics. This study explores a new hardware computing paradigm, the graphics processing unit (GPU), to accelerate cardiac models, and analyzes results in the context of simulating a small mammalian heart in real time. The ODEs associated with membrane ionic flow were computed on traditional CPU and compared to GPU performance, for one to four parallel processing units. The scalability of solving the PDE responsible for tissue coupling was examined on a cluster using up to 128 cores. Results indicate that the GPU implementation was between 9 and 17 times faster than the CPU implementation and scaled similarly. Solving the PDE was still 160 times slower than real time. PMID:19964295
NASA Astrophysics Data System (ADS)
Deris, A. M.; Zain, A. M.; Sallehuddin, R.; Sharif, S.
2017-09-01
Electric discharge machine (EDM) is one of the widely used nonconventional machining processes for hard and difficult to machine materials. Due to the large number of machining parameters in EDM and its complicated structural, the selection of the optimal solution of machining parameters for obtaining minimum machining performance is remain as a challenging task to the researchers. This paper proposed experimental investigation and optimization of machining parameters for EDM process on stainless steel 316L work piece using Harmony Search (HS) algorithm. The mathematical model was developed based on regression approach with four input parameters which are pulse on time, peak current, servo voltage and servo speed to the output response which is dimensional accuracy (DA). The optimal result of HS approach was compared with regression analysis and it was found HS gave better result y giving the most minimum DA value compared with regression approach.
Changes in chokeberry (Aronia melanocarpa L.) polyphenols during juice processing and storage.
Wilkes, Kail; Howard, Luke R; Brownmiller, Cindi; Prior, Ronald L
2014-05-07
Chokeberries are an excellent source of polyphenols, but their fate during juice processing and storage is unknown. The stability of anthocyanins, total proanthocyanidins, hydroxycinnamic acids, and flavonols at various stages of juice processing and over 6 months of storage at 25 °C was determined. Flavonols, total proanthocyanidins, and hydroxycinnamic acids were retained in the juice to a greater extent than anthocyanins, with losses mostly due to removal of seeds and skins following pressing. Anthocyanins were extensively degraded by thermal treatments during which time levels of protocatechuic acid and phloroglucinaldehyde increased, and additional losses occurred following pressing. Flavonols, total proanthocyanidins, and hydroxycinnamic acids were well retained in juices stored for 6 months at 25 °C, whereas anthocyanins declined linearly. Anthocyanin losses during storage were paralleled by increased polymeric color values, indicating that the small amounts of anthocyanins remaining were present in large part in polymeric forms.
Astrocytic control of synaptic function.
Papouin, Thomas; Dunphy, Jaclyn; Tolman, Michaela; Foley, Jeannine C; Haydon, Philip G
2017-03-05
Astrocytes intimately interact with synapses, both morphologically and, as evidenced in the past 20 years, at the functional level. Ultrathin astrocytic processes contact and sometimes enwrap the synaptic elements, sense synaptic transmission and shape or alter the synaptic signal by releasing signalling molecules. Yet, the consequences of such interactions in terms of information processing in the brain remain very elusive. This is largely due to two major constraints: (i) the exquisitely complex, dynamic and ultrathin nature of distal astrocytic processes that renders their investigation highly challenging and (ii) our lack of understanding of how information is encoded by local and global fluctuations of intracellular calcium concentrations in astrocytes. Here, we will review the existing anatomical and functional evidence of local interactions between astrocytes and synapses, and how it underlies a role for astrocytes in the computation of synaptic information.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'. © 2017 The Author(s).
Optimisation of multiplet identifier processing on a PLAYSTATION® 3
NASA Astrophysics Data System (ADS)
Hattori, Masami; Mizuno, Takashi
2010-02-01
To enable high-performance computing (HPC) for applications with large datasets using a Sony® PLAYSTATION® 3 (PS3™) video game console, we configured a hybrid system consisting of a Windows® PC and a PS3™. To validate this system, we implemented the real-time multiplet identifier (RTMI) application, which identifies multiplets of microearthquakes in terms of the similarity of their waveforms. The cross-correlation computation, which is a core algorithm of the RTMI application, was optimised for the PS3™ platform, while the rest of the computation, including data input and output remained on the PC. With this configuration, the core part of the algorithm ran 69 times faster than the original program, accelerating total computation speed more than five times. As a result, the system processed up to 2100 total microseismic events, whereas the original implementation had a limit of 400 events. These results indicate that this system enables high-performance computing for large datasets using the PS3™, as long as data transfer time is negligible compared with computation time.
Basu, Anirban; Kumar, Gopinatha Suresh
2015-05-15
The thermodynamics of the interaction of the food colourant tartrazine with two homologous serum proteins, HSA and BSA, were investigated, employing microcalorimetric techniques. At T=298.15K the equilibrium constants for the tartrazine-BSA and HSA complexation process were evaluated to be (1.92 ± 0.05) × 10(5)M(-1) and (1.04 ± 0.05) × 10(5)M(-1), respectively. The binding was driven by a large negative standard molar enthalpic contribution. The binding was dominated essentially by non-polyelectrolytic forces which remained largely invariant at all salt concentrations. The polyelectrolytic contribution was weak at all salt concentrations and accounted for only 6-18% of the total standard molar Gibbs energy change in the salt concentration range 10-50mM. The negative standard molar heat capacity values, in conjunction with the enthalpy-entropy compensation phenomenon observed, established the involvement of dominant hydrophobic forces in the complexation process. Tartrazine enhanced the stability of both serum albumins against thermal denaturation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Godoy-Lorite, Antonia; Guimerà, Roger; Sales-Pardo, Marta
2016-01-01
In social networks, individuals constantly drop ties and replace them by new ones in a highly unpredictable fashion. This highly dynamical nature of social ties has important implications for processes such as the spread of information or of epidemics. Several studies have demonstrated the influence of a number of factors on the intricate microscopic process of tie replacement, but the macroscopic long-term effects of such changes remain largely unexplored. Here we investigate whether, despite the inherent randomness at the microscopic level, there are macroscopic statistical regularities in the long-term evolution of social networks. In particular, we analyze the email network of a large organization with over 1,000 individuals throughout four consecutive years. We find that, although the evolution of individual ties is highly unpredictable, the macro-evolution of social communication networks follows well-defined statistical patterns, characterized by exponentially decaying log-variations of the weight of social ties and of individuals' social strength. At the same time, we find that individuals have social signatures and communication strategies that are remarkably stable over the scale of several years.
NASA Astrophysics Data System (ADS)
Vermeulen, A.; Verheggen, B.; Pieterse, G.; Haszpra, L.
2007-12-01
Tall towers allow us to observe the integrated influence of carbon exchange processes from large areas on the concentrations of CO2. The signal received shows a large variability at diurnal and synoptic timescales. The question remains how high resolutions and how accurate transport models need to be, in order to discriminate the relevant source terms from the atmospheric signal. We will examine the influence of the resolution of (ECMWF) meteorological fields, antropogenic and biogenic fluxes when going from resolutions of 2° to 0.2° lat-lon, using a simple Lagrangian 2D transport model. Model results will be compared to other Eulerian model results and observations at the CHIOTTO/CarboEurope tall tower network in Europe. Biogenic fluxes taken into account are from the FACEM model (Pieterse et al, 2006). Results show that the relative influence of the different CO2 exchange processes is very different at each tower and that higher model resolution clearly pays off in better model performance.
An acceptable role for computers in the aircraft design process
NASA Technical Reports Server (NTRS)
Gregory, T. J.; Roberts, L.
1980-01-01
Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.
Toro, Nicolás; Villadas, Pablo J; Molina-Sánchez, María Dolores; Navarro-Gómez, Pilar; Vinardell, José M; Cuesta-Berrio, Lidia; Rodríguez-Carvajal, Miguel A
2017-04-06
The question of how genotypic and ecological units arise and spread in natural microbial populations remains controversial in the field of evolutionary biology. Here, we investigated the early stages of ecological and genetic differentiation in a highly clonal sympatric Sinorhizobium meliloti population. Whole-genome sequencing revealed that a large DNA region of the symbiotic plasmid pSymB was replaced in some isolates with a similar synteny block carrying densely clustered SNPs and displaying gene acquisition and loss. Two different versions of this genomic island of differentiation (GID) generated by multiple genetic exchanges over time appear to have arisen recently, through recombination in a particular clade within this population. In addition, these isolates display resistance to phages from the same geographic region, probably due to the modification of surface components by the acquired genes. Our results suggest that an underlying process of early ecological and genetic differentiation in S. meliloti is primarily triggered by acquisition of genes that confer resistance to soil phages within particular large genomic DNA regions prone to recombination.
Huang, Chun; Zhang, Jin; Young, Neil P; Snaith, Henry J; Grant, Patrick S
2016-05-10
Supercapacitors are in demand for short-term electrical charge and discharge applications. Unlike conventional supercapacitors, solid-state versions have no liquid electrolyte and do not require robust, rigid packaging for containment. Consequently they can be thinner, lighter and more flexible. However, solid-state supercapacitors suffer from lower power density and where new materials have been developed to improve performance, there remains a gap between promising laboratory results that usually require nano-structured materials and fine-scale processing approaches, and current manufacturing technology that operates at large scale. We demonstrate a new, scalable capability to produce discrete, multi-layered electrodes with a different material and/or morphology in each layer, and where each layer plays a different, critical role in enhancing the dynamics of charge/discharge. This layered structure allows efficient utilisation of each material and enables conservative use of hard-to-obtain materials. The layered electrode shows amongst the highest combinations of energy and power densities for solid-state supercapacitors. Our functional design and spray manufacturing approach to heterogeneous electrodes provide a new way forward for improved energy storage devices.
[Parametabolism as Non-Specific Modifier of Supramolecular Interactions in Living Systems].
Kozlov, V A; Sapozhnikov, S P; Sheptuhina, A I; Golenkov, A V
2015-01-01
As it became known recently, in addition to the enzyme (enzymes and/or ribozymes) in living organisms occur a large number of ordinary chemical reactions without the participation of biological catalysts. These reactions are distinguished by low speed and, as a rule, the irreversibility. For example, along with diabetes mellitus, glycation and fructosilation of proteins are observed resulted in posttranslational modification with the low- or nonfunctioning protein formation which is poorly exposed to enzymatic proteolysis and therefore accumulates in the body. In addition, the known processes such as the nonenzymatic carbomoylation, pyridoxylation and thiamiation proteins. There is a reasonable basis to believe that alcoholic injury also realized through parametabolic secondary metabolites synthesis such as acetaldehyde. At the same time, the progress in supramolecular chemistry proves that in biological objects there is another large group ofparametabolic reactions caused by the formation of supramolecular complexes. Obviously, known parameterizes interactions can modify the formation of supramolecular complexes in living objects. These processes are of considerable interest for fundamental biology and fundamental and practical medicine, but they remain unexplored due to a lack of awareness of a wide range of researchers.
Miller, B.; Jimenez, M.; Bridle, H.
2016-01-01
Inertial focusing is a microfluidic based separation and concentration technology that has expanded rapidly in the last few years. Throughput is high compared to other microfluidic approaches although sample volumes have typically remained in the millilitre range. Here we present a strategy for achieving rapid high volume processing with stacked and cascaded inertial focusing systems, allowing for separation and concentration of particles with a large size range, demonstrated here from 30 μm–300 μm. The system is based on curved channels, in a novel toroidal configuration and a stack of 20 devices has been shown to operate at 1 L/min. Recirculation allows for efficient removal of large particles whereas a cascading strategy enables sequential removal of particles down to a final stage where the target particle size can be concentrated. The demonstration of curved stacked channels operating in a cascaded manner allows for high throughput applications, potentially replacing filtration in applications such as environmental monitoring, industrial cleaning processes, biomedical and bioprocessing and many more. PMID:27808244
Large heterogeneities in comet 67P as revealed by active pits from sinkhole collapse.
Vincent, Jean-Baptiste; Bodewits, Dennis; Besse, Sébastien; Sierks, Holger; Barbieri, Cesare; Lamy, Philippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; Keller, Horst Uwe; Agarwal, Jessica; A'Hearn, Michael F; Auger, Anne-Thérèse; Barucci, M Antonella; Bertaux, Jean-Loup; Bertini, Ivano; Capanna, Claire; Cremonese, Gabriele; Da Deppo, Vania; Davidsson, Björn; Debei, Stefano; De Cecco, Mariolino; El-Maarry, Mohamed Ramy; Ferri, Francesca; Fornasier, Sonia; Fulle, Marco; Gaskell, Robert; Giacomini, Lorenza; Groussin, Olivier; Guilbert-Lepoutre, Aurélie; Gutierrez-Marques, P; Gutiérrez, Pedro J; Güttler, Carsten; Hoekzema, Nick; Höfner, Sebastian; Hviid, Stubbe F; Ip, Wing-Huen; Jorda, Laurent; Knollenberg, Jörg; Kovacs, Gabor; Kramm, Rainer; Kührt, Ekkehard; Küppers, Michael; La Forgia, Fiorangela; Lara, Luisa M; Lazzarin, Monica; Lee, Vicky; Leyrat, Cédric; Lin, Zhong-Yi; Lopez Moreno, Josè J; Lowry, Stephen; Magrin, Sara; Maquet, Lucie; Marchi, Simone; Marzari, Francesco; Massironi, Matteo; Michalik, Harald; Moissl, Richard; Mottola, Stefano; Naletto, Giampiero; Oklay, Nilda; Pajola, Maurizio; Preusker, Frank; Scholten, Frank; Thomas, Nicolas; Toth, Imre; Tubiana, Cecilia
2015-07-02
Pits have been observed on many cometary nuclei mapped by spacecraft. It has been argued that cometary pits are a signature of endogenic activity, rather than impact craters such as those on planetary and asteroid surfaces. Impact experiments and models cannot reproduce the shapes of most of the observed cometary pits, and the predicted collision rates imply that few of the pits are related to impacts. Alternative mechanisms like explosive activity have been suggested, but the driving process remains unknown. Here we report that pits on comet 67P/Churyumov-Gerasimenko are active, and probably created by a sinkhole process, possibly accompanied by outbursts. We argue that after formation, pits expand slowly in diameter, owing to sublimation-driven retreat of the walls. Therefore, pits characterize how eroded the surface is: a fresh cometary surface will have a ragged structure with many pits, while an evolved surface will look smoother. The size and spatial distribution of pits imply that large heterogeneities exist in the physical, structural or compositional properties of the first few hundred metres below the current nucleus surface.
Microprocessor activity controls differential miRNA biogenesis In Vivo.
Conrad, Thomas; Marsico, Annalisa; Gehre, Maja; Orom, Ulf Andersson
2014-10-23
In miRNA biogenesis, pri-miRNA transcripts are converted into pre-miRNA hairpins. The in vivo properties of this process remain enigmatic. Here, we determine in vivo transcriptome-wide pri-miRNA processing using next-generation sequencing of chromatin-associated pri-miRNAs. We identify a distinctive Microprocessor signature in the transcriptome profile from which efficiency of the endogenous processing event can be accurately quantified. This analysis reveals differential susceptibility to Microprocessor cleavage as a key regulatory step in miRNA biogenesis. Processing is highly variable among pri-miRNAs and a better predictor of miRNA abundance than primary transcription itself. Processing is also largely stable across three cell lines, suggesting a major contribution of sequence determinants. On the basis of differential processing efficiencies, we define functionality for short sequence features adjacent to the pre-miRNA hairpin. In conclusion, we identify Microprocessor as the main hub for diversified miRNA output and suggest a role for uncoupling miRNA biogenesis from host gene expression. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Sicart, J. E.; Ramseyer, V.; Lejeune, Y.; Essery, R.; Webster, C.; Rutter, N.
2017-12-01
At high altitudes and latitudes, snow has a large influence on hydrological processes. Large fractions of these regions are covered by forests, which have a strong influence on snow accumulation and melting processes. Trees absorb a large part of the incoming shortwave radiation and this heat load is mostly dissipated as longwave radiation. Trees shelter the snow surface from wind, so sub-canopy snowmelt depends mainly on the radiative fluxes: vegetation attenuates the transmission of shortwave radiation but enhances longwave irradiance to the surface. An array of 13 pyranometers and 11 pyrgeometers was deployed on the snow surface below a coniferous forest at the CEN-MeteoFrance Col de Porte station in the French Alps (1325 m asl) during the 2017 winter in order to investigate spatial and temporal variabilities of solar and infrared irradiances in different meteorological conditions. Sky view factors measured with hemispherical photographs at each radiometer location were in a narrow range from 0.2 to 0.3. The temperature of the vegetation was measured with IR thermocouples and an IR camera. In clear sky conditions, the attenuation of solar radiation by the canopy reached 96% and its spatial variability exceeded 100 W m-2. Longwave irradiance varied by 30 W m-2 from dense canopy to gap areas. In overcast conditions, the spatial variabilities of solar and infrared irradiances were reduced and remained closely related to the sky view factor. A simple radiative model taking into account the penetration through the canopy of the direct and diffuse solar radiation, and isotropic infrared emission of the vegetation as a blackbody emitter, accurately reproduced the dynamics of the radiation fluxes at the snow surface. Model results show that solar transmissivity of the canopy in overcast conditions is an excellent proxy of the sky view factor and the emitting temperature of the vegetation remained close to the air temperature in this typically dense Alpine forest.
Khoo, Lay See; Lai, Poh Soon; Saidin, Mohd Hilmi; Noor, Zahari; Mahmood, Mohd Shah
2018-07-01
Cadaver body bags are the conventional method to contain a human body or human remains, which includes the use for storage and transportation of the deceased at any crime scene or disaster scene. During disasters, most often than not, the first responders including the police will be equipped with cadaver body bags to do scene processing of human remains and collection of personal belongings at the disaster site. However, in an unanticipated large scale disasters involving hundreds and thousands of fatalities, cadaver body bags supplies may be scarce. The authors have therefore innovated the cling film plastic wrap as an alternative for the cadaver body bag used at the disaster site. The plastic wrap was tested on six different experimental subjects, i.e. both adult and child mannequins; body parts of the mannequin figure (arm and hand); a human adult subject and an unknown dead body. The strengths of the cling film plastic wrap are discussed in comparison with the cadaver body bag in the aspects of costing, weight, duration of the wrap, water and body fluid resistant properties, visibility and other advantages. An average savings of more than 5000% are noted for both adult body wrap and child body wrap compared to the cadaver body wrap. This simply means that the authors can either wrap 25 adult dead bodies or 80 children dead bodies with the cost of 1 cadaver body bag. The cling film plastic wrap has proven to have significant innovation impact for dead body management particularly by the first responders in large scale disasters. With proper handling of dead bodies, first responders can manage the dead with dignity and respect in an overwhelmed situation to facilitate the humanitarian victim identification process later. Copyright © 2018 Elsevier B.V. All rights reserved.
Khoo, Lay See; Lai, Poh Soon; Saidin, Mohd Hilmi; Noor, Zahari; Mahmood, Mohd Shah
2018-04-01
Cadaver body bags are the conventional method to contain a human body or human remains, which includes the use for storage and transportation of the deceased at any crime scene or disaster scene. During disasters, most often than not, the first responders including the police will be equipped with cadaver body bags to do scene processing of human remains and collection of personal belongings at the disaster site. However, in an unanticipated large scale disasters involving hundreds and thousands of fatalities, cadaver body bags supplies may be scarce. The authors have therefore innovated the cling film plastic wrap as an alternative for the cadaver body bag used at the disaster site. The plastic wrap was tested on six different experimental subjects, i.e. both adult and child mannequins; body parts of the mannequin figure (arm and hand); a human adult subject and an unknown dead body. The strengths of the cling film plastic wrap are discussed in comparison with the cadaver body bag in the aspects of costing, weight, duration of the wrap, water and body fluid resistant properties, visibility and other advantages. An average savings of more than 5000% are noted for both adult body wrap and child body wrap compared to the cadaver body wrap. This simply means that the authors can either wrap 25 adult dead bodies or 80 children dead bodies with the cost of 1 cadaver body bag. The cling film plastic wrap has proven to have significant innovation impact for dead body management particularly by the first responders in large scale disasters. With proper handling of dead bodies, first responders can manage the dead with dignity and respect in an overwhelmed situation to facilitate the humanitarian victim identification process later. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Shiwang; Xie, Shi-Jie; Carrillo, Jan-Michael Y.
Polymer nanocomposites (PNCs) are important materials that are widely used in many current technologies and potentially have broader applications in the future due to their excellent property of tunability, light weight and low cost. But, expanding the limits in property enhancement remains a fundamental scientific challenge. We demonstrate that well-dispersed, small (diameter ~1.8 nm) nanoparticles with attractive interactions lead to unexpectedly large and qualitatively new changes in PNC structural dynamics in comparison to conventional composites based on particles of diameter ~10-50 nm. At the same time, the zero-shear viscosity at high temperatures remains comparable to that of the neat polymer,more » thereby retaining good processibility and resolving a major challenge in PNC applications. These results suggest that the nanoparticle mobility and relatively short lifetimes of nanoparticlepolymer associations open qualitatively new horizons in tunability of macroscopic properties in nanocomposites with high potential for the development of new functional materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dhondt, Ineke; Petyuk, Vladislav A.; Cai, Huaihan
Most aging hypotheses assume the accumulation of damage, resulting in gradual physiological decline and, ultimately, death. Avoiding protein damage accumulation by enhanced turnover should slow down the aging process and extend the lifespan. But, lowering translational efficiency extends rather than shortens the lifespan in C. elegans. We studied turnover of individual proteins in the long-lived daf-2 mutant by combining SILeNCe (stable isotope labeling by nitrogen in Caenorhabditiselegans) and mass spectrometry. Intriguingly, the majority of proteins displayed prolonged half-lives in daf-2, whereas others remained unchanged, signifying that longevity is not supported by high protein turnover. We found that this slowdown wasmore » most prominent for translation-related and mitochondrial proteins. Conversely, the high turnover of lysosomal hydrolases and very low turnover of cytoskeletal proteins remained largely unchanged. The slowdown of protein dynamics and decreased abundance of the translational machinery may point to the importance of anabolic attenuation in lifespan extension, as suggested by the hyperfunction theory.« less
Galvão, Malthus Fonseca; Pujol-Luz, José Roberto; de Assis Pujol-Luz, Cristiane Vieira; de Rosa, Cássio Thyone Almeida; Simone, Luiz Ricardo L; Báo, Sônia Nair; Barros-Cordeiro, Karine Brenda; Pessoa, Larissa; Bissacot, Giovanna
2015-09-01
Little is known regarding the scavenger fauna associated with buried human corpses, particularly in clandestine burials. We report the presence of 20 shells of the terrestrial snail Allopeas micra, within hollow bones of human remains buried for 5 years, during the process of collecting DNA material. The fact that a large number of shells of A. micra had been found in the corpse and in the crime scene supports the assumption that there was no attempt to remove the corpse from the area where the crime occurred. Despite this, our observations cannot be used to estimate the postmortem interval because there is no precise knowledge about the development of this species. This is the first record of a terrestrial snail associated with a human corpse and its role in this forensic medicine case. © 2015 American Academy of Forensic Sciences.
NASA Astrophysics Data System (ADS)
Miyagi, Yousuke; Ozawa, Taku; Shimada, Masanobu
2009-10-01
On April 1, 2007 (UTC), a large Mw 8.1 interplate earthquake struck the Solomon Islands subduction zone where complicated tectonics result from the subduction of four plates. Extensive ground movements and a large tsunami occurred in the epicentral area causing severe damage over a wide area. Using ALOS/PALSAR data and the DInSAR technique, we detected crustal deformation exceeding 2 m in islands close to the epicenter. A slip distribution of the inferred seismic fault was estimated using geodetic information derived from DInSAR processing and field investigations. The result indicates large slip areas around the hypocenter and the centroid. It is possible that the largest slip area is related to subduction of the plate boundary between the Woodlark and Australian plates. A small slip area between those large slip areas may indicate weak coupling due to thermal activity related to volcanic activity on Simbo Island. The 2007 earthquake struck an area where large earthquake has not occurred since 1970. Most of this seismic gap was filled by the 2007 events, however a small seismic gap still remains in the southeastern region of the 2007 earthquake.
NASA Astrophysics Data System (ADS)
Duro, Javier; Iglesias, Rubén; Blanco, Pablo; Albiol, David; Koudogbo, Fifamè
2015-04-01
The Wide Area Product (WAP) is a new interferometric product developed to provide measurement over large regions. Persistent Scatterers Interferometry (PSI) has largely proved their robust and precise performance in measuring ground surface deformation in different application domains. In this context, however, the accurate displacement estimation over large-scale areas (more than 10.000 km2) characterized by low magnitude motion gradients (3-5 mm/year), such as the ones induced by inter-seismic or Earth tidal effects, still remains an open issue. The main reason for that is the inclusion of low quality and more distant persistent scatterers in order to bridge low-quality areas, such as water bodies, crop areas and forested regions. This fact yields to spatial propagation errors on PSI integration process, poor estimation and compensation of the Atmospheric Phase Screen (APS) and the difficult to face residual long-wavelength phase patterns originated by orbit state vectors inaccuracies. Research work for generating a Wide Area Product of ground motion in preparation for the Sentinel-1 mission has been conducted in the last stages of Terrafirma as well as in other research programs. These developments propose technological updates for keeping the precision over large scale PSI analysis. Some of the updates are based on the use of external information, like meteorological models, and the employment of GNSS data for an improved calibration of large measurements. Usually, covering wide regions implies the processing over areas with a land use which is chiefly focused on livestock, horticulture, urbanization and forest. This represents an important challenge for providing continuous InSAR measurements and the application of advanced phase filtering strategies to enhance the coherence. The advanced PSI processing has been performed out over several areas, allowing a large scale analysis of tectonic patterns, and motion caused by multi-hazards as volcanic, landslide and flood. Several examples of the application of the PSI WAP to wide regions for measuring ground displacements related to different types of hazards, natural and human induced will be presented. The InSAR processing approach to measure accurate movements at local and large scales for allowing multi-hazard interpretation studies will also be discussed. The test areas will show deformations related to active faults systems, landslides in mountains slopes, ground compaction over underneath aquifers and movements in volcanic areas.
Li, Liangliang; Wang, Jiangfeng; Wang, Yu
2016-08-01
Analysis of the process of decomposition is essential in establishing the postmortem interval. However, despite the fact that insects are important players in body decomposition, their exact function within the decay process is still unclear. There is also limited knowledge as to how the decomposition process occurs in the absence of insects. In the present study, we compared the decomposition of a pig carcass in open air with that of one placed in a methyl methacrylate box to prevent insect contact. The pig carcass in the methyl methacrylate box was in the fresh stage for 1 day, the bloated stage from 2 d to 11 d, and underwent deflated decay from 12 d. In contrast, the pig carcass in open air went through the fresh, bloated, active decay and post-decay stages; and 22.3 h (0.93 d), 62.47 h (2.60 d), 123.63 h (5.15 d) and 246.5 h (10.27 d) following the start of the experiment respectively, prior to entering the skeletonization stage. A large amount of soft tissue were remained on the pig carcass in the methyl methacrylate box on 26 d, while only scattered bones remained on the pig carcass in open air. The results indicate that insects greatly accelerate the decomposition process. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
The roles of associative and executive processes in creative cognition.
Beaty, Roger E; Silvia, Paul J; Nusbaum, Emily C; Jauk, Emanuel; Benedek, Mathias
2014-10-01
How does the mind produce creative ideas? Past research has pointed to important roles of both executive and associative processes in creative cognition. But such work has largely focused on the influence of one ability or the other-executive or associative-so the extent to which both abilities may jointly affect creative thought remains unclear. Using multivariate structural equation modeling, we conducted two studies to determine the relative influences of executive and associative processes in domain-general creative cognition (i.e., divergent thinking). Participants completed a series of verbal fluency tasks, and their responses were analyzed by means of latent semantic analysis (LSA) and scored for semantic distance as a measure of associative ability. Participants also completed several measures of executive function-including broad retrieval ability (Gr) and fluid intelligence (Gf). Across both studies, we found substantial effects of both associative and executive abilities: As the average semantic distance between verbal fluency responses and cues increased, so did the creative quality of divergent-thinking responses (Study 1 and Study 2). Moreover, the creative quality of divergent-thinking responses was predicted by the executive variables-Gr (Study 1) and Gf (Study 2). Importantly, the effects of semantic distance and the executive function variables remained robust in the same structural equation model predicting divergent thinking, suggesting unique contributions of both constructs. The present research extends recent applications of LSA in creativity research and provides support for the notion that both associative and executive processes underlie the production of novel ideas.
Jang, Mi; Shim, Won Joon; Han, Gi Myung; Song, Young Kyoung; Hong, Sang Hee
2018-06-01
Fragmentation of large plastic debris into smaller particles results in increasing microplastic concentrations in the marine environment. In plastic debris fragmentation processes, the influence of biological factors remains largely unknown. This study investigated the fragmentation of expanded polystyrene (EPS) debris by polychaetes (Marphysa sanguinea) living on the debris. A large number of EPS particles (131 ± 131 particles/individual, 0.2-3.8 mm in length) were found in the digestive tracts of burrowing polychaetes living on EPS debris. To confirm the formation of microplastics by polychaetes and identify the quantity and morphology of produced microplastics, polychaetes were exposed to EPS blocks in filtered seawater under laboratory conditions. Polychaetes burrowed into the blocks and created numerous EPS microplastic particles, indicating that a single polychaete can produce hundreds of thousands of microplastic particles per year. These results reveal the potential role of marine organisms as microplastic producers in the marine environment. Copyright © 2018 Elsevier Ltd. All rights reserved.
Microphysics of Pyrocumulonimbus Clouds
NASA Technical Reports Server (NTRS)
Jensen, Eric; Ackerman, Andrew S.; Fridlind, Ann
2004-01-01
The intense heat from forest fires can generate explosive deep convective cloud systems that inject pollutants to high altitudes. Both satellite and high-altitude aircraft measurements have documented cases in which these pyrocumulonimbus clouds inject large amounts of smoke well into the stratosphere (Fromm and Servranckx 2003; Jost et al. 2004). This smoke can remain in the stratosphere, be transported large distances, and affect lower stratospheric chemistry. In addition recent in situ measurements in pyrocumulus updrafts have shown that the high concentrations of smoke particles have significant impacts on cloud microphysical properties. Very high droplet number densities result in delayed precipitation and may enhance lightning (Andrew et al. 2004). Presumably, the smoke particles will also lead to changes in the properties of anvil cirrus produces by the deep convection, with resulting influences on cloud radiative forcing. In situ sampling near the tops of mature pyrocumulonimbus is difficult due to the high altitude and violence of the storms. In this study, we use large eddy simulations (LES) with size-resolved microphysics to elucidate physical processes in pyrocumulonimbus clouds.
Environment and host as large-scale controls of ectomycorrhizal fungi.
van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I
2018-06-06
Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.
MetaSRA: normalized human sample-specific metadata for the Sequence Read Archive.
Bernstein, Matthew N; Doan, AnHai; Dewey, Colin N
2017-09-15
The NCBI's Sequence Read Archive (SRA) promises great biological insight if one could analyze the data in the aggregate; however, the data remain largely underutilized, in part, due to the poor structure of the metadata associated with each sample. The rules governing submissions to the SRA do not dictate a standardized set of terms that should be used to describe the biological samples from which the sequencing data are derived. As a result, the metadata include many synonyms, spelling variants and references to outside sources of information. Furthermore, manual annotation of the data remains intractable due to the large number of samples in the archive. For these reasons, it has been difficult to perform large-scale analyses that study the relationships between biomolecular processes and phenotype across diverse diseases, tissues and cell types present in the SRA. We present MetaSRA, a database of normalized SRA human sample-specific metadata following a schema inspired by the metadata organization of the ENCODE project. This schema involves mapping samples to terms in biomedical ontologies, labeling each sample with a sample-type category, and extracting real-valued properties. We automated these tasks via a novel computational pipeline. The MetaSRA is available at metasra.biostat.wisc.edu via both a searchable web interface and bulk downloads. Software implementing our computational pipeline is available at http://github.com/deweylab/metasra-pipeline. cdewey@biostat.wisc.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Influence of the watermark in immersion lithography process
NASA Astrophysics Data System (ADS)
Kawamura, Daisuke; Takeishi, Tomoyuki; Sho, Koutarou; Matsunaga, Kentarou; Shibata, Naofumi; Ozawa, Kaoru; Shimura, Satoru; Kyoda, Hideharu; Kawasaki, Tetsu; Ishida, Seiki; Toshima, Takayuki; Oonishi, Yasunobu; Ito, Shinichi
2005-05-01
In the liquid immersion lithography, uses of the cover material (C/M) films were discussed to reduce elution of resist components to fluid. With fluctuation of exposure tool or resist process, it is possible to remain of waterdrop on the wafer and watermark (W/M) will be made. The investigation of influence of the W/M on resist patterns, formation process of W/M, and reduction of pattern defect due to W/M will be discussed. Resist patterns within and around the intentionally made W/M were observed in three cases, which were without C/M, TOK TSP-3A and alkali-soluble C/M. In all C/M cases, pattern defect were T-topped shapes. Reduction of pattern defects due to waterdrop was examined. It was found that remained waterdrop made defect. It should be required to remove waterdrop before drying, and/or to remove the defect due to waterdrop. But new dry technique and/or unit will be need for making no W/M. It was examined that the observation of waterdrop through the drying step and simulative reproduction of experiment in order to understand the formation mechanism of W/M. If maximum drying time of waterdrop using immersion exposure tool is estimated 90 seconds, the watermark of which volume and diameter are less than 0.02 uL and 350um will be dried and will make pattern defect. The threshold will be large with wafer speed become faster. From result and speculations in this work, it is considered that it will be difficult to development C/M as single film, which makes no pattern defects due to remained waterdrop.
Selective spatial attention modulates bottom-up informational masking of speech
Carlile, Simon; Corkhill, Caitlin
2015-01-01
To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20 dB improvement in speech reception threshold; 40% of which was attributed to a release from informational masking. When across frequency temporal modulations in the masker talkers are decorrelated the speech is unintelligible, although the within frequency modulation characteristics remains identical. Used as a masker as above, the information masking accounted for 37% of the spatial unmasking seen with this masker. This unintelligible and highly differentiable masker is unlikely to involve top-down processes. These data provides strong evidence of bottom-up masking involving speech-like, within-frequency modulations and that this, presumably low level process, can be modulated by selective spatial attention. PMID:25727100
Selective spatial attention modulates bottom-up informational masking of speech.
Carlile, Simon; Corkhill, Caitlin
2015-03-02
To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20 dB improvement in speech reception threshold; 40% of which was attributed to a release from informational masking. When across frequency temporal modulations in the masker talkers are decorrelated the speech is unintelligible, although the within frequency modulation characteristics remains identical. Used as a masker as above, the information masking accounted for 37% of the spatial unmasking seen with this masker. This unintelligible and highly differentiable masker is unlikely to involve top-down processes. These data provides strong evidence of bottom-up masking involving speech-like, within-frequency modulations and that this, presumably low level process, can be modulated by selective spatial attention.
Isotope effect of mercury diffusion in air
Koster van Groos, Paul G.; Esser, Bradley K.; Williams, Ross W.; Hunt, James R.
2014-01-01
Identifying and reducing impacts from mercury sources in the environment remains a considerable challenge and requires process based models to quantify mercury stocks and flows. The stable isotope composition of mercury in environmental samples can help address this challenge by serving as a tracer of specific sources and processes. Mercury isotope variations are small and result only from isotope fractionation during transport, equilibrium, and transformation processes. Because these processes occur in both industrial and environmental settings, knowledge of their associated isotope effects is required to interpret mercury isotope data. To improve the mechanistic modeling of mercury isotope effects during gas phase diffusion, an experimental program tested the applicability of kinetic gas theory. Gas-phase elemental mercury diffusion through small bore needles from finite sources demonstrated mass dependent diffusivities leading to isotope fractionation described by a Rayleigh distillation model. The measured relative atomic diffusivities among mercury isotopes in air are large and in agreement with kinetic gas theory. Mercury diffusion in air offers a reasonable explanation of recent field results reported in the literature. PMID:24364380
The Theory of High Energy Collision Processes - Final Report DOE/ER/40158-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Tai, T.
In 1984, DOE awarded Harvard University a new Grant DE-FG02-84ER40158 to continue their support of Tai Tsun Wu as Principal Investigator of research on the theory of high energy collision processes. This Grant was renewed and remained active continuously from June 1, 1984 through November 30, 2007. Topics of interest during the 23-year duration of this Grant include: the theory and phenomenology of collision and production processes at ever higher energies; helicity methods of QED and QCD; neutrino oscillations and masses; Yang-Mills gauge theory; Beamstrahlung; Fermi pseudopotentials; magnetic monopoles and dyons; cosmology; classical confinement; mass relations; Bose-Einstein condensation; and large-momentum-transfermore » scattering processes. This Final Report describes the research carried out on Grant DE-FG02-84ER40158 for the period June 1, 1984 through November 30, 2007. Two books resulted from this project and a total of 125 publications.« less
Dynamics of assembly production flow
NASA Astrophysics Data System (ADS)
Ezaki, Takahiro; Yanagisawa, Daichi; Nishinari, Katsuhiro
2015-06-01
Despite recent developments in management theory, maintaining a manufacturing schedule remains difficult because of production delays and fluctuations in demand and supply of materials. The response of manufacturing systems to such disruptions to dynamic behavior has been rarely studied. To capture these responses, we investigate a process that models the assembly of parts into end products. The complete assembly process is represented by a directed tree, where the smallest parts are injected at leaves and the end products are removed at the root. A discrete assembly process, represented by a node on the network, integrates parts, which are then sent to the next downstream node as a single part. The model exhibits some intriguing phenomena, including overstock cascade, phase transition in terms of demand and supply fluctuations, nonmonotonic distribution of stockout in the network, and the formation of a stockout path and stockout chains. Surprisingly, these rich phenomena result from only the nature of distributed assembly processes. From a physical perspective, these phenomena provide insight into delay dynamics and inventory distributions in large-scale manufacturing systems.
Isotope effect of mercury diffusion in air.
Koster van Groos, Paul G; Esser, Bradley K; Williams, Ross W; Hunt, James R
2014-01-01
Identifying and reducing impacts from mercury sources in the environment remains a considerable challenge and requires process based models to quantify mercury stocks and flows. The stable isotope composition of mercury in environmental samples can help address this challenge by serving as a tracer of specific sources and processes. Mercury isotope variations are small and result only from isotope fractionation during transport, equilibrium, and transformation processes. Because these processes occur in both industrial and environmental settings, knowledge of their associated isotope effects is required to interpret mercury isotope data. To improve the mechanistic modeling of mercury isotope effects during gas phase diffusion, an experimental program tested the applicability of kinetic gas theory. Gas-phase elemental mercury diffusion through small bore needles from finite sources demonstrated mass dependent diffusivities leading to isotope fractionation described by a Rayleigh distillation model. The measured relative atomic diffusivities among mercury isotopes in air are large and in agreement with kinetic gas theory. Mercury diffusion in air offers a reasonable explanation of recent field results reported in the literature.
Hilbig, Benjamin E; Erdfelder, Edgar; Pohl, Rüdiger F
2011-07-01
A new process model of the interplay between memory and judgment processes was recently suggested, assuming that retrieval fluency-that is, the speed with which objects are recognized-will determine inferences concerning such objects in a single-cue fashion. This aspect of the fluency heuristic, an extension of the recognition heuristic, has remained largely untested due to methodological difficulties. To overcome the latter, we propose a measurement model from the class of multinomial processing tree models that can estimate true single-cue reliance on recognition and retrieval fluency. We applied this model to aggregate and individual data from a probabilistic inference experiment and considered both goodness of fit and model complexity to evaluate different hypotheses. The results were relatively clear-cut, revealing that the fluency heuristic is an unlikely candidate for describing comparative judgments concerning recognized objects. These findings are discussed in light of a broader theoretical view on the interplay of memory and judgment processes.
Time-based analysis of the apheresis platelet supply chain in England.
Wilding, R; Cotton, S; Dobbin, J; Chapman, J; Yates, N
2011-10-01
During 2009/2010 loss of platelets within NHS Blood and Transplant (NHSBT) due to time expiry was 9.3%. Hospitals remain reluctant to hold stocks of platelets due to the poor shelf life at issue. The purpose of this study was to identify areas for time compression in the apheresis platelet supply chain to extend the shelf life available for hospitals and reduce wastage in NHSBT. This was done within the context of NHSBT reconfiguring their supply chain and moving towards a consolidated and centralised approach. Time based process mapping was applied to identify value and non-value adding time in two manufacturing models. A large amount of the non-value adding time in the apheresis platelet supply chain is due to transportation and waiting for the next process in the manufacturing process to take place. Time based process mapping provides an effective 'lens' for supply chain professionals to identify opportunities for improvement in the platelet supply chain. © 2011 The Author(s). Vox Sanguinis © 2011 International Society of Blood Transfusion.
The search for a topographic signature of life.
Dietrich, William E; Perron, J Taylor
2006-01-26
Landscapes are shaped by the uplift, deformation and breakdown of bedrock and the erosion, transport and deposition of sediment. Life is important in all of these processes. Over short timescales, the impact of life is quite apparent: rock weathering, soil formation and erosion, slope stability and river dynamics are directly influenced by biotic processes that mediate chemical reactions, dilate soil, disrupt the ground surface and add strength with a weave of roots. Over geologic time, biotic effects are less obvious but equally important: biota affect climate, and climatic conditions dictate the mechanisms and rates of erosion that control topographic evolution. Apart from the obvious influence of humans, does the resulting landscape bear an unmistakable stamp of life? The influence of life on topography is a topic that has remained largely unexplored. Erosion laws that explicitly include biotic effects are needed to explore how intrinsically small-scale biotic processes can influence the form of entire landscapes, and to determine whether these processes create a distinctive topography.
Preece, David; Allan, Alfred; Becerra, Rodrigo
2016-01-01
To examine the neuropsychological outcomes for an adult patient, 2 years after receiving microsurgery and conventional radiotherapy for a recurrent craniopharyngioma; and the impact of a further intervention, stereotactic radiotherapy, on this level of neuropsychological functioning. JD, a 30 year old male whose recurrent craniopharyngioma had 2 years earlier been treated with two operations and conventional radiotherapy. JD was assessed (using standardized clinical tests) before and after a course of stereotactic radiotherapy. Prior to stereotactic radiotherapy (and 2 years after microsurgery and conventional radiotherapy) JD's IQ was intact, but considerable impairments were present in executive functioning, memory, theory of mind and processing speed. Fifteen months after stereotactic radiotherapy, all neuropsychological domains remained largely static or improved, supporting the utility of this treatment option in the neuropsychological domain. However, deficits in executive functioning, memory and processing speed remained. These findings suggest that, even after multiple treatments, substantial cognitive impairments can be present in an adult patient with a recurrent craniopharyngioma. This profile of deficits underlines the inadequacy of relying purely on IQ as a marker for cognitive health in this population and emphasizes the need to include neuropsychological impairments as a focus of rehabilitation with these patients.
A mechanism for the production of ultrafine particles from concrete fracture.
Jabbour, Nassib; Rohan Jayaratne, E; Johnson, Graham R; Alroe, Joel; Uhde, Erik; Salthammer, Tunga; Cravigan, Luke; Faghihi, Ehsan Majd; Kumar, Prashant; Morawska, Lidia
2017-03-01
While the crushing of concrete gives rise to large quantities of coarse dust, it is not widely recognized that this process also emits significant quantities of ultrafine particles. These particles impact not just the environments within construction activities but those in entire urban areas. The origin of these ultrafine particles is uncertain, as existing theories do not support their production by mechanical processes. We propose a hypothesis for this observation based on the volatilisation of materials at the concrete fracture interface. The results from this study confirm that mechanical methods can produce ultrafine particles (UFP) from concrete, and that the particles are volatile. The ultrafine mode was only observed during concrete fracture, producing particle size distributions with average count median diameters of 27, 39 and 49 nm for the three tested concrete samples. Further volatility measurements found that the particles were highly volatile, showing between 60 and 95% reduction in the volume fraction remaining by 125 °C. An analysis of the volatile fraction remaining found that different volatile material is responsible for the production of particles between the samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Assessment of anorexia nervosa: an overview of universal issues and contextual challenges
2013-01-01
Aim Anorexia Nervosa (AN) is a complex and clinically challenging syndrome. Intended for specialist audiences, this narrative review aims to summarise the available literature related to assessment in the adult patient context, synthesising both research evidence and clinical consensus guidelines. Method We provide a review of the available literature on specialist assessment of AN focusing on common trajectories into assessment, obstacles accessing assessment, common presenting issues and barriers to the assessment process, the necessary scope of assessment, and tools and techniques. It describes the further step of synthesising assessment information in ways that can inform resultant care plans. Results In addition to assessment of core behaviours and diagnostic skills, considerations for the expert assessor include the functions of primary care, systemic and personal barriers, knowledge of current assessment tools and research pertaining to comorbid pathology in AN, assessing severity of illness, role of family at assessment, as well as medical, nutritional and compulsory elements of assessment. Conclusion Comprehensive assessment of AN in the current healthcare context still remains largely the remit of the specialist ED clinician. Assessment should remain an on-going process, paying particular attention to available empirical evidence, thereby reducing the gap between research and practice. PMID:24999408
NASA Astrophysics Data System (ADS)
Shi, X.
2015-12-01
As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.
Key ingredients needed when building large data processing systems for scientists
NASA Technical Reports Server (NTRS)
Miller, K. C.
2002-01-01
Why is building a large science software system so painful? Weren't teams of software engineers supposed to make life easier for scientists? Does it sometimes feel as if it would be easier to write the million lines of code in Fortran 77 yourself? The cause of this dissatisfaction is that many of the needs of the science customer remain hidden in discussions with software engineers until after a system has already been built. In fact, many of the hidden needs of the science customer conflict with stated needs and are therefore very difficult to meet unless they are addressed from the outset in a system's architectural requirements. What's missing is the consideration of a small set of key software properties in initial agreements about the requirements, the design and the cost of the system.
Domoic acid excretion in dungeness crabs, razor clams and mussels.
Schultz, Irvin R; Skillman, Ann; Woodruff, Dana
2008-07-01
Domoic acid (DA) is a neurotoxic amino acid produced by several marine algal species of the Pseudo-nitzschia (PN) genus. We studied the elimination of DA from hemolymph after intravascular (IV) injection in razor clams (Siliqua patula), mussels (Mytilus edulis) and Dungeness crabs (Cancer magister). Crabs were also injected with two other organic acids, dichloroacetic acid (DCAA) and kainic acid (KA). For IV dosing, hemolymph was repetitively sampled and DA concentrations measured by HPLC-UV. Toxicokinetic analysis of DA in crabs suggested most of the injected dose remained within hemolymph compartment with little extravascular distribution. This observation is in sharp contrast to results obtained from clams and mussels which exhibited similarly large apparent volumes of distribution despite large differences in overall clearance. These findings suggest fundamentally different storage and elimination processes are occurring for DA between bivalves and crabs.
Algal Energy Conversion and Capture
NASA Astrophysics Data System (ADS)
Hazendonk, P.
2015-12-01
We address the potential for energy conversions and capture for: energy generation; reduction in energy use; reduction in greenhouse gas emissions; remediation of water and air pollution; protection and enhancement of soil fertility. These processes have the potential to sequester carbon at scales that may have global impact. Energy conversion and capture strategies evaluate energy use and production from agriculture, urban areas and industries, and apply existing and emerging technologies to reduce and recapture energy embedded in waste products. The basis of biocrude production from Micro-algal feedstocks: 1) The nutrients from the liquid fraction of waste streams are concentrated and fed into photo bioreactors (essentially large vessels in which microalgae are grown) along with CO2 from flue gasses from down stream processes. 2) The algae are processed to remove high value products such as proteins and beta-carotenes. The advantage of algae feedstocks is the high biomass productivity is 30-50 times that of land based crops and the remaining biomass contains minimal components that are difficult to convert to biocrude. 3) The remaining biomass undergoes hydrothermal liquefaction to produces biocrude and biochar. The flue gasses of this process can be used to produce electricity (fuel cell) and subsequently fed back into the photobioreactor. The thermal energy required for this process is small, hence readily obtained from solar-thermal sources, and furthermore no drying or preprocessing is required keeping the energy overhead extremely small. 4) The biocrude can be upgraded and refined as conventional crude oil, creating a range of liquid fuels. In principle this process can be applied on the farm scale to the municipal scale. Overall, our primary food production is too dependent on fossil fuels. Energy conversion and capture can make food production sustainable.
Prevention of preterm birth: harnessing science to address the global epidemic.
Rubens, Craig E; Sadovsky, Yoel; Muglia, Louis; Gravett, Michael G; Lackritz, Eve; Gravett, Courtney
2014-11-12
Preterm birth is a leading cause of infant morbidity and mortality worldwide, but current interventions to prevent prematurity are largely ineffective. Preterm birth is increasingly recognized as an outcome that can result from a variety of pathological processes. Despite current research efforts, the mechanisms underlying these processes remain poorly understood and are influenced by a range of biological and environmental factors. Research with modern techniques is needed to understand the mechanisms responsible for preterm labor and birth and identify targets for diagnostic and therapeutic solutions. This review evaluates the state of reproductive science relevant to understanding the causes of preterm birth, identifies potential targets for prevention, and outlines challenges and opportunities for translating research findings into effective interventions. Copyright © 2014, American Association for the Advancement of Science.
Massive blow-out craters formed by hydrate-controlled methane expulsion from the Arctic seafloor
NASA Astrophysics Data System (ADS)
Andreassen, K.; Hubbard, A.; Winsborrow, M.; Patton, H.; Vadakkepuliyambatta, S.; Plaza-Faverola, A.; Gudlaugsson, E.; Serov, P.; Deryabin, A.; Mattingsdal, R.; Mienert, J.; Bünz, S.
2017-06-01
Widespread methane release from thawing Arctic gas hydrates is a major concern, yet the processes, sources, and fluxes involved remain unconstrained. We present geophysical data documenting a cluster of kilometer-wide craters and mounds from the Barents Sea floor associated with large-scale methane expulsion. Combined with ice sheet/gas hydrate modeling, our results indicate that during glaciation, natural gas migrated from underlying hydrocarbon reservoirs and was sequestered extensively as subglacial gas hydrates. Upon ice sheet retreat, methane from this hydrate reservoir concentrated in massive mounds before being abruptly released to form craters. We propose that these processes were likely widespread across past glaciated petroleum provinces and that they also provide an analog for the potential future destabilization of subglacial gas hydrate reservoirs beneath contemporary ice sheets.
Approaches for in silico finishing of microbial genome sequences
Kremer, Frederico Schmitt; McBride, Alan John Alexander; Pinto, Luciano da Silva
2017-01-01
Abstract The introduction of next-generation sequencing (NGS) had a significant effect on the availability of genomic information, leading to an increase in the number of sequenced genomes from a large spectrum of organisms. Unfortunately, due to the limitations implied by the short-read sequencing platforms, most of these newly sequenced genomes remained as “drafts”, incomplete representations of the whole genetic content. The previous genome sequencing studies indicated that finishing a genome sequenced by NGS, even bacteria, may require additional sequencing to fill the gaps, making the entire process very expensive. As such, several in silico approaches have been developed to optimize the genome assemblies and facilitate the finishing process. The present review aims to explore some free (open source, in many cases) tools that are available to facilitate genome finishing. PMID:28898352
Label-assisted mass spectrometry for the acceleration of reaction discovery and optimization
NASA Astrophysics Data System (ADS)
Cabrera-Pardo, Jaime R.; Chai, David I.; Liu, Song; Mrksich, Milan; Kozmin, Sergey A.
2013-05-01
The identification of new reactions expands our knowledge of chemical reactivity and enables new synthetic applications. Accelerating the pace of this discovery process remains challenging. We describe a highly effective and simple platform for screening a large number of potential chemical reactions in order to discover and optimize previously unknown catalytic transformations, thereby revealing new chemical reactivity. Our strategy is based on labelling one of the reactants with a polyaromatic chemical tag, which selectively undergoes a photoionization/desorption process upon laser irradiation, without the assistance of an external matrix, and enables rapid mass spectrometric detection of any products originating from such labelled reactants in complex reaction mixtures without any chromatographic separation. This method was successfully used for high-throughput discovery and subsequent optimization of two previously unknown benzannulation reactions.
URANIUM RECOVERY AND PURIFICATION PROCESS AND PRODUCTION OF HIGH PURITY URANIUM TETRAFLUORIDE
Bailes, R.H.; Long, R.S.; Grinstead, R.R.
1957-09-17
A process is described wherein an anionic exchange technique is employed to separate uramium from a large variety of impurities. Very efficient and economical purification of contamimated uranium can be achieved by treatment of the contaminated uranium to produce a solution containing a high concentration of chloride. Under these conditions the uranium exists as an aniomic chloride complex. Then the uranium chloride complex is adsorbed from the solution on an aniomic exchange resin, whereby a portion of the impurities remain in the solution and others are retained with the uramium by the resin. The adsorbed impurities are then removed by washing the resin with pure concentrated hydrochloric acid, after which operation the uranium is eluted with pure water yielding an acidic uranyl chloride solution of high purity.
Microfluidic model of the platelet-generating organ: beyond bone marrow biomimetics
Blin, Antoine; Le Goff, Anne; Magniez, Aurélie; Poirault-Chassac, Sonia; Teste, Bruno; Sicot, Géraldine; Nguyen, Kim Anh; Hamdi, Feriel S.; Reyssat, Mathilde; Baruch, Dominique
2016-01-01
We present a new, rapid method for producing blood platelets in vitro from cultured megakaryocytes based on a microfluidic device. This device consists in a wide array of VWF-coated micropillars. Such pillars act as anchors on megakaryocytes, allowing them to remain trapped in the device and subjected to hydrodynamic shear. The combined effect of anchoring and shear induces the elongation of megakaryocytes and finally their rupture into platelets and proplatelets. This process was observed with megakaryocytes from different origins and found to be robust. This original bioreactor design allows to process megakaryocytes at high throughput (millions per hour). Since platelets are produced in such a large amount, their extensive biological characterisation is possible and shows that platelets produced in this bioreactor are functional. PMID:26898346
Approaches for in silico finishing of microbial genome sequences.
Kremer, Frederico Schmitt; McBride, Alan John Alexander; Pinto, Luciano da Silva
The introduction of next-generation sequencing (NGS) had a significant effect on the availability of genomic information, leading to an increase in the number of sequenced genomes from a large spectrum of organisms. Unfortunately, due to the limitations implied by the short-read sequencing platforms, most of these newly sequenced genomes remained as "drafts", incomplete representations of the whole genetic content. The previous genome sequencing studies indicated that finishing a genome sequenced by NGS, even bacteria, may require additional sequencing to fill the gaps, making the entire process very expensive. As such, several in silico approaches have been developed to optimize the genome assemblies and facilitate the finishing process. The present review aims to explore some free (open source, in many cases) tools that are available to facilitate genome finishing.
A critical role for PDGFRα signaling in medial nasal process development.
He, Fenglei; Soriano, Philippe
2013-01-01
The primitive face is composed of neural crest cell (NCC) derived prominences. The medial nasal processes (MNP) give rise to the upper lip and vomeronasal organ, and are essential for normal craniofacial development, but the mechanism of MNP development remains largely unknown. PDGFRα signaling is known to be critical for NCC development and craniofacial morphogenesis. In this study, we show that PDGFRα is required for MNP development by maintaining the migration of progenitor neural crest cells (NCCs) and the proliferation of MNP cells. Further investigations reveal that PI3K/Akt and Rac1 signaling mediate PDGFRα function during MNP development. We thus establish PDGFRα as a novel regulator of MNP development and elucidate the roles of its downstream signaling pathways at cellular and molecular levels.
Mathaes, Roman; Mahler, Hanns-Christian; Roggo, Yves; Huwyler, Joerg; Eder, Juergen; Fritsch, Kamila; Posset, Tobias; Mohl, Silke; Streubel, Alexander
2016-01-01
Capping equipment used in good manufacturing practice manufacturing features different designs and a variety of adjustable process parameters. The overall capping result is a complex interplay of the different capping process parameters and is insufficiently described in literature. It remains poorly studied how the different capping equipment designs and capping equipment process parameters (e.g., pre-compression force, capping plate height, turntable rotating speed) contribute to the final residual seal force of a sealed container closure system and its relation to container closure integrity and other drug product quality parameters. Stopper compression measured by computer tomography correlated to residual seal force measurements.In our studies, we used different container closure system configurations from different good manufacturing practice drug product fill & finish facilities to investigate the influence of differences in primary packaging, that is, vial size and rubber stopper design on the capping process and the capped drug product. In addition, we compared two large-scale good manufacturing practice manufacturing capping equipment and different capping equipment settings and their impact on product quality and integrity, as determined by residual seal force.The capping plate to plunger distance had a major influence on the obtained residual seal force values of a sealed vial, whereas the capping pre-compression force and the turntable rotation speed showed only a minor influence on the residual seal force of a sealed vial. Capping process parameters could not easily be transferred from capping equipment of different manufacturers. However, the residual seal force tester did provide a valuable tool to compare capping performance of different capping equipment. No vial showed any leakage greater than 10(-8)mbar L/s as measured by a helium mass spectrometry system, suggesting that container closure integrity was warranted in the residual seal force range tested for the tested container closure systems. Capping equipment used in good manufacturing practice manufacturing features different designs and a variety of adjustable process parameters. The overall capping result is a complex interplay of the different capping process parameters and is insufficiently described in the literature. It remains poorly studied how the different capping equipment designs and capping equipment process parameters contribute to the final capping result.In this study, we used different container closure system configurations from different good manufacturing process drug product fill & finish facilities to investigate the influence of the vial size and the rubber stopper design on the capping process. In addition, we compared two examples of large-scale good manufacturing process capping equipment and different capping equipment settings and their impact on product quality and integrity, as determined by residual seal force. © PDA, Inc. 2016.
NASA Astrophysics Data System (ADS)
Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng
2018-03-01
The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.
Komatsu, G.; Dohm, J.M.; Hare, T.M.
2004-01-01
Large-scale tectonomagmatic complexes are common on Earth and Mars. Many of these complexes are created or at least influenced by mantle processes, including a wide array of plume types ranging from superplumes to mantle plumes. Among the most prominent complexes, the Mongolian plateau on Earth and the Tharsis bulge on Mars share remarkable similarities in terms of large domal uplifted areas, great rift canyon systems, and widespread volcanism on their surfaces. Water has also played an important role in the development of the two complexes. In general, atmospheric and surface water play a bigger role in the development of the present-day Mongolian plateau than for the Tharsis bulge, as evidenced by highly developed drainages and thick accumulation of sediments in the basins of the Baikal rift system. On the Tharsis bulge, however, water appears to have remained as ground ice except during periods of elevated magmatic activity. Glacial and periglacial processes are well documented for the Mongolian plateau and are also reported for parts of the Tharsis bulge. Ice-magma interactions, which are represented by the formation of subice volcanoes in parts of the Mongolian plateau region, have been reported for the Valles Marineris region of Mars. The complexes are also characterized by cataclysmic floods, but their triggering mechanism may differ: mainly ice-dam failures for the Mongolian plateau and outburst of groundwater for the Tharsis bulge, probably by magma-ice interactions, although ice-dam failures within the Valles Marineris region cannot be ruled out as a possible contributor. Comparative studies of the Mongolian plateau and Tharsis bulge provide excellent opportunities for understanding surface manifestations of plume-driven processes on terrestrial planets and how they interact with hydro-cryospheres. ?? 2004 Geological Society of America.
Sharma, Hitt J; Patil, Vishwanath D; Lalwani, Sanjay K; Manglani, Mamta V; Ravichandran, Latha; Kapre, Subhash V; Jadhav, Suresh S; Parekh, Sameer S; Ashtagi, Girija; Malshe, Nandini; Palkar, Sonali; Wade, Minal; Arunprasath, T K; Kumar, Dinesh; Shewale, Sunil D
2012-01-11
Hib vaccine can be easily incorporated in EPI vaccination schedule as the immunization schedule of Hib is similar to that of DTP vaccine. To meet the global demand of Hib vaccine, SIIL scaled up the Hib conjugate manufacturing process. This study was conducted in Indian infants to assess and compare the immunogenicity and safety of DTwP-HB+Hib (Pentavac(®)) vaccine of SIIL manufactured at large scale with the 'same vaccine' manufactured at a smaller scale. 720 infants aged 6-8 weeks were randomized (2:1 ratio) to receive 0.5 ml of Pentavac(®) vaccine from two different lots one produced at scaled up process and the other at a small scale process. Serum samples obtained before and at one month after the 3rd dose of vaccine from both the groups were tested for IgG antibody response by ELISA and compared to assess non-inferiority. Neither immunological interference nor increased reactogenicity was observed in either of the vaccine groups. All infants developed protective antibody titres to diphtheria, tetanus and Hib disease. For hepatitis B antigen, one child from each group remained sero-negative. The response to pertussis was 88% in large scale group vis-à-vis 87% in small scale group. Non-inferiority was concluded for all five components of the vaccine. No serious adverse event was reported in the study. The scale up vaccine achieved comparable response in terms of the safety and immunogenicity to small scale vaccine and therefore can be easily incorporated in the routine childhood vaccination programme. Copyright © 2011 Elsevier Ltd. All rights reserved.
Membrane Assembly during the Infection Cycle of the Giant Mimivirus
Mutsafi, Yael; Shimoni, Eyal; Shimon, Amir; Minsky, Abraham
2013-01-01
Although extensively studied, the structure, cellular origin and assembly mechanism of internal membranes during viral infection remain unclear. By combining diverse imaging techniques, including the novel Scanning-Transmission Electron Microscopy tomography, we elucidate the structural stages of membrane biogenesis during the assembly of the giant DNA virus Mimivirus. We show that this elaborate multistage process occurs at a well-defined zone localized at the periphery of large viral factories that are generated in the host cytoplasm. Membrane biogenesis is initiated by fusion of multiple vesicles, ∼70 nm in diameter, that apparently derive from the host ER network and enable continuous supply of lipid components to the membrane-assembly zone. The resulting multivesicular bodies subsequently rupture to form large open single-layered membrane sheets from which viral membranes are generated. Membrane generation is accompanied by the assembly of icosahedral viral capsids in a process involving the hypothetical major capsid protein L425 that acts as a scaffolding protein. The assembly model proposed here reveals how multiple Mimivirus progeny can be continuously and efficiently generated and underscores the similarity between the infection cycles of Mimivirus and Vaccinia virus. Moreover, the membrane biogenesis process indicated by our findings provides new insights into the pathways that might mediate assembly of internal viral membranes in general. PMID:23737745
NASA Astrophysics Data System (ADS)
Thomazo, Christophe; Buoncristiani, Jean-Francois; Vennin, Emmanuelle; Pellenard, Pierre; Cocquerez, Theophile; Mugnier, Jean L.; Gérard, Emmanuelle
2017-09-01
Cold climate carbonates can be used as paleoclimatic proxies. The mineralogy and isotopic composition of subglacially precipitated carbonate crusts provide insights into the subglacial conditions and processes occurring at the meltwater-basement rock interface of glaciers. This study documents such crusts discovered on the lee side of a gneissic roche moutonnée at the terminus of the Bossons glacier in the Mont Blanc Massif area (France). The geological context and mineralogical investigations suggest that the Ca used for the precipitation of large crystals of radial fibrous sparite observed in these crusts originated from subglacial chemical weathering of Ca-bearing minerals of the local bedrock (plagioclase and amphibole). Measurements of the carbon and oxygen isotope compositions in the crusts indicate precipitation at, or near to, equilibrium with the basal meltwater under open system conditions during refreezing processes. The homogeneous and low carbonate δ13C values (ca. -11.3‰) imply a large contribution of soil organic carbon to the Bossons subglacial meltwater carbon reservoir at the time of deposition. In addition, organic remains trapped within the subglacially precipitated carbonate crusts give an age of deposition around 6500 years cal BP suggesting that the Mid-Holocene climatic and pedological optima are archived in the Bossons glacier carbonate crusts.
Ratanapariyanuch, Kornsulee; Tyler, Robert T; Shim, Youn Young; Reaney, Martin Jt
2012-01-12
Large volumes of treated process water are required for protein extraction. Evaporation of this water contributes greatly to the energy consumed in enriching protein products. Thin stillage remaining from ethanol production is available in large volumes and may be suitable for extracting protein rich materials. In this work protein was extracted from ground defatted oriental mustard (Brassica juncea (L.) Czern.) meal using thin stillage. Protein extraction efficiency was studied at pHs between 7.6 and 10.4 and salt concentrations between 3.4 × 10-2 and 1.2 M. The optimum extraction efficiency was pH 10.0 and 1.0 M NaCl. Napin and cruciferin were the most prevalent proteins in the isolate. The isolate exhibited high in vitro digestibility (74.9 ± 0.80%) and lysine content (5.2 ± 0.2 g/100 g of protein). No differences in the efficiency of extraction, SDS-PAGE profile, digestibility, lysine availability, or amino acid composition were observed between protein extracted with thin stillage and that extracted with NaCl solution. The use of thin stillage, in lieu of water, for protein extraction would decrease the energy requirements and waste disposal costs of the protein isolation and biofuel production processes.
2012-01-01
Large volumes of treated process water are required for protein extraction. Evaporation of this water contributes greatly to the energy consumed in enriching protein products. Thin stillage remaining from ethanol production is available in large volumes and may be suitable for extracting protein rich materials. In this work protein was extracted from ground defatted oriental mustard (Brassica juncea (L.) Czern.) meal using thin stillage. Protein extraction efficiency was studied at pHs between 7.6 and 10.4 and salt concentrations between 3.4 × 10-2 and 1.2 M. The optimum extraction efficiency was pH 10.0 and 1.0 M NaCl. Napin and cruciferin were the most prevalent proteins in the isolate. The isolate exhibited high in vitro digestibility (74.9 ± 0.80%) and lysine content (5.2 ± 0.2 g/100 g of protein). No differences in the efficiency of extraction, SDS-PAGE profile, digestibility, lysine availability, or amino acid composition were observed between protein extracted with thin stillage and that extracted with NaCl solution. The use of thin stillage, in lieu of water, for protein extraction would decrease the energy requirements and waste disposal costs of the protein isolation and biofuel production processes. PMID:22239856
Large-Scale SRM Screen of Urothelial Bladder Cancer Candidate Biomarkers in Urine.
Duriez, Elodie; Masselon, Christophe D; Mesmin, Cédric; Court, Magali; Demeure, Kevin; Allory, Yves; Malats, Núria; Matondo, Mariette; Radvanyi, François; Garin, Jérôme; Domon, Bruno
2017-04-07
Urothelial bladder cancer is a condition associated with high recurrence and substantial morbidity and mortality. Noninvasive urinary tests that would detect bladder cancer and tumor recurrence are required to significantly improve patient care. Over the past decade, numerous bladder cancer candidate biomarkers have been identified in the context of extensive proteomics or transcriptomics studies. To translate these findings in clinically useful biomarkers, the systematic evaluation of these candidates remains the bottleneck. Such evaluation involves large-scale quantitative LC-SRM (liquid chromatography-selected reaction monitoring) measurements, targeting hundreds of signature peptides by monitoring thousands of transitions in a single analysis. The design of highly multiplexed SRM analyses is driven by several factors: throughput, robustness, selectivity and sensitivity. Because of the complexity of the samples to be analyzed, some measurements (transitions) can be interfered by coeluting isobaric species resulting in biased or inconsistent estimated peptide/protein levels. Thus the assessment of the quality of SRM data is critical to allow flagging these inconsistent data. We describe an efficient and robust method to process large SRM data sets, including the processing of the raw data, the detection of low-quality measurements, the normalization of the signals for each protein, and the estimation of protein levels. Using this methodology, a variety of proteins previously associated with bladder cancer have been assessed through the analysis of urine samples from a large cohort of cancer patients and corresponding controls in an effort to establish a priority list of most promising candidates to guide subsequent clinical validation studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ruo-Yu; Rieger, F. M.; Aharonian, F. A., E-mail: ruoyu@mpi-hd.mpg.de, E-mail: frank.rieger@mpi-hd.mpg.de, E-mail: aharon@mpi-hd.mpg.de
The origin of the extended X-ray emission in the large-scale jets of active galactic nuclei (AGNs) poses challenges to conventional models of acceleration and emission. Although electron synchrotron radiation is considered the most feasible radiation mechanism, the formation of the continuous large-scale X-ray structure remains an open issue. As astrophysical jets are expected to exhibit some turbulence and shearing motion, we here investigate the potential of shearing flows to facilitate an extended acceleration of particles and evaluate its impact on the resultant particle distribution. Our treatment incorporates systematic shear and stochastic second-order Fermi effects. We show that for typical parametersmore » applicable to large-scale AGN jets, stochastic second-order Fermi acceleration, which always accompanies shear particle acceleration, can play an important role in facilitating the whole process of particle energization. We study the time-dependent evolution of the resultant particle distribution in the presence of second-order Fermi acceleration, shear acceleration, and synchrotron losses using a simple Fokker–Planck approach and provide illustrations for the possible emergence of a complex (multicomponent) particle energy distribution with different spectral branches. We present examples for typical parameters applicable to large-scale AGN jets, indicating the relevance of the underlying processes for understanding the extended X-ray emission and the origin of ultrahigh-energy cosmic rays.« less
Turney, G.L.; Goerlitz, D.F.
1989-01-01
Gas Works Park, in Seattle, Washington, is located on the site of a coal and oil gasification plant that ceased operation in 1956. During operation, many types of wastes, including coal, tar, and oil, accumulated on site. The park soil is presently (1986) contaminated with compounds such as polynuclear aromatic hydrocarbons, volatile organic compounds, trace metals, and cyanide. Analyses of water samples from a network of observation wells in the park indicate that these compounds are also present in the groundwater. Polynuclear aromatic hydrocarbons and volatile organic compounds were identified in groundwater samples in concentrations as large as 200 mg/L. Concentrations of organic compounds were largest where groundwater was in contact with a nonaqueous phase liquid in the soil. Concentrations in groundwater were much smaller where no nonaqueous phase liquid was present, even if the groundwater was in contact with contaminated soils. This condition is attributed to weathering processes at the site, such as dissolution, volatilization, and biodegradation. Soluble, volatile, low-molecular-weight organic compounds are preferentially dissolved from the nonaqueous phase liquid into the groundwater. Where no nonaqueous phase liquid is present, only stained soils containing relatively insoluble, high-molecular-weight compounds remain; therefore, contaminant concentrations in the groundwater are much smaller. Concentrations of organic contaminants in the soils may still remain large. Values of specific conductance were as large as 5,280 microsiemens/cm, well above a background of 242 microsiemens/cm, suggesting large concentrations of minerals in the groundwater. Trace metal concentrations, however , were generally < 0.010 mg/L, and below limits of US EPA drinking water standards. Cyanide was present in groundwater samples from throughout the park, ranging in concentration from 0.01 to 8.6 mg/L. (Author 's abstract)
On the permeation of large organic cations through the pore of ATP-gated P2X receptors
Harkat, Mahboubi; Peverini, Laurie; Dunning, Kate; Beudez, Juline; Martz, Adeline; Calimet, Nicolas; Specht, Alexandre; Cecchini, Marco; Chataigneau, Thierry; Grutter, Thomas
2017-01-01
Pore dilation is thought to be a hallmark of purinergic P2X receptors. The most commonly held view of this unusual process posits that under prolonged ATP exposure the ion pore expands in a striking manner from an initial small-cation conductive state to a dilated state, which allows the passage of larger synthetic cations, such as N-methyl-d-glucamine (NMDG+). However, this mechanism is controversial, and the identity of the natural large permeating cations remains elusive. Here, we provide evidence that, contrary to the time-dependent pore dilation model, ATP binding opens an NMDG+-permeable channel within milliseconds, with a conductance that remains stable over time. We show that the time course of NMDG+ permeability superimposes that of Na+ and demonstrate that the molecular motions leading to the permeation of NMDG+ are very similar to those that drive Na+ flow. We found, however, that NMDG+ “percolates” 10 times slower than Na+ in the open state, likely due to a conformational and orientational selection of permeating molecules. We further uncover that several P2X receptors, including those able to desensitize, are permeable not only to NMDG+ but also to spermidine, a large natural cation involved in ion channel modulation, revealing a previously unrecognized P2X-mediated signaling. Altogether, our data do not support a time-dependent dilation of the pore on its own but rather reveal that the open pore of P2X receptors is wide enough to allow the permeation of large organic cations, including natural ones. This permeation mechanism has considerable physiological significance. PMID:28442564
NASA Astrophysics Data System (ADS)
Hernandez, Charles; Drobinski, Philippe; Turquety, Solène
2015-10-01
Wildfires alter land cover creating changes in dynamic, vegetative, radiative, thermal and hydrological properties of the surface. However, how so drastic changes induced by wildfires and how the age of the burnt scar affect the small and meso-scale atmospheric boundary layer dynamics are largely unknown. These questions are relevant for process analysis, meteorological and air quality forecast but also for regional climate analysis. Such questions are addressed numerically in this study on the case of the Portugal wildfires in 2003 as a testbed. In order to study the effects of burnt scars, an ensemble of numerical simulations using the Weather Research and Forecasting modeling system (WRF) have been performed with different surface properties mimicking the surface state immediately after the fire, few days after the fire and few months after the fire. In order to investigate such issue in a seamless approach, the same modelling framework has been used with various horizontal resolutions of the model grid and land use, ranging from 3.5 km, which can be considered as the typical resolution of state-of-the art regional numerical weather prediction models to 14 km which is now the typical target resolution of regional climate models. The study shows that the combination of high surface heat fluxes over the burnt area, large differential heating with respect to the preserved surroundings and lower surface roughness produces very intense frontogenesis with vertical velocity reaching few meters per second. This powerful meso-scale circulation can pump more humid air from the surroundings not impacted by the wildfire and produce more cloudiness over the burnt area. The influence of soil temperature immediately after the wildfire ceases is mainly seen at night as the boundary-layer remains unstably stratified and lasts only few days. So the intensity of the induced meso-scale circulation decreases with time, even though it remains until full recovery of the vegetation. Finally all these effects are simulated whatever the land cover and model resolution and there are thus robust processes in both regional climate simulations and process studies or short-time forecast. However, the impact of burnt scars on the precipitation signal remains very uncertain, especially because low precipitation is at stake.
NASA Astrophysics Data System (ADS)
Horanyi, Mihaly; Szalay, Jamey
2017-10-01
The lunar regolith has been formed, and remains continually reworked, by the intermitten impacts of comets, asteroids, meteoroids, and the continual bombardment by interplanetary dust particles (IDP). Thick atmospheres protect Venus, Earth, and Mars, ablating the incoming IDPs into “shooting stars” that rarely reach the surface. However, the surfaces of airless bodies near 1 AU are directly exposed to the high-speed (>> 1 km/s) IDP impacts. The Moon is expected to be bombarded by 5x103 kg/day of IDPs arriving with a characteristic speed of ~ 20 km/s. The IDP sources impacting the Moon at high latitudes remain largely uncharacterized due to the lack of optical and radar observations in the polar regions on Earth. These high latitude sources have very large impact speeds in the range of 30 < v < 50 km/ hence they are expected to have a significant effect on the lunar surface, including the removal and burial of volatile deposits in the lunar polar regions.Water is thought to be continually delivered to the Moon through geological timescales by water-bearing comets and asteroids, and produced continuously in situ by the impacts of solar wind protons of oxygen rich minerals exposed on the surface. IDPs are an unlikely source of water due to their long UV exposure in the inner solar system, but their high-speed impacts can mobilize secondary ejecta dust particles, atoms and molecules, some with high-enough speed to escape the Moon. Other surface processes that can lead to mobilization, transport and loss of water molecules and other volatiles include solar heating, photochemical processes, and solar wind sputtering. Since none of these are at work in permanently shadowed regions (PSR), dust impacts remain the dominant process to dictate the evolution of volatiles in PSRs. The competing effects of dust impacts are: a) ejecta production leading to loss out of a PSR; b) gardening and overturning the regolith; and c) the possible accumulation of impact ejecta, leading to the burial of the volatiles. This talk will summarize the expected effects of dust impacts on volatile accumulation in the lunar PSRs based on theoretical models, recent laboratory results, and observations by the LADEE spacecraft.
Marketing veterinary services.
Lee, David E
2006-03-01
Marketing is a holistic process that goes far beyond a Yellow Page advertisement or a glossy brochure. A thorough evaluation of a market before entry, including best and worst case scenarios, is critical to mak-ing good investments. Veterinarians are fortunate to have a market that is largely protected by barriers to entry and characterized by reasonably high rates of return given minimal risk. Our market base continues to expand and, overall, remains fairly price insensitive. The extent to which a practice can align its capabilities with a product mix that ideally meets its clients' needs will ultimately determine its success.
Jupiter Icy Moons Orbiter Mission design overview
NASA Technical Reports Server (NTRS)
Sims, Jon A.
2006-01-01
An overview of the design of a possible mission to three large moons of Jupiter (Callisto, Ganymede, and Europa) is presented. The potential Jupiter Icy Moons Orbiter (JIMO) mission uses ion thrusters powered by a nuclear reactor to transfer from Earth to Jupiter and enter a low-altitude science orbit around each of the moons. The combination of very limited control authority and significant multibody dynamics resulted in some aspects of the trajectory design being different than for any previous mission. The results of several key trades, innovative trajectory types and design processes, and remaining issues are presented.
An Evidence-Based Practical Approach to Pediatric Otolaryngology in the Developing World.
Belcher, Ryan H; Molter, David W; Goudy, Steven L
2018-06-01
Despite humanitarian otolaryngology groups traveling in record numbers to resource-limited areas treating pediatric otolaryngology disease processes and training local providers, there remains a large burden of unmet needs. There is a meager amount of published information that comes from the developing world from an otolaryngology standpoint. As would be expected, the little information that does comes involves some of the most common pediatric otolaryngology diseases and surgical burdens including childhood hearing loss, otitis media, adenotonsillectomies, airway obstructions requiring tracheostomies, foreign body aspirations, and craniomaxillofacial surgeries, including cleft lip and palate. Copyright © 2018 Elsevier Inc. All rights reserved.
Natural versus anthropogenic factors affecting low-level cloud albedo over the North Atlantic
NASA Technical Reports Server (NTRS)
Falkowski, Paul G.; Kim, Yongseung; Kolber, Zbigniew; Wilson, Cara; Wirick, Creighton; Cess, Robert
1992-01-01
Cloud albedo plays a key role in regulating earth's climate. Cloud albedo depends on column-integrated liquid water content and the density of cloud condensation nuclei, which consists primarily of submicrometer-sized aerosol sulfate particles. A comparison of two independent satellite data sets suggests that, although anthropogenic sulfate emissions may enhance cloud albedo immediately adjacent to the east coast of the United States, over the central North Atlantic Ocean the variability in albedo can be largely accounted for by natural marine and atmospheric processes that probably have remained relatively constant since the beginning of the industrial revolution.
Kazantzis, Nikolaos; Brownfield, Nicole R; Mosely, Livia; Usatoff, Alexsandra S; Flighty, Andrew J
2017-12-01
Treatment adherence has posed a substantial challenge not only for patients but also for the health profession for many decades. The last 5 years has witnessed significant attention toward adherence with cognitive behavioral therapy (CBT) homework for anxiety and depressive disorders, and adherence assessment methods have diversified. However, there remains a large component of the adherence process not assessed in CBT, with patient effort, engagement, and the known role for treatment appraisals and beliefs necessitating the pursuit of improved adherence assessment methods. Copyright © 2017 Elsevier Inc. All rights reserved.
Handbook of solar-terrestrial data systems, version 1
NASA Technical Reports Server (NTRS)
1991-01-01
The interaction between the solar wind and the earth's magnetic field creates a large magnetic cavity which is termed the magnetosphere. Energy derived from the solar wind is ultimately dissipated by particle acceleration-precipitation and Joule heating in the magnetosphere-ionosphere. The rate of energy dissipation is highly variable, with peak levels during geomagnetic storms and substorms. The degree to which solar wind and magnetospheric conditions control the energy dissipation processes remains one of the major outstanding questions in magnetospheric physics. A conference on Solar Wind-Magnetospheric Coupling was convened to discuss these issues and this handbook is the result.
The neural bases of cognitive processes in gambling disorder
Potenza, Marc N.
2014-01-01
Functional imaging is offering powerful new tools to investigate the neurobiology of cognitive functioning in people with and without psychiatric conditions like gambling disorder. Based on similarities between gambling and substance-use disorders in neurocognitive and other domains, gambling disorder has recently been classified in DSM-5 as a behavioral addiction. Despite the advances in understanding, there exist multiple unanswered questions about the pathophysiology underlying gambling disorder and the promise for translating the neurobiological understanding into treatment advances remains largely unrealized. Here we review the neurocognitive underpinnings of gambling disorder with an eye towards improving prevention, treatment and policy efforts. PMID:24961632
Bacteriophage Applications for Food Production and Processing
Moye, Zachary D.; Woolston, Joelle; Sulakvelidze, Alexander
2018-01-01
Foodborne illnesses remain a major cause of hospitalization and death worldwide despite many advances in food sanitation techniques and pathogen surveillance. Traditional antimicrobial methods, such as pasteurization, high pressure processing, irradiation, and chemical disinfectants are capable of reducing microbial populations in foods to varying degrees, but they also have considerable drawbacks, such as a large initial investment, potential damage to processing equipment due to their corrosive nature, and a deleterious impact on organoleptic qualities (and possibly the nutritional value) of foods. Perhaps most importantly, these decontamination strategies kill indiscriminately, including many—often beneficial—bacteria that are naturally present in foods. One promising technique that addresses several of these shortcomings is bacteriophage biocontrol, a green and natural method that uses lytic bacteriophages isolated from the environment to specifically target pathogenic bacteria and eliminate them from (or significantly reduce their levels in) foods. Since the initial conception of using bacteriophages on foods, a substantial number of research reports have described the use of bacteriophage biocontrol to target a variety of bacterial pathogens in various foods, ranging from ready-to-eat deli meats to fresh fruits and vegetables, and the number of commercially available products containing bacteriophages approved for use in food safety applications has also been steadily increasing. Though some challenges remain, bacteriophage biocontrol is increasingly recognized as an attractive modality in our arsenal of tools for safely and naturally eliminating pathogenic bacteria from foods. PMID:29671810
How pristine is the interior of the comet 67P/Churyumov-Gerasimenko?
NASA Astrophysics Data System (ADS)
Capria, Maria Teresa; Capaccioni, Fabrizio; Filacchione, Gianrico; Tosi, Federico; De Sanctis, Maria Cristina; Mottola, Stefano; Ciarniello, Mauro; Formisano, Michelangelo; Longobardo, Andrea; Migliorini, Alessandra; Palomba, Ernesto; Raponi, Andrea; Kührt, Ekkehard; Bockelée-Morvan, Dominique; Erard, Stéphane; Leyrat, Cedric; Zinzi, Angelo
2017-07-01
Comets are usually considered to be the most primitive bodies in the Solar System. The level of truth of this paradigm, however, is a matter of debate, especially if by primitive we mean that they represent a sample of intact, unprocessed material. We now have the possibility of analysing the comet 67P/Churyumov-Gerasimenko with an unprecedented level of detail, but its interior remains largely unprobed and unknown. The questions we address in this paper concern the depth of the processed layers, and whether the comet nucleus, under these processed layers, is really representative of the original material. We applied the Rome model for the thermal evolution and differentiation of nuclei to give an estimation of the evolution and depth of the active layers and of the interplay between the erosion process and the penetration of the heat wave. In order to characterize the illumination regime and the activity on the nucleus, two locations with very different illumination histories were chosen for the simulation. For both locations, the bulk of the activity tends to be concentrated around the perihelion time, giving rise to a high erosion rate. As a consequence, the active layers tend to remain close to the surface, and the interior of the comet, below a layer of few tens of centimetres, can be considered as pristine.
NASA Astrophysics Data System (ADS)
Hammer, Sebastian; Mangold, Hans-Moritz; Nguyen, Ariana E.; Martinez-Ta, Dominic; Naghibi Alvillar, Sahar; Bartels, Ludwig; Krenner, Hubert J.
2018-02-01
We review1 the fully-scalable fabrication of a large array of hybrid molybdenum disulfide (MoS2) - silicon dioxide (SiO2) one-dimensional (1D), freestanding photonic-crystal cavities (PCCs) capable of enhancement of the MoS2 photoluminescence (PL) at the narrow cavity resonance. As demonstrated in our prior work [S. Hammer et al., Sci. Rep. 7, 7251 (2017)]1, geometric mode tuning over the wide spectral range of MoS2 PL can be achieved by changing the PC period. In this contribution, we provide a step-by-step description of the fabrication process and give additional detailed information on the degradation of MoS2 by XeF2 vapor. We avoid potential damage of the MoS2 monolayer during the crucial XeF2 etch by refraining from stripping the electron beam (e-beam) resist after dry etching of the photonic crystal pattern. The remaining resist on top of the samples encapsulates and protects the MoS2 film during the entire fabrication process. Albeit the thickness of the remaining resists strongly depends on the fabrication process, the resulting encapsulation of the MoS2 layer improves the confinement to the optical modes and gives rise to a potential enhancement of the light-matter interaction.
GABA abnormalities in schizophrenia: a methodological review of in vivo studies.
Taylor, Stephan F; Tso, Ivy F
2015-09-01
Abnormalities of GABAergic interneurons are some of the most consistent findings from post-mortem studies of schizophrenia. However, linking these molecular deficits with in vivo observations in patients - a critical goal in order to evaluate interventions that would target GABAergic deficits - presents a challenge. Explanatory models have been developed based on animal work and the emerging experimental literature in schizophrenia patients. This literature includes: neuroimaging ligands to GABA receptors, magnetic resonance spectroscopy (MRS) of GABA concentration, transcranial magnetic stimulation of cortical inhibitory circuits and pharmacologic probes of GABA receptors to dynamically challenge the GABA system, usually in combination with neuroimaging studies. Pharmacologic challenges have elicited behavioral changes, and preliminary studies of therapeutic GABAergic interventions have been conducted. This article critically reviews the evidence for GABAergic dysfunction from each of these areas. These methods remain indirect measures of GABAergic function, and a broad array of dysfunction is linked with the putative GABAergic measures, including positive symptoms, cognition, emotion, motor processing and sensory processing, covering diverse brain areas. Measures of receptor binding have not shown replicable group differences in binding, and MRS assays of GABA concentration have yielded equivocal evidence of large-scale alteration in GABA concentration. Overall, the experimental base remains sparse, and much remains to be learned about the role of GABAergic interneurons in healthy brains. Challenges with pharmacologic and functional probes show promise, and may yet enable a better characterization of GABAergic deficits in schizophrenia. Copyright © 2014 Elsevier B.V. All rights reserved.
Smith, Charlotte; Cook, Rachel; Rohleder, Poul
2017-02-01
This study sought to elucidate the process through which people living with HIV (PLWH) in the United Kingdom disclose their status to an intimate partner (IP). A qualitative cross-sectional survey design was used. A total of 95 PLWH took part. They were presented with a series of open-ended questions enquiring into their last experience of disclosing to an IP. The data were analysed using thematic analysis. Disclosure became a salient issue when the discloser acknowledged their relationship as meaningful. A decision to tell was mostly made to build a foundation for the evolving relationship. Once the decision was made, it was enacted via one of two mechanisms (self-initiated or opportunistic) and partners' reported reactions fell within one of four main reaction types. In the long-term for couples who remained together, disclosure was understood to have brought them closer. However, for both those whose relationships remained intact, and for those whose relationship had since broken down, sexual difficulties associated with being in a sero-discordant partnership pervaded. At a personal level, the experience resulted in increased confidence in living with the diagnosis, and an increased sense of disclosure mastery. Disclosure is a highly nuanced process. In particular, it was found to be largely characterized by the IP relational context in which it was occurring. The clinical and theoretical implications of these findings are discussed. In particular, these findings highlight a need for the provision of long-term support to PLWH in negotiating their relationships throughout the process. Statement of contribution What is already known on this subject? Disclosing a HIV+ status to an intimate partner (IP) is key in addressing the global HIV epidemic, social stigma, and the psychological and physical well-being of people living with the condition. It is increasingly recognized that HIV disclosure is a process, rather than an event. Researchers have begun to initiate a line of research into a process-based theoretical account of disclosure. What does this study add? This study provided a nuanced account of the disclosure process within an IP relationship. The process was found to be largely influenced by the discloser's subjective experience of the intimate partnership. The findings point to a need for a disclosure intervention that supports couples more longitudinally, particularly in negotiating the emotional and sexual difficulties that often arise upon disclosing. © 2016 The British Psychological Society.
Advances in Dyslexia Genetics-New Insights Into the Role of Brain Asymmetries.
Paracchini, S; Diaz, R; Stein, J
2016-01-01
Dyslexia is a common condition affecting up to 10% school-aged children. There is strong evidence that genetics plays an important role in dyslexia and is expected to be complex in nature. Few specific susceptibility factors have been identified so far, but their functional characterization has provided novel insights into the biology of dyslexia. In particular, they point to an unexpected role of candidate genes for dyslexia in the biology of cilia, cellular organelles required in many processes including the establishment of left-right asymmetries early in development. This observation has brought back into the spotlight the old idea of a link between dyslexia and handedness. Yet much of the genetics contributing to dyslexia remains unexplained. The lack of biological markers, clear diagnostic criteria, and homogeneous assessment strategies are just some of the factors preventing the collection of the cohorts powered enough for large-scale genetic studies. While the technology and methods to generate and handle large-scale data have reached unprecedented potential, the main challenge remains in establishing universal guidelines to collect suitable phenotype information across independent studies. These difficulties reflect the complex nature of dyslexia which is highly heterogeneous and often co-occurs with other neurodevelopmental disorders. Copyright © 2016 Elsevier Inc. All rights reserved.
Organic contamination of ground water at Gas Works Park, Seattle, Washington
Turney, G.L.; Goerlitz, D.F.
1990-01-01
Gas Works Park, in Seattle, Washington, is located on the site of a coal and oil gasification plant that ceased operation in 1956. During operation, many types of wastes, including coal, tar, and oil, accumulated on-site. The park soil is currently (1986) contaminated with compounds such as polynuclear aromatic hydrocarbons, volatile organic compounds, trace metals, and cyanide. Analyses of water samples from a network of observation wells in the park indicate that these compounds are also present in the ground water. Polynuclear aromatic hydrocarbons and volatile organic compounds were identified in ground water samples in concentrations as large as 200 mg/L. Concentrations of organic compounds were largest where ground water was in contact with a non-aqueous phase liquid in the soil. Where no non-aqueous phase liquid was present, concentrations were much smaller, even if the ground water was in contact with contaminated soils. This condition is attributed to weathering processes in which soluble, low-molecular-weight organic compounds are preferentially dissolved from the non-aqueous phase liquid into the ground water. Where no non-aqueous phase liquid is present, only stained soils containing relatively insoluble, high-molecular-weight compounds remain. Concentrations of organic contaminants in the soils may still remain large.
Vitamin D and Diabetic Complications: True or False Prophet?
Alam, Uazman; Arul-Devah, Vilashini; Javed, Saad; Malik, Rayaz A
2016-03-01
Vitamin D deficiency is now recognized as a condition of increasing prevalence worldwide. Vitamin D has an established role in calcium and bone metabolism; however, more recently associations with vitamin D deficiency and risk of developing diabetes, diabetes complications, and cardiovascular disease have all been acknowledged. The vitamin D receptor is ubiquitously expressed, and experimental, in vitro, and in vivo studies strongly suggest a role in regulating the transcription of multiple genes beyond calcium homeostasis. These include antiproliferative, immunomodulatory, angiogenic, inhibition of the renin-angiotensin-aldosterone system, and neurotrophic factor expression. Observational studies report a strong association between vitamin D deficiency and cardiovascular and metabolic disorders; however, there remains a paucity of large long-term randomized clinical trials showing a benefit with treatment. An increasing body of literature suggests a possible pathogenetic role of vitamin D in the long-term complications of diabetes and vitamin D deficiency may also exacerbate symptoms of painful diabetic peripheral neuropathy. It remains unknown if supplementation of vitamin D to normal or non-deficient levels alters pathogenetic processes related to diabetic microvascular complications. With the high prevalence of vitamin D deficiency in patients with diabetes and putative mechanisms linking vitamin D deficiency to diabetic complications, there is a compelling argument for undertaking large well-designed randomized controlled trials of vitamin D supplementation.
Five Describing Factors of Dyslexia.
Tamboer, Peter; Vorst, Harrie C M; Oort, Frans J
2016-09-01
Two subtypes of dyslexia (phonological, visual) have been under debate in various studies. However, the number of symptoms of dyslexia described in the literature exceeds the number of subtypes, and underlying relations remain unclear. We investigated underlying cognitive features of dyslexia with exploratory and confirmatory factor analyses. A sample of 446 students (63 with dyslexia) completed a large test battery and a large questionnaire. Five factors were found in both the test battery and the questionnaire. These 10 factors loaded on 5 latent factors (spelling, phonology, short-term memory, rhyme/confusion, and whole-word processing/complexity), which explained 60% of total variance. Three analyses supported the validity of these factors. A confirmatory factor analysis fit with a solution of five factors (RMSEA = .03). Those with dyslexia differed from those without dyslexia on all factors. A combination of five factors provided reliable predictions of dyslexia and nondyslexia (accuracy >90%). We also looked for factorial deficits on an individual level to construct subtypes of dyslexia, but found varying profiles. We concluded that a multiple cognitive deficit model of dyslexia is supported, whereas the existence of subtypes remains unclear. We discussed the results in relation to advanced compensation strategies of students, measures of intelligence, and various correlations within groups of those with and without dyslexia. © Hammill Institute on Disabilities 2014.
Paces, James B.; Wurster, Frederic C.
2014-01-01
Near-surface physical and chemical process can strongly affect dissolved-ion concentrations and stable isotope compositions of water in wetland settings, especially under arid climate conditions. In contrast, heavy radiogenic isotopes of strontium (87Sr/86Sr) and uranium (234U/238U) remain largely unaffected and can be used to help identify unique signatures from different sources and quantify end-member mixing that would otherwise be difficult to determine. The utility of combined Sr and U isotopes are demonstrated in this study of wetland habitats on the Pahranagat National Wildlife Refuge, which depend on supply from large-volume springs north of the Refuge, and from small-volume springs and seeps within the Refuge. Water budgets from these sources have not been quantified previously. Evaporation, transpiration, seasonally variable surface flow, and water management practices complicate the use of conventional methods for determining source contributions and mixing relations. In contrast, 87Sr/86Sr and 234U/238U remain unfractionated under these conditions, and compositions at a given site remain constant. Differences in Sr- and U-isotopic signatures between individual sites can be related by simple two- or three-component mixing models. Results indicate that surface flow constituting the Refuge’s irrigation source consists of a 65:25:10 mixture of water from two distinct regionally sourced carbonate aquifer springs, and groundwater from locally sourced volcanic aquifers. Within the Refuge, contributions from the irrigation source and local groundwater are readily determined and depend on proximity to those sources as well as water management practices.
NASA Astrophysics Data System (ADS)
Paces, James B.; Wurster, Frederic C.
2014-09-01
Near-surface physical and chemical process can strongly affect dissolved-ion concentrations and stable-isotope compositions of water in wetland settings, especially under arid climate conditions. In contrast, heavy radiogenic isotopes of strontium (87Sr/86Sr) and uranium (234U/238U) remain largely unaffected and can be used to help identify unique signatures from different sources and quantify end-member mixing that would otherwise be difficult to determine. The utility of combined Sr and U isotopes are demonstrated in this study of wetland habitats on the Pahranagat National Wildlife Refuge, which depend on supply from large-volume springs north of the Refuge, and from small-volume springs and seeps within the Refuge. Water budgets from these sources have not been quantified previously. Evaporation, transpiration, seasonally variable surface flow, and water management practices complicate the use of conventional methods for determining source contributions and mixing relations. In contrast, 87Sr/86Sr and 234U/238U remain unfractionated under these conditions, and compositions at a given site remain constant. Differences in Sr- and U-isotopic signatures between individual sites can be related by simple two- or three-component mixing models. Results indicate that surface flow constituting the Refuge's irrigation source consists of a 65:25:10 mixture of water from two distinct regionally sourced carbonate-aquifer springs, and groundwater from locally sourced volcanic aquifers. Within the Refuge, contributions from the irrigation source and local groundwater are readily determined and depend on proximity to those sources as well as water management practices.
Extreme Beta-Cell Deficiency in Pancreata of Dogs with Canine Diabetes
Shields, Emily J.; Lam, Carol J.; Cox, Aaron R.; Rankin, Matthew M.; Van Winkle, Thomas J.; Hess, Rebecka S.; Kushner, Jake A.
2015-01-01
The pathophysiology of canine diabetes remains poorly understood, in part due to enigmatic clinical features and the lack of detailed histopathology studies. Canine diabetes, similar to human type 1 diabetes, is frequently associated with diabetic ketoacidosis at onset or after insulin omission. However, notable differences exist. Whereas human type 1 diabetes often occurs in children, canine diabetes is typically described in middle age to elderly dogs. Many competing theories have been proposed regarding the underlying cause of canine diabetes, from pancreatic atrophy to chronic pancreatitis to autoimmune mediated β-cell destruction. It remains unclear to what extent β-cell loss contributes to canine diabetes, as precise quantifications of islet morphometry have not been performed. We used high-throughput microscopy and automated image processing to characterize islet histology in a large collection of pancreata of diabetic dogs. Diabetic pancreata displayed a profound reduction in β-cells and islet endocrine cells. Unlike humans, canine non-diabetic islets are largely comprised of β-cells. Very few β-cells remained in islets of diabetic dogs, even in pancreata from new onset cases. Similarly, total islet endocrine cell number was sharply reduced in diabetic dogs. No compensatory proliferation or lymphocyte infiltration was detected. The majority of pancreata had no evidence of pancreatitis. Thus, canine diabetes is associated with extreme β-cell deficiency in both new and longstanding disease. The β-cell predominant composition of canine islets and the near-total absence of β-cells in new onset elderly diabetic dogs strongly implies that similar to human type 1 diabetes, β-cell loss underlies the pathophysiology of canine diabetes. PMID:26057531
Bayesian sensitivity analysis of bifurcating nonlinear models
NASA Astrophysics Data System (ADS)
Becker, W.; Worden, K.; Rowson, J.
2013-01-01
Sensitivity analysis allows one to investigate how changes in input parameters to a system affect the output. When computational expense is a concern, metamodels such as Gaussian processes can offer considerable computational savings over Monte Carlo methods, albeit at the expense of introducing a data modelling problem. In particular, Gaussian processes assume a smooth, non-bifurcating response surface. This work highlights a recent extension to Gaussian processes which uses a decision tree to partition the input space into homogeneous regions, and then fits separate Gaussian processes to each region. In this way, bifurcations can be modelled at region boundaries and different regions can have different covariance properties. To test this method, both the treed and standard methods were applied to the bifurcating response of a Duffing oscillator and a bifurcating FE model of a heart valve. It was found that the treed Gaussian process provides a practical way of performing uncertainty and sensitivity analysis on large, potentially-bifurcating models, which cannot be dealt with by using a single GP, although an open problem remains how to manage bifurcation boundaries that are not parallel to coordinate axes.
Helium-Shell Nucleosynthesis and Extinct Radioactivities
NASA Technical Reports Server (NTRS)
Meyer, B. S.; The, L.-S.; Clayton, D. D.; ElEid, M. F.
2004-01-01
Although the exact site for the origin of the r-process isotopes remains mysterious, most thinking has centered on matter ejected from the cores of massive stars in core-collapse supernovae [13]. In the 1970's and 1980's, however, difficulties in understanding the yields from such models led workers to consider the possibility of r-process nucleosynthesis farther out in the exploding star, in particular, in the helium burning shell [4,5]. The essential idea was that shock passage through this shell would heat and compress this material to the point that the reactions 13C(alpha; n)16O and, especially, 22Ne(alpha; n)25Mg would generate enough neutrons to capture on preexisting seed nuclei and drive an "n process" [6], which could reproduce the r-process abundances. Subsequent work showed that the required 13C and 22Ne abundances were too large compared to the amounts available in realistic models [7] and recent thinking has returned to supernova core material or matter ejected from neutron star-neutron star collisions as the more likely r-process sites.
Solvent-Free Manufacturing of Electrodes for Lithium-ion Batteries
NASA Astrophysics Data System (ADS)
Ludwig, Brandon; Zheng, Zhangfeng; Shou, Wan; Wang, Yan; Pan, Heng
2016-03-01
Lithium ion battery electrodes were manufactured using a new, completely dry powder painting process. The solvents used for conventional slurry-cast electrodes have been completely removed. Thermal activation time has been greatly reduced due to the time and resource demanding solvent evaporation process needed with slurry-cast electrode manufacturing being replaced by a hot rolling process. It has been found that thermal activation time to induce mechanical bonding of the thermoplastic polymer to the remaining active electrode particles is only a few seconds. Removing the solvent and drying process allows large-scale Li-ion battery production to be more economically viable in markets such as automotive energy storage systems. By understanding the surface energies of various powders which govern the powder mixing and binder distribution, bonding tests of the dry-deposited particles onto the current collector show that the bonding strength is greater than slurry-cast electrodes, 148.8 kPa as compared to 84.3 kPa. Electrochemical tests show that the new electrodes outperform conventional slurry processed electrodes, which is due to different binder distribution.
NASA Astrophysics Data System (ADS)
Ruiz-Villanueva, Virginia; Piégay, Hervé; Gurnell, Angela A.; Marston, Richard A.; Stoffel, Markus
2016-09-01
Large wood is an important physical component of woodland rivers and significantly influences river morphology. It is also a key component of stream ecosystems. However, large wood is also a source of risk for human activities as it may damage infrastructure, block river channels, and induce flooding. Therefore, the analysis and quantification of large wood and its mobility are crucial for understanding and managing wood in rivers. As the amount of large-wood-related studies by researchers, river managers, and stakeholders increases, documentation of commonly used and newly available techniques and their effectiveness has also become increasingly relevant as well. Important data and knowledge have been obtained from the application of very different approaches and have generated a significant body of valuable information representative of different environments. This review brings a comprehensive qualitative and quantitative summary of recent advances regarding the different processes involved in large wood dynamics in fluvial systems including wood budgeting and wood mechanics. First, some key definitions and concepts are introduced. Second, advances in quantifying large wood dynamics are reviewed; in particular, how measurements and modeling can be combined to integrate our understanding of how large wood moves through and is retained within river systems. Throughout, we present a quantitative and integrated meta-analysis compiled from different studies and geographical regions. Finally, we conclude by highlighting areas of particular research importance and their likely future trajectories, and we consider a particularly underresearched area so as to stress the future challenges for large wood research.
NASA Astrophysics Data System (ADS)
Benz, Arnold O.
2017-12-01
Solar flares are observed at all wavelengths from decameter radio waves to gamma-rays beyond 1 GeV. This review focuses on recent observations in EUV, soft and hard X-rays, white light, and radio waves. Space missions such as RHESSI, Yohkoh, TRACE, SOHO, and more recently Hinode and SDO have enlarged widely the observational base. They have revealed a number of surprises: Coronal sources appear before the hard X-ray emission in chromospheric footpoints, major flare acceleration sites appear to be independent of coronal mass ejections, electrons, and ions may be accelerated at different sites, there are at least 3 different magnetic topologies, and basic characteristics vary from small to large flares. Recent progress also includes improved insights into the flare energy partition, on the location(s) of energy release, tests of energy release scenarios and particle acceleration. The interplay of observations with theory is important to deduce the geometry and to disentangle the various processes involved. There is increasing evidence supporting magnetic reconnection as the basic cause. While this process has become generally accepted as the trigger, it is still controversial how it converts a considerable fraction of the energy into non-thermal particles. Flare-like processes may be responsible for large-scale restructuring of the magnetic field in the corona as well as for its heating. Large flares influence interplanetary space and substantially affect the Earth's ionosphere. Flare scenarios have slowly converged over the past decades, but every new observation still reveals major unexpected results, demonstrating that solar flares, after 150 years since their discovery, remain a complex problem of astrophysics including major unsolved questions.
Détroit, Florent; Corny, Julien; Dizon, Eusebio Z; Mijares, Armand S
2013-01-01
"Pygmy populations" are recognized in several places over the world, especially in Western Africa and in Southeast Asia (Philippine "negritos," for instance). Broadly defined as "small-bodied Homo sapiens" (compared with neighboring populations), their origins and the nature of the processes involved in the maintenance of their phenotype over time are highly debated. Major results have been recently obtained from population genetics on present-day negrito populations, but their evolutionary history remains largely unresolved. We present and discuss the Upper Pleistocene human remains recovered from Tabon Cave and Callao Cave in the Philippines, which are potentially highly relevant to these research questions. Human fossils have been recovered in large numbers from Tabon Cave (Palawan Island) but mainly from reworked and mixed sediments from several archaeological layers. We review and synthesize the long and meticulous collaborative work done on the archives left from the 1960s excavations and on the field. The results demonstrate the long history of human occupations in the cave, since at least ~30,000 BP. The examination of the Tabon human remains shows a large variability: large and robust for one part of the sample, and small and gracile for the other part. The latter would fit quite comfortably within the range of variation of Philippine negritos. Farther north, on Luzon Island, the human third metatarsal recently recovered from Callao Cave and dated to ~66,000 BP is now the oldest direct evidence of human presence in the Philippines. Previous data show that, compared with H. sapiens (including Philippine negritos), this bone presents a very small size and several unusual morphological characteristics. We present a new analytical approach using three-dimensional geometric morphometrics for comparing the Callao fossil to a wide array of extant Asian mammals, including nonhuman primates and H. sapiens. The results demonstrate that the shape of the Callao metatarsal is definitely closer to humans than to any other groups. The fossil clearly belongs to the genus Homo; however, it remains at the margin of the variation range of H. sapiens. Because of its great antiquity and the presence of another diminutive species of the genus Homo in the Wallace area during this time period (H. floresiensis), we discuss here in detail the affinities and potential relatedness of the Callao fossil with negritos that are found today on Luzon Island. Copyright © 2013 Wayne State University Press, Detroit, Michigan 48201-1309.
What is the Source? Post-glacial sediment flux from the Waipaoa Catchment, New Zealand
NASA Astrophysics Data System (ADS)
Bilderback, E. L.; Pettinga, J. R.; Litchfield, N. J.; Quigley, M.; Marden, M.
2011-12-01
In the Waipaoa, and for much of the eastern North Island, the shift from the last glacial coldest period to the current interglacial climatic regime resulted in Late Pleistocene-Holocene catchment-wide channel incision (Berryman et al., 2000; Litchfield and Berryman, 2005). Only ~25% of the total post 18 ka sediment yield for the Waipaoa Catchment can be accounted for by channel incision, one of the most widespread and most effective erosive processes in the catchment (Orpin et al., 2006; Marden et al., 2008). We find that deep-seated landslides, which are pervasive, cannot make up this apparent source area sediment deficit. This presents a challenge to our current understanding of the Waipaoa Sedimentary System. New high resolution topographic data sets (lidar and photogrammetry) combined with tephrochronology and field mapping have enabled us to approximate the sediment flux from post 18 ka deep-seated landslides. The sediment delivered to the offshore sink from these upper Waipaoa landslides is likely to be less than 20% of the sediment volume calculated for channel incision. A further GIS analysis of the ~2500 km2 Waipaoa catchment using work from Crosby and Whipple (2006) delineating relict topography and Marden et al. (2008) accounting for river incision and slopes stabilized behind terrace remnants indicates that only about half of the available catchment area could have contributed additional large volumes of sediment to the offshore post 18 ka sink. The presence of tephra cover older than 18 ka on landforms ranging from flat ridgelines to steep (>30 degree) slopes in this remaining terrestrial source area suggests that it has not been eroded en mass. The apparent source deficit remains even though many of the major erosive processes available to fill this deficit have been studied and the potentially contributing catchment area is dramatically reduced by these studies. This analysis raises questions about erosive processes and our ability to balance large scale sediment budgets. Does costal erosion contribute a significant volume to the offshore sink? Was sediment from other catchments trapped in the Poverty Bay postglacial shelf basin? Are the uncertainties in any of these source and sink calculations large enough that the previous questions are essentially irrelevant? We believe that it is an achievable goal to account for the major processes that generate sediment in the Waipaoa Sedimentary System and that this budget tuning can inform our understanding of active landscapes.
Original size of the Vredefort structure, South Africa
NASA Technical Reports Server (NTRS)
Therriault, A. M.; Reid, A. M.; Reimold, W. U.
1993-01-01
The Vredefort structure is located approximately 120 km southwest of Johannesburg, South Africa, and is deeply eroded. Controversies remain on the origin of this structure with the most popular hypotheses being: (1) by impact cratering about 2.0 Ga; (2) as a cryptoexplosion structure about 2.0 Ga; and (3) by purely tectonic processes starting at about 3.0 Ga and ending with the Vredefort event at 2.0 Ga. In view of recent work in which the granophyre dikes are interpreted as the erosional remants of a more extensive impact melt sheet, injected downward into the underlying country rocks, the impact origin hypothesis for Vredefort is adopted. In order to estimate the original dimensions of the Vredefort impact structure, it is assumed that the structure was initially circular, that its predeformation center corresponds to the center of the granitic core, and that the pre-Vredefort geology of the area prior to approximately 2.0 Ga ago is as suggested by Fletcher and Reimold. The spatial relationship between shock metamorphic effects, the shock pressures they record, and the morphological features of the crater were established for a number of large terrestrial craters. The principles of crater formation at large complex impact structures comparable in size to Vredefort were also established, although many details remain unresolved. An important conclusion is that the transient crater, which is formed directly by excavation and displacement by the shock-induced cratering flow-field (i.e., the particle velocity flow field existing in the region of the transient crater but behind the initial outgoing shock front), is highly modified during the late stage processes. The original transient crater diameter lies well within the final rim of the crater, which is established by structural movements during late-stage cavity modification.
Zadro, Joshua Robert; Shirley, Debra; Simic, Milena; Mousavi, Seyed Javad; Ceprnja, Dragana; Maka, Katherine; Ferreira, Paulo
2017-06-01
To investigate the feasibility of implementing a video-game exercise programme for older people with chronic low back pain (LBP). Single-centred single-blinded randomised controlled trial (RCT). Physiotherapy outpatient department in a public hospital in Western Sydney, Australia. We will recruit 60 participants over 55 years old with chronic LBP from the waiting list. Participants will be randomised to receive video-game exercise (n=30) or to remain on the waiting list (n=30) for 8 weeks, with follow up at 3 and 6 months. Participants engaging in video-game exercises will be unsupervised and will complete video-game exercise for 60minutes, 3 times per week. Participants allocated to remain on the waiting list will be encouraged to maintain their usual levels of physical activity. The primary outcomes for this feasibility study will be study processes (recruitment and response rates, adherence to and experience with the intervention, and incidence of adverse events) relevant to the future design of a large RCT. Estimates of treatment efficacy (point estimates and 95% confidence intervals) on pain self-efficacy, care seeking, physical activity, fear of movement/re-injury, pain, physical function, disability, falls-efficacy, strength, and walking speed, will be our secondary outcome measures. Recruitment for this trial began in November 2015. This study describes the rationale and processes of a feasibility study investigating a video-game exercise programme for older people with chronic LBP. Results from the feasibility study will inform on the design and sample required for a large multicentre RCT. Australian New Zealand Clinical Trials Registry: ACTRN12615000703505. Copyright © 2016 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
Modrek, Sepideh; Cullen, Mark R.
2013-01-01
While the negative effects of unemployment have been well studied, the consequences of layoffs and downsizing for those who remain employed are less well understood. This study used human resources and health claims data from a large multi-site fully insured aluminum company to explore the health consequences of downsizing on the remaining workforce. We exploit the variation in the timing and intensity of layoff to categorize 30 plants as high or low layoff plants. Next, we select a stably employed cohort of workers with history of health insurance going back to 2006 to 1) describe the selection process into layoff and 2) explore the association between the severity of plant level layoffs and the incidence of four chronic conditions in the remaining workforce. We examined four health outcomes: incident hypertension, diabetes, asthma/COPD and depression for a cohort of approximately 13,000 employees. Results suggest that there was an increased risk of developing hypertension for workers that remain at the plants with the highest level of layoffs, and increased risk of developing diabetes for salaried workers that remain at the plants with the highest level of layoffs. The hypertension results were robust to a several specification tests. In addition, the study design selected only healthy workers, therefore our results are likely to be a lower bound and suggest that adverse health consequences of the current recession may affect a broader proportion of the population than previously expected. PMID:23849284
Van Landeghem, Sofie; De Bodt, Stefanie; Drebert, Zuzanna J.; Inzé, Dirk; Van de Peer, Yves
2013-01-01
Despite the availability of various data repositories for plant research, a wealth of information currently remains hidden within the biomolecular literature. Text mining provides the necessary means to retrieve these data through automated processing of texts. However, only recently has advanced text mining methodology been implemented with sufficient computational power to process texts at a large scale. In this study, we assess the potential of large-scale text mining for plant biology research in general and for network biology in particular using a state-of-the-art text mining system applied to all PubMed abstracts and PubMed Central full texts. We present extensive evaluation of the textual data for Arabidopsis thaliana, assessing the overall accuracy of this new resource for usage in plant network analyses. Furthermore, we combine text mining information with both protein–protein and regulatory interactions from experimental databases. Clusters of tightly connected genes are delineated from the resulting network, illustrating how such an integrative approach is essential to grasp the current knowledge available for Arabidopsis and to uncover gene information through guilt by association. All large-scale data sets, as well as the manually curated textual data, are made publicly available, hereby stimulating the application of text mining data in future plant biology studies. PMID:23532071
Molecular pathology of prostate cancer.
Cazares, L H; Drake, R R; Esquela-Kirscher, A; Lance, R S; Semmes, O J; Troyer, D A
2010-01-01
This chapter includes discussion of the molecular pathology of tissue, blood, urine, and expressed prostatic secretions. Because we are unable to reliably image the disease in vivo, a 12 core method that oversamples the peripheral zone is widely used. This generates large numbers of cores that need to be carefully processed and sampled. In spite of the large number of tissue cores, the amount of tumor available for study is often quite limited. This is a particular challenge for research, as new biomarker assays will need to preserve tissue architecture intact for histopathology. Methods of processing and reporting pathology are discussed. With the exception of ductal variants, recognized subtypes of prostate cancer are largely confined to research applications, and most prostate cancers are acinar. Biomarker discovery in urine and expressed prostatic secretions would be useful since these are readily obtained and are proximate fluids. The well-known challenges of biomarker discovery in blood and urine are referenced and discussed. Mediators of carcinogenesis can serve as biomarkers as exemplified by mutations in PTEN and TMPRSS2:ERG fusion. The use of proteomics in biomarker discovery with an emphasis on imaging mass spectroscopy of tissues is discussed. Small RNAs are of great interest, however, their usefulness as biomarkers in clinical decision making remains the subject of ongoing research. The chapter concludes with an overview of blood biomarkers such as circulating nucleic acids and tumor cells and bound/free isoforms of prostate specific antigen (PSA).
Natural Language Processing Technologies in Radiology Research and Clinical Applications.
Cai, Tianrun; Giannopoulos, Andreas A; Yu, Sheng; Kelil, Tatiana; Ripley, Beth; Kumamaru, Kanako K; Rybicki, Frank J; Mitsouras, Dimitrios
2016-01-01
The migration of imaging reports to electronic medical record systems holds great potential in terms of advancing radiology research and practice by leveraging the large volume of data continuously being updated, integrated, and shared. However, there are significant challenges as well, largely due to the heterogeneity of how these data are formatted. Indeed, although there is movement toward structured reporting in radiology (ie, hierarchically itemized reporting with use of standardized terminology), the majority of radiology reports remain unstructured and use free-form language. To effectively "mine" these large datasets for hypothesis testing, a robust strategy for extracting the necessary information is needed. Manual extraction of information is a time-consuming and often unmanageable task. "Intelligent" search engines that instead rely on natural language processing (NLP), a computer-based approach to analyzing free-form text or speech, can be used to automate this data mining task. The overall goal of NLP is to translate natural human language into a structured format (ie, a fixed collection of elements), each with a standardized set of choices for its value, that is easily manipulated by computer programs to (among other things) order into subcategories or query for the presence or absence of a finding. The authors review the fundamentals of NLP and describe various techniques that constitute NLP in radiology, along with some key applications. ©RSNA, 2016.
Natural Language Processing Technologies in Radiology Research and Clinical Applications
Cai, Tianrun; Giannopoulos, Andreas A.; Yu, Sheng; Kelil, Tatiana; Ripley, Beth; Kumamaru, Kanako K.; Rybicki, Frank J.
2016-01-01
The migration of imaging reports to electronic medical record systems holds great potential in terms of advancing radiology research and practice by leveraging the large volume of data continuously being updated, integrated, and shared. However, there are significant challenges as well, largely due to the heterogeneity of how these data are formatted. Indeed, although there is movement toward structured reporting in radiology (ie, hierarchically itemized reporting with use of standardized terminology), the majority of radiology reports remain unstructured and use free-form language. To effectively “mine” these large datasets for hypothesis testing, a robust strategy for extracting the necessary information is needed. Manual extraction of information is a time-consuming and often unmanageable task. “Intelligent” search engines that instead rely on natural language processing (NLP), a computer-based approach to analyzing free-form text or speech, can be used to automate this data mining task. The overall goal of NLP is to translate natural human language into a structured format (ie, a fixed collection of elements), each with a standardized set of choices for its value, that is easily manipulated by computer programs to (among other things) order into subcategories or query for the presence or absence of a finding. The authors review the fundamentals of NLP and describe various techniques that constitute NLP in radiology, along with some key applications. ©RSNA, 2016 PMID:26761536
Eavesdropping on the Arctic: Automated bioacoustics reveal dynamics in songbird breeding phenology.
Oliver, Ruth Y; Ellis, Daniel P W; Chmura, Helen E; Krause, Jesse S; Pérez, Jonathan H; Sweet, Shannan K; Gough, Laura; Wingfield, John C; Boelman, Natalie T
2018-06-01
Bioacoustic networks could vastly expand the coverage of wildlife monitoring to complement satellite observations of climate and vegetation. This approach would enable global-scale understanding of how climate change influences phenomena such as migratory timing of avian species. The enormous data sets that autonomous recorders typically generate demand automated analyses that remain largely undeveloped. We devised automated signal processing and machine learning approaches to estimate dates on which songbird communities arrived at arctic breeding grounds. Acoustically estimated dates agreed well with those determined via traditional surveys and were strongly related to the landscape's snow-free dates. We found that environmental conditions heavily influenced daily variation in songbird vocal activity, especially before egg laying. Our novel approaches demonstrate that variation in avian migratory arrival can be detected autonomously. Large-scale deployment of this innovation in wildlife monitoring would enable the coverage necessary to assess and forecast changes in bird migration in the face of climate change.
Microwave amplification based on quasiparticle SIS up and down frequency converters
NASA Astrophysics Data System (ADS)
Kojima, T.; Uzawa, Y.; Shan, W.
2018-02-01
Heterodyne instruments have recently attained quantum-limited low-noise performance, particularly in radio astronomy, but it is difficult to develop large heterodyne arrays such as a modern radio camera using cryogenic sensitive detectors based on microwave kinetic inductance detectors, transition edge sensors, etc. In the realization of the heterodyne array, the reduction of power dissipation for semiconductor-based amplifiers remains a major challenge. Alternatively, superconducting parametric amplifiers still seem to have several barriers to application, especially in terms of operating temperature. Here, we show a novel concept of microwave amplification based on up and down frequency-conversion processes using quasiparticle superconductor-insulator-superconductor (SIS) tunnel junctions. We demonstrate positive gain using a proof-of-concept test module, which operates with a power dissipation of several μW at a bath temperature of 4 K. The performance of the module suggests great potential for application in large arrays.
Natural polyreactive IgA antibodies coat the intestinal microbiota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunker, Jeffrey J.; Erickson, Steven A.; Flynn, Theodore M.
Large quantities of immunoglobulin A (IgA) are constitutively secreted by intestinal plasma cells to coat and contain the commensal microbiota, yet the specificity of these antibodies remains elusive. In this paper, we profiled the reactivities of single murine IgA plasma cells by cloning and characterizing large numbers of monoclonal antibodies. IgAs were not specific to individual bacterial taxa but rather polyreactive, with broad reactivity to a diverse, but defined, subset of microbiota. These antibodies arose at low frequencies among naïve B cells and were selected into the IgA repertoire upon recirculation in Peyer’s patches. This selection process occurred independent ofmore » microbiota or dietary antigens. Furthermore, although some IgAs acquired somatic mutations, these did not substantially influence their reactivity. In conclusion, these findings reveal an endogenous mechanism driving homeostatic production of polyreactive IgAs with innate specificity to microbiota.« less
Frankenstein, Ziv; Sperling, Joseph; Sperling, Ruth; Eisenstein, Miriam
2012-01-01
Summary The spliceosome is a mega-Dalton ribonucleoprotein (RNP) assembly that processes primary RNA transcripts, producing functional mRNA. The electron microscopy structures of the native spliceosome and of several spliceosomal subcomplexes are available but the spatial arrangement of the latter within the native spliceosome is not known. We designed a new computational procedure to efficiently fit thousands of conformers into the spliceosome envelope. Despite the low resolution limitations, we obtained only one model that complies with the available biochemical data. Our model localizes the five small nuclear RNPs (snRNPs) mostly within the large subunit of the native spliceosome, requiring only minor conformation changes. The remaining free volume presumably accommodates additional spliceosomal components. The constituents of the active core of the spliceosome are juxtaposed, forming a continuous surface deep within the large spliceosomal cavity, which provides a sheltered environment for the splicing reaction. PMID:22578543
Silver hake tracks changes in Northwest Atlantic circulation.
Nye, Janet A; Joyce, Terrence M; Kwon, Young-Oh; Link, Jason S
2011-08-02
Recent studies documenting shifts in spatial distribution of many organisms in response to a warming climate highlight the need to understand the mechanisms underlying species distribution at large spatial scales. Here we present one noteworthy example of remote oceanographic processes governing the spatial distribution of adult silver hake, Merluccius bilinearis, a commercially important fish in the Northeast US shelf region. Changes in spatial distribution of silver hake over the last 40 years are highly correlated with the position of the Gulf Stream. These changes in distribution are in direct response to local changes in bottom temperature on the continental shelf that are responding to the same large scale circulation change affecting the Gulf Stream path, namely changes in the Atlantic meridional overturning circulation (AMOC). If the AMOC weakens, as is suggested by global climate models, silver hake distribution will remain in a poleward position, the extent to which could be forecast at both decadal and multidecadal scales.
NASA Astrophysics Data System (ADS)
An, Yongling; Fei, Huifang; Zeng, Guifang; Ci, Lijie; Xi, Baojuan; Xiong, Shenglin; Feng, Jinkui
2018-02-01
Design and synthesis of capable anode materials that can store the large size K+ is the key of development for potassium-ion batteries. The low-cost and commercial expanded graphite with large particles is a graphite-derived material with good conductivity and enlarged interlayer spaces to boost the potassium ion diffusion coefficient during charge/discharge process. Thus, we achieve excellent anode performance for potassium-ion batteries based on an expanded graphite. It can deliver a capacity of 263 mAh g-1 at the rate of 10 mA g-1 and the reversible capacity remains almost unchanged after 500 cycles at a high rate of 200 mA g-1 with a coulombic efficiency of around 100%. The potassium storage mechanism is investigated by the ex situ XRD technique. This excellent potassium storage performance will make the expanded graphite promising anode candidate for potassium ion batteries.
Larger CO2 source at the equatorial Pacific during the last deglaciation
Kubota, Kaoru; Yokoyama, Yusuke; Ishikawa, Tsuyoshi; Obrochta, Stephen; Suzuki, Atsushi
2014-01-01
While biogeochemical and physical processes in the Southern Ocean are thought to be central to atmospheric CO2 rise during the last deglaciation, the role of the equatorial Pacific, where the largest CO2 source exists at present, remains largely unconstrained. Here we present seawater pH and pCO2 variations from fossil Porites corals in the mid equatorial Pacific offshore Tahiti based on a newly calibrated boron isotope paleo-pH proxy. Our new data, together with recalibrated existing data, indicate that a significant pCO2 increase (pH decrease), accompanied by anomalously large marine 14C reservoir ages, occurred following not only the Younger Dryas, but also Heinrich Stadial 1. These findings indicate an expanded zone of equatorial upwelling and resultant CO2 emission, which may be derived from higher subsurface dissolved inorganic carbon concentration. PMID:24918354
CPU architecture for a fast and energy-saving calculation of convolution neural networks
NASA Astrophysics Data System (ADS)
Knoll, Florian J.; Grelcke, Michael; Czymmek, Vitali; Holtorf, Tim; Hussmann, Stephan
2017-06-01
One of the most difficult problem in the use of artificial neural networks is the computational capacity. Although large search engine companies own specially developed hardware to provide the necessary computing power, for the conventional user only remains the state of the art method, which is the use of a graphic processing unit (GPU) as a computational basis. Although these processors are well suited for large matrix computations, they need massive energy. Therefore a new processor on the basis of a field programmable gate array (FPGA) has been developed and is optimized for the application of deep learning. This processor is presented in this paper. The processor can be adapted for a particular application (in this paper to an organic farming application). The power consumption is only a fraction of a GPU application and should therefore be well suited for energy-saving applications.
Frictional velocity-weakening in landslides on Earth and on other planetary bodies.
Lucas, Antoine; Mangeney, Anne; Ampuero, Jean Paul
2014-03-04
One of the ultimate goals in landslide hazard assessment is to predict maximum landslide extension and velocity. Despite much work, the physical processes governing energy dissipation during these natural granular flows remain uncertain. Field observations show that large landslides travel over unexpectedly long distances, suggesting low dissipation. Numerical simulations of landslides require a small friction coefficient to reproduce the extension of their deposits. Here, based on analytical and numerical solutions for granular flows constrained by remote-sensing observations, we develop a consistent method to estimate the effective friction coefficient of landslides. This method uses a constant basal friction coefficient that reproduces the first-order landslide properties. We show that friction decreases with increasing volume or, more fundamentally, with increasing sliding velocity. Inspired by frictional weakening mechanisms thought to operate during earthquakes, we propose an empirical velocity-weakening friction law under a unifying phenomenological framework applicable to small and large landslides observed on Earth and beyond.
Gene regulation knowledge commons: community action takes care of DNA binding transcription factors
Tripathi, Sushil; Vercruysse, Steven; Chawla, Konika; Christie, Karen R.; Blake, Judith A.; Huntley, Rachael P.; Orchard, Sandra; Hermjakob, Henning; Thommesen, Liv; Lægreid, Astrid; Kuiper, Martin
2016-01-01
A large gap remains between the amount of knowledge in scientific literature and the fraction that gets curated into standardized databases, despite many curation initiatives. Yet the availability of comprehensive knowledge in databases is crucial for exploiting existing background knowledge, both for designing follow-up experiments and for interpreting new experimental data. Structured resources also underpin the computational integration and modeling of regulatory pathways, which further aids our understanding of regulatory dynamics. We argue how cooperation between the scientific community and professional curators can increase the capacity of capturing precise knowledge from literature. We demonstrate this with a project in which we mobilize biological domain experts who curate large amounts of DNA binding transcription factors, and show that they, although new to the field of curation, can make valuable contributions by harvesting reported knowledge from scientific papers. Such community curation can enhance the scientific epistemic process. Database URL: http://www.tfcheckpoint.org PMID:27270715
Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B
2013-03-23
Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.
Coordinated Management of Academic Health Centers.
Balser, Jeffrey R; Stead, William W
2017-01-01
Academic health centers (AHCs) are the nation's primary resource for healthcare discovery, innovation, and training. US healthcare revenue growth has declined sharply since 2009, and is forecast to remain well below historic levels for the foreseeable future. As the cost of education and research at nearly all AHCs is heavily subsidized through large transfers from clinical care margins, our institutions face a mounting crisis. Choices centering on how to increase the cost-effectiveness of the AHC enterprise require unprecedented levels of alignment to preserve an environment that nurtures creativity. Management processes require governance models that clarify decision rights while harnessing the talents and the intellectual capital of a large, diverse enterprise to nimbly address unfamiliar organizational challenges. This paper describes key leadership tactics aimed at propelling AHCs along this journey - one that requires from all leaders a commitment to resilience, optimism, and willingness to embrace change.
Natural polyreactive IgA antibodies coat the intestinal microbiota
Bunker, Jeffrey J.; Erickson, Steven A.; Flynn, Theodore M.; ...
2017-09-28
Large quantities of immunoglobulin A (IgA) are constitutively secreted by intestinal plasma cells to coat and contain the commensal microbiota, yet the specificity of these antibodies remains elusive. In this paper, we profiled the reactivities of single murine IgA plasma cells by cloning and characterizing large numbers of monoclonal antibodies. IgAs were not specific to individual bacterial taxa but rather polyreactive, with broad reactivity to a diverse, but defined, subset of microbiota. These antibodies arose at low frequencies among naïve B cells and were selected into the IgA repertoire upon recirculation in Peyer’s patches. This selection process occurred independent ofmore » microbiota or dietary antigens. Furthermore, although some IgAs acquired somatic mutations, these did not substantially influence their reactivity. In conclusion, these findings reveal an endogenous mechanism driving homeostatic production of polyreactive IgAs with innate specificity to microbiota.« less
NASA Technical Reports Server (NTRS)
Silverberg, R. F.; Cheng, E. S.; Cottingham, D. A.; Fixsen, D. J.; Meyer, S. S.; Wilson, G. W.
2004-01-01
The formation of the first objects, stars and galaxies and their subsequent evolution remain a cosmological unknown. Few observational probes of these processes exist. The Cosmic Infrared Background (CIB) originates from this era, and can provide information to test models of both galaxy evolution and the growth of primordial structure. The Explorer of Diffuse Galactic Emission (EDGE) is a proposed balloon-borne mission designed to measure the spatial fluctuations in the CIB from 200 micrometers to 1 millimeter on 6' to 3 degree scales with 2 microKelvin sensitivity/resolution element. Such measurements would provide a sensitive probe of the large-scale variation in protogalaxy density at redshifts approximately 0.5-3. In this paper, we present the scientific justification for the mission and show a concept for the instrument and observations.
NASA Technical Reports Server (NTRS)
Hudson, W. R.
1976-01-01
A microscopic surface texture is created by sputter etching a surface while simultaneously sputter depositing a lower sputter yield material onto the surface. A xenon ion beam source has been used to perform this texturing process on samples as large as three centimeters in diameter. Ion beam textured surface structures have been characterized with SEM photomicrographs for a large number of materials including Cu, Al, Si, Ti, Ni, Fe, Stainless steel, Au, and Ag. Surfaces have been textured using a variety of low sputter yield materials - Ta, Mo, Nb, and Ti. The initial stages of the texture creation have been documented, and the technique of ion beam sputter removal of any remaining deposited material has been studied. A number of other texturing parameters have been studied such as the variation of the texture with ion beam power, surface temperature, and the rate of texture growth with sputter etching time.
Schiffels, Daniel; Szalai, Veronika A; Liddle, J Alexander
2017-07-25
Robust self-assembly across length scales is a ubiquitous feature of biological systems but remains challenging for synthetic structures. Taking a cue from biology-where disparate molecules work together to produce large, functional assemblies-we demonstrate how to engineer microscale structures with nanoscale features: Our self-assembly approach begins by using DNA polymerase to controllably create double-stranded DNA (dsDNA) sections on a single-stranded template. The single-stranded DNA (ssDNA) sections are then folded into a mechanically flexible skeleton by the origami method. This process simultaneously shapes the structure at the nanoscale and directs the large-scale geometry. The DNA skeleton guides the assembly of RecA protein filaments, which provides rigidity at the micrometer scale. We use our modular design strategy to assemble tetrahedral, rectangular, and linear shapes of defined dimensions. This method enables the robust construction of complex assemblies, greatly extending the range of DNA-based self-assembly methods.
Large conditional single-photon cross-phase modulation
Hosseini, Mahdi; Duan, Yiheng; Vuletić, Vladan
2016-01-01
Deterministic optical quantum logic requires a nonlinear quantum process that alters the phase of a quantum optical state by π through interaction with only one photon. Here, we demonstrate a large conditional cross-phase modulation between a signal field, stored inside an atomic quantum memory, and a control photon that traverses a high-finesse optical cavity containing the atomic memory. This approach avoids fundamental limitations associated with multimode effects for traveling optical photons. We measure a conditional cross-phase shift of π/6 (and up to π/3 by postselection on photons that remain in the system longer than average) between the retrieved signal and control photons, and confirm deterministic entanglement between the signal and control modes by extracting a positive concurrence. By upgrading to a state-of-the-art cavity, our system can reach a coherent phase shift of π at low loss, enabling deterministic and universal photonic quantum logic. PMID:27519798
Pilot line report: Development of a high efficiency thin silicon solar cell
NASA Technical Reports Server (NTRS)
1978-01-01
Experimental technology advances were implemented to increase the conversion efficiency of ultrathin 2cm x 2cm cells, to demonstrate a capability for fabricating such cells at a rate of 10,000 per month, and to fabricate 200 large-area ultrathin cells to determine their feasibility of manufacture. A production rate of 10,000 50 micron m cells per month with lot average AM0 efficiencies of 11.5% was demonstrated, with peak efficiencies of 13.5% obtained. Losses in most stages of the processing were minimized, the remaining exceptions being in the photolithography and metallization steps for front contact generation and breakage handling. The 5cm x 5cm cells were fabricated with a peak yield in excess of 40% for over 10% AM0 efficiency. Greater fabrication volume is needed to fully evaluate the expected yield and efficiency levels for large cells.
Arctic sea ice trends, variability and implications for seasonal ice forecasting
Serreze, Mark C.; Stroeve, Julienne
2015-01-01
September Arctic sea ice extent over the period of satellite observations has a strong downward trend, accompanied by pronounced interannual variability with a detrended 1 year lag autocorrelation of essentially zero. We argue that through a combination of thinning and associated processes related to a warming climate (a stronger albedo feedback, a longer melt season, the lack of especially cold winters) the downward trend itself is steepening. The lack of autocorrelation manifests both the inherent large variability in summer atmospheric circulation patterns and that oceanic heat loss in winter acts as a negative (stabilizing) feedback, albeit insufficient to counter the steepening trend. These findings have implications for seasonal ice forecasting. In particular, while advances in observing sea ice thickness and assimilating thickness into coupled forecast systems have improved forecast skill, there remains an inherent limit to predictability owing to the largely chaotic nature of atmospheric variability. PMID:26032315
Pharmacogenetics and outcome with antipsychotic drugs.
Pouget, Jennie G; Shams, Tahireh A; Tiwari, Arun K; Müller, Daniel J
2014-12-01
Antipsychotic medications are the gold-standard treatment for schizophrenia, and are often prescribed for other mental conditions. However, the efficacy and side-effect profiles of these drugs are heterogeneous, with large interindividual variability. As a result, treatment selection remains a largely trial-and-error process, with many failed treatment regimens endured before finding a tolerable balance between symptom management and side effects. Much of the interindividual variability in response and side effects is due to genetic factors (heritability, h(2)~ 0.60-0.80). Pharmacogenetics is an emerging field that holds the potential to facilitate the selection of the best medication for a particular patient, based on his or her genetic information. In this review we discuss the most promising genetic markers of antipsychotic treatment outcomes, and present current translational research efforts that aim to bring these pharmacogenetic findings to the clinic in the near future.
Pharmacogenetics and outcome with antipsychotic drugs
Pouget, Jennie G.; Shams, Tahireh A.; Tiwari, Arun K.; Müller, Daniel J.
2014-01-01
Antipsychotic medications are the gold-standard treatment for schizophrenia, and are often prescribed for other mental conditions. However, the efficacy and side-effect profiles of these drugs are heterogeneous, with large interindividual variability. As a result, treatment selection remains a largely trial-and-error process, with many failed treatment regimens endured before finding a tolerable balance between symptom management and side effects. Much of the interindividual variability in response and side effects is due to genetic factors (heritability, h2~ 0.60-0.80). Pharmacogenetics is an emerging field that holds the potential to facilitate the selection of the best medication for a particular patient, based on his or her genetic information. In this review we discuss the most promising genetic markers of antipsychotic treatment outcomes, and present current translational research efforts that aim to bring these pharmacogenetic findings to the clinic in the near future. PMID:25733959
Meiotic recombination and male infertility: from basic science to clinical reality?
Hann, Michael C; Lau, Patricio E; Tempest, Helen G
2011-01-01
Infertility is a common problem that affects approximately 15% of the population. Although many advances have been made in the treatment of infertility, the molecular and genetic causes of male infertility remain largely elusive. This review will present a summary of our current knowledge on the genetic origin of male infertility and the key events of male meiosis. It focuses on chromosome synapsis and meiotic recombination and the problems that arise when errors in these processes occur, specifically meiotic arrest and chromosome aneuploidy, the leading cause of pregnancy loss in humans. In addition, meiosis-specific candidate genes will be discussed, including a discussion on why we have been largely unsuccessful at identifying disease-causing mutations in infertile men. Finally clinical applications of sperm aneuploidy screening will be touched upon along with future prospective clinical tests to better characterize male infertility in a move towards personalized medicine. PMID:21297654
Meiotic recombination and male infertility: from basic science to clinical reality?
Hann, Michael C; Lau, Patricio E; Tempest, Helen G
2011-03-01
Infertility is a common problem that affects approximately 15% of the population. Although many advances have been made in the treatment of infertility, the molecular and genetic causes of male infertility remain largely elusive. This review will present a summary of our current knowledge on the genetic origin of male infertility and the key events of male meiosis. It focuses on chromosome synapsis and meiotic recombination and the problems that arise when errors in these processes occur, specifically meiotic arrest and chromosome aneuploidy, the leading cause of pregnancy loss in humans. In addition, meiosis-specific candidate genes will be discussed, including a discussion on why we have been largely unsuccessful at identifying disease-causing mutations in infertile men. Finally clinical applications of sperm aneuploidy screening will be touched upon along with future prospective clinical tests to better characterize male infertility in a move towards personalized medicine.
Diel Surface Temperature Range Scales with Lake Size
Woolway, R. Iestyn; Jones, Ian D.; Maberly, Stephen C.; French, Jon R.; Livingstone, David M.; Monteith, Donald T.; Simpson, Gavin L.; Thackeray, Stephen J.; Andersen, Mikkel R.; Battarbee, Richard W.; DeGasperi, Curtis L.; Evans, Christopher D.; de Eyto, Elvira; Feuchtmayr, Heidrun; Hamilton, David P.; Kernan, Martin; Krokowski, Jan; Rimmer, Alon; Rose, Kevin C.; Rusak, James A.; Ryves, David B.; Scott, Daniel R.; Shilland, Ewan M.; Smyth, Robyn L.; Staehr, Peter A.; Thomas, Rhian; Waldron, Susan; Weyhenmeyer, Gesa A.
2016-01-01
Ecological and biogeochemical processes in lakes are strongly dependent upon water temperature. Long-term surface warming of many lakes is unequivocal, but little is known about the comparative magnitude of temperature variation at diel timescales, due to a lack of appropriately resolved data. Here we quantify the pattern and magnitude of diel temperature variability of surface waters using high-frequency data from 100 lakes. We show that the near-surface diel temperature range can be substantial in summer relative to long-term change and, for lakes smaller than 3 km2, increases sharply and predictably with decreasing lake area. Most small lakes included in this study experience average summer diel ranges in their near-surface temperatures of between 4 and 7°C. Large diel temperature fluctuations in the majority of lakes undoubtedly influence their structure, function and role in biogeochemical cycles, but the full implications remain largely unexplored. PMID:27023200
Video Browsing on Handheld Devices
NASA Astrophysics Data System (ADS)
Hürst, Wolfgang
Recent improvements in processing power, storage space, and video codec development enable users now to playback video on their handheld devices in a reasonable quality. However, given the form factor restrictions of such a mobile device, screen size still remains a natural limit and - as the term "handheld" implies - always will be a critical resource. This is not only true for video but any data that is processed on such devices. For this reason, developers have come up with new and innovative ways to deal with large documents in such limited scenarios. For example, if you look at the iPhone, innovative techniques such as flicking have been introduced to skim large lists of text (e.g. hundreds of entries in your music collection). Automatically adapting the zoom level to, for example, the width of table cells when double tapping on the screen enables reasonable browsing of web pages that have originally been designed for large, desktop PC sized screens. A multi touch interface allows you to easily zoom in and out of large text documents and images using two fingers. In the next section, we will illustrate that advanced techniques to browse large video files have been developed in the past years, as well. However, if you look at state-of-the-art video players on mobile devices, normally just simple, VCR like controls are supported (at least at the time of this writing) that only allow users to just start, stop, and pause video playback. If supported at all, browsing and navigation functionality is often restricted to simple skipping of chapters via two single buttons for backward and forward navigation and a small and thus not very sensitive timeline slider.
Ichinose, Tsuyoshi; Yamamoto, Atsushi; Kobayashi, Tsutomu; Shitara, Hitoshi; Shimoyama, Daisuke; Iizuka, Haku; Koibuchi, Noriyuki; Takagishi, Kenji
2016-02-01
Rotator cuff tear (RCT) is a common musculoskeletal disorder in the elderly. The large RCT is often irreparable due to the retraction and degeneration of the rotator cuff muscle. The integrity of the teres minor (TM) muscle is thought to affect postoperative functional recovery in some surgical treatments. Hypertrophy of the TM is found in some patients with large RCTs; however, the process underlying this hypertrophy is still unclear. The objective of this study was to determine if compensatory hypertrophy of the TM muscle occurs in a large RCT rat model. Twelve Wistar rats underwent transection of the suprascapular nerve and the supraspinatus and infraspinatus tendons in the left shoulder. The rats were euthanized 4 weeks after the surgery, and the cuff muscles were collected and weighed. The cross-sectional area and the involvement of Akt/mammalian target of rapamycin (mTOR) signaling were examined in the remaining TM muscle. The weight and cross-sectional area of the TM muscle was higher in the operated-on side than in the control side. The phosphorylated Akt/Akt protein ratio was not significantly different between these sides. The phosphorylated-mTOR/mTOR protein ratio was significantly higher on the operated-on side. Transection of the suprascapular nerve and the supraspinatus and infraspinatus tendons activates mTOR signaling in the TM muscle, which results in muscle hypertrophy. The Akt-signaling pathway may not be involved in this process. Nevertheless, activation of mTOR signaling in the TM muscle after RCT may be an effective therapeutic target of a large RCT. Copyright © 2016 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Bo; Wang, Mi; Xu, Wen; Li, Deren; Gong, Jianya; Pi, Yingdong
2017-12-01
The potential of large-scale block adjustment (BA) without ground control points (GCPs) has long been a concern among photogrammetric researchers, which is of effective guiding significance for global mapping. However, significant problems with the accuracy and efficiency of this method remain to be solved. In this study, we analyzed the effects of geometric errors on BA, and then developed a step-wise BA method to conduct integrated processing of large-scale ZY-3 satellite images without GCPs. We first pre-processed the BA data, by adopting a geometric calibration (GC) method based on the viewing-angle model to compensate for systematic errors, such that the BA input images were of good initial geometric quality. The second step was integrated BA without GCPs, in which a series of technical methods were used to solve bottleneck problems and ensure accuracy and efficiency. The BA model, based on virtual control points (VCPs), was constructed to address the rank deficiency problem caused by lack of absolute constraints. We then developed a parallel matching strategy to improve the efficiency of tie points (TPs) matching, and adopted a three-array data structure based on sparsity to relieve the storage and calculation burden of the high-order modified equation. Finally, we used the conjugate gradient method to improve the speed of solving the high-order equations. To evaluate the feasibility of the presented large-scale BA method, we conducted three experiments on real data collected by the ZY-3 satellite. The experimental results indicate that the presented method can effectively improve the geometric accuracies of ZY-3 satellite images. This study demonstrates the feasibility of large-scale mapping without GCPs.
NASA Astrophysics Data System (ADS)
Jutla, A.; Akanda, A. S.; Colwell, R. R.
2014-12-01
Prediction of conditions of an impending disease outbreak remains a challenge but is achievable if the associated and appropriate large scale hydroclimatic process can be estimated in advance. Outbreaks of diarrheal diseases such as cholera, are related to episodic seasonal variability in river discharge in the regions where water and sanitation infrastructure are inadequate and insufficient. However, forecasting river discharge, few months in advance, remains elusive where cholera outbreaks are frequent, probably due to non-availability of geophysical data as well as transboundary water stresses. Here, we show that satellite derived water storage from Gravity Recovery and Climate Experiment Forecasting (GRACE) sensors can provide reliable estimates on river discharge atleast two months in advance over regional scales. Bayesian regression models predicted flooding and drought conditions, a prerequisite for cholera outbreaks, in Bengal Delta with an overall accuracy of 70% for upto 60 days in advance without using any other ancillary ground based data. Forecasting of river discharge will have significant impacts on planning and designing intervention strategies for potential cholera outbreaks in the coastal regions where the disease remain endemic and often fatal.
Food processing and allergenicity.
Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian
2015-06-01
Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Corbetta, Matteo; Sbarufatti, Claudio; Giglio, Marco; Todd, Michael D.
2018-05-01
The present work critically analyzes the probabilistic definition of dynamic state-space models subject to Bayesian filters used for monitoring and predicting monotonic degradation processes. The study focuses on the selection of the random process, often called process noise, which is a key perturbation source in the evolution equation of particle filtering. Despite the large number of applications of particle filtering predicting structural degradation, the adequacy of the picked process noise has not been investigated. This paper reviews existing process noise models that are typically embedded in particle filters dedicated to monitoring and predicting structural damage caused by fatigue, which is monotonic in nature. The analysis emphasizes that existing formulations of the process noise can jeopardize the performance of the filter in terms of state estimation and remaining life prediction (i.e., damage prognosis). This paper subsequently proposes an optimal and unbiased process noise model and a list of requirements that the stochastic model must satisfy to guarantee high prognostic performance. These requirements are useful for future and further implementations of particle filtering for monotonic system dynamics. The validity of the new process noise formulation is assessed against experimental fatigue crack growth data from a full-scale aeronautical structure using dedicated performance metrics.
The role of gap phase processes in the biomass dynamics of tropical forests
Feeley, Kenneth J; Davies, Stuart J; Ashton, Peter S; Bunyavejchewin, Sarayudh; Nur Supardi, M.N; Kassim, Abd Rahman; Tan, Sylvester; Chave, Jérôme
2007-01-01
The responses of tropical forests to global anthropogenic disturbances remain poorly understood. Above-ground woody biomass in some tropical forest plots has increased over the past several decades, potentially reflecting a widespread response to increased resource availability, for example, due to elevated atmospheric CO2 and/or nutrient deposition. However, previous studies of biomass dynamics have not accounted for natural patterns of disturbance and gap phase regeneration, making it difficult to quantify the importance of environmental changes. Using spatially explicit census data from large (50 ha) inventory plots, we investigated the influence of gap phase processes on the biomass dynamics of four ‘old-growth’ tropical forests (Barro Colorado Island (BCI), Panama; Pasoh and Lambir, Malaysia; and Huai Kha Khaeng (HKK), Thailand). We show that biomass increases were gradual and concentrated in earlier-phase forest patches, while biomass losses were generally of greater magnitude but concentrated in rarer later-phase patches. We then estimate the rate of biomass change at each site independent of gap phase dynamics using reduced major axis regressions and ANCOVA tests. Above-ground woody biomass increased significantly at Pasoh (+0.72% yr−1) and decreased at HKK (−0.56% yr−1) independent of changes in gap phase but remained stable at both BCI and Lambir. We conclude that gap phase processes play an important role in the biomass dynamics of tropical forests, and that quantifying the role of gap phase processes will help improve our understanding of the factors driving changes in forest biomass as well as their place in the global carbon budget. PMID:17785266
Modeling fMRI signals can provide insights into neural processing in the cerebral cortex
Sharifian, Fariba; Heikkinen, Hanna; Vigário, Ricardo
2015-01-01
Every stimulus or task activates multiple areas in the mammalian cortex. These distributed activations can be measured with functional magnetic resonance imaging (fMRI), which has the best spatial resolution among the noninvasive brain imaging methods. Unfortunately, the relationship between the fMRI activations and distributed cortical processing has remained unclear, both because the coupling between neural and fMRI activations has remained poorly understood and because fMRI voxels are too large to directly sense the local neural events. To get an idea of the local processing given the macroscopic data, we need models to simulate the neural activity and to provide output that can be compared with fMRI data. Such models can describe neural mechanisms as mathematical functions between input and output in a specific system, with little correspondence to physiological mechanisms. Alternatively, models can be biomimetic, including biological details with straightforward correspondence to experimental data. After careful balancing between complexity, computational efficiency, and realism, a biomimetic simulation should be able to provide insight into how biological structures or functions contribute to actual data processing as well as to promote theory-driven neuroscience experiments. This review analyzes the requirements for validating system-level computational models with fMRI. In particular, we study mesoscopic biomimetic models, which include a limited set of details from real-life networks and enable system-level simulations of neural mass action. In addition, we discuss how recent developments in neurophysiology and biophysics may significantly advance the modelling of fMRI signals. PMID:25972586
Modeling fMRI signals can provide insights into neural processing in the cerebral cortex.
Vanni, Simo; Sharifian, Fariba; Heikkinen, Hanna; Vigário, Ricardo
2015-08-01
Every stimulus or task activates multiple areas in the mammalian cortex. These distributed activations can be measured with functional magnetic resonance imaging (fMRI), which has the best spatial resolution among the noninvasive brain imaging methods. Unfortunately, the relationship between the fMRI activations and distributed cortical processing has remained unclear, both because the coupling between neural and fMRI activations has remained poorly understood and because fMRI voxels are too large to directly sense the local neural events. To get an idea of the local processing given the macroscopic data, we need models to simulate the neural activity and to provide output that can be compared with fMRI data. Such models can describe neural mechanisms as mathematical functions between input and output in a specific system, with little correspondence to physiological mechanisms. Alternatively, models can be biomimetic, including biological details with straightforward correspondence to experimental data. After careful balancing between complexity, computational efficiency, and realism, a biomimetic simulation should be able to provide insight into how biological structures or functions contribute to actual data processing as well as to promote theory-driven neuroscience experiments. This review analyzes the requirements for validating system-level computational models with fMRI. In particular, we study mesoscopic biomimetic models, which include a limited set of details from real-life networks and enable system-level simulations of neural mass action. In addition, we discuss how recent developments in neurophysiology and biophysics may significantly advance the modelling of fMRI signals. Copyright © 2015 the American Physiological Society.
The role of gap phase processes in the biomass dynamics of tropical forests.
Feeley, Kenneth J; Davies, Stuart J; Ashton, Peter S; Bunyavejchewin, Sarayudh; Nur Supardi, M N; Kassim, Abd Rahman; Tan, Sylvester; Chave, Jérôme
2007-11-22
The responses of tropical forests to global anthropogenic disturbances remain poorly understood. Above-ground woody biomass in some tropical forest plots has increased over the past several decades, potentially reflecting a widespread response to increased resource availability, for example, due to elevated atmospheric CO2 and/or nutrient deposition. However, previous studies of biomass dynamics have not accounted for natural patterns of disturbance and gap phase regeneration, making it difficult to quantify the importance of environmental changes. Using spatially explicit census data from large (50 ha) inventory plots, we investigated the influence of gap phase processes on the biomass dynamics of four 'old-growth' tropical forests (Barro Colorado Island (BCI), Panama; Pasoh and Lambir, Malaysia; and Huai Kha Khaeng (HKK), Thailand). We show that biomass increases were gradual and concentrated in earlier-phase forest patches, while biomass losses were generally of greater magnitude but concentrated in rarer later-phase patches. We then estimate the rate of biomass change at each site independent of gap phase dynamics using reduced major axis regressions and ANCOVA tests. Above-ground woody biomass increased significantly at Pasoh (+0.72% yr(-1)) and decreased at HKK (-0.56% yr(-1)) independent of changes in gap phase but remained stable at both BCI and Lambir. We conclude that gap phase processes play an important role in the biomass dynamics of tropical forests, and that quantifying the role of gap phase processes will help improve our understanding of the factors driving changes in forest biomass as well as their place in the global carbon budget.
Holdway, Douglas A
2002-03-01
A review of the acute and chronic effects of produced formation water (PFW), drilling fluids (muds) including oil-based cutting muds, water-based cutting muds, ester-based cutting muds and chemical additives, and crude oils associated with offshore oil and gas production was undertaken in relation to both temperate and tropical marine ecological processes. The main environmental effects are summarized, often in tabular form. Generally, the temporal and spatial scales of these studies, along with the large levels of inherent variation in natural environments, have precluded our ability to predict the potential long-term environmental impacts of the offshore oil and gas production industry. A series of critical questions regarding the environmental effects of the offshore oil and gas production industry that still remain unanswered are provided for future consideration.
Biochemical Reconstitution of the WAVE Regulatory Complex
Chen, Baoyu; Padrick, Shae B.; Henry, Lisa; Rosen, Michael K.
2014-01-01
The WAVE regulatory complex (WRC) is a 400-KDa heteropentameric protein assembly that plays a central role in controlling actin cytoskeletal dynamics in many cellular processes. The WRC acts by integrating diverse cellular cues and stimulating the actin nucleating activity of the Arp2/3 complex at membranes. Biochemical and biophysical studies of the underlying mechanisms of these processes require large amounts of purified WRC. Recent success in recombinant expression, reconstitution, purification and crystallization of the WRC has greatly advanced our understanding of the inhibition, activation and membrane recruitment mechanisms of this complex. But many important questions remain to be answered. Here we summarize and update the methods developed in our laboratory, which allow reliable and flexible production of tens of milligrams of recombinant WRC of crystallographic quality, sufficient for many biochemical and structural studies. PMID:24630101
The business value of health care information technology.
Frisse, M C
1999-01-01
The American health care system is one of the world's largest and most complex industries. The Health Care Financing Administration reports that 1997 expenditures for health care exceeded one trillion dollars, or 13.5 percent of the gross domestic product. Despite these expenditures, over 16 percent of the U.S. population remains uninsured, and a large percentage of patients express dissatisfaction with the health care system. Managed care, effective in its ability to attenuate the rate of cost increase, is associated with a concomitant degree of administrative overhead that is often perceived by providers and patients alike as a major source of cost and inconvenience. Both providers and patients sense a great degree of inconvenience and an excessive amount of paperwork associated with both the process of seeking medical care and the subsequent process of paying for medical services.
The B1 Protein Guides the Biosynthesis of a Lasso Peptide
NASA Astrophysics Data System (ADS)
Zhu, Shaozhou; Fage, Christopher D.; Hegemann, Julian D.; Mielcarek, Andreas; Yan, Dushan; Linne, Uwe; Marahiel, Mohamed A.
2016-10-01
Lasso peptides are a class of ribosomally synthesized and post-translationally modified peptides (RiPPs) with a unique lariat knot-like fold that endows them with extraordinary stability and biologically relevant activity. However, the biosynthetic mechanism of these fascinating molecules remains largely speculative. Generally, two enzymes (B for processing and C for cyclization) are required to assemble the unusual knot-like structure. Several subsets of lasso peptide gene clusters feature a “split” B protein on separate open reading frames (B1 and B2), suggesting distinct functions for the B protein in lasso peptide biosynthesis. Herein, we provide new insights into the role of the RiPP recognition element (RRE) PadeB1, characterizing its capacity to bind the paeninodin leader peptide and deliver its peptide substrate to PadeB2 for processing.
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
The Business Value of Health Care Information Technology
Frisse, Mark C.
1999-01-01
The American health care system is one of the world's largest and most complex industries. The Health Care Financing Administration reports that 1997 expenditures for health care exceeded one trillion dollars, or 13.5 percent of the gross domestic product. Despite these expenditures, over 16 percent of the U.S. population remains uninsured, and a large percentage of patients express dissatisfaction with the health care system. Managed care, effective in its ability to attenuate the rate of cost increase, is associated with a concomitant degree of administrative overhead that is often perceived by providers and patients alike as a major source of cost and inconvenience. Both providers and patients sense a great degree of inconvenience and an excessive amount of paperwork associated with both the process of seeking medical care and the subsequent process of paying for medical services. PMID:10495096
Small Rayed Crater Ejecta Retention Age Calculated from Current Crater Production Rates on Mars
NASA Technical Reports Server (NTRS)
Calef, F. J. III; Herrick, R. R.; Sharpton, V. L.
2011-01-01
Ejecta from impact craters, while extant, records erosive and depositional processes on their surfaces. Estimating ejecta retention age (Eret), the time span when ejecta remains recognizable around a crater, can be applied to estimate the timescale that surface processes operate on, thereby obtaining a history of geologic activity. However, the abundance of sub-kilometer diameter (D) craters identifiable in high resolution Mars imagery has led to questions of accuracy in absolute crater dating and hence ejecta retention ages (Eret). This research calculates the maximum Eret for small rayed impact craters (SRC) on Mars using estimates of the Martian impactor flux adjusted for meteorite ablation losses in the atmosphere. In addition, we utilize the diameter-distance relationship of secondary cratering to adjust crater counts in the vicinity of the large primary crater Zunil.
State of the art in perceptual design of hearing aids
NASA Astrophysics Data System (ADS)
Edwards, Brent W.; van Tasell, Dianne J.
2002-05-01
Hearing aid capabilities have increased dramatically over the past six years, in large part due to the development of small, low-power digital signal processing chips suitable for hearing aid applications. As hearing aid signal processing capabilities increase, there will be new opportunities to apply perceptually based knowledge to technological development. Most hearing loss compensation techniques in today's hearing aids are based on simple estimates of audibility and loudness. As our understanding of the psychoacoustical and physiological characteristics of sensorineural hearing loss improves, the result should be improved design of hearing aids and fitting methods. The state of the art in hearing aids will be reviewed, including form factors, user requirements, and technology that improves speech intelligibility, sound quality, and functionality. General areas of auditory perception that remain unaddressed by current hearing aid technology will be discussed.
NASA Astrophysics Data System (ADS)
Miguez-Macho, Gonzalo; Stenchikov, Georgiy L.; Robock, Alan
2005-04-01
The reasons for biases in regional climate simulations were investigated in an attempt to discern whether they arise from deficiencies in the model parameterizations or are due to dynamical problems. Using the Regional Atmospheric Modeling System (RAMS) forced by the National Centers for Environmental Prediction-National Center for Atmospheric Research reanalysis, the detailed climate over North America at 50-km resolution for June 2000 was simulated. First, the RAMS equations were modified to make them applicable to a large region, and its turbulence parameterization was corrected. The initial simulations showed large biases in the location of precipitation patterns and surface air temperatures. By implementing higher-resolution soil data, soil moisture and soil temperature initialization, and corrections to the Kain-Fritch convective scheme, the temperature biases and precipitation amount errors could be removed, but the precipitation location errors remained. The precipitation location biases could only be improved by implementing spectral nudging of the large-scale (wavelength of 2500 km) dynamics in RAMS. This corrected for circulation errors produced by interactions and reflection of the internal domain dynamics with the lateral boundaries where the model was forced by the reanalysis.
Phenotypic Diagnosis of Lineage and Differentiation During Sake Yeast Breeding
Ohnuki, Shinsuke; Okada, Hiroki; Friedrich, Anne; Kanno, Yoichiro; Goshima, Tetsuya; Hasuda, Hirokazu; Inahashi, Masaaki; Okazaki, Naoto; Tamura, Hiroyasu; Nakamura, Ryo; Hirata, Dai; Fukuda, Hisashi; Shimoi, Hitoshi; Kitamoto, Katsuhiko; Watanabe, Daisuke; Schacherer, Joseph; Akao, Takeshi; Ohya, Yoshikazu
2017-01-01
Sake yeast was developed exclusively in Japan. Its diversification during breeding remains largely uncharacterized. To evaluate the breeding processes of the sake lineage, we thoroughly investigated the phenotypes and differentiation of 27 sake yeast strains using high-dimensional, single-cell, morphological phenotyping. Although the genetic diversity of the sake yeast lineage is relatively low, its morphological diversity has expanded substantially compared to that of the Saccharomyces cerevisiae species as a whole. Evaluation of the different types of breeding processes showed that the generation of hybrids (crossbreeding) has more profound effects on cell morphology than the isolation of mutants (mutation breeding). Analysis of phenotypic robustness revealed that some sake yeast strains are more morphologically heterogeneous, possibly due to impairment of cellular network hubs. This study provides a new perspective for studying yeast breeding genetics and micro-organism breeding strategies. PMID:28642365
Cold Multiphoton Matrix Assisted Laser Desorption/Ionization (MALDI)
NASA Astrophysics Data System (ADS)
Harris, Peter; Cooke, William; Tracy, Eugene
2008-05-01
We present evidence of a cold multiphoton MALDI process occurring at a Room Temperature Ionic Liquid (RTIL)/metal interface. Our RTIL, 1-Butyl-3-methylimidazolium hexafluorophosphate, remains a stable liquid at room temperatures, even at pressures lower than 10-9 torr. We focus the 2^nd harmonic of a pulsed (2ns pulse length) Nd:YAG laser onto a gold grid coated with RTIL to generate a cold (narrow velocity spread) ion source with temporal resolution comparable to current MALDI ion sources. Unlike conventional MALDI, we believe multiphoton MALDI does not rely on collisional ionization within the ejection plume, and thus produces large signals at laser intensities just above threshold. Removing the collisional ionization process allow us to eject material from smaller regions of a sample, enhancing the suitability of multiphoton MALDI as an ion imaging technique.
Baines, Christopher P; Gutiérrez-Aguilar, Manuel
2018-07-01
Mitochondria from different organisms can undergo a sudden process of inner membrane unselective leakiness to molecules known as the mitochondrial permeability transition (MPT). This process has been studied for nearly four decades and several proteins have been claimed to constitute, or at least regulate the usually inactive pore responsible for this transition. However, no protein candidate proposed as the actual pore-forming unit has passed rigorous gain- or loss-of-function genetic tests. Here we review evidence for -and against- putative channel-forming components of the MPT pore. We conclude that the structure of the MPT pore still remains largely undefined and suggest that future studies should follow established technical considerations to unambiguously consolidate the channel forming constituent(s) of the MPT pore. Copyright © 2018 Elsevier Ltd. All rights reserved.
Coral symbiotic algae calcify ex hospite in partnership with bacteria.
Frommlet, Jörg C; Sousa, Maria L; Alves, Artur; Vieira, Sandra I; Suggett, David J; Serôdio, João
2015-05-12
Dinoflagellates of the genus Symbiodinium are commonly recognized as invertebrate endosymbionts that are of central importance for the functioning of coral reef ecosystems. However, the endosymbiotic phase within Symbiodinium life history is inherently tied to a more cryptic free-living (ex hospite) phase that remains largely unexplored. Here we show that free-living Symbiodinium spp. in culture commonly form calcifying bacterial-algal communities that produce aragonitic spherulites and encase the dinoflagellates as endolithic cells. This process is driven by Symbiodinium photosynthesis but occurs only in partnership with bacteria. Our findings not only place dinoflagellates on the map of microbial-algal organomineralization processes but also point toward an endolithic phase in the Symbiodinium life history, a phenomenon that may provide new perspectives on the biology and ecology of Symbiodinium spp. and the evolutionary history of the coral-dinoflagellate symbiosis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Kruzic
2007-09-01
Located in Area 25 of the Nevada Test Site, the Test Cell A Facility was used in the 1960s for the testing of nuclear rocket engines, as part of the Nuclear Rocket Development Program. The facility was decontaminated and decommissioned (D&D) in 2005 using the Streamlined Approach For Environmental Restoration (SAFER) process, under the Federal Facilities Agreement and Consent Order (FFACO). Utilities and process piping were verified void of contents, hazardous materials were removed, concrete with removable contamination decontaminated, large sections mechanically demolished, and the remaining five-foot, five-inch thick radiologically-activated reinforced concrete shield wall demolished using open-air controlled explosive demolitionmore » (CED). CED of the shield wall was closely monitored and resulted in no radiological exposure or atmospheric release.« less
Cowley, Benjamin U.; Korpela, Jussi
2018-01-01
Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap. PMID:29692705
Cowley, Benjamin U; Korpela, Jussi
2018-01-01
Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.
Finding the right way: DFM versus area efficiency for 65 nm gate layer lithography
NASA Astrophysics Data System (ADS)
Sarma, Chandra S.; Scheer, Steven; Herold, Klaus; Fonseca, Carlos; Thomas, Alan; Schroeder, Uwe P.
2006-03-01
DFM (Design for Manufacturing) has become a buzzword for lithography since the 90nm node. Implementing DFM intelligently can boost yield rates and reliability in semiconductor manufacturing significantly. However, any restriction on the design space will always result in an area loss, thus diminishing the effective shrink factor for a given technology. For a lithographer, the key task is to develop a manufacturable process, while not sacrificing too much area. We have developed a high performing lithography process for attenuated gate level lithography that is based on aggressive illumination and a newly optimized SRAF placement schemes. In this paper we present our methodology and results for this optimization, using an anchored simulation model. The wafer results largely confirm the predictions of the simulations. The use of aggressive SRAF (Sub Resolution Assist Features) strategy leads to reduction of forbidden pitch regions without any SRAF printing. The data show that our OPC is capable of correcting the PC tip to tip distance without bridging between the tips in dense SRAM cells. SRAF strategy for various 2D cases has also been verified on wafer. We have shown that aggressive illumination schemes yielding a high performing lithography process can be employed without sacrificing area. By carefully choosing processing conditions, we were able develop a process that has very little restrictions for design. In our approach, the remaining issues can be addressed by DFM, partly in data prep procedures, which are largely area neutral and transparent to the designers. Hence, we have shown successfully, that DFM and effective technology shrinks are not mutually exclusive.
Landscape characteristics of disturbed shrubsteppe habitats in southwestern Idaho (USA)
Knick, Steven T.; Rotenberry, J.T.
1997-01-01
We compared 5 zones in shrubsteppe habitats of southwestern Idaho to determine the effect of differing disturbance combinations on landscapes that once shared historically similar disturbance regimes. The primary consequence of agriculture, wildfires, and extensive fires ignited by the military during training activities was loss of native shrubs from the landscape. Agriculture created large square blocks on the landscape, and the landscape contained fewer small patches and more large shrub patches than non-agricultural areas. In contrast, fires left a more fragmented landscape. Repeated fires did not change the distribution of patch sizes, but decreased the total area of remaining shrublands and increased the distance between remaining shrub patches that provide seed sources. Military training with tracked vehicles was associated with a landscape characterized by small, closely spaced, shrub patches. Our results support the general model hypothesized for conversion of shrublands to annual grasslands by disturbance. Larger shrub patches in our region, historically resistant to fire spread and large-scale fires because of a perennial bunchgrass understory, were more fragmented than small patches. Presence of cheatgrass (Bromus tectorum), an exotic annual, was positively related to landscape patchiness and negatively related to number of shrub cells. Thus, cheatgrass dominance can contribute to further fragmentation and loss of the shrub patch by facilitating spread of subsequent fires, carried by continuous fuels, through the patch. The synergistic processes of fragmentation of shrub patches by disturbance, invasion and subsequent dominance by exotic annuals, and fire are converting shrubsteppe in southwestern Idaho to a new state dominated by exotic annual grasslands and high fire frequencies.
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; van Leeuwen, P. J.
2017-12-01
Model Uncertainty Quantification remains one of the central challenges of effective Data Assimilation (DA) in complex partially observed non-linear systems. Stochastic parameterization methods have been proposed in recent years as a means of capturing the uncertainty associated with unresolved sub-grid scale processes. Such approaches generally require some knowledge of the true sub-grid scale process or rely on full observations of the larger scale resolved process. We present a methodology for estimating the statistics of sub-grid scale processes using only partial observations of the resolved process. It finds model error realisations over a training period by minimizing their conditional variance, constrained by available observations. Special is that these realisations are binned conditioned on the previous model state during the minimization process, allowing for the recovery of complex error structures. The efficacy of the approach is demonstrated through numerical experiments on the multi-scale Lorenz 96' model. We consider different parameterizations of the model with both small and large time scale separations between slow and fast variables. Results are compared to two existing methods for accounting for model uncertainty in DA and shown to provide improved analyses and forecasts.
Skórska, Czesława; Sitkowska, Jolanta; Krysińska-Traczyk, Ewa; Cholewa, Grazyna; Dutkiewicz, Jacek
2005-01-01
The aim of this study was to determine the levels of microorganisms, dust and endotoxin in the air during processing of peppermint (Mentha piperita) and chamomile (Matricaria recutita) by herb farmers, and to examine the species composition of airborne microflora. Air samples were collected on glass fibre filters by use of personal samplers on 13 farms owned by herb cultivating farmers, located in Lublin province (eastern Poland). The concentrations of total viable microorganisms (bacteria + fungi) in the farm air during processing of peppermint herb were large, within a range from 895.1-6,015.8 x 10(3) cfu/m(3) (median 1,055.3 x 10(3) cfu/m(3)). During processing of chamomile herb they were much lower and varied within a range from 0.88-295.6 x 10(3) cfu/m(3) (median 27.3 x 10(3) cfu/m(3)). Gram-negative bacteria distinctly prevailed during processing of peppermint leaves, forming 46.4-88.5 % of the total airborne microflora. During processing of chamomile herb, Gram-negative bacteria were dominant at 3 out of 6 sampling sites forming 54.7-75.3 % of total microflora, whereas at the remaining 3 sites the most common were fungi forming 46.2-99.9 % of the total count. The species Pantoea agglomerans (synonyms: Erwinia herbicola, Enterobacter agglomerans ), having strong allergenic and endotoxic properties, distinctly prevailed among Gram-negative isolates. Among fungi, the most common species was Alternaria alternata. The concentrations of airborne dust and endotoxin determined on the examined herb farms were large. The concentrations of airborne dust during peppermint and chamomile processing ranged from 86.7-958.9 mg/m(3), and from 1.1-499.2 mg/m(3), respectively (medians 552.3 mg/m(3) and 12.3 mg/m(3)). The concentrations of airborne endotoxin determined during peppermint and chamomile processing were within a wide range 1.53-208.33 microg/m(3) and 0.005-2604.19 microg/m(3) respectively (medians 57.3 microg/m(3) and 0.96 microg/m(3)). In conclusion, farmers cultivating peppermint are exposed during processing of this herb to large concentrations of airborne microorganisms, dust and endotoxin posing a risk of work-related respiratory disease. The exposure to bioaerosols during processing of chamomile is lower; nevertheless, peak values create a respiratory risk for exposed farmers.
Selective attention modulates high-frequency activity in the face-processing network.
Müsch, Kathrin; Hamamé, Carlos M; Perrone-Bertolotti, Marcela; Minotti, Lorella; Kahane, Philippe; Engel, Andreas K; Lachaux, Jean-Philippe; Schneider, Till R
2014-11-01
Face processing depends on the orchestrated activity of a large-scale neuronal network. Its activity can be modulated by attention as a function of task demands. However, it remains largely unknown whether voluntary, endogenous attention and reflexive, exogenous attention to facial expressions equally affect all regions of the face-processing network, and whether such effects primarily modify the strength of the neuronal response, the latency, the duration, or the spectral characteristics. We exploited the good temporal and spatial resolution of intracranial electroencephalography (iEEG) and recorded from depth electrodes to uncover the fast dynamics of emotional face processing. We investigated frequency-specific responses and event-related potentials (ERP) in the ventral occipito-temporal cortex (VOTC), ventral temporal cortex (VTC), anterior insula, orbitofrontal cortex (OFC), and amygdala when facial expressions were task-relevant or task-irrelevant. All investigated regions of interest (ROI) were clearly modulated by task demands and exhibited stronger changes in stimulus-induced gamma band activity (50-150 Hz) when facial expressions were task-relevant. Observed latencies demonstrate that the activation is temporally coordinated across the network, rather than serially proceeding along a processing hierarchy. Early and sustained responses to task-relevant faces in VOTC and VTC corroborate their role for the core system of face processing, but they also occurred in the anterior insula. Strong attentional modulation in the OFC and amygdala (300 msec) suggests that the extended system of the face-processing network is only recruited if the task demands active face processing. Contrary to our expectation, we rarely observed differences between fearful and neutral faces. Our results demonstrate that activity in the face-processing network is susceptible to the deployment of selective attention. Moreover, we show that endogenous attention operates along the whole face-processing network, and that these effects are reflected in frequency-specific changes in the gamma band. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multi-carrier transmission for hybrid radio frequency with optical wireless communications
NASA Astrophysics Data System (ADS)
Wang, Gang; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik; Nguyen, Tien M.
2015-05-01
Radio frequency (RF) wireless communication is reaching its capacity to support large data rate transmissions due to hardware constraints (e.g., silicon processes), software strategies (e.g., information theory), and consumer desire for timely large file exchanges (e.g., big data and mobile cloud computing). A high transmission rate performance must keep pace with the generated huge volumes of data for real-time processing. Integrated RF and optical wireless communications (RF/OWC) could be the next generation transmission technology to satisfy both the increased data rate exchange and the communications constraints. However, with the promising benefits of RF/OWC, challenges remain to fully develop hybrid RF with wireless optical communications such as uniform waveform design for information transmission and detection. In this paper, an orthogonal frequency division multiplexing (OFDM) transmission scheme, which widely employed in RF communications, is developed for optical communications. The traditional high peak-to-average power ratio (PAPR) in OFDM is reduced to improve system performance. The proposed multi-carrier waveform is evaluated with a frequency-selective fading channel. The results demonstrate that bit error rate (BER) performance of our proposed optical OFDM transmission technique outperforms the traditional OWC on-off keying (OOK) transmission scheme.
Multiple yielding processes in a colloidal gel under large amplitude oscillatory stress
NASA Astrophysics Data System (ADS)
Gibaud, Thomas; Perge, Christophe; Lindström, Stefan B.; Taberlet, Nicolas; Manneville, Sébastien
Fatigue refers to the changes in material properties caused by repeatedly applied loads. It has been widely studied for, e.g., construction materials, but much less has been done on soft materials. Here, we characterize the fatigue dynamics of a colloidal gel. Fatigue is induced by large amplitude oscillatory stress (LAOStress), and the local displacements of the gel are measured through high-frequency ultrasonic imaging. We show that fatigue eventually leads to rupture and fluidization. We evidence four successive steps associated with these dynamics: (i) the gel first remains solid, (ii) it then slides against the walls, (iii) the bulk of the sample becomes heterogeneous and displays solid-fluid coexistence, and (iv) it is finally fully fluidized. It is possible to homogeneously scale the duration of each step with respect to the stress oscillation amplitude $\\sigma_0$. The data are compatible with both exponential and power-law scalings with $\\sigma_0$, which hints at two possible interpretations in terms of delayed yielding in terms activated processes or of the Basquin law. Surprisingly, we find that the model parameters behave nonmonotonically as we change the oscillation frequency and/or the gel concentration.
Music and emotions: from enchantment to entrainment.
Vuilleumier, Patrik; Trost, Wiebke
2015-03-01
Producing and perceiving music engage a wide range of sensorimotor, cognitive, and emotional processes. Emotions are a central feature of the enjoyment of music, with a large variety of affective states consistently reported by people while listening to music. However, besides joy or sadness, music often elicits feelings of wonder, nostalgia, or tenderness, which do not correspond to emotion categories typically studied in neuroscience and whose neural substrates remain largely unknown. Here we review the similarities and differences in the neural substrates underlying these "complex" music-evoked emotions relative to other more "basic" emotional experiences. We suggest that these emotions emerge through a combination of activation in emotional and motivational brain systems (e.g., including reward pathways) that confer its valence to music, with activation in several other areas outside emotional systems, including motor, attention, or memory-related regions. We then discuss the neural substrates underlying the entrainment of cognitive and motor processes by music and their relation to affective experience. These effects have important implications for the potential therapeutic use of music in neurological or psychiatric diseases, particularly those associated with motor, attention, or affective disturbances. © 2015 New York Academy of Sciences.
Dynamics of carbon dioxide emission at Mammoth Mountain, California
Rogie, J.D.; Kerrick, Derrill M.; Sorey, M.L.; Chiodini, G.; Galloway, D.L.
2001-01-01
Mammoth Mountain, a dormant volcano in the eastern Sierra Nevada, California, has been passively degassing large quantities of cold magmatic CO2 since 1990 following a 6-month-long earthquake swarm associated with a shallow magmatic intrussion in 1989. A search for any link between gas discharge and volcanic hazard at this popular recreation area led us to initiate a detailed study of the degassing process in 1997. Our continuous monitoring results elucidate some of the physical controls that influence dynamics in flank CO2 degassing at this volcano. High coherence between variations in CO2 efflux and variations in atmospheric pressure and wind speed imply that meteorological parameters account for much, if not all of the variability in CO2 efflux rates. Our results help explain differences among previously published estimates of CO2 efflux at Mammoth Mountain and indicate that the long-term (annual) CO2 degassing rate has in fact remained constant since ~ 1997. Discounting the possibility of large meteorologically driven temporal variations in gas efflux at other volcanoes may result in spurious interpretations of transients do not reflect actual geologic processes. ?? 2001 Elsevier Science B.V. All rights reserved.
Huang, Chun; Zhang, Jin; Young, Neil P.; Snaith, Henry J.; Grant, Patrick S.
2016-01-01
Supercapacitors are in demand for short-term electrical charge and discharge applications. Unlike conventional supercapacitors, solid-state versions have no liquid electrolyte and do not require robust, rigid packaging for containment. Consequently they can be thinner, lighter and more flexible. However, solid-state supercapacitors suffer from lower power density and where new materials have been developed to improve performance, there remains a gap between promising laboratory results that usually require nano-structured materials and fine-scale processing approaches, and current manufacturing technology that operates at large scale. We demonstrate a new, scalable capability to produce discrete, multi-layered electrodes with a different material and/or morphology in each layer, and where each layer plays a different, critical role in enhancing the dynamics of charge/discharge. This layered structure allows efficient utilisation of each material and enables conservative use of hard-to-obtain materials. The layered electrode shows amongst the highest combinations of energy and power densities for solid-state supercapacitors. Our functional design and spray manufacturing approach to heterogeneous electrodes provide a new way forward for improved energy storage devices. PMID:27161379
Effect of large wood retention at check dams on sediment continuity
NASA Astrophysics Data System (ADS)
Schmocker, Lukas; Schalko, Isabella; Weitbrecht, Volker
2017-04-01
Large wood transport during flood events may seriously increase the damage potential due to accumulations at river infrastructures. The large wood is therefore mostly retained upstream of populated areas using retention structures that often combine a check dam with a debris rack. One disadvantages of this structures is, that the bed-load gets retained along with the wood. Especially if large wood blocks the rack early during a flood event, sediment continuity is completely interrupted. This may lead to severe bed erosion downstream of the check dam. So far, no common design to retain large wood but maintain sediment continuity is available. One attempt to separate the large wood from the bed-load was made with the large wood retention structure at River Sihl in Zürich, Switzerland. The retention of the large wood occurs in a bypass channel located along the main river. The bypass is located at an outer river bend, where a separation of bed-load and large wood results due to the secondary currents induced by the river curvature. Large wood floats towards the outer bend due to inertia and the secondary currents whereas bed-load remains at the inner bend. The bypass is separated by a side weir from the main river to ensure that the bed-load remains in the river during bed forming discharges and flood events. New model test are currently carried out at the Laboratory of Hydraulics, Hydrology, and Glaciology (VAW) of ETH Zurich, where sediment continuity should be achieved using an inclined rack. The rack is inclined in flow direction with a degree of 45° to 20°. First results show that the large wood deposits at the upper part of the rack whereas the lower part of the rack remains free for bed-load transport. Furthermore, the backwater rise for the inclined rack due to the accumulated wood is considerably reduced compared to a vertical rack, as a large part of the rack remains clear for the flow to pass. The findings of this studies help to understand the complex interaction between sediment and large wood at a check dam retention structure. Furthermore, new retention structures and rack designs are available, where sediment continuity can partially be maintained to reduce downstream bed erosion.
Inorganic material profiling using Arn+ cluster: Can we achieve high quality profiles?
NASA Astrophysics Data System (ADS)
Conard, T.; Fleischmann, C.; Havelund, R.; Franquet, A.; Poleunis, C.; Delcorte, A.; Vandervorst, W.
2018-06-01
Retrieving molecular information by sputtering of organic systems has been concretized in the last years due to the introduction of sputtering by large gas clusters which drastically eliminated the compound degradation during the analysis and has led to strong improvements in depth resolution. Rapidly however, a limitation was observed for heterogeneous systems where inorganic layers or structures needed to be profiled concurrently. As opposed to organic material, erosion of the inorganic layer appears very difficult and prone to many artefacts. To shed some light on these problems we investigated a simple system consisting of aluminum delta layer(s) buried in a silicon matrix in order to define the most favorable beam conditions for practical analysis. We show that counterintuitive to the small energy/atom used and unlike monoatomic ion sputtering, the information depth obtained with large cluster ions is typically very large (∼10 nm) and that this can be caused both by a large roughness development at early stages of the sputtering process and by a large mixing zone. As a consequence, a large deformation of the Al intensity profile is observed. Using sample rotation during profiling significantly improves the depth resolution while sample temperature has no significant effect. The determining parameter for high depth resolution still remains the total energy of the cluster instead of the energy per atom in the cluster.
Understanding Mechanism of Photocatalytic Microbial Decontamination of Environmental Wastewater
Regmi, Chhabilal; Joshi, Bhupendra; Ray, Schindra K.; Gyawali, Gobinda; Pandey, Ramesh P.
2018-01-01
Several photocatalytic nanoparticles are synthesized and studied for potential application for the degradation of organic and biological wastes. Although these materials degrade organic compounds by advance oxidation process, the exact mechanisms of microbial decontamination remains partially known. Understanding the real mechanisms of these materials for microbial cell death and growth inhibition helps to fabricate more efficient semiconductor photocatalyst for large-scale decontamination of environmental wastewater or industries and hospitals/biomedical labs generating highly pathogenic bacteria and toxic molecules containing liquid waste by designing a reactor. Recent studies on microbial decontamination by photocatalytic nanoparticles and their possible mechanisms of action is highlighted with examples in this mini review. PMID:29541632
Aerosol growth in Titan’s ionosphere
Lavvas, Panayotis; Yelle, Roger V.; Koskinen, Tommi; Bazin, Axel; Vuitton, Véronique; Vigren, Erik; Galand, Marina; Wellbrock, Anne; Coates, Andrew J.; Wahlund, Jan-Erik; Crary, Frank J.; Snowden, Darci
2013-01-01
Photochemically produced aerosols are common among the atmospheres of our solar system and beyond. Observations and models have shown that photochemical aerosols have direct consequences on atmospheric properties as well as important astrobiological ramifications, but the mechanisms involved in their formation remain unclear. Here we show that the formation of aerosols in Titan’s upper atmosphere is directly related to ion processes, and we provide a complete interpretation of observed mass spectra by the Cassini instruments from small to large masses. Because all planetary atmospheres possess ionospheres, we anticipate that the mechanisms identified here will be efficient in other environments as well, modulated by the chemical complexity of each atmosphere. PMID:23382231
Prioritization of remedial actions for radioactive waste burials in the Chernobyl exclusion zone
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crossland, Ian
2013-07-01
Large volumes of radioactive waste were urgently buried in the Chernobyl Exclusion Zone (ChEZ) in the aftermath of the 1986 accident. Twenty-six years later, decisions must be taken about whether and how to remediate these sites. An attempt to resolve two key issues is described here: 1. How to assess the hazards posed by these facilities, recognizing that, for the foreseeable future, the Chernobyl Exclusion Zone will remain under institutional control? and 2. What standards to apply in deciding the extent of remediation? This paper presents an examination of the issues and proposes a simple decision-making process. (authors)
Deposition of thin silicon layers on transferred large area graphene
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lupina, Grzegorz, E-mail: lupina@ihp-microelectronics.com; Kitzmann, Julia; Lukosius, Mindaugas
2013-12-23
Physical vapor deposition of Si onto transferred graphene is investigated. At elevated temperatures, Si nucleates preferably on wrinkles and multilayer graphene islands. In some cases, however, Si can be quasi-selectively grown only on the monolayer graphene regions while the multilayer islands remain uncovered. Experimental insights and ab initio calculations show that variations in the removal efficiency of carbon residuals after the transfer process can be responsible for this behavior. Low-temperature Si seed layer results in improved wetting and enables homogeneous growth. This is an important step towards realization of electronic devices in which graphene is embedded between two Si layers.
Fine structure of 25 extragalactic radio sources. [interferometric observations of quasars
NASA Technical Reports Server (NTRS)
Wittels, J. J.; Knight, C. A.; Shapiro, I. I.; Hinteregger, H. F.; Rogers, A. E. E.; Whitney, A. R.; Clark, T. A.; Hutton, L. K.; Marandino, G. E.; Niell, A. E.
1975-01-01
Interferometric observations taken at 7.8 GHz (gamma approximately = 3.8 cm) with five pairings of antennae of 25 extragalactic radio sources between April, 1972 and May, 1973 are reported. These sources exhibit a broad variety of fine structure from very simple to complex. The total flux and the correlated flux of some of the sources underwent large changes in a few weeks, while the structure and total power of others remained constant during the entire period of observation. Some aspects of the data processing and a discussion of errors are presented. Numerous figures are provided and explained. The individual radio sources are described in detail.
Aerosol growth in Titan's ionosphere.
Lavvas, Panayotis; Yelle, Roger V; Koskinen, Tommi; Bazin, Axel; Vuitton, Véronique; Vigren, Erik; Galand, Marina; Wellbrock, Anne; Coates, Andrew J; Wahlund, Jan-Erik; Crary, Frank J; Snowden, Darci
2013-02-19
Photochemically produced aerosols are common among the atmospheres of our solar system and beyond. Observations and models have shown that photochemical aerosols have direct consequences on atmospheric properties as well as important astrobiological ramifications, but the mechanisms involved in their formation remain unclear. Here we show that the formation of aerosols in Titan's upper atmosphere is directly related to ion processes, and we provide a complete interpretation of observed mass spectra by the Cassini instruments from small to large masses. Because all planetary atmospheres possess ionospheres, we anticipate that the mechanisms identified here will be efficient in other environments as well, modulated by the chemical complexity of each atmosphere.
NASA Astrophysics Data System (ADS)
Ahmad, I.; Temple, M. P.; Kallis, A.; Wojdak, M.; Oton, C. J.; Barbier, D.; Saleh, H.; Kenyon, A. J.; Loh, W. H.
2008-12-01
Erbium-doped silicon-rich silicon oxide films deposited by plasma enhanced chemical vapor deposition suffer from compressive stress as deposited, which converts to a large tensile stress on annealing due to the release of hydrogen. Although the cracking that results from this stress can be avoided by patterning the films into ridges, significant stress remains along the ridge axis. Measurements of erbium photoluminescence sensitized by silicon nanoclusters in stressed and relaxed films suggest an important role for internal film stresses in promoting the phase separation of excess silicon into nanoclusters, which has previously been thought of as a thermally driven process.
Neuropsychological Profiles on the WAIS-IV of Adults With ADHD.
Theiling, Johanna; Petermann, Franz
2016-11-01
The aim of the study was to investigate the pattern of neuropsychological profiles on the Wechsler Adult Intelligence Scale-IV (WAIS-IV) for adults With ADHD relative to randomly matched controls and to assess overall intellectual ability discrepancies of the Full Scale Intelligence Quotient (FSIQ) and the General Ability Index (GAI). In all, 116 adults With ADHD and 116 controls between 16 and 71 years were assessed. Relative to controls, adults With ADHD show significant decrements in subtests with working memory and processing speed demands with moderate to large effect sizes and a higher GAI in comparison with the FSIQ. This suggests first that deficits identified with previous WAIS versions are robust in adults With ADHD and remain deficient when assessed with the WAIS-IV; second that the WAIS-IV reliably differentiates between patients and controls; and third that a reduction of the FSIQ is most likely due to a decrement in working memory and processing speed abilities. The findings have essential implications for the diagnostic process. © The Author(s) 2014.
A neural model of figure-ground organization.
Craft, Edward; Schütze, Hartmut; Niebur, Ernst; von der Heydt, Rüdiger
2007-06-01
Psychophysical studies suggest that figure-ground organization is a largely autonomous process that guides--and thus precedes--allocation of attention and object recognition. The discovery of border-ownership representation in single neurons of early visual cortex has confirmed this view. Recent theoretical studies have demonstrated that border-ownership assignment can be modeled as a process of self-organization by lateral interactions within V2 cortex. However, the mechanism proposed relies on propagation of signals through horizontal fibers, which would result in increasing delays of the border-ownership signal with increasing size of the visual stimulus, in contradiction with experimental findings. It also remains unclear how the resulting border-ownership representation would interact with attention mechanisms to guide further processing. Here we present a model of border-ownership coding based on dedicated neural circuits for contour grouping that produce border-ownership assignment and also provide handles for mechanisms of selective attention. The results are consistent with neurophysiological and psychophysical findings. The model makes predictions about the hypothetical grouping circuits and the role of feedback between cortical areas.
NASA Astrophysics Data System (ADS)
Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa
2018-03-01
Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.
Harrison, Richard P; Medcalf, Nicholas; Rafiq, Qasim A
2018-03-01
Manufacturing methods for cell-based therapies differ markedly from those established for noncellular pharmaceuticals and biologics. Attempts to 'shoehorn' these into existing frameworks have yielded poor outcomes. Some excellent clinical results have been realized, yet emergence of a 'blockbuster' cell-based therapy has so far proved elusive. The pressure to provide these innovative therapies, even at a smaller scale, remains. In this process, economics research paper, we utilize cell expansion research data combined with operational cost modeling in a case study to demonstrate the alternative ways in which a novel mesenchymal stem cell-based therapy could be provided at small scale. This research outlines the feasibility of cell microfactories but highlighted that there is a strong pressure to automate processes and split the quality control cost-burden over larger production batches. The study explores one potential paradigm of cell-based therapy provisioning as a potential exemplar on which to base manufacturing strategy.
Growth kinetics of vertically aligned carbon nanotube arrays in clean oxygen-free conditions.
In, Jung Bin; Grigoropoulos, Costas P; Chernov, Alexander A; Noy, Aleksandr
2011-12-27
Vertically aligned carbon nanotubes (CNTs) are an important technological system, as well as a fascinating system for studying basic principles of nanomaterials synthesis; yet despite continuing efforts for the past decade many important questions about this process remain largely unexplained. We present a series of parametric ethylene chemical vapor deposition growth studies in a "hot-wall" reactor using ultrapure process gases that reveal the fundamental kinetics of the CNT growth. Our data show that the growth rate is proportional to the concentration of the carbon feedstock and monotonically decreases with the concentration of hydrogen gas and that the most important parameter determining the rate of the CNT growth is the production rate of active carbon precursor in the gas phase reaction. The growth termination times obtained with the purified gas mixtures were strikingly insensitive to variations in both hydrogen and ethylene pressures ruling out the carbon encapsulation of the catalyst as the main process termination cause.
Using Twitter for Demographic and Social Science Research: Tools for Data Collection and Processing
McCormick, Tyler H.; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S.
2015-01-01
Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users—a key component of much social science research—remains a challenge. This article develops an accurate and reliable data processing approach for social science researchers interested in using Twitter data to examine behaviors and attitudes, as well as the demographic characteristics of the populations expressing or engaging in them. Using information gathered from Twitter users who state an intention to not vote in the 2012 presidential election, we describe and evaluate a method for processing data to retrieve demographic information reported by users that is not encoded as text (e.g., details of images) and evaluate the reliability of these techniques. We end by assessing the challenges of this data collection strategy and discussing how large-scale social media data may benefit demographic researchers. PMID:29033471
Using Twitter for Demographic and Social Science Research: Tools for Data Collection and Processing.
McCormick, Tyler H; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S
2017-08-01
Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users-a key component of much social science research-remains a challenge. This article develops an accurate and reliable data processing approach for social science researchers interested in using Twitter data to examine behaviors and attitudes, as well as the demographic characteristics of the populations expressing or engaging in them. Using information gathered from Twitter users who state an intention to not vote in the 2012 presidential election, we describe and evaluate a method for processing data to retrieve demographic information reported by users that is not encoded as text (e.g., details of images) and evaluate the reliability of these techniques. We end by assessing the challenges of this data collection strategy and discussing how large-scale social media data may benefit demographic researchers.
Smith, Ashley R; Chein, Jason; Steinberg, Laurence
2013-07-01
While there is little doubt that risk-taking is generally more prevalent during adolescence than before or after, the underlying causes of this pattern of age differences have long been investigated and debated. One longstanding popular notion is the belief that risky and reckless behavior in adolescence is tied to the hormonal changes of puberty. However, the interactions between pubertal maturation and adolescent decision making remain largely understudied. In the current review, we discuss changes in decision making during adolescence, focusing on the asynchronous development of the affective, reward-focused processing system and the deliberative, reasoned processing system. As discussed, differential maturation in the structure and function of brain systems associated with these systems leaves adolescents particularly vulnerable to socio-emotional influences and risk-taking behaviors. We argue that this asynchrony may be partially linked to pubertal influences on development and specifically on the maturation of the affective, reward-focused processing system. Copyright © 2013 Elsevier Inc. All rights reserved.
Update on Bio-Refining and Nanocellulose Composite Materials Manufacturing.
Postek, Michael T; Poster, Dianne L
2017-01-01
Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. One of the factors limiting the potential of nanocellulose and the vast array of potential new products is the ability to produce high-volume quantities of this nano-material. However, recent research has demonstrated that nanocellulose can be efficently produced in large volumes from wood at relatively low cost by the incorporation of ionizing radiation in the process stream. Ionizing radiation causes significant break down of the polysaccharides and leads to the production of potentially useful gaseous products such as H 2 and CO. Ionizing radiation processing remains an open field, ripe for innovation and application. This presentation will review the strong collaboration between the National Institute of Standards and Technology (NIST) and its academic partners pursuing the demonstration of applied ionizing radiation processing to plant materials for the manufacturing and characterization of novel nanomaterials.
Update on Bio-Refining and Nanocellulose Composite Materials Manufacturing
Postek, Michael T.; Poster, Dianne L.
2017-01-01
Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. One of the factors limiting the potential of nanocellulose and the vast array of potential new products is the ability to produce high-volume quantities of this nano-material. However, recent research has demonstrated that nanocellulose can be efficently produced in large volumes from wood at relatively low cost by the incorporation of ionizing radiation in the process stream. Ionizing radiation causes significant break down of the polysaccharides and leads to the production of potentially useful gaseous products such as H2 and CO. Ionizing radiation processing remains an open field, ripe for innovation and application. This presentation will review the strong collaboration between the National Institute of Standards and Technology (NIST) and its academic partners pursuing the demonstration of applied ionizing radiation processing to plant materials for the manufacturing and characterization of novel nanomaterials. PMID:29225398
Understanding the Dynamics of the Oxic-Anoxic Interface in the Black Sea
NASA Astrophysics Data System (ADS)
Stanev, Emil V.; Poulain, Pierre-Marie; Grayek, Sebastian; Johnson, Kenneth S.; Claustre, Hervé; Murray, James W.
2018-01-01
The Black Sea, the largest semienclosed anoxic basin on Earth, can be considered as an excellent natural laboratory for oxic and anoxic biogeochemical processes. The suboxic zone, a thin interface between oxic and anoxic waters, still remains poorly understood because it has been undersampled. This has led to alternative concepts regarding the underlying processes that create it. Existing hypotheses suggest that the interface originates either by isopycnal intrusions that introduce oxygen or the dynamics of manganese redox cycling that are associated with the sinking of particles or chemosynthetic bacteria. Here we reexamine these concepts using high-resolution oxygen, sulfide, nitrate, and particle concentration profiles obtained with sensors deployed on profiling floats. Our results show an extremely stable structure in density space over the entire basin with the exception of areas near the Bosporus plume and in the southern areas dominated by coastal anticyclones. The absence of large-scale horizontal intrusive signatures in the open-sea supports a hypothesis prioritizing the role of biogeochemical processes.
Update on bio-refining and nanocellulose composite materials manufacturing
NASA Astrophysics Data System (ADS)
Postek, Michael T.; Poster, Dianne L.
2017-08-01
Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. One of the factors limiting the potential of nanocellulose and the vast array of potential new products is the ability to produce high-volume quantities of this nano-material. However, recent research has demonstrated that nanocellulose can be efficently produced in large volumes from wood at relatively low cost by the incorporation of ionizing radiation in the process stream. Ionizing radiation causes significant break down of the polysaccharides and leads to the production of potentially useful gaseous products such as H2 and CO. Ionizing radiation processing remains an open field, ripe for innovation and application. This presentation will review the strong collaboration between the National Institute of Standards and Technology (NIST) and its academic partners pursuing the demonstration of applied ionizing radiation processing to plant materials for the manufacturing and characterization of novel nanomaterials.
Degradation of roxarsone in a silt loam soil and its toxicity assessment.
Liang, Tengfang; Ke, Zhengchen; Chen, Qing; Liu, Li; Chen, Guowei
2014-10-01
The land application of poultry or swine litter, containing large amounts of roxarsone, causes serious arsenic pollution in soil. Understanding biotransformation process of roxarsone and its potential risks favors proper disposal of roxarsone-contaminated animal litter, yet remains not achieved. We report an experimental study of biotransformation process of roxarsone in a silt loam soil under various soil moisture and temperature conditions, and the toxicity of roxarsone and its products from degradation. Results showed that soil moisture and higher temperature promoted roxarsone degradation, associating with emergent pentavalent arsenic. Analysis of fluorescein diacetate (FDA) hydrolysis activity revealed that roxarsone does not exert acute toxic on soil microbes. With the release of inorganic arsenic, FDA hydrolysis activity was inhibited gradually, as evidenced by ecotoxicological assessment using Photobacterium leiognathi. The results shade new lights on the dynamic roxarsone biotransformation processes in soil, which is important for guiding appropriate disposal of poultry or swine litter in the environment. Copyright © 2014 Elsevier Ltd. All rights reserved.
Barraoui, Driss; Labrecque, Michel; Blais, Jean-François
2008-03-01
Given the fact that, according to our knowledge, no study has compared the agro-environmental use of decontaminated with non-decontaminated sludge, a greenhouse experiment was carried out to test the growth of maize (Zea mays L., G-4011 Hybrid) and bioaccumulation of metals in the presence of four different sludges (MUC, QUC, BEC and DAI), before and after their decontamination by a novel process (METIX-AC). Data showed that decontaminated sludge ameliorated plant growth and biomass production, and decreased bioaccumulation of metals, more than control soil, inorganic chemical fertilization, or conventional non-decontaminated sludge. Since chemicals used by the METIX-AC process contained S and Fe, decontaminated sludge introduced large amounts of these elements, while the overall presence of metals was reduced. Often, sludge dose also affected maize growth and bioaccumulation of metals. Overall, no toxicity to plants was noticed and bioaccumulation and transfer of many metals remained below the limits reported in the literature.
320-nm Flexible Solution-Processed 2,7-dioctyl[1] benzothieno[3,2-b]benzothiophene Transistors.
Ren, Hang; Tang, Qingxin; Tong, Yanhong; Liu, Yichun
2017-08-09
Flexible organic thin-film transistors (OTFTs) have received extensive attention due to their outstanding advantages such as light weight, low cost, flexibility, large-area fabrication, and compatibility with solution-processed techniques. However, compared with a rigid substrate, it still remains a challenge to obtain good device performance by directly depositing solution-processed organic semiconductors onto an ultrathin plastic substrate. In this work, ultrathin flexible OTFTs are successfully fabricated based on spin-coated 2,7-dioctyl[1]benzothieno[3,2-b]benzothiophene (C8-BTBT) films. The resulting device thickness is only ~320 nm, so the device has the ability to adhere well to a three-dimension curved surface. The ultrathin C8-BTBT OTFTs exhibit a mobility as high as 4.36 cm² V -1 s -1 and an on/off current ratio of over 10⁶. These results indicate the substantial promise of our ultrathin flexible C8-BTBT OTFTs for next-generation flexible and conformal electronic devices.
320-nm Flexible Solution-Processed 2,7-dioctyl[1] benzothieno[3,2-b]benzothiophene Transistors
Ren, Hang; Tang, Qingxin; Tong, Yanhong; Liu, Yichun
2017-01-01
Flexible organic thin-film transistors (OTFTs) have received extensive attention due to their outstanding advantages such as light weight, low cost, flexibility, large-area fabrication, and compatibility with solution-processed techniques. However, compared with a rigid substrate, it still remains a challenge to obtain good device performance by directly depositing solution-processed organic semiconductors onto an ultrathin plastic substrate. In this work, ultrathin flexible OTFTs are successfully fabricated based on spin-coated 2,7-dioctyl[1]benzothieno[3,2-b]benzothiophene (C8-BTBT) films. The resulting device thickness is only ~320 nm, so the device has the ability to adhere well to a three-dimension curved surface. The ultrathin C8-BTBT OTFTs exhibit a mobility as high as 4.36 cm2 V−1 s−1 and an on/off current ratio of over 106. These results indicate the substantial promise of our ultrathin flexible C8-BTBT OTFTs for next-generation flexible and conformal electronic devices. PMID:28792438
A new processing scheme for ultra-high resolution direct infusion mass spectrometry data
NASA Astrophysics Data System (ADS)
Zielinski, Arthur T.; Kourtchev, Ivan; Bortolini, Claudio; Fuller, Stephen J.; Giorio, Chiara; Popoola, Olalekan A. M.; Bogialli, Sara; Tapparo, Andrea; Jones, Roderic L.; Kalberer, Markus
2018-04-01
High resolution, high accuracy mass spectrometry is widely used to characterise environmental or biological samples with highly complex composition enabling the identification of chemical composition of often unknown compounds. Despite instrumental advancements, the accurate molecular assignment of compounds acquired in high resolution mass spectra remains time consuming and requires automated algorithms, especially for samples covering a wide mass range and large numbers of compounds. A new processing scheme is introduced implementing filtering methods based on element assignment, instrumental error, and blank subtraction. Optional post-processing incorporates common ion selection across replicate measurements and shoulder ion removal. The scheme allows both positive and negative direct infusion electrospray ionisation (ESI) and atmospheric pressure photoionisation (APPI) acquisition with the same programs. An example application to atmospheric organic aerosol samples using an Orbitrap mass spectrometer is reported for both ionisation techniques resulting in final spectra with 0.8% and 8.4% of the peaks retained from the raw spectra for APPI positive and ESI negative acquisition, respectively.
Plankton networks driving carbon export in the oligotrophic ocean
NASA Astrophysics Data System (ADS)
Guidi, L.; Chaffron, S.; Bittner, L.; Eveillard, D.; Raes, J.; Karsenti, E.; Bowler, C.; Gorsky, G.
2016-02-01
The biological carbon pump is the process by which CO2 is transformed to organic carbon via photosynthesis that sinks to the deep ocean as particles where it is sequestered. While the intensity of the pump correlates with plankton community composition, the underlying ecosystem structure and interactions driving the process remain largely uncharacterised. Here we use environmental and metagenomic data gathered during the Tara Oceans expedition to improve our understanding of the underlying processes. We show that specific plankton communities correlate with carbon export and highlight unexpected and overlooked taxa such as Radiolaria, alveolate parasites, as well as Synechococcus and their phages, as lineages most strongly associated with carbon export in the subtropical oligotrophic ocean. Additionally, we show that the relative abundance of just a few bacterial and viral genes can predict most of the variability in carbon export in these regions. Together these results help elucidate ecosystem drivers of the biological carbon pump and present a case study for scaling from genes-to-ecosystems.
Rab7-a novel redox target that modulates inflammatory pain processing.
Kallenborn-Gerhardt, Wiebke; Möser, Christine V; Lorenz, Jana E; Steger, Mirco; Heidler, Juliana; Scheving, Reynir; Petersen, Jonas; Kennel, Lea; Flauaus, Cathrin; Lu, Ruirui; Edinger, Aimee L; Tegeder, Irmgard; Geisslinger, Gerd; Heide, Heinrich; Wittig, Ilka; Schmidtko, Achim
2017-07-01
Chronic pain is accompanied by production of reactive oxygen species (ROS) in various cells that are important for nociceptive processing. Recent data indicate that ROS can trigger specific redox-dependent signaling processes, but the molecular targets of ROS signaling in the nociceptive system remain largely elusive. Here, we performed a proteome screen for pain-dependent redox regulation using an OxICAT approach, thereby identifying the small GTPase Rab7 as a redox-modified target during inflammatory pain in mice. Prevention of Rab7 oxidation by replacement of the redox-sensing thiols modulates its GTPase activity. Immunofluorescence studies revealed Rab7 expression to be enriched in central terminals of sensory neurons. Knockout mice lacking Rab7 in sensory neurons showed normal responses to noxious thermal and mechanical stimuli; however, their pain behavior during inflammatory pain and in response to ROS donors was reduced. The data suggest that redox-dependent changes in Rab7 activity modulate inflammatory pain sensitivity.
Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Marks, D. G.; Gurney, R. J.
2009-12-01
The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.
Wollum, Alexandra; Burstein, Roy; Fullman, Nancy; Dwyer-Lindgren, Laura; Gakidou, Emmanuela
2015-09-02
Nigeria has made notable gains in improving childhood survival but the country still accounts for a large portion of the world's overall disease burden, particularly among women and children. To date, no systematic analyses have comprehensively assessed trends for health outcomes and interventions across states in Nigeria. We extracted data from 19 surveys to generate estimates for 20 key maternal and child health (MCH) interventions and outcomes for 36 states and the Federal Capital Territory from 2000 to 2013. Source-specific estimates were generated for each indicator, after which a two-step statistical model was applied using a mixed-effects model followed by Gaussian process regression to produce state-level trends. National estimates were calculated by population-weighting state values. Under-5 mortality decreased in all states from 2000 to 2013, but a large gap remained across them. Malaria intervention coverage stayed low despite increases between 2009 and 2013, largely driven by rising rates of insecticide-treated net ownership. Overall, vaccination coverage improved, with notable increases in the coverage of three-dose oral polio vaccine. Nevertheless, immunization coverage remained low for most vaccines, including measles. Coverage of other MCH interventions, such as antenatal care and skilled birth attendance, generally stagnated and even declined in many states, and the range between the lowest- and highest-performing states remained wide in 2013. Countrywide, a measure of overall intervention coverage increased from 33% in 2000 to 47% in 2013 with considerable variation across states, ranging from 21% in Sokoto to 66% in Ekiti. We found that Nigeria made notable gains for a subset of MCH indicators between 2000 and 2013, but also experienced stalled progress and even declines for others. Despite progress for a subset of indicators, Nigeria's absolute levels of intervention coverage remained quite low. As Nigeria rolls out its National Health Bill and seeks to strengthen its delivery of health services, continued monitoring of local health trends will help policymakers track successes and promptly address challenges as they arise. Subnational benchmarking ought to occur regularly in Nigeria and throughout sub-Saharan Africa to inform local decision-making and bolster health system performance.
NASA Astrophysics Data System (ADS)
Stoecklin, A.; Friedli, B.; Puzrin, A. M.
2017-11-01
The volume of submarine landslides is a key controlling factor for their damage potential. Particularly large landslides are found in active sedimentary regions. However, the mechanism controlling their volume, and in particular their thickness, remains unclear. Here we present a mechanism that explains how rapid sedimentation can lead to localized slope failure at a preferential depth and set the conditions for the emergence of large-scale slope-parallel landslides. We account for the contractive shearing behavior of the sediments, which locally accelerates the development of overpressures in the pore fluid, even on very mild slopes. When applied to the Santa Barbara basin, the mechanism offers an explanation for the regional variation in landslide thickness and their sedimentation-controlled recurrence. Although earthquakes are the most likely trigger for these mass movements, our results suggest that the sedimentation process controls the geometry of their source region. The mechanism introduced here is generally applicable and can provide initial conditions for subsequent landslide triggering, runout, and tsunami-source analyses in sedimentary regions.
Comi, Troy J; Do, Thanh D; Rubakhin, Stanislav S; Sweedler, Jonathan V
2017-03-22
The chemical differences between individual cells within large cellular populations provide unique information on organisms' homeostasis and the development of diseased states. Even genetically identical cell lineages diverge due to local microenvironments and stochastic processes. The minute sample volumes and low abundance of some constituents in cells hinder our understanding of cellular heterogeneity. Although amplification methods facilitate single-cell genomics and transcriptomics, the characterization of metabolites and proteins remains challenging both because of the lack of effective amplification approaches and the wide diversity in cellular constituents. Mass spectrometry has become an enabling technology for the investigation of individual cellular metabolite profiles with its exquisite sensitivity, large dynamic range, and ability to characterize hundreds to thousands of compounds. While advances in instrumentation have improved figures of merit, acquiring measurements at high throughput and sampling from large populations of cells are still not routine. In this Perspective, we highlight the current trends and progress in mass-spectrometry-based analysis of single cells, with a focus on the technologies that will enable the next generation of single-cell measurements.
The Complete Redistribution Approximation in Optically Thick Line-Driven Winds
NASA Astrophysics Data System (ADS)
Gayley, K. G.; Onifer, A. J.
2001-05-01
Wolf-Rayet winds are thought to exhibit large momentum fluxes, which has in part been explained by ionization stratification in the wind. However, it the cause of high mass loss, not high momentum flux, that remains largely a mystery, because standard models fail to achieve sufficient acceleration near the surface where the mass-loss rate is set. We consider a radiative transfer approximation that allows for the dynamics of optically thick Wolf-Rayet winds to be modeled without detailed treatment of the radiation field, called the complete redistribution approximation. In it, it is assumed that thermalization processes cause the photon frequencies to be completely randomized over the course of propagating through the wind, which allows the radiation field to be treated statistically rather than in detail. Thus the approach is similar to the statistical treatment of the line list used in the celebrated CAK approach. The results differ from the effectively gray treatment in that the radiation field is influenced by the line distribution, and the role of gaps in the line distribution is enhanced. The ramifications for the driving of large mass-loss rates is explored.
Hydropower and sustainability: resilience and vulnerability in China's powersheds.
McNally, Amy; Magee, Darrin; Wolf, Aaron T
2009-07-01
Large dams represent a whole complex of social, economic and ecological processes, perhaps more than any other large infrastructure project. Today, countries with rapidly developing economies are constructing new dams to provide energy and flood control to growing populations in riparian and distant urban communities. If the system is lacking institutional capacity to absorb these physical and institutional changes there is potential for conflict, thereby threatening human security. In this paper, we propose analyzing sustainability (political, socioeconomic, and ecological) in terms of resilience versus vulnerability, framed within the spatial abstraction of a powershed. The powershed framework facilitates multi-scalar and transboundary analysis while remaining focused on the questions of resilience and vulnerability relating to hydropower dams. Focusing on examples from China, this paper describes the complex nature of dams using the sustainability and powershed frameworks. We then analyze the roles of institutions in China to understand the relationships between power, human security and the socio-ecological system. To inform the study of conflicts over dams China is a particularly useful case study because we can examine what happens at the international, national and local scales. The powershed perspective allows us to examine resilience and vulnerability across political boundaries from a dynamic, process-defined analytical scale while remaining focused on a host of questions relating to hydro-development that invoke drivers and impacts on national and sub-national scales. The ability to disaggregate the affects of hydropower dam construction from political boundaries allows for a deeper analysis of resilience and vulnerability. From our analysis we find that reforms in China's hydropower sector since 1996 have been motivated by the need to create stability at the national scale rather than resilient solutions to China's growing demand for energy and water resource control at the local and international scales. Some measures that improved economic development through the market economy and a combination of dam construction and institutional reform may indeed improve hydro-political resilience at a single scale. However, if China does address large-scale hydropower construction's potential to create multi-scale geopolitical tensions, they may be vulnerable to conflict - though not necessarily violent - in domestic and international political arenas. We conclude with a look toward a resilient basin institution for the Nu/Salween River, the site of a proposed large-scale hydropower development effort in China and Myanmar.
[Engineering a bone free flap for maxillofacial reconstruction: technical restrictions].
Raoul, G; Myon, L; Chai, F; Blanchemain, N; Ferri, J
2011-09-01
Vascularisation is a key for success in bone tissue engineering. Creating a functional vascular network is an important concern so as to ensure vitality in regenerated tissues. Many strategies were developed to achieve this goal. One of these is cellular growth technique by perfusion bioreactor chamber. These new technical requirements came along with improved media and chamber receptacles: bioreactors (chapter 2). Some bone tissue engineering processes already have clinical applications but for volumes limited by the lack of vascularisation. Resorbable or non-resorbable membranes are an example. They are used separately or in association with bone grafts and they protect the graft during the revascularization process. Potentiated osseous regeneration uses molecular or cellular adjuvants (BMPs and autologous stem cells) to improve osseous healing. Significant improvements were made: integration of specific sequences, which may guide and enhance cells differentiation in scaffold; nano- or micro-patterned cell containing scaffolds. Finally, some authors consider the patient body as an ideal bioreactor to induce vascularisation in large volumes of grafted tissues. "Endocultivation", i.e., cellular culture inside the human body was proven to be feasible and safe. The properties of regenerated bone in the long run remain to be assessed. The objective to reach remains the engineering of an "in vitro" osseous free flap without morbidity. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Functional Basis for Efficient Physical Layer Classical Control in Quantum Processors
NASA Astrophysics Data System (ADS)
Ball, Harrison; Nguyen, Trung; Leong, Philip H. W.; Biercuk, Michael J.
2016-12-01
The rapid progress seen in the development of quantum-coherent devices for information processing has motivated serious consideration of quantum computer architecture and organization. One topic which remains open for investigation and optimization relates to the design of the classical-quantum interface, where control operations on individual qubits are applied according to higher-level algorithms; accommodating competing demands on performance and scalability remains a major outstanding challenge. In this work, we present a resource-efficient, scalable framework for the implementation of embedded physical layer classical controllers for quantum-information systems. Design drivers and key functionalities are introduced, leading to the selection of Walsh functions as an effective functional basis for both programing and controller hardware implementation. This approach leverages the simplicity of real-time Walsh-function generation in classical digital hardware, and the fact that a wide variety of physical layer controls, such as dynamic error suppression, are known to fall within the Walsh family. We experimentally implement a real-time field-programmable-gate-array-based Walsh controller producing Walsh timing signals and Walsh-synthesized analog waveforms appropriate for critical tasks in error-resistant quantum control and noise characterization. These demonstrations represent the first step towards a unified framework for the realization of physical layer controls compatible with large-scale quantum-information processing.
A developmental and genetic classification for midbrain-hindbrain malformations
Millen, Kathleen J.; Dobyns, William B.
2009-01-01
Advances in neuroimaging, developmental biology and molecular genetics have increased the understanding of developmental disorders affecting the midbrain and hindbrain, both as isolated anomalies and as part of larger malformation syndromes. However, the understanding of these malformations and their relationships with other malformations, within the central nervous system and in the rest of the body, remains limited. A new classification system is proposed, based wherever possible, upon embryology and genetics. Proposed categories include: (i) malformations secondary to early anteroposterior and dorsoventral patterning defects, or to misspecification of mid-hindbrain germinal zones; (ii) malformations associated with later generalized developmental disorders that significantly affect the brainstem and cerebellum (and have a pathogenesis that is at least partly understood); (iii) localized brain malformations that significantly affect the brain stem and cerebellum (pathogenesis partly or largely understood, includes local proliferation, cell specification, migration and axonal guidance); and (iv) combined hypoplasia and atrophy of putative prenatal onset degenerative disorders. Pertinent embryology is discussed and the classification is justified. This classification will prove useful for both physicians who diagnose and treat patients with these disorders and for clinical scientists who wish to understand better the perturbations of developmental processes that produce them. Importantly, both the classification and its framework remain flexible enough to be easily modified when new embryologic processes are described or new malformations discovered. PMID:19933510
Key principles to improve programmes and interventions in complementary feeding.
Lutter, Chessa K; Iannotti, Lora; Creed-Kanashiro, Hilary; Guyon, Agnes; Daelmans, Bernadette; Robert, Rebecca; Haider, Rukhsana
2013-09-01
Although there are some examples of successful complementary feeding programmes to promote healthy growth and prevent stunting at the community level, to date there are few, if any, examples of successful programmes at scale. A lack of systematic process and impact evaluations on pilot projects to generate lessons learned has precluded scaling up of effective programmes. Programmes to effect positive change in nutrition rarely follow systematic planning, implementation, and evaluation (PIE) processes to enhance effectiveness over the long term. As a result a set of programme-oriented key principles to promote healthy growth remains elusive. The purpose of this paper is to fill this gap by proposing a set of principles to improve programmes and interventions to promote healthy growth and development. Identifying such principles for programme success has three requirements: rethinking traditional paradigms used to promote improved infant and young child feeding; ensuring better linkages to delivery platforms; and, improving programming. Following the PIE model for programmes and learning from experiences from four relatively large-scale programmes described in this paper, 10 key principles are identified in the areas of programme planning, programme implementation, programme evaluation, and dissemination, replication, and scaling up. Nonetheless, numerous operational research questions remain, some of which are highlighted in this paper. © 2013 John Wiley & Sons Ltd.
DiCanio, Christian; Nam, Hosung; Whalen, Douglas H.; Timothy Bunnell, H.; Amith, Jonathan D.; García, Rey Castillo
2013-01-01
While efforts to document endangered languages have steadily increased, the phonetic analysis of endangered language data remains a challenge. The transcription of large documentation corpora is, by itself, a tremendous feat. Yet, the process of segmentation remains a bottleneck for research with data of this kind. This paper examines whether a speech processing tool, forced alignment, can facilitate the segmentation task for small data sets, even when the target language differs from the training language. The authors also examined whether a phone set with contextualization outperforms a more general one. The accuracy of two forced aligners trained on English (hmalign and p2fa) was assessed using corpus data from Yoloxóchitl Mixtec. Overall, agreement performance was relatively good, with accuracy at 70.9% within 30 ms for hmalign and 65.7% within 30 ms for p2fa. Segmental and tonal categories influenced accuracy as well. For instance, additional stop allophones in hmalign's phone set aided alignment accuracy. Agreement differences between aligners also corresponded closely with the types of data on which the aligners were trained. Overall, using existing alignment systems was found to have potential for making phonetic analysis of small corpora more efficient, with more allophonic phone sets providing better agreement than general ones. PMID:23967953
DiCanio, Christian; Nam, Hosung; Whalen, Douglas H; Bunnell, H Timothy; Amith, Jonathan D; García, Rey Castillo
2013-09-01
While efforts to document endangered languages have steadily increased, the phonetic analysis of endangered language data remains a challenge. The transcription of large documentation corpora is, by itself, a tremendous feat. Yet, the process of segmentation remains a bottleneck for research with data of this kind. This paper examines whether a speech processing tool, forced alignment, can facilitate the segmentation task for small data sets, even when the target language differs from the training language. The authors also examined whether a phone set with contextualization outperforms a more general one. The accuracy of two forced aligners trained on English (hmalign and p2fa) was assessed using corpus data from Yoloxóchitl Mixtec. Overall, agreement performance was relatively good, with accuracy at 70.9% within 30 ms for hmalign and 65.7% within 30 ms for p2fa. Segmental and tonal categories influenced accuracy as well. For instance, additional stop allophones in hmalign's phone set aided alignment accuracy. Agreement differences between aligners also corresponded closely with the types of data on which the aligners were trained. Overall, using existing alignment systems was found to have potential for making phonetic analysis of small corpora more efficient, with more allophonic phone sets providing better agreement than general ones.
Health decision making: lynchpin of evidence-based practice.
Spring, Bonnie
2008-01-01
Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers' intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed.
Disease Manifestations and Pathogenic Mechanisms of Group A Streptococcus
Barnett, Timothy C.; McArthur, Jason D.; Cole, Jason N.; Gillen, Christine M.; Henningham, Anna; Sriprakash, K. S.; Sanderson-Smith, Martina L.; Nizet, Victor
2014-01-01
SUMMARY Streptococcus pyogenes, also known as group A Streptococcus (GAS), causes mild human infections such as pharyngitis and impetigo and serious infections such as necrotizing fasciitis and streptococcal toxic shock syndrome. Furthermore, repeated GAS infections may trigger autoimmune diseases, including acute poststreptococcal glomerulonephritis, acute rheumatic fever, and rheumatic heart disease. Combined, these diseases account for over half a million deaths per year globally. Genomic and molecular analyses have now characterized a large number of GAS virulence determinants, many of which exhibit overlap and redundancy in the processes of adhesion and colonization, innate immune resistance, and the capacity to facilitate tissue barrier degradation and spread within the human host. This improved understanding of the contribution of individual virulence determinants to the disease process has led to the formulation of models of GAS disease progression, which may lead to better treatment and intervention strategies. While GAS remains sensitive to all penicillins and cephalosporins, rising resistance to other antibiotics used in disease treatment is an increasing worldwide concern. Several GAS vaccine formulations that elicit protective immunity in animal models have shown promise in nonhuman primate and early-stage human trials. The development of a safe and efficacious commercial human vaccine for the prophylaxis of GAS disease remains a high priority. PMID:24696436
Estimation of fatigue life using electromechanical impedance technique
NASA Astrophysics Data System (ADS)
Lim, Yee Yan; Soh, Chee Kiong
2010-04-01
Fatigue induced damage is often progressive and gradual in nature. Structures subjected to large number of fatigue load cycles will encounter the process of progressive crack initiation, propagation and finally fracture. Monitoring of structural health, especially for the critical components, is therefore essential for early detection of potential harmful crack. Recent advent of smart materials such as piezo-impedance transducer adopting the electromechanical impedance (EMI) technique and wave propagation technique are well proven to be effective in incipient damage detection and characterization. Exceptional advantages such as autonomous, real-time and online, remote monitoring may provide a cost-effective alternative to the conventional structural health monitoring (SHM) techniques. In this study, the main focus is to investigate the feasibility of characterizing a propagating fatigue crack in a structure using the EMI technique as well as estimating its remaining fatigue life using the linear elastic fracture mechanics (LEFM) approach. Uniaxial cyclic tensile load is applied on a lab-sized aluminum beam up to failure. Progressive shift in admittance signatures measured by the piezo-impedance transducer (PZT patch) corresponding to increase of loading cycles reflects effectiveness of the EMI technique in tracing the process of fatigue damage progression. With the use of LEFM, prediction of the remaining life of the structure at different cycles of loading is possible.
Star formation in evolving molecular clouds
NASA Astrophysics Data System (ADS)
Völschow, M.; Banerjee, R.; Körtgen, B.
2017-09-01
Molecular clouds are the principle stellar nurseries of our universe; they thus remain a focus of both observational and theoretical studies. From observations, some of the key properties of molecular clouds are well known but many questions regarding their evolution and star formation activity remain open. While numerical simulations feature a large number and complexity of involved physical processes, this plethora of effects may hide the fundamentals that determine the evolution of molecular clouds and enable the formation of stars. Purely analytical models, on the other hand, tend to suffer from rough approximations or a lack of completeness, limiting their predictive power. In this paper, we present a model that incorporates central concepts of astrophysics as well as reliable results from recent simulations of molecular clouds and their evolutionary paths. Based on that, we construct a self-consistent semi-analytical framework that describes the formation, evolution, and star formation activity of molecular clouds, including a number of feedback effects to account for the complex processes inside those objects. The final equation system is solved numerically but at much lower computational expense than, for example, hydrodynamical descriptions of comparable systems. The model presented in this paper agrees well with a broad range of observational results, showing that molecular cloud evolution can be understood as an interplay between accretion, global collapse, star formation, and stellar feedback.
Hayes, Dave J.; Northoff, Georg
2011-01-01
The ability to detect and respond appropriately to aversive stimuli is essential for all organisms, from fruit flies to humans. This suggests the existence of a core neural network which mediates aversion-related processing. Human imaging studies on aversion have highlighted the involvement of various cortical regions, such as the prefrontal cortex, while animal studies have focused largely on subcortical regions like the periaqueductal gray and hypothalamus. However, whether and how these regions form a core neural network of aversion remains unclear. To help determine this, a translational cross-species investigation in humans (i.e., meta-analysis) and other animals (i.e., systematic review of functional neuroanatomy) was performed. Our results highlighted the recruitment of the anterior cingulate cortex, the anterior insula, and the amygdala as well as other subcortical (e.g., thalamus, midbrain) and cortical (e.g., orbitofrontal) regions in both animals and humans. Importantly, involvement of these regions remained independent of sensory modality. This study provides evidence for a core neural network mediating aversion in both animals and humans. This not only contributes to our understanding of the trans-species neural correlates of aversion but may also carry important implications for psychiatric disorders where abnormal aversive behavior can often be observed. PMID:22102836
Health Decision Making: Lynchpin of Evidence-Based Practice
Spring, Bonnie
2008-01-01
Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. Implications for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers’ intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed. PMID:19015288
NASA Technical Reports Server (NTRS)
Grant, J. A.; Schultz, P. H.
1993-01-01
In spite of the highly successful nature of recent planetary missions to the terrestrial planets and outer satellites a number of questions concerning the evolution of their surfaces remain unresolved. For example, knowledge of many characteristics of the stratigraphy and soils comprising the near-surface on Mars remains largely unknown, but is crucial in order to accurately define the history of surface processes and near-surface sedimentary record. Similar statements can be made regarding our understanding of near-surface stratigraphy and processes on other extraterrestrial planetary bodies. Ground penetrating radar (GPR) is a proven and standard instrument capable of imaging the subsurface at high resolution to 10's of meters depth in a variety of terrestrial environments. Moreover, GPR is portable and easily modified for rover deployment. Data collected with a rover mounted GPR could resolve a number of issues related to planetary surface evolution by defining shallow stratigraphic records and would provide context for interpreting results of other surface analyses (e.g. elemental or mineralogical). A discussion of existing GPR capabilities is followed first by examples of how GPR might be used to better define surface evolution on Mars and then by a brief description of possible GPR applications to the Moon and other planetary surfaces.
Lithospheric Strength and Stress State: Persistent Challenges and New Directions in Geodynamics
NASA Astrophysics Data System (ADS)
Hirth, G.
2017-12-01
The strength of the lithosphere controls a broad array of geodynamic processes ranging from earthquakes, the formation and evolution of plate boundaries and the thermal evolution of the planet. A combination of laboratory, geologic and geophysical observations provides several independent constraints on the rheological properties of the lithosphere. However, several persistent challenges remain in the interpretation of these data. Problems related to extrapolation in both scale and time (rate) need to be addressed to apply laboratory data. Nonetheless, good agreement between extrapolation of flow laws and the interpretation of microstructures in viscously deformed lithospheric mantle rocks demonstrates a strong foundation to build on to explore the role of scale. Furthermore, agreement between the depth distribution of earthquakes and predictions based on extrapolation of high temperature friction relationships provides a basis to understand links between brittle deformation and stress state. In contrast, problems remain for rationalizing larger scale geodynamic processes with these same rheological constraints. For example, at face value the lab derived values for the activation energy for creep are too large to explain convective instabilities at the base of the lithosphere, but too low to explain the persistence of dangling slabs in the upper mantle. In this presentation, I will outline these problems (and successes) and provide thoughts on where new progress can be made to resolve remaining inconsistencies, including discussion of the role of the distribution of volatiles and alteration on the strength of the lithosphere, new data on the influence of pressure on friction and fracture strength, and links between the location of earthquakes, thermal structure, and stress state.
A regulatory circuit for piwi by the large Maf gene traffic jam in Drosophila.
Saito, Kuniaki; Inagaki, Sachi; Mituyama, Toutai; Kawamura, Yoshinori; Ono, Yukiteru; Sakota, Eri; Kotani, Hazuki; Asai, Kiyoshi; Siomi, Haruhiko; Siomi, Mikiko C
2009-10-29
PIWI-interacting RNAs (piRNAs) silence retrotransposons in Drosophila germ lines by associating with the PIWI proteins Argonaute 3 (AGO3), Aubergine (Aub) and Piwi. piRNAs in Drosophila are produced from intergenic repetitive genes and piRNA clusters by two systems: the primary processing pathway and the amplification loop. The amplification loop occurs in a Dicer-independent, PIWI-Slicer-dependent manner. However, primary piRNA processing remains elusive. Here we analysed piRNA processing in a Drosophila ovarian somatic cell line where Piwi, but not Aub or AGO3, is expressed; thus, only the primary piRNAs exist. In addition to flamenco, a Piwi-specific piRNA cluster, traffic jam (tj), a large Maf gene, was determined as a new piRNA cluster. piRNAs arising from tj correspond to the untranslated regions of tj messenger RNA and are sense-oriented. piRNA loading on to Piwi may occur in the cytoplasm. zucchini, a gene encoding a putative cytoplasmic nuclease, is required for tj-derived piRNA production. In tj and piwi mutant ovaries, somatic cells fail to intermingle with germ cells and Fasciclin III is overexpressed. Loss of tj abolishes Piwi expression in gonadal somatic cells. Thus, in gonadal somatic cells, tj gives rise simultaneously to two different molecules: the TJ protein, which activates Piwi expression, and piRNAs, which define the Piwi targets for silencing.
Landing Gear Integration in Aircraft Conceptual Design. Revision
NASA Technical Reports Server (NTRS)
Chai, Sonny T.; Mason, William H.
1997-01-01
The design of the landing gear is one of the more fundamental aspects of aircraft design. The design and integration process encompasses numerous engineering disciplines, e.g., structure, weights, runway design, and economics, and has become extremely sophisticated in the last few decades. Although the design process is well-documented, no attempt has been made until now in the development of a design methodology that can be used within an automated environment. As a result, the process remains to be a key responsibility for the configuration designer and is largely experience-based and graphically-oriented. However, as industry and government try to incorporate multidisciplinary design optimization (MDO) methods in the conceptual design phase, the need for a more systematic procedure has become apparent. The development of an MDO-capable design methodology as described in this work is focused on providing the conceptual designer with tools to help automate the disciplinary analyses, i.e., geometry, kinematics, flotation, and weight. Documented design procedures and analyses were examined to determine their applicability, and to ensure compliance with current practices and regulations. Using the latest information as obtained from industry during initial industry survey, the analyses were in terms modified and expanded to accommodate the design criteria associated with the advanced large subsonic transports. Algorithms were then developed based on the updated analysis procedures to be incorporated into existing MDO codes.
Quantum communication and information processing
NASA Astrophysics Data System (ADS)
Beals, Travis Roland
Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.
Coppi, B.; Basu, B.; Fletcher, A.
2017-05-31
In the context of a two-fluid theory of magnetic reconnection, when the longitudinal electron thermal conductivity is relatively large, the perturbed electron temperature tends to become singular in the presence of a reconnected field component and an electron temperature gradient. A finite transverse thermal diffusivity removes this singularity while a finite ‘inductivity’ can remove the singularity of the relevant plasma displacement. Then (i) a new ‘magneto-thermal’ reconnection producing mode, is found with characteristic widths of the reconnection layer remaining significant even when the macroscopic distances involved are very large; (ii) the mode phase velocities can be both in the directionmore » of the electron diamagnetic velocity as well in the opposite (ion) direction. A numerical solution of the complete set of equations has been carried out with a simplified analytical reformulation of the problem. A sequence of processes is analyzed to point out that high-energy particle populations can be produced as a result of reconnection events. These processes involve mode-particle resonances transferring energy of the reconnecting mode to a superthermal ion population and the excitation of lower hybrid waves that can lead to a significant superthermal electron population. The same modes excited in axisymmetric (e.g. toroidal) confinement configurations can extract angular momentum from the main body of the plasma column and thereby sustain a local ‘spontaneous rotation’ of it.« less
Nature-Inspired Capillary-Driven Welding Process for Boosting Metal-Oxide Nanofiber Electronics.
Meng, You; Lou, Kaihua; Qi, Rui; Guo, Zidong; Shin, Byoungchul; Liu, Guoxia; Shan, Fukai
2018-06-20
Recently, semiconducting nanofiber networks (NFNs) have been considered as one of the most promising platforms for large-area and low-cost electronics applications. However, the high contact resistance among stacking nanofibers remained to be a major challenge, leading to poor device performance and parasitic energy consumption. In this report, a controllable welding technique for NFNs was successfully demonstrated via a bioinspired capillary-driven process. The interfiber connections were well-achieved via a cooperative concept, combining localized capillary condensation and curvature-induced surface diffusion. With the improvements of the interfiber connections, the welded NFNs exhibited enhanced mechanical property and high electrical performance. The field-effect transistors (FETs) based on the welded Hf-doped In 2 O 3 (InHfO) NFNs were demonstrated for the first time. Meanwhile, the mechanisms involved in the grain-boundary modulation for polycrystalline metal-oxide nanofibers were discussed. When the high-k ZrO x dielectric thin films were integrated into the FETs, the field-effect mobility and operating voltage were further improved to be 25 cm 2 V -1 s -1 and 3 V, respectively. This is one of the best device performances among the reported nanofibers-based FETs. These results demonstrated the potencies of the capillary-driven welding process and grain-boundary modulation mechanism for metal-oxide NFNs, which could be applicable for high-performance, large-scale, and low-power functional electronics.
Joseph, Adrian; Kenty, Brian; Mollet, Michael; Hwang, Kenneth; Rose, Steven; Goldrick, Stephen; Bender, Jean; Farid, Suzanne S.
2016-01-01
ABSTRACT In the production of biopharmaceuticals disk‐stack centrifugation is widely used as a harvest step for the removal of cells and cellular debris. Depth filters followed by sterile filters are often then employed to remove residual solids remaining in the centrate. Process development of centrifugation is usually conducted at pilot‐scale so as to mimic the commercial scale equipment but this method requires large quantities of cell culture and significant levels of effort for successful characterization. A scale‐down approach based upon the use of a shear device and a bench‐top centrifuge has been extended in this work towards a preparative methodology that successfully predicts the performance of the continuous centrifuge and polishing filters. The use of this methodology allows the effects of cell culture conditions and large‐scale centrifugal process parameters on subsequent filtration performance to be assessed at an early stage of process development where material availability is limited. Biotechnol. Bioeng. 2016;113: 1934–1941. © 2016 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:26927621
Abundance of live 244Pu in deep-sea reservoirs on Earth points to rarity of actinide nucleosynthesis
Wallner, A.; Faestermann, T.; Feige, J.; Feldstein, C.; Knie, K.; Korschinek, G.; Kutschera, W.; Ofan, A.; Paul, M.; Quinto, F.; Rugel, G.; Steier, P.
2015-01-01
Half of the heavy elements including all actinides are produced in r-process nucleosynthesis, whose sites and history remain a mystery. If continuously produced, the Interstellar Medium is expected to build-up a quasi-steady state of abundances of short-lived nuclides (with half-lives ≤100 My), including actinides produced in r-process nucleosynthesis. Their existence in today’s interstellar medium would serve as a radioactive clock and would establish that their production was recent. In particular 244Pu, a radioactive actinide nuclide (half-life=81 My), can place strong constraints on recent r-process frequency and production yield. Here we report the detection of live interstellar 244Pu, archived in Earth’s deep-sea floor during the last 25 My, at abundances lower than expected from continuous production in the Galaxy by about 2 orders of magnitude. This large discrepancy may signal a rarity of actinide r-process nucleosynthesis sites, compatible with neutron-star mergers or with a small subset of actinide-producing supernovae. PMID:25601158
Implementing Kanban for agile process management within the ALMA Software Operations Group
NASA Astrophysics Data System (ADS)
Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge
2014-07-01
After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.
Ren, Xiaojun; Deng, Ruijie; Wang, Lida; Zhang, Kaixiang; Li, Jinghong
2017-08-01
RNA splicing, which mainly involves two transesterification steps, is a fundamental process of gene expression and its abnormal regulation contributes to serious genetic diseases. Antisense oligonucleotides (ASOs) are genetic control tools that can be used to specifically control genes through alteration of the RNA splicing pathway. Despite intensive research, how ASOs or various other factors influence the multiple processes of RNA splicing still remains obscure. This is largely due to an inability to analyze the splicing efficiency of each step in the RNA splicing process with high sensitivity. We addressed this limitation by introducing a padlock probe-based isothermal amplification assay to achieve quantification of the specific products in different splicing steps. With this amplified assay, the roles that ASOs play in RNA splicing inhibition in the first and second steps could be distinguished. We identified that 5'-ASO could block RNA splicing by inhibiting the first step, while 3'-ASO could block RNA splicing by inhibiting the second step. This method provides a versatile tool for assisting efficient ASO design and discovering new splicing modulators and therapeutic drugs.
Vitorazi, L; Ould-Moussa, N; Sekar, S; Fresnais, J; Loh, W; Chapel, J-P; Berret, J-F
2014-12-21
Recent studies have pointed out the importance of polyelectrolyte assembly in the elaboration of innovative nanomaterials. Beyond their structures, many important questions on the thermodynamics of association remain unanswered. Here, we investigate the complexation between poly(diallyldimethylammonium chloride) (PDADMAC) and poly(sodium acrylate) (PANa) chains using a combination of three techniques: isothermal titration calorimetry (ITC), static and dynamic light scattering and electrophoresis. Upon addition of PDADMAC to PANa or vice-versa, the results obtained by the different techniques agree well with each other, and reveal a two-step process. The primary process is the formation of highly charged polyelectrolyte complexes of size 100 nm. The secondary process is the transition towards a coacervate phase made of rich and poor polymer droplets. The binding isotherms measured are accounted for using a phenomenological model that provides the thermodynamic parameters for each reaction. Small positive enthalpies and large positive entropies consistent with a counterion release scenario are found throughout this study. Furthermore, this work stresses the importance of the underestimated formulation pathway or mixing order in polyelectrolyte complexation.
A light-stimulated synaptic device based on graphene hybrid phototransistor
NASA Astrophysics Data System (ADS)
Qin, Shuchao; Wang, Fengqiu; Liu, Yujie; Wan, Qing; Wang, Xinran; Xu, Yongbing; Shi, Yi; Wang, Xiaomu; Zhang, Rong
2017-09-01
Neuromorphic chips refer to an unconventional computing architecture that is modelled on biological brains. They are increasingly employed for processing sensory data for machine vision, context cognition, and decision making. Despite rapid advances, neuromorphic computing has remained largely an electronic technology, making it a challenge to access the superior computing features provided by photons, or to directly process vision data that has increasing importance to artificial intelligence. Here we report a novel light-stimulated synaptic device based on a graphene-carbon nanotube hybrid phototransistor. Significantly, the device can respond to optical stimuli in a highly neuron-like fashion and exhibits flexible tuning of both short- and long-term plasticity. These features combined with the spatiotemporal processability make our device a capable counterpart to today’s electrically-driven artificial synapses, with superior reconfigurable capabilities. In addition, our device allows for generic optical spike processing, which provides a foundation for more sophisticated computing. The silicon-compatible, multifunctional photosensitive synapse opens up a new opportunity for neural networks enabled by photonics and extends current neuromorphic systems in terms of system complexities and functionalities.
Rafferty, Rae; Fairbrother, Greg
2015-06-01
To introduce a theory which describes the process of and explicates the factors moderating, the acquisition and integration of leadership coaching skills into the routine practice of senior nurses and midwives. Organizations invest significant resources in leadership coaching programs to ensure that coaching is embedded as a core function of the manager's role. However, even after training, many managers remain unable to undertake this role successfully. The process by which health professionals translate 'manager as coach' training into successful practice outcomes, has remained largely unexplored. A grounded theory study design. Data, collected between February 2012-May 2013, included in-depth interviews with 20 senior nurses and midwives who had attended a leadership coaching program and analysis of nine reflective practice journals. Multiple researchers coded and analysed the data using constant comparative techniques. The outcomes of coaching training ranged from inappropriate use of the coaching skills through to transformed managerial practice. These outcomes were influenced by the dynamic interaction of three central domains of the emergent theoretical model: pre-existing individual perceptions, program elements and contemporaneous experiences. Interactions occurred within the domains and between them, impacting on activators such as courage, motivation, commitment and confidence. The study offers new insights into how senior nurses and midwives acquire and integrate coaching skills into their routine practice. The process is described as multifactorial and dynamic and has implications for the training design, delivery and organizational support of future leadership coaching programs. © 2015 John Wiley & Sons Ltd.
Inducing task-relevant responses to speech in the sleeping brain.
Kouider, Sid; Andrillon, Thomas; Barbosa, Leonardo S; Goupil, Louise; Bekinschtein, Tristan A
2014-09-22
Falling asleep leads to a loss of sensory awareness and to the inability to interact with the environment [1]. While this was traditionally thought as a consequence of the brain shutting down to external inputs, it is now acknowledged that incoming stimuli can still be processed, at least to some extent, during sleep [2]. For instance, sleeping participants can create novel sensory associations between tones and odors [3] or reactivate existing semantic associations, as evidenced by event-related potentials [4-7]. Yet, the extent to which the brain continues to process external stimuli remains largely unknown. In particular, it remains unclear whether sensory information can be processed in a flexible and task-dependent manner by the sleeping brain, all the way up to the preparation of relevant actions. Here, using semantic categorization and lexical decision tasks, we studied task-relevant responses triggered by spoken stimuli in the sleeping brain. Awake participants classified words as either animals or objects (experiment 1) or as either words or pseudowords (experiment 2) by pressing a button with their right or left hand, while transitioning toward sleep. The lateralized readiness potential (LRP), an electrophysiological index of response preparation, revealed that task-specific preparatory responses are preserved during sleep. These findings demonstrate that despite the absence of awareness and behavioral responsiveness, sleepers can still extract task-relevant information from external stimuli and covertly prepare for appropriate motor responses. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
The second laws of quantum thermodynamics.
Brandão, Fernando; Horodecki, Michał; Ng, Nelly; Oppenheim, Jonathan; Wehner, Stephanie
2015-03-17
The second law of thermodynamics places constraints on state transformations. It applies to systems composed of many particles, however, we are seeing that one can formulate laws of thermodynamics when only a small number of particles are interacting with a heat bath. Is there a second law of thermodynamics in this regime? Here, we find that for processes which are approximately cyclic, the second law for microscopic systems takes on a different form compared to the macroscopic scale, imposing not just one constraint on state transformations, but an entire family of constraints. We find a family of free energies which generalize the traditional one, and show that they can never increase. The ordinary second law relates to one of these, with the remainder imposing additional constraints on thermodynamic transitions. We find three regimes which determine which family of second laws govern state transitions, depending on how cyclic the process is. In one regime one can cause an apparent violation of the usual second law, through a process of embezzling work from a large system which remains arbitrarily close to its original state. These second laws are relevant for small systems, and also apply to individual macroscopic systems interacting via long-range interactions. By making precise the definition of thermal operations, the laws of thermodynamics are unified in this framework, with the first law defining the class of operations, the zeroth law emerging as an equivalence relation between thermal states, and the remaining laws being monotonicity of our generalized free energies.
MUSE: the Multi-Slit Solar Explorer
NASA Astrophysics Data System (ADS)
Tarbell, Theodore D.; De Pontieu, Bart
2017-08-01
The Multi-Slit Solar Explorer is a proposed Small Explorer mission for studying the dynamics of the corona and transition region using both conventional and novel spectral imaging techniques. The physical processes that heat the multi-million degree solar corona, accelerate the solar wind and drive solar activity (CMEs and flares) remain poorly known. A breakthrough in these areas can only come from radically innovative instrumentation and state-of-the-art numerical modeling and will lead to better understanding of space weather origins. MUSE’s multi-slit coronal spectroscopy will use a 100x improvement in spectral raster cadence to fill a crucial gap in our knowledge of Sun-Earth connections; it will reveal temperatures, velocities and non-thermal processes over a wide temperature range to diagnose physical processes that remain invisible to current or planned instruments. MUSE will contain two instruments: an EUV spectrograph (SG) and EUV context imager (CI). Both have similar spatial resolution and leverage extensive heritage from previous high-resolution instruments such as IRIS and the HiC rocket payload. The MUSE investigation will build on the success of IRIS by combining numerical modeling with a uniquely capable observatory: MUSE will obtain EUV spectra and images with the highest resolution in space (1/3 arcsec) and time (1-4 s) ever achieved for the transition region and corona, along 35 slits and a large context FOV simultaneously. The MUSE consortium includes LMSAL, SAO, Stanford, ARC, HAO, GSFC, MSFC, MSU, ITA Oslo and other institutions.